hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2e3b888b916d9dde01c5fb956823a62bdc40c899 | 2,215 | py | Python | tests/layers.py | Brickstertwo/git-commands | 87fa9a6573dd426eecece098fbadc3f5550c8976 | [
"MIT"
] | 1 | 2018-10-17T11:09:32.000Z | 2018-10-17T11:09:32.000Z | tests/layers.py | Brickstertwo/git-commands | 87fa9a6573dd426eecece098fbadc3f5550c8976 | [
"MIT"
] | 122 | 2015-01-06T19:10:23.000Z | 2017-09-26T14:22:11.000Z | tests/layers.py | Brickster/git-commands | 87fa9a6573dd426eecece098fbadc3f5550c8976 | [
"MIT"
] | null | null | null | class Functional(object):
@classmethod
def setUp(cls):
pass
class Unit(object):
@classmethod
def setUp(cls):
pass
class GitAbandon(Unit):
@classmethod
def setUp(cls):
pass
class GitAbandonFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitChanges(Unit):
@classmethod
def setUp(cls):
pass
class GitChangesFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitReindex(Unit):
@classmethod
def setUp(cls):
pass
class GitReindexFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitRestash(Unit):
@classmethod
def setUp(cls):
pass
class GitRestashFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitSettings(Unit):
@classmethod
def setUp(cls):
pass
class GitSettingsFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitSnapshot(Unit):
@classmethod
def setUp(cls):
pass
class GitSnapshotFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitState(Unit):
@classmethod
def setUp(cls):
pass
class GitStateFunctional(Functional):
@classmethod
def setUp(cls):
pass
class GitStateExtensions(GitState):
@classmethod
def setUp(cls):
pass
class GitUpstream(Unit):
@classmethod
def setUp(cls):
pass
class GitUpstreamFunctional(Functional):
@classmethod
def setUp(cls):
pass
class Utils(Unit):
@classmethod
def setUp(cls):
pass
class UtilsDirectories(Utils):
@classmethod
def setUp(cls):
pass
class UtilsExecute(Utils):
@classmethod
def setUp(cls):
pass
class UtilsGit(Utils):
@classmethod
def setUp(cls):
pass
class UtilsMessages(Utils):
@classmethod
def setUp(cls):
pass
class UtilsParseActions(Utils):
@classmethod
def setUp(cls):
pass
class UtilsParseString(Utils):
@classmethod
def setUp(cls):
pass
class Issues(Functional):
@classmethod
def setUp(cls):
pass
| 13.757764 | 40 | 0.623025 | 216 | 2,215 | 6.388889 | 0.157407 | 0.273913 | 0.371739 | 0.430435 | 0.724638 | 0.724638 | 0.676087 | 0 | 0 | 0 | 0 | 0 | 0.291648 | 2,215 | 160 | 41 | 13.84375 | 0.879541 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
2e503c1848eea6265fb162e5fcd1f196f7dce334 | 3,120 | py | Python | test/pyaz/reservations/reservation/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/reservations/reservation/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/reservations/reservation/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from ... pyaz_utils import get_cli_name, get_params
def list(reservation_order_id):
params = get_params(locals())
command = "az reservations reservation list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(reservation_id, reservation_order_id, expand=None):
params = get_params(locals())
command = "az reservations reservation show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list_history(reservation_id, reservation_order_id):
params = get_params(locals())
command = "az reservations reservation list-history " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update(reservation_order_id, reservation_id, applied_scope_type, applied_scopes=None, instance_flexibility=None):
params = get_params(locals())
command = "az reservations reservation update " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def split(reservation_order_id, reservation_id, quantity_1, quantity_2):
params = get_params(locals())
command = "az reservations reservation split " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def merge(reservation_order_id, reservation_id_1, reservation_id_2):
params = get_params(locals())
command = "az reservations reservation merge " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 35.454545 | 117 | 0.678526 | 373 | 3,120 | 5.576408 | 0.13941 | 0.080769 | 0.057692 | 0.060577 | 0.898077 | 0.832212 | 0.832212 | 0.832212 | 0.832212 | 0.725481 | 0 | 0.006515 | 0.212821 | 3,120 | 87 | 118 | 35.862069 | 0.840391 | 0 | 0 | 0.825 | 0 | 0 | 0.086538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0 | 0.025 | 0 | 0.175 | 0.225 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cf2d710e81869a0f0677ffc327d1197f3b5ef6c3 | 5,830 | py | Python | ViCE_pkg/test.py | 5teffen/ViCE | 3652a4d74d50c6ffdf3a239cb00eb5799ec7b63d | [
"MIT"
] | 8 | 2020-03-07T11:48:36.000Z | 2022-03-20T23:57:43.000Z | ViCE_pkg/test.py | nyuvis/ViCE | 3652a4d74d50c6ffdf3a239cb00eb5799ec7b63d | [
"MIT"
] | 25 | 2020-02-08T11:02:42.000Z | 2020-04-21T09:52:02.000Z | ViCE_pkg/test.py | nyuvis/ViCE | 3652a4d74d50c6ffdf3a239cb00eb5799ec7b63d | [
"MIT"
] | 5 | 2020-04-22T07:11:26.000Z | 2022-03-19T21:23:10.000Z | import pandas as pd
import numpy as np
import data
import model
import vice
from sklearn import model_selection
from sklearn.linear_model import LogisticRegression
import pickle
# How to deal with the preprocessing? Generate and then delete after?
path = "diabetes.csv"
model_path = 'finalized_model.sav'
d = data.Data(path)
m = model.Model().load_model(model_path)
# m.train_model(d, type='svm')
d2 = data.Data(example = "diabetes")
v = vice.Vice(d,m)
v.generate_explanation(10)
# --- Setting random seed --
# np.random.seed(150)
# --- Parameters ---
# data_path = "static/data/delinquency/delinquency.csv"
# preproc_path = "static/data/delinquency/delinquency_preproc.csv"
# data_path = "static/data/heart/heart.csv"
# preproc_path = "static/data/heart/heart_preproc.csv"
# no_bins = 10
# model_path = "TBD" # Manual?
# # --- Advanced Parameters
# density_fineness = 1000
# categorical_cols = [] # Categorical columns can be customized
# monotonicity_arr = []
# df = pd.read_csv(data_path)
# feature_names = np.array(df.columns)[:-1]
# all_data = np.array(df.values)
# # -- Split data and target values --
# data = all_data[:,:-1]
# target = all_data[:,-1]
# no_samples, no_features = data.shape
# svm_model = SVM_model(data,target)
# svm_model.train_model(0.001)
# svm_model.test_model()
# index = 7
# test_sample = data[index]
#
# bins_centred, X_pos_array, init_vals, col_ranges = divide_data_bins(data,no_bins) # Note: Does not account for categorical features
#
# single_bin_result = bin_single_sample(test_sample, col_ranges)
# aggr_data = prep_for_D3_aggregation(preproc_path, data, feature_names, [0,1,2,3,4,10], bins_centred, X_pos_array, False)
# density_fineness = 1000
# all_den, all_median, all_mean = all_kernel_densities(data,feature_names,density_fineness) # Pre-load density distributions
# cols_lst = [3,9,11,12]
# anchs = False
# print(ids_with_combination(preproc_path, cols_lst, anchs))
# sample_no = 1
# locked = [1,2,3]
# monotonicity_arr = mono_finder(svm_model, data, col_ranges)
# print("MONOTONICITY ARRAY:")
# print(monotonicity_arr)
# change_vector, change_row, anchors, percent = instance_explanation(svm_model, data, data[sample_no], sample_no, X_pos_array, bins_centred,
# no_bins, monotonicity_arr, col_ranges)
# for s in range(no_samples):
# change_vector, change_row, anchors, percent = instance_explanation(svm_model, data, data[s], s, X_pos_array, bins_centred,
# no_bins, monotonicity_arr, col_ranges)
# print(change_vector)
# instance_explanation(svm_model, data, row, sample, X_pos_array,
# bins_centred, no_bins, monotonicity_arr, col_ranges)
# create_summary_file(data, target, svm_model, bins_centred, X_pos_array, init_vals, no_bins, monotonicity_arr, preproc_path)
# res = prepare_for_D3(data[sample_no], bins_centred, change_row, change_vector, anchors, percent, feature_names, False, monotonicity_arr)
# if __name__ == '__main__':
# from preprocessing import create_summary_file
# # --- Setting random seed --
# np.random.seed(150)
# # --- Parameters ---
# # data_path = "static/data/delinquency/delinquency.csv"
# # preproc_path = "static/data/delinquency/delinquency_preproc.csv"
# data_path = "static/data/heart/heart.csv"
# preproc_path = "static/data/heart/heart_preproc.csv"
# no_bins = 10
# model_path = "TBD" # Manual?
# # --- Advanced Parameters
# density_fineness = 1000
# categorical_cols = [] # Categorical columns can be customized
# # monotonicity_arr = []
# df = pd.read_csv(data_path)
# feature_names = np.array(df.columns)[:-1]
# all_data = np.array(df.values)
# # -- Split data and target values --
# data = all_data[:,:-1]
# target = all_data[:,-1]
# no_samples, no_features = data.shape
# svm_model = SVM_model(data,target)
# svm_model.train_model(0.001)
# svm_model.test_model()
# index = 7
# # test_sample = data[index]
# bins_centred, X_pos_array, init_vals, col_ranges = divide_data_bins(data,no_bins) # Note: Does not account for categorical features
# # single_bin_result = bin_single_sample(test_sample, col_ranges)
# # aggr_data = prep_for_D3_aggregation(preproc_path, data, feature_names, [0,1,2,3,4,10], bins_centred, X_pos_array, False)
# # density_fineness = 1000
# # all_den, all_median, all_mean = all_kernel_densities(data,feature_names,density_fineness) # Pre-load density distributions
# cols_lst = [3,9,11,12]
# anchs = False
# # print(ids_with_combination(preproc_path, cols_lst, anchs))
# sample_no = 1
# # locked = [1,2,3]
# monotonicity_arr = mono_finder(svm_model, data, col_ranges)
# # print("MONOTONICITY ARRAY:")
# # print(monotonicity_arr)
# change_vector, change_row, anchors, percent = instance_explanation(svm_model, data, data[sample_no], sample_no, X_pos_array, bins_centred,
# no_bins, monotonicity_arr, col_ranges)
# for s in range(no_samples):
# change_vector, change_row, anchors, percent = instance_explanation(svm_model, data, data[s], s, X_pos_array, bins_centred,
# no_bins, monotonicity_arr, col_ranges)
# print(change_vector)
# instance_explanation(svm_model, data, row, sample, X_pos_array,
# bins_centred, no_bins, monotonicity_arr, col_ranges)
# create_summary_file(data, target, svm_model, bins_centred, X_pos_array, init_vals, no_bins, monotonicity_arr, preproc_path)
# res = prepare_for_D3(data[sample_no], bins_centred, change_row, change_vector, anchors, percent, feature_names, False, monotonicity_arr)
| 31.684783 | 145 | 0.693997 | 792 | 5,830 | 4.787879 | 0.183081 | 0.037975 | 0.028481 | 0.044304 | 0.890295 | 0.890295 | 0.890295 | 0.890295 | 0.890295 | 0.890295 | 0 | 0.017581 | 0.190223 | 5,830 | 184 | 146 | 31.684783 | 0.785639 | 0.875643 | 0 | 0 | 0 | 0 | 0.065327 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.533333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
cf2fa6e2f10668fb2acf901e278b890d711e867f | 41 | py | Python | gesso/gesso/command_line.py | machineeeee/builder-python | a01415ef8675e5a11afaa0fe33f794f8ab2a98dc | [
"Apache-2.0"
] | null | null | null | gesso/gesso/command_line.py | machineeeee/builder-python | a01415ef8675e5a11afaa0fe33f794f8ab2a98dc | [
"Apache-2.0"
] | null | null | null | gesso/gesso/command_line.py | machineeeee/builder-python | a01415ef8675e5a11afaa0fe33f794f8ab2a98dc | [
"Apache-2.0"
] | null | null | null | import gesso
def main():
gesso.gesso()
| 8.2 | 14 | 0.682927 | 6 | 41 | 4.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 4 | 15 | 10.25 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d83e2687aeda1f8b192bea1e8e40b2109354190a | 2,675 | py | Python | test/integration/tfp/tfp_models/test_disc_bounded.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 123 | 2018-11-22T01:34:47.000Z | 2022-02-06T17:41:05.000Z | test/integration/tfp/tfp_models/test_disc_bounded.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 947 | 2018-11-30T15:49:31.000Z | 2022-03-30T15:56:17.000Z | test/integration/tfp/tfp_models/test_disc_bounded.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 50 | 2019-02-14T15:13:39.000Z | 2022-03-18T18:14:17.000Z |
import numpy as np__
import tensorflow as tf__
import tensorflow_probability as tfp__
tfd__ = tfp__.distributions
tfb__ = tfp__.bijectors
from tensorflow.python.ops.parallel_for import pfor as pfor__
class test_disc_bounded_model(tfd__.Distribution):
def __init__(self, n, y):
self.n = n
self.y = y
def log_prob_one_chain(self, params):
target = 0
n = self.n
y = self.y
p_binom = tf__.cast(params[0], tf__.float64)
p_binom_logit = tf__.cast(params[1], tf__.float64)
target += tf__.reduce_sum(tfd__.Binomial(n, None, p_binom).log_prob(y))
target += tf__.reduce_sum(tfd__.Binomial(n, p_binom).log_prob(y))
return target
def log_prob(self, params):
return tf__.vectorized_map(self.log_prob_one_chain, params)
def parameter_shapes(self, nchains__):
n = self.n
y = self.y
return [(nchains__, ), (nchains__, ), (nchains__, n)]
def parameter_bijectors(self):
n = self.n
y = self.y
return [tfb__.Chain([tfb__.Shift(tf__.cast(0, tf__.float64)),
tfb__.Scale(tf__.cast(1, tf__.float64) - tf__.cast(0, tf__.float64)),
tfb__.Sigmoid()]), tfb__.Identity(),
tfb__.Identity()]
def parameter_names(self):
return ["p_binom", "p_binom_logit", "cat_theta_logit"]
model = test_disc_bounded_model
import numpy as np__
import tensorflow as tf__
import tensorflow_probability as tfp__
tfd__ = tfp__.distributions
tfb__ = tfp__.bijectors
from tensorflow.python.ops.parallel_for import pfor as pfor__
class test_disc_bounded_model(tfd__.Distribution):
def __init__(self, n, y):
self.n = n
self.y = y
def log_prob_one_chain(self, params):
target = 0
n = self.n
y = self.y
p_binom = tf__.cast(params[0], tf__.float64)
p_binom_logit = tf__.cast(params[1], tf__.float64)
target += tf__.reduce_sum(tfd__.Binomial(n, None, p_binom).log_prob(y))
target += tf__.reduce_sum(tfd__.Binomial(n, p_binom_logit).log_prob(y))
return target
def log_prob(self, params):
return tf__.vectorized_map(self.log_prob_one_chain, params)
def parameter_shapes(self, nchains__):
n = self.n
y = self.y
return [(nchains__, ), (nchains__, )]
def parameter_bijectors(self):
n = self.n
y = self.y
return [tfb__.Chain([tfb__.Shift(tf__.cast(0, tf__.float64)),
tfb__.Scale(tf__.cast(1, tf__.float64) - tf__.cast(0, tf__.float64)),
tfb__.Sigmoid()]), tfb__.Identity()]
def parameter_names(self):
return ["p_binom", "p_binom_logit"]
model = test_disc_bounded_model | 29.722222 | 94 | 0.657196 | 380 | 2,675 | 4.086842 | 0.155263 | 0.038635 | 0.030908 | 0.051513 | 0.986478 | 0.986478 | 0.951062 | 0.951062 | 0.951062 | 0.951062 | 0 | 0.015437 | 0.225047 | 2,675 | 90 | 95 | 29.722222 | 0.733719 | 0 | 0 | 0.869565 | 0 | 0 | 0.020561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0 | 0.115942 | 0.057971 | 0.463768 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d88fc3274e4dea48595d167a3036e8ac7ac1bb86 | 2,592 | py | Python | circular_economy/models.py | CzechInvest/ciis | c6102598f564a717472e5e31e7eb894bba2c8104 | [
"MIT"
] | 1 | 2019-05-26T22:24:01.000Z | 2019-05-26T22:24:01.000Z | circular_economy/models.py | CzechInvest/ciis | c6102598f564a717472e5e31e7eb894bba2c8104 | [
"MIT"
] | 6 | 2019-01-22T14:53:43.000Z | 2020-09-22T16:20:28.000Z | circular_economy/models.py | CzechInvest/ciis | c6102598f564a717472e5e31e7eb894bba2c8104 | [
"MIT"
] | null | null | null | from django.db import models
from addresses.models import Address
from contacts.models import ContactPerson as MyContactPerson
from django.utils.translation import ugettext_lazy as _
import uuid
class ContactPerson(MyContactPerson):
mail = models.EmailField(blank=True)
pass
class Keyword(models.Model):
keyword = models.TextField(max_length=32)
def __str__(self):
return self.keyword
class Company(models.Model):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(
max_length=200,
help_text=_("Name")
)
url = models.URLField(
null=False,
blank=False
)
contact_person = models.ForeignKey(ContactPerson, on_delete=models.PROTECT)
address = models.ForeignKey(Address,
on_delete=models.PROTECT)
characteristics = models.TextField()
def __str__(self):
return self.name
class Municipality(models.Model):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(
max_length=200,
help_text=_("Name")
)
activity = models.CharField(
max_length=200,
help_text=_("Activity name")
)
url = models.URLField(
null=False,
blank=False
)
contact_person = models.ForeignKey(ContactPerson, on_delete=models.PROTECT)
address = models.ForeignKey(Address,
on_delete=models.PROTECT)
characteristics = models.TextField()
project_description = models.TextField()
challange = models.TextField()
result = models.TextField()
keywords = models.ManyToManyField("Keyword")
def __str__(self):
return self.name
class Pilot(models.Model):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(
max_length=200,
help_text=_("Name")
)
activity = models.CharField(
max_length=200,
help_text=_("Activity name")
)
url = models.URLField(
null=False,
blank=False
)
contact_person = models.ForeignKey(ContactPerson, on_delete=models.PROTECT)
address = models.ForeignKey(Address,
on_delete=models.PROTECT)
characteristics = models.TextField()
project_description = models.TextField()
challange = models.TextField()
result = models.TextField()
keywords = models.ManyToManyField("Keyword")
def __str__(self):
return self.name
| 25.92 | 79 | 0.653549 | 271 | 2,592 | 6.073801 | 0.239852 | 0.09113 | 0.051033 | 0.076549 | 0.803159 | 0.791009 | 0.791009 | 0.770352 | 0.770352 | 0.770352 | 0 | 0.010309 | 0.251543 | 2,592 | 99 | 80 | 26.181818 | 0.838144 | 0 | 0 | 0.710526 | 0 | 0 | 0.020062 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.013158 | 0.065789 | 0.052632 | 0.631579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
d8d4746fc921bd34abfe9d2dab7d0f3430946153 | 237,805 | py | Python | tccli/services/iotexplorer/iotexplorer_client.py | tencentcloudapi-test/tencentcloud-cli | da9733765df2b405b83b7acff48256f31e053ab1 | [
"Apache-2.0"
] | null | null | null | tccli/services/iotexplorer/iotexplorer_client.py | tencentcloudapi-test/tencentcloud-cli | da9733765df2b405b83b7acff48256f31e053ab1 | [
"Apache-2.0"
] | null | null | null | tccli/services/iotexplorer/iotexplorer_client.py | tencentcloudapi-test/tencentcloud-cli | da9733765df2b405b83b7acff48256f31e053ab1 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import sys
import six
import json
import tccli.options_define as OptionsDefine
import tccli.format_output as FormatOutput
from tccli import __version__
from tccli.utils import Utils
from tccli.exceptions import ConfigurationError, ClientError, ParamError
from tencentcloud.common import credential
from tencentcloud.common.profile.http_profile import HttpProfile
from tencentcloud.common.profile.client_profile import ClientProfile
from tencentcloud.iotexplorer.v20190423 import iotexplorer_client as iotexplorer_client_v20190423
from tencentcloud.iotexplorer.v20190423 import models as models_v20190423
from jmespath import search
import time
def doGetCOSURL(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetCOSURLRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetCOSURL(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDevice(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDeviceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeDevice(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeTopicPolicy(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeTopicPolicyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeTopicPolicy(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetProjectList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetProjectListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetProjectList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCallDeviceActionSync(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CallDeviceActionSyncRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CallDeviceActionSync(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeGatewayBindDevices(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeGatewayBindDevicesRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeGatewayBindDevices(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDeviceBindGateway(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDeviceBindGatewayRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeDeviceBindGateway(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetDeviceLocationHistory(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetDeviceLocationHistoryRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetDeviceLocationHistory(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doPublishMessage(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.PublishMessageRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.PublishMessage(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteDevice(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteDeviceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteDevice(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetLoRaGatewayList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetLoRaGatewayListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetLoRaGatewayList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doReleaseStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ReleaseStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ReleaseStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyFenceBind(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyFenceBindRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyFenceBind(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeletePositionSpace(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeletePositionSpaceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeletePositionSpace(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeBindedProducts(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeBindedProductsRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeBindedProducts(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyLoRaGateway(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyLoRaGatewayRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyLoRaGateway(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doControlDeviceData(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ControlDeviceDataRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ControlDeviceData(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doPublishRRPCMessage(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.PublishRRPCMessageRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.PublishRRPCMessage(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doSearchPositionSpace(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.SearchPositionSpaceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.SearchPositionSpace(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteLoRaGateway(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteLoRaGatewayRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteLoRaGateway(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doEnableTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.EnableTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.EnableTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeGatewaySubDeviceList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeGatewaySubDeviceListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeGatewaySubDeviceList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doSearchStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.SearchStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.SearchStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doListFirmwares(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ListFirmwaresRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ListFirmwares(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateDevice(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateDeviceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateDevice(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteFenceBind(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteFenceBindRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteFenceBind(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeFenceEventList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeFenceEventListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeFenceEventList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeFirmwareTask(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeFirmwareTaskRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeFirmwareTask(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetPositionSpaceList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetPositionSpaceListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetPositionSpaceList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeLoRaFrequency(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeLoRaFrequencyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeLoRaFrequency(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyPositionSpace(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyPositionSpaceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyPositionSpace(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetTopicRuleList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetTopicRuleListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetTopicRuleList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateProject(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateProjectRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateProject(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCallDeviceActionAsync(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CallDeviceActionAsyncRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CallDeviceActionAsync(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteProject(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteProjectRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteProject(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUnbindProducts(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UnbindProductsRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.UnbindProducts(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetBatchProductionsList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetBatchProductionsListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetBatchProductionsList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDeviceData(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDeviceDataRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeDeviceData(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDirectBindDeviceInFamily(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DirectBindDeviceInFamilyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DirectBindDeviceInFamily(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreatePositionFence(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreatePositionFenceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreatePositionFence(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateLoRaGateway(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateLoRaGatewayRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateLoRaGateway(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateLoRaFrequency(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateLoRaFrequencyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateLoRaFrequency(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateTopicPolicy(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateTopicPolicyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateTopicPolicy(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUploadFirmware(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UploadFirmwareRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.UploadFirmware(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUpdateDevicesEnableState(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UpdateDevicesEnableStateRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.UpdateDevicesEnableState(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyProject(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyProjectRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyProject(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doBindProducts(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.BindProductsRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.BindProducts(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeFenceBindList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeFenceBindListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeFenceBindList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyTopicPolicy(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyTopicPolicyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyTopicPolicy(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSpaceFenceEventList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSpaceFenceEventListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeSpaceFenceEventList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUpdateFirmware(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UpdateFirmwareRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.UpdateFirmware(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeGatewaySubProducts(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeGatewaySubProductsRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeGatewaySubProducts(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDevicePositionList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDevicePositionListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeDevicePositionList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doBindDevices(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.BindDevicesRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.BindDevices(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateBatchProduction(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateBatchProductionRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateBatchProduction(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDisableTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DisableTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DisableTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetFamilyDeviceUserList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetFamilyDeviceUserListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetFamilyDeviceUserList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetStudioProductList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetStudioProductListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetStudioProductList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeBatchProduction(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeBatchProductionRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeBatchProduction(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteDevices(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteDevicesRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteDevices(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetGatewaySubDeviceList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetGatewaySubDeviceListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetGatewaySubDeviceList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doListEventHistory(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ListEventHistoryRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ListEventHistory(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDeviceDataHistory(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDeviceDataHistoryRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeDeviceDataHistory(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyPositionFence(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyPositionFenceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyPositionFence(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeStudioProduct(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeStudioProductRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeStudioProduct(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribePositionFenceList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribePositionFenceListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribePositionFenceList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteLoRaFrequency(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteLoRaFrequencyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteLoRaFrequency(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifySpaceProperty(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifySpacePropertyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifySpaceProperty(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeleteTopicPolicy(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeleteTopicPolicyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeleteTopicPolicy(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGetDeviceList(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GetDeviceListRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GetDeviceList(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doListTopicPolicy(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ListTopicPolicyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ListTopicPolicy(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doGenSingleDeviceSignatureOfPublic(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.GenSingleDeviceSignatureOfPublicRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.GenSingleDeviceSignatureOfPublic(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyLoRaFrequency(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyLoRaFrequencyRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyLoRaFrequency(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDeletePositionFence(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DeletePositionFenceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DeletePositionFence(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doSearchTopicRule(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.SearchTopicRuleRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.SearchTopicRule(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyModelDefinition(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyModelDefinitionRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.ModifyModelDefinition(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUnbindDevices(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UnbindDevicesRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.UnbindDevices(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeProject(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeProjectRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeProject(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeModelDefinition(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeModelDefinitionRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.DescribeModelDefinition(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreatePositionSpace(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreatePositionSpaceRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreatePositionSpace(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateFenceBind(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
if g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
cred = credential.CVMRoleCredential()
elif g_param[OptionsDefine.RoleArn.replace('-', '_')] and g_param[OptionsDefine.RoleSessionName.replace('-', '_')]:
cred = credential.STSAssumeRoleCredential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.RoleArn.replace('-', '_')],
g_param[OptionsDefine.RoleSessionName.replace('-', '_')]
)
else:
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint],
proxy=g_param[OptionsDefine.HttpsProxy.replace('-', '_')]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.IotexplorerClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateFenceBindRequest()
model.from_json_string(json.dumps(args))
start_time = time.time()
while True:
rsp = client.CreateFenceBind(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
if not g_param[OptionsDefine.Waiter] or search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj) == g_param['OptionsDefine.WaiterInfo']['to']:
break
cur_time = time.time()
if cur_time - start_time >= g_param['OptionsDefine.WaiterInfo']['timeout']:
raise ClientError('Request timeout, wait `%s` to `%s` timeout, last request is %s' %
(g_param['OptionsDefine.WaiterInfo']['expr'], g_param['OptionsDefine.WaiterInfo']['to'],
search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj)))
else:
print('Inquiry result is %s.' % search(g_param['OptionsDefine.WaiterInfo']['expr'], json_obj))
time.sleep(g_param['OptionsDefine.WaiterInfo']['interval'])
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
CLIENT_MAP = {
"v20190423": iotexplorer_client_v20190423,
}
MODELS_MAP = {
"v20190423": models_v20190423,
}
ACTION_MAP = {
"GetCOSURL": doGetCOSURL,
"ModifyStudioProduct": doModifyStudioProduct,
"DescribeDevice": doDescribeDevice,
"DescribeTopicPolicy": doDescribeTopicPolicy,
"GetProjectList": doGetProjectList,
"CallDeviceActionSync": doCallDeviceActionSync,
"DescribeGatewayBindDevices": doDescribeGatewayBindDevices,
"DeleteStudioProduct": doDeleteStudioProduct,
"DescribeDeviceBindGateway": doDescribeDeviceBindGateway,
"GetDeviceLocationHistory": doGetDeviceLocationHistory,
"PublishMessage": doPublishMessage,
"DeleteDevice": doDeleteDevice,
"GetLoRaGatewayList": doGetLoRaGatewayList,
"ReleaseStudioProduct": doReleaseStudioProduct,
"ModifyFenceBind": doModifyFenceBind,
"DeletePositionSpace": doDeletePositionSpace,
"DescribeBindedProducts": doDescribeBindedProducts,
"ModifyLoRaGateway": doModifyLoRaGateway,
"ControlDeviceData": doControlDeviceData,
"DeleteTopicRule": doDeleteTopicRule,
"PublishRRPCMessage": doPublishRRPCMessage,
"SearchPositionSpace": doSearchPositionSpace,
"DeleteLoRaGateway": doDeleteLoRaGateway,
"CreateStudioProduct": doCreateStudioProduct,
"EnableTopicRule": doEnableTopicRule,
"DescribeGatewaySubDeviceList": doDescribeGatewaySubDeviceList,
"SearchStudioProduct": doSearchStudioProduct,
"ModifyTopicRule": doModifyTopicRule,
"ListFirmwares": doListFirmwares,
"CreateDevice": doCreateDevice,
"DeleteFenceBind": doDeleteFenceBind,
"DescribeFenceEventList": doDescribeFenceEventList,
"DescribeFirmwareTask": doDescribeFirmwareTask,
"GetPositionSpaceList": doGetPositionSpaceList,
"DescribeLoRaFrequency": doDescribeLoRaFrequency,
"ModifyPositionSpace": doModifyPositionSpace,
"GetTopicRuleList": doGetTopicRuleList,
"CreateProject": doCreateProject,
"CallDeviceActionAsync": doCallDeviceActionAsync,
"DeleteProject": doDeleteProject,
"DescribeTopicRule": doDescribeTopicRule,
"UnbindProducts": doUnbindProducts,
"GetBatchProductionsList": doGetBatchProductionsList,
"DescribeDeviceData": doDescribeDeviceData,
"DirectBindDeviceInFamily": doDirectBindDeviceInFamily,
"CreatePositionFence": doCreatePositionFence,
"CreateLoRaGateway": doCreateLoRaGateway,
"CreateTopicRule": doCreateTopicRule,
"CreateLoRaFrequency": doCreateLoRaFrequency,
"CreateTopicPolicy": doCreateTopicPolicy,
"UploadFirmware": doUploadFirmware,
"UpdateDevicesEnableState": doUpdateDevicesEnableState,
"ModifyProject": doModifyProject,
"BindProducts": doBindProducts,
"DescribeFenceBindList": doDescribeFenceBindList,
"ModifyTopicPolicy": doModifyTopicPolicy,
"DescribeSpaceFenceEventList": doDescribeSpaceFenceEventList,
"UpdateFirmware": doUpdateFirmware,
"DescribeGatewaySubProducts": doDescribeGatewaySubProducts,
"DescribeDevicePositionList": doDescribeDevicePositionList,
"BindDevices": doBindDevices,
"CreateBatchProduction": doCreateBatchProduction,
"DisableTopicRule": doDisableTopicRule,
"GetFamilyDeviceUserList": doGetFamilyDeviceUserList,
"GetStudioProductList": doGetStudioProductList,
"DescribeBatchProduction": doDescribeBatchProduction,
"DeleteDevices": doDeleteDevices,
"GetGatewaySubDeviceList": doGetGatewaySubDeviceList,
"ListEventHistory": doListEventHistory,
"DescribeDeviceDataHistory": doDescribeDeviceDataHistory,
"ModifyPositionFence": doModifyPositionFence,
"DescribeStudioProduct": doDescribeStudioProduct,
"DescribePositionFenceList": doDescribePositionFenceList,
"DeleteLoRaFrequency": doDeleteLoRaFrequency,
"ModifySpaceProperty": doModifySpaceProperty,
"DeleteTopicPolicy": doDeleteTopicPolicy,
"GetDeviceList": doGetDeviceList,
"ListTopicPolicy": doListTopicPolicy,
"GenSingleDeviceSignatureOfPublic": doGenSingleDeviceSignatureOfPublic,
"ModifyLoRaFrequency": doModifyLoRaFrequency,
"DeletePositionFence": doDeletePositionFence,
"SearchTopicRule": doSearchTopicRule,
"ModifyModelDefinition": doModifyModelDefinition,
"UnbindDevices": doUnbindDevices,
"DescribeProject": doDescribeProject,
"DescribeModelDefinition": doDescribeModelDefinition,
"CreatePositionSpace": doCreatePositionSpace,
"CreateFenceBind": doCreateFenceBind,
}
AVAILABLE_VERSION_LIST = [
"v20190423",
]
def action_caller():
return ACTION_MAP
def parse_global_arg(parsed_globals):
g_param = parsed_globals
is_exist_profile = True
if not parsed_globals["profile"]:
is_exist_profile = False
g_param["profile"] = "default"
configure_path = os.path.join(os.path.expanduser("~"), ".tccli")
is_conf_exist, conf_path = Utils.file_existed(configure_path, g_param["profile"] + ".configure")
is_cred_exist, cred_path = Utils.file_existed(configure_path, g_param["profile"] + ".credential")
conf = {}
cred = {}
if is_conf_exist:
conf = Utils.load_json_msg(conf_path)
if is_cred_exist:
cred = Utils.load_json_msg(cred_path)
if not (isinstance(conf, dict) and isinstance(cred, dict)):
raise ConfigurationError(
"file: %s or %s is not json format"
% (g_param["profile"] + ".configure", g_param["profile"] + ".credential"))
if OptionsDefine.Token not in cred:
cred[OptionsDefine.Token] = None
if not is_exist_profile:
if os.environ.get(OptionsDefine.ENV_SECRET_ID) and os.environ.get(OptionsDefine.ENV_SECRET_KEY):
cred[OptionsDefine.SecretId] = os.environ.get(OptionsDefine.ENV_SECRET_ID)
cred[OptionsDefine.SecretKey] = os.environ.get(OptionsDefine.ENV_SECRET_KEY)
cred[OptionsDefine.Token] = os.environ.get(OptionsDefine.ENV_TOKEN)
if os.environ.get(OptionsDefine.ENV_REGION):
conf[OptionsDefine.Region] = os.environ.get(OptionsDefine.ENV_REGION)
if os.environ.get(OptionsDefine.ENV_ROLE_ARN) and os.environ.get(OptionsDefine.ENV_ROLE_SESSION_NAME):
cred[OptionsDefine.RoleArn] = os.environ.get(OptionsDefine.ENV_ROLE_ARN)
cred[OptionsDefine.RoleSessionName] = os.environ.get(OptionsDefine.ENV_ROLE_SESSION_NAME)
for param in g_param.keys():
if g_param[param] is None:
if param in [OptionsDefine.SecretKey, OptionsDefine.SecretId, OptionsDefine.Token]:
if param in cred:
g_param[param] = cred[param]
elif not g_param[OptionsDefine.UseCVMRole.replace('-', '_')]:
raise ConfigurationError("%s is invalid" % param)
elif param in [OptionsDefine.Region, OptionsDefine.Output]:
if param in conf:
g_param[param] = conf[param]
else:
raise ConfigurationError("%s is invalid" % param)
elif param.replace('_', '-') in [OptionsDefine.RoleArn, OptionsDefine.RoleSessionName]:
if param.replace('_', '-') in cred:
g_param[param] = cred[param.replace('_', '-')]
try:
if g_param[OptionsDefine.ServiceVersion]:
g_param[OptionsDefine.Version] = "v" + g_param[OptionsDefine.ServiceVersion].replace('-', '')
else:
version = conf["iotexplorer"][OptionsDefine.Version]
g_param[OptionsDefine.Version] = "v" + version.replace('-', '')
if g_param[OptionsDefine.Endpoint] is None:
g_param[OptionsDefine.Endpoint] = conf["iotexplorer"][OptionsDefine.Endpoint]
except Exception as err:
raise ConfigurationError("config file:%s error, %s" % (conf_path, str(err)))
if g_param[OptionsDefine.Version] not in AVAILABLE_VERSION_LIST:
raise Exception("available versions: %s" % " ".join(AVAILABLE_VERSION_LIST))
if g_param[OptionsDefine.Waiter]:
param = eval(g_param[OptionsDefine.Waiter])
if 'expr' not in param:
raise Exception('`expr` in `--waiter` must be defined')
if 'to' not in param:
raise Exception('`to` in `--waiter` must be defined')
if 'timeout' not in param:
if 'waiter' in conf and 'timeout' in conf['waiter']:
param['timeout'] = conf['waiter']['timeout']
else:
param['timeout'] = 180
if 'interval' not in param:
if 'waiter' in conf and 'interval' in conf['waiter']:
param['interval'] = conf['waiter']['interval']
else:
param['timeout'] = 5
param['interval'] = min(param['interval'], param['timeout'])
g_param['OptionsDefine.WaiterInfo'] = param
# 如果在配置文件中读取字段的值,python2中的json.load函数会读取unicode类型的值,因此这里要转化类型
if six.PY2:
for key, value in g_param.items():
if isinstance(value, six.text_type):
g_param[key] = value.encode('utf-8')
return g_param
| 53.391334 | 155 | 0.680654 | 25,442 | 237,805 | 6.136349 | 0.017137 | 0.099039 | 0.301209 | 0.130956 | 0.926487 | 0.924712 | 0.923905 | 0.922663 | 0.921164 | 0.919883 | 0 | 0.004044 | 0.184744 | 237,805 | 4,453 | 156 | 53.403324 | 0.801235 | 0.004041 | 0 | 0.82708 | 0 | 0 | 0.139805 | 0.073989 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021645 | false | 0 | 0.003848 | 0.000241 | 0.025974 | 0.021164 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d8f14829e4e4a30049f0efda28a3287c95ae435f | 37,231 | py | Python | nl_mc.py | mohapsat/martech101 | 6f8c66df9919db1ac2319bb688a2b2fc694667a9 | [
"MIT"
] | 1 | 2022-03-14T10:49:33.000Z | 2022-03-14T10:49:33.000Z | nl_mc.py | mohapsat/martech101 | 6f8c66df9919db1ac2319bb688a2b2fc694667a9 | [
"MIT"
] | 2 | 2021-03-31T18:49:12.000Z | 2021-06-01T22:09:57.000Z | nl_mc.py | mohapsat/Martech-Weekly-100 | 6f8c66df9919db1ac2319bb688a2b2fc694667a9 | [
"MIT"
] | 1 | 2018-09-26T10:55:47.000Z | 2018-09-26T10:55:47.000Z | # python-mailchimp-api
# pip install mailchimp3
from mailchimp3 import MailChimp
import idx_config as conf
import json
client = MailChimp(mc_api=conf.MAILCHIMP_API_KEY, mc_user=conf.MAILCHIMP_USER)
# client = MailChimp(mc_api=MAILCHIMP_API_KEY, mc_user=MAILCHIMP_USER)
# client.campaigns.get(campaign_id='f25d3b3267')
# MASTER_MC_CAMPAIGN_ID = 'xxxxx' # master regular campaign object
# replicate existing campaign to create a new campaign
campaign = client.campaigns.actions.replicate(campaign_id=conf.MASTER_MC_CAMPAIGN_ID)
# Get campaign_id of new campaign
campaign_id = campaign.get('id')
# update content for the new campaign
# content_1 = {
# # 'plain_text': '',
# 'html':'<html>Hello Chimps!</html>',
# # 'archive_html': '<html>Hello world!</html>',
# '_links': [{
# 'rel': 'parent',
# 'href': 'https://us17.api.mailchimp.com/3.0/campaigns/f25d3b3267',
# 'method': 'GET',
# 'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Response.json'
#
# }, {
# 'rel': 'self',
# 'href': 'https://us17.api.mailchimp.com/3.0/campaigns/f25d3b3267/content',
# 'method': 'GET',
# 'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Content/Response.json'
#
# }, {
# 'rel': 'create',
# 'href': 'https://us17.api.mailchimp.com/3.0/campaigns/f25d3b3267/content',
# 'method': 'PUT',
# 'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Content/PUT.json'
# }]
# }
# load records from file to json object
json_mt101_data = json.load(open('mt101_data.txt'))
# prep html file, mobile optimized
# html_data = '<html> Title: '+json_mt101_data[0]['title']+\
# '<br/>'\
# +'Author: '+json_mt101_data[0]['author']+'</html>'
html_data='<html><head><meta property=\"og:title\" content=\"MARTECH101 - Top Sold and Read Marketing Books Weekly\">' \
'<meta name=\"viewport\" content=\"width=device-width, initial-scale=1, maximum-scale=1\">' \
'<meta name=\"referrer\" content=\"origin\"> ' \
'<!--[if gte mso 15]> <xml> <o:OfficeDocumentSettings> <o:AllowPNG/> <o:PixelsPerInch>96</o:PixelsPerInch> ' \
'</o:OfficeDocumentSettings> </xml> <![endif]--><meta charset=\"UTF-8\">' \
'<meta http-equiv=\"X-UA-Compatible\" content=\"IE=edge\">' \
'<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\">' \
'<title>Martech Weekly - Most Sold and Read Marketing Books Weekly</title>' \
'<style type=\"text/css\">html{font-weight:300}' \
'p{margin:10px 0;padding:0}table{border-collapse:collapse}' \
'h1,h2,h3,h4,h5,h6{display:block;margin:0;padding:0;font-weight:300 !important}' \
'a{font-weight:300 !important;color:#107896 !important;text-decoration:none !important}' \
'a.title-link{color:#0c596e !important;font-size:1.1rem !important}' \
'h3.title{font-size:16px !important;line-height:1.5 !important}' \
'img,a img{border:0;height:auto;outline:none;text-decoration:none}' \
'body,#bodyTable,#bodyCell{height:100%;margin:0;padding:0;width:100%}' \
'#outlook a{padding:0}img{-ms-interpolation-mode:bicubic}table{mso-table-lspace:0pt;mso-table-rspace:0pt}' \
'.ReadMsgBody{width:100%}.ExternalClass{width:100%}p,a,li,td,blockquote{mso-line-height-rule:exactly}' \
'a[href^=tel],a[href^=sms]{color:inherit;cursor:default;text-decoration:none}' \
'p,a,li,td,body,table,blockquote{-ms-text-size-adjust:100%;-webkit-text-size-adjust:100%}' \
'.ExternalClass,.ExternalClass p,.ExternalClass td,.ExternalClass div,.ExternalClass span,.ExternalClass font{line-height:100%}a[x-apple-data-detectors]{color:inherit !important;text-decoration:none !important;font-size:inherit !important;font-family:inherit !important;font-weight:inherit !important;line-height:inherit !important}#bodyCell{padding:10px}.templateContainer{max-width:600px !important}a.mcnButton{display:block}.mcnImage{vertical-align:bottom}.mcnTextContent{word-break:break-word}' \
'.mcnTextContent img{height:auto !important}.mcnDividerBlock{table-layout:fixed !important}' \
'body,#bodyTable{background-color:white}#bodyCell{border-top:0}.templateContainer{border:0}' \
'h1{color:#202020;font-family:Helvetica;font-size:26px;font-style:normal;font-weight:bold;line-height:125%;letter-spacing:normal;text-align:left}' \
'h2{color:#202020;font-family:Helvetica;font-size:22px;font-style:normal;font-weight:bold;line-height:125%;letter-spacing:normal;text-align:left}' \
'h3{color:#202020;font-family:Helvetica;font-size:20px;font-style:normal;font-weight:bold;line-height:125%;letter-spacing:normal;text-align:left}' \
'h4{color:#202020;font-family:Helvetica;font-size:18px;font-style:normal;font-weight:bold;line-height:125%;letter-spacing:normal;text-align:left}' \
'#templatePreheader{background-color:white;border-top:0;border-bottom:0;padding-top:9px;padding-bottom:9px}' \
'#templatePreheader .mcnTextContent,#templatePreheader .mcnTextContent p{color:#656565;font-family:Helvetica;font-size:12px;line-height:150%;text-align:left}' \
'#templatePreheader .mcnTextContent a,#templatePreheader .mcnTextContent p a{color:#656565;font-weight:normal;text-decoration:underline}' \
'#templateHeader{background-color:#FFF;border-top:0;border-bottom:0;padding-top:9px;padding-bottom:0}' \
'#templateHeader .mcnTextContent,#templateHeader .mcnTextContent p{color:#202020;font-family:Helvetica;font-size:16px;line-height:150%;text-align:left}' \
'#templateHeader .mcnTextContent a,#templateHeader .mcnTextContent p a{color:#2BAADF;font-weight:normal;text-decoration:underline}' \
'#templateBody{background-color:#FFF;border-top:0;border-bottom:2px solid #EAEAEA;padding-top:0;padding-bottom:9px}' \
'#templateBody .mcnTextContent,#templateBody .mcnTextContent p{color:#202020;font-family:Helvetica;font-size:16px;line-height:150%;text-align:left}' \
'#templateBody .mcnTextContent a,#templateBody .mcnTextContent p a{color:#2BAADF;font-weight:normal;text-decoration:underline}' \
'#templateFooter{background-color:white;border-top:0;border-bottom:0;padding-top:9px;padding-bottom:9px}' \
'#templateFooter .mcnTextContent,#templateFooter .mcnTextContent p{color:#656565;font-family:Helvetica;font-size:12px;line-height:150%;text-align:center}#templateFooter .mcnTextContent a,#templateFooter .mcnTextContent p a{color:#656565;font-weight:normal;text-decoration:underline}@media only screen and (min-width:768px){.templateContainer{width:600px !important}}@media only screen and (max-width: 480px){body,table,td,p,a,li,blockquote{-webkit-text-size-adjust:none !important}}@media only screen and (max-width: 480px){body{width:100% !important;min-width:100% !important}}@media only screen and (max-width: 480px){#bodyCell{padding-top:10px !important}}@media only screen and (max-width: 480px){.mcnImage{width:100% !important}}@media only screen and (max-width: 480px){.mcnCartContainer,.mcnCaptionTopContent,.mcnRecContentContainer,.mcnCaptionBottomContent,.mcnTextContentContainer,.mcnBoxedTextContentContainer,.mcnImageGroupContentContainer,.mcnCaptionLeftTextContentContainer,.mcnCaptionRightTextContentContainer,.mcnCaptionLeftImageContentContainer,.mcnCaptionRightImageContentContainer,.mcnImageCardLeftTextContentContainer,.mcnImageCardRightTextContentContainer{max-width:100% !important;width:100% !important}}@media only screen and (max-width: 480px){.mcnBoxedTextContentContainer{min-width:100% !important}}@media only screen and (max-width: 480px){.mcnImageGroupContent{padding:9px !important}}@media only screen and (max-width: 480px){.mcnCaptionLeftContentOuter .mcnTextContent,.mcnCaptionRightContentOuter .mcnTextContent{padding-top:9px !important}}@media only screen and (max-width: 480px){.mcnImageCardTopImageContent,.mcnCaptionBlockInner .mcnCaptionTopContent:last-child .mcnTextContent{padding-top:18px !important}}@media only screen and (max-width: 480px){.mcnImageCardBottomImageContent{padding-bottom:9px !important}}@media only screen and (max-width: 480px){.mcnImageGroupBlockInner{padding-top:0 !important;padding-bottom:0 !important}}@media only screen and (max-width: 480px){.mcnImageGroupBlockOuter{padding-top:9px !important;padding-bottom:9px !important}}@media only screen and (max-width: 480px){.mcnTextContent,.mcnBoxedTextContentColumn{padding-right:18px !important;padding-left:18px !important}}@media only screen and (max-width: 480px){.mcnImageCardLeftImageContent,.mcnImageCardRightImageContent{padding-right:18px !important;padding-bottom:0 !important;padding-left:18px !important}}@media only screen and (max-width: 480px){.mcpreview-image-uploader{display:none !important;width:100% !important}}@media only screen and (max-width: 480px){h1{font-size:22px !important;line-height:125% !important}}@media only screen and (max-width: 480px){h2{font-size:20px !important;line-height:125% !important}}@media only screen and (max-width: 480px){h3{font-size:18px !important;line-height:125% !important}}@media only screen and (max-width: 480px){h4{font-size:16px !important;line-height:150% !important}}@media only screen and (max-width: 480px){.mcnBoxedTextContentContainer .mcnTextContent,.mcnBoxedTextContentContainer .mcnTextContent p{font-size:14px !important;line-height:150% !important}}@media only screen and (max-width: 480px){#templatePreheader{display:block !important}}@media only screen and (max-width: 480px){#templatePreheader .mcnTextContent,#templatePreheader .mcnTextContent p{font-size:14px !important;line-height:150% !important}}@media only screen and (max-width: 480px){#templateHeader .mcnTextContent,#templateHeader .mcnTextContent p{font-size:16px !important;line-height:150% !important}}@media only screen and (max-width: 480px){#templateBody .mcnTextContent,#templateBody .mcnTextContent p{font-size:16px !important;line-height:150% !important}}@media only screen and (max-width: 480px){#templateFooter .mcnTextContent,#templateFooter .mcnTextContent p{font-size:14px !important;line-height:150% !important}}</style></head><body id=\"archivebody\" style=\"height: 100%;margin: 0;padding: 0;width: 100%;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;background-color: white;\"><center><table align=\"left\" border=\"0\" cellpadding=\"0\" cellspacing=\"0\" style=\"max-width: 100%;min-width: 100%;border-collapse: collapse;mso-table-lspace: 0pt;mso-table-rspace: 0pt;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\" width=\"100%\" class=\"mcnTextContentContainer\"><tbody><tr><td valign=\"top\" class=\"mcnTextContent\" style=\"padding-top: 0;padding-right: 18px;padding-bottom: 9px;padding-left: 18px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;word-break: break-word;color: #202020;font-family: Helvetica;font-size: 16px;line-height: 150%;text-align: left;\"><h1 style=\"font-weight: bold;display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 26px;font-style: normal;line-height: 125%;letter-spacing: normal;text-align: left;\"><span><a href=\"http://www.martech101.com\" style=\"text-decoration: none;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #107896 !important;font-weight: 300 !important;\">Your weekly fix of top marketing books.</a></span></h1><div style=\"margin-top: 40px;\"><table style=\"border-collapse: collapse;mso-table-lspace: 0pt;mso-table-rspace: 0pt;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><tbody><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[0]['detail_page_url']+'\"><img src=\"'+json_mt101_data[0]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[0]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">1. '+json_mt101_data[0]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[1]['detail_page_url']+'\"><img src=\"'+json_mt101_data[1]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[1]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">2. '+json_mt101_data[1]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[2]['detail_page_url']+'\"><img src=\"'+json_mt101_data[2]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[2]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">3. '+json_mt101_data[2]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[3]['detail_page_url']+'\"><img src=\"'+json_mt101_data[3]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[3]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">4. '+json_mt101_data[3]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[4]['detail_page_url']+'\"><img src=\"'+json_mt101_data[4]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[4]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">5. '+json_mt101_data[4]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[5]['detail_page_url']+'\"><img src=\"'+json_mt101_data[5]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[5]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">6. '+json_mt101_data[5]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[6]['detail_page_url']+'\"><img src=\"'+json_mt101_data[6]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[6]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">7. '+json_mt101_data[6]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[7]['detail_page_url']+'\"><img src=\"'+json_mt101_data[7]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[7]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">8. '+json_mt101_data[7]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[8]['detail_page_url']+'\"><img src=\"'+json_mt101_data[8]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[8]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">9. '+json_mt101_data[8]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[9]['detail_page_url']+'\"><img src=\"'+json_mt101_data[9]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[9]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">10. '+json_mt101_data[9]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[10]['detail_page_url']+'\"><img src=\"'+json_mt101_data[10]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[10]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">11. '+json_mt101_data[10]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[11]['detail_page_url']+'\"><img src=\"'+json_mt101_data[11]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[11]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">12. '+json_mt101_data[11]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[12]['detail_page_url']+'\"><img src=\"'+json_mt101_data[12]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[12]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">13. '+json_mt101_data[12]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[13]['detail_page_url']+'\"><img src=\"'+json_mt101_data[13]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[13]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">14. '+json_mt101_data[13]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[14]['detail_page_url']+'\"><img src=\"'+json_mt101_data[14]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[14]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">15. '+json_mt101_data[14]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[15]['detail_page_url']+'\"><img src=\"'+json_mt101_data[15]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[15]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">16. '+json_mt101_data[15]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[16]['detail_page_url']+'\"><img src=\"'+json_mt101_data[16]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[16]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">17. '+json_mt101_data[16]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[17]['detail_page_url']+'\"><img src=\"'+json_mt101_data[17]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[17]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">18. '+json_mt101_data[17]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[18]['detail_page_url']+'\"><img src=\"'+json_mt101_data[18]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[18]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">19. '+json_mt101_data[18]['title']+'</a></h3></td></tr><tr><td style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><div style=\"height: 100px; width: 80px; padding-right: 15px;\"><a href=\"'+json_mt101_data[19]['detail_page_url']+'\"><img src=\"'+json_mt101_data[19]['large_image_url']+'\" style=\"max-height: 100px;max-width: 80px;border: 0;height: auto !important;outline: none;text-decoration: none;-ms-interpolation-mode: bicubic;\"></a></div></td><td style=\"padding-top: 20px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;\"><h3 class=\"title\" style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 16px !important;font-style: normal;font-weight: 300 !important;line-height: 1.5 !important;letter-spacing: normal;text-align: left;\"><a class=\"title-link\" href=\"'+json_mt101_data[19]['detail_page_url']+'\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #0c596e !important;font-weight: 300 !important;text-decoration: none !important;font-size: 1.1rem !important;\">20. '+json_mt101_data[19]['title']+'</a></h3></td></tr></tbody></table><br><p style=\"margin: 10px 0;padding: 0;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #202020;font-family: Helvetica;font-size: 16px;line-height: 150%;text-align: left;\"></p><h3 style=\"display: block;margin: 0;padding: 0;color: #202020;font-family: Helvetica;font-size: 20px;font-style: normal;font-weight: 300 !important;line-height: 125%;letter-spacing: normal;text-align: left;\"> <a href=\"http://www.martech101.com\" style=\"mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%;color: #107896 !important;font-weight: 300 !important;text-decoration: none !important;\">See more top selling and read marketing books on Martech101...</a></h3><p></p></div></td></tr></tbody></table></center></body></html>'
# def htmlfiletostr(file):
# '''
# method reads a html file and returns a str
# :return: str
# '''
# str = ""
# with open(file) as report_file:
# raw_html = report_file.readlines()
# str = ''.join(raw_html)
# return str
#
# set dyn_html_content
# dyn_html_content = htmlfiletostr("nl_sample.htm")
dyn_html_content = html_data
dyn_content = {
# 'plain_text': '',
'html': dyn_html_content,
# 'archive_html': '<html>Hello world!</html>',
'_links': [{
'rel': 'parent',
'href': 'https://us17.api.mailchimp.com/3.0/campaigns/'+campaign_id,
'method': 'GET',
'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Response.json'
}, {
'rel': 'self',
'href': 'https://us17.api.mailchimp.com/3.0/campaigns/'+campaign_id+'/content',
'method': 'GET',
'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Content/Response.json'
}, {
'rel': 'create',
'href': 'https://us17.api.mailchimp.com/3.0/campaigns/'+campaign_id+'/content',
'method': 'PUT',
'targetSchema': 'https://us17.api.mailchimp.com/schema/3.0/Definitions/Campaigns/Content/PUT.json'
}]
}
# print(json.dumps(dyn_content))
# update content of the new campaign_id
client.campaigns.content.update(campaign_id=campaign_id, data=dyn_content)
# send the latest campaign_id
client.campaigns.actions.send(campaign_id=campaign_id)
# remove old campaign id
# TDB | 251.560811 | 29,285 | 0.718998 | 5,425 | 37,231 | 4.871152 | 0.060461 | 0.041474 | 0.07258 | 0.08749 | 0.828729 | 0.805457 | 0.799175 | 0.792553 | 0.757133 | 0.739083 | 0 | 0.063512 | 0.069619 | 37,231 | 148 | 29,286 | 251.560811 | 0.699385 | 0.051812 | 0 | 0.089552 | 0 | 1.38806 | 0.899526 | 0.47147 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.149254 | 0 | 0.149254 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
2b5832a5619cbd9c866288f302dacdb38d63c0dc | 7,323 | bzl | Python | tests/core/nuget_single/workspace.in.bzl | renovate-bot/rules_dotnet | e19f307c6d69c2cc15a6912c1b977b664ea99843 | [
"Apache-2.0"
] | null | null | null | tests/core/nuget_single/workspace.in.bzl | renovate-bot/rules_dotnet | e19f307c6d69c2cc15a6912c1b977b664ea99843 | [
"Apache-2.0"
] | null | null | null | tests/core/nuget_single/workspace.in.bzl | renovate-bot/rules_dotnet | e19f307c6d69c2cc15a6912c1b977b664ea99843 | [
"Apache-2.0"
] | null | null | null | load("@io_bazel_rules_dotnet//dotnet:defs.bzl", "dotnet_register_toolchains", "dotnet_repositories", "mono_register_sdk", "net_register_sdk")
dotnet_repositories()
dotnet_register_toolchains()
mono_register_sdk()
load("@io_bazel_rules_dotnet//dotnet:defs.bzl", "dotnet_nuget_new", "nuget_package")
dotnet_nuget_new(
name = "npgsql",
package = "Npgsql",
version = "3.2.7",
build_file_content = """
package(default_visibility = [ "//visibility:public" ])
load("@io_bazel_rules_dotnet//dotnet:defs.bzl", "dotnet_import_library")
dotnet_import_library(
name = "lib",
src = "lib/net451/Npgsql.dll"
)
""",
)
dotnet_nuget_new(
name = "commandlineparser",
package = "commandlineparser",
version = "2.3.0",
build_file_content = """
package(default_visibility = [ "//visibility:public" ])
load("@io_bazel_rules_dotnet//dotnet:defs.bzl", "dotnet_import_library")
dotnet_import_library(
name = "lib",
src = "lib/net45/CommandLine.dll"
)
""",
)
nuget_package(
name = "nuget.frameworks",
package = "nuget.frameworks",
version = "4.8.0",
core_lib = {
"netcoreapp2.0": "lib/netstandard1.6/NuGet.Frameworks.dll",
"netcoreapp2.1": "lib/netstandard1.6/NuGet.Frameworks.dll",
},
net_lib = {
"net45": "lib/net40/NuGet.Frameworks.dll",
"net451": "lib/net40/NuGet.Frameworks.dll",
"net452": "lib/net40/NuGet.Frameworks.dll",
"net46": "lib/net46/NuGet.Frameworks.dll",
"net461": "lib/net46/NuGet.Frameworks.dll",
"net462": "lib/net46/NuGet.Frameworks.dll",
"net47": "lib/net46/NuGet.Frameworks.dll",
"net471": "lib/net46/NuGet.Frameworks.dll",
"net472": "lib/net46/NuGet.Frameworks.dll",
"netstandard1.6": "lib/netstandard1.6/NuGet.Frameworks.dll",
"netstandard2.0": "lib/netstandard1.6/NuGet.Frameworks.dll",
},
mono_lib = "lib/net46/NuGet.Frameworks.dll",
core_files = {
"netcoreapp2.0": [
"lib/netstandard1.6/NuGet.Frameworks.dll",
"lib/netstandard1.6/NuGet.Frameworks.xml",
],
"netcoreapp2.1": [
"lib/netstandard1.6/NuGet.Frameworks.dll",
"lib/netstandard1.6/NuGet.Frameworks.xml",
],
},
net_files = {
"net45": [
"lib/net40/NuGet.Frameworks.dll",
"lib/net40/NuGet.Frameworks.xml",
],
"net451": [
"lib/net40/NuGet.Frameworks.dll",
"lib/net40/NuGet.Frameworks.xml",
],
"net452": [
"lib/net40/NuGet.Frameworks.dll",
"lib/net40/NuGet.Frameworks.xml",
],
"net46": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"net461": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"net462": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"net47": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"net471": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"net472": [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
"netstandard1.6": [
"lib/netstandard1.6/NuGet.Frameworks.dll",
"lib/netstandard1.6/NuGet.Frameworks.xml",
],
"netstandard2.0": [
"lib/netstandard1.6/NuGet.Frameworks.dll",
"lib/netstandard1.6/NuGet.Frameworks.xml",
],
},
mono_files = [
"lib/net46/NuGet.Frameworks.dll",
"lib/net46/NuGet.Frameworks.xml",
],
)
nuget_package(
name = "nuget.common",
package = "nuget.common",
version = "4.8.0",
core_lib = {
"netcoreapp2.0": "lib/netstandard1.6/NuGet.Common.dll",
"netcoreapp2.1": "lib/netstandard1.6/NuGet.Common.dll",
},
net_lib = {
"net46": "lib/net46/NuGet.Common.dll",
"net461": "lib/net46/NuGet.Common.dll",
"net462": "lib/net46/NuGet.Common.dll",
"net47": "lib/net46/NuGet.Common.dll",
"net471": "lib/net46/NuGet.Common.dll",
"net472": "lib/net46/NuGet.Common.dll",
"netstandard1.6": "lib/netstandard1.6/NuGet.Common.dll",
"netstandard2.0": "lib/netstandard1.6/NuGet.Common.dll",
},
mono_lib = "lib/net46/NuGet.Common.dll",
core_deps = {
"net46": [
"@nuget.frameworks//:net46_net",
],
"net461": [
"@nuget.frameworks//:net461_net",
],
"net462": [
"@nuget.frameworks//:net462_net",
],
"net47": [
"@nuget.frameworks//:net47_net",
],
"net471": [
"@nuget.frameworks//:net471_net",
],
"net472": [
"@nuget.frameworks//:net472_net",
],
"netstandard1.6": [
"@nuget.frameworks//:netstandard1.6_net",
],
"netstandard2.0": [
"@nuget.frameworks//:netstandard2.0_net",
],
},
net_deps = {
"net46": [
"@nuget.frameworks//:net46_net",
],
"net461": [
"@nuget.frameworks//:net461_net",
],
"net462": [
"@nuget.frameworks//:net462_net",
],
"net47": [
"@nuget.frameworks//:net47_net",
],
"net471": [
"@nuget.frameworks//:net471_net",
],
"net472": [
"@nuget.frameworks//:net472_net",
],
"netstandard1.6": [
"@nuget.frameworks//:netstandard1.6_net",
],
"netstandard2.0": [
"@nuget.frameworks//:netstandard2.0_net",
],
},
mono_deps = [
"@nuget.frameworks//:mono",
],
core_files = {
"netcoreapp2.0": [
"lib/netstandard1.6/NuGet.Common.dll",
"lib/netstandard1.6/NuGet.Common.xml",
],
"netcoreapp2.1": [
"lib/netstandard1.6/NuGet.Common.dll",
"lib/netstandard1.6/NuGet.Common.xml",
],
},
net_files = {
"net46": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"net461": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"net462": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"net47": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"net471": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"net472": [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
"netstandard1.6": [
"lib/netstandard1.6/NuGet.Common.dll",
"lib/netstandard1.6/NuGet.Common.xml",
],
"netstandard2.0": [
"lib/netstandard1.6/NuGet.Common.dll",
"lib/netstandard1.6/NuGet.Common.xml",
],
},
mono_files = [
"lib/net46/NuGet.Common.dll",
"lib/net46/NuGet.Common.xml",
],
)
#
| 29.647773 | 141 | 0.532022 | 726 | 7,323 | 5.258953 | 0.081267 | 0.239654 | 0.143007 | 0.132006 | 0.881613 | 0.875851 | 0.721844 | 0.720272 | 0.614458 | 0.592981 | 0 | 0.069614 | 0.29578 | 7,323 | 246 | 142 | 29.768293 | 0.670739 | 0 | 0 | 0.725738 | 0 | 0 | 0.560366 | 0.450697 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.016878 | 0 | 0.016878 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2b735b7fca3cda1abfc6e97dcb24c602d06589b8 | 4,311 | py | Python | tests/snapshots/snap_test_schema_to_typed_dict.py | dan-wave-security-test/avro-to-python-types | c5c24a3e01b4adc7f914e459bd4e0eda0d2eba33 | [
"MIT"
] | null | null | null | tests/snapshots/snap_test_schema_to_typed_dict.py | dan-wave-security-test/avro-to-python-types | c5c24a3e01b4adc7f914e459bd4e0eda0d2eba33 | [
"MIT"
] | null | null | null | tests/snapshots/snap_test_schema_to_typed_dict.py | dan-wave-security-test/avro-to-python-types | c5c24a3e01b4adc7f914e459bd4e0eda0d2eba33 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_expandable_schemas common.ChildA.avsc'] = '''from typing import TypedDict, Optional
class common_ChildA(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
'''
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_expandable_schemas common.ChildB.avsc'] = '''from typing import TypedDict, Optional
class common_ChildB(TypedDict):
streetaddress: str
city: str
'''
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_expandable_schemas domain.Parent.avsc'] = '''from typing import TypedDict, Optional
class common_ChildA(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
class common_ChildB(TypedDict):
streetaddress: str
city: str
class domain_CompositeItem(TypedDict):
composite_a: common_ChildA
composite_b: common_ChildB
class domain_Parent(TypedDict):
first_item: common_ChildA
second_item: common_ChildA
composite_item: domain_CompositeItem
favorite_color: Optional[str]
'''
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_self_contained_schemas nested_record.avsc'] = '''from typing import TypedDict, Optional
class example_avro_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_avro_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_avro_AddressUSRecord
'''
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_self_contained_schemas nested_records.avsc'] = '''from typing import TypedDict, Optional
class example_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_OtherThing(TypedDict):
thing1: str
thing2: Optional[int]
class example_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_AddressUSRecord
other_thing: example_OtherThing
'''
snapshots['SnapshotTypedDictFromSchemaFile::test_snapshot_self_contained_schemas nested_records_deep.avsc'] = '''from typing import TypedDict, Optional
class example_avro_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_avro_NextOtherThing(TypedDict):
thing1: str
thing2: Optional[int]
class example_avro_OtherThing(TypedDict):
thing1: str
other_thing: example_avro_NextOtherThing
class example_avro_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_avro_AddressUSRecord
other_thing: example_avro_OtherThing
'''
snapshots['SnapshotTypedDictFromSchemaString::test_snapshot_all_schemas nested_record.avsc'] = '''from typing import TypedDict, Optional
class example_avro_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_avro_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_avro_AddressUSRecord
'''
snapshots['SnapshotTypedDictFromSchemaString::test_snapshot_all_schemas nested_records.avsc'] = '''from typing import TypedDict, Optional
class example_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_OtherThing(TypedDict):
thing1: str
thing2: Optional[int]
class example_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_AddressUSRecord
other_thing: example_OtherThing
'''
snapshots['SnapshotTypedDictFromSchemaString::test_snapshot_all_schemas nested_records_deep.avsc'] = '''from typing import TypedDict, Optional
class example_avro_AddressUSRecord(TypedDict):
streetaddress: str
city: str
class example_avro_NextOtherThing(TypedDict):
thing1: str
thing2: Optional[int]
class example_avro_OtherThing(TypedDict):
thing1: str
other_thing: example_avro_NextOtherThing
class example_avro_User(TypedDict):
name: str
favorite_number: Optional[int]
favorite_color: Optional[str]
address: example_avro_AddressUSRecord
other_thing: example_avro_OtherThing
'''
| 25.814371 | 151 | 0.779865 | 469 | 4,311 | 6.886994 | 0.136461 | 0.068111 | 0.059443 | 0.055728 | 0.902477 | 0.902477 | 0.900929 | 0.849226 | 0.716409 | 0.681734 | 0 | 0.003512 | 0.141267 | 4,311 | 166 | 152 | 25.96988 | 0.86899 | 0.014382 | 0 | 0.831858 | 0 | 0 | 0.925106 | 0.387894 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.097345 | 0 | 0.097345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2b7a970aa344127285b5b5d06995b7c8fb17388c | 25,194 | py | Python | prospector/processing/commit/test_feature_extractor.py | pombredanne/vulnerability-assessment-kb | 3a32e3eea5413b7acd6ac70a6cba46ccc84e83d2 | [
"Apache-2.0"
] | 41 | 2019-02-21T03:27:21.000Z | 2020-06-04T13:39:52.000Z | prospector/processing/commit/test_feature_extractor.py | pombredanne/vulnerability-assessment-kb | 3a32e3eea5413b7acd6ac70a6cba46ccc84e83d2 | [
"Apache-2.0"
] | 3 | 2019-02-28T19:07:34.000Z | 2020-05-18T19:34:28.000Z | prospector/processing/commit/test_feature_extractor.py | pombredanne/vulnerability-assessment-kb | 3a32e3eea5413b7acd6ac70a6cba46ccc84e83d2 | [
"Apache-2.0"
] | 32 | 2019-02-26T16:09:35.000Z | 2020-05-30T13:47:40.000Z | import pandas
import pytest
# from datamodel import advisory
from datamodel.advisory import AdvisoryRecord
from datamodel.commit import Commit
from git.git import Git
from util.sample_data_generation import random_list_of_cve
from .feature_extractor import ( # extract_features,
extract_changed_relevant_paths,
extract_other_CVE_in_message,
extract_path_similarities,
extract_references_vuln_id,
extract_referred_to_by_nvd,
extract_referred_to_by_pages_linked_from_advisories,
extract_time_between_commit_and_advisory_record,
is_commit_in_given_interval,
is_commit_reachable_from_given_tag,
)
from .preprocessor import preprocess_commit
@pytest.fixture
def repository():
repo = Git("https://github.com/apache/struts")
repo.clone()
return repo
# def test_extract_features(repository, requests_mock):
# requests_mock.get(
# "https://for.testing.purposes/reference/to/some/commit/7532d2fb0d6081a12c2a48ec854a81a8b718be62"
# )
# requests_mock.get(
# "https://for.testing.purposes/containing_commit_id_in_text",
# text="some text 7532d2fb0d6081a12c2a48ec854a81a8b718be62 blah",
# )
# repo = repository
# commit = repo.get_commit("7532d2fb0d6081a12c2a48ec854a81a8b718be62")
# processed_commit = preprocess_commit(commit)
# advisory_record = AdvisoryRecord(
# vulnerability_id="CVE-2020-26258",
# repository_url="https://github.com/apache/struts",
# published_timestamp=1607532756,
# references=[
# "https://for.testing.purposes/reference/to/some/commit/7532d2fb0d6081a12c2a48ec854a81a8b718be62",
# "https://for.testing.purposes/containing_commit_id_in_text",
# ],
# paths=["pom.xml"],
# )
# extracted_features = extract_features(processed_commit, advisory_record)
# assert extracted_features.references_vuln_id is True
# assert extracted_features.time_between_commit_and_advisory_record == 1000000
# assert extracted_features.changes_relevant_path == [
# "pom.xml",
# ]
# assert extracted_features.other_CVE_in_message == [
# "CVE-2020-26259",
# ]
# assert extracted_features.referred_to_by_pages_linked_from_advisories == [
# "https://for.testing.purposes/containing_commit_id_in_text",
# ]
# assert extracted_features.referred_to_by_nvd == [
# "https://for.testing.purposes/reference/to/some/commit/7532d2fb0d6081a12c2a48ec854a81a8b718be62",
# ]
def test_extract_references_vuln_id():
commit = Commit(
commit_id="test_commit",
repository="test_repository",
cve_refs=[
"test_advisory_record",
"another_advisory_record",
"yet_another_advisory_record",
],
)
advisory_record = AdvisoryRecord(vulnerability_id="test_advisory_record")
result = extract_references_vuln_id(commit, advisory_record)
assert result is True
def test_time_between_commit_and_advisory_record():
commit = Commit(
commit_id="test_commit", repository="test_repository", timestamp=142
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", published_timestamp=100
)
assert (
extract_time_between_commit_and_advisory_record(commit, advisory_record) == 42
)
@pytest.fixture
def paths():
return [
"fire-nation/zuko/lightning.png",
"water-bending/katara/necklace.gif",
"air-nomad/aang/littlefoot.jpg",
"earth-kingdom/toph/metal.png",
]
@pytest.fixture
def sub_paths():
return [
"lightning.png",
"zuko/lightning.png",
"fire-nation/zuko",
"water-bending",
]
class TestExtractChangedRelevantPaths:
@staticmethod
def test_sub_path_matching(paths, sub_paths):
commit = Commit(
commit_id="test_commit", repository="test_repository", changed_files=paths
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=sub_paths
)
matched_paths = {
"fire-nation/zuko/lightning.png",
"water-bending/katara/necklace.gif",
}
assert extract_changed_relevant_paths(commit, advisory_record) == matched_paths
@staticmethod
def test_same_path_only(paths):
commit = Commit(
commit_id="test_commit", repository="test_repository", changed_files=paths
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=paths[:2]
)
assert extract_changed_relevant_paths(commit, advisory_record) == set(paths[:2])
@staticmethod
def test_same_path_and_others(paths):
commit = Commit(
commit_id="test_commit",
repository="test_repository",
changed_files=[paths[0]],
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=paths[:2]
)
assert extract_changed_relevant_paths(commit, advisory_record) == {
paths[0],
}
@staticmethod
def test_no_match(paths):
commit = Commit(
commit_id="test_commit",
repository="test_repository",
changed_files=paths[:1],
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=paths[2:]
)
assert extract_changed_relevant_paths(commit, advisory_record) == set()
@staticmethod
def test_empty_list(paths):
commit = Commit(
commit_id="test_commit", repository="test_repository", changed_files=[]
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=paths
)
assert extract_changed_relevant_paths(commit, advisory_record) == set()
commit = Commit(
commit_id="test_commit",
repository="test_repository",
changed_files=paths,
)
advisory_record = AdvisoryRecord(
vulnerability_id="test_advisory_record", paths=[]
)
assert extract_changed_relevant_paths(commit, advisory_record) == set()
def test_extract_other_CVE_in_message():
commit = Commit(
commit_id="test_commit",
repository="test_repository",
cve_refs=["CVE-2021-29425", "CVE-2021-21251"],
)
advisory_record = AdvisoryRecord(vulnerability_id="CVE-2020-31284")
assert extract_other_CVE_in_message(commit, advisory_record) == {
"CVE-2021-29425",
"CVE-2021-21251",
}
advisory_record = AdvisoryRecord(vulnerability_id="CVE-2021-29425")
result = extract_other_CVE_in_message(commit, advisory_record)
assert result == {
"CVE-2021-21251",
}
def test_is_commit_in_given_interval():
assert is_commit_in_given_interval(1359961896, 1359961896, 0)
assert is_commit_in_given_interval(1359961896, 1360047896, 1)
assert is_commit_in_given_interval(1359961896, 1359875896, -1)
assert not is_commit_in_given_interval(1359961896, 1359871896, -1)
assert not is_commit_in_given_interval(1359961896, 1360051896, 1)
def test_extract_referred_to_by_nvd(repository):
advisory_record = AdvisoryRecord(
vulnerability_id="CVE-2020-26258",
references=[
"https://lists.apache.org/thread.html/r97993e3d78e1f5389b7b172ba9f308440830ce5f051ee62714a0aa34@%3Ccommits.struts.apache.org%3E",
"https://other.com",
],
)
commit = Commit(
commit_id="r97993e3d78e1f5389b7b172ba9f308440830ce5",
repository="test_repository",
)
assert extract_referred_to_by_nvd(commit, advisory_record) == {
"https://lists.apache.org/thread.html/r97993e3d78e1f5389b7b172ba9f308440830ce5f051ee62714a0aa34@%3Ccommits.struts.apache.org%3E",
}
commit = Commit(
commit_id="f4d2eabd921cbd8808b9d923ee63d44538b4154f",
repository="test_repository",
)
assert extract_referred_to_by_nvd(commit, advisory_record) == set()
def test_is_commit_reachable_from_given_tag(repository):
repo = repository
commit = repo.get_commit("7532d2fb0d6081a12c2a48ec854a81a8b718be62")
test_commit = preprocess_commit(commit)
advisory_record = AdvisoryRecord(
vulnerability_id="CVE-2020-26258",
repository_url="https://github.com/apache/struts",
paths=["pom.xml"],
published_timestamp=1000000,
versions=["STRUTS_2_1_3", "STRUTS_2_3_9"],
)
assert not is_commit_reachable_from_given_tag(
test_commit, advisory_record, advisory_record.versions[0]
)
assert is_commit_reachable_from_given_tag(
preprocess_commit(repo.get_commit("2e19fc6670a70c13c08a3ed0927abc7366308bb1")),
advisory_record,
advisory_record.versions[1],
)
def test_extract_referred_to_by_pages_linked_from_advisories(repository, requests_mock):
requests_mock.get(
"https://for.testing.purposes/containing_commit_id_in_text_2",
text="some text r97993e3d78e1f5389b7b172ba9f308440830ce5 blah",
)
advisory_record = AdvisoryRecord(
vulnerability_id="CVE-2020-26258",
references=["https://for.testing.purposes/containing_commit_id_in_text_2"],
)
commit = Commit(
commit_id="r97993e3d78e1f5389b7b172ba9f308440830ce5",
repository="test_repository",
)
assert extract_referred_to_by_pages_linked_from_advisories(
commit, advisory_record
) == {
"https://for.testing.purposes/containing_commit_id_in_text_2",
}
commit = Commit(
commit_id="f4d2eabd921cbd8808b9d923ee63d44538b4154f",
repository="test_repository",
)
assert (
extract_referred_to_by_pages_linked_from_advisories(commit, advisory_record)
== set()
)
def test_extract_referred_to_by_pages_linked_from_advisories_wrong_url(repository):
advisory_record = AdvisoryRecord(
vulnerability_id="CVE-2020-26258",
references=["https://non-existing-url.com"],
)
commit = Commit(
commit_id="r97993e3d78e1f5389b7b172ba9f308440830ce5",
repository="test_repository",
)
assert not extract_referred_to_by_pages_linked_from_advisories(
commit, advisory_record
)
def test_extract_path_similarities():
code_tokens = [
"TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato",
"Bolin+Bumi+Ozai+Katara",
"Jinora.Appa.Unalaq.Zaheer",
"Naga.LinBeifong",
"Sokka.Kya",
"Bumi=Momo=Naga=Iroh",
"Sokka_Unalaq",
"Sokka.Iroh.Pabu",
"LinBeifong=Zuko",
"TenzinBolinSokka",
"Korra-AsamiSato-Pabu-Iroh",
"Mako.Naga",
"Jinora=Bumi",
"BolinAppaKuvira",
"TophBeifongIroh",
"Amon+Zuko+Unalaq",
]
paths = [
"Unalaq/Aang/Suyin Beifong",
"Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer",
"Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko",
"Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi",
"Momo",
"Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq",
]
commit = Commit(changed_files=paths)
advisory = AdvisoryRecord(
vulnerability_id=random_list_of_cve(max_count=1, min_count=1)[0],
code_tokens=code_tokens,
)
similarities: pandas.DataFrame = extract_path_similarities(commit, advisory)
expected = (
",changed file,code token,jaccard,sorensen-dice,otsuka-ochiai,levenshtein,damerau-levenshtein,length diff,inverted normalized levenshtein,inverted normalized damerau-levenshtein\n"
"0,Unalaq/Aang/Suyin Beifong,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.09090909090909091,0.16666666666666666,0.17677669529663687,8,8,4,0.19999999999999996,0.19999999999999996\n"
"1,Unalaq/Aang/Suyin Beifong,Bolin+Bumi+Ozai+Katara,0.0,0.0,0.0,4,4,0,0.6,0.6\n"
"2,Unalaq/Aang/Suyin Beifong,Jinora.Appa.Unalaq.Zaheer,0.14285714285714285,0.25,0.25,4,4,0,0.6,0.6\n"
"3,Unalaq/Aang/Suyin Beifong,Naga.LinBeifong,0.16666666666666666,0.2857142857142857,0.2886751345948129,3,3,1,0.7,0.7\n"
"4,Unalaq/Aang/Suyin Beifong,Sokka.Kya,0.0,0.0,0.0,4,4,2,0.6,0.6\n"
"5,Unalaq/Aang/Suyin Beifong,Bumi=Momo=Naga=Iroh,0.0,0.0,0.0,4,4,0,0.6,0.6\n"
"6,Unalaq/Aang/Suyin Beifong,Sokka_Unalaq,0.2,0.3333333333333333,0.35355339059327373,4,4,2,0.6,0.6\n"
"7,Unalaq/Aang/Suyin Beifong,Sokka.Iroh.Pabu,0.0,0.0,0.0,4,4,1,0.6,0.6\n"
"8,Unalaq/Aang/Suyin Beifong,LinBeifong=Zuko,0.16666666666666666,0.2857142857142857,0.2886751345948129,4,4,1,0.6,0.6\n"
"9,Unalaq/Aang/Suyin Beifong,TenzinBolinSokka,0.0,0.0,0.0,4,4,1,0.6,0.6\n"
"10,Unalaq/Aang/Suyin Beifong,Korra-AsamiSato-Pabu-Iroh,0.0,0.0,0.0,5,5,1,0.5,0.5\n"
"11,Unalaq/Aang/Suyin Beifong,Mako.Naga,0.0,0.0,0.0,4,4,2,0.6,0.6\n"
"12,Unalaq/Aang/Suyin Beifong,Jinora=Bumi,0.0,0.0,0.0,4,4,2,0.6,0.6\n"
"13,Unalaq/Aang/Suyin Beifong,BolinAppaKuvira,0.0,0.0,0.0,4,4,1,0.6,0.6\n"
"14,Unalaq/Aang/Suyin Beifong,TophBeifongIroh,0.16666666666666666,0.2857142857142857,0.2886751345948129,4,4,1,0.6,0.6\n"
"15,Unalaq/Aang/Suyin Beifong,Amon+Zuko+Unalaq,0.16666666666666666,0.2857142857142857,0.2886751345948129,4,4,1,0.6,0.6\n"
"16,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.25,0.4,0.4008918628686366,8,8,0,0.19999999999999996,0.19999999999999996\n"
"17,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Bolin+Bumi+Ozai+Katara,0.1,0.18181818181818182,0.1889822365046136,8,8,4,0.19999999999999996,0.19999999999999996\n"
"18,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Jinora.Appa.Unalaq.Zaheer,0.1,0.18181818181818182,0.1889822365046136,7,7,4,0.30000000000000004,0.30000000000000004\n"
"19,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Naga.LinBeifong,0.1111111111111111,0.2,0.2182178902359924,7,7,5,0.30000000000000004,0.30000000000000004\n"
"20,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Sokka.Kya,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"21,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Bumi=Momo=Naga=Iroh,0.1,0.18181818181818182,0.1889822365046136,8,8,4,0.19999999999999996,0.19999999999999996\n"
"22,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Sokka_Unalaq,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"23,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Sokka.Iroh.Pabu,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"24,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,LinBeifong=Zuko,0.1111111111111111,0.2,0.2182178902359924,7,7,5,0.30000000000000004,0.30000000000000004\n"
"25,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,TenzinBolinSokka,0.1111111111111111,0.2,0.2182178902359924,7,7,5,0.30000000000000004,0.30000000000000004\n"
"26,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Korra-AsamiSato-Pabu-Iroh,0.2,0.3333333333333333,0.3380617018914066,6,6,3,0.4,0.4\n"
"27,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Mako.Naga,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"28,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Jinora=Bumi,0.125,0.2222222222222222,0.2672612419124244,7,7,6,0.30000000000000004,0.30000000000000004\n"
"29,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,BolinAppaKuvira,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"30,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,TophBeifongIroh,0.1111111111111111,0.2,0.2182178902359924,7,7,5,0.30000000000000004,0.30000000000000004\n"
"31,Tenzin/Asami Sato/Suyin Beifong/Tenzin/Bumi/Zaheer,Amon+Zuko+Unalaq,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"32,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.23076923076923078,0.375,0.375,8,8,0,0.19999999999999996,0.19999999999999996\n"
"33,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Bolin+Bumi+Ozai+Katara,0.09090909090909091,0.16666666666666666,0.17677669529663687,7,7,4,0.30000000000000004,0.30000000000000004\n"
"34,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Jinora.Appa.Unalaq.Zaheer,0.0,0.0,0.0,8,8,4,0.19999999999999996,0.19999999999999996\n"
"35,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Naga.LinBeifong,0.1,0.18181818181818182,0.20412414523193154,8,8,5,0.19999999999999996,0.19999999999999996\n"
"36,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Sokka.Kya,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"37,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Bumi=Momo=Naga=Iroh,0.09090909090909091,0.16666666666666666,0.17677669529663687,7,7,4,0.30000000000000004,0.30000000000000004\n"
"38,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Sokka_Unalaq,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"39,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Sokka.Iroh.Pabu,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"40,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,LinBeifong=Zuko,0.1,0.18181818181818182,0.20412414523193154,7,7,5,0.30000000000000004,0.30000000000000004\n"
"41,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,TenzinBolinSokka,0.1,0.18181818181818182,0.20412414523193154,7,7,5,0.30000000000000004,0.30000000000000004\n"
"42,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Korra-AsamiSato-Pabu-Iroh,0.18181818181818182,0.3076923076923077,0.31622776601683794,7,7,3,0.30000000000000004,0.30000000000000004\n"
"43,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Mako.Naga,0.1111111111111111,0.2,0.25,7,7,6,0.30000000000000004,0.30000000000000004\n"
"44,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Jinora=Bumi,0.0,0.0,0.0,8,8,6,0.19999999999999996,0.19999999999999996\n"
"45,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,BolinAppaKuvira,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"46,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,TophBeifongIroh,0.0,0.0,0.0,8,8,5,0.19999999999999996,0.19999999999999996\n"
"47,Asami Sato/Tenzin/Tonraq/Katara/Tarrlok/Naga/Zuko,Amon+Zuko+Unalaq,0.1,0.18181818181818182,0.20412414523193154,8,8,5,0.19999999999999996,0.19999999999999996\n"
"48,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.3333333333333333,0.5,0.5,9,9,1,0.09999999999999998,0.09999999999999998\n"
"49,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Bolin+Bumi+Ozai+Katara,0.2,0.3333333333333333,0.35355339059327373,8,8,5,0.19999999999999996,0.19999999999999996\n"
"50,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Jinora.Appa.Unalaq.Zaheer,0.0,0.0,0.0,9,9,5,0.09999999999999998,0.09999999999999998\n"
"51,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Naga.LinBeifong,0.1,0.18181818181818182,0.20412414523193154,8,8,6,0.19999999999999996,0.19999999999999996\n"
"52,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Sokka.Kya,0.0,0.0,0.0,9,9,7,0.09999999999999998,0.09999999999999998\n"
"53,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Bumi=Momo=Naga=Iroh,0.09090909090909091,0.16666666666666666,0.17677669529663687,8,8,5,0.19999999999999996,0.19999999999999996\n"
"54,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Sokka_Unalaq,0.0,0.0,0.0,9,9,7,0.09999999999999998,0.09999999999999998\n"
"55,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Sokka.Iroh.Pabu,0.0,0.0,0.0,9,9,6,0.09999999999999998,0.09999999999999998\n"
"56,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,LinBeifong=Zuko,0.1,0.18181818181818182,0.20412414523193154,8,8,6,0.19999999999999996,0.19999999999999996\n"
"57,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,TenzinBolinSokka,0.1,0.18181818181818182,0.20412414523193154,8,8,6,0.19999999999999996,0.19999999999999996\n"
"58,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Korra-AsamiSato-Pabu-Iroh,0.18181818181818182,0.3076923076923077,0.31622776601683794,7,7,4,0.30000000000000004,0.30000000000000004\n"
"59,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Mako.Naga,0.0,0.0,0.0,9,9,7,0.09999999999999998,0.09999999999999998\n"
"60,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Jinora=Bumi,0.1111111111111111,0.2,0.25,8,8,7,0.19999999999999996,0.19999999999999996\n"
"61,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,BolinAppaKuvira,0.2222222222222222,0.36363636363636365,0.4082482904638631,8,8,6,0.19999999999999996,0.19999999999999996\n"
"62,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,TophBeifongIroh,0.2222222222222222,0.36363636363636365,0.4082482904638631,7,7,6,0.30000000000000004,0.30000000000000004\n"
"63,Amon/Asami Sato/Bumi/Kuvira/Toph Beifong/Bolin/Bumi,Amon+Zuko+Unalaq,0.1,0.18181818181818182,0.20412414523193154,8,8,6,0.19999999999999996,0.19999999999999996\n"
"64,Momo,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.0,0.0,0.0,8,8,7,0.19999999999999996,0.19999999999999996\n"
"65,Momo,Bolin+Bumi+Ozai+Katara,0.0,0.0,0.0,4,4,3,0.6,0.6\n"
"66,Momo,Jinora.Appa.Unalaq.Zaheer,0.0,0.0,0.0,4,4,3,0.6,0.6\n"
"67,Momo,Naga.LinBeifong,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"68,Momo,Sokka.Kya,0.0,0.0,0.0,2,2,1,0.8,0.8\n"
"69,Momo,Bumi=Momo=Naga=Iroh,0.25,0.4,0.5,3,3,3,0.7,0.7\n"
"70,Momo,Sokka_Unalaq,0.0,0.0,0.0,2,2,1,0.8,0.8\n"
"71,Momo,Sokka.Iroh.Pabu,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"72,Momo,LinBeifong=Zuko,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"73,Momo,TenzinBolinSokka,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"74,Momo,Korra-AsamiSato-Pabu-Iroh,0.0,0.0,0.0,5,5,4,0.5,0.5\n"
"75,Momo,Mako.Naga,0.0,0.0,0.0,2,2,1,0.8,0.8\n"
"76,Momo,Jinora=Bumi,0.0,0.0,0.0,2,2,1,0.8,0.8\n"
"77,Momo,BolinAppaKuvira,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"78,Momo,TophBeifongIroh,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"79,Momo,Amon+Zuko+Unalaq,0.0,0.0,0.0,3,3,2,0.7,0.7\n"
"80,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,TophBeifong_Zuko_IknikBlackstoneVarrick_AsamiSato,0.13333333333333333,0.23529411764705882,0.23570226039551587,9,9,2,0.09999999999999998,0.09999999999999998\n"
"81,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Bolin+Bumi+Ozai+Katara,0.08333333333333333,0.15384615384615385,0.16666666666666666,9,9,6,0.09999999999999998,0.09999999999999998\n"
"82,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Jinora.Appa.Unalaq.Zaheer,0.08333333333333333,0.15384615384615385,0.16666666666666666,10,10,6,0.0,0.0\n"
"83,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Naga.LinBeifong,0.2,0.3333333333333333,0.3849001794597505,8,8,7,0.19999999999999996,0.19999999999999996\n"
"84,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Sokka.Kya,0.1,0.18181818181818182,0.23570226039551587,9,9,8,0.09999999999999998,0.09999999999999998\n"
"85,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Bumi=Momo=Naga=Iroh,0.0,0.0,0.0,10,10,6,0.0,0.0\n"
"86,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Sokka_Unalaq,0.2222222222222222,0.36363636363636365,0.47140452079103173,8,8,8,0.19999999999999996,0.19999999999999996\n"
"87,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Sokka.Iroh.Pabu,0.09090909090909091,0.16666666666666666,0.19245008972987526,9,9,7,0.09999999999999998,0.09999999999999998\n"
"88,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,LinBeifong=Zuko,0.2,0.3333333333333333,0.3849001794597505,8,8,7,0.19999999999999996,0.19999999999999996\n"
"89,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,TenzinBolinSokka,0.2,0.3333333333333333,0.3849001794597505,8,8,7,0.19999999999999996,0.19999999999999996\n"
"90,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Korra-AsamiSato-Pabu-Iroh,0.07692307692307693,0.14285714285714285,0.14907119849998599,10,10,5,0.0,0.0\n"
"91,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Mako.Naga,0.1,0.18181818181818182,0.23570226039551587,9,9,8,0.09999999999999998,0.09999999999999998\n"
"92,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Jinora=Bumi,0.0,0.0,0.0,10,10,8,0.0,0.0\n"
"93,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,BolinAppaKuvira,0.2,0.3333333333333333,0.3849001794597505,9,9,7,0.09999999999999998,0.09999999999999998\n"
"94,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,TophBeifongIroh,0.2,0.3333333333333333,0.3849001794597505,8,8,7,0.19999999999999996,0.19999999999999996\n"
"95,Kuvira/Bolin/Lin Beifong/Sokka/Mako/Korra/Toph Beifong/Unalaq,Amon+Zuko+Unalaq,0.09090909090909091,0.16666666666666666,0.19245008972987526,9,9,7,0.09999999999999998,0.09999999999999998\n"
)
assert similarities.to_csv() == expected
| 56.111359 | 232 | 0.725927 | 3,559 | 25,194 | 5.011239 | 0.095532 | 0.026577 | 0.03095 | 0.030502 | 0.808298 | 0.739221 | 0.700533 | 0.667171 | 0.546454 | 0.4582 | 0 | 0.271571 | 0.138406 | 25,194 | 448 | 233 | 56.236607 | 0.550053 | 0.066802 | 0 | 0.228412 | 0 | 0.275766 | 0.636588 | 0.503238 | 0 | 0 | 0 | 0 | 0.064067 | 1 | 0.047354 | false | 0 | 0.022284 | 0.005571 | 0.08078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
993de489f75ab00ab6721da3fe00087160888e67 | 30,223 | py | Python | operations/migrations/0001_initial.py | mark-bondo/moondance | 3347c3fb8ac3e40a5c66b61a21cfb562841531ba | [
"MIT"
] | null | null | null | operations/migrations/0001_initial.py | mark-bondo/moondance | 3347c3fb8ac3e40a5c66b61a21cfb562841531ba | [
"MIT"
] | null | null | null | operations/migrations/0001_initial.py | mark-bondo/moondance | 3347c3fb8ac3e40a5c66b61a21cfb562841531ba | [
"MIT"
] | null | null | null | # Generated by Django 3.1.5 on 2021-05-31 14:58
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import simple_history.models
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="Product_Code",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
auto_now_add=True, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
auto_now=True, verbose_name="Datetime Updated"
),
),
(
"type",
models.CharField(
choices=[
("Finished Goods", "Finished Goods"),
("WIP", "WIP"),
("WIP - Labor", "WIP - Labor"),
("Raw Materials", "Raw Materials"),
("Labor", "Labor"),
],
max_length=200,
),
),
("family", models.CharField(max_length=200)),
("category", models.CharField(max_length=200, unique=True)),
(
"freight_factor_percentage",
models.DecimalField(
blank=True,
decimal_places=2,
help_text="Percentage adder to material cost. Use whole numbers with 2 decimals maximum.",
max_digits=12,
null=True,
),
),
(
"sales_channel_type",
models.CharField(
blank=True,
choices=[
("All", "All"),
("FBA", "FBA"),
("MoonDance", "MoonDance"),
],
max_length=100,
null=True,
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="product_code_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="product_code_last_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
],
options={
"verbose_name": "Product Category",
"verbose_name_plural": "Product Categories",
"ordering": ("type", "family", "category"),
},
),
migrations.CreateModel(
name="Product",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
auto_now_add=True, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
auto_now=True, verbose_name="Datetime Updated"
),
),
(
"sku",
models.CharField(max_length=200, unique=True, verbose_name="SKU"),
),
("description", models.CharField(max_length=200)),
(
"upc",
models.CharField(
blank=True, max_length=200, null=True, verbose_name="UPC"
),
),
(
"unit_of_measure",
models.CharField(
choices=[
("grams", "grams"),
("oz", "oz"),
("lbs", "lbs"),
("each", "each"),
("minutes", "minutes"),
],
default="lbs",
max_length=200,
),
),
(
"unit_weight",
models.DecimalField(
blank=True, decimal_places=2, max_digits=12, null=True
),
),
(
"unit_sales_price",
models.DecimalField(
blank=True, decimal_places=2, max_digits=12, null=True
),
),
(
"unit_material_cost",
models.DecimalField(
blank=True, decimal_places=5, max_digits=12, null=True
),
),
(
"unit_labor_cost",
models.DecimalField(
blank=True, decimal_places=5, max_digits=12, null=True
),
),
("notes", models.TextField(blank=True, null=True)),
(
"_created_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="product_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="product_last_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
(
"product_code",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="Product_product_code_fk",
to="operations.product_code",
),
),
],
options={
"verbose_name": "Product",
"verbose_name_plural": "Product List",
"ordering": ("sku",),
},
),
migrations.CreateModel(
name="Order_Cost_Overlay",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
auto_now_add=True, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
auto_now=True, verbose_name="Datetime Updated"
),
),
(
"sales_channel",
models.CharField(
choices=[
("Shopify Website", "Shopify Website"),
("Amazon FBA", "Amazon FBA"),
("Amazon FBM", "Amazon FBM"),
("POS", "POS"),
],
max_length=200,
),
),
("name", models.CharField(max_length=200, unique=True)),
(
"type",
models.CharField(
choices=[
("Fulfillment Labor", "Fulfillment Labor"),
("Shipping Materials", "Shipping Materials"),
],
max_length=200,
),
),
(
"apply_to",
models.CharField(
choices=[
("Each Order", "Each Order"),
("Each Order Line", "Each Order Line"),
],
max_length=200,
),
),
(
"labor_hourly_rate",
models.DecimalField(
blank=True, decimal_places=2, max_digits=5, null=True
),
),
("labor_minutes", models.IntegerField(blank=True, null=True)),
(
"material_cost",
models.DecimalField(
blank=True, decimal_places=2, max_digits=16, null=True
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="order_cost_overlay_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="order_cost_overlay_last_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
],
options={
"verbose_name": "Cost Overlay",
"verbose_name_plural": "Cost Overlays",
"ordering": ("sales_channel", "name"),
},
),
migrations.CreateModel(
name="HistoricalRecipe_Line",
fields=[
(
"id",
models.IntegerField(
auto_created=True, blank=True, db_index=True, verbose_name="ID"
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Updated"
),
),
("quantity", models.DecimalField(decimal_places=5, max_digits=12)),
(
"unit_of_measure",
models.CharField(
choices=[
("grams", "grams"),
("oz", "oz"),
("lbs", "lbs"),
("each", "each"),
("minutes", "minutes"),
],
default="grams",
max_length=200,
),
),
("history_id", models.AutoField(primary_key=True, serialize=False)),
("history_date", models.DateTimeField()),
("history_change_reason", models.CharField(max_length=100, null=True)),
(
"history_type",
models.CharField(
choices=[("+", "Created"), ("~", "Changed"), ("-", "Deleted")],
max_length=1,
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
(
"history_user",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
),
),
(
"sku",
models.ForeignKey(
blank=True,
db_constraint=False,
limit_choices_to=models.Q(
_negated=True, product_code__type__in=["Finished Goods"]
),
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to="operations.product",
),
),
(
"sku_parent",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to="operations.product",
),
),
],
options={
"verbose_name": "historical Recipe Line",
"ordering": ("-history_date", "-history_id"),
"get_latest_by": "history_date",
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name="HistoricalProduct",
fields=[
(
"id",
models.IntegerField(
auto_created=True, blank=True, db_index=True, verbose_name="ID"
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Updated"
),
),
(
"sku",
models.CharField(db_index=True, max_length=200, verbose_name="SKU"),
),
("description", models.CharField(max_length=200)),
(
"upc",
models.CharField(
blank=True, max_length=200, null=True, verbose_name="UPC"
),
),
(
"unit_of_measure",
models.CharField(
choices=[
("grams", "grams"),
("oz", "oz"),
("lbs", "lbs"),
("each", "each"),
("minutes", "minutes"),
],
default="lbs",
max_length=200,
),
),
(
"unit_weight",
models.DecimalField(
blank=True, decimal_places=2, max_digits=12, null=True
),
),
(
"unit_sales_price",
models.DecimalField(
blank=True, decimal_places=2, max_digits=12, null=True
),
),
(
"unit_material_cost",
models.DecimalField(
blank=True, decimal_places=5, max_digits=12, null=True
),
),
(
"unit_labor_cost",
models.DecimalField(
blank=True, decimal_places=5, max_digits=12, null=True
),
),
("notes", models.TextField(blank=True, null=True)),
("history_id", models.AutoField(primary_key=True, serialize=False)),
("history_date", models.DateTimeField()),
("history_change_reason", models.CharField(max_length=100, null=True)),
(
"history_type",
models.CharField(
choices=[("+", "Created"), ("~", "Changed"), ("-", "Deleted")],
max_length=1,
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
(
"history_user",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
),
),
(
"product_code",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to="operations.product_code",
),
),
],
options={
"verbose_name": "historical Product",
"ordering": ("-history_date", "-history_id"),
"get_latest_by": "history_date",
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name="HistoricalOrder_Cost_Overlay",
fields=[
(
"id",
models.IntegerField(
auto_created=True, blank=True, db_index=True, verbose_name="ID"
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
blank=True, editable=False, verbose_name="Datetime Updated"
),
),
(
"sales_channel",
models.CharField(
choices=[
("Shopify Website", "Shopify Website"),
("Amazon FBA", "Amazon FBA"),
("Amazon FBM", "Amazon FBM"),
("POS", "POS"),
],
max_length=200,
),
),
("name", models.CharField(db_index=True, max_length=200)),
(
"type",
models.CharField(
choices=[
("Fulfillment Labor", "Fulfillment Labor"),
("Shipping Materials", "Shipping Materials"),
],
max_length=200,
),
),
(
"apply_to",
models.CharField(
choices=[
("Each Order", "Each Order"),
("Each Order Line", "Each Order Line"),
],
max_length=200,
),
),
(
"labor_hourly_rate",
models.DecimalField(
blank=True, decimal_places=2, max_digits=5, null=True
),
),
("labor_minutes", models.IntegerField(blank=True, null=True)),
(
"material_cost",
models.DecimalField(
blank=True, decimal_places=2, max_digits=16, null=True
),
),
("history_id", models.AutoField(primary_key=True, serialize=False)),
("history_date", models.DateTimeField()),
("history_change_reason", models.CharField(max_length=100, null=True)),
(
"history_type",
models.CharField(
choices=[("+", "Created"), ("~", "Changed"), ("-", "Deleted")],
max_length=1,
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
db_constraint=False,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
(
"history_user",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
),
),
],
options={
"verbose_name": "historical Cost Overlay",
"ordering": ("-history_date", "-history_id"),
"get_latest_by": "history_date",
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name="Recipe_Line",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"_active",
models.BooleanField(default=True, verbose_name="Is Active"),
),
(
"_created",
models.DateTimeField(
auto_now_add=True, verbose_name="Datetime Created"
),
),
(
"_last_updated",
models.DateTimeField(
auto_now=True, verbose_name="Datetime Updated"
),
),
("quantity", models.DecimalField(decimal_places=5, max_digits=12)),
(
"unit_of_measure",
models.CharField(
choices=[
("grams", "grams"),
("oz", "oz"),
("lbs", "lbs"),
("each", "each"),
("minutes", "minutes"),
],
default="grams",
max_length=200,
),
),
(
"_created_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="recipe_line_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"_last_updated_by",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="recipe_line_last_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Updated By",
),
),
(
"sku",
models.ForeignKey(
limit_choices_to=models.Q(
_negated=True, product_code__type__in=["Finished Goods"]
),
on_delete=django.db.models.deletion.PROTECT,
related_name="Recipe_sku_fk",
to="operations.product",
),
),
(
"sku_parent",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="Recipe_sku_parent_fk",
to="operations.product",
),
),
],
options={
"verbose_name": "Recipe Line",
"verbose_name_plural": "Recipe Lines",
"ordering": ("sku_parent", "sku"),
"unique_together": {("sku", "sku_parent")},
},
),
]
| 37.731586 | 114 | 0.352215 | 1,878 | 30,223 | 5.422258 | 0.090522 | 0.061573 | 0.032996 | 0.051851 | 0.887459 | 0.879603 | 0.872042 | 0.842483 | 0.836492 | 0.836492 | 0 | 0.010214 | 0.556199 | 30,223 | 800 | 115 | 37.77875 | 0.748975 | 0.001489 | 0 | 0.80454 | 1 | 0 | 0.11824 | 0.01299 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005044 | 0 | 0.010088 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
99705c2f48ee40fccd34305029bb10d8b28efeb3 | 3,135 | py | Python | data/variables.py | mkumatag/openbmc-automation | 4800237bcf6f516387b5510699396154b05a5b34 | [
"Apache-2.0"
] | 7 | 2015-12-21T06:58:29.000Z | 2016-08-09T09:21:01.000Z | data/variables.py | mkumatag/openbmc-automation | 4800237bcf6f516387b5510699396154b05a5b34 | [
"Apache-2.0"
] | 55 | 2015-11-05T05:54:22.000Z | 2016-06-15T10:04:01.000Z | data/variables.py | mkumatag/openbmc-automation | 4800237bcf6f516387b5510699396154b05a5b34 | [
"Apache-2.0"
] | 16 | 2015-11-03T20:01:55.000Z | 2020-11-12T16:26:46.000Z |
# Here contains a list of valid Properties bases on fru_type after a boot.
INVENTORY_ITEMS={
"CPU": [
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"FRU File ID",
"Manufacturer",
"Name",
"Part Number",
"Serial Number",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"DIMM": [
"Asset Tag",
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"FRU File ID",
"Manufacturer",
"Model Number",
"Name",
"Serial Number",
"Version",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"MEMORY_BUFFER": [
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"FRU File ID",
"Manufacturer",
"Name",
"Part Number",
"Serial Number",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"FAN": [
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"DAUGHTER_CARD": [
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"FRU File ID",
"Manufacturer",
"Name",
"Part Number",
"Serial Number",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"BMC": [
"fault",
"fru_type",
"is_fru",
"manufacturer",
"present",
"version",
],
"MAIN_PLANAR": [
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"Part Number",
"Serial Number",
"Type",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"SYSTEM": [
"Custom Field 1",
"Custom Field 2",
"Custom Field 3",
"Custom Field 4",
"Custom Field 5",
"Custom Field 6",
"Custom Field 7",
"Custom Field 8",
"FRU File ID",
"Manufacturer",
"Model Number",
"Name",
"Serial Number",
"Version",
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
"CORE": [
"fault",
"fru_type",
"is_fru",
"present",
"version",
],
}
| 21.040268 | 74 | 0.415949 | 290 | 3,135 | 4.417241 | 0.175862 | 0.412178 | 0.084309 | 0.098361 | 0.852459 | 0.839188 | 0.839188 | 0.766589 | 0.766589 | 0.766589 | 0 | 0.027618 | 0.445614 | 3,135 | 148 | 75 | 21.182432 | 0.709436 | 0.022967 | 0 | 0.903448 | 0 | 0 | 0.44085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
510755e7be5ffcf500346d5a7549031acbf423fc | 117 | py | Python | test/integration/tfp/models_data/normal_lub_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 123 | 2018-11-22T01:34:47.000Z | 2022-02-06T17:41:05.000Z | test/integration/tfp/models_data/normal_lub_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 947 | 2018-11-30T15:49:31.000Z | 2022-03-30T15:56:17.000Z | test/integration/tfp/models_data/normal_lub_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 50 | 2019-02-14T15:13:39.000Z | 2022-03-18T18:14:17.000Z |
data = dict(y_lub=[-1,-2,1,1,2,-3,5,5,-6,8.9], y_lb=[-1,-2,1,1,2,-3,5,5,-6,8.9], y_ub=[-1,-2,1,1,2,-3,5,5,-6,8.9])
| 29.25 | 114 | 0.461538 | 41 | 117 | 1.243902 | 0.317073 | 0.235294 | 0.176471 | 0.235294 | 0.686275 | 0.686275 | 0.686275 | 0.686275 | 0.686275 | 0.686275 | 0 | 0.3 | 0.059829 | 117 | 3 | 115 | 39 | 0.163636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
513ac27905117099ca9e9f1e4bb35b20968e13ee | 71,448 | py | Python | apis/nb/clients/topology_client/TopologyApi.py | CiscoDevNet/APIC-EM-Generic-Scripts- | 74211d9488f1e77cf56ef86dba20ec8e8eb49cc1 | [
"ECL-2.0",
"Apache-2.0"
] | 45 | 2016-06-09T15:41:25.000Z | 2019-08-06T17:13:11.000Z | apis/nb/clients/topology_client/TopologyApi.py | CiscoDevNet/APIC-EM-Generic-Scripts | 74211d9488f1e77cf56ef86dba20ec8e8eb49cc1 | [
"ECL-2.0",
"Apache-2.0"
] | 36 | 2016-06-12T03:03:56.000Z | 2017-03-13T18:20:11.000Z | apis/nb/clients/topology_client/TopologyApi.py | CiscoDevNet/APIC-EM-Generic-Scripts | 74211d9488f1e77cf56ef86dba20ec8e8eb49cc1 | [
"ECL-2.0",
"Apache-2.0"
] | 15 | 2016-06-22T03:51:37.000Z | 2019-07-10T10:06:02.000Z | #!/usr/bin/env python
#pylint: skip-file
# This source code is licensed under the Apache license found in the
# LICENSE file in the root directory of this project.
import sys
import os
import urllib.request, urllib.parse, urllib.error
from .models import *
class TopologyApi(object):
def __init__(self, apiClient):
self.apiClient = apiClient
def getTopologyApplications(self, **kwargs):
"""getTopologyApplications
Args:
Returns: TopologyApplicationListResult
"""
allParams = []
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplications" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyApplicationListResult')
return responseObject
def updateTopologyApplication(self, **kwargs):
"""updateTopologyApplications
Args:
topologyApplicationDtoList, list[TopologyApplicationDto]: application (required)
Returns: TaskIdResult
"""
allParams = ['topologyApplicationDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topologyApplicationDtoList' in params):
bodyParam = params['topologyApplicationDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyApplications(self, **kwargs):
"""createTopologyApplications
Args:
topologyApplicationDtoList, list[TopologyApplicationDto]: application (required)
Returns: TaskIdResult
"""
allParams = ['topologyApplicationDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyApplications" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topologyApplicationDtoList' in params):
bodyParam = params['topologyApplicationDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplication(self, **kwargs):
"""getTopologyApplication
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TopologyApplicationResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyApplicationResult')
return responseObject
def deleteTopologyApplication(self, **kwargs):
"""deleteTopologyApplication
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPages(self, **kwargs):
"""getTopologyApplicationPages
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TopologyPageListResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyPageListResult')
return responseObject
def updateTopologyPages(self, **kwargs):
"""updateTopologyPages
Args:
applicationUuid, str: Topology Application Uuid (required)
topologyPageDtoList, list[TopologyPageDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'topologyPageDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('topologyPageDtoList' in params):
bodyParam = params['topologyPageDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyPages(self, **kwargs):
"""createTopologyPages
Args:
applicationUuid, str: Topology Application Uuid (required)
topologyPageDtoList, list[TopologyPageDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'topologyPageDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('topologyPageDtoList' in params):
bodyParam = params['topologyPageDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPage(self, **kwargs):
"""getTopologyApplicationPage
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TopologyPageResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPage" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyPageResult')
return responseObject
def deleteTopologyPage(self, **kwargs):
"""deleteTopologyPage
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyPage" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPageViews(self, **kwargs):
"""getTopologyApplicationPageViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TopologyViewListResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPageViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyViewListResult')
return responseObject
def updateTopologyViews(self, **kwargs):
"""updateTopologyViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
topologyViewDtoList, list[TopologyViewDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'topologyViewDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('topologyViewDtoList' in params):
bodyParam = params['topologyViewDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyViews(self, **kwargs):
"""createTopologyViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
topologyViewDtoList, list[TopologyViewDto]: view (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'topologyViewDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('topologyViewDtoList' in params):
bodyParam = params['topologyViewDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPageView(self, **kwargs):
"""getTopologyApplicationPageView
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
viewUuid, str: Topology Application Page View Uuid (required)
Returns: TopologyViewResult
"""
allParams = ['applicationUuid', 'pageUuid', 'viewUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPageView" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}/view/{viewUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('viewUuid' in params):
replacement = str(self.apiClient.toPathValue(params['viewUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'viewUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyViewResult')
return responseObject
def deleteTopologyView(self, **kwargs):
"""deleteTopologyView
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
viewUuid, str: Topology Application View Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'viewUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyView" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/application/{applicationUuid}/page/{pageUuid}/view/{viewUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('viewUuid' in params):
replacement = str(self.apiClient.toPathValue(params['viewUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'viewUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplications(self, **kwargs):
"""getTopologyApplications
Args:
Returns: TopologyApplicationListResult
"""
allParams = []
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplications" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyApplicationListResult')
return responseObject
def updateTopologyApplication(self, **kwargs):
"""updateTopologyApplications
Args:
topologyApplicationDtoList, list[TopologyApplicationDto]: application (required)
Returns: TaskIdResult
"""
allParams = ['topologyApplicationDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topologyApplicationDtoList' in params):
bodyParam = params['topologyApplicationDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyApplications(self, **kwargs):
"""createTopologyApplications
Args:
topologyApplicationDtoList, list[TopologyApplicationDto]: application (required)
Returns: TaskIdResult
"""
allParams = ['topologyApplicationDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyApplications" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topologyApplicationDtoList' in params):
bodyParam = params['topologyApplicationDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplication(self, **kwargs):
"""getTopologyApplication
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TopologyApplicationResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyApplicationResult')
return responseObject
def deleteTopologyApplication(self, **kwargs):
"""deleteTopologyApplication
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyApplication" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPages(self, **kwargs):
"""getTopologyApplicationPages
Args:
applicationUuid, str: Topology Application Uuid (required)
Returns: TopologyPageListResult
"""
allParams = ['applicationUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyPageListResult')
return responseObject
def updateTopologyPages(self, **kwargs):
"""updateTopologyPages
Args:
applicationUuid, str: Topology Application Uuid (required)
topologyPageDtoList, list[TopologyPageDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'topologyPageDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('topologyPageDtoList' in params):
bodyParam = params['topologyPageDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyPages(self, **kwargs):
"""createTopologyPages
Args:
applicationUuid, str: Topology Application Uuid (required)
topologyPageDtoList, list[TopologyPageDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'topologyPageDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyPages" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('topologyPageDtoList' in params):
bodyParam = params['topologyPageDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPage(self, **kwargs):
"""getTopologyApplicationPage
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TopologyPageResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPage" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyPageResult')
return responseObject
def deleteTopologyPage(self, **kwargs):
"""deleteTopologyPage
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyPage" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPageViews(self, **kwargs):
"""getTopologyApplicationPageViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
Returns: TopologyViewListResult
"""
allParams = ['applicationUuid', 'pageUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPageViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyViewListResult')
return responseObject
def updateTopologyViews(self, **kwargs):
"""updateTopologyViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
topologyViewDtoList, list[TopologyViewDto]: page (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'topologyViewDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method updateTopologyViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'PUT'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('topologyViewDtoList' in params):
bodyParam = params['topologyViewDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def createTopologyViews(self, **kwargs):
"""createTopologyViews
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
topologyViewDtoList, list[TopologyViewDto]: view (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'topologyViewDtoList']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method createTopologyViews" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}/view'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('topologyViewDtoList' in params):
bodyParam = params['topologyViewDtoList']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getTopologyApplicationPageView(self, **kwargs):
"""getTopologyApplicationPageView
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
viewUuid, str: Topology Application Page View Uuid (required)
Returns: TopologyViewResult
"""
allParams = ['applicationUuid', 'pageUuid', 'viewUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getTopologyApplicationPageView" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}/view/{viewUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('viewUuid' in params):
replacement = str(self.apiClient.toPathValue(params['viewUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'viewUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyViewResult')
return responseObject
def deleteTopologyView(self, **kwargs):
"""deleteTopologyView
Args:
applicationUuid, str: Topology Application Uuid (required)
pageUuid, str: Topology Application Page Uuid (required)
viewUuid, str: Topology Application View Uuid (required)
Returns: TaskIdResult
"""
allParams = ['applicationUuid', 'pageUuid', 'viewUuid']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method deleteTopologyView" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/controllerapp/{applicationUuid}/page/{pageUuid}/view/{viewUuid}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'DELETE'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('applicationUuid' in params):
replacement = str(self.apiClient.toPathValue(params['applicationUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'applicationUuid' + '}',
replacement)
if ('pageUuid' in params):
replacement = str(self.apiClient.toPathValue(params['pageUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'pageUuid' + '}',
replacement)
if ('viewUuid' in params):
replacement = str(self.apiClient.toPathValue(params['viewUuid']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'viewUuid' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def loadCustomTopology(self, **kwargs):
"""loadCustomTopology
Args:
Returns: TopologyResult
"""
allParams = []
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method loadCustomTopology" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/custom'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyResult')
return responseObject
def saveCustomTopology(self, **kwargs):
"""saveCustomTopology
Args:
topo, Topology: Topology (required)
Returns: TaskIdResult
"""
allParams = ['topo']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method saveCustomTopology" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/custom'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'POST'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topo' in params):
bodyParam = params['topo']
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TaskIdResult')
return responseObject
def getL2Topology(self, **kwargs):
"""getL2Topology
Args:
vlanID, str: ID of VLAN (required)
Returns: TopologyResult
"""
allParams = ['vlanID']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getL2Topology" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/l2/{vlanID}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('vlanID' in params):
replacement = str(self.apiClient.toPathValue(params['vlanID']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'vlanID' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyResult')
return responseObject
def getL3TopologyForVrf(self, **kwargs):
"""getL3TopologyForVrf
Args:
vrfName, str: VRF Name (required)
Returns: TopologyResult
"""
allParams = ['vrfName']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getL3TopologyForVrf" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/l3/vrf/{vrfName}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('vrfName' in params):
replacement = str(self.apiClient.toPathValue(params['vrfName']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'vrfName' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyResult')
return responseObject
def getL3Topology(self, **kwargs):
"""getL3Topology
Args:
topologyType, str: Type of topology(OSPF,ISIS,etc) (required)
Returns: TopologyResult
"""
allParams = ['topologyType']
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getL3Topology" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/l3/{topologyType}'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
if ('topologyType' in params):
replacement = str(self.apiClient.toPathValue(params['topologyType']))
replacement = urllib.parse.quote(replacement)
resourcePath = resourcePath.replace('{' + 'topologyType' + '}',
replacement)
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyResult')
return responseObject
def getPhysicalTopology(self, **kwargs):
"""getPhysicalTopology
Args:
Returns: TopologyResult
"""
allParams = []
params = locals()
for (key, val) in list(params['kwargs'].items()):
if key not in allParams:
raise TypeError("Got an unexpected keyword argument '%s' to method getPhysicalTopology" % key)
params[key] = val
del params['kwargs']
resourcePath = '/topology/physical-topology'
resourcePath = resourcePath.replace('{format}', 'json')
method = 'GET'
queryParams = {}
headerParams = {}
formParams = {}
files = {}
bodyParam = None
headerParams['Accept'] = 'application/json'
headerParams['Content-Type'] = 'application/json'
postData = (formParams if formParams else bodyParam)
response = self.apiClient.callAPI(resourcePath, method, queryParams,
postData, headerParams, files=files)
if not response:
return None
responseObject = self.apiClient.deserialize(response, 'TopologyResult')
return responseObject
| 28.329897 | 122 | 0.534207 | 5,071 | 71,448 | 7.525932 | 0.029777 | 0.040536 | 0.065795 | 0.025941 | 0.972094 | 0.972094 | 0.972094 | 0.972094 | 0.962583 | 0.959962 | 0 | 0.000267 | 0.371991 | 71,448 | 2,521 | 123 | 28.341134 | 0.850279 | 0.092613 | 0 | 0.967185 | 0 | 0 | 0.170225 | 0.040776 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031952 | false | 0 | 0.003454 | 0 | 0.098446 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5acdf41213f4317e9b3c847b82a8b62f2defe067 | 934 | py | Python | tests/core/test_utils.py | Finistere/dependency_manager | 5a183d46ac5d760944dc507d1281813d02d2c75e | [
"MIT"
] | null | null | null | tests/core/test_utils.py | Finistere/dependency_manager | 5a183d46ac5d760944dc507d1281813d02d2c75e | [
"MIT"
] | null | null | null | tests/core/test_utils.py | Finistere/dependency_manager | 5a183d46ac5d760944dc507d1281813d02d2c75e | [
"MIT"
] | null | null | null | from antidote.core import DependencyDebug, DependencyValue, Scope
def test_dependency_value():
ref = DependencyValue("test", scope=Scope.singleton())
assert ref == DependencyValue("test", scope=Scope.singleton())
assert ref != DependencyValue("test2", scope=Scope.singleton())
assert ref != DependencyValue("test", scope=None)
def test_dependency_debug():
ref = DependencyDebug("info", scope=Scope.singleton(), wired=[1], dependencies=[2])
assert ref == DependencyDebug("info", scope=Scope.singleton(), wired=[1], dependencies=[2])
assert ref != DependencyDebug("info2", scope=Scope.singleton(), wired=[1], dependencies=[2])
assert ref != DependencyDebug("info", scope=None, wired=[1], dependencies=[2])
assert ref != DependencyDebug("info", scope=Scope.singleton(), wired=[10], dependencies=[2])
assert ref != DependencyDebug("info", scope=Scope.singleton(), wired=[1], dependencies=[20])
| 51.888889 | 96 | 0.707709 | 105 | 934 | 6.257143 | 0.238095 | 0.121766 | 0.231355 | 0.205479 | 0.812785 | 0.812785 | 0.812785 | 0.812785 | 0.733638 | 0.563166 | 0 | 0.019512 | 0.122056 | 934 | 17 | 97 | 54.941176 | 0.781707 | 0 | 0 | 0 | 0 | 0 | 0.044968 | 0 | 0 | 0 | 0 | 0 | 0.615385 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
51b3ddca380c12f91e135158ad36379ff79cc8c3 | 165 | py | Python | SmerekaRoman/HW_1/2 .py | kolyasalubov/Lv-639.pythonCore | 06f10669a188318884adb00723127465ebdf2907 | [
"MIT"
] | null | null | null | SmerekaRoman/HW_1/2 .py | kolyasalubov/Lv-639.pythonCore | 06f10669a188318884adb00723127465ebdf2907 | [
"MIT"
] | null | null | null | SmerekaRoman/HW_1/2 .py | kolyasalubov/Lv-639.pythonCore | 06f10669a188318884adb00723127465ebdf2907 | [
"MIT"
] | null | null | null | a = int(input('a ='))
b = int(input('b ='))
print('a+b =', a+b)
print('a-b =', a-b)
print('a*b =', a*b)
print('a/b =', a/b)
print('a%b =', a%b)
print('a//b =', a//b) | 20.625 | 21 | 0.448485 | 38 | 165 | 1.947368 | 0.131579 | 0.351351 | 0.567568 | 0.648649 | 0.743243 | 0.743243 | 0.743243 | 0.743243 | 0.743243 | 0.743243 | 0 | 0 | 0.151515 | 165 | 8 | 22 | 20.625 | 0.528571 | 0 | 0 | 0 | 0 | 0 | 0.222892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.75 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 11 |
51dd27ea69d5a3138347b2c38107d36fddc17e34 | 62,303 | py | Python | tests/src/python/test_qgsserver_wms_getmap.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | null | null | null | tests/src/python/test_qgsserver_wms_getmap.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | null | null | null | tests/src/python/test_qgsserver_wms_getmap.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | 1 | 2021-12-25T08:40:30.000Z | 2021-12-25T08:40:30.000Z | # -*- coding: utf-8 -*-
"""QGIS Unit tests for QgsServer WMS GetMap.
From build dir, run: ctest -R PyQgsServerWMSGetMap -V
.. note:: This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
"""
__author__ = 'Alessandro Pasotti'
__date__ = '25/05/2015'
__copyright__ = 'Copyright 2015, The QGIS Project'
# This will get replaced with a git SHA1 when you do a git archive
__revision__ = '176c06ceefb5f555205e72b20c962740cc0ec183'
import os
# Needed on Qt 5 so that the serialization of XML is consistent among all executions
os.environ['QT_HASH_SEED'] = '1'
import re
import urllib.request
import urllib.parse
import urllib.error
from qgis.testing import unittest
from qgis.PyQt.QtCore import QSize
import osgeo.gdal # NOQA
from test_qgsserver import QgsServerTestBase
from utilities import unitTestDataPath
from qgis.core import QgsProject
# Strip path and content length because path may vary
RE_STRIP_UNCHECKABLE = b'MAP=[^"]+|Content-Length: \d+'
RE_ATTRIBUTES = b'[^>\s]+=[^>\s]+'
class TestQgsServerWMSGetMap(QgsServerTestBase):
"""QGIS Server WMS Tests for GetMap request"""
#regenerate_reference = True
def test_wms_getmap_basic_mode(self):
# 1 bits
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png; mode=1bit",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Mode_1bit", 20000)
# 8 bits
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png; mode=8bit",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Mode_8bit", 20000)
# 16 bits
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png; mode=16bit",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Mode_16bit", 20000)
def test_wms_getmap_basic(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,dem",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic2")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectUseLayerIdsPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "country20131022151106556",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic3")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,db_point",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic4")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"sERVICE": "WMS",
"VeRSION": "1.1.1",
"REqUEST": "GetMap",
"LAYeRS": "Country,db_point",
"STYLeS": "",
"FORMAt": "image/png",
"bBOX": "-16817707,-4710778,5696513,14587125",
"HeIGHT": "500",
"WIDTH": "500",
"CRs": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic4")
def test_wms_getmap_complex_labeling(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "pointlabel",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Labeling_Complex")
def test_wms_getmap_context_rendering(self):
project = os.path.join(self.testdata_path, "test_project_render_context.qgs")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(project),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "points",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-119.8,20.4,-82.4,49.2",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:4326"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_ContextRendering")
def test_wms_getmap_dpi(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"DPI": "112.5"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Basic5")
def test_wms_getmap_invalid_parameters(self):
# invalid format
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "pdf",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"The format \'pdf\' from FORMAT is not supported." in r
self.assertTrue(err)
# height should be an int
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "FOO",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HEIGHT (\'FOO\') cannot be converted into int" in r
self.assertTrue(err)
# width should be an int
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "FOO",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"WIDTH (\'FOO\') cannot be converted into int" in r
self.assertTrue(err)
# bbox should be formatted like "double,double,double,double"
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"BBOX (\'-16817707,-4710778,5696513\') cannot be converted into a rectangle" in r
self.assertTrue(err)
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,FOO",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"BBOX (\'-16817707,-4710778,5696513,FOO\') cannot be converted into a rectangle" in r
self.assertTrue(err)
# test invalid bbox : xmin > xmax
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "1,0,0,1",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"cannot be converted into a rectangle" in r
self.assertTrue(err)
# test invalid bbox : ymin > ymax
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "0,1,0,0",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"cannot be converted into a rectangle" in r
self.assertTrue(err)
# opacities should be a list of int
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"OPACITIES": "253,FOO",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"OPACITIES (\'253,FOO\') cannot be converted into a list of int" in r
self.assertTrue(err)
# filters should be formatted like "layer0:filter0;layer1:filter1"
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country \"name\" = 'eurasia'"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"FILTER (\'Country \"name\" = \'eurasia\'\') is not properly formatted" in r
self.assertTrue(err)
# selections should be formatted like "layer0:id0,id1;layer1:id0"
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"SRS": "EPSG:3857",
"SELECTION": "Country=4"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"SELECTION (\'Country=4\') is not properly formatted" in r
self.assertTrue(err)
# invalid highlight geometries
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_GEOM": "POLYGONN((-15000000 10000000, -15000000 6110620, 2500000 6110620, 2500000 10000000, -15000000 10000000))",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HIGHLIGHT_GEOM (\'POLYGONN((-15000000 10000000, -15000000 6110620, 2500000 6110620, 2500000 10000000, -15000000 10000000))\') cannot be converted into a list of geometries" in r
self.assertTrue(err)
# invalid highlight label colors
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_LABELCOLOR": "%2300230000;%230023000",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HIGHLIGHT_LABELCOLOR (\'#00230000;#0023000\') cannot be converted into a list of colors" in r
self.assertTrue(err)
# invalid list of label sizes
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_LABELSIZE": "16;17;FOO",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HIGHLIGHT_LABELSIZE (\'16;17;FOO\') cannot be converted into a list of int" in r
self.assertTrue(err)
# invalid list of label buffer size
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_LABELBUFFERSIZE": "1.5;2;FF",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HIGHLIGHT_LABELBUFFERSIZE (\'1.5;2;FF\') cannot be converted into a list of float" in r
self.assertTrue(err)
# invalid buffer color
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_LABELBUFFERCOLOR": "%232300FF00;%232300FF0",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
err = b"HIGHLIGHT_LABELBUFFERCOLOR (\'#2300FF00;#2300FF0\') cannot be converted into a list of colors" in r
self.assertTrue(err)
def test_wms_getmap_transparent(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"TRANSPARENT": "TRUE"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Transparent")
def test_wms_getmap_labeling_settings(self):
# Test the `DrawRectOnly` option with 1 candidate (`CandidatesPolygon`).
# May fail if the labeling position engine is tweaked.
d = unitTestDataPath('qgis_server_accesscontrol') + '/'
project = os.path.join(d, "project_labeling_settings.qgs")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(project),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"TRANSPARENT": "TRUE"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_LabelingSettings")
def test_wms_getmap_background(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"BGCOLOR": "green"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Background")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"BGCOLOR": "0x008000"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Background_Hex")
def test_wms_getmap_order(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Hello,Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_LayerOrder")
def test_wms_getmap_srs(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-151.7,-38.9,51.0,78.0",
"HEIGHT": "500",
"WIDTH": "500",
"SRS": "EPSG:4326"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SRS")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-151.7,-38.9,51.0,78.0",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:4326"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SRS")
def test_wms_getmap_style(self):
# default style
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleDefault")
# custom style with STYLES parameter
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"STYLES": "custom",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleCustom")
# custom style with STYLE parameter
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"STYLE": "custom",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleCustom")
# mixed custom and default style with STYLES parameter
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels,Hello",
"STYLES": "custom,",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleMixed")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Hello,Country_Labels",
"STYLES": "default,custom",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleMixed_LayerOrder")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Hello,Country_Labels",
"STYLES": ",custom",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_StyleMixed_LayerOrder")
def test_wms_getmap_filter(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country:\"name\" = 'eurasia'"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter")
# try to display a feature yet filtered by the project
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectStatePath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country:\"name\" = 'africa'"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter2")
# display all features to check that initial filter is restored
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectStatePath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter3")
# display multiple features filtered from multiple layers
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectStatePath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello,Hello_Filter_SubsetString",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country: \"name\" IN ( 'arctic' , 'eurasia' );Hello: \"color\" = 'red';Hello_Filter_SubsetString: \"color\" = 'slate'"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter4")
# display multiple features filtered from multiple layers with same filter for some
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Country_Diagrams,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "1017529,-4226661,11271098,17063190",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country,Country_Diagrams: \"name\" IN ( 'africa' , 'eurasia' );Hello: \"color\" IN ( 'magenta' , 'cerese' )"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter5")
# Error in filter (missing quote after africa) with multiple layer filter
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Country_Diagrams,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "1017529,-4226661,11271098,17063190",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": "Country,Country_Diagrams: \"name\" IN ( 'africa , 'eurasia' );Hello: \"color\" IN ( 'magenta' , 'cerese' )"
}.items())])
expected = self.strip_version_xmlns(b'<ServiceExceptionReport >\n <ServiceException code="Security">The filter string "name" IN ( \'africa , \'eurasia\' ) has been rejected because of security reasons. Note: Text strings have to be enclosed in single or double quotes. A space between each word / special character is mandatory. Allowed Keywords and special characters are AND,OR,IN,<,>=,>,>=,!=,\',\',(,),DMETAPHONE,SOUNDEX. Not allowed are semicolons in the filter expression.</ServiceException>\n</ServiceExceptionReport>\n')
r, h = self._result(self._execute_request(qs))
self.assertEqual(self.strip_version_xmlns(r), expected)
def test_wms_getmap_filter_ogc(self):
filter = "<Filter><PropertyIsEqualTo><PropertyName>name</PropertyName>" + \
"<Literal>eurasia</Literal></PropertyIsEqualTo></Filter>"
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC")
def test_wms_getmap_filter_ogc_with_empty(self):
filter = "(<Filter><PropertyIsEqualTo><PropertyName>name</PropertyName>" + \
"<Literal>eurasia</Literal></PropertyIsEqualTo></Filter>)()"
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC")
# empty filter
filter = ("(<ogc:Filter xmlns=\"http://www.opengis.net/ogc\">"
"</ogc:Filter>)")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC2")
# filter on the second layer
filter_hello = ("(<Filter></Filter>)")
filter_country = ("(<Filter><PropertyIsEqualTo><PropertyName>name"
"</PropertyName><Literal>eurasia</Literal>"
"</PropertyIsEqualTo></Filter>)")
filter = "{}{}".format(filter_hello, filter_country)
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Hello,Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC3")
def test_wms_getmap_filter_ogc_v2(self):
# with namespace
filter = ('<fes:Filter xmlns:fes=\"http://www.opengis.net/fes/2.0\">'
'<fes:PropertyIsEqualTo>'
'<fes:ValueReference>name</fes:ValueReference>'
'<fes:Literal>eurasia</fes:Literal>'
'</fes:PropertyIsEqualTo>'
'</fes:Filter>')
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC_V2")
# without namespace (only with prefix)
filter = ('<fes:Filter>'
'<fes:PropertyIsEqualTo>'
'<fes:ValueReference>name</fes:ValueReference>'
'<fes:Literal>eurasia</fes:Literal>'
'</fes:PropertyIsEqualTo>'
'</fes:Filter>')
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"FILTER": filter
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Filter_OGC_V2")
def test_wms_getmap_selection(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"SRS": "EPSG:3857",
"SELECTION": "Country: 4,1;Hello: 2,5"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Selection")
def test_wms_getmap_diagrams(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Diagrams,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Diagrams")
def test_wms_getmap_opacities(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"OPACITIES": "125, 50"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Opacities")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello,dem",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"OPACITIES": "125,50,150"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Opacities2")
# Test OPACITIES with specific STYLES
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello,dem",
"STYLES": "origin,default,default",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857",
"OPACITIES": "125,50,150"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Opacities3")
def test_wms_getmap_highlight(self):
# highlight layer with color separated from sld
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_GEOM": "POLYGON((-15000000 10000000, -15000000 6110620, 2500000 6110620, 2500000 10000000, -15000000 10000000))",
"HIGHLIGHT_SYMBOL": "<StyledLayerDescriptor><UserStyle><Name>Highlight</Name><FeatureTypeStyle><Rule><Name>Symbol</Name><LineSymbolizer><Stroke><SvgParameter name=\"stroke\">%23ea1173</SvgParameter><SvgParameter name=\"stroke-opacity\">1</SvgParameter><SvgParameter name=\"stroke-width\">1.6</SvgParameter></Stroke></LineSymbolizer></Rule></FeatureTypeStyle></UserStyle></StyledLayerDescriptor>",
"HIGHLIGHT_LABELSTRING": "Highlight Layer!",
"HIGHLIGHT_LABELSIZE": "16",
"HIGHLIGHT_LABELCOLOR": "%2300FF0000",
"HIGHLIGHT_LABELBUFFERCOLOR": "%232300FF00",
"HIGHLIGHT_LABELBUFFERSIZE": "1.5",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Highlight")
def test_wms_getmap_highlight_point(self):
# checks SLD stroke-width works for Points See issue 19795 comments
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_GEOM": "POINT(-6250000 8055310)",
"HIGHLIGHT_SYMBOL": "<?xml version=\"1.0\" encoding=\"UTF-8\"?><StyledLayerDescriptor xmlns=\"http://www.opengis.net/sld\" xmlns:ogc=\"http://www.opengis.net/ogc\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" version=\"1.1.0\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" xsi:schemaLocation=\"http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd\" xmlns:se=\"http://www.opengis.net/se\"><UserStyle><se:FeatureTypeStyle><se:Rule><se:PointSymbolizer><se:Graphic><se:Mark><se:WellKnownName>circle</se:WellKnownName><se:Stroke><se:SvgParameter name=\"stroke\">%23ff0000</se:SvgParameter><se:SvgParameter name=\"stroke-opacity\">1</se:SvgParameter><se:SvgParameter name=\"stroke-width\">7.5</se:SvgParameter></se:Stroke><se:Fill><se:SvgParameter name=\"fill\">%237bdcb5</se:SvgParameter><se:SvgParameter name=\"fill-opacity\">1</se:SvgParameter></se:Fill></se:Mark><se:Size>28.4</se:Size></se:Graphic></se:PointSymbolizer></se:Rule></se:FeatureTypeStyle></UserStyle></StyledLayerDescriptor>",
"HIGHLIGHT_LABELSTRING": "Highlight Point :)",
"HIGHLIGHT_LABELSIZE": "16",
"HIGHLIGHT_LABELCOLOR": "%2300FF0000",
"HIGHLIGHT_LABELBUFFERCOLOR": "%232300FF00",
"HIGHLIGHT_LABELBUFFERSIZE": "1.2",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Highlight_Point")
# TODO make Server Getmap work with SLD Highlight feature and enable test
def test_wms_getmap_highlight_line(self):
# checks SLD stroke-width works for Lines See issue 19795 comments
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Labels",
"HIGHLIGHT_GEOM": "LINESTRING(-15000000 8055310, 2500000 8055310)",
"HIGHLIGHT_SYMBOL": "<?xml version=\"1.0\" encoding=\"UTF-8\"?><StyledLayerDescriptor xmlns=\"http://www.opengis.net/sld\" xmlns:ogc=\"http://www.opengis.net/ogc\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" version=\"1.1.0\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" xsi:schemaLocation=\"http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd\" xmlns:se=\"http://www.opengis.net/se\"><UserStyle><se:FeatureTypeStyle><se:Rule><se:LineSymbolizer><se:Stroke><se:SvgParameter name=\"stroke\">%23ff0000</se:SvgParameter><se:SvgParameter name=\"stroke-opacity\">1</se:SvgParameter><se:SvgParameter name=\"stroke-width\">17.3</se:SvgParameter><se:SvgParameter name=\"stroke-linecap\">round</se:SvgParameter></se:Stroke></se:LineSymbolizer></se:Rule></se:FeatureTypeStyle></UserStyle></StyledLayerDescriptor>",
"HIGHLIGHT_LABELSTRING": "",
"HIGHLIGHT_LABELSIZE": "10",
"HIGHLIGHT_LABELCOLOR": "black",
"HIGHLIGHT_LABELBUFFERCOLOR": "white",
"HIGHLIGHT_LABELBUFFERSIZE": "1",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Highlight_Line")
def test_wms_getmap_annotations(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectAnnotationPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,Hello",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Annotations")
def test_wms_getmap_sld(self):
import socketserver
import threading
import http.server
# Bring up a simple HTTP server
os.chdir(unitTestDataPath() + '')
handler = http.server.SimpleHTTPRequestHandler
httpd = socketserver.TCPServer(('localhost', 0), handler)
port = httpd.server_address[1]
httpd_thread = threading.Thread(target=httpd.serve_forever)
httpd_thread.setDaemon(True)
httpd_thread.start()
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,db_point",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLDRestored")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"REQUEST": "GetMap",
"VERSION": "1.1.1",
"SERVICE": "WMS",
"SLD": "http://localhost:" + str(port) + "/qgis_local_server/db_point.sld",
"BBOX": "-16817707,-4710778,5696513,14587125",
"WIDTH": "500",
"HEIGHT": "500",
"LAYERS": "db_point",
"STYLES": "",
"FORMAT": "image/png",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLD")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,db_point",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLDRestored")
httpd.server_close()
def test_wms_getmap_sld_body(self):
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,db_point",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLDRestored")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"REQUEST": "GetMap",
"VERSION": "1.1.1",
"SERVICE": "WMS",
"SLD_BODY": "<?xml version=\"1.0\" encoding=\"UTF-8\"?><StyledLayerDescriptor xmlns=\"http://www.opengis.net/sld\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ogc=\"http://www.opengis.net/ogc\" xsi:schemaLocation=\"http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd\" version=\"1.1.0\" xmlns:se=\"http://www.opengis.net/se\" xmlns:xlink=\"http://www.w3.org/1999/xlink\"> <NamedLayer> <se:Name>db_point</se:Name> <UserStyle> <se:Name>db_point_style</se:Name> <se:FeatureTypeStyle> <se:Rule> <se:Name>Single symbol</se:Name> <ogc:Filter xmlns:ogc=\"http://www.opengis.net/ogc\"> <ogc:PropertyIsEqualTo> <ogc:PropertyName>gid</ogc:PropertyName> <ogc:Literal>1</ogc:Literal> </ogc:PropertyIsEqualTo> </ogc:Filter> <se:PointSymbolizer uom=\"http://www.opengeospatial.org/se/units/metre\"> <se:Graphic> <se:Mark> <se:WellKnownName>square</se:WellKnownName> <se:Fill> <se:SvgParameter name=\"fill\">5e86a1</se:SvgParameter> </se:Fill> <se:Stroke> <se:SvgParameter name=\"stroke\">000000</se:SvgParameter> </se:Stroke> </se:Mark> <se:Size>0.007</se:Size> </se:Graphic> </se:PointSymbolizer> </se:Rule> </se:FeatureTypeStyle> </UserStyle> </NamedLayer> </StyledLayerDescriptor>",
"BBOX": "-16817707,-4710778,5696513,14587125",
"WIDTH": "500",
"HEIGHT": "500",
"LAYERS": "db_point",
"STYLES": "",
"FORMAT": "image/png",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLD")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country,db_point",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_SLDRestored")
def test_wms_getmap_group(self):
"""A WMS shall render the requested layers by drawing the leftmost in the list
bottommost, the next one over that, and so on."""
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectGroupsPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "Country_Diagrams,Country_Labels,Country",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r_individual, _ = self._result(self._execute_request(qs))
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectGroupsPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "CountryGroup",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r_group, _ = self._result(self._execute_request(qs))
""" Debug check:
f = open('grouped.png', 'wb+')
f.write(r_group)
f.close()
f = open('individual.png', 'wb+')
f.write(r_individual)
f.close()
#"""
self.assertEqual(r_individual, r_group, 'Individual layers query and group layers query results should be identical')
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectGroupsPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"LAYERS": "group_short_name",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r_group, _ = self._result(self._execute_request(qs))
self.assertEqual(r_individual, r_group, 'Individual layers query and group layers query with short name results should be identical')
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectGroupsPath),
"SERVICE": "WMS",
"VERSION": "1.1.1",
"REQUEST": "GetMap",
"SLD_BODY": "<?xml version=\"1.0\" encoding=\"UTF-8\"?><StyledLayerDescriptor xmlns=\"http://www.opengis.net/sld\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ogc=\"http://www.opengis.net/ogc\" xsi:schemaLocation=\"http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd\" version=\"1.1.0\" xmlns:se=\"http://www.opengis.net/se\" xmlns:xlink=\"http://www.w3.org/1999/xlink\"> <NamedLayer> <se:Name>CountryGroup</se:Name></NamedLayer> </StyledLayerDescriptor>",
"STYLES": "",
"FORMAT": "image/png",
"BBOX": "-16817707,-4710778,5696513,14587125",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:3857"
}.items())])
r_group_sld, _ = self._result(self._execute_request(qs))
self.assertEqual(r_individual, r_group_sld, 'Individual layers query and SLD group layers query results should be identical')
def test_wms_getmap_group_regression_20810(self):
"""A WMS shall render the requested layers by drawing the leftmost in the list
bottommost, the next one over that, and so on. Even if the layers are inside groups."""
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(os.path.join(self.testdata_path, 'test_project_wms_grouped_layers.qgs')),
"SERVICE": "WMS",
"VERSION": "1.3.0",
"REQUEST": "GetMap",
"BBOX": "613402.5658687877003,5809005.018114360981,619594.408781287726,5813869.006602735259",
"CRS": "EPSG:25832",
"WIDTH": "429",
"HEIGHT": "337",
"LAYERS": "osm,areas and symbols",
"STYLES": ",",
"FORMAT": "image/png",
"DPI": "200",
"MAP_RESOLUTION": "200",
"FORMAT_OPTIONS": "dpi:200"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_GroupedLayersUp")
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(os.path.join(self.testdata_path, 'test_project_wms_grouped_layers.qgs')),
"SERVICE": "WMS",
"VERSION": "1.3.0",
"REQUEST": "GetMap",
"BBOX": "613402.5658687877003,5809005.018114360981,619594.408781287726,5813869.006602735259",
"CRS": "EPSG:25832",
"WIDTH": "429",
"HEIGHT": "337",
"LAYERS": "areas and symbols,osm",
"STYLES": ",",
"FORMAT": "image/png",
"DPI": "200",
"MAP_RESOLUTION": "200",
"FORMAT_OPTIONS": "dpi:200"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_GroupedLayersDown")
def test_wms_getmap_datasource_error(self):
"""Must throw a server exception if datasource if not available"""
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(os.path.join(self.testdata_path, 'test_project_wms_invalid_layers.qgs')),
"SERVICE": "WMS",
"VERSION": "1.3.0",
"REQUEST": "GetMap",
"BBOX": "613402.5658687877003,5809005.018114360981,619594.408781287726,5813869.006602735259",
"CRS": "EPSG:25832",
"WIDTH": "429",
"HEIGHT": "337",
"LAYERS": "areas and symbols,osm",
"STYLES": ",",
"FORMAT": "image/png",
"DPI": "200",
"MAP_RESOLUTION": "200",
"FORMAT_OPTIONS": "dpi:200"
}.items())])
r, h = self._result(self._execute_request(qs))
self.assertTrue('ServerException' in str(r))
@unittest.skipIf(os.environ.get('TRAVIS', '') == 'true', 'Can\'t rely on external resources for continuous integration')
def test_wms_getmap_external(self):
# 1 bits
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(self.projectPath),
"SERVICE": "WMS",
"REQUEST": "GetMap",
"LAYERS": "EXTERNAL_WMS:landsat",
"landsat:layers": "GEBCO_LATEST",
"landsat:dpiMode": "7",
"landsat:url": "https://www.gebco.net/data_and_products/gebco_web_services/web_map_service/mapserv",
"landsat:crs": "EPSG:4326",
"landsat:styles": "default",
"landsat:format": "image/jpeg",
"landsat:bbox": "-90,-180,90,180",
"landsat:version": "1.3.0",
"STYLES": "",
"BBOX": "-90,-180,90,180",
"HEIGHT": "500",
"WIDTH": "500",
"CRS": "EPSG:4326"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_External", 20000)
def test_wms_getmap_root_custom_layer_order_regression_21917(self):
"""When drawing root layer, custom layer order order should be respected."""
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(os.path.join(self.testdata_path, 'bug_21917_root_layer_order.qgs')),
"SERVICE": "WMS",
"VERSION": "1.3.0",
"REQUEST": "GetMap",
"BBOX": "44.9014,8.20346,44.9015,8.20355",
"CRS": "EPSG:4326",
"WIDTH": "400",
"HEIGHT": "400",
"LAYERS": "group",
"STYLES": ",",
"FORMAT": "image/png",
"DPI": "200",
"MAP_RESOLUTION": "200",
"FORMAT_OPTIONS": "dpi:200"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Group_Layer_Order")
# Check with root_layer
qs = "?" + "&".join(["%s=%s" % i for i in list({
"MAP": urllib.parse.quote(os.path.join(self.testdata_path, 'bug_21917_root_layer_order.qgs')),
"SERVICE": "WMS",
"VERSION": "1.3.0",
"REQUEST": "GetMap",
"BBOX": "44.9014,8.20346,44.9015,8.20355",
"CRS": "EPSG:4326",
"WIDTH": "400",
"HEIGHT": "400",
"LAYERS": "root_layer",
"STYLES": ",",
"FORMAT": "image/png",
"DPI": "200",
"MAP_RESOLUTION": "200",
"FORMAT_OPTIONS": "dpi:200"
}.items())])
r, h = self._result(self._execute_request(qs))
self._img_diff_error(r, h, "WMS_GetMap_Group_Layer_Order")
if __name__ == '__main__':
unittest.main()
| 39.784802 | 1,239 | 0.509574 | 6,675 | 62,303 | 4.63236 | 0.085993 | 0.009573 | 0.017205 | 0.019663 | 0.816888 | 0.795285 | 0.780246 | 0.769283 | 0.767731 | 0.757123 | 0 | 0.094015 | 0.306005 | 62,303 | 1,565 | 1,240 | 39.810224 | 0.62112 | 0.041651 | 0 | 0.829787 | 0 | 0.006079 | 0.339791 | 0.115631 | 0.057751 | 0 | 0.000135 | 0.000639 | 0.015198 | 1 | 0.022796 | false | 0 | 0.010638 | 0 | 0.034195 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
51f09105dd5d94de50e06e5f71f9ca1078a66d53 | 14,509 | py | Python | MC/runner.py | ucbtrans/sumo-project | a1232cfa6705d79313733ae118777a276536b088 | [
"BSD-2-Clause"
] | 5 | 2017-10-26T23:18:45.000Z | 2021-12-13T08:48:54.000Z | MC/runner_platoons.py | ucbtrans/sumo-project | a1232cfa6705d79313733ae118777a276536b088 | [
"BSD-2-Clause"
] | null | null | null | MC/runner_platoons.py | ucbtrans/sumo-project | a1232cfa6705d79313733ae118777a276536b088 | [
"BSD-2-Clause"
] | 3 | 2016-06-14T19:33:22.000Z | 2021-03-23T03:41:31.000Z | #!/usr/bin/env python
#@file runner.py
import os
import sys
import optparse
import subprocess
import random
import pdb
import settings
# import glob
# glob.platoons = []
# glob.platoonedvehicles=[]
# glob.platoonleaderspeed=[]
settings.init()
# import python modules from $SUMO_HOME/tools directory
try:
sys.path.insert(0,os.path.join(os.path.dirname(
__file__), '..', '..', 'code', 'sumo-0.30.0', "tools"))
sys.path.append(os.path.join(os.environ.get("SUMO_HOME", os.path.join(
os.path.dirname(__file__), "..", "..", "..")), "tools"))
from sumolib import checkBinary
except ImportError:
sys.exit("please declare environment variable 'SUMO_HOME' as the root directory of your sumo installation (it should contain folders 'bin', 'tools' and 'docs')")
from platoon_functions import *
import traci
PORT = 8873 # the port used for communicating with your sumo instance
print("MODULE:", traci.__file__)
# Runs the simulation, and allows you to change traffic phase
def run(platoon_flag):
## execute the TraCI control loop
traci.init(PORT)
programPointer = 0 # initiates at start # len(PROGRAM) - 1 # initiates at end
step = 0
platooning = platoon_flag
## Platoons Settings - platoon_comm should be a divisor of platoon_create so that theres no misalignment (ie, forming
# platoons without already adjusting the current platoons - if platoon_comm = 20, then possible to create platoon without
# first updating the ones that exist that can have changed lanes)
platoon_create = 50; # how often platoons update and are checked, every X ticks
platoon_comm = 10; # how often platoons communicate, every X ticks
numplatoons = 0;
while traci.simulation.getMinExpectedNumber() > 0 and step <= 60*60*3*settings.step_frac: #60*60*1.5: # 1.5 hours
traci.simulationStep() # advance a simulation step
if step % settings.step_frac == 0:
programPointer = (programPointer + 1) % 120
################################# PLATOONING #################################
## PLATOON CREATION
# Creates platoons if active, one line for each intersection and road segment.
start_range = 1; end_range = 120;
targetTau = 0.8; targetMinGap = 3;
accTau = 1.1; accMinGap = 3;
## PLATOON CONTROL
if platooning and (step % platoon_comm == 0):
#step >= 88*settings.step_frac
platoon_control(accTau, accMinGap, targetTau, targetMinGap, platoon_comm, step/settings.step_frac) #step%150 == 0 and step >= 1000)
if platooning and (step % platoon_create == 0):
## top left intersection
create_platoons("116069075#0", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0.376", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0.376", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0.376", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#0.376", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#1", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#1", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#1", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0.394", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0.394", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0.394", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0.394", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
## right of top left intersection
create_platoons("116069075#1.264", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.264", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.264", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.264", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.338", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.338", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.338", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.338", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069075#1.338", "_4", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("5982169#0", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("26467810", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("26467810", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("26467810", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.128", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.128", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.128", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.128", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.128", "_4", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.112", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.112", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("116069186#1.112", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
## to the right of above intersection
# create_platoons("26467810.54", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("26467810.54", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("26467810.54", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("26467810.54", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("26467810.54", "_4", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# create_platoons("116069186#1", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("116069186#1", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("116069186#1", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# create_platoons("116069261", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("116069261", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# create_platoons("50846753.54", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("50846753.54", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("50846753.54", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# create_platoons("-50846769", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("-50846769", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("-50846769", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# ## finish main route
# create_platoons("50846755#0", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("50846755#0", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
#
# create_platoons("50846755#0.517", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("50846755#0.517", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
# create_platoons("50846755#0.517", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#0.671", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#0.671", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#0.671", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#0.671", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#2.0", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#2.0", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#2.0.590", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#2.0.590", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("50846755#2.0.590", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("-50846720#6.20", "_0", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("-50846720#6.20", "_1", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("-50846720#6.20", "_2", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("-50846720#6.20", "_3", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
create_platoons("-50846720#6.20", "_4", start_range, end_range, accTau, accMinGap, targetTau, targetMinGap, programPointer)
## Final Simulation Step Actions
# sets traffic light at intersection 13 at the phase indicated
sys.stdout.flush()
step += 1
traci.close()
sys.stdout.flush()
def get_options():
optParser = optparse.OptionParser()
optParser.add_option("--nogui", action="store_true",
default=False, help="run the commandline version of sumo")
options, args = optParser.parse_args()
return options
# this is the main entry point of this script
if __name__ == "__main__":
options = get_options()
# this script has been called from the command line. It will start sumo as a
# server, then connect and run
if options.nogui:
sumoBinary = checkBinary('sumo')
else:
sumoBinary = checkBinary('sumo-gui')
sumoProcess = subprocess.Popen([sumoBinary,
"-c", "network/testmap.sumocfg",
"--step-length", str(settings.step_length),
"--tripinfo-output", "tripinfo.xml",
"--xml-validation", "never",
"--lanechange.overtake-right", "true",
"--lanechange.duration", "3",
"--ignore-junction-blocker", "10",
"--remote-port", str(PORT)],
stdout=sys.stdout, stderr=sys.stderr)
run(True)
sumoProcess.wait()
| 65.06278 | 165 | 0.688056 | 1,591 | 14,509 | 6.056568 | 0.160277 | 0.078871 | 0.18929 | 0.283935 | 0.741179 | 0.735367 | 0.72665 | 0.721046 | 0.721046 | 0.721046 | 0 | 0.082014 | 0.194914 | 14,509 | 222 | 166 | 65.355856 | 0.742916 | 0.284237 | 0 | 0.016667 | 0 | 0.008333 | 0.124036 | 0.009376 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.091667 | 0 | 0.116667 | 0.008333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a400938d7c5183c50d82351eec0113546c4fa1cd | 74,366 | py | Python | trackme/bin/trackme_rest_handler_data_hosts.py | dritanbitincka/trackme | 5b2e1eb6b8d46619d2ba053d985647410253e656 | [
"Apache-2.0"
] | 1 | 2021-06-06T11:51:36.000Z | 2021-06-06T11:51:36.000Z | trackme/bin/trackme_rest_handler_data_hosts.py | duanshuaimin/trackme | 78d0ec64e3ae2e40878b282ba2f375f978a28d73 | [
"Apache-2.0"
] | null | null | null | trackme/bin/trackme_rest_handler_data_hosts.py | duanshuaimin/trackme | 78d0ec64e3ae2e40878b282ba2f375f978a28d73 | [
"Apache-2.0"
] | null | null | null | import logging
import os, sys
import splunk
import splunk.entity
import splunk.Intersplunk
import json
logger = logging.getLogger(__name__)
splunkhome = os.environ['SPLUNK_HOME']
sys.path.append(os.path.join(splunkhome, 'etc', 'apps', 'trackme', 'lib'))
import rest_handler
import splunklib.client as client
class TrackMeHandlerDataHosts_v1(rest_handler.RESTHandler):
def __init__(self, command_line, command_arg):
super(TrackMeHandlerDataHosts_v1, self).__init__(command_line, command_arg, logger)
# Get the entire data hosts collection as a Python array
def get_dh_collection(self, request_info, **kwargs):
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
else:
# body is not required in this endpoint, if not submitted do not describe the usage
describe = False
if describe:
response = "{\"describe\": \"This endpoint retrieves the entire data hosts collection returned as a JSON array, it requires a GET call with no data required\"}"\
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Render
return {
"payload": json.dumps(collection.data.query(), indent=1),
'status': 200 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Get data host by _key
def get_dh_by_key(self, request_info, **kwargs):
describe = False
# By object_category and object
key = None
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
key = resp_dict['_key']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint retrieves an existing data host record by the Kvstore key, it requires a GET call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"_key\": \"KVstore unique identifier for this record\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Get the record
record = json.dumps(collection.data.query_by_id(key), indent=1)
# Render result
if record is not None and len(record)>2:
return {
"payload": str(record),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found ' + str(key),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Get Ack by object name
def get_dh_by_name(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint retrieves an existing data host record by the data host name (data_host), it requires a GET call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Get the record
record = json.dumps(collection.data.query(query=str(query_string)), indent=1)
# Render result
if record is not None and len(record)>2:
return {
"payload": str(record),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Disable monitoring by object name
def post_dh_disable_monitoring(self, request_info, **kwargs):
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint disables data monitoring for an existing data host by the data host name (data_host), it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# define the new state
data_monitored_state = "disabled"
# Render result
if key is not None and len(key)>2:
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": str(data_monitored_state),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "disable monitoring",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Enable monitoring by object name
def post_dh_enable_monitoring(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint enables data monitoring for an existing data host by the data host name (data_host), it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# define the new state
data_monitored_state = "enabled"
# Render result
if key is not None and len(key)>2:
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": str(data_monitored_state),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "enable monitoring",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Reset by object name
def post_dh_reset(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint resets (removal of index and sourcetype knowledge) an existing data host by the data host name (data_host), it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2:
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "reset data",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Update priority by object name
def post_dh_update_priority(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
priority = resp_dict['priority']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint updates the priority definition for an existing data host by the data host name (data_host), it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"priority\": \"priority value, valid options are low / medium / high\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2 and priority in ("low", "medium", "high"):
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": str(priority)
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "modify priority",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Update lagging policy by object name
def post_dh_update_lag_policy(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
data_lag_alert_kpis = resp_dict['data_lag_alert_kpis'] # all_kpis / lag_ingestion_kpi / lag_event_kpi
data_max_lag_allowed = int(resp_dict['data_max_lag_allowed']) # seconds
data_override_lagging_class = resp_dict['data_override_lagging_class'] # true / false
data_host_alerting_policy = resp_dict['data_host_alerting_policy'] # global_policy / track_per_sourcetype / track_per_host
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint resets (removal of index and sourcetype knowledge) an existing data host by the data host name (data_host), it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2 and data_override_lagging_class in ("true", "false") and data_lag_alert_kpis in ("all_kpis", "lag_ingestion_kpi", "lag_event_kpi") and data_host_alerting_policy in ("global_policy", "track_per_sourcetype", "track_per_host"):
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": str(data_max_lag_allowed),
"data_lag_alert_kpis": str(data_lag_alert_kpis),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": str(data_override_lagging_class),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": str(data_host_alerting_policy),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "modify monitoring lag policy",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Update monitoring week days by object name
def post_dh_update_wdays(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
# Week days monitoring can be:
# manual:all_days / manual:monday-to-friday / manual:monday-to-saturday / [ 0, 1, 2, 3, 4, 5, 6 ] where Sunday is 0
data_monitoring_wdays = resp_dict['data_monitoring_wdays']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint configures the week days monitoring rule for an existing data host, it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"data_monitoring_wdays\": \"the week days rule, valid options are manual:all_days / manual:monday-to-friday / manual:monday-to-saturday / [ 0, 1, 2, 3, 4, 5, 6 ] where Sunday is 0\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2:
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": str(data_monitoring_wdays),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": record[0].get('OutlierMinEventCount'),
"OutlierLowerThresholdMultiplier": record[0].get('OutlierLowerThresholdMultiplier'),
"OutlierUpperThresholdMultiplier": record[0].get('OutlierUpperThresholdMultiplier'),
"OutlierAlertOnUpper": record[0].get('OutlierAlertOnUpper'),
"OutlierTimePeriod": record[0].get('OutlierTimePeriod'),
"OutlierSpan": record[0].get('OutlierSpan'),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": record[0].get('enable_behaviour_analytic'),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "modify week days monitoring",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Update outliers configuration by object name
def post_dh_update_outliers(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
OutlierMinEventCount = resp_dict['OutlierMinEventCount'] # integer, default to 0 (disabled)
OutlierLowerThresholdMultiplier = resp_dict['OutlierLowerThresholdMultiplier'] # integer, defaults to 4
OutlierUpperThresholdMultiplier = resp_dict['OutlierUpperThresholdMultiplier'] # integer, defaults to 4
OutlierAlertOnUpper = resp_dict['OutlierAlertOnUpper'] # true / false
OutlierTimePeriod = resp_dict['OutlierTimePeriod'] # relative time period, default to -7d
OutlierSpan = resp_dict['OutlierSpan'] # span period Splunk notation, defaults to 5m
enable_behaviour_analytic = resp_dict['enable_behaviour_analytic'] # true / false
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint configures the week days monitoring rule for an existing data host, it requires a POST call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"OutlierMinEventCount\": \"the minimal number of events, if set to anything bigger than 0, the lower bound becomes a static value, needs to be an integer, default to 0 (disabled)\", "\
+ "\"OutlierLowerThresholdMultiplier\": \"The lower bound threshold multiplier, must be an integer, defaults to 4\", "\
+ "\"OutlierUpperThresholdMultiplier\": \"The upper bound threshold multiplier, must be integer, defaults to 4\", "\
+ "\"OutlierAlertOnUpper\": \"Enables / Disables alerting on upper outliers detection, valid options are true / false, defaults to false\", "\
+ "\"OutlierTimePeriod\": \"relative time period for outliers calculation, default to -7d\", "\
+ "\"OutlierSpan\": \"span period Splunk notation for outliers UI rendering, defaults to 5m\", "\
+ "\"enable_behaviour_analytic\": \"Enables / Disables outliers detection for that object, valid options are true / false, defaults to true\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\", "\
+ "\"data_host\": \"name of the data host\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2:
# Update the record
collection.data.update(str(key), json.dumps({
"object_category": record[0].get('object_category'),
"data_host": record[0].get('data_host'),
"data_index": record[0].get('data_index'),
"data_sourcetype": record[0].get('data_sourcetype'),
"data_last_lag_seen": record[0].get('data_last_lag_seen'),
"data_last_ingestion_lag_seen": record[0].get('data_last_ingestion_lag_seen'),
"data_eventcount": record[0].get('data_eventcount'),
"data_first_time_seen": record[0].get('data_first_time_seen'),
"data_last_time_seen": record[0].get('data_last_time_seen'),
"data_last_ingest": record[0].get('data_last_ingest'),
"data_max_lag_allowed": record[0].get('data_max_lag_allowed'),
"data_lag_alert_kpis": record[0].get('data_lag_alert_kpis'),
"data_monitored_state": record[0].get('data_monitored_state'),
"data_monitoring_wdays": record[0].get('data_monitoring_wdays'),
"data_override_lagging_class": record[0].get('data_override_lagging_class'),
"data_host_state": record[0].get('data_host_state'),
"data_tracker_runtime": record[0].get('data_tracker_runtime'),
"data_previous_host_state": record[0].get('data_previous_host_state'),
"data_previous_tracker_runtime": record[0].get('data_previous_tracker_runtime'),
"data_host_st_summary": record[0].get('data_host_st_summary'),
"data_host_alerting_policy": record[0].get('data_host_alerting_policy'),
"OutlierMinEventCount": str(OutlierMinEventCount),
"OutlierLowerThresholdMultiplier": str(OutlierLowerThresholdMultiplier),
"OutlierUpperThresholdMultiplier": str(OutlierUpperThresholdMultiplier),
"OutlierAlertOnUpper": str(OutlierAlertOnUpper),
"OutlierTimePeriod": str(OutlierTimePeriod),
"OutlierSpan": str(OutlierSpan),
"isOutlier": record[0].get('isOutlier'),
"enable_behaviour_analytic": str(enable_behaviour_analytic),
"latest_flip_state": record[0].get('latest_flip_state'),
"latest_flip_time": record[0].get('latest_flip_time'),
"priority": record[0].get('priority')
}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "modify outliers",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json.dumps(collection.data.query_by_id(key), indent=1)),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": json.dumps(collection.data.query_by_id(key), indent=1),
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Remove data host temporary by object name
def delete_dh_delete_temporary(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint performs a temporary deletion of an existing data host, it requires a DELETE call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2:
# Store the record for audit purposes
json_data = str(json.dumps(collection.data.query_by_id(key), indent=1))
# Remove the record
collection.data.delete(json.dumps({"_key":key}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "delete temporary",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json_data),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": "Record with _key " + str(key) + " was temporarily deleted from the collection.",
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
# Remove data host permanent by object name
def delete_dh_delete_permanent(self, request_info, **kwargs):
# By data_host
data_host = None
query_string = None
describe = False
# Retrieve from data
try:
resp_dict = json.loads(str(request_info.raw_args['payload']))
except Exception as e:
resp_dict = None
if resp_dict is not None:
try:
describe = resp_dict['describe']
if describe in ("true", "True"):
describe = True
except Exception as e:
describe = False
if not describe:
data_host = resp_dict['data_host']
else:
# body is required in this endpoint, if not submitted describe the usage
describe = True
if describe:
response = "{\"describe\": \"This endpoint performs a permanent deletion of an existing data host, it requires a DELETE call with the following information:\""\
+ ", \"options\" : [ { "\
+ "\"data_host\": \"name of the data host\", "\
+ "\"update_comment\": \"OPTIONAL: a comment for the update, comments are added to the audit record, if unset will be defined to: API update\""\
+ " } ] }"
return {
"payload": json.dumps(json.loads(str(response)), indent=1),
'status': 200 # HTTP status code
}
# Update comment is optional and used for audit changes
try:
update_comment = resp_dict['update_comment']
except Exception as e:
update_comment = "API update"
# Define the KV query
query_string = '{ "data_host": "' + data_host + '" }'
# Get splunkd port
entity = splunk.entity.getEntity('/server', 'settings',
namespace='trackme', sessionKey=request_info.session_key, owner='-')
splunkd_port = entity['mgmtHostPort']
try:
# Data collection
collection_name = "kv_trackme_host_monitoring"
service = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection = service.kvstore[collection_name]
# Audit collection
collection_name_audit = "kv_trackme_audit_changes"
service_audit = client.connect(
owner="nobody",
app="trackme",
port=splunkd_port,
token=request_info.session_key
)
collection_audit = service_audit.kvstore[collection_name_audit]
# Get the current record
# Notes: the record is returned as an array, as we search for a specific record, we expect one record only
try:
record = collection.data.query(query=str(query_string))
key = record[0].get('_key')
except Exception as e:
key = None
# Render result
if key is not None and len(key)>2:
# Store the record for audit purposes
json_data = str(json.dumps(collection.data.query_by_id(key), indent=1))
# Remove the record
collection.data.delete(json.dumps({"_key":key}))
# Record an audit change
import time
current_time = int(round(time.time() * 1000))
user = request_info.user
try:
# Insert the record
collection_audit.data.insert(json.dumps({
"time": str(current_time),
"user": str(user),
"action": "success",
"change_type": "delete permanent",
"object": str(data_host),
"object_category": "data_host",
"object_attrs": str(json_data),
"result": "N/A",
"comment": str(update_comment)
}))
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
return {
"payload": "Record with _key " + str(key) + " was permanently deleted from the collection.",
'status': 200 # HTTP status code
}
else:
return {
"payload": 'Warn: resource not found or request is incorrect ' + str(query_string),
'status': 404 # HTTP status code
}
except Exception as e:
return {
'payload': 'Warn: exception encountered: ' + str(e) # Payload of the request.
}
| 43.31159 | 270 | 0.523694 | 7,365 | 74,366 | 5.0778 | 0.041412 | 0.040243 | 0.05749 | 0.048666 | 0.936066 | 0.925959 | 0.919033 | 0.906867 | 0.90208 | 0.900422 | 0 | 0.009221 | 0.377336 | 74,366 | 1,716 | 271 | 43.33683 | 0.798424 | 0.085899 | 0 | 0.866029 | 0 | 0 | 0.202244 | 0.057594 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010367 | false | 0 | 0.013557 | 0 | 0.069378 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cfc69e116631d0970e4a9733807fec7d4027578d | 31,572 | py | Python | hackathon/mBrainAligner/src/3D_U-Net/model/unet_model.py | zzhmark/vaa3d_tools | 3ca418add85a59ac7e805d55a600b78330d7e53d | [
"MIT"
] | 1 | 2021-12-27T19:14:03.000Z | 2021-12-27T19:14:03.000Z | hackathon/mBrainAligner/src/3D_U-Net/model/unet_model.py | zzhmark/vaa3d_tools | 3ca418add85a59ac7e805d55a600b78330d7e53d | [
"MIT"
] | null | null | null | hackathon/mBrainAligner/src/3D_U-Net/model/unet_model.py | zzhmark/vaa3d_tools | 3ca418add85a59ac7e805d55a600b78330d7e53d | [
"MIT"
] | null | null | null | import numpy as np
import keras
from keras import backend as K
from keras.engine import Input, Model
from keras.layers import Conv3D, MaxPooling3D, AveragePooling3D, UpSampling3D, Add,Activation,BatchNormalization, \
PReLU, Deconvolution3D, LeakyReLU, add, Dense, Flatten, GlobalAveragePooling3D, Reshape, Multiply,Reshape,Subtract
from keras.optimizers import Adam,SGD
from keras.layers.core import Lambda
from model.base_block import ASPP_Block,SpatialChannelAttentionBlcok
from model.class_block import class_supervised_block
from model.base_block import create_convolution_block,get_up_convolution,ChannelSep,Conv_shortcut_Block,\
CurrentConvBlock,deeplab_atrous_Block
from metrics import dice_coefficient_loss, get_label_dice_coefficient_function, dice_coefficient,\
focal_loss,dice_coefficient1,dice_coefficient_loss1
K.set_image_data_format("channels_last")
keras.initializers.TruncatedNormal(mean=0.0, stddev=0.05, seed=None)
try:
from keras.engine import merge
except ImportError:
from keras.layers.merge import concatenate
# -------------------- current unet -------------------- #
def current_unet_model(input_shape, pool_size=(2, 2, 2), n_labels=9, initial_learning_rate=0.00001, deconvolution=False,
TrainOP='Adam',
depth=4, batch_normalization=False, instance_normalization=True, activation_name="softmax",
include_label_wise_dice_coefficients=False, metrics=dice_coefficient):
print('the shape of the input : ', input_shape)
level_filters = [30, 60, 120, 240, 320, 480]
inputs = Input(input_shape)
current_layer = inputs
levels = list()
print("###", current_layer._keras_shape)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='deconv_' + str(0))
levels.append(current_layer)
current_layer = MaxPooling3D(pool_size=pool_size, name='maxpool' + str(0))(current_layer)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[1],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='deconv_' + str(1))
levels.append(current_layer)
current_layer = MaxPooling3D(pool_size=pool_size, name='maxpool' + str(1))(current_layer)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[2],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='deconv_' + str(2))
levels.append(current_layer)
current_layer = MaxPooling3D(pool_size=pool_size, name='maxpool' + str(2))(current_layer)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[3],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='deconv_' + str(3))
current_layer = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[2],
name='Upsample' + str(2))(current_layer)
current_layer = concatenate([current_layer, levels[2]], axis=4)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[2],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='upconv_' + str(2))
current_layer = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[1],
name='Upsample' + str(1))(current_layer)
current_layer = concatenate([current_layer, levels[1]], axis=4)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[1],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='upconv_' + str(1))
current_layer = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[0],
name='Upsample' + str(0))(current_layer)
current_layer = concatenate([current_layer, levels[0]], axis=4)
current_layer = CurrentConvBlock(input_layer=current_layer,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='upconv_' + str(0))
final_convolution = Conv3D(n_labels, (1, 1, 1), name='FinalConv1')(current_layer)
print("###", final_convolution._keras_shape)
act1 = Activation(activation_name, name='act1')(final_convolution)
model = Model(inputs=inputs, outputs=act1)
if TrainOP == 'Adam':
model.compile(optimizer=Adam(lr=initial_learning_rate), loss=dice_coefficient_loss, metrics=[metrics])
else:
model.compile(optimizer=SGD(lr=initial_learning_rate), loss=dice_coefficient_loss, metrics=[metrics])
return model
def FRnet(input_shape,input_shape1, pool_size=(2, 2, 2), n_labels=9, initial_learning_rate=0.00001, deconvolution=False,
TrainOP='Adam',depth=4, batch_normalization=False, instance_normalization=True, activation_name="softmax",
include_label_wise_dice_coefficients=False, metrics=dice_coefficient):
print('the shape of the input : ', input_shape)
level_filters = [32,64,128,256,512]
inputs = Input(input_shape)
inputs1 = Input(input_shape1)
current_layer1 = inputs1
current_layer = inputs
levels = list()
print("model2 : ", current_layer1._keras_shape)
# add levels with max pooling
for layer_depth in range(depth): # 01234
if layer_depth > -1:
layer2 = create_convolution_block(input_layer=current_layer,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv1_' + str(layer_depth))
layer2 = create_convolution_block(input_layer=layer2,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv2_' + str(layer_depth))
print("###", layer2._keras_shape)
levels.append(layer2)
if layer_depth < depth - 1:
current_layer = MaxPooling3D(pool_size=pool_size,name = 'maxpool' + str(layer_depth))(layer2)
else:
# current_layer = create_convolution_block(input_layer=layer2,
# n_filters=level_filters[4],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='conv3')
# print("###", current_layer._keras_shape)
# current_layer = create_convolution_block(input_layer=current_layer,
# n_filters=level_filters[5],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='conv4')
# print("###", current_layer._keras_shape)
current_layer = layer2
# add levels with up-convolution or up-sampling
for layer_depth in range(depth - 2, -1, -1):
up_convolution = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[layer_depth],
name = 'Upsample' + str(layer_depth))(current_layer)
print("###",up_convolution._keras_shape)
# current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
# input_layer=levels[layer_depth],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='Up_conv3_' + str(layer_depth))
concat = concatenate([up_convolution, levels[layer_depth]], axis=4)
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=concat,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv1_' + str(layer_depth))
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=current_layer,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv2_' + str(layer_depth))
if include_label_wise_dice_coefficients and n_labels > 1:
label_wise_dice_metrics = [get_label_dice_coefficient_function(index) for index in range(n_labels)]
if metrics:
metrics = metrics + label_wise_dice_metrics
else:
metrics = label_wise_dice_metrics
if not isinstance(metrics, list):
metrics = [metrics]
# current_layer1 = ChannelSep(current_layer1,
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# name='CS1', nlayers=4,
# nfilters=8)
# print("###", current_layer1._keras_shape)
# current_layer1 = ChannelSep(current_layer1,
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# name='CS3', nlayers=4,
# nfilters=8)
# print("###", current_layer1._keras_shape)
#
current_layer1 = create_convolution_block(input_layer=current_layer1,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='convA1')
print("###", current_layer1._keras_shape)
current_layer1 = create_convolution_block(input_layer=current_layer1,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='convA2')
print("###", current_layer1._keras_shape)
current_layer1 = concatenate([current_layer, current_layer1], axis=4)
# current_layer1 = SpatialChannelAttentionBlcok(current_layer1, level_filters[0], name='SC2', layernumbers=3)
# current_layer1 = ASPP_Block(current_layer1,name = 'ASPP1',n_filters = 30)
final_convolution1 = Conv3D(n_labels, (1, 1, 1), name='FinalConv1')(current_layer)
final_convolution2 = Conv3D(n_labels, (1, 1, 1), name='FinalConv3')(current_layer1)
print("###", final_convolution2._keras_shape)
act1 = Activation(activation_name, name='act1')(final_convolution1)
act2 = Activation(activation_name, name='act2')(final_convolution2)
model = Model(inputs=[inputs,inputs1], outputs=[act1,act2])
if TrainOP == 'Adam':
model.compile(optimizer=Adam(lr=initial_learning_rate),
loss={'act1': dice_coefficient_loss,
'act2': dice_coefficient_loss},
loss_weights={'act1': 1.,
'act2': 1.0},
metrics=metrics)
else:
model.compile(optimizer=SGD(lr=initial_learning_rate),
loss={'act1': dice_coefficient_loss,
'act2': dice_coefficient_loss},
loss_weights={'act1': 1.,
'act2': 1.0},
metrics=metrics)
return model
def FRnet_C(input_shape,input_shape1, pool_size=(2, 2, 2), n_labels=9, initial_learning_rate=0.00001, deconvolution=False,
TrainOP='Adam',depth=4, batch_normalization=False, instance_normalization=True, activation_name="softmax",
include_label_wise_dice_coefficients=False, metrics=dice_coefficient):
print('the shape of the input : ', input_shape)
level_filters = [32,64,128,256,512]
inputs = Input(input_shape)
inputs1 = Input(input_shape1)
current_layer1 = inputs1
current_layer = inputs
levels = list()
print("model2 : ", current_layer1._keras_shape)
# add levels with max pooling
for layer_depth in range(depth): # 01234
if layer_depth > -1:
layer2 = create_convolution_block(input_layer=current_layer,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv1_' + str(layer_depth))
layer2 = create_convolution_block(input_layer=layer2,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv2_' + str(layer_depth))
print("###", layer2._keras_shape)
levels.append(layer2)
if layer_depth < depth - 1:
current_layer = MaxPooling3D(pool_size=pool_size,name = 'maxpool' + str(layer_depth))(layer2)
else:
current_layer = layer2
# add levels with up-convolution or up-sampling
for layer_depth in range(depth - 2, -1, -1):
up_convolution = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[layer_depth],
name = 'Upsample' + str(layer_depth))(current_layer)
print("###",up_convolution._keras_shape)
concat = concatenate([up_convolution, levels[layer_depth]], axis=4)
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=concat,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv1_' + str(layer_depth))
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=current_layer,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv2_' + str(layer_depth))
if include_label_wise_dice_coefficients and n_labels > 1:
label_wise_dice_metrics = [get_label_dice_coefficient_function(index) for index in range(n_labels)]
if metrics:
metrics = metrics + label_wise_dice_metrics
else:
metrics = label_wise_dice_metrics
if not isinstance(metrics, list):
metrics = [metrics]
current_layer1 = create_convolution_block(input_layer=current_layer1,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='convA1')
print("###", current_layer1._keras_shape)
current_layer1 = create_convolution_block(input_layer=current_layer1,
n_filters=level_filters[0],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name='convA2')
print("###", current_layer1._keras_shape)
current_layer1 = concatenate([current_layer, current_layer1], axis=4)
# current_layer1 = SpatialChannelAttentionBlcok(current_layer1, level_filters[0], name='SC2', layernumbers=3)
# current_layer1 = ASPP_Block(current_layer1,name = 'ASPP1',n_filters = 30)
final_convolution1 = Conv3D(n_labels, (1, 1, 1), name='FinalConv1')(current_layer)
final_convolution2 = Conv3D(n_labels, (1, 1, 1), name='FinalConv3')(current_layer1)
# final_convolution3, final_convolution4, final_convolution5 = class_supervised_block(final_convolution1,
# nbasefilters=n_labels,name='CS')
final_convolution6, final_convolution7, final_convolution8 = class_supervised_block(final_convolution2,
nbasefilters=n_labels, name='CS1')
print("###", final_convolution2._keras_shape)
act1 = Activation(activation_name, name='act1')(final_convolution1)
act2 = Activation(activation_name, name='act2')(final_convolution2)
# w_act_x1 = Activation(activation_name, name='act_x1')(final_convolution3)
# w_act_y1 = Activation(activation_name, name='act_y1')(final_convolution4)
# w_act_z1 = Activation(activation_name, name='act_z1')(final_convolution5)
w_act_x2 = Activation(activation_name, name='act_x2')(final_convolution6)
w_act_y2 = Activation(activation_name, name='act_y2')(final_convolution7)
w_act_z2 = Activation(activation_name, name='act_z2')(final_convolution8)
model = Model(inputs=[inputs,inputs1], outputs=[act1,act2, w_act_x2, w_act_y2, w_act_z2])
# model = Model(inputs=[inputs, inputs1],
# outputs=[act1, act2, w_act_x1, w_act_y1, w_act_z1, w_act_x2, w_act_y2, w_act_z2])
beishu = 0.5
if TrainOP == 'Adam':
model.compile(optimizer=Adam(lr=initial_learning_rate),
loss={'act1': dice_coefficient_loss,
'act2': dice_coefficient_loss,
'act_x2': dice_coefficient_loss1,
'act_y2': dice_coefficient_loss1,
'act_z2': dice_coefficient_loss1},
loss_weights={'act1': 1.,
'act2': 1.,
'act_x2': 1.0*beishu,
'act_y2': 1.0*beishu,
'act_z2': 1.0*beishu},
metrics=metrics)
else:
model.compile(optimizer=SGD(lr=initial_learning_rate),
loss={'act1': dice_coefficient_loss,
'act2': dice_coefficient_loss,
'act_x2': dice_coefficient_loss1,
'act_y2': dice_coefficient_loss1,
'act_z2': dice_coefficient_loss1},
loss_weights={'act1': 1.,
'act2': 1.,
'act_x2': 1.0*beishu,
'act_y2': 1.0*beishu,
'act_z2': 1.0*beishu},
metrics=metrics)
# if TrainOP == 'Adam':
# model.compile(optimizer=Adam(lr=initial_learning_rate),
# loss={'act1': dice_coefficient_loss,
# 'act2': dice_coefficient_loss,
# 'act_x1': dice_coefficient_loss1,
# 'act_y1': dice_coefficient_loss1,
# 'act_z1': dice_coefficient_loss1,
# 'act_x2': dice_coefficient_loss1,
# 'act_y2': dice_coefficient_loss1,
# 'act_z2': dice_coefficient_loss1},
# loss_weights={'act1': 1.,
# 'act2': 1.,
# 'act_x1': 1.0 * beishu,
# 'act_y1': 1.0 * beishu,
# 'act_z1': 1.0 * beishu,
# 'act_x2': 1.0 * beishu,
# 'act_y2': 1.0 * beishu,
# 'act_z2': 1.0 * beishu},
# metrics=metrics)
# else:
# model.compile(optimizer=SGD(lr=initial_learning_rate),
# loss={'act1': dice_coefficient_loss,
# 'act2': dice_coefficient_loss,
# 'act_x1': dice_coefficient_loss1,
# 'act_y1': dice_coefficient_loss1,
# 'act_z1': dice_coefficient_loss1,
# 'act_x2': dice_coefficient_loss1,
# 'act_y2': dice_coefficient_loss1,
# 'act_z2': dice_coefficient_loss1},
# loss_weights={'act1': 1.,
# 'act2': 1.,
# 'act_x1': 1.0 * beishu,
# 'act_y1': 1.0 * beishu,
# 'act_z1': 1.0 * beishu,
# 'act_x2': 1.0 * beishu,
# 'act_y2': 1.0 * beishu,
# 'act_z2': 1.0 * beishu},
# metrics=metrics)
return model
# -------------------- unet -------------------- #
def unet_model_3d(input_shape, pool_size=(2, 2, 2), n_labels=9, initial_learning_rate=0.00001, deconvolution=False,TrainOP = 'Adam',
depth=4,batch_normalization=False, instance_normalization=True, activation_name="softmax",
include_label_wise_dice_coefficients=False, metrics=dice_coefficient):
print('the shape of the input : ', input_shape)
level_filters = [32,64,128,256,512]
# create and compile model
inputs = Input(input_shape)
current_layer = inputs
levels = list()
print("###", current_layer._keras_shape)
# add levels with max pooling
for layer_depth in range(depth): # 01234
if layer_depth > -1:
layer2 = create_convolution_block(input_layer=current_layer,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv1_' + str(layer_depth))
layer2 = create_convolution_block(input_layer=layer2,
n_filters=level_filters[layer_depth],
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'conv2_' + str(layer_depth))
print("###", layer2._keras_shape)
levels.append(layer2)
if layer_depth < depth - 1:
current_layer = MaxPooling3D(pool_size=pool_size,name = 'maxpool' + str(layer_depth))(layer2)
else:
# current_layer = create_convolution_block(input_layer=layer2,
# n_filters=level_filters[4],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='conv3')
# print("###", current_layer._keras_shape)
# current_layer = create_convolution_block(input_layer=current_layer,
# n_filters=level_filters[5],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='conv4')
# print("###", current_layer._keras_shape)
current_layer = layer2
# add levels with up-convolution or up-sampling
for layer_depth in range(depth - 2, -1, -1):
up_convolution = get_up_convolution(pool_size=pool_size,
deconvolution=deconvolution,
n_filters=level_filters[layer_depth],
name = 'Upsample' + str(layer_depth))(current_layer)
print("###",up_convolution._keras_shape)
# current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
# input_layer=levels[layer_depth],
# batch_normalization=batch_normalization,
# instance_normalization=instance_normalization,
# activation=LeakyReLU,
# name='Up_conv3_' + str(layer_depth))
concat = concatenate([up_convolution, levels[layer_depth]], axis=4)
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=concat,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv1_' + str(layer_depth))
current_layer = create_convolution_block(n_filters=level_filters[layer_depth],
input_layer=current_layer,
batch_normalization=batch_normalization,
instance_normalization=instance_normalization,
activation=LeakyReLU,
name = 'Up_conv2_' + str(layer_depth))
if include_label_wise_dice_coefficients and n_labels > 1:
label_wise_dice_metrics = [get_label_dice_coefficient_function(index) for index in range(n_labels)]
if metrics:
metrics = metrics + label_wise_dice_metrics
else:
metrics = label_wise_dice_metrics
if not isinstance(metrics, list):
metrics = [metrics]
final_convolution = Conv3D(n_labels, (1, 1, 1),name = 'FinalConv1')(current_layer)
print("###", final_convolution._keras_shape)
act1 = Activation(activation_name,name = 'act1')(final_convolution)
model = Model(inputs=inputs, outputs=act1)
if TrainOP == 'Adam':
model.compile(optimizer=Adam(lr=initial_learning_rate), loss=dice_coefficient_loss, metrics=metrics)
else:
model.compile(optimizer=SGD(lr=initial_learning_rate), loss=dice_coefficient_loss, metrics=metrics)
#print(model.summary())
return model
| 57.613139 | 132 | 0.521506 | 2,718 | 31,572 | 5.710449 | 0.075055 | 0.061852 | 0.135816 | 0.0451 | 0.912248 | 0.894981 | 0.891566 | 0.891566 | 0.878487 | 0.876425 | 0 | 0.028044 | 0.399151 | 31,572 | 547 | 133 | 57.718464 | 0.790142 | 0.208856 | 0 | 0.852547 | 0 | 0 | 0.029597 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010724 | false | 0 | 0.037534 | 0 | 0.058981 | 0.058981 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cfd1e1e166ddf516f7f4bfd47b68f3a4239592b7 | 112 | py | Python | back/api/serializers/__init__.py | QLevaslot/TheBigPicture | 98a37e619ba8e4dee6a4113742a652f3d7a928c4 | [
"MIT"
] | 2 | 2020-08-10T11:49:22.000Z | 2020-08-10T11:49:52.000Z | back/api/serializers/__init__.py | QLevaslot/TheBigPicture | 98a37e619ba8e4dee6a4113742a652f3d7a928c4 | [
"MIT"
] | null | null | null | back/api/serializers/__init__.py | QLevaslot/TheBigPicture | 98a37e619ba8e4dee6a4113742a652f3d7a928c4 | [
"MIT"
] | null | null | null |
from api.serializers.bigpicture import *
from api.serializers.vote import *
from api.serializers.user import *
| 22.4 | 40 | 0.803571 | 15 | 112 | 6 | 0.466667 | 0.233333 | 0.6 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116071 | 112 | 4 | 41 | 28 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
32066dfd81ff8a00e9490bd1168ee188a8555602 | 86,760 | py | Python | metal/models/ip_addresses_api.py | displague/metal-python | 96e64e9ac41025d85ff6f61693165e29e1c366db | [
"MIT"
] | null | null | null | metal/models/ip_addresses_api.py | displague/metal-python | 96e64e9ac41025d85ff6f61693165e29e1c366db | [
"MIT"
] | 3 | 2021-09-27T05:10:36.000Z | 2021-09-27T06:10:57.000Z | metal/models/ip_addresses_api.py | displague/metal-python | 96e64e9ac41025d85ff6f61693165e29e1c366db | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Metal API
This is the API for Equinix Metal. The API allows you to programmatically interact with all of your Equinix Metal resources, including devices, networks, addresses, organizations, projects, and your user account. The official API docs are hosted at <https://metal.equinix.com/developers/api>. # noqa: E501
The version of the OpenAPI document: 1.0.0
Contact: support@equinixmetal.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from metal.api_client import ApiClient
from metal.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class IPAddressesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_ip_assignment(self, id, ip_assignment, **kwargs): # noqa: E501
"""Create an ip assignment # noqa: E501
Creates an ip assignment for a device. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_ip_assignment(id, ip_assignment, async_req=True)
>>> result = thread.get()
:param id: Device UUID (required)
:type id: str
:param ip_assignment: IPAssignment to create (required)
:type ip_assignment: IPAssignmentInput
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPAssignment
"""
kwargs['_return_http_data_only'] = True
return self.create_ip_assignment_with_http_info(id, ip_assignment, **kwargs) # noqa: E501
def create_ip_assignment_with_http_info(self, id, ip_assignment, **kwargs): # noqa: E501
"""Create an ip assignment # noqa: E501
Creates an ip assignment for a device. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_ip_assignment_with_http_info(id, ip_assignment, async_req=True)
>>> result = thread.get()
:param id: Device UUID (required)
:type id: str
:param ip_assignment: IPAssignment to create (required)
:type ip_assignment: IPAssignmentInput
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPAssignment, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'ip_assignment'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_ip_assignment" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `create_ip_assignment`") # noqa: E501
# verify the required parameter 'ip_assignment' is set
if self.api_client.client_side_validation and ('ip_assignment' not in local_var_params or # noqa: E501
local_var_params['ip_assignment'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `ip_assignment` when calling `create_ip_assignment`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'ip_assignment' in local_var_params:
body_params = local_var_params['ip_assignment']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
201: "IPAssignment",
401: "Error",
404: "Error",
422: "Error",
}
return self.api_client.call_api(
'/devices/{id}/ips', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def create_self_service_reservation(self, project_id, reservation, **kwargs): # noqa: E501
"""Create a reservation # noqa: E501
Creates a reservation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_self_service_reservation(project_id, reservation, async_req=True)
>>> result = thread.get()
:param project_id: Project UUID (required)
:type project_id: str
:param reservation: reservation to create (required)
:type reservation: CreateSelfServiceReservationRequest
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: SelfServiceReservationResponse
"""
kwargs['_return_http_data_only'] = True
return self.create_self_service_reservation_with_http_info(project_id, reservation, **kwargs) # noqa: E501
def create_self_service_reservation_with_http_info(self, project_id, reservation, **kwargs): # noqa: E501
"""Create a reservation # noqa: E501
Creates a reservation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_self_service_reservation_with_http_info(project_id, reservation, async_req=True)
>>> result = thread.get()
:param project_id: Project UUID (required)
:type project_id: str
:param reservation: reservation to create (required)
:type reservation: CreateSelfServiceReservationRequest
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(SelfServiceReservationResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'project_id',
'reservation'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_self_service_reservation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `create_self_service_reservation`") # noqa: E501
# verify the required parameter 'reservation' is set
if self.api_client.client_side_validation and ('reservation' not in local_var_params or # noqa: E501
local_var_params['reservation'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `reservation` when calling `create_self_service_reservation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['project_id'] = local_var_params['project_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'reservation' in local_var_params:
body_params = local_var_params['reservation']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
201: "SelfServiceReservationResponse",
401: "Error",
422: "Error",
}
return self.api_client.call_api(
'/projects/{project_id}/self-service/reservations', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def delete_ip_address(self, id, **kwargs): # noqa: E501
"""Unassign an ip address # noqa: E501
Note! This call can be used to un-assign an IP assignment or delete an IP reservation. Un-assign an IP address record. Use the assignment UUID you get after attaching the IP. This will remove the relationship between an IP and the device and will make the IP address available to be assigned to another device. Delete and IP reservation. Use the reservation UUID you get after adding the IP to the project. This will permanently delete the IP block reservation from the project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_ip_address(id, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.delete_ip_address_with_http_info(id, **kwargs) # noqa: E501
def delete_ip_address_with_http_info(self, id, **kwargs): # noqa: E501
"""Unassign an ip address # noqa: E501
Note! This call can be used to un-assign an IP assignment or delete an IP reservation. Un-assign an IP address record. Use the assignment UUID you get after attaching the IP. This will remove the relationship between an IP and the device and will make the IP address available to be assigned to another device. Delete and IP reservation. Use the reservation UUID you get after adding the IP to the project. This will permanently delete the IP block reservation from the project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_ip_address_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_ip_address" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `delete_ip_address`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/ips/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_ip_address_by_id(self, id, **kwargs): # noqa: E501
"""Retrieve an ip address # noqa: E501
Returns a single ip address if the user has access. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_address_by_id(id, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPAssignment
"""
kwargs['_return_http_data_only'] = True
return self.find_ip_address_by_id_with_http_info(id, **kwargs) # noqa: E501
def find_ip_address_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve an ip address # noqa: E501
Returns a single ip address if the user has access. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_address_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPAssignment, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'include',
'exclude'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_ip_address_by_id" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_ip_address_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'include' in local_var_params and local_var_params['include'] is not None: # noqa: E501
query_params.append(('include', local_var_params['include'])) # noqa: E501
collection_formats['include'] = 'csv' # noqa: E501
if 'exclude' in local_var_params and local_var_params['exclude'] is not None: # noqa: E501
query_params.append(('exclude', local_var_params['exclude'])) # noqa: E501
collection_formats['exclude'] = 'csv' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "IPAssignment",
401: "Error",
403: "Error",
404: "Error",
}
return self.api_client.call_api(
'/ips/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_ip_address_customdata(self, id, **kwargs): # noqa: E501
"""Retrieve the custom metadata of an IP Reservation or IP Assignment # noqa: E501
Provides the custom metadata stored for this IP Reservation or IP Assignment in json format # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_address_customdata(id, async_req=True)
>>> result = thread.get()
:param id: Ip Reservation UUID (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.find_ip_address_customdata_with_http_info(id, **kwargs) # noqa: E501
def find_ip_address_customdata_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve the custom metadata of an IP Reservation or IP Assignment # noqa: E501
Provides the custom metadata stored for this IP Reservation or IP Assignment in json format # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_address_customdata_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Ip Reservation UUID (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_ip_address_customdata" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_ip_address_customdata`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/ips/{id}/customdata', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_ip_assignments(self, id, **kwargs): # noqa: E501
"""Retrieve all ip assignments # noqa: E501
Returns all ip assignments for a device. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_assignments(id, async_req=True)
>>> result = thread.get()
:param id: Device UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPAssignmentList
"""
kwargs['_return_http_data_only'] = True
return self.find_ip_assignments_with_http_info(id, **kwargs) # noqa: E501
def find_ip_assignments_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve all ip assignments # noqa: E501
Returns all ip assignments for a device. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_assignments_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Device UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPAssignmentList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'include',
'exclude'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_ip_assignments" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_ip_assignments`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'include' in local_var_params and local_var_params['include'] is not None: # noqa: E501
query_params.append(('include', local_var_params['include'])) # noqa: E501
collection_formats['include'] = 'csv' # noqa: E501
if 'exclude' in local_var_params and local_var_params['exclude'] is not None: # noqa: E501
query_params.append(('exclude', local_var_params['exclude'])) # noqa: E501
collection_formats['exclude'] = 'csv' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "IPAssignmentList",
401: "Error",
404: "Error",
}
return self.api_client.call_api(
'/devices/{id}/ips', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_ip_availabilities(self, id, cidr, **kwargs): # noqa: E501
"""Retrieve all available subnets of a particular reservation # noqa: E501
Provides a list of IP resevations for a single project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_availabilities(id, cidr, async_req=True)
>>> result = thread.get()
:param id: IP Reservation UUID (required)
:type id: str
:param cidr: Size of subnets in bits (required)
:type cidr: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPAvailabilitiesList
"""
kwargs['_return_http_data_only'] = True
return self.find_ip_availabilities_with_http_info(id, cidr, **kwargs) # noqa: E501
def find_ip_availabilities_with_http_info(self, id, cidr, **kwargs): # noqa: E501
"""Retrieve all available subnets of a particular reservation # noqa: E501
Provides a list of IP resevations for a single project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_availabilities_with_http_info(id, cidr, async_req=True)
>>> result = thread.get()
:param id: IP Reservation UUID (required)
:type id: str
:param cidr: Size of subnets in bits (required)
:type cidr: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPAvailabilitiesList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'cidr'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_ip_availabilities" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_ip_availabilities`") # noqa: E501
# verify the required parameter 'cidr' is set
if self.api_client.client_side_validation and ('cidr' not in local_var_params or # noqa: E501
local_var_params['cidr'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `cidr` when calling `find_ip_availabilities`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'cidr' in local_var_params and local_var_params['cidr'] is not None: # noqa: E501
query_params.append(('cidr', local_var_params['cidr'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "IPAvailabilitiesList",
401: "Error",
403: "Error",
404: "Error",
}
return self.api_client.call_api(
'/ips/{id}/available', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_ip_reservations(self, id, **kwargs): # noqa: E501
"""Retrieve all ip reservations # noqa: E501
Provides a list of IP resevations for a single project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_reservations(id, async_req=True)
>>> result = thread.get()
:param id: Project UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPReservationList
"""
kwargs['_return_http_data_only'] = True
return self.find_ip_reservations_with_http_info(id, **kwargs) # noqa: E501
def find_ip_reservations_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve all ip reservations # noqa: E501
Provides a list of IP resevations for a single project. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_ip_reservations_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Project UUID (required)
:type id: str
:param include: Nested attributes to include. Included objects will return their full attributes. Attribute names can be dotted (up to 3 levels) to included deeply nested objects.
:type include: list[str]
:param exclude: Nested attributes to exclude. Excluded objects will return only the href attribute. Attribute names can be dotted (up to 3 levels) to exclude deeply nested objects.
:type exclude: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPReservationList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'include',
'exclude'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_ip_reservations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_ip_reservations`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'include' in local_var_params and local_var_params['include'] is not None: # noqa: E501
query_params.append(('include', local_var_params['include'])) # noqa: E501
collection_formats['include'] = 'csv' # noqa: E501
if 'exclude' in local_var_params and local_var_params['exclude'] is not None: # noqa: E501
query_params.append(('exclude', local_var_params['exclude'])) # noqa: E501
collection_formats['exclude'] = 'csv' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "IPReservationList",
401: "Error",
403: "Error",
404: "Error",
}
return self.api_client.call_api(
'/projects/{id}/ips', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_self_service_reservation(self, id, project_id, **kwargs): # noqa: E501
"""Retrieve a reservation # noqa: E501
Returns a reservation # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_self_service_reservation(id, project_id, async_req=True)
>>> result = thread.get()
:param id: Reservation short_id (required)
:type id: str
:param project_id: Project UUID (required)
:type project_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: SelfServiceReservationResponse
"""
kwargs['_return_http_data_only'] = True
return self.find_self_service_reservation_with_http_info(id, project_id, **kwargs) # noqa: E501
def find_self_service_reservation_with_http_info(self, id, project_id, **kwargs): # noqa: E501
"""Retrieve a reservation # noqa: E501
Returns a reservation # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_self_service_reservation_with_http_info(id, project_id, async_req=True)
>>> result = thread.get()
:param id: Reservation short_id (required)
:type id: str
:param project_id: Project UUID (required)
:type project_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(SelfServiceReservationResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'project_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_self_service_reservation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `find_self_service_reservation`") # noqa: E501
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `find_self_service_reservation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
if 'project_id' in local_var_params:
path_params['project_id'] = local_var_params['project_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "SelfServiceReservationResponse",
401: "Error",
404: "Error",
}
return self.api_client.call_api(
'/projects/{project_id}/self-service/reservations/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def find_self_service_reservations(self, project_id, **kwargs): # noqa: E501
"""Retrieve all reservations # noqa: E501
Returns all reservations. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_self_service_reservations(project_id, async_req=True)
>>> result = thread.get()
:param project_id: Project UUID (required)
:type project_id: str
:param page: Page to return
:type page: int
:param per_page: Items returned per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: SelfServiceReservationList
"""
kwargs['_return_http_data_only'] = True
return self.find_self_service_reservations_with_http_info(project_id, **kwargs) # noqa: E501
def find_self_service_reservations_with_http_info(self, project_id, **kwargs): # noqa: E501
"""Retrieve all reservations # noqa: E501
Returns all reservations. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_self_service_reservations_with_http_info(project_id, async_req=True)
>>> result = thread.get()
:param project_id: Project UUID (required)
:type project_id: str
:param page: Page to return
:type page: int
:param per_page: Items returned per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(SelfServiceReservationList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'project_id',
'page',
'per_page'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method find_self_service_reservations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'project_id' is set
if self.api_client.client_side_validation and ('project_id' not in local_var_params or # noqa: E501
local_var_params['project_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_id` when calling `find_self_service_reservations`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] > 100000: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `find_self_service_reservations`, must be a value less than or equal to `100000`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `find_self_service_reservations`, must be a value greater than or equal to `1`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] > 1000: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `find_self_service_reservations`, must be a value less than or equal to `1000`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `find_self_service_reservations`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
if 'project_id' in local_var_params:
path_params['project_id'] = local_var_params['project_id'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'per_page' in local_var_params and local_var_params['per_page'] is not None: # noqa: E501
query_params.append(('per_page', local_var_params['per_page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "SelfServiceReservationList",
401: "Error",
}
return self.api_client.call_api(
'/projects/{project_id}/self-service/reservations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def request_ip_reservation(self, id, ip_reservation_request, **kwargs): # noqa: E501
"""Requesting IP reservations # noqa: E501
Request more IP space for a project in order to have additional IP addresses to assign to devices. If the request is within the max quota, an IP reservation will be created. If the project will exceed its IP quota, a request will be submitted for review, and will return an IP Reservation with a `state` of `pending`. You can automatically have the request fail with HTTP status 422 instead of triggering the review process by providing the `fail_on_approval_required` parameter set to `true` in the request. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.request_ip_reservation(id, ip_reservation_request, async_req=True)
>>> result = thread.get()
:param id: Project UUID (required)
:type id: str
:param ip_reservation_request: IP Reservation Request to create (required)
:type ip_reservation_request: IPReservationRequestInput
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPReservation
"""
kwargs['_return_http_data_only'] = True
return self.request_ip_reservation_with_http_info(id, ip_reservation_request, **kwargs) # noqa: E501
def request_ip_reservation_with_http_info(self, id, ip_reservation_request, **kwargs): # noqa: E501
"""Requesting IP reservations # noqa: E501
Request more IP space for a project in order to have additional IP addresses to assign to devices. If the request is within the max quota, an IP reservation will be created. If the project will exceed its IP quota, a request will be submitted for review, and will return an IP Reservation with a `state` of `pending`. You can automatically have the request fail with HTTP status 422 instead of triggering the review process by providing the `fail_on_approval_required` parameter set to `true` in the request. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.request_ip_reservation_with_http_info(id, ip_reservation_request, async_req=True)
>>> result = thread.get()
:param id: Project UUID (required)
:type id: str
:param ip_reservation_request: IP Reservation Request to create (required)
:type ip_reservation_request: IPReservationRequestInput
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPReservation, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'ip_reservation_request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method request_ip_reservation" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `request_ip_reservation`") # noqa: E501
# verify the required parameter 'ip_reservation_request' is set
if self.api_client.client_side_validation and ('ip_reservation_request' not in local_var_params or # noqa: E501
local_var_params['ip_reservation_request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `ip_reservation_request` when calling `request_ip_reservation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'ip_reservation_request' in local_var_params:
body_params = local_var_params['ip_reservation_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
201: "IPReservation",
401: "Error",
403: "Error",
404: "Error",
422: "Error",
}
return self.api_client.call_api(
'/projects/{id}/ips', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def update_ip_address(self, id, details, customdata, **kwargs): # noqa: E501
"""Update an ip address # noqa: E501
Update details about an ip address # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_ip_address(id, details, customdata, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param details: Notes for this IP Assignment (required)
:type details: str
:param customdata: Provides the custom metadata stored for this IP Assignment in json format (required)
:type customdata: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: IPAssignment
"""
kwargs['_return_http_data_only'] = True
return self.update_ip_address_with_http_info(id, details, customdata, **kwargs) # noqa: E501
def update_ip_address_with_http_info(self, id, details, customdata, **kwargs): # noqa: E501
"""Update an ip address # noqa: E501
Update details about an ip address # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_ip_address_with_http_info(id, details, customdata, async_req=True)
>>> result = thread.get()
:param id: IP Address UUID (required)
:type id: str
:param details: Notes for this IP Assignment (required)
:type details: str
:param customdata: Provides the custom metadata stored for this IP Assignment in json format (required)
:type customdata: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(IPAssignment, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'details',
'customdata'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_ip_address" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `update_ip_address`") # noqa: E501
# verify the required parameter 'details' is set
if self.api_client.client_side_validation and ('details' not in local_var_params or # noqa: E501
local_var_params['details'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `details` when calling `update_ip_address`") # noqa: E501
# verify the required parameter 'customdata' is set
if self.api_client.client_side_validation and ('customdata' not in local_var_params or # noqa: E501
local_var_params['customdata'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `customdata` when calling `update_ip_address`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'details' in local_var_params and local_var_params['details'] is not None: # noqa: E501
query_params.append(('details', local_var_params['details'])) # noqa: E501
if 'customdata' in local_var_params and local_var_params['customdata'] is not None: # noqa: E501
query_params.append(('customdata', local_var_params['customdata'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['x_auth_token'] # noqa: E501
response_types_map = {
200: "IPAssignment",
401: "Error",
403: "Error",
404: "Error",
}
return self.api_client.call_api(
'/ips/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 47.152174 | 531 | 0.603331 | 9,889 | 86,760 | 5.055921 | 0.033269 | 0.038882 | 0.061322 | 0.025921 | 0.964139 | 0.960578 | 0.954278 | 0.946118 | 0.938858 | 0.929957 | 0 | 0.01514 | 0.326233 | 86,760 | 1,839 | 532 | 47.177814 | 0.84017 | 0.48361 | 0 | 0.740651 | 0 | 0.004825 | 0.187011 | 0.047894 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030157 | false | 0.004825 | 0.006031 | 0 | 0.066345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3216655666fe457c37063de371211aedb4716815 | 15,024 | py | Python | test/exploits/regex/test_redos.py | drjerry/acsploit | fbe07fb0eb651e3c5fc27a0dbdfcd0ec4c674381 | [
"BSD-3-Clause"
] | 107 | 2018-05-03T16:53:01.000Z | 2022-02-23T14:47:20.000Z | test/exploits/regex/test_redos.py | drjerry/acsploit | fbe07fb0eb651e3c5fc27a0dbdfcd0ec4c674381 | [
"BSD-3-Clause"
] | 7 | 2019-04-28T00:41:35.000Z | 2021-05-04T20:35:54.000Z | test/exploits/regex/test_redos.py | drjerry/acsploit | fbe07fb0eb651e3c5fc27a0dbdfcd0ec4c674381 | [
"BSD-3-Clause"
] | 16 | 2019-03-29T12:39:16.000Z | 2021-03-03T11:09:45.000Z | from exploits.regex import redos
from exploits.regex.regex_common import RegexParser
from test.exploits.dummy_output import DummyOutput
import pytest
def test_run_redos_wikipedia_vuln1():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a+)+x'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos_wikipedia_vuln2():
output = DummyOutput()
length = 10
redos.options['regex'] = '([a-zA-Z]+)*\.'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos_wikipedia_vuln3():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|aa)+[A-Z0-9]'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert output[0] == 'a' * length
def test_run_redos_wikipedia_vuln4():
output = DummyOutput()
length = 30
redos.options['regex'] = '[0-5]{7}_(a|a?)+5'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
for i in range(7):
assert (int(output[0][i]) >= 0) and (int(output[0][i]) <= 5)
assert output[0][7:] == '_' + ('a' * (length - 8))
def test_run_redos_wikipedia_vuln6():
output = DummyOutput()
length = 10
redos.options['regex'] = '(([a-z])+.)+[A-Z]([a-z])+'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos_wikipedia_vuln7():
output = DummyOutput()
length = 50
redos.options['regex'] = '([a-zA-Z0-9])(([-.]|[_]+)?([a-zA-Z0-9]+))*(@){1}[a-z0-9]+[.]{1}(([a-z]{2,3})|([a-z]{2,3}[.]{1}[a-z]{2,3}))'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos_wikipedia_vuln7_negations():
output = DummyOutput()
length = 50
redos.options['regex'] = '([^a])(([-.]|[_]+)?([^a]+))*(@){1}[^a]+[.]{1}(([^a]{2,3})|([^a]{2,3}[.]{1}[^a]{2,3}))'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos_stackoverflow():
output = DummyOutput()
length = 10
redos.options['regex'] = r'((ab)*)+\\'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
def test_run_redos():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b|ab)*c'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = True
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0] == 'ab' * (length // len('ab'))
def test_run_redos_dot():
output = DummyOutput()
length = 10
redos.options['regex'] = '(.|.)*c'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
# dots are replaced with 'a' in stringify_result
assert output[0] == 'a' * length
def test_run_redos_quadratic():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b)*(a|c)*d'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = True
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0] == 'a' * length
def test_run_redos_quadratic_but_excluding_quadratic():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b)*(a|c)*d'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['parallelize'] = False
with pytest.raises(ValueError):
redos.run(output)
def test_run_redos_no_vulnerability():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b)*c'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['parallelize'] = False
with pytest.raises(ValueError):
redos.run(output)
def test_run_redos_longer_length():
output = DummyOutput()
length = 100
redos.options['regex'] = '(a|b|ab)*c'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0] == 'ab' * (length // len('ab'))
def test_run_redos_no_vulnerability_exponential_missing_terminator():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b|ab)*'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['parallelize'] = False
with pytest.raises(ValueError):
redos.run(output)
def test_run_redos_end_string_anchor():
output = DummyOutput()
length = 11
redos.options['regex'] = '(a|b|ab)*$'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0][:10] == 'ab' * (length // len('ab'))
assert output[0][10] != 'a'
assert output[0][10] != 'b'
def test_run_redos_no_vulnerability_quadratic_missing_terminator():
output = DummyOutput()
length = 10
redos.options['regex'] = '(a|b)*(a|c)*'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['parallelize'] = False
with pytest.raises(ValueError):
redos.run(output)
def test_exponential_whitespace_word_digit_whitespace_terminator():
output = DummyOutput()
length = 11
redos.options['regex'] = r'^\s*(\w|\d)+\s*$'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
for i in output[0][:-2]:
assert i >= '0' and i <= '9'
assert not (output[0][length-1] >= '0' and output[0][length-1] <= '9')
def test_multicharacter_terminator():
output = DummyOutput()
length = 11
redos.options['regex'] = r'(\w|\d)+(\s*|.)$'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = False
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
for i in output[0][:-2]:
assert i >= '0' and i <= '9'
assert not (output[0][length-2] >= '0' and output[0][length-2] <= '9')
assert not (output[0][length-1] >= '0' and output[0][length-1] <= '9')
def test_redos_two_dots():
output = DummyOutput()
length = 11
redos.options['regex'] = '.*b.*a'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = True
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0][1:] == ('b' * (length - 1))
def test_redos_negative_number():
output = DummyOutput()
length = 11
redos.options['regex'] = '^(-(([0-9]+\\.[0-9]*[1-9][0-9]*)|([0-9]*[1-9][0-9]*\\.[0-9]+)|([0-9]*[1-9][0-9]*)))$'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0][0] == '-'
for i in output[0][1:-1]:
assert (i >= '0' and i <= '9') or i == '.'
assert (output[0][-1] < '0' or output[0][-1] > '9') and output[0][-1] != '.'
def test_redos_two_dots_not_vulnerable():
output = DummyOutput()
length = 11
redos.options['regex'] = '.*bal.*a'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
with pytest.raises(ValueError):
redos.run(output)
def test_redos_two_dots_anchor():
output = DummyOutput()
length = 12
redos.options['regex'] = '.*bal.*a$'
redos.options['max_length'] = length
redos.options['show_nfa'] = False
redos.options['include_quadratic'] = True
redos.options['use_file'] = False
redos.options['show_progress_bar'] = False
redos.options['show_only_vulnerable'] = True
redos.options['verify'] = False
redos.options['parallelize'] = False
redos.run(output)
assert len(output) == 1
assert len(output[0]) == length
assert output[0] == 'bal' * (length // len('bal'))
def test_nfa_intersection_with_dot_accepting_state():
nfa_1 = RegexParser().run('a.c')
nfa_2 = RegexParser().run('abc')
intersection_nfa = nfa_1.intersection(nfa_2)
assert intersection_nfa.has_accepting_states()
def test_nfa_intersection_both_dot_accepting_state():
nfa_1 = RegexParser().run('a.c')
nfa_2 = RegexParser().run('.ac')
intersection_nfa = nfa_1.intersection(nfa_2)
assert intersection_nfa.has_accepting_states()
def test_nfa_intersection_with_dot_no_accepting_state():
nfa_1 = RegexParser().run('a.c')
nfa_2 = RegexParser().run('abd')
intersection_nfa = nfa_1.intersection(nfa_2)
assert not intersection_nfa.has_accepting_states()
def test_verify_string_non_vulnerable_string():
with pytest.raises(RuntimeError):
redos.verify_string('ab*c', 'a', 'b', 'c', True)
with pytest.raises(RuntimeError):
redos.verify_string('ab*c', 'a', 'b', 'c', False)
def test_verify_string_quadratic_regex_exponential_test():
with pytest.raises(RuntimeError):
redos.verify_string('(a|b)*(a|c)*d', '', 'a', '', True)
def test_verify_string_quadratic_regex():
redos.verify_string('(a|b)*(a|c)*d', '', 'a', '', False)
def test_verify_string_quadratic_regex():
redos.verify_string('(a|b|ab)*d', '', 'ab', '', True)
| 34.858469 | 137 | 0.652223 | 1,946 | 15,024 | 4.853546 | 0.059609 | 0.257914 | 0.16739 | 0.102276 | 0.917523 | 0.902488 | 0.891265 | 0.856432 | 0.824775 | 0.802435 | 0 | 0.01729 | 0.180045 | 15,024 | 430 | 138 | 34.939535 | 0.749411 | 0.003062 | 0 | 0.801567 | 0 | 0.007833 | 0.196648 | 0.020032 | 0 | 0 | 0 | 0 | 0.133159 | 1 | 0.078329 | false | 0 | 0.010444 | 0 | 0.088773 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5ce7a0644afb109ce7ca47866a60873e18181de4 | 117 | py | Python | zookeeper/test_version.py | AdamHillier/zookeeper | f9a2c19d429bf61a3bc64dc51ba942872b22c5a1 | [
"Apache-2.0"
] | 4 | 2019-05-30T17:07:28.000Z | 2019-07-07T14:01:29.000Z | zookeeper/test_version.py | AdamHillier/zookeeper | f9a2c19d429bf61a3bc64dc51ba942872b22c5a1 | [
"Apache-2.0"
] | 58 | 2020-10-15T06:39:46.000Z | 2022-03-29T12:05:45.000Z | zookeeper/test_version.py | AdamHillier/zookeeper | f9a2c19d429bf61a3bc64dc51ba942872b22c5a1 | [
"Apache-2.0"
] | 1 | 2019-06-02T17:01:13.000Z | 2019-06-02T17:01:13.000Z | import zookeeper
def test_version():
assert hasattr(zookeeper, "__version__") and "." in zookeeper.__version__
| 19.5 | 77 | 0.752137 | 13 | 117 | 6.076923 | 0.692308 | 0.405063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145299 | 117 | 5 | 78 | 23.4 | 0.79 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7a847b3bd1c38d012fb7f64d409ee803900300d9 | 436 | py | Python | tools/__init__.py | JunboLu/CP2K_kit | 0950f37f253c3f90d6a0539c57f1be1045e7317d | [
"Apache-2.0"
] | 16 | 2021-04-19T03:40:32.000Z | 2022-02-21T12:53:33.000Z | tools/__init__.py | JunboLu/CP2K_kit | 0950f37f253c3f90d6a0539c57f1be1045e7317d | [
"Apache-2.0"
] | null | null | null | tools/__init__.py | JunboLu/CP2K_kit | 0950f37f253c3f90d6a0539c57f1be1045e7317d | [
"Apache-2.0"
] | 2 | 2021-11-28T02:55:31.000Z | 2022-02-21T12:54:52.000Z | from CP2K_kit.tools import atom
from CP2K_kit.tools import call
from CP2K_kit.tools import get_cell
from CP2K_kit.tools import data_op
from CP2K_kit.tools import numeric
from CP2K_kit.tools import read_input
from CP2K_kit.tools import traj_info
from CP2K_kit.tools import traj_tools
from CP2K_kit.tools import read_lmp
from CP2K_kit.tools import log_info
from CP2K_kit.tools import file_tools
from CP2K_kit.tools import revise_cp2k_inp
| 33.538462 | 42 | 0.862385 | 82 | 436 | 4.317073 | 0.256098 | 0.271186 | 0.372881 | 0.542373 | 0.841808 | 0.468927 | 0 | 0 | 0 | 0 | 0 | 0.033505 | 0.110092 | 436 | 12 | 43 | 36.333333 | 0.878866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8fd90aa97fba0707892ff216ac52a587421a56f3 | 6,309 | py | Python | tests/test_main.py | tsavko/schemachange | e9643b1270522bb59992fd3026c675b2a6004569 | [
"Apache-2.0"
] | null | null | null | tests/test_main.py | tsavko/schemachange | e9643b1270522bb59992fd3026c675b2a6004569 | [
"Apache-2.0"
] | null | null | null | tests/test_main.py | tsavko/schemachange | e9643b1270522bb59992fd3026c675b2a6004569 | [
"Apache-2.0"
] | null | null | null | import os
import sys
import pytest
import unittest.mock as mock
import schemachange.cli
@pytest.mark.parametrize("args, expected", [
(["schemachange"], ('.', None, None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "--config-folder", "test"], ('test', None, None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "-f", '.'], ('.', '.', None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "--modules-folder", "modules-folder"], ('.', None, "modules-folder", None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "--snowflake-account", "account"], ('.', None, None, "account", None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "--snowflake-user", "user"], ('.', None, None, None, "user", None, None, None, None, None, False, False, False, False)),
(["schemachange", "--snowflake-role", "role"], ('.', None, None, None, None, "role", None, None, None, None, False, False, False, False)),
(["schemachange", "--snowflake-warehouse", "warehouse"], ('.', None, None, None, None, None, "warehouse", None, None, None, False, False, False, False)),
(["schemachange", "--snowflake-database", "database"], ('.', None, None, None, None, None, None, "database", None, None, False, False, False, False)),
(["schemachange", "--change-history-table", "db.schema.table"], ('.', None, None, None, None, None, None, None, "db.schema.table", None, False, False, False, False)),
(["schemachange", "--vars", '{"var1": "val"}'], ('.', None, None, None, None, None, None, None, None, {'var1' : 'val'}, False, False, False, False)),
(["schemachange", "--create-change-history-table"], ('.', None, None, None, None, None, None, None, None, None, True, False, False, False)),
(["schemachange", "--autocommit"], ('.', None, None, None, None, None, None, None, None, None, False, True, False, False)),
(["schemachange", "--verbose"], ('.', None, None, None, None, None, None, None, None, None, False, False, True, False)),
(["schemachange", "--dry-run"], ('.', None, None, None, None, None, None, None, None, None, False, False, False, True))
])
def test_main_no_subcommand_given_arguments_make_sure_arguments_set_on_call( args, expected):
sys.argv = args
with mock.patch("schemachange.cli.schemachange") as mock_schemachange:
schemachange.cli.main()
mock_schemachange.assert_called_once_with(*expected)
@pytest.mark.parametrize("args, expected", [
(["schemachange", "deploy"], ('.', None, None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--config-folder", "test"], ('test', None, None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "-f", '.'], ('.', '.', None, None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--modules-folder", "modules-folder"], ('.', None, "modules-folder", None, None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--snowflake-account", "account"], ('.', None, None, "account", None, None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--snowflake-user", "user"], ('.', None, None, None, "user", None, None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--snowflake-role", "role"], ('.', None, None, None, None, "role", None, None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--snowflake-warehouse", "warehouse"], ('.', None, None, None, None, None, "warehouse", None, None, None, False, False, False, False)),
(["schemachange", "deploy", "--snowflake-database", "database"], ('.', None, None, None, None, None, None, "database", None, None, False, False, False, False)),
(["schemachange", "deploy", "--change-history-table", "db.schema.table"], ('.', None, None, None, None, None, None, None, "db.schema.table", None, False, False, False, False)),
(["schemachange", "deploy", "--vars", '{"var1": "val"}'], ('.', None, None, None, None, None, None, None, None, {'var1' : 'val'}, False, False, False, False)),
(["schemachange", "deploy", "--create-change-history-table"], ('.', None, None, None, None, None, None, None, None, None, True, False, False, False)),
(["schemachange", "deploy", "--autocommit"], ('.', None, None, None, None, None, None, None, None, None, False, True, False, False)),
(["schemachange", "deploy", "--verbose"], ('.', None, None, None, None, None, None, None, None, None, False, False, True, False)),
(["schemachange", "deploy", "--dry-run"], ('.', None, None, None, None, None, None, None, None, None, False, False, False, True))
])
def test_main_deploy_subcommand_given_arguments_make_sure_arguments_set_on_call( args, expected):
sys.argv = args
with mock.patch("schemachange.cli.schemachange") as mock_schemachange:
schemachange.cli.main()
mock_schemachange.assert_called_once_with(*expected)
@pytest.mark.parametrize("args, expected", [
(["schemachange", "render", "script.sql"], ('.', None, None, None, False, "script.sql")),
(["schemachange", "render", "--config-folder", "test", "script.sql"], ("test", None, None, None, False, "script.sql")),
(["schemachange", "render", "--root-folder", '.', "script.sql"], ('.', ".", None, None, False, "script.sql")),
(["schemachange", "render", "--modules-folder", "modules-folder", "script.sql"], ('.', None, "modules-folder", None, False, "script.sql")),
(["schemachange", "render", "--vars", '{"var1": "val"}', "script.sql"], ('.', None, None, {'var1' : 'val'}, False, "script.sql")),
(["schemachange", "render", "--verbose", "script.sql"], ('.', None, None, None, True, "script.sql")),
])
def test_main_render_subcommand_given_arguments_make_sure_arguments_set_on_call( args, expected):
sys.argv = args
with mock.patch("schemachange.cli.render_command") as mock_render_command:
schemachange.cli.main()
mock_render_command.assert_called_once_with(*expected)
| 86.424658 | 184 | 0.622127 | 741 | 6,309 | 5.22807 | 0.08502 | 0.446051 | 0.529685 | 0.545173 | 0.914559 | 0.885906 | 0.863449 | 0.851575 | 0.82731 | 0.82731 | 0 | 0.001116 | 0.147884 | 6,309 | 72 | 185 | 87.625 | 0.719494 | 0 | 0 | 0.258065 | 0 | 0 | 0.268347 | 0.036931 | 0 | 0 | 0 | 0 | 0.048387 | 1 | 0.048387 | false | 0 | 0.080645 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8fed835a544b3ec5fc35ad5568fc914cb5545979 | 180 | py | Python | nanobrew/core/application/mapper/sensor_type_mapper.py | nanobrew/nanobrew-core | ef180faa1e33af58ca7b7ff76a4ae016becb6cfc | [
"MIT"
] | 1 | 2020-04-02T08:54:11.000Z | 2020-04-02T08:54:11.000Z | nanobrew/core/application/mapper/sensor_type_mapper.py | nanobrew/nanobrew-core | ef180faa1e33af58ca7b7ff76a4ae016becb6cfc | [
"MIT"
] | 19 | 2020-05-02T10:04:07.000Z | 2020-06-01T09:59:13.000Z | nanobrew/core/application/mapper/sensor_type_mapper.py | nanobrew/nanobrew-core | ef180faa1e33af58ca7b7ff76a4ae016becb6cfc | [
"MIT"
] | 1 | 2020-03-13T15:59:19.000Z | 2020-03-13T15:59:19.000Z | from ...domain.sensor_type import SensorType
class SensorTypeMapper:
async def sensor_type_to_dict(self, sensor_type: SensorType):
return await sensor_type.to_dict()
| 25.714286 | 65 | 0.772222 | 24 | 180 | 5.5 | 0.625 | 0.30303 | 0.181818 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 180 | 6 | 66 | 30 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
8ffdb423cacd35349fa7dece1d7d8ba8f4cebe28 | 3,695 | py | Python | images/core/context/free5gc/lib/nextepc/gtp/support/cache/tlv_msg_167.py | my5G/OPlaceRAN | cc6fe5223b9b2a32d7963b2304762fe2a0265298 | [
"Apache-2.0"
] | 1 | 2022-02-22T07:19:59.000Z | 2022-02-22T07:19:59.000Z | images/core/context/free5gc/lib/nextepc/gtp/support/cache/tlv_msg_167.py | my5G/OPlaceRAN | cc6fe5223b9b2a32d7963b2304762fe2a0265298 | [
"Apache-2.0"
] | 1 | 2022-01-15T20:26:01.000Z | 2022-01-15T20:26:01.000Z | images/core/context/free5gc/lib/nextepc/gtp/support/cache/tlv_msg_167.py | my5G/OPlaceRAN | cc6fe5223b9b2a32d7963b2304762fe2a0265298 | [
"Apache-2.0"
] | 1 | 2022-01-07T18:49:10.000Z | 2022-01-07T18:49:10.000Z | ies = []
ies.append({ "ie_type" : "Cause", "ie_value" : "Cause", "presence" : "M", "instance" : "0", "comment" : ""})
ies.append({ "ie_type" : "F-TEID", "ie_value" : "Sender F-TEID for Control Plane", "presence" : "C", "instance" : "0", "comment" : "This IE shall be included by an SGW if the SGW receives a Sender F-TEID for Control Plane IE from an MME/SGSN in a Create Indirect Data Forwarding Tunnel Request message.See also NOTE 1 in Table 7.2.18-1."})
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 0", "presence" : "M", "instance" : "0", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 1", "presence" : "O", "instance" : "1", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "2"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 2", "presence" : "O", "instance" : "2", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "3"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 3", "presence" : "O", "instance" : "3", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "4"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 4", "presence" : "O", "instance" : "4", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "5"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 5", "presence" : "O", "instance" : "5", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "6"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 6", "presence" : "O", "instance" : "6", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "7"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 7", "presence" : "O", "instance" : "7", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "8"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 8", "presence" : "O", "instance" : "8", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "9"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 9", "presence" : "O", "instance" : "9", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
type_list["Bearer Context"]["max_instance"] = "10"
ies.append({ "ie_type" : "Bearer Context", "ie_value" : "Bearer Context 10", "presence" : "O", "instance" : "10", "comment" : "Several IEs with this type and instance value may be included as necessary to represent a list of Bearers"})
ies.append({ "ie_type" : "Recovery", "ie_value" : "Recovery", "presence" : "CO", "instance" : "0", "comment" : "This IE shall be included if contacting the peer for the first time"})
msg_list[key]["ies"] = ies
| 142.115385 | 339 | 0.690663 | 555 | 3,695 | 4.513514 | 0.140541 | 0.160878 | 0.061477 | 0.083832 | 0.811976 | 0.79481 | 0.774052 | 0.774052 | 0.744511 | 0.744511 | 0 | 0.013668 | 0.148579 | 3,695 | 25 | 340 | 147.8 | 0.782581 | 0 | 0 | 0 | 0 | 0.04 | 0.711502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
64e458993dcf3a802011ce54699fed491e6165e3 | 6,977 | py | Python | ad_api/api/portfolios.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | null | null | null | ad_api/api/portfolios.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | null | null | null | ad_api/api/portfolios.py | mkdir700/python-amazon-ad-api | e82429be4c56f4b56bddfcd70c18dabd4c109406 | [
"MIT"
] | null | null | null | from ad_api.base import Client, sp_endpoint, fill_query_params, ApiResponse, Utils
class Portfolios(Client):
"""
"""
@sp_endpoint('/v2/portfolios', method='GET')
def list_portfolios(self, **kwargs) -> ApiResponse:
r"""
list_portfolios(**kwargs) -> ApiResponse
Retrieves a list of portfolios, optionally filtered by identifier, name, or state. Note that this operation returns a maximum of 100 portfolios.
query **portfolioIdFilter**:string | Optional. The returned list includes portfolios with identifiers matching those in the specified comma-delimited list. There is a maximum of 100 identifiers allowed
query **portfolioNameFilter**:string | Optional. The returned list includes portfolios with identifiers matching those in the specified comma-delimited list. There is a maximum of 100 identifiers allowed
query **portfolioStateFilter**:string | Optional. The returned list includes portfolios with states matching those in the specified comma-delimited list. Available values : enabled, paused, archived
"""
return self._request(kwargs.pop('path'), params=kwargs)
@sp_endpoint('/v2/portfolios/extended', method='GET')
def list_portfolios_extended(self, **kwargs) -> ApiResponse:
r"""
list_portfolios_extended(**kwargs) -> ApiResponse
Retrieves a list of portfolios, optionally filtered by identifier, name, or state. Note that this operation returns a maximum of 100 portfolios.
query **portfolioIdFilter**:string | Optional. The returned list includes portfolios with identifiers matching those in the specified comma-delimited list. There is a maximum of 100 identifiers allowed
query **portfolioNameFilter**:string | Optional. The returned list includes portfolios with identifiers matching those in the specified comma-delimited list. There is a maximum of 100 identifiers allowed
query **portfolioStateFilter**:string | Optional. The returned list includes portfolios with states matching those in the specified comma-delimited list. Available values : enabled, paused, archived
"""
return self._request(kwargs.pop('path'), params=kwargs)
@sp_endpoint('/v2/portfolios/{}', method='GET')
def get_portfolio(self, portfolioId, **kwargs) -> ApiResponse:
r"""
get_portfolio(portfolioId) -> ApiResponse
Retrieves a portfolio data with the portfolioId identifier provided
query **portfolioId**:number | Required. The identifier of an existing portfolio.
"""
return self._request(fill_query_params(kwargs.pop('path'), portfolioId), params=kwargs)
@sp_endpoint('/v2/portfolios/extended/{}', method='GET')
def get_portfolio_extended(self, portfolioId, **kwargs) -> ApiResponse:
r"""
get_portfolio_extended(portfolioId) -> ApiResponse
Gets an extended set of properties for a portfolio specified by identifier.
query **portfolioId**:number | Required. The identifier of an existing portfolio.
"""
return self._request(fill_query_params(kwargs.pop('path'), portfolioId), params=kwargs)
@sp_endpoint('/v2/portfolios', method='POST')
def create_portfolios(self, **kwargs) -> ApiResponse:
r"""
create_portfolios(body: (list, str, dict)) -> ApiResponse
Creates one or more portfolios.
body: | REQUIRED {'description': 'A list of portfolio resources with updated values.}'
| **name** | **string** | The portfolio name.
| **budget** | **dict** |
| **amount** | **number** | The budget amount associated with the portfolio. Cannot be null.
| **currencyCode** | **string** | The currency used for all monetary values for entities under this profile. Cannot be null.
| **policy** | **string** | The budget policy. Set to dateRange to specify a budget for a specific period of time. Set to monthlyRecurring to specify a budget that is automatically renewed at the beginning of each month. Cannot be null. Enum: [ dateRange, monthlyRecurring ]
| **startDate** | **string** | The starting date in YYYYMMDD format to which the budget is applied. Required if policy is set to dateRange. Not specified if policy is set to monthlyRecurring. Note that the starting date for monthlyRecurring is the date when the policy is set.
| **endDate** | **string** | The end date after which the budget is no longer applied. Optional if policy is set to dateRange or monthlyRecurring.
| **inBudget** | **boolean** | Indicates the current budget status of the portfolio. Set to true if the portfolio is in budget, set to false if the portfolio is out of budget.
| **state** | **string** | The current state of the portfolio. Enum: [ enabled, paused, archived ]
"""
body = Utils.convert_body(kwargs.pop('body'))
return self._request(kwargs.pop('path'), data=body, params=kwargs)
@sp_endpoint('/v2/portfolios', method='PUT')
def edit_portfolios(self, **kwargs) -> ApiResponse:
r"""
edit_portfolios(body: (list, str, dict)) -> ApiResponse
Updates one or more portfolios.
body: | REQUIRED {'description': 'A list of portfolio resources with updated values.}'
| **portfolioId** | **number** | The portfolio identifier.
| **name** | **string** | The portfolio name.
| **budget** | **dict** |
| **amount** | **number** | The budget amount associated with the portfolio. Cannot be null.
| **currencyCode** | **string** | The currency used for all monetary values for entities under this profile. Cannot be null.
| **policy** | **string** | The budget policy. Set to dateRange to specify a budget for a specific period of time. Set to monthlyRecurring to specify a budget that is automatically renewed at the beginning of each month. Cannot be null. Enum: [ dateRange, monthlyRecurring ]
| **startDate** | **string** | The starting date in YYYYMMDD format to which the budget is applied. Required if policy is set to dateRange. Not specified if policy is set to monthlyRecurring. Note that the starting date for monthlyRecurring is the date when the policy is set.
| **endDate** | **string** | The end date after which the budget is no longer applied. Optional if policy is set to dateRange or monthlyRecurring.
| **inBudget** | **boolean** | Indicates the current budget status of the portfolio. Set to true if the portfolio is in budget, set to false if the portfolio is out of budget.
| **state** | **string** | The current state of the portfolio. Enum: [ enabled, paused, archived ]
"""
body = Utils.convert_body(kwargs.pop('body'))
return self._request(kwargs.pop('path'), data=body, params=kwargs)
| 54.507813 | 292 | 0.682385 | 852 | 6,977 | 5.543427 | 0.174883 | 0.014821 | 0.018632 | 0.027948 | 0.921448 | 0.896888 | 0.866398 | 0.830404 | 0.830404 | 0.830404 | 0 | 0.004415 | 0.220869 | 6,977 | 127 | 293 | 54.937008 | 0.864422 | 0.727533 | 0 | 0.285714 | 0 | 0 | 0.108979 | 0.033585 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.035714 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
56b0a725b955fa954325998aa6fb6b30cf4f204e | 1,212 | py | Python | tests/collective_ops/test_gather.py | Thenerdstation/mpi4jax | 8e2fa86abcf6e775d1acea1b85fe44d15ff57387 | [
"MIT"
] | 122 | 2021-03-27T20:46:28.000Z | 2022-03-31T22:23:45.000Z | tests/collective_ops/test_gather.py | Thenerdstation/mpi4jax | 8e2fa86abcf6e775d1acea1b85fe44d15ff57387 | [
"MIT"
] | 68 | 2020-07-22T08:21:02.000Z | 2021-03-19T09:42:08.000Z | tests/collective_ops/test_gather.py | Thenerdstation/mpi4jax | 8e2fa86abcf6e775d1acea1b85fe44d15ff57387 | [
"MIT"
] | 9 | 2021-03-27T03:46:34.000Z | 2022-02-16T12:44:22.000Z | from mpi4py import MPI
import jax
import jax.numpy as jnp
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()
def test_gather():
from mpi4jax import gather
arr = jnp.ones((3, 2)) * rank
res, _ = gather(arr, root=0)
if rank == 0:
for p in range(size):
assert jnp.array_equal(res[p], jnp.ones((3, 2)) * p)
else:
assert jnp.array_equal(res, arr)
def test_gather_jit():
from mpi4jax import gather
arr = jnp.ones((3, 2)) * rank
res = jax.jit(lambda x: gather(x, root=0)[0])(arr)
if rank == 0:
for p in range(size):
assert jnp.array_equal(res[p], jnp.ones((3, 2)) * p)
else:
assert jnp.array_equal(res, arr)
def test_gather_scalar():
from mpi4jax import gather
arr = rank
res, _ = gather(arr, root=0)
if rank == 0:
assert jnp.array_equal(res, jnp.arange(size))
else:
assert jnp.array_equal(res, arr)
def test_gather_scalar_jit():
from mpi4jax import gather
arr = rank
res = jax.jit(lambda x: gather(x, root=0)[0])(arr)
if rank == 0:
assert jnp.array_equal(res, jnp.arange(size))
else:
assert jnp.array_equal(res, arr)
| 21.263158 | 64 | 0.60231 | 191 | 1,212 | 3.712042 | 0.198953 | 0.101551 | 0.157969 | 0.214386 | 0.850494 | 0.850494 | 0.842031 | 0.768688 | 0.768688 | 0.719323 | 0 | 0.02593 | 0.268152 | 1,212 | 56 | 65 | 21.642857 | 0.773393 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.1 | false | 0 | 0.175 | 0 | 0.275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7122f8065a5bcc02879fdbe3cdc7a8e316dcdffb | 43,700 | py | Python | image/pav/csi/spec/csi_pb2_grpc.py | albertofaria/pav | 3c68b64d21bc02bc5cdc29084f4688993d58b1ff | [
"MIT"
] | 4 | 2021-12-11T23:30:51.000Z | 2021-12-17T09:06:45.000Z | image/pav/csi/spec/csi_pb2_grpc.py | albertofaria/pav | 3c68b64d21bc02bc5cdc29084f4688993d58b1ff | [
"MIT"
] | null | null | null | image/pav/csi/spec/csi_pb2_grpc.py | albertofaria/pav | 3c68b64d21bc02bc5cdc29084f4688993d58b1ff | [
"MIT"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from pav.csi.spec import csi_pb2 as pav_dot_csi_dot_spec_dot_csi__pb2
class IdentityStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetPluginInfo = channel.unary_unary(
'/csi.v1.Identity/GetPluginInfo',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoResponse.FromString,
)
self.GetPluginCapabilities = channel.unary_unary(
'/csi.v1.Identity/GetPluginCapabilities',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesResponse.FromString,
)
self.Probe = channel.unary_unary(
'/csi.v1.Identity/Probe',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ProbeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ProbeResponse.FromString,
)
class IdentityServicer(object):
"""Missing associated documentation comment in .proto file."""
def GetPluginInfo(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetPluginCapabilities(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Probe(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_IdentityServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetPluginInfo': grpc.unary_unary_rpc_method_handler(
servicer.GetPluginInfo,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoResponse.SerializeToString,
),
'GetPluginCapabilities': grpc.unary_unary_rpc_method_handler(
servicer.GetPluginCapabilities,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesResponse.SerializeToString,
),
'Probe': grpc.unary_unary_rpc_method_handler(
servicer.Probe,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ProbeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ProbeResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'csi.v1.Identity', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Identity(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def GetPluginInfo(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Identity/GetPluginInfo',
pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginInfoResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetPluginCapabilities(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Identity/GetPluginCapabilities',
pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.GetPluginCapabilitiesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Probe(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Identity/Probe',
pav_dot_csi_dot_spec_dot_csi__pb2.ProbeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ProbeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class ControllerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.CreateVolume = channel.unary_unary(
'/csi.v1.Controller/CreateVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeResponse.FromString,
)
self.DeleteVolume = channel.unary_unary(
'/csi.v1.Controller/DeleteVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeResponse.FromString,
)
self.ControllerPublishVolume = channel.unary_unary(
'/csi.v1.Controller/ControllerPublishVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeResponse.FromString,
)
self.ControllerUnpublishVolume = channel.unary_unary(
'/csi.v1.Controller/ControllerUnpublishVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeResponse.FromString,
)
self.ValidateVolumeCapabilities = channel.unary_unary(
'/csi.v1.Controller/ValidateVolumeCapabilities',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesResponse.FromString,
)
self.ListVolumes = channel.unary_unary(
'/csi.v1.Controller/ListVolumes',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesResponse.FromString,
)
self.GetCapacity = channel.unary_unary(
'/csi.v1.Controller/GetCapacity',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityResponse.FromString,
)
self.ControllerGetCapabilities = channel.unary_unary(
'/csi.v1.Controller/ControllerGetCapabilities',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesResponse.FromString,
)
self.CreateSnapshot = channel.unary_unary(
'/csi.v1.Controller/CreateSnapshot',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotResponse.FromString,
)
self.DeleteSnapshot = channel.unary_unary(
'/csi.v1.Controller/DeleteSnapshot',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotResponse.FromString,
)
self.ListSnapshots = channel.unary_unary(
'/csi.v1.Controller/ListSnapshots',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsResponse.FromString,
)
self.ControllerExpandVolume = channel.unary_unary(
'/csi.v1.Controller/ControllerExpandVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeResponse.FromString,
)
self.ControllerGetVolume = channel.unary_unary(
'/csi.v1.Controller/ControllerGetVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeResponse.FromString,
)
class ControllerServicer(object):
"""Missing associated documentation comment in .proto file."""
def CreateVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ControllerPublishVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ControllerUnpublishVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ValidateVolumeCapabilities(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListVolumes(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetCapacity(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ControllerGetCapabilities(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateSnapshot(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteSnapshot(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListSnapshots(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ControllerExpandVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ControllerGetVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ControllerServicer_to_server(servicer, server):
rpc_method_handlers = {
'CreateVolume': grpc.unary_unary_rpc_method_handler(
servicer.CreateVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeResponse.SerializeToString,
),
'DeleteVolume': grpc.unary_unary_rpc_method_handler(
servicer.DeleteVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeResponse.SerializeToString,
),
'ControllerPublishVolume': grpc.unary_unary_rpc_method_handler(
servicer.ControllerPublishVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeResponse.SerializeToString,
),
'ControllerUnpublishVolume': grpc.unary_unary_rpc_method_handler(
servicer.ControllerUnpublishVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeResponse.SerializeToString,
),
'ValidateVolumeCapabilities': grpc.unary_unary_rpc_method_handler(
servicer.ValidateVolumeCapabilities,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesResponse.SerializeToString,
),
'ListVolumes': grpc.unary_unary_rpc_method_handler(
servicer.ListVolumes,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesResponse.SerializeToString,
),
'GetCapacity': grpc.unary_unary_rpc_method_handler(
servicer.GetCapacity,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityResponse.SerializeToString,
),
'ControllerGetCapabilities': grpc.unary_unary_rpc_method_handler(
servicer.ControllerGetCapabilities,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesResponse.SerializeToString,
),
'CreateSnapshot': grpc.unary_unary_rpc_method_handler(
servicer.CreateSnapshot,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotResponse.SerializeToString,
),
'DeleteSnapshot': grpc.unary_unary_rpc_method_handler(
servicer.DeleteSnapshot,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotResponse.SerializeToString,
),
'ListSnapshots': grpc.unary_unary_rpc_method_handler(
servicer.ListSnapshots,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsResponse.SerializeToString,
),
'ControllerExpandVolume': grpc.unary_unary_rpc_method_handler(
servicer.ControllerExpandVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeResponse.SerializeToString,
),
'ControllerGetVolume': grpc.unary_unary_rpc_method_handler(
servicer.ControllerGetVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'csi.v1.Controller', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Controller(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def CreateVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/CreateVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.CreateVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def DeleteVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/DeleteVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.DeleteVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ControllerPublishVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ControllerPublishVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerPublishVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ControllerUnpublishVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ControllerUnpublishVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerUnpublishVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ValidateVolumeCapabilities(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ValidateVolumeCapabilities',
pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ValidateVolumeCapabilitiesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListVolumes(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ListVolumes',
pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ListVolumesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetCapacity(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/GetCapacity',
pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.GetCapacityResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ControllerGetCapabilities(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ControllerGetCapabilities',
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetCapabilitiesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateSnapshot(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/CreateSnapshot',
pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.CreateSnapshotResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def DeleteSnapshot(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/DeleteSnapshot',
pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.DeleteSnapshotResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListSnapshots(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ListSnapshots',
pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ListSnapshotsResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ControllerExpandVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ControllerExpandVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerExpandVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ControllerGetVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Controller/ControllerGetVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.ControllerGetVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class NodeStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.NodeStageVolume = channel.unary_unary(
'/csi.v1.Node/NodeStageVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeResponse.FromString,
)
self.NodeUnstageVolume = channel.unary_unary(
'/csi.v1.Node/NodeUnstageVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeResponse.FromString,
)
self.NodePublishVolume = channel.unary_unary(
'/csi.v1.Node/NodePublishVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeResponse.FromString,
)
self.NodeUnpublishVolume = channel.unary_unary(
'/csi.v1.Node/NodeUnpublishVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeResponse.FromString,
)
self.NodeGetVolumeStats = channel.unary_unary(
'/csi.v1.Node/NodeGetVolumeStats',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsResponse.FromString,
)
self.NodeExpandVolume = channel.unary_unary(
'/csi.v1.Node/NodeExpandVolume',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeResponse.FromString,
)
self.NodeGetCapabilities = channel.unary_unary(
'/csi.v1.Node/NodeGetCapabilities',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesResponse.FromString,
)
self.NodeGetInfo = channel.unary_unary(
'/csi.v1.Node/NodeGetInfo',
request_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoRequest.SerializeToString,
response_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoResponse.FromString,
)
class NodeServicer(object):
"""Missing associated documentation comment in .proto file."""
def NodeStageVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeUnstageVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodePublishVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeUnpublishVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeGetVolumeStats(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeExpandVolume(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeGetCapabilities(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NodeGetInfo(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_NodeServicer_to_server(servicer, server):
rpc_method_handlers = {
'NodeStageVolume': grpc.unary_unary_rpc_method_handler(
servicer.NodeStageVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeResponse.SerializeToString,
),
'NodeUnstageVolume': grpc.unary_unary_rpc_method_handler(
servicer.NodeUnstageVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeResponse.SerializeToString,
),
'NodePublishVolume': grpc.unary_unary_rpc_method_handler(
servicer.NodePublishVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeResponse.SerializeToString,
),
'NodeUnpublishVolume': grpc.unary_unary_rpc_method_handler(
servicer.NodeUnpublishVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeResponse.SerializeToString,
),
'NodeGetVolumeStats': grpc.unary_unary_rpc_method_handler(
servicer.NodeGetVolumeStats,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsResponse.SerializeToString,
),
'NodeExpandVolume': grpc.unary_unary_rpc_method_handler(
servicer.NodeExpandVolume,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeResponse.SerializeToString,
),
'NodeGetCapabilities': grpc.unary_unary_rpc_method_handler(
servicer.NodeGetCapabilities,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesResponse.SerializeToString,
),
'NodeGetInfo': grpc.unary_unary_rpc_method_handler(
servicer.NodeGetInfo,
request_deserializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoRequest.FromString,
response_serializer=pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'csi.v1.Node', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Node(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def NodeStageVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeStageVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeStageVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeUnstageVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeUnstageVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnstageVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodePublishVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodePublishVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodePublishVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeUnpublishVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeUnpublishVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeUnpublishVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeGetVolumeStats(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeGetVolumeStats',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetVolumeStatsResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeExpandVolume(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeExpandVolume',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeExpandVolumeResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeGetCapabilities(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeGetCapabilities',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetCapabilitiesResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def NodeGetInfo(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/csi.v1.Node/NodeGetInfo',
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoRequest.SerializeToString,
pav_dot_csi_dot_spec_dot_csi__pb2.NodeGetInfoResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 49.546485 | 127 | 0.687849 | 4,160 | 43,700 | 6.817548 | 0.041106 | 0.061352 | 0.046014 | 0.061352 | 0.874123 | 0.874123 | 0.848948 | 0.812665 | 0.811784 | 0.764994 | 0 | 0.005961 | 0.243753 | 43,700 | 881 | 128 | 49.602724 | 0.852215 | 0.052517 | 0 | 0.526596 | 1 | 0 | 0.076444 | 0.042342 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071809 | false | 0 | 0.00266 | 0.031915 | 0.118351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8586d337663151bbdae3bc7cc9efdde5a0317e12 | 48,058 | py | Python | Bin/actividades.py | mfneirae/GrupLAC-Complete | f4ccefe2553b90015d28df0e8d7730b4bad37d84 | [
"MIT"
] | null | null | null | Bin/actividades.py | mfneirae/GrupLAC-Complete | f4ccefe2553b90015d28df0e8d7730b4bad37d84 | [
"MIT"
] | null | null | null | Bin/actividades.py | mfneirae/GrupLAC-Complete | f4ccefe2553b90015d28df0e8d7730b4bad37d84 | [
"MIT"
] | 1 | 2021-06-10T09:21:18.000Z | 2021-06-10T09:21:18.000Z | #
#
# #############################################################################
# Copyright (c) 2018 Universidad Nacional de Colombia All Rights Reserved.
#
# This work was made as a development to improve data collection
# for self-assessment and accreditation processes in the Vicedeanship
# of academic affairs in the Engineering Faculty of the Universidad
# Nacional de Colombia and is licensed under a Creative Commons
# Attribution-NonCommercial - ShareAlike 4.0 International License
# and MIT Licence.
#
# by Manuel Embus.
#
# For more information write me to jai@mfneirae.com
# Or visit my webpage at https://mfneirae.com/
# #############################################################################
#
#
def clc(str):
import re
str = re.sub(r'[^A-Za-z0-9:=_?ÁÀÉÈÍÌÓÒÚÙéèáà,éñèíìńúùóò .\-/+]',r'',re.sub(' +',' ',str.replace('"',"").replace("'","").strip().replace(";" , "|").replace("\r\n","").replace("\n","").replace("\r","")))
if str == ",":
str = "-"
return str;
def asesorias_programaextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contasesorias_programa
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscaasesorias_programa = containers[a].td
#print(buscaasesorias_programa)
try:
if buscaasesorias_programa.text == "Asesorías al Programa Ondas":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_asesorias_programa = cont.text
tipo = "114"
index1 = info_asesorias_programa.find("- ") + 2
index2 = info_asesorias_programa.find('\n',index1,len(info_asesorias_programa))
nombreart = clc(info_asesorias_programa[index1:index2])
index1 = info_asesorias_programa.find(" en ") + 4
lugar = clc(info_asesorias_programa[index1:index2])
index1 = info_asesorias_programa.find(' desde ',index1,len(info_asesorias_programa)) + 7
index2 = info_asesorias_programa.find(' hasta', index1, len(info_asesorias_programa))
desde = clc(info_asesorias_programa[index1:index2])
anopub = clc(info_asesorias_programa[index1:index1 + 4])
index1 = info_asesorias_programa.find(' hasta ',index1,len(info_asesorias_programa)) + 7
index2 = info_asesorias_programa.find(', \n', index1, len(info_asesorias_programa))
hasta = clc(info_asesorias_programa[index1:index2])
index1 = info_asesorias_programa.find('Nombre de las ferias:',index1,len(info_asesorias_programa)) + 21
index2 = info_asesorias_programa.find('Institución:', index1, len(info_asesorias_programa))
ferias = clc(info_asesorias_programa[index1:index2])
index1 = info_asesorias_programa.find('Institución:',index1,len(info_asesorias_programa)) + 12
index2 = info_asesorias_programa.find('\n', index1, len(info_asesorias_programa))
institucion = clc(info_asesorias_programa[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "null" + "," \
+ anopub + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + institucion + "'," \
+ "null" + "," \
+ "null" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ "" +";" \
+ anopub +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ institucion +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + ferias + "'," \
+ "'" + desde + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + hasta + "'," \
+ "null" + "," \
+ "null" \
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ ferias +";" \
+ desde +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ hasta +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contasesorias_programa = [COD_PRODUCTO]
def cursos_corta_duracionextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contcursos_corta_duracion
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscacursos_corta_duracion = containers[a].td
#print(buscacursos_corta_duracion)
try:
if buscacursos_corta_duracion.text == "Curso de Corta Duración Dictados":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_cursos_corta_duracion = cont.text
index1 = info_cursos_corta_duracion.find("- ") + 2
index2 = info_cursos_corta_duracion.find(':')
tipo = clc(info_cursos_corta_duracion[index1:index2])
#Tipo Artículo
if tipo.strip() == "Perfeccionamiento":
tipo = "115"
elif tipo.strip() == "Extensión extracurricular":
tipo = "116"
elif tipo.strip() == "Especialización":
tipo = "122"
elif tipo.strip() == "Otro":
tipo = "118"
else:
logging.critical('Añadir: ' + tipo + ' a cursos_corta_duracion')
print ("ALERTA: Revisar el archivo Registros.log")
index1 = index2 + 2
index2 = info_cursos_corta_duracion.find('\n', index1, len(info_cursos_corta_duracion))
nombreart = clc(info_cursos_corta_duracion[index1:index2])
index1 = index2 + 2
index2 = info_cursos_corta_duracion.find(',', index1, len(info_cursos_corta_duracion))
lugar = clc(info_cursos_corta_duracion[index1:index2])
index1 = index2 + 2
index2 = info_cursos_corta_duracion.find(',', index1, len(info_cursos_corta_duracion))
anopub = clc(info_cursos_corta_duracion[index1:index2])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = info_cursos_corta_duracion.find('Idioma: ',index1,len(info_cursos_corta_duracion)) + 8
index2 = info_cursos_corta_duracion.find(',', index1, len(info_cursos_corta_duracion))
idioma = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Medio de divulgación:',index1,len(info_cursos_corta_duracion)) + 21
index2 = info_cursos_corta_duracion.find('\n', index1, len(info_cursos_corta_duracion))
divulgacion = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Sitio web:') + 10
index2 = info_cursos_corta_duracion.find(',', index1, len(info_cursos_corta_duracion))
DOI = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Participación')
if index1 == -1:
pass
else:
index1 = index1 + 13
index2 = info_cursos_corta_duracion.find(',', index1, len(info_cursos_corta_duracion))
participacion = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Duración (semanas):') + 19
index2 = info_cursos_corta_duracion.find('Finalidad:', index1, len(info_cursos_corta_duracion))
duracion = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Finalidad:') + 10
index2 = info_cursos_corta_duracion.find('\n', index1, len(info_cursos_corta_duracion))
finalidad = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Institución financiadora:') + 25
index2 = info_cursos_corta_duracion.find('Autores:', index1, len(info_cursos_corta_duracion))
institucion = clc(info_cursos_corta_duracion[index1:index2])
index1 = info_cursos_corta_duracion.find('Autores:') + 8
index2 = info_cursos_corta_duracion.find('\n', index1, len(info_cursos_corta_duracion))
autores = clc(info_cursos_corta_duracion[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "'" + lugar + "'," \
+ anopub + "," \
+ "'" + idioma + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + DOI + "'," \
+ "'" + participacion + "'," \
+ "'" + institucion + "'," \
+ "null" + "," \
+ "'" + autores + "'" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ lugar +";" \
+ anopub +";" \
+ idioma +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ DOI +";" \
+ participacion +";" \
+ institucion +";" \
+ "" +";" \
+ autores +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + finalidad + "'," \
+ "'" + duracion + "'" \
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ finalidad +";" \
+ duracion +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contcursos_corta_duracion = [COD_PRODUCTO]
def trabajos_dirigidosextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global conttrabajos_dirigidos
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscatrabajos_dirigidos = containers[a].td
#print(buscatrabajos_dirigidos)
try:
if buscatrabajos_dirigidos.text == "Trabajos dirigidos/turorías":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_trabajos_dirigidos = cont.text
index1 = info_trabajos_dirigidos.find("- ") + 2
index2 = info_trabajos_dirigidos.find(':')
tipo = "117"
index1 = index2 + 2
index2 = info_trabajos_dirigidos.find('\n', index1, len(info_trabajos_dirigidos))
nombreart = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Desde', index2, len(info_trabajos_dirigidos)) + 5
index2 = info_trabajos_dirigidos.find('hasta', index1, len(info_trabajos_dirigidos))
desde = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('hasta', index2, len(info_trabajos_dirigidos)) + 5
index2 = info_trabajos_dirigidos.find(', Tipo de orientación:', index1, len(info_trabajos_dirigidos))
hasta = clc(info_trabajos_dirigidos[index1:index2])
anopub = clc(desde[len(desde)-4:len(desde)])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = info_trabajos_dirigidos.find('Tipo de orientación:',index1,len(info_trabajos_dirigidos)) + 20
index2 = info_trabajos_dirigidos.find('\n', index1, len(info_trabajos_dirigidos))
orientacion = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Nombre del estudiante:',index1,len(info_trabajos_dirigidos)) + 22
index2 = info_trabajos_dirigidos.find(',', index1, len(info_trabajos_dirigidos))
estudiante = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Programa académico:',index1,len(info_trabajos_dirigidos)) + 19
index2 = info_trabajos_dirigidos.find('\n', index1, len(info_trabajos_dirigidos))
programa = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Número de páginas:',index1,len(info_trabajos_dirigidos)) + 18
index2 = info_trabajos_dirigidos.find(',', index1, len(info_trabajos_dirigidos))
paginas = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Valoración:',index1,len(info_trabajos_dirigidos)) + 11
index2 = info_trabajos_dirigidos.find('\n', index1, len(info_trabajos_dirigidos))
valoracion = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Institución:') + 12
index2 = info_trabajos_dirigidos.find('Autores:', index1, len(info_trabajos_dirigidos))
institucion = clc(info_trabajos_dirigidos[index1:index2])
index1 = info_trabajos_dirigidos.find('Autores:') + 8
index2 = info_trabajos_dirigidos.find('\n', index1, len(info_trabajos_dirigidos))
autores = clc(info_trabajos_dirigidos[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "null" + "," \
+ anopub + "," \
+ "null" + "," \
+ "'" + paginas + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + institucion + "'," \
+ "null" + "," \
+ "'" + autores + "'" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ "" +";" \
+ anopub +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ institucion +";" \
+ "" +";" \
+ autores +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "null" + "," \
+ "'" + orientacion + "'," \
+ "'" + estudiante + "'," \
+ "'" + programa + "'," \
+ "null" + "," \
+ "'" + valoracion + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null"\
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ "" +";" \
+ orientacion +";" \
+ estudiante +";" \
+ programa +";" \
+ "" +";" \
+ valoracion +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
conttrabajos_dirigidos = [COD_PRODUCTO]
def jurado_comisionesextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contjurado_comisiones
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscajurado_comisiones = containers[a].td
#print(buscajurado_comisiones)
try:
if buscajurado_comisiones.text == "Jurado/Comisiones evaluadoras de trabajo de grado":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_jurado_comisiones = cont.text
index1 = info_jurado_comisiones.find("- ") + 2
index2 = info_jurado_comisiones.find(':')
tipo = clc(info_jurado_comisiones[index1:index2])
if tipo.strip() == "Pregrado":
tipo = "85"
elif tipo.strip() == "Especialización":
tipo = "86"
elif tipo.strip() == "Especialidad Médica":
tipo = "87"
elif tipo.strip() == "Maestría":
tipo = "88"
elif tipo.strip() == "Doctorado":
tipo = "89"
elif tipo.strip() == "Otra":
tipo = "90"
elif tipo.strip() == "Curso de perfeccionamiento/especialización":
tipo = "91"
else:
logging.critical('Añadir: ' + tipo + ' a jurado_comisiones')
print ("ALERTA: Revisar el archivo Registros.log")
index1 = index2 + 2
index2 = info_jurado_comisiones.find('\n', index1, len(info_jurado_comisiones))
nombreart = clc(info_jurado_comisiones[index1:index2])
index1 = index2 + 2
index2 = info_jurado_comisiones.find(',', index1, len(info_jurado_comisiones))
lugar = clc(info_jurado_comisiones[index1:index2])
index1 = index2 + 2
index2 = info_jurado_comisiones.find(',', index1, len(info_jurado_comisiones))
anopub = clc(info_jurado_comisiones[index1:index2])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = info_jurado_comisiones.find('Idioma:', index2, len(info_jurado_comisiones)) + 7
index2 = info_jurado_comisiones.find(',', index1, len(info_jurado_comisiones))
idioma = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Medio de divulgación:', index2, len(info_jurado_comisiones)) + 21
index2 = info_jurado_comisiones.find('\n', index1, len(info_jurado_comisiones))
divulgacion = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Sitio web:', index2, len(info_jurado_comisiones)) + 10
index2 = info_jurado_comisiones.find(',', index1, len(info_jurado_comisiones))
DOI = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Nombre del orientado:',index1,len(info_jurado_comisiones)) + 21
index2 = info_jurado_comisiones.find('\n', index1, len(info_jurado_comisiones))
estudiante = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Programa académico:',index1,len(info_jurado_comisiones)) + 19
index2 = info_jurado_comisiones.find(',', index1, len(info_jurado_comisiones))
programa = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Institución:') + 12
index2 = info_jurado_comisiones.find('Autores:', index1, len(info_jurado_comisiones))
institucion = clc(info_jurado_comisiones[index1:index2])
index1 = info_jurado_comisiones.find('Autores:') + 8
index2 = info_jurado_comisiones.find('\n', index1, len(info_jurado_comisiones))
autores = clc(info_jurado_comisiones[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "'" + lugar + "'," \
+ anopub + "," \
+ "'" + idioma + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + DOI + "'," \
+ "null" + "," \
+ "'" + institucion + "'," \
+ "null" + "," \
+ "'" + autores + "'" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ lugar +";" \
+ anopub +";" \
+ idioma +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ DOI +";" \
+ "" +";" \
+ institucion +";" \
+ "" +";" \
+ autores +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + estudiante + "'," \
+ "'" + programa + "'," \
+ "'" + divulgacion + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null"\
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ estudiante +";" \
+ programa +";" \
+ divulgacion +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contjurado_comisiones = [COD_PRODUCTO]
def comites_evaluacionextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contcomites_evaluacion
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscacomites_evaluacion = containers[a].td
#print(buscacomites_evaluacion)
try:
if buscacomites_evaluacion.text == "Participación en comités de evaluación":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_comites_evaluacion = cont.text
index1 = info_comites_evaluacion.find("- ") + 2
index2 = info_comites_evaluacion.find(':')
tipo = clc(info_comites_evaluacion[index1:index2])
if tipo.strip() == "Profesor titular":
tipo = "78"
elif tipo.strip() == "Concurso docente":
tipo = "79"
elif tipo.strip() == "Jefe de cátedra":
tipo = "80"
elif tipo.strip() == "Evaluación de cursos":
tipo = "81"
elif tipo.strip() == "Acreditación de programas":
tipo = "82"
elif tipo.strip() == "Asignación de becas":
tipo = "83"
elif tipo.strip() == "Otra":
tipo = "84"
else:
logging.critical('Añadir: ' + tipo + ' a comites_evaluacion')
print ("ALERTA: Revisar el archivo Registros.log")
index1 = index2 + 2
index2 = info_comites_evaluacion.find('\n', index1, len(info_comites_evaluacion))
nombreart = clc(info_comites_evaluacion[index1:index2])
index1 = index2 + 2
index2 = info_comites_evaluacion.find(',', index1, len(info_comites_evaluacion))
lugar = clc(info_comites_evaluacion[index1:index2])
index1 = index2 + 2
index2 = info_comites_evaluacion.find(',', index1, len(info_comites_evaluacion))
anopub = clc(info_comites_evaluacion[index1:index2])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = info_comites_evaluacion.find('Sitio web:', index2, len(info_comites_evaluacion)) + 10
index2 = info_comites_evaluacion.find('Medio de divulgación:', index1, len(info_comites_evaluacion))
DOI = clc(info_comites_evaluacion[index1:index2])
index1 = info_comites_evaluacion.find('Medio de divulgación:', index2, len(info_comites_evaluacion)) + 21
index2 = info_comites_evaluacion.find('\n', index1, len(info_comites_evaluacion))
divulgacion = clc(info_comites_evaluacion[index1:index2])
index1 = info_comites_evaluacion.find('Institución:') + 12
index2 = info_comites_evaluacion.find('Autores:', index1, len(info_comites_evaluacion))
institucion = clc(info_comites_evaluacion[index1:index2])
index1 = info_comites_evaluacion.find('Autores:') + 8
index2 = info_comites_evaluacion.find('\n', index1, len(info_comites_evaluacion))
autores = clc(info_comites_evaluacion[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "'" + lugar + "'," \
+ anopub + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + DOI + "'," \
+ "null" + "," \
+ "'" + institucion + "'," \
+ "null" + "," \
+ "'" + autores + "'" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ lugar +";" \
+ anopub +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ DOI +";" \
+ "" +";" \
+ institucion +";" \
+ "" +";" \
+ autores +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + divulgacion + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null"\
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ divulgacion +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contcomites_evaluacion = [COD_PRODUCTO]
def demas_trabajosextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contdemas_trabajos
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscademas_trabajos = containers[a].td
#print(buscademas_trabajos)
try:
if buscademas_trabajos.text == "Demás trabajos":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_demas_trabajos = cont.text
index1 = info_demas_trabajos.find("- ") + 2
index2 = info_demas_trabajos.find(':')
tipo = "70"
index1 = index2 + 2
index2 = info_demas_trabajos.find('\n', index1, len(info_demas_trabajos))
nombreart = clc(info_demas_trabajos[index1:index2])
index1 = index2 + 2
index2 = info_demas_trabajos.find(',', index1, len(info_demas_trabajos))
lugar = clc(info_demas_trabajos[index1:index2])
index1 = index2 + 2
index2 = info_demas_trabajos.find(',', index1, len(info_demas_trabajos))
anopub = clc(info_demas_trabajos[index1:index2])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = info_demas_trabajos.find('Idioma:', index2, len(info_demas_trabajos)) + 7
index2 = info_demas_trabajos.find(',', index1, len(info_demas_trabajos))
idioma = clc(info_demas_trabajos[index1:index2])
index1 = info_demas_trabajos.find('Medio de divulgación:', index2, len(info_demas_trabajos)) + 21
index2 = info_demas_trabajos.find('\n', index1, len(info_demas_trabajos))
divulgacion = clc(info_demas_trabajos[index1:index2])
index1 = info_demas_trabajos.find('Autores:') + 8
index2 = info_demas_trabajos.find('\n', index1, len(info_demas_trabajos))
autores = clc(info_demas_trabajos[index1:index2])
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "'" + lugar + "'," \
+ anopub + "," \
+ "'" + idioma + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + autores + "'" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ lugar +";" \
+ anopub +";" \
+ idioma +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ autores +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + divulgacion + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null"\
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ divulgacion +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contdemas_trabajos = [COD_PRODUCTO]
def proyectos_grupoextract():
from settings import my_url, coduapa, codhermes, codcolciencias, nombregi, dnilider, my_url, COD_PRODUCTO
import bs4, logging, sys, re, init
global contproyectos_grupo
LOG_FILENAME = './Logs/Registros.log'
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG,
format = "%(asctime)s:%(levelname)s:%(message)s")
LEVELS = {'debug': logging.DEBUG,
'info': logging.INFO,
'warning': logging.WARNING,
'error': logging.ERROR,
'critical': logging.CRITICAL}
if len(sys.argv) > 1:
level_name = sys.argv[1]
level = LEVELS.get(level_name, logging.NOTSET)
logging.basicConfig(level=level)
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
uClient = uReq(my_url)
page_html = uClient.read()
uClient.close()
all = 0
a = 0
x = 0
y = 0
page_soup = soup(page_html,"html.parser")
containers = page_soup.findAll("table")
for a in range(0,len(containers)):
buscaproyectos_grupo = containers[a].td
#print(buscaproyectos_grupo)
try:
if buscaproyectos_grupo.text == "Proyectos":
all = a
#print(all)
break
except AttributeError:
pass
if all != 0:
containerb = containers[all]
container = containerb.findAll("tr")
for x in range(1, len(container)):
cont = container[x]
info_proyectos_grupo = cont.text
index1 = info_proyectos_grupo.find("- ") + 2
index2 = info_proyectos_grupo.find(':')
tipo = "119"
index1 = index2 + 2
index2 = info_proyectos_grupo.find('\n', index1, len(info_proyectos_grupo))
nombreart = clc(info_proyectos_grupo[index1:index2])
index1 = index2 + 2
index2 = info_proyectos_grupo.find(' - \n', index1, len(info_proyectos_grupo))
desde = clc(info_proyectos_grupo[index1:index2])
anopub = clc(desde[0:4])
try:
ano = int(anopub)
except ValueError:
anopub = ""
index1 = index2
index2 = len(info_proyectos_grupo)
hasta = clc(info_proyectos_grupo[index1:index2])
if len(hasta) > 10:
hasta = "-"
init.REL_GRUPO_PRODUCTO.append( \
"REPLACE INTO `uapa_db`.`REL_GRUPO_PRODUCTO`(`CODGP_PROD`,`CODGP`,`GP_TIPO_PROD`,`Nombre_Producto`,`Lugar`,`Año`,`Idioma`,`Páginas`,`Volumen`,`Editorial`,`Ambito`,`DOI`,`Descripción`,`Instituciones`,`Tipo_Vincula_Institu`,`Autores`) VALUES"
+ "('" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + "'," \
+ tipo + "," \
+ "'" + nombreart + "'," \
+ "null" + "," \
+ anopub + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" \
+ ");\n")
init.REL_GRUPO_PRODUCTO_CSV.append(str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) +";" \
+ tipo +";" \
+ nombreart +";" \
+ "" +";" \
+ anopub +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
init.GP_ACTIVIDADES.append( \
"REPLACE INTO `uapa_db`.`GP_ACTIVIDADES`(`CODGP_PROD_ACT`,`CODGP_PROD`,`Nombre_de_Ferias`,`Fecha_Inicio_Curso`,`Tipo_Orientación`,`Nombre_Estudiante`,`Programa_Académico`,`Divulgacion`,`Valoración`,`Fecha_fin_Curso`,`Finalidad`,`Duración`) VALUES"
+ "('" + str(codcolciencias) + "X" + str(COD_PRODUCTO) + "',"\
+ "'" + str(codcolciencias) + str(COD_PRODUCTO) + "',"\
+ "null" + "," \
+ "'" + desde + "'," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "null" + "," \
+ "'" + hasta + "'," \
+ "null" + "," \
+ "null"\
+ ");\n")
init.GP_ACTIVIDADES_CSV.append(str(codcolciencias) + "X" + str(COD_PRODUCTO) +";" \
+ str(codcolciencias) + str(COD_PRODUCTO) +";" \
+ "" +";" \
+ desde +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ "" +";" \
+ hasta +";" \
+ "" +";" \
+ "" +";" \
+ "\n")
COD_PRODUCTO += 1
else:
logging.info(' El Grupo: ' + nombregi + 'no tiene Esquemas de Cirduitos Trazados Asociados')
contproyectos_grupo = [COD_PRODUCTO]
| 44.663569 | 259 | 0.516813 | 4,367 | 48,058 | 5.473781 | 0.076712 | 0.025435 | 0.034806 | 0.045223 | 0.850025 | 0.80656 | 0.763136 | 0.748829 | 0.745482 | 0.743516 | 0 | 0.017037 | 0.32338 | 48,058 | 1,075 | 260 | 44.705116 | 0.718086 | 0.018186 | 0 | 0.77832 | 0 | 0.013672 | 0.150485 | 0.073508 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007813 | false | 0.007813 | 0.02832 | 0 | 0.037109 | 0.00293 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
85922641fe44e134806cd8d236afb50f926678a9 | 927 | py | Python | dsrp/api/serializers.py | ZurMaD/restoration-old-videos-web | 86b69256a29acd67353d0559f6c1fef5ce807b81 | [
"MIT"
] | 2 | 2021-01-13T02:39:18.000Z | 2021-12-11T04:28:51.000Z | dsrp/api/serializers.py | ZurMaD/restoration-old-videos-web | 86b69256a29acd67353d0559f6c1fef5ce807b81 | [
"MIT"
] | null | null | null | dsrp/api/serializers.py | ZurMaD/restoration-old-videos-web | 86b69256a29acd67353d0559f6c1fef5ce807b81 | [
"MIT"
] | null | null | null | # serializers.py
from rest_framework import serializers
# from api.models import Hero
# from .models import ApiMedicHypertable
# class HeroSerializer(serializers.HyperlinkedModelSerializer):
# class Meta:
# model = Hero
# fields = ('name', 'alias')
# class medichypertableSerializer(serializers.HyperlinkedModelSerializer):
# class Meta:
# model = ApiMedicHypertable
# fields = ('time', 'kit_id','pres_card','frec_resp','temp_corp','caidas')
# class lastmedichypertableSerializer(serializers.HyperlinkedModelSerializer):
# class Meta:
# model = ApiMedicHypertable
# fields = ('time', 'kit_id','pres_card','frec_resp','temp_corp','caidas')
# class lastmedicbykitidhypertableSerializer(serializers.HyperlinkedModelSerializer):
# class Meta:
# model = ApiMedicHypertable
# fields = ('time', 'kit_id','pres_card','frec_resp','temp_corp','caidas') | 37.08 | 85 | 0.704423 | 84 | 927 | 7.619048 | 0.369048 | 0.23125 | 0.2625 | 0.2875 | 0.629688 | 0.55 | 0.55 | 0.55 | 0.55 | 0.55 | 0 | 0 | 0.1726 | 927 | 25 | 86 | 37.08 | 0.83442 | 0.911543 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
85a49607e73c1a804c57f48e19e1df08d473754f | 180 | py | Python | cobrakbase/core/kbaseclassifier/__init__.py | Fxe/cobrakbase | a41c142b0808b4ded16b400167c70b2466eebd85 | [
"MIT"
] | 3 | 2018-11-28T12:48:54.000Z | 2022-02-28T22:20:32.000Z | cobrakbase/core/kbaseclassifier/__init__.py | Fxe/cobrakbase | a41c142b0808b4ded16b400167c70b2466eebd85 | [
"MIT"
] | 2 | 2020-06-26T20:13:16.000Z | 2020-10-27T05:10:34.000Z | cobrakbase/core/kbaseclassifier/__init__.py | Fxe/cobrakbase | a41c142b0808b4ded16b400167c70b2466eebd85 | [
"MIT"
] | 1 | 2020-09-02T17:40:34.000Z | 2020-09-02T17:40:34.000Z | from cobrakbase.core.kbaseclassifier.genomeclassifiertrainingset import GenomeClassifierTrainingSet
from cobrakbase.core.kbaseclassifier.genomeclassifier import GenomeClassifier
| 60 | 100 | 0.911111 | 14 | 180 | 11.714286 | 0.5 | 0.170732 | 0.219512 | 0.402439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 180 | 2 | 101 | 90 | 0.964706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
a455e018fadef8eb4525acc972c843d02346deb5 | 2,082 | py | Python | src/genie/libs/parser/iosxe/tests/ShowRadiusServerGroupAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxe/tests/ShowRadiusServerGroupAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxe/tests/ShowRadiusServerGroupAll/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z | expected_output = {
"radius": {
"memlocks": 1,
"server": {
"10.106.26.213": {
"acct_port": 1813,
"auth_port": 1812,
"auto_test_enabled": False,
"keywrap_enabled": False,
"server_name": "mgmt-rad",
"transactions": {"acct": 0, "authen": 0, "author": 0},
},
"121.0.0.1": {
"acct_port": 1813,
"auth_port": 1812,
"auto_test_enabled": False,
"keywrap_enabled": False,
"server_name": "data-rad",
"transactions": {"acct": 0, "authen": 0, "author": 0},
},
"44AA::1": {
"acct_port": 1813,
"auth_port": 1812,
"auto_test_enabled": False,
"keywrap_enabled": False,
"server_name": "ipv6-rad",
"transactions": {"acct": 0, "authen": 0, "author": 0},
},
},
"sg_unconfigured": False,
"sharecount": 1,
"type": "standard",
},
"radius-1": {
"memlocks": 1,
"server": {
"10.106.26.213": {
"acct_port": 1813,
"auth_port": 1812,
"auto_test_enabled": False,
"keywrap_enabled": False,
"server_name": "mgmt-rad",
"transactions": {"acct": 4, "authen": 0, "author": 0},
}
},
"sg_unconfigured": False,
"sharecount": 1,
"type": "standard",
},
"radius-2": {
"memlocks": 1,
"server": {
"121.0.0.1": {
"acct_port": 1813,
"auth_port": 1812,
"auto_test_enabled": False,
"keywrap_enabled": False,
"server_name": "data-rad",
"transactions": {"acct": 0, "authen": 0, "author": 0},
}
},
"sg_unconfigured": False,
"sharecount": 1,
"type": "standard",
},
}
| 31.074627 | 70 | 0.399135 | 175 | 2,082 | 4.554286 | 0.217143 | 0.150565 | 0.075282 | 0.100376 | 0.942284 | 0.942284 | 0.942284 | 0.942284 | 0.923463 | 0.923463 | 0 | 0.08476 | 0.439001 | 2,082 | 66 | 71 | 31.545455 | 0.597603 | 0 | 0 | 0.712121 | 0 | 0 | 0.341499 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a4657096aca8b39e4a5b6f340d3e74577de1d2c2 | 2,679 | py | Python | tests/test_11.py | NJAPe/advent-of-code-2018 | 9fc4b8c6b32518fc39331465cf7773674081c7df | [
"MIT"
] | 1 | 2018-12-25T13:08:38.000Z | 2018-12-25T13:08:38.000Z | tests/test_11.py | NJAPe/advent-of-code-2018 | 9fc4b8c6b32518fc39331465cf7773674081c7df | [
"MIT"
] | null | null | null | tests/test_11.py | NJAPe/advent-of-code-2018 | 9fc4b8c6b32518fc39331465cf7773674081c7df | [
"MIT"
] | null | null | null | import advent_of_code.puzzle_11 as p11
from nose.tools import assert_equal
def test_calc_power_level_sample_1():
power_level = p11.calc_power_level(3, 5, 8)
assert_equal(power_level, 4, "Wrong power level")
def test_calc_power_level_sample_2():
power_level = p11.calc_power_level(122, 79, 57)
assert_equal(power_level, -5, "Wrong power level")
def test_calc_power_level_sample_3():
power_level = p11.calc_power_level(217, 196, 39)
assert_equal(power_level, 0, "Wrong power level")
def test_calc_power_level_sample_4():
power_level = p11.calc_power_level(101, 153, 71)
assert_equal(power_level, 4, "Wrong power level")
def test_find_largest_3_by_3_sample_1():
max_power = p11.find_largest_3_by_3_area(301, 301, 18)
assert_equal(max_power[0], 33, f"{max_power}Wrong x-position of top-left corner")
assert_equal(max_power[1], 45, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 29, "Wrong max-power value")
def test_find_largest_3_by_3_sample_2():
max_power = p11.find_largest_3_by_3_area(301, 301, 42)
assert_equal(max_power[0], 21, "Wrong x-position of top-left corner")
assert_equal(max_power[1], 61, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 30, "Wrong max-power value")
def test_find_largest_3_by_3_real():
max_power = p11.find_largest_3_by_3_area(301, 301, 8444)
assert_equal(max_power[0], 243, "Wrong x-position of top-left corner")
assert_equal(max_power[1], 68, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 28, "Wrong max-power value")
def test_find_largest_any_sample_1():
max_power = p11.find_largest_square(300, 300, 18)
assert_equal(max_power[0], 90, "Wrong x-position of top-left corner")
assert_equal(max_power[1], 269, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 16, "Wrong square size")
assert_equal(max_power[3], 113, "Wrong max-power value")
def test_find_largest_any_sample_2():
max_power = p11.find_largest_square(300, 300, 42)
assert_equal(max_power[0], 232, "Wrong x-position of top-left corner")
assert_equal(max_power[1], 251, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 12, "Wrong square size")
assert_equal(max_power[3], 119, "Wrong max-power value")
def test_find_largest_any_real():
max_power = p11.find_largest_square(300, 300, 8444)
assert_equal(max_power[0], 236, "Wrong x-position of top-left corner")
assert_equal(max_power[1], 252, "Wrong y-position of top-left corner")
assert_equal(max_power[2], 12, "Wrong square size")
assert_equal(max_power[3], 96, "Wrong max-power value")
| 38.826087 | 85 | 0.736842 | 465 | 2,679 | 3.926882 | 0.167742 | 0.148959 | 0.161008 | 0.21851 | 0.88609 | 0.88609 | 0.733297 | 0.731106 | 0.641292 | 0.562979 | 0 | 0.085989 | 0.14483 | 2,679 | 68 | 86 | 39.397059 | 0.711043 | 0 | 0 | 0.085106 | 0 | 0 | 0.252427 | 0 | 0 | 0 | 0 | 0 | 0.553191 | 1 | 0.212766 | false | 0 | 0.042553 | 0 | 0.255319 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a470bffca995a0396210713e873b5c30b68031b2 | 8,947 | py | Python | src/reassessing_the_ins_and_outs_of_unemployment/match_extract.py | caibengbu/ecma33330_proj | e73c9a6566e1c6ab760736f9a9820f4bcec663dd | [
"MIT"
] | 5 | 2021-04-12T20:28:22.000Z | 2022-03-30T02:51:14.000Z | src/reassessing_the_ins_and_outs_of_unemployment/match_extract.py | caibengbu/ecma33330_proj | e73c9a6566e1c6ab760736f9a9820f4bcec663dd | [
"MIT"
] | 5 | 2021-04-19T02:50:47.000Z | 2021-11-18T12:49:44.000Z | src/reassessing_the_ins_and_outs_of_unemployment/match_extract.py | caibengbu/ecma33330_proj | e73c9a6566e1c6ab760736f9a9820f4bcec663dd | [
"MIT"
] | null | null | null | import pandas as pd
import string
import os
import numpy as np
from .other_utils import months_interval, get_raw_filename, get_extracted_pickle_filename
def match_extract(date,theDir):
print("extracting data from raw (" + date +")")
filename = get_raw_filename(theDir,date)
date = int(date)
hrsersuf_dict = dict(zip(string.ascii_uppercase, range(1,27)))
hrsersuf_dict['-1'] = 0
hrsersuf_dict['Z'] = 25
hrsersuf_dict['Y'] = 26
if date <= 197712:
data_dict = {
"hh": (3,8),
"hh1": (8,12),
"hh2": (24,26),
"line": (93,95),
"mis": (1,2),
"age": (96,98),
"race": (99,100),
"sex": (100,101),
"status": (108,109),
"dur": (65,67),
"fweight": (120,132),
"educ": (102,104),
"grade": (104,105),
"mar": (98,99),
"ind": (87,90),
"occu": (90,93)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['hh3'] = df['hh']*1000000 + df['hh1']*100 + df['hh2']
df['educ1'] = df['educ'] - df['grade'] + 1
df = df.drop(['hh','hh1','hh2','educ'], axis=1)
df = df.rename(columns={'hh3':'hh','educ1':'educ'})
elif date <= 198212:
data_dict = {
"hh": (3,15),
"line": (93,95),
"mis": (1,2),
"age": (96,98),
"race": (99,100),
"sex": (100,101),
"status": (108,109),
"dur": (65,67),
"fweight": (120,132),
"educ": (102,104),
"grade": (104,105),
"mar": (98,99),
"ind": (87,90),
"occu": (90,93)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['educ1'] = df['educ'] - df['grade'] + 1
df = df.drop(['educ'], axis=1)
df = df.rename(columns={'educ1':'educ'})
elif date <= 198312:
data_dict = {
"hh": (3,15),
"line": (93,95),
"mis": (1,2),
"age": (96,98),
"race": (99,100),
"sex": (100,101),
"status": (108,109),
"dur": (65,67),
"fweight": (120,132),
"educ": (102,104),
"grade": (104,105),
"mar": (98,99),
"ind": (523,526),
"occu": (526,529)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['educ1'] = df['educ'] - df['grade'] + 1
df = df.drop(['educ'], axis=1)
df = df.rename(columns={'educ1':'educ'})
elif date <= 198812:
data_dict = {
"hh": (3,15),
"line": (540,542),
"mis": (1,2),
"age": (96,98),
"race": (99,100),
"sex": (100,101),
"status": (108,109),
"dur": (65,67),
"fweight": (120,132),
"educ": (102,104),
"grade": (104,105),
"mar": (98,99),
"ind": (523,526),
"occu": (526,529)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['educ1'] = df['educ'] - df['grade'] + 1
df = df.drop(['educ'], axis=1)
df = df.rename(columns={'educ1':'educ'})
elif date <= 199112:
data_dict = {
"hh":(144,156),
"line":(263,265),
"mis":(69,70),
"age":(269,271),
"race":(279,280),
"sex":(274,275),
"status":(347,348),
"dur":(303,305),
"fweight":(397,405),
"lweight":(575,583),
"llind":(583,584),
"educ":(276,278),
"grade":(278,279),
"mar":(271,272),
"ind":(310,312),
"occu":(313,315)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['educ1'] = df['educ'] - df['grade'] + 1
df = df.drop(['educ'], axis=1)
df = df.rename(columns={'educ1':'educ'})
elif date <= 199312:
data_dict = {
"hh":(144,156),
"line":(263,265),
"mis":(69,70),
"age":(269,271),
"race":(279,280),
"sex":(274,275),
"status":(347,348),
"dur":(303,305),
"fweight":(397,405),
"lweight":(575,583),
"llind":(583,584),
"educ":(276,278),
"mar":(271,272),
"ind":(309,312),
"occu":(312,315)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
df = df.replace('[^0-9]', np.nan,regex=True).astype("float64")
elif date <= 199505:
data_dict = {
"gestfips":(92,94),
"hrhhid":(0,12),
"hrsersuf":(74,76),
"line":(146,148),
"mis":(62,64),
"age":(121,123),
"race":(138,140),
"sex":(128,130),
"status":(179,181),
"dur":(406,409),
"fweight":(612,622),
"lweight":(592,602),
"llind":(68,70),
"educ":(136,138),
"mar":(124,126),
"ind":(435,438),
"occu":(438,441)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
z = df.hrsersuf.map(hrsersuf_dict).fillna(0)
df['z'] = pd.to_numeric(df['hrsersuf'],errors='coerce').fillna(0)
df['z'] = df['z'] + z
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['hh'] = df['hrhhid']*100 + df['z']
df = df.drop(['hrhhid','hrsersuf'],axis=1)
elif date <= 200212:
data_dict = {
"hrhhid":(0,15),
"hrsersuf":(74,76),
"line":(146,148),
"mis":(62,64),
"age":(121,123),
"race":(138,140),
"sex":(128,130),
"status":(179,181),
"dur":(406,409),
"fweight":(612,622),
"lweight":(592,602),
"llind":(68,70),
"educ":(136,138),
"mar":(124,126),
"ind":(435,438),
"occu":(438,441)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
z = df.hrsersuf.map(hrsersuf_dict).fillna(0)
df['z'] = pd.to_numeric(df['hrsersuf'],errors='coerce').fillna(0)
df['z'] = df['z'] + z
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['hh'] = df['hrhhid']*100 + df['z']
df = df.drop(['hrhhid','hrsersuf'],axis=1)
else :
data_dict = {
"hrhhid":(0,15),
"hrsersuf":(74,76),
"line":(146,148),
"mis":(62,64),
"age":(121,123),
"race":(138,140),
"sex":(128,130),
"status":(179,181),
"dur":(406,409),
"fweight":(612,622),
"lweight":(592,602),
"llind":(68,70),
"educ":(136,138),
"mar":(124,126),
"ind":(855,859),
"occu":(859,863)}
df = pd.read_fwf(filename,colspecs=list(data_dict.values()),header=None)
df.columns = list(data_dict.keys())
z = df.hrsersuf.map(hrsersuf_dict).fillna(0)
df['z'] = pd.to_numeric(df['hrsersuf'],errors='coerce').fillna(0)
df['z'] = df['z'] + z
df = df.replace('[^0-9]', np.nan, regex=True).astype("float64")
df['hh'] = df['hrhhid']*100 + df['z']
df = df.drop(['hrhhid','hrsersuf'],axis=1)
df.to_pickle(get_extracted_pickle_filename(theDir,date))
def extract_all(start_date,end_date,theDir):
dates = months_interval(start_date,end_date)
for date in dates:
filename = get_extracted_pickle_filename(theDir,date)
if os.path.isfile(filename):
print(f"data for {date} is already extracted")
else:
match_extract(date,theDir)
def clean_all_extracted(start_date,end_date,theDir):
dates = months_interval(start_date,end_date)
for date in dates:
filename = get_extracted_pickle_filename(theDir,date)
if os.path.isfile(filename):
os.remove(filename)
| 35.931727 | 89 | 0.461831 | 1,097 | 8,947 | 3.692799 | 0.18505 | 0.05332 | 0.05332 | 0.024438 | 0.835102 | 0.826709 | 0.813626 | 0.807208 | 0.807208 | 0.807208 | 0 | 0.144306 | 0.327708 | 8,947 | 248 | 90 | 36.076613 | 0.529177 | 0 | 0 | 0.822314 | 0 | 0 | 0.116128 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012397 | false | 0 | 0.020661 | 0 | 0.033058 | 0.008264 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a472f2dfd459ad4d92631add106fdaea22da51a7 | 8,984 | py | Python | parser/fase2/team14/Optimizacion/Asignacion.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | null | null | null | parser/fase2/team14/Optimizacion/Asignacion.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | null | null | null | parser/fase2/team14/Optimizacion/Asignacion.py | Yosoyfr/tytus | 0df7e656835a93458462e476f7ab858a33baa2e0 | [
"MIT"
] | null | null | null | from Optimizacion.Instruccion import Instruccion
from Optimizacion.reporteOptimizacion import *
class Asignacion(Instruccion):
def __init__(self,id=None, valorizq=None,operador=None,valorder=None,linea=''):
self.valorder=valorder
self.valorizq=valorizq
self.operador=operador
self.id=id
self.linea=linea
def Optimizar(self):
'Metodo Abstracto para obtener el valor de la Instrruccion'
anterior="";
optimizado ="";
if(self.valorder!=None and self.operador!=None):
if (self.valorizq=='['):
'ES STACK'
elif(self.valorizq in 'stack'):
''
else:
print (self.valorizq,'-----------',self.valorder)
if(self.id==self.valorizq):
if(self.operador=="+"):
if(self.valorder=="0"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 8', anterior,optimizado,self.linea))
return self.valorizq
elif(self.operador=="-"):
if(self.valorder=="0"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 9', anterior,optimizado,self.linea))
return self.valorizq
elif(self.operador=="*"):
if(self.valorder=="1"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 10', anterior,optimizado,self.linea))
print("anterior",anterior)
return self.valorizq
elif(self.operador=="/"):
if(self.valorder=="1"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 11', anterior,optimizado,self.linea))
return self.valorizq
if(self.id==self.valorder):
if(self.operador=="+"):
if(self.valorizq=="0"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 8', anterior,optimizado,self.linea))
return self.valorder
elif(self.operador=="*"):
if(self.valorizq=="1"):
optimizado="";
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 10', anterior,optimizado,self.linea))
print("anterior",anterior)
return self.valorder
if(self.id != self.valorizq ):
print("entrooooooooo")
if(self.operador=="+"):
if(self.valorder=="0"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 12', anterior,optimizado,self.linea))
return optimizado
elif(self.operador=="-"):
if(self.valorder=="0"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 13', anterior,optimizado,self.linea))
return optimizado
elif(self.operador=="*"):
if(self.valorder=="1"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 14', anterior,optimizado,self.linea))
return optimizado
elif(self.valorder=="2"):
optimizado=self.id + '='+ self.valorizq +" + "+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 16', anterior,optimizado,self.linea))
return optimizado
elif(self.valorder=="0"):
optimizado=self.id + '='+ self.valorder
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 17', anterior,optimizado,self.linea))
return optimizado
elif(self.operador=="/"):
if(self.valorder=="1"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 15', anterior,optimizado,self.linea))
return optimizado
if(self.id != self.valorder):
if(self.operador=="+"):
if(self.valizq=="0"):
optimizado=self.id + '='+ self.valorder
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 12', anterior,optimizado,self.linea))
return optimizado
elif(self.operador=="*"):
if(self.valorizq=="1"):
optimizado=self.id + '='+ self.valorder
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 14', anterior,optimizado,self.linea))
return optimizado
elif(self.valorizq=="2"):
optimizado=self.id + '='+ self.valorder +" + "+ self.valorder
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 16', anterior,optimizado,self.linea))
return optimizado
elif(self.valorizq=="0"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 17', anterior,optimizado,self.linea))
return optimizado
elif(self.operador=="/"):
if(self.valorizq=="0"):
optimizado=self.id + '='+ self.valorizq
anterior =self.id + '='+ self.valorizq +" "+self.operador+" "+ self.valorder
repOptimizado.append(reporteOptimizacion('Mirilla','Regla 18', anterior,optimizado,self.linea))
return optimizado
| 59.105263 | 127 | 0.43622 | 640 | 8,984 | 6.117188 | 0.089063 | 0.125671 | 0.081737 | 0.11954 | 0.878161 | 0.863346 | 0.841379 | 0.812516 | 0.803576 | 0.797446 | 0 | 0.009639 | 0.445681 | 8,984 | 151 | 128 | 59.496689 | 0.776506 | 0.006345 | 0 | 0.712 | 0 | 0 | 0.051313 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016 | false | 0 | 0.016 | 0 | 0.176 | 0.032 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8efe47b765bca571f6f0b2fe7cadabf3ffa0e17c | 52,245 | py | Python | homeschool/courses/tests/test_views.py | MOHAMEDELADIB/homeschool | 114aaa4ac4092bb9441ce710877e088eb0bc0dbb | [
"MIT"
] | null | null | null | homeschool/courses/tests/test_views.py | MOHAMEDELADIB/homeschool | 114aaa4ac4092bb9441ce710877e088eb0bc0dbb | [
"MIT"
] | 1 | 2022-03-02T19:44:44.000Z | 2022-03-02T19:44:44.000Z | homeschool/courses/tests/test_views.py | MOHAMEDELADIB/homeschool | 114aaa4ac4092bb9441ce710877e088eb0bc0dbb | [
"MIT"
] | null | null | null | import datetime
from unittest import mock
from django.contrib.messages import get_messages
from freezegun import freeze_time
from homeschool.courses.models import Course, CourseResource, CourseTask, GradedWork
from homeschool.courses.tests.factories import (
CourseFactory,
CourseResourceFactory,
CourseTaskFactory,
GradedWorkFactory,
)
from homeschool.schools.tests.factories import GradeLevelFactory, SchoolYearFactory
from homeschool.students.tests.factories import (
CourseworkFactory,
EnrollmentFactory,
GradeFactory,
)
from homeschool.test import TestCase
class TestCourseCreateView(TestCase):
def test_unauthenticated_access(self):
self.assertLoginRequired("courses:create")
def test_get(self):
user = self.make_user()
SchoolYearFactory(school__admin=user)
with self.login(user):
self.get_check_200("courses:create")
assert self.get_context("create")
def test_school_year_id(self):
"""A school year is fetched from the querystring."""
user = self.make_user()
today = user.get_local_today()
# Use dates in the past so the school year won't be a "current" school year.
school_year = SchoolYearFactory(
school__admin=user,
start_date=today - datetime.timedelta(days=365),
end_date=today - datetime.timedelta(days=1),
)
with self.login(user):
self.get_check_200(
"courses:create", data={"school_year": str(school_year.id)}
)
form = self.get_context("form")
assert form.school_year == school_year
def test_school_year_only_user_school_years(self):
"""A school year from the querystring can only be a user's school years."""
user = self.make_user()
user_school_year = SchoolYearFactory(school__admin=user)
school_year = SchoolYearFactory()
with self.login(user):
self.get_check_200(
"courses:create", data={"school_year": str(school_year.id)}
)
form = self.get_context("form")
assert form.school_year == user_school_year
def test_school_year_from_user(self):
"""A school year is fetched from the user if not provided on the querystring."""
user = self.make_user()
school_year = SchoolYearFactory(school__admin=user)
with self.login(user):
self.get_check_200("courses:create")
form = self.get_context("form")
assert form.school_year == school_year
def test_school_year_id_bogus(self):
"""A malformed school year id in the querystring is ignored."""
user = self.make_user()
school_year = SchoolYearFactory(school__admin=user)
with self.login(user):
self.get_check_200("courses:create", data={"school_year": "bogus"})
form = self.get_context("form")
assert form.school_year == school_year
def test_no_school_year(self):
"""When no school year is provided, the user is redirected to the list page."""
user = self.make_user()
with self.login(user):
response = self.get("courses:create")
self.response_302(response)
assert self.reverse("schools:school_year_list") in response.get("Location")
def test_has_grade_level(self):
"""A grade level put in the querystring is available as context."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school__admin=user)
with self.login(user):
self.get_check_200(
"courses:create", data={"grade_level": str(grade_level.id)}
)
assert self.get_context("grade_level") == grade_level
def test_not_other_grade_level(self):
"""A different user's grade level cannot be in the context."""
user = self.make_user()
SchoolYearFactory(school__admin=user)
grade_level = GradeLevelFactory()
with self.login(user):
self.get_check_200(
"courses:create", data={"grade_level": str(grade_level.id)}
)
assert self.get_context("grade_level") is None
def test_bogus_grade_level(self):
"""A bogus grade level is ignored and not in the context"""
user = self.make_user()
SchoolYearFactory(school__admin=user)
with self.login(user):
self.get_check_200("courses:create", data={"grade_level": "bogus"})
assert self.get_context("grade_level") is None
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
data = {
"name": "Course name",
"wednesday": "on",
"friday": "on",
"grade_levels": str(grade_level.id),
"default_task_duration": 45,
"is_active": False,
}
with self.login(user):
response = self.post("courses:create", data=data)
course = Course.objects.get(grade_levels=grade_level.id)
assert course.name == "Course name"
assert course.days_of_week == Course.WEDNESDAY + Course.FRIDAY
self.response_302(response)
assert self.reverse("courses:detail", pk=course.id) in response.get("Location")
assert not course.is_active
def test_course_copy_fills_form_fields(self):
"""The course to copy fills in the course form."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course_to_copy = CourseFactory(
grade_levels=[grade_level], name="To Copy", default_task_duration=99
)
with self.login(user):
self.get_check_200(
"courses:create", data={"copy_from": str(course_to_copy.id)}
)
form = self.get_context("form")
assert form.initial["name"] == "To Copy"
assert form.initial["default_task_duration"] == 99
def test_course_copy_only_user_courses(self):
"""A user cannot copy another user's course."""
user = self.make_user()
SchoolYearFactory(school=user.school)
course_to_copy = CourseFactory()
with self.login(user):
self.get_check_200(
"courses:create", data={"copy_from": str(course_to_copy.id)}
)
assert self.get_context("course_to_copy") is None
def test_course_copy_tasks_resources(self):
"""Copying a course includes the tasks, graded work, and resources."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course_to_copy = CourseFactory(grade_levels=[grade_level])
CourseTaskFactory(course=course_to_copy)
graded_task = CourseTaskFactory(course=course_to_copy)
GradedWorkFactory(course_task=graded_task)
CourseResourceFactory(course=course_to_copy)
data = {
"name": "Course name",
"wednesday": "on",
"friday": "on",
"grade_levels": str(grade_level.id),
"default_task_duration": 45,
}
url = self.reverse("courses:create")
url += f"?copy_from={course_to_copy.id}"
with self.login(user):
self.post(url, data=data)
assert Course.objects.count() == 2
copied_course = Course.objects.last()
assert copied_course.id != course_to_copy.id
assert CourseTask.objects.filter(course=copied_course).count() == 2
new_graded_task = CourseTask.objects.filter(course=copied_course).last()
assert hasattr(new_graded_task, "graded_work")
assert CourseResource.objects.filter(course=copied_course).count() == 1
class TestCourseDetailView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:detail", pk=course.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
assert list(self.get_context("grade_levels")) == [grade_level]
assert self.get_context("school_year") == grade_level.school_year
assert self.get_context("course_tasks") == []
assert self.get_context("last_task") is None
self.assertInContext("task_details")
def test_grade_level_name_with_task(self):
"""Any grade level specific task has the grade level's name next to it."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
CourseTaskFactory(course=course, grade_level=grade_level)
with self.login(user):
self.get("courses:detail", pk=course.id)
self.assertResponseContains(grade_level.name)
def test_last_task(self):
"""The last task is added to the context."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
CourseTaskFactory(course=course)
last_task = CourseTaskFactory(course=course)
with self.login(user):
self.get("courses:detail", pk=course.id)
assert self.get_context("last_task") == last_task
def test_enrolled_students(self):
"""The enrolled students of the course are in the context."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
enrollment = EnrollmentFactory(grade_level=grade_level)
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
assert self.get_context("enrolled_students") == [enrollment.student]
def test_course_tasks_context(self):
"""All the task details of an enrolled student are in the context."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
enrollment = EnrollmentFactory(grade_level=grade_level)
work = CourseworkFactory(student=enrollment.student, course_task=task)
grade = GradeFactory(student=enrollment.student, graded_work__course_task=task)
url = self.reverse("courses:detail", pk=course.id) + "?completed_tasks=1"
with self.login(user):
self.get_check_200(url)
assert self.get_context("task_details") == [
{
"number": 1,
"task": task,
"complete": True,
"student_details": [
{
"student": enrollment.student,
"assigned": True,
"coursework": work,
"grade": grade,
"planned_date": None,
}
],
}
]
def test_task_complete_no_student(self):
"""When there are no students, a task defaults to incomplete."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
CourseTaskFactory(course=course)
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
detail = self.get_context("task_details")[0]
assert not detail["complete"]
def test_task_complete_one_student_coursework(self):
"""When a student has not completed, the task is marked incomplete."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
EnrollmentFactory(grade_level=grade_level)
enrollment = EnrollmentFactory(grade_level=grade_level)
CourseworkFactory(student=enrollment.student, course_task=task)
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
detail = self.get_context("task_details")[0]
assert not detail["complete"]
def test_task_complete_both_students_done(self):
"""When all students are done with a task, it is marked complete."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
enrollment_1 = EnrollmentFactory(grade_level=grade_level)
CourseworkFactory(student=enrollment_1.student, course_task=task)
enrollment_2 = EnrollmentFactory(grade_level=grade_level)
CourseworkFactory(student=enrollment_2.student, course_task=task)
url = self.reverse("courses:detail", pk=course.id) + "?completed_tasks=1"
with self.login(user):
self.get_check_200(url)
detail = self.get_context("task_details")[0]
assert detail["complete"]
def test_task_complete_grade_specific_task(self):
"""A grade specific task is complete when the students in the grade are done."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
other_grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level, other_grade_level])
task = CourseTaskFactory(course=course, grade_level=grade_level)
enrollment = EnrollmentFactory(grade_level=grade_level)
CourseworkFactory(student=enrollment.student, course_task=task)
EnrollmentFactory(grade_level=other_grade_level)
url = self.reverse("courses:detail", pk=course.id) + "?completed_tasks=1"
with self.login(user):
self.get_check_200(url)
detail = self.get_context("task_details")[0]
assert detail["complete"]
def test_hide_complete_tasks(self):
"""With students enrolled, completed tasks are hidden by default."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
enrollment = EnrollmentFactory(grade_level=grade_level)
CourseworkFactory(student=enrollment.student, course_task=task)
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
assert not self.get_context("task_details")
@freeze_time("2021-03-10") # Wednesday
def test_no_student_planned_date(self):
"""When no student is enrolled, the tasks have a planned completion date."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
CourseTaskFactory(course=course)
with self.login(user):
self.get_check_200("courses:detail", pk=course.id)
planned_date = self.get_context("task_details")[0]["planned_date"]
assert planned_date == datetime.date.today()
class TestCourseEditView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:edit", pk=course.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get_check_200("courses:edit", pk=course.id)
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"name": "New course name",
"wednesday": "on",
"friday": "on",
"grade_levels": str(grade_level.id),
"default_task_duration": 45,
"is_active": False,
}
with self.login(user):
self.post("courses:edit", pk=course.id, data=data)
course.refresh_from_db()
assert course.name == "New course name"
assert course.days_of_week == Course.WEDNESDAY + Course.FRIDAY
assert not course.is_active
class TestBulkDeleteCourseTasks(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:task_delete_bulk", pk=course.id)
def test_only_users_course(self):
"""The user can only modify their own course tasks."""
user = self.make_user()
course = CourseFactory()
with self.login(user):
response = self.get("courses:task_delete_bulk", pk=course.id)
self.response_404(response)
def test_ok(self):
"""The view renders ok."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
self.get_check_200("courses:task_delete_bulk", pk=course.id)
assert self.get_context("course") == course
assert list(self.get_context("course_tasks")) == [task]
def test_delete(self):
"""The selected tasks are deleted."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
undeleted_task = CourseTaskFactory(course=course)
data = {f"task-{task.id}": f"{task.id}"}
with self.login(user):
response = self.post("courses:task_delete_bulk", pk=course.id, data=data)
self.response_200(response)
assert response.get("HX-Redirect") == self.reverse(
"courses:detail", pk=course.id
)
assert CourseTask.objects.filter(id=undeleted_task.id).count() == 1
assert CourseTask.objects.filter(id=task.id).count() == 0
message = list(get_messages(response.wsgi_request))[0]
assert str(message) == "Deleted 1 tasks."
def test_error_redirect(self):
"""An error will redirect to the same page."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
another_task = CourseTaskFactory()
data = {
f"task-{task.id}": f"{task.id}",
f"task-{another_task.id}": f"{another_task.id}",
}
with self.login(user):
response = self.post("courses:task_delete_bulk", pk=course.id, data=data)
self.response_200(response)
assert response.get("HX-Redirect") == self.reverse(
"courses:task_delete_bulk", pk=course.id
)
message = list(get_messages(response.wsgi_request))[0]
assert (
str(message)
== "Sorry, you do not have permission to delete the selected tasks."
)
class TestCourseDeleteView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:delete", pk=course.id)
def test_other_course(self):
"""A user may not access another user's course."""
user = self.make_user()
course = CourseFactory()
with self.login(user):
response = self.get("courses:delete", pk=course.id)
assert response.status_code == 404
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
GradeFactory(graded_work__course_task=task)
CourseworkFactory(course_task=task)
CourseResourceFactory(course=course)
with self.login(user):
self.get_check_200("courses:delete", pk=course.id)
assert self.get_context("tasks_count") == 1
assert self.get_context("grades_count") == 1
assert self.get_context("coursework_count") == 1
assert self.get_context("course_resources_count") == 1
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
response = self.post("courses:delete", pk=course.id)
assert Course.objects.count() == 0
self.response_302(response)
assert response.get("Location") == self.reverse(
"schools:school_year_detail", pk=grade_level.school_year.id
)
class TestCourseCopySelectView(TestCase):
def test_unauthenticated_access(self):
self.assertLoginRequired("courses:copy")
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get_check_200("courses:copy")
assert self.get_context("school_years") == [
{
"school_year": grade_level.school_year,
"grade_levels": {grade_level: [course]},
}
]
def test_only_user_courses(self):
"""The copy view only lists the user's courses."""
user = self.make_user()
CourseFactory()
with self.login(user):
self.get_check_200("courses:copy")
assert self.get_context("school_years") == []
class TestCourseTaskCreateView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:task_create", pk=course.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level], default_task_duration=42)
with self.login(user):
self.get_check_200("courses:task_create", pk=course.id)
form = self.get_context("form")
assert form.initial["duration"] == course.default_task_duration
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {"course": str(course.id), "description": "A new task", "duration": "30"}
with self.login(user):
response = self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 1
task = CourseTask.objects.get(course=course)
assert task.description == data["description"]
assert task.duration == int(data["duration"])
self.response_302(response)
assert response.get("Location") == self.reverse("courses:detail", pk=course.id)
assert not hasattr(task, "graded_work")
def test_has_previous_task(self):
"""The previous task is in the context if the querystring is present."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
url = self.reverse("courses:task_create", pk=course.id)
url += f"?previous_task={task.id}"
with self.login(user):
self.get(url)
assert self.get_context("previous_task") == task
def test_has_create(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get("courses:task_create", pk=course.id)
self.assertContext("create", True)
def test_redirect_next(self):
next_url = "/another/location/"
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "new description",
"duration": 15,
}
url = self.reverse("courses:task_create", pk=course.id)
url += f"?next={next_url}"
with self.login(user):
response = self.post(url, data=data)
self.response_302(response)
assert next_url in response.get("Location")
def test_has_course(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get("courses:task_create", pk=course.id)
self.assertContext("course", course)
def test_has_grade_levels(self):
"""The grade levels for selection match the grades that have the course."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
other_grade_level = GradeLevelFactory(school_year=grade_level.school_year)
course = CourseFactory(grade_levels=[grade_level, other_grade_level])
with self.login(user):
self.get("courses:task_create", pk=course.id)
grade_levels = set(self.get_context("grade_levels"))
assert grade_levels == {grade_level, other_grade_level}
def test_after_task(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {"course": str(course.id), "description": "A new task", "duration": "30"}
task_1 = CourseTaskFactory(course=course)
task_2 = CourseTaskFactory(course=course)
url = self.reverse("courses:task_create", pk=course.id)
url += f"?previous_task={task_1.id}"
with self.login(user):
self.post(url, data=data)
task_3 = CourseTask.objects.get(description="A new task")
assert list(CourseTask.objects.all()) == [task_1, task_3, task_2]
def test_is_graded(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"is_graded": "on",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 1
task = CourseTask.objects.get(course=course)
assert task.graded_work is not None
def test_replicates(self):
"""A user that replicates data will create multiple tasks."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"replicate": "on",
"replicate_count": "2",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 2
descriptions = list(
CourseTask.objects.filter(course=course).values_list(
"description", flat=True
)
)
assert descriptions == ["A new task", "A new task"]
def test_bad_replicate_count(self):
"""A bad replicate count does no harm."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"replicate": "on",
"replicate_count": "bad",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 1
@mock.patch("homeschool.courses.views.schools_constants")
def test_max_allowed_enforced(self, mock_constants):
"""When replicate count is higher than the max, only the max are created."""
mock_constants.MAX_ALLOWED_DAYS = 2
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"replicate": "on",
"replicate_count": "10",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 2
def test_replicates_with_autonumber(self):
"""A user who replicates with autonumber will create multiple numbered tasks."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"replicate": "on",
"replicate_count": "2",
"autonumber": "on",
"starting_at": "5",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
assert CourseTask.objects.count() == 2
descriptions = list(
CourseTask.objects.filter(course=course).values_list(
"description", flat=True
)
)
assert descriptions == ["A new task 5", "A new task 6"]
def test_replicates_with_autonumber_bad_starting_at(self):
"""A bad starting_at does not replicate."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"description": "A new task",
"duration": "30",
"replicate": "on",
"replicate_count": "2",
"autonumber": "on",
"starting_at": "boom",
}
with self.login(user):
self.post("courses:task_create", pk=course.id, data=data)
task = CourseTask.objects.get(course=course)
assert task.description == "A new task"
class TestBulkCreateCourseTasks(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:task_create_bulk", pk=course.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level], default_task_duration=42)
with self.login(user):
self.get_check_200("courses:task_create_bulk", pk=course.id)
form = self.get_context("formset")[0]
assert form.user == user
assert (
form.get_initial_for_field(form.fields["duration"], "duration")
== course.default_task_duration
)
assert self.get_context("course") == course
assert list(self.get_context("grade_levels")) == [grade_level]
assert self.get_context("extra_forms") == "3"
def test_other_course(self):
"""A user may not bulk create for another user's course."""
user = self.make_user()
course = CourseFactory()
with self.login(user):
response = self.get("courses:task_create_bulk", pk=course.id)
assert response.status_code == 404
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"form-TOTAL_FORMS": "1",
"form-INITIAL_FORMS": "0",
"form-MIN_NUM_FORMS": "0",
"form-MAX_NUM_FORMS": "1000",
"form-0-course": str(course.id),
"form-0-description": "A new task",
"form-0-duration": "42",
}
with self.login(user):
response = self.post("courses:task_create_bulk", pk=course.id, data=data)
assert CourseTask.objects.count() == 1
task = CourseTask.objects.get(course=course)
assert task.description == data["form-0-description"]
assert task.duration == int(data["form-0-duration"])
self.response_302(response)
assert response.get("Location") == self.reverse("courses:detail", pk=course.id)
assert not hasattr(task, "graded_work")
def test_after_task(self):
"""Tasks are placed after a specified task."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"form-TOTAL_FORMS": "2",
"form-INITIAL_FORMS": "0",
"form-MIN_NUM_FORMS": "0",
"form-MAX_NUM_FORMS": "1000",
"form-0-course": str(course.id),
"form-0-description": "A new task",
"form-0-duration": "42",
"form-1-course": str(course.id),
"form-1-description": "Another new task",
"form-1-duration": "42",
}
task_1 = CourseTaskFactory(course=course)
task_2 = CourseTaskFactory(course=course)
url = self.reverse("courses:task_create_bulk", pk=course.id)
url += f"?previous_task={task_1.id}"
with self.login(user):
self.post(url, data=data)
task_3 = CourseTask.objects.get(description="A new task")
task_4 = CourseTask.objects.get(description="Another new task")
assert list(CourseTask.objects.all()) == [task_1, task_3, task_4, task_2]
def test_redirect_next(self):
"""After creation, the user returns to the next URL."""
next_url = "/another/location/"
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"form-TOTAL_FORMS": "1",
"form-INITIAL_FORMS": "0",
"form-MIN_NUM_FORMS": "0",
"form-MAX_NUM_FORMS": "1000",
"form-0-course": str(course.id),
"form-0-description": "A new task",
"form-0-duration": "42",
}
url = self.reverse("courses:task_create_bulk", pk=course.id)
url += f"?next={next_url}"
with self.login(user):
response = self.post(url, data=data)
self.response_302(response)
assert next_url in response.get("Location")
def test_has_previous_task(self):
"""When the previous task is in the querystring, it's in the context."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level], default_task_duration=42)
task = CourseTaskFactory(course=course)
url = self.reverse("courses:task_create_bulk", pk=course.id)
url += f"?previous_task={task.id}"
with self.login(user):
self.get_check_200(url)
assert self.get_context("previous_task") == task
class TestGetCourseTaskBulkHx(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired(
"courses:task_create_bulk_hx", pk=course.id, last_form_number=42
)
def test_other_course(self):
"""A user may not get bulk create forms for another user's course."""
user = self.make_user()
course = CourseFactory()
with self.login(user):
response = self.get(
"courses:task_create_bulk_hx", pk=course.id, last_form_number=2
)
assert response.status_code == 404
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level], default_task_duration=42)
with self.login(user):
self.get_check_200(
"courses:task_create_bulk_hx", pk=course.id, last_form_number=2
)
assert len(self.get_context("forms")) == 3
form = self.get_context("forms")[0]
assert form.user == user
assert (
form.get_initial_for_field(form.fields["duration"], "duration")
== course.default_task_duration
)
assert self.get_context("course") == course
assert list(self.get_context("grade_levels")) == [grade_level]
assert self.get_context("last_form_number") == 2
class TestCourseTaskUpdateView(TestCase):
def test_unauthenticated_access(self):
task = CourseTaskFactory()
self.assertLoginRequired("courses:task_edit", pk=task.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
self.get_check_200("courses:task_edit", pk=task.id)
def test_get_other_user(self):
user = self.make_user()
task = CourseTaskFactory()
with self.login(user):
response = self.get("courses:task_edit", pk=task.id)
self.response_404(response)
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(
description="some description", duration=30, course=course
)
data = {
"course": str(task.course.id),
"description": "new description",
"duration": 15,
}
with self.login(user):
response = self.post("courses:task_edit", pk=task.id, data=data)
task.refresh_from_db()
assert task.description == data["description"]
assert task.duration == data["duration"]
self.response_302(response)
def test_has_course(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
self.get("courses:task_edit", pk=task.id)
self.assertContext("course", task.course)
def test_has_grade_levels(self):
"""The grade levels for selection match the grades that have the course."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
other_grade_level = GradeLevelFactory(school_year=grade_level.school_year)
course = CourseFactory(grade_levels=[grade_level, other_grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
self.get("courses:task_edit", pk=task.id)
grade_levels = set(self.get_context("grade_levels"))
assert grade_levels == {grade_level, other_grade_level}
def test_has_previous_task(self):
"""A previous task is in the context when it exists."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
previous_task = CourseTaskFactory(course=course)
task = CourseTaskFactory(course=course)
with self.login(user):
self.get("courses:task_edit", pk=task.id)
assert self.get_context("previous_task") == previous_task
def test_redirect_next(self):
next_url = "/another/location/"
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
data = {
"course": str(task.course.id),
"description": "new description",
"duration": 15,
}
url = self.reverse("courses:task_edit", pk=task.id)
url += f"?next={next_url}"
with self.login(user):
response = self.post(url, data=data)
self.response_302(response)
assert next_url in response.get("Location")
def test_is_graded(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(
description="some description", duration=30, course=course
)
data = {
"course": str(task.course.id),
"description": "new description",
"duration": 15,
"is_graded": "on",
}
with self.login(user):
self.post("courses:task_edit", pk=task.id, data=data)
task.refresh_from_db()
assert task.graded_work is not None
def test_keep_graded(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(
description="some description", duration=30, course=course
)
GradedWorkFactory(course_task=task)
data = {
"course": str(task.course.id),
"description": "new description",
"duration": 15,
"is_graded": "on",
}
with self.login(user):
self.post("courses:task_edit", pk=task.id, data=data)
task.refresh_from_db()
assert task.graded_work is not None
assert GradedWork.objects.count() == 1
def test_remove_graded(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(
description="some description", duration=30, course=course
)
GradedWorkFactory(course_task=task)
data = {
"course": str(task.course.id),
"description": "new description",
"duration": 15,
}
with self.login(user):
self.post("courses:task_edit", pk=task.id, data=data)
task.refresh_from_db()
assert not hasattr(task, "graded_work")
class TestCourseTaskDeleteView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
task = CourseTaskFactory(course=course)
self.assertLoginRequired("courses:task_delete", course_id=course.id, pk=task.id)
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
response = self.post("courses:task_delete", course_id=course.id, pk=task.id)
assert CourseTask.objects.count() == 0
self.response_302(response)
assert response.get("Location") == self.reverse("courses:detail", pk=course.id)
def test_post_other_user(self):
user = self.make_user()
course = CourseFactory()
task = CourseTaskFactory(course=course)
with self.login(user):
response = self.get("courses:task_delete", course_id=course.id, pk=task.id)
self.response_404(response)
def test_redirect_next(self):
"""The delete view redirects to next parameter if present."""
next_url = "/another/location/"
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
url = self.reverse("courses:task_delete", course_id=course.id, pk=task.id)
url += f"?next={next_url}"
with self.login(user):
response = self.post(url)
self.response_302(response)
assert next_url in response.get("Location")
class TestCourseTaskHxDeleteView(TestCase):
def test_unauthenticated_access(self):
task = CourseTaskFactory()
self.assertLoginRequired("courses:task_hx_delete", pk=task.id)
def test_delete(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
task = CourseTaskFactory(course=course)
with self.login(user):
response = self.delete("courses:task_hx_delete", pk=task.id)
assert CourseTask.objects.count() == 0
self.response_200(response)
assert "task_details" in response.context
def test_delete_other_user(self):
"""Another user cannot delete a user's task."""
user = self.make_user()
course = CourseFactory()
task = CourseTaskFactory(course=course)
with self.login(user):
response = self.delete("courses:task_hx_delete", pk=task.id)
assert CourseTask.objects.count() == 1
self.response_404(response)
class TestCourseTaskDown(TestCase):
def test_unauthenticated_access(self):
task = CourseTaskFactory()
self.assertLoginRequired("courses:task_down", pk=task.id)
def test_post(self):
"""A task is moved down."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
first_task = CourseTaskFactory(course=course)
second_task = CourseTaskFactory(course=course)
with self.login(user):
response = self.post("courses:task_down", pk=first_task.id)
assert (
response.get("Location")
== self.reverse("courses:detail", first_task.course.id)
+ f"#task-{first_task.id}"
)
assert list(CourseTask.objects.all()) == [second_task, first_task]
class TestCourseTaskUp(TestCase):
def test_unauthenticated_access(self):
task = CourseTaskFactory()
self.assertLoginRequired("courses:task_up", pk=task.id)
def test_post(self):
"""A task is moved up."""
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
first_task = CourseTaskFactory(course=course)
second_task = CourseTaskFactory(course=course)
with self.login(user):
response = self.post("courses:task_up", pk=second_task.id)
assert (
response.get("Location")
== self.reverse("courses:detail", second_task.course.id)
+ f"#task-{second_task.id}"
)
assert list(CourseTask.objects.all()) == [second_task, first_task]
class TestCourseResourceCreateView(TestCase):
def test_unauthenticated_access(self):
course = CourseFactory()
self.assertLoginRequired("courses:resource_create", pk=course.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
with self.login(user):
self.get_check_200("courses:resource_create", pk=course.id)
assert self.get_context("create")
assert self.get_context("course") == course
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
data = {
"course": str(course.id),
"title": "Charlotte's Web",
"details": "That's some pig.",
}
with self.login(user):
response = self.post("courses:resource_create", pk=course.id, data=data)
assert CourseResource.objects.count() == 1
resource = CourseResource.objects.get(course=course)
assert resource.title == data["title"]
assert resource.details == data["details"]
self.response_302(response)
assert response.get("Location") == self.reverse("courses:detail", pk=course.id)
class TestCourseResourceUpdateView(TestCase):
def test_unauthenticated_access(self):
resource = CourseResourceFactory()
self.assertLoginRequired("courses:resource_edit", pk=resource.id)
def test_get(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
resource = CourseResourceFactory(course__grade_levels=[grade_level])
with self.login(user):
self.get_check_200("courses:resource_edit", pk=resource.id)
assert not self.get_context("create")
assert self.get_context("course") == resource.course
def test_get_other_user(self):
"""A user may not edit another user's resource."""
user = self.make_user()
resource = CourseResourceFactory()
with self.login(user):
response = self.get("courses:resource_edit", pk=resource.id)
self.response_404(response)
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
resource = CourseResourceFactory(course__grade_levels=[grade_level])
data = {
"course": str(resource.course.id),
"title": "Charlotte's Web",
"details": "That's some pig.",
}
with self.login(user):
response = self.post("courses:resource_edit", pk=resource.id, data=data)
assert CourseResource.objects.count() == 1
resource = CourseResource.objects.get(course=resource.course)
assert resource.title == data["title"]
assert resource.details == data["details"]
self.response_302(response)
assert response.get("Location") == self.reverse(
"courses:detail", pk=resource.course.id
)
class TestCourseResourceDeleteView(TestCase):
def test_unauthenticated_access(self):
resource = CourseResourceFactory()
self.assertLoginRequired("courses:resource_delete", pk=resource.id)
def test_post(self):
user = self.make_user()
grade_level = GradeLevelFactory(school_year__school=user.school)
course = CourseFactory(grade_levels=[grade_level])
resource = CourseResourceFactory(course=course)
with self.login(user):
response = self.post("courses:resource_delete", pk=resource.id)
assert CourseResource.objects.count() == 0
self.response_302(response)
assert response.get("Location") == self.reverse("courses:detail", pk=course.id)
def test_post_other_user(self):
"""A user may not delete another user's resource."""
user = self.make_user()
course = CourseFactory()
resource = CourseResourceFactory(course=course)
with self.login(user):
response = self.post("courses:resource_delete", pk=resource.id)
self.response_404(response)
| 37.291221 | 88 | 0.63698 | 6,014 | 52,245 | 5.318257 | 0.050881 | 0.058467 | 0.03039 | 0.04052 | 0.83695 | 0.811281 | 0.780797 | 0.760224 | 0.736743 | 0.716014 | 0 | 0.009039 | 0.250378 | 52,245 | 1,400 | 89 | 37.317857 | 0.807629 | 0.050378 | 0 | 0.701395 | 0 | 0 | 0.110256 | 0.021751 | 0 | 0 | 0 | 0 | 0.138605 | 1 | 0.091163 | false | 0 | 0.008372 | 0 | 0.115349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
74cff515de28dd75270822268eb8a8c1686f08cf | 14,093 | py | Python | app/tests/test_user_address_model.py | ihor-nahuliak/task-17-aug-2019 | 8bcfc86d54de544db1520fcf7f0055cd5ecf73e5 | [
"MIT"
] | null | null | null | app/tests/test_user_address_model.py | ihor-nahuliak/task-17-aug-2019 | 8bcfc86d54de544db1520fcf7f0055cd5ecf73e5 | [
"MIT"
] | 6 | 2020-06-05T22:33:25.000Z | 2022-02-10T12:18:13.000Z | app/tests/test_user_address_model.py | ihor-nahuliak/task-17-aug-2019 | 8bcfc86d54de544db1520fcf7f0055cd5ecf73e5 | [
"MIT"
] | null | null | null | import unittest
import django.test
from django.db import IntegrityError
from django.core.management import call_command
from app_address.models import UserAddress
class TestCase(django.test.TestCase):
def setUp(self):
super().setUp()
call_command('loaddata', 'initial_data.json', verbosity=0)
def test_model_attributes_list(self):
self.assertTrue(hasattr(UserAddress, 'user'))
self.assertTrue(hasattr(UserAddress, 'name'))
self.assertTrue(hasattr(UserAddress, 'street_address'))
self.assertTrue(hasattr(UserAddress, 'street_address_line2'))
self.assertTrue(hasattr(UserAddress, 'zipcode'))
self.assertTrue(hasattr(UserAddress, 'city'))
self.assertTrue(hasattr(UserAddress, 'state'))
self.assertTrue(hasattr(UserAddress, 'country'))
self.assertTrue(hasattr(UserAddress, 'full_address'))
def test_full_address_1(self):
m = UserAddress(user_id=1,
name='Max',
city='Giventown')
m.save()
self.assertEqual(m.full_address,
'\nNone\nNone Giventown None ')
def test_full_address_2(self):
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='Randomstreet',
city='Giventown')
m.save()
self.assertEqual(m.full_address,
'Randomstreet\nNone\nNone Giventown None ')
def test_full_address_3(self):
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='456 Randomstreet',
city='Giventown')
m.save()
self.assertEqual(m.full_address,
'456 Randomstreet\nNone\nNone Giventown None ')
def test_full_address_4(self):
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='789 Otherstreet',
city='Giventown',
country='NL')
m.save()
self.assertEqual(m.full_address,
'789 Otherstreet\nNone\nNone Giventown None NL')
def test_deduplication_leaves_just_2_unique_records(self):
m = UserAddress(user_id=1,
name='Max',
city='Giventown')
m.save()
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='Randomstreet',
city='Giventown')
m.save()
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='456 Randomstreet',
city='Giventown')
m.save()
m = UserAddress(user_id=1,
name='Max Mustermann',
street_address='789 Otherstreet',
city='Giventown',
country='NL')
m.save()
total_count = UserAddress.objects.count()
self.assertEqual(2, total_count)
def test_deduplication_raises_conflict_error_on_different_pk(self):
m = UserAddress(id=1,
user_id=1,
name='Max',
city='Giventown')
m.save()
m = UserAddress(id=2,
user_id=1,
name='Max Mustermann',
street_address='Randomstreet',
city='Giventown')
with self.assertRaises(IntegrityError) as err_ctx:
m.save()
self.assertEqual(err_ctx.exception.args[0], 'address duplicated error')
def test_match__name_not_filled(self):
m1 = UserAddress(user_id=1, name='')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__name_filled_left(self):
m1 = UserAddress(user_id=1, name='lorem')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__name_filled_center(self):
m1 = UserAddress(user_id=1, name='ipsum')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__name_filled_right(self):
m1 = UserAddress(user_id=1, name='dolor')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__name_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, name='lorem ipsum dolor')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__name_filled_different(self):
m1 = UserAddress(user_id=1, name='lorem ipsum sitam')
m1.save()
m2 = UserAddress(user_id=1, name='lorem ipsum dolor')
m2.save()
self.assertNotEqual(m2.id, m1.id)
# def test_match__name_filled_better_before(self):
# # TODO: it takes the longest old value as a better one ???
# m1 = UserAddress(user_id=1, name='lorem ipsum dolor')
# m1.save()
# m2 = UserAddress(user_id=1, name='lorem ipsum')
# m2.save()
#
# self.assertEqual(m2.id, m1.id)
# self.assertEqual(m2.name, 'lorem ipsum dolor')
def test_match__street_address_not_filled(self):
m1 = UserAddress(user_id=1, street_address='')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_filled_left(self):
m1 = UserAddress(user_id=1, street_address='lorem')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_filled_center(self):
m1 = UserAddress(user_id=1, street_address='ipsum')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_filled_right(self):
m1 = UserAddress(user_id=1, street_address='dolor')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_filled_different(self):
m1 = UserAddress(user_id=1, street_address='lorem ipsum sitam')
m1.save()
m2 = UserAddress(user_id=1, street_address='lorem ipsum dolor')
m2.save()
self.assertNotEqual(m2.id, m1.id)
def test_match__street_address_line2_not_filled(self):
m1 = UserAddress(user_id=1, street_address_line2='')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_is_null(self):
m1 = UserAddress(user_id=1, street_address_line2=None)
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_filled_left(self):
m1 = UserAddress(user_id=1, street_address_line2='lorem')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_filled_center(self):
m1 = UserAddress(user_id=1, street_address_line2='ipsum')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_filled_right(self):
m1 = UserAddress(user_id=1, street_address_line2='dolor')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__street_address_line2_filled_different(self):
m1 = UserAddress(user_id=1, street_address_line2='lorem ipsum sitam')
m1.save()
m2 = UserAddress(user_id=1, street_address_line2='lorem ipsum dolor')
m2.save()
self.assertNotEqual(m2.id, m1.id)
def test_match__zipcode_not_filled(self):
m1 = UserAddress(user_id=1, zipcode='')
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N 5DU UK')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__zipcode_is_null(self):
m1 = UserAddress(user_id=1, zipcode=None)
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N 5DU UK')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__zipcode_filled_left(self):
m1 = UserAddress(user_id=1, zipcode='WC2N')
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N 5DU')
m2.save()
# must be different! thy are different zip codes
self.assertNotEqual(m2.id, m1.id)
def test_match__zipcode_filled_right(self):
m1 = UserAddress(user_id=1, zipcode='5DU')
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N 5DU')
m2.save()
# must be different! thy are different zip codes
self.assertNotEqual(m2.id, m1.id)
def test_match__zipcode_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, zipcode='WC2N 5DU')
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N 5DU')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__zipcode_filled_different(self):
m1 = UserAddress(user_id=1, zipcode='WC2N ABC')
m1.save()
m2 = UserAddress(user_id=1, zipcode='WC2N DEF')
m2.save()
self.assertNotEqual(m2.id, m1.id)
def test_match__city_not_filled(self):
m1 = UserAddress(user_id=1, city='')
m1.save()
m2 = UserAddress(user_id=1, city='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__city_filled_left(self):
m1 = UserAddress(user_id=1, city='lorem')
m1.save()
m2 = UserAddress(user_id=1, city='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__city_filled_center(self):
m1 = UserAddress(user_id=1, city='ipsum')
m1.save()
m2 = UserAddress(user_id=1, city='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__city_filled_right(self):
m1 = UserAddress(user_id=1, city='dolor')
m1.save()
m2 = UserAddress(user_id=1, city='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__city_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, city='lorem ipsum dolor')
m1.save()
m2 = UserAddress(user_id=1, city='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_not_filled(self):
m1 = UserAddress(user_id=1, state='')
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_is_null(self):
m1 = UserAddress(user_id=1, state=None)
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_filled_left(self):
m1 = UserAddress(user_id=1, state='lorem')
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_filled_center(self):
m1 = UserAddress(user_id=1, state='ipsum')
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_filled_right(self):
m1 = UserAddress(user_id=1, state='dolor')
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__state_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, state='lorem ipsum dolor')
m1.save()
m2 = UserAddress(user_id=1, state='lorem ipsum dolor')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__country_not_filled(self):
m1 = UserAddress(user_id=1, country='')
m1.save()
m2 = UserAddress(user_id=1, country='UK')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__country_filled_fully_duplicated(self):
m1 = UserAddress(user_id=1, country='UK')
m1.save()
m2 = UserAddress(user_id=1, country='UK')
m2.save()
self.assertEqual(m2.id, m1.id)
def test_match__country_filled_different(self):
m1 = UserAddress(user_id=1, country='US')
m1.save()
m2 = UserAddress(user_id=1, country='UK')
m2.save()
self.assertNotEqual(m2.id, m1.id)
if __name__ == '__main__':
unittest.main()
| 32.698376 | 79 | 0.602994 | 1,787 | 14,093 | 4.522104 | 0.067711 | 0.033783 | 0.077961 | 0.196015 | 0.873531 | 0.870437 | 0.859052 | 0.845069 | 0.743967 | 0.689519 | 0 | 0.039171 | 0.280849 | 14,093 | 430 | 80 | 32.774419 | 0.758165 | 0.03037 | 0 | 0.641975 | 0 | 0 | 0.102403 | 0.005201 | 0 | 0 | 0 | 0.002326 | 0.169753 | 1 | 0.145062 | false | 0 | 0.015432 | 0 | 0.16358 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
74e23c83e374910c05a72febec2e34ed1f2a2e36 | 29,404 | py | Python | neural_models/architectures/old_fnet/velocity_update.py | ajupatatero/neurasim | c1d3f8163a7389b06a13e453daa98ad5157d9b2e | [
"MIT"
] | null | null | null | neural_models/architectures/old_fnet/velocity_update.py | ajupatatero/neurasim | c1d3f8163a7389b06a13e453daa98ad5157d9b2e | [
"MIT"
] | null | null | null | neural_models/architectures/old_fnet/velocity_update.py | ajupatatero/neurasim | c1d3f8163a7389b06a13e453daa98ad5157d9b2e | [
"MIT"
] | null | null | null |
import torch
from . import CellType
from . import velocityDivergence
import matplotlib.pyplot as plt
def velocityUpdateNot(pressure, U, flags):
r""" Calculate the pressure gradient and subtract it into (i.e. calculate
U' = U - grad(p)). Some care must be taken with handling boundary conditions.
This function mimics correctVelocity in Manta.
Velocity update is done IN-PLACE.
Arguments:
p (Tensor): scalar pressure field.
U (Tensor): velocity field (size(2) can be 2 or 3, indicating 2D / 3D)
flags (Tensor): input occupancy grid
"""
# Check arguments.
assert U.dim() == 5 and flags.dim() == 5 and pressure.dim() == 5, \
"Dimension mismatch"
assert flags.size(1) == 1, "flags is not scalar"
b = flags.size(0)
d = flags.size(2)
h = flags.size(3)
w = flags.size(4)
is3D = (U.size(1) == 3)
if not is3D:
assert d == 1, "d > 1 for a 2D domain"
assert U.size(4) == w, "2D velocity field must have only 2 channels"
assert U.size(0) == b and U.size(2) == d and U.size(3) == h \
and U.size(4) == w, "size mismatch"
assert pressure.is_same_size(flags), "size mismatch"
assert U.is_contiguous() and flags.is_contiguous() and \
pressure.is_contiguous(), "Input is not contiguous"
# First, we build the mask for detecting fluid cells. Borders are left untouched.
# mask_fluid Fluid cells.
# mask_fluid_i Fluid cells with (i-1) neighbour also a fluid.
# mask_fluid_j Fluid cells with (j-1) neighbour also a fluid.
# mask_fluid_k FLuid cells with (k-1) neighbour also a fluid.
# Second, we detect obstacle cells
# See Bridson p44 for algorithm and boundaries treatment.
if not is3D:
# Current cell is fluid
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid)
# Current is fluid and neighbour to left or down are fluid
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is fluid and neighbours to left or down are obstacle
mask_fluid_obstacle_im1 = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_fluid_obstacle_jm1 = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
# Current cell is obstacle and not outflow
mask_obstacle = flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.eq(CellType.TypeEmpty).__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.ne(CellType.TypeOutflow))
# Current cell is obstacle and neighbours to left or down are fluid
mask_obstacle_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_obstacle_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is obstacle and neighbours to left or down are not fluid
mask_no_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_no_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
else:
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid)
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_k = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).eq(CellType.TypeFluid))
# Cast into float or double tensor and cat into a single mask along chan.
mask_fluid_i_f = mask_fluid_i.type(U.type())
mask_fluid_j_f = mask_fluid_j.type(U.type())
mask_fluid_obstacle_i_f = mask_fluid_obstacle_im1.type(U.type())
mask_fluid_obstacle_j_f = mask_fluid_obstacle_jm1.type(U.type())
mask_obstacle_fluid_i_f = mask_obstacle_fluid_im1.type(U.type())
mask_obstacle_fluid_j_f = mask_obstacle_fluid_jm1.type(U.type())
mask_no_fluid_i_f = mask_no_fluid_im1.type(U.type())
mask_no_fluid_j_f = mask_no_fluid_jm1.type(U.type())
if is3D:
mask_fluid_k_f = mask_fluid_k.type(U.type())
if not is3D:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f), 1).contiguous()
mask_fluid_obstacle = torch.cat((mask_fluid_obstacle_i_f, mask_fluid_obstacle_j_f), 1).contiguous()
mask_obstacle_fluid = torch.cat((mask_obstacle_fluid_i_f, mask_obstacle_fluid_j_f), 1).contiguous()
mask_no_fluid = torch.cat((mask_no_fluid_i_f, mask_no_fluid_j_f), 1).contiguous()
else:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f, mask_fluid_k_f), 1).contiguous()
# pressure tensor.
#Pijk Pressure at (i,j,k) in 3 channels (2 for 2D).
#Pijk_m Pressure at chan 0: (i-1, j, k)
# chan 1: (i, j-1, k)
# chan 2: (i, j, k-1)
if not is3D:
Pijk = pressure.narrow(4, 1, w-2).narrow(3, 1, h-2)
Pijk = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m[:,0] = pressure.narrow(4, 0, w-2).narrow(3, 1, h-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-2).narrow(3, 0, h-2).squeeze(1)
else:
Pijk = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2)
Pijk = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m[:,0] = pressure.narrow(4, 0, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,2] = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).squeeze(1)
# grad(p) = [[ p(i,j,k) - p(i-1,j,k) ]
# [ p(i,j,k) - p(i,j-1,k) ]
# [ p(i,j,k) - p(i,j,k-1) ]]
if not is3D:
# Three cases:
# 1) Cell is fluid and left neighbour is fluid:
# u = u - grad(p)
# 2) Cell is fluid and left neighbour is obstacle
# u = u - p(i,j)
# 3) Cell is obstacle and left neighbour is fluid
# u = u + p(i-1,j)
U[:,:,:,1:(h-1),1:(w-1)] = (mask_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_fluid_obstacle * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - Pijk) + \
mask_obstacle_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) + Pijk_m) + \
mask_no_fluid * (0))
else:
U[:,:,1:(d-1),1:(h-1),1:(w-1)] = mask * \
(U.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2) - (Pijk - Pijk_m))
def velocityUpdate(pressure, U, flags):
r""" Calculate the pressure gradient and subtract it into (i.e. calculate
U' = U - grad(p)). Some care must be taken with handling boundary conditions.
This function mimics correctVelocity in Manta.
Velocity update is done IN-PLACE.
Arguments:
p (Tensor): scalar pressure field.
U (Tensor): velocity field (size(2) can be 2 or 3, indicating 2D / 3D)
flags (Tensor): input occupancy grid
"""
# Check arguments.
assert U.dim() == 5 and flags.dim() == 5 and pressure.dim() == 5, \
"Dimension mismatch"
assert flags.size(1) == 1, "flags is not scalar"
b = flags.size(0)
d = flags.size(2)
h = flags.size(3)
w = flags.size(4)
is3D = (U.size(1) == 3)
if not is3D:
assert d == 1, "d > 1 for a 2D domain"
assert U.size(4) == w, "2D velocity field must have only 2 channels"
assert U.size(0) == b and U.size(2) == d and U.size(3) == h \
and U.size(4) == w, "size mismatch"
assert pressure.is_same_size(flags), "size mismatch"
assert U.is_contiguous(), "U is not contiguous"
assert flags.is_contiguous(), "Flags is not contiguous"
assert pressure.is_contiguous(), "Pressure is not contiguous"
# First, we build the mask for detecting fluid cells. Borders are left untouched.
# mask_fluid Fluid cells.
# mask_fluid_i Fluid cells with (i-1) neighbour also a fluid.
# mask_fluid_j Fluid cells with (j-1) neighbour also a fluid.
# mask_fluid_k FLuid cells with (k-1) neighbour also a fluid.
# Second, we detect obstacle cells
# See Bridson p44 for algorithm and boundaries treatment.
if not is3D:
# Current cell is fluid
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid)
# Current cell is inflow
mask_inflow = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).eq(CellType.TypeInflow)
# Current cell is outflow
mask_outflow = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).eq(CellType.TypeOutflow)
# Current is fluid and neighbour to left or down are fluid
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current is inflow and neighbour to left or down are inflow
mask_inflow_i = mask_inflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeInflow))
mask_inflow_j = mask_inflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeInflow))
# Current is inflow and neighbour to left or down are inflow
mask_outflow_i = mask_outflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeOutflow))
mask_outflow_j = mask_outflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeOutflow))
# Current cell is fluid and neighbours to left or down are obstacle
mask_fluid_obstacle_im1 = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_fluid_obstacle_jm1 = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
# Current cell is inflow and neighbours to left or down are obstacle
mask_inflow_obstacle_im1 = mask_inflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_inflow_obstacle_jm1 = mask_inflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
# Current cell is fluid and neighbours to left or down are Inflows
mask_fluid_inflow_im1 = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeInflow))
mask_fluid_inflow_jm1 = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeInflow))
# Current cell is fluid and neighbours to left or down are Outflows
mask_fluid_outflow_im1 = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeOutflow))
mask_fluid_outflow_jm1 = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeOutflow))
# Current cell is obstacle and not outflow
mask_obstacle = flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.eq(CellType.TypeEmpty).__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.ne(CellType.TypeOutflow))
# Current cell is obstacle and neighbours to left or down are fluid
mask_obstacle_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_obstacle_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is obstacle and neighbours to left or down are inflow
mask_obstacle_inflow_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeInflow))
mask_obstacle_inflow_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeInflow))
# Current cell is inflow and neighbours to left or down are fluid
mask_inflow_fluid_im1 = mask_inflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_inflow_fluid_jm1 = mask_inflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is outflow and neighbours to left or down are fluid
mask_outflow_fluid_im1 = mask_outflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_outflow_fluid_jm1 = mask_outflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is outflow and neighbours to left or down are obstacle
mask_outflow_obstacle_im1 = mask_outflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_outflow_obstacle_jm1 = mask_outflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
# Current cell is obstacle and neighbours to left or down are not fluid
mask_no_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_no_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
# Current cell is outflow and neighbours to left or down are not fluid
mask_outflow_no_fluid_im1 = mask_outflow.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).ne(CellType.TypeFluid))
mask_outflow_no_fluid_jm1 = mask_outflow.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).ne(CellType.TypeFluid))
else:
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid)
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_k = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).eq(CellType.TypeFluid))
# Cast into float or double tensor and cat into a single mask along chan.
mask_fluid_i_f = mask_fluid_i.type(U.type())
mask_fluid_j_f = mask_fluid_j.type(U.type())
mask_inflow_i_f = mask_inflow_i.type(U.type())
mask_inflow_j_f = mask_inflow_j.type(U.type())
mask_outflow_i_f = mask_outflow_i.type(U.type())
mask_outflow_j_f = mask_outflow_j.type(U.type())
mask_fluid_obstacle_i_f = mask_fluid_obstacle_im1.type(U.type())
mask_fluid_obstacle_j_f = mask_fluid_obstacle_jm1.type(U.type())
mask_fluid_inflow_im1 = mask_fluid_inflow_im1.type(U.type())
mask_fluid_inflow_jm1 = mask_fluid_inflow_jm1.type(U.type())
mask_fluid_outflow_im1 = mask_fluid_outflow_im1.type(U.type())
mask_fluid_outflow_jm1 = mask_fluid_outflow_jm1.type(U.type())
mask_inflow_fluid_im1 = mask_inflow_fluid_im1.type(U.type())
mask_inflow_fluid_jm1 = mask_inflow_fluid_jm1.type(U.type())
mask_inflow_obstacle_im1 = mask_inflow_obstacle_im1.type(U.type())
mask_inflow_obstacle_jm1 = mask_inflow_obstacle_jm1.type(U.type())
mask_outflow_fluid_im1 = mask_outflow_fluid_im1.type(U.type())
mask_outflow_fluid_jm1 = mask_outflow_fluid_jm1.type(U.type())
mask_obstacle_fluid_i_f = mask_obstacle_fluid_im1.type(U.type())
mask_obstacle_fluid_j_f = mask_obstacle_fluid_jm1.type(U.type())
mask_obstacle_inflow_i_f = mask_obstacle_inflow_im1.type(U.type())
mask_obstacle_inflow_j_f = mask_obstacle_inflow_jm1.type(U.type())
mask_no_fluid_i_f = mask_no_fluid_im1.type(U.type())
mask_no_fluid_j_f = mask_no_fluid_jm1.type(U.type())
if is3D:
mask_fluid_k_f = mask_fluid_k.type(U.type())
if not is3D:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f), 1).contiguous()
mask_inflow = torch.cat((mask_inflow_i_f, mask_inflow_j_f), 1).contiguous()
mask_outflow = torch.cat((mask_outflow_i_f, mask_outflow_j_f), 1).contiguous()
mask_fluid_obstacle = torch.cat((mask_fluid_obstacle_i_f, mask_fluid_obstacle_j_f), 1).contiguous()
mask_fluid_inflow = torch.cat((mask_fluid_inflow_im1, mask_fluid_inflow_jm1), 1).contiguous()
mask_fluid_outflow = torch.cat((mask_fluid_outflow_im1, mask_fluid_outflow_jm1), 1).contiguous()
mask_obstacle_fluid = torch.cat((mask_obstacle_fluid_i_f, mask_obstacle_fluid_j_f), 1).contiguous()
mask_obstacle_inflow = torch.cat((mask_obstacle_inflow_i_f, mask_obstacle_inflow_j_f), 1).contiguous()
mask_inflow_fluid = torch.cat((mask_inflow_fluid_im1, mask_inflow_fluid_jm1), 1).contiguous()
mask_outflow_fluid = torch.cat((mask_outflow_fluid_im1, mask_outflow_fluid_jm1), 1).contiguous()
mask_inflow_obstacle = torch.cat((mask_inflow_obstacle_im1, mask_inflow_obstacle_jm1), 1).contiguous()
mask_no_fluid = torch.cat((mask_no_fluid_i_f, mask_no_fluid_j_f), 1).contiguous()
else:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f, mask_fluid_k_f), 1).contiguous()
# pressure tensor.
#Pijk Pressure at (i,j,k) in 3 channels (2 for 2D).
#Pijk_m Pressure at chan 0: (i-1, j, k)
# chan 1: (i, j-1, k)
# chan 2: (i, j, k-1)
#Pijk_p Pressure at chan 0: (i+1, j, k)
# chan 1: (i, j+1, k)
# chan 2: (i, j, k+1)
if not is3D:
Pijk = pressure.narrow(4, 1, w-2).narrow(3, 1, h-2)
Pijk = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_p = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m[:,0] = pressure.narrow(4, 0, w-2).narrow(3, 1, h-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-2).narrow(3, 0, h-2).squeeze(1)
Pijk_p[:,0] = pressure.narrow(4, 2, w-2).narrow(3, 1, h-2).squeeze(1)
Pijk_p[:,1] = pressure.narrow(4, 1, w-2).narrow(3, 2, h-2).squeeze(1)
else:
Pijk = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2)
Pijk = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m[:,0] = pressure.narrow(4, 0, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,2] = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).squeeze(1)
# grad(p) = [[ p(i,j,k) - p(i-1,j,k) ]
# [ p(i,j,k) - p(i,j-1,k) ]
# [ p(i,j,k) - p(i,j,k-1) ]]
if not is3D:
# Three cases:
# 1) Cell is fluid and left neighbour is fluid:
# u = u - grad(p)
# 2) Cell is fluid and left neighbour is obstacle
# u = u - p(i,j)
# 3) Cell is obstacle and left neighbour is fluid
# u = u + p(i-1,j)
mask_fluid = mask_fluid*0 +1
U[:,:,:,1:(h-1),1:(w-1)] = (mask_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_fluid_obstacle * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - Pijk) + \
mask_obstacle_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) + Pijk_m) + \
mask_inflow * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) +
mask_fluid_inflow * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_inflow_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_obstacle_inflow * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_inflow_obstacle * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)) + \
mask_no_fluid * (0))
else:
U[:,:,1:(d-1),1:(h-1),1:(w-1)] = mask * \
(U.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2) - (Pijk - Pijk_m))
def velocityUpdate_Density(pressure, U, flags, density):
r""" Calculate the pressure gradient and subtract it into (i.e. calculate
U' = U - grad(p)/rho). Some care must be taken with handling boundary conditions.
This function mimics correctVelocity in Manta.
Velocity update is done IN-PLACE.
Arguments:
p (Tensor): scalar pressure field.
U (Tensor): velocity field (size(2) can be 2 or 3, indicating 2D / 3D)
flags (Tensor): input occupancy grid
density (Tensor): scalar density field.
"""
# Check arguments.
assert U.dim() == 5 and flags.dim() == 5 and pressure.dim() == 5 and density.dim(), \
"Dimension mismatch"
assert flags.size(1) == 1, "flags is not scalar"
b = flags.size(0)
d = flags.size(2)
h = flags.size(3)
w = flags.size(4)
is3D = (U.size(1) == 3)
if not is3D:
assert d == 1, "d > 1 for a 2D domain"
assert U.size(4) == w, "2D velocity field must have only 2 channels"
assert U.size(0) == b and U.size(2) == d and U.size(3) == h \
and U.size(4) == w, "size mismatch"
assert pressure.is_same_size(flags), "size mismatch"
assert density.is_same_size(flags), "size mismatch"
assert U.is_contiguous() and flags.is_contiguous() and \
pressure.is_contiguous() and density.is_contiguous(), "Input is not contiguous"
# First, we build the mask for detecting fluid cells. Borders are left untouched.
# mask_fluid Fluid cells.
# mask_fluid_i Fluid cells with (i-1) neighbour also a fluid.
# mask_fluid_j Fluid cells with (j-1) neighbour also a fluid.
# mask_fluid_k FLuid cells with (k-1) neighbour also a fluid.
# Second, we detect obstacle cells
# See Bridson p44 for algorithm and boundaries treatment.
if not is3D:
# Current cell is fluid
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid)
# Current is fluid and neighbour to left or down are fluid
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is fluid and neighbours to left or down are obstacle
mask_fluid_obstacle_im1 = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeObstacle))
mask_fluid_obstacle_jm1 = mask_fluid.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
#mask_fluid_obstacle_im1 = mask_fluid.__and__ \
# (flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
#mask_fluid_obstacle_jm1 = mask_fluid.__and__ \
# (flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
mask_obstacle = flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.eq(CellType.TypeObstacle).__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 1, h-2) \
.ne(CellType.TypeOutflow))
# Current cell is obstacle and neighbours to left or down are fluid
mask_obstacle_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeFluid))
mask_obstacle_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeFluid))
# Current cell is obstacle and neighbours to left or down are not fluid
mask_no_fluid_im1 = mask_obstacle.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).eq(CellType.TypeEmpty))
mask_no_fluid_jm1 = mask_obstacle.__and__ \
(flags.narrow(4, 1, w-2).narrow(3, 0, h-2).eq(CellType.TypeEmpty))
else:
mask_fluid = flags.narrow(4, 1, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid)
mask_fluid_i = mask_fluid.__and__ \
(flags.narrow(4, 0, w-2).narrow(3, 1, h-2).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_j = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).eq(CellType.TypeFluid))
mask_fluid_k = mask_fluid.__and__ \
(flags.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).eq(CellType.TypeFluid))
# Cast into float or double tensor and cat into a single mask along chan.
mask_fluid_i_f = mask_fluid_i.type(U.type())
mask_fluid_j_f = mask_fluid_j.type(U.type())
mask_fluid_obstacle_i_f = mask_fluid_obstacle_im1.type(U.type())
mask_fluid_obstacle_j_f = mask_fluid_obstacle_jm1.type(U.type())
mask_obstacle_fluid_i_f = mask_obstacle_fluid_im1.type(U.type())
mask_obstacle_fluid_j_f = mask_obstacle_fluid_jm1.type(U.type())
mask_no_fluid_i_f = mask_no_fluid_im1.type(U.type())
mask_no_fluid_j_f = mask_no_fluid_jm1.type(U.type())
if is3D:
mask_fluid_k_f = mask_fluid_k.type(U.type())
if not is3D:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f), 1).contiguous()
mask_fluid_obstacle = torch.cat((mask_fluid_obstacle_i_f, mask_fluid_obstacle_j_f), 1).contiguous()
mask_obstacle_fluid = torch.cat((mask_obstacle_fluid_i_f, mask_obstacle_fluid_j_f), 1).contiguous()
mask_no_fluid = torch.cat((mask_no_fluid_i_f, mask_no_fluid_j_f), 1).contiguous()
else:
mask_fluid = torch.cat((mask_fluid_i_f, mask_fluid_j_f, mask_fluid_k_f), 1).contiguous()
# pressure tensor.
#Pijk Pressure at (i,j,k) in 3 channels (2 for 2D).
#Pijk_m Pressure at chan 0: (i-1, j, k)
# chan 1: (i, j-1, k)
# chan 2: (i, j, k-1)
#Pijk_p Pressure at chan 0: (i+1, j, k)
# chan 1: (i, j+1, k)
# chan 2: (i, j, k+1)
if not is3D:
Pijk = pressure.narrow(4, 1, w-2).narrow(3, 1, h-2)
Pijk = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_p = Pijk.clone().expand(b, 2, d, h-2, w-2)
Pijk_m[:,0] = pressure.narrow(4, 0, w-2).narrow(3, 1, h-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-2).narrow(3, 0, h-2).squeeze(1)
Pijk_p[:,0] = pressure.narrow(4, 2, w-2).narrow(3, 1, h-2).squeeze(1)
Pijk_p[:,1] = pressure.narrow(4, 1, w-2).narrow(3, 2, h-2).squeeze(1)
Rhoijk = density.narrow(4, 1, w-2).narrow(3, 1, h-2)
Rhoijk = Rhoijk.clone().expand(b, 2, d, h-2, w-2)
Rhoijk_m = Rhoijk.clone().expand(b, 2, d, h-2, w-2)
Rhoijk_p = Rhoijk.clone().expand(b, 2, d, h-2, w-2)
Rhoijk_m[:,0] = density.narrow(4, 0, w-2).narrow(3, 1, h-2).squeeze(1)
Rhoijk_m[:,1] = density.narrow(4, 1, w-2).narrow(3, 0, h-2).squeeze(1)
Rhoijk_p[:,0] = density.narrow(4, 2, w-2).narrow(3, 1, h-2).squeeze(1)
Rhoijk_p[:,1] = density.narrow(4, 1, w-2).narrow(3, 2, h-2).squeeze(1)
#print("Pijk ", Pijk)
#print("Pijk_m ", Pijk_m)
#print("Pijk_p ", Pijk_p)
else:
Pijk = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2)
Pijk = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m = Pijk.clone().expand(b, 3, d-2, h-1, w-1)
Pijk_m[:,0] = pressure.narrow(4, 0, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,1] = pressure.narrow(4, 1, w-1).narrow(3, 0, h-1).narrow(2, 1, d-2).squeeze(1)
Pijk_m[:,2] = pressure.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 0, d-2).squeeze(1)
# grad(p) = [[ p(i,j,k) - p(i-1,j,k) ]
# [ p(i,j,k) - p(i,j-1,k) ]
# [ p(i,j,k) - p(i,j,k-1) ]]
if not is3D:
# Three cases:
# 1) Cell is fluid and left neighbour is fluid:
# u = u - grad(p)
# 2) Cell is fluid and left neighbour is obstacle
# u = u - p(i,j)
# 3) Cell is obstacle and left neighbour is fluid
# u = u + p(i-1,j)
U[:,:,:,1:(h-1),1:(w-1)] = (mask_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk - Pijk_m)/(1-Rhoijk)) + \
mask_fluid_obstacle * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - (Pijk_p - Pijk)) + \
#mask_fluid_obstacle * \
#(U.narrow(4, 1, w-2).narrow(3, 1, h-2) - Pijk) + \
mask_obstacle_fluid * \
(U.narrow(4, 1, w-2).narrow(3, 1, h-2) + Pijk_m))
else:
U[:,:,1:(d-1),1:(h-1),1:(w-1)] = mask * \
(U.narrow(4, 1, w-1).narrow(3, 1, h-1).narrow(2, 1, d-2) - (Pijk - Pijk_m))
| 47.733766 | 110 | 0.60536 | 4,923 | 29,404 | 3.403413 | 0.030875 | 0.083796 | 0.045837 | 0.051567 | 0.955297 | 0.947956 | 0.913339 | 0.891793 | 0.885944 | 0.881886 | 0 | 0.054224 | 0.244865 | 29,404 | 615 | 111 | 47.811382 | 0.700369 | 0.212828 | 0 | 0.76584 | 0 | 0 | 0.022176 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 1 | 0.008264 | false | 0 | 0.011019 | 0 | 0.019284 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
776151a6f067b8fc2138a88b8c04bb26eeb7818a | 17,073 | py | Python | kelvin/tests/test_ft_deriv.py | MoleOrbitalHybridAnalyst/kelvin | 99538f8360975e2f80941446d8fbf2e848f74cf9 | [
"MIT"
] | 1 | 2021-08-05T15:53:46.000Z | 2021-08-05T15:53:46.000Z | kelvin/tests/test_ft_deriv.py | MoleOrbitalHybridAnalyst/kelvin | 99538f8360975e2f80941446d8fbf2e848f74cf9 | [
"MIT"
] | null | null | null | kelvin/tests/test_ft_deriv.py | MoleOrbitalHybridAnalyst/kelvin | 99538f8360975e2f80941446d8fbf2e848f74cf9 | [
"MIT"
] | 1 | 2022-01-13T18:41:06.000Z | 2022-01-13T18:41:06.000Z | import unittest
import numpy
from pyscf import gto, scf
from kelvin.ccsd import ccsd
from kelvin.scf_system import SCFSystem
from kelvin.ueg_system import UEGSystem
from kelvin.ueg_scf_system import UEGSCFSystem
from kelvin.pueg_system import PUEGSystem
try:
from lattice.hubbard import Hubbard1D
from kelvin.hubbard_system import HubbardSystem
has_lattice = True
except ImportError:
has_lattice = False
def fd_ESN(m, T, mu, ng, Ecctot, athresh=0.0,
quad='lin', damp=0.0, mi=35, delta=5e-4):
muf = mu + delta
mub = mu - delta
sys = SCFSystem(m, T, muf, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=T, mu=muf, max_iter=mi, damp=damp,
ngrid=ng, econv=1e-10, athresh=athresh, quad=quad)
Ef, Ecf = ccsdT.run()
sys = SCFSystem(m, T, mub, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mub, max_iter=mi, damp=damp,
ngrid=ng, econv=1e-10, athresh=athresh, quad=quad)
Eb, Ecb = ccsdT.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
sys = SCFSystem(m, Tf, mu, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=Tf, mu=mu, max_iter=mi, damp=damp,
ngrid=ng, econv=1e-10, athresh=athresh, quad=quad)
Ef, Ecf = ccsdT.run()
sys = SCFSystem(m, Tb, mu, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=Tb, mu=mu, max_iter=mi, damp=damp,
ngrid=ng, econv=1e-10, athresh=athresh, quad=quad)
Eb, Ecb = ccsdT.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
return (Ex, Nx, Sx)
class FTDerivTest(unittest.TestCase):
def setUp(self):
self.Bethresh = 1e-5
self.uegthresh = 1e-5
self.hthresh = 1e-6
def test_Be_sto3g_gen(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 2.0
mu = 0.0
ng = 8
sys = SCFSystem(m, T, mu, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10, singles=True)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot)
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_Be_sto3g(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 2.0
mu = 0.0
ng = 8
sys = SCFSystem(m, T, mu, orbtype='u')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10, singles=True)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot)
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_Be_sto3g_gen_active(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 0.02
mu = 0.0
ng = 40
athresh = 1e-20
sys = SCFSystem(m, T, mu, orbtype='g')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=100, damp=0.3,
ngrid=ng, econv=1e-10, athresh=athresh, singles=True)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot, athresh=athresh,
damp=0.3, mi=100, delta=2e-5)
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_Be_sto3g_active(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 0.02
mu = 0.0
ng = 40
athresh = 1e-20
sys = SCFSystem(m, T, mu, orbtype='u')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=100, damp=0.3,
ngrid=ng, econv=1e-10, athresh=athresh, singles=True)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot, athresh=athresh,
damp=0.3, mi=100, delta=2e-5)
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_Be_sto3g_ln(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 2.0
mu = 0.0
ng = 8
sys = SCFSystem(m, T, mu, orbtype='u')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10, quad='ln')
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot, quad='ln')
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_Be_sto3g_sin(self):
mol = gto.M(
verbose=0,
atom='Be 0 0 0',
basis='sto-3G')
m = scf.RHF(mol)
m.conv_tol = 1e-13
m.scf()
T = 2.0
mu = 0.0
ng = 8
sys = SCFSystem(m, T, mu, orbtype='u')
ccsdT = ccsd(sys, iprint=0, T=T, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10, quad='sin')
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
Ex, Nx, Sx = fd_ESN(m, T, mu, ng, Ecctot, quad='sin')
dE = abs((ccsdT.E - Ex)/Ex)
dS = abs((ccsdT.S - Sx)/Sx)
dN = abs((ccsdT.N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, ccsdT.E)
eS = "Expected: {} Actual: {}".format(Sx, ccsdT.S)
eN = "Expected: {} Actual: {}".format(Nx, ccsdT.N)
self.assertTrue(dE < self.Bethresh, eE)
self.assertTrue(dS < self.Bethresh, eS)
self.assertTrue(dN < self.Bethresh, eN)
def test_UEG(self):
T = 0.1
mu = 0.1
L = 2*numpy.pi/numpy.sqrt(1.0)
norb = 7
cut = 1.2
damp = 0.2
mi = 50
ng = 8
ueg = UEGSystem(T, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, T=T, mu=mu, iprint=0,
max_iter=mi, damp=damp, ngrid=ng)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
E = ccsdT.E
S = ccsdT.S
N = ccsdT.N
delta = 1e-4
muf = mu + delta
mub = mu - delta
ueg = UEGSystem(T, L, cut, mu=muf, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=muf, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSystem(T, L, cut, mu=mub, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=mub, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
ueg = UEGSystem(Tf, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tf, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSystem(Tb, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tb, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
dE = abs(E - Ex)
dS = abs(S - Sx)
dN = abs(N - Nx)
eE = "Expected: {} Actual: {}".format(Ex, E)
eS = "Expected: {} Actual: {}".format(Sx, S)
eN = "Expected: {} Actual: {}".format(Nx, N)
self.assertTrue(dE < self.uegthresh, eE)
self.assertTrue(dS < self.uegthresh, eS)
self.assertTrue(dN < self.uegthresh, eN)
def test_UEG2(self):
T = 0.1
mu = 0.1
L = 2*numpy.pi/numpy.sqrt(1.0)
norb = 7
cut = 1.2
damp = 0.2
mi = 50
ng = 8
ueg = UEGSCFSystem(T, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, T=T, mu=mu, iprint=0,
max_iter=mi, damp=damp, ngrid=ng)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
E = ccsdT.E
S = ccsdT.S
N = ccsdT.N
delta = 1e-4
muf = mu + delta
mub = mu - delta
ueg = UEGSCFSystem(T, L, cut, mu=muf, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=muf, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSCFSystem(T, L, cut, mu=mub, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=mub, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
ueg = UEGSCFSystem(Tf, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tf, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSCFSystem(Tb, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tb, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
dE = abs(E - Ex)
dS = abs(S - Sx)
dN = abs(N - Nx)
eE = "Expected: {} Actual: {}".format(Ex, E)
eS = "Expected: {} Actual: {}".format(Sx, S)
eN = "Expected: {} Actual: {}".format(Nx, N)
self.assertTrue(dE < self.uegthresh, eE)
self.assertTrue(dS < self.uegthresh, eS)
self.assertTrue(dN < self.uegthresh, eN)
def test_UEG_gen(self):
T = 0.1
mu = 0.1
L = 2*numpy.pi/numpy.sqrt(1.0)
norb = 7
cut = 1.2
damp = 0.2
mi = 50
ng = 8
ueg = UEGSystem(T, L, cut, mu=mu, norb=norb, orbtype='g')
ccsdT = ccsd(ueg, T=T, mu=mu, iprint=0,
max_iter=mi, damp=damp, ngrid=ng)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
E = ccsdT.E
S = ccsdT.S
N = ccsdT.N
delta = 1e-4
muf = mu + delta
mub = mu - delta
ueg = UEGSystem(T, L, cut, mu=muf, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=muf, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSystem(T, L, cut, mu=mub, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=mub, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
ueg = UEGSystem(Tf, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tf, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = UEGSystem(Tb, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tb, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
dE = abs(E - Ex)/Ex
dS = abs(S - Sx)/Sx
dN = abs(N - Nx)/Nx
eE = "Expected: {} Actual: {}".format(Ex, E)
eS = "Expected: {} Actual: {}".format(Sx, S)
eN = "Expected: {} Actual: {}".format(Nx, N)
self.assertTrue(dE < self.uegthresh, eE)
self.assertTrue(dS < self.uegthresh, eS)
self.assertTrue(dN < self.uegthresh, eN)
def test_PUEG(self):
T = 0.1
mu = 0.1
L = 2*numpy.pi/numpy.sqrt(1.0)
norb = 7
cut = 1.2
damp = 0.2
mi = 50
ng = 8
ueg = PUEGSystem(T, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, T=T, mu=mu, iprint=0,
max_iter=mi, damp=damp, ngrid=ng)
Ecctot, Ecc = ccsdT.run()
ccsdT.compute_ESN()
E = ccsdT.E
S = ccsdT.S
N = ccsdT.N
delta = 1e-4
muf = mu + delta
mub = mu - delta
ueg = PUEGSystem(T, L, cut, mu=muf, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=muf, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = PUEGSystem(T, L, cut, mu=mub, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=T, mu=mub, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
ueg = PUEGSystem(Tf, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tf, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Ef, Ecf = ccsdT.run()
ueg = PUEGSystem(Tb, L, cut, mu=mu, norb=norb)
ccsdT = ccsd(ueg, iprint=0, T=Tb, mu=mu, max_iter=35,
damp=0.0, ngrid=ng, econv=1e-10)
Eb, Ecb = ccsdT.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
dE = abs((E - Ex)/Ex)
dS = abs((S - Sx)/Sx)
dN = abs((N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, E)
eS = "Expected: {} Actual: {}".format(Sx, S)
eN = "Expected: {} Actual: {}".format(Nx, N)
self.assertTrue(dE < self.uegthresh, eE)
self.assertTrue(dS < self.uegthresh, eS)
self.assertTrue(dN < self.uegthresh, eN)
@unittest.skipUnless(has_lattice, "Lattice module cannot be found")
def test_hubbard(self):
T = 0.7
mu = 0.0
U = 1.2
model = Hubbard1D(4, 1.0, U)
Pa = numpy.zeros((4, 4))
Pb = numpy.zeros((4, 4))
Pa[0, 0] = 1.0
Pa[2, 2] = 1.0
Pb[1, 1] = 1.0
Pb[3, 3] = 1.0
sys = HubbardSystem(T, model, Pa=Pa, Pb=Pb, mu=mu)
cc = ccsd(sys, T=T, mu=mu, iprint=0)
Ecctot, Ecc = cc.run()
cc.compute_ESN()
E = cc.E
S = cc.S
N = cc.N
delta = 1e-4
muf = mu + delta
mub = mu - delta
sys = HubbardSystem(T, model, Pa=Pa, Pb=Pb, mu=muf)
cc = ccsd(sys, T=T, mu=muf, iprint=0)
Ef, Ecf = cc.run()
sys = HubbardSystem(T, model, Pa=Pa, Pb=Pb, mu=mub)
cc = ccsd(sys, T=T, mu=mub, iprint=0)
Eb, Ecb = cc.run()
Nx = -(Ef - Eb)/(2*delta)
Tf = T + delta
Tb = T - delta
sys = HubbardSystem(Tf, model, Pa=Pa, Pb=Pb, mu=mu)
cc = ccsd(sys, T=Tf, mu=mu, iprint=0)
Ef, Ecf = cc.run()
sys = HubbardSystem(Tb, model, Pa=Pa, Pb=Pb, mu=mu)
cc = ccsd(sys, T=Tb, mu=mu, iprint=0)
Eb, Ecb = cc.run()
Sx = -(Ef - Eb)/(2*delta)
Ex = Ecctot + T*Sx + mu*Nx
dE = abs((E - Ex)/Ex)
dS = abs((S - Sx)/Sx)
dN = abs((N - Nx)/Nx)
eE = "Expected: {} Actual: {}".format(Ex, E)
eS = "Expected: {} Actual: {}".format(Sx, S)
eN = "Expected: {} Actual: {}".format(Nx, N)
self.assertTrue(dE < self.hthresh, eE)
self.assertTrue(dS < self.hthresh, eS)
self.assertTrue(dN < self.hthresh, eN)
if __name__ == '__main__':
unittest.main()
| 34.283133 | 74 | 0.490951 | 2,556 | 17,073 | 3.242175 | 0.055556 | 0.010136 | 0.079643 | 0.043924 | 0.898998 | 0.891275 | 0.878002 | 0.878002 | 0.864969 | 0.847955 | 0 | 0.039461 | 0.348386 | 17,073 | 497 | 75 | 34.352113 | 0.705438 | 0 | 0 | 0.802198 | 0 | 0 | 0.054941 | 0 | 0 | 0 | 0 | 0 | 0.072527 | 1 | 0.028571 | false | 0 | 0.024176 | 0 | 0.057143 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
77d43c9335d5f32b26f303c4473bdc35bdbc24f5 | 12,949 | py | Python | rntools/todo/newpkg_utils.py | roadnarrows-robotics/rnr-sdk | aee20c65b49fb3eedf924c5c2ec9f19f4f1a1b29 | [
"MIT"
] | null | null | null | rntools/todo/newpkg_utils.py | roadnarrows-robotics/rnr-sdk | aee20c65b49fb3eedf924c5c2ec9f19f4f1a1b29 | [
"MIT"
] | null | null | null | rntools/todo/newpkg_utils.py | roadnarrows-robotics/rnr-sdk | aee20c65b49fb3eedf924c5c2ec9f19f4f1a1b29 | [
"MIT"
] | null | null | null | import os, re, fileinput
from datetime import datetime as date
# DHP - this should not be a hard coded value :(
pkg_template_dir = "/prj/tools/templates/pkg_template"
def genPkg_helper(pkgdata):
printPkgInfo(pkgdata)
pkgname=re.sub(" ", "_", pkgdata.pkgname_entry.get())
owner=pkgdata.owner_entry.get()
website=pkgdata.website_entry.get()
author=pkgdata.author_entry.get()
email=pkgdata.email_entry.get()
desc=pkgdata.desc_entry.get()
license=pkgdata.license.get()
libs=pkgdata.lib_entry.get()
libdirs=pkgdata.libdirs_entry.get()
cmd = "cp -r " + pkg_template_dir + " " + pkgname
os.system(cmd)
cmd = "rm -rf `find " + pkgname + " -name .svn` "
os.system(cmd)
subdirs = ""
swsubdirs = ""
if pkgdata.sw_cb.get():
subdirs = subdirs + " sw"
else:
cmd = "rm -rf " + pkgname + "/sw"
os.system(cmd)
if pkgdata.fw_cb.get():
subdirs = subdirs + " fw"
else:
cmd = "rm -rf " + pkgname + "/fw"
os.system(cmd)
if pkgdata.hw_cb.get():
subdirs = subdirs
else:
cmd = "rm -rf " + pkgname + "/hw"
os.system(cmd)
if pkgdata.gtest_cb.get():
swsubdirs = swsubdirs + " gtest"
else:
cmd = "rm -rf " + pkgname + "/sw/gtest*"
os.system(cmd)
cmd = "rm -rf " + pkgname + "/include/gtest*"
os.system(cmd)
cmd = "rm -rf " + pkgname + "/docs/doxy/x_gtest.doxy*"
os.system(cmd)
if pkgdata.tinyxml_cb.get():
swsubdirs = swsubdirs + " tinyxml"
else:
cmd = "rm -rf " + pkgname + "/sw/tiny*"
os.system(cmd)
cmd = "rm -rf " + pkgname + "/include/tiny*"
os.system(cmd)
# eula
for line in fileinput.FileInput(pkgname+"/EULA.txt", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
print line,
# readme
for line in fileinput.FileInput(pkgname+"/README.xml", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@DESC@", desc)
print line,
# Makefile
for line in fileinput.FileInput(pkgname+"/Makefile", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", subdirs)
print line,
# examples/Makefile
for line in fileinput.FileInput(pkgname+"/examples/Makefile", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", subdirs)
print line,
# sw/Makefile
if pkgdata.sw_cb.get():
for line in fileinput.FileInput(pkgname+"/sw/Makefile", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# fw/Makefile
if pkgdata.sw_cb.get():
for line in fileinput.FileInput(pkgname+"/fw/Makefile", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# gtest
if pkgdata.gtest_cb.get():
for line in fileinput.FileInput(pkgname+"/docs/doxy/x_gtest.doxy",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# doxy includes
for line in fileinput.FileInput(pkgname+"/docs/doxy/zModDoxyIncludes.doxy",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# docs/doxy/main.doxy
for line in fileinput.FileInput(pkgname+"/docs/doxy/main.doxy",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# docs/doxy/page_EULA.doxy
for line in fileinput.FileInput(pkgname+"/docs/doxy/page_EULA.doxy",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# make/utenv.sh
for line in fileinput.FileInput(pkgname+"/make/utenv.sh", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@SUBDIRS@", swsubdirs)
print line,
# make/Pkg.mk
for line in fileinput.FileInput(pkgname+"/make/Pkg.mk", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@LIBS@", libs)
line = line.replace("@LIBDIRS@", libdirs)
print line,
# make/doxy.conf
for line in fileinput.FileInput(pkgname+"/make/doxy.conf", inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@LIBS@", libs)
line = line.replace("@LIBDIRS@", libdirs)
print line,
# if making deb packages...
if pkgdata.dpkg_cb.get():
# make/deb-dev/control
for line in fileinput.FileInput(pkgname+"/make/deb-dev/control",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
line = line.replace("@LIBS@", libs)
line = line.replace("@LIBDIRS@", libdirs)
print line,
# make/deb-src/control
for line in fileinput.FileInput(pkgname+"/make/deb-src/control",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-doc/control
for line in fileinput.FileInput(pkgname+"/make/deb-doc/control",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-dev/prerm
for line in fileinput.FileInput(pkgname+"/make/deb-dev/prerm",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-src/prerm
for line in fileinput.FileInput(pkgname+"/make/deb-src/prerm",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-doc/prerm
for line in fileinput.FileInput(pkgname+"/make/deb-doc/prerm",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-dev/postinst
for line in fileinput.FileInput(pkgname+"/make/deb-dev/postinst",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-src/postinst
for line in fileinput.FileInput(pkgname+"/make/deb-src/postinst",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
# make/deb-doc/postinst
for line in fileinput.FileInput(pkgname+"/make/deb-doc/postinst",
inplace=True):
line = line.replace("@PKGNAME@", pkgname)
line = line.replace("@DATE@", str(date.now().year))
line = line.replace("@OWNER@", owner)
line = line.replace("@WEBSITE@", website)
line = line.replace("@AUTHOR@", author)
line = line.replace("@EMAIL@", email)
print line,
else: # no deb pkgs
cmd = "rm -rf " + pkgname + "/make/deb-*"
os.system(cmd)
cmd = "/prj/tools/eula_update.py rnrestricted " + pkgname
os.system(cmd)
def printPkgInfo(pkgdata):
print " Package Name : " + re.sub(" ", "_", pkgdata.pkgname_entry.get());
print " Owner : " + pkgdata.owner_entry.get() + \
" (" + pkgdata.website_entry.get() + ")"
print " Author : " + pkgdata.author_entry.get() + \
" <" + pkgdata.email_entry.get() + ">"
print " License : " + pkgdata.license.get()
print " Package Features: "
i=0
if pkgdata.sw_cb.get() == 1 :
print " * sw "
i=i+1
if pkgdata.hw_cb.get() == 1 :
print " * hw "
i=i+1
if pkgdata.fw_cb.get() == 1 :
print " * fw "
i=i+1
if pkgdata.dpkg_cb.get() == 1 :
print " * dpkg "
i=i+1
if i==0:
print " None!"
print " 3rd party libs (to be embedded in package): "
i=0
if pkgdata.gtest_cb.get() == 1 :
print " * gtest "
i=i+1
if pkgdata.tinyxml_cb.get() == 1 :
print " * tinyxml "
i=i+1
if i == 0:
print " None!"
| 36.373596 | 78 | 0.579118 | 1,540 | 12,949 | 4.842857 | 0.074675 | 0.154465 | 0.289622 | 0.053097 | 0.860821 | 0.801287 | 0.771252 | 0.744704 | 0.715205 | 0.641727 | 0 | 0.001741 | 0.246119 | 12,949 | 355 | 79 | 36.476056 | 0.762241 | 0.034674 | 0 | 0.72449 | 0 | 0 | 0.161937 | 0.023329 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.006803 | null | null | 0.129252 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7ad71a6d3786fb5eaba1577be22600e323424d0b | 13,398 | py | Python | loss/losses.py | kirtanp/MAMO-fair | fd0fc39383f11a9e1ec401233b89c2399860fb94 | [
"Apache-2.0"
] | 1 | 2021-08-16T12:42:32.000Z | 2021-08-16T12:42:32.000Z | loss/losses.py | kirtanp/MAMO-fair | fd0fc39383f11a9e1ec401233b89c2399860fb94 | [
"Apache-2.0"
] | null | null | null | loss/losses.py | kirtanp/MAMO-fair | fd0fc39383f11a9e1ec401233b89c2399860fb94 | [
"Apache-2.0"
] | 1 | 2022-02-23T01:09:25.000Z | 2022-02-23T01:09:25.000Z | from loss.loss_class import Loss
import torch
import torch.nn as nn
import torch.nn.functional as F
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
class BCELoss(Loss):
def __init__(self, name='BCELoss', weight_vector=None):
super().__init__(name)
self.weight_vector = weight_vector
def compute_loss(self, y_true, y_pred):
y_true = y_true[:, [0]]
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.BCELoss()
return(loss_fn(y_pred, y_true))
class MSELoss(Loss):
def __init__(self, name='MSELoss'):
super().__init__(name)
def compute_loss(self, y_true, y_pred):
y_true = y_true[:, [0]]
loss_fn = nn.MSELoss()
return(loss_fn(y_pred, y_true))
class DPLoss(Loss):
def __init__(self, name='DPLoss', weight_vector=None, \
threshold=0.5, attribute_index=1, reg_lambda=0.1,
reg_type='tanh', reg_beta=0.0, good_value=1):
super().__init__(name)
self.weight_vector = weight_vector
self.threshold = threshold
self.isFairnessLoss = True
self.idx = attribute_index
self.reg_lambda = reg_lambda
self.reg_beta = reg_beta
self.reg_type = reg_type
self.good_value = good_value
def _differentiable_round(self, x):
x = x.float()
return torch.tanh(3*(x - self.threshold))/2 + 0.5
def _DP_torch(self, y_true, y_pred, reg):
if(reg=='tanh'):
y_pred = self._differentiable_round(y_pred)
if(self.good_value):
y_pred = y_pred[y_pred > self.threshold]
else:
y_pred = y_pred[y_pred < self.threshold]
elif(reg=='ccr'):
if(self.good_value):
y_pred = y_pred[y_pred > self.threshold]
else:
y_pred = y_pred[y_pred < self.threshold]
elif(y_pred=='linear'):
y_pred = y_pred
total = y_true.shape[0] + 1e-7
return(torch.sum(y_pred)/total)
def compute_loss(self, y_true, y_pred):
a = y_true[:, self.idx]
y_true = y_true[:, [0]]
y_pred = torch.clamp(y_pred, 1e-7, 1-1e-7)
y_pred_0 = y_pred[a==0]
y_true_0 = y_true[a==0]
y_pred_1 = y_pred[a==1]
y_true_1 = y_true[a==1]
DP_0 = self._DP_torch(y_true_0, y_pred_0, self.reg_type)
DP_1 = self._DP_torch(y_true_1, y_pred_1, self.reg_type)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
return( torch.abs((DP_0 - DP_1)) + self.reg_lambda*loss_fn(y_pred, y_true) +\
self.reg_beta*torch.mean(y_pred**2))
class FPRLoss(Loss):
def __init__(self, name='FPRLoss', weight_vector=None, \
threshold=0.5, attribute_index=1, reg_lambda=0.1,
reg_type='tanh'):
super().__init__(name)
self.weight_vector = weight_vector
self.threshold = threshold
self.isFairnessLoss = True
self.idx = attribute_index
self.reg_lambda = reg_lambda
self.reg_type = reg_type
def _differentiable_round(self, x):
x = x.float()
return torch.tanh(3*(x - self.threshold))/2 + 0.5
def _FPR_torch(self, y_true, y_pred, reg):
if(reg=='tanh'):
y_pred = self._differentiable_round(y_pred)
y_pred = y_pred[(y_true==0) & (y_pred > self.threshold)]
elif(reg=='ccr'):
y_pred = y_pred[(y_true==0) & (y_pred > self.threshold)]
elif(y_pred=='linear'):
y_pred = y_pred
total_negatives = torch.sum(y_true==0) + 1e-7
return(torch.sum(y_pred)/total_negatives)
def compute_loss(self, y_true, y_pred):
a = y_true[:, self.idx]
y_true = y_true[:, [0]]
y_pred = torch.clamp(y_pred, 1e-7, 1-1e-7)
y_pred_0 = y_pred[a==0]
y_true_0 = y_true[a==0]
y_pred_1 = y_pred[a==1]
y_true_1 = y_true[a==1]
FPR_0 = self._FPR_torch(y_true_0, y_pred_0, self.reg_type)
FPR_1 = self._FPR_torch(y_true_1, y_pred_1, self.reg_type)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
return( torch.abs((FPR_0 - FPR_1)) + self.reg_lambda*loss_fn(y_pred, y_true))
class FNRLoss(Loss):
def __init__(self, name='FNRLoss', weight_vector=None, \
threshold=0.5, attribute_index=1, reg_lambda=0.1,
reg_type='tanh'):
super().__init__(name)
self.weight_vector = weight_vector
self.threshold = threshold
self.isFairnessLoss = True
self.idx = attribute_index
self.reg_lambda = reg_lambda
def _differentiable_round(self, x):
x = x.float()
return torch.tanh(3*(x - self.threshold))/2 + 0.5
def _FNR_torch(self, y_true, y_pred, reg):
if(reg=='tanh'):
y_pred = self._differentiable_round(y_pred)
y_pred = y_pred[(y_true==1) & (y_pred < self.threshold)]
elif(reg=='ccr'):
y_pred = y_pred[(y_true==1) & (y_pred < self.threshold)]
elif(y_pred=='linear'):
y_pred = y_pred
total_positives = torch.sum(y_true==1) + 1e-7
return(torch.sum(y_pred)/total_positives)
def compute_loss(self, y_true, y_pred):
a = y_true[:, self.idx]
y_true = y_true[:, [0]]
y_pred = torch.clamp(y_pred, 1e-7, 1-1e-7)
y_pred_0 = y_pred[a==0]
y_true_0 = y_true[a==0]
y_pred_1 = y_pred[a==1]
y_true_1 = y_true[a==1]
FNR_0 = self._FNR_torch(y_true_0, y_pred_0, self.reg_type)
FNR_1 = self._FNR_torch(y_true_1, y_pred_1, self.reg_type)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
return(torch.abs((FNR_0 - FNR_1)) + self.reg_lambda*loss_fn(y_pred, y_true) )
class TNRLoss(Loss):
def __init__(self, name='TNRLoss', weight_vector=None, \
threshold=0.5, attribute_index=1, reg_lambda=0.1,
reg_type='tanh'):
super().__init__(name)
self.weight_vector = weight_vector
self.threshold = threshold
self.isFairnessLoss = True
self.idx = attribute_index
self.reg_lambda = reg_lambda
self.reg_type = reg_type
def _differentiable_round(self, x):
x = x.float()
return torch.tanh(5*(x - self.threshold))/2 + 0.5
def _TNR_torch(self, y_true, y_pred, reg):
if(reg=='tanh'):
y_pred = self._differentiable_round(y_pred)
y_pred = y_pred[(y_true==0) & (y_pred < self.threshold)]
elif(reg=='ccr'):
y_pred = y_pred[(y_true==0) & (y_pred < self.threshold)]
elif(y_pred=='linear'):
y_pred = y_pred
total_negatives = torch.sum(y_true==0) + 1e-7
return(torch.sum(y_pred)/total_negatives)
def compute_loss(self, y_true, y_pred):
a = y_true[:, self.idx]
y_true = y_true[:, [0]]
y_pred = torch.clamp(y_pred, 1e-7, 1-1e-7)
y_pred_0 = y_pred[a==0]
y_true_0 = y_true[a==0]
y_pred_1 = y_pred[a==1]
y_true_1 = y_true[a==1]
TNR_0 = self._TNR_torch(y_true_0, y_pred_0, self.reg_type)
TNR_1 = self._TNR_torch(y_true_1, y_pred_1, self.reg_type)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
return(torch.abs((TNR_0 - TNR_1)) + self.reg_lambda*loss_fn(y_pred, y_true))
class TPRLoss(Loss):
def __init__(self, name='TPRLoss', weight_vector=None, \
threshold=0.5, attribute_index=1, reg_lambda=0.1,
reg_type='tanh'):
super().__init__(name)
self.weight_vector = weight_vector
self.threshold = threshold
self.isFairnessLoss = True
self.idx = attribute_index
self.reg_lambda = reg_lambda
self.reg_type = reg_type
def _differentiable_round(self, x):
x = x.float()
return torch.tanh(5*(x - self.threshold))/2 + 0.5
def _TPR_torch(self, y_true, y_pred, reg):
if(reg=='tanh'):
y_pred = self._differentiable_round(y_pred)
y_pred = y_pred[(y_true==1) & (y_pred > self.threshold)]
elif(reg=='ccr'):
y_pred = y_pred[(y_true==1) & (y_pred > self.threshold)]
elif(y_pred=='linear'):
y_pred = y_pred
total_positives = torch.sum(y_true==1) + 1e-7
return(torch.sum(y_pred)/total_positives)
def compute_loss(self, y_true, y_pred):
a = y_true[:, self.idx]
y_true = y_true[:, [0]]
y_pred = torch.clamp(y_pred, 1e-7, 1-1e-7)
y_pred_0 = y_pred[a==0]
y_true_0 = y_true[a==0]
y_pred_1 = y_pred[a==1]
y_true_1 = y_true[a==1]
TPR_0 = self._TPR_torch(y_true_0, y_pred_0, self.reg_type)
TPR_1 = self._TPR_torch(y_true_1, y_pred_1, self.reg_type)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
return(torch.abs((TPR_0 - TPR_1)) + self.reg_lambda*loss_fn(y_pred, y_true))
class CFLoss(Loss):
def __init__(self, name='CounterfactualLoss', sen_attributes_idx=[1], \
reg_lambda=0.1, weight_vector=None, augmentation=False):
super().__init__(name)
self.reg_lambda = reg_lambda
self.needs_model = True
self.sen_attributes_idx = sen_attributes_idx
self.weight_vector = weight_vector
self.augmentation = augmentation
def _logit(self, x):
eps = torch.tensor(1e-7)
x = x.float()
x = torch.clamp(x, eps, 1-eps)
return torch.log(x/(1-x))
def _get_subsets(self, s):
x = len(s)
subset_list = []
for i in range(1, 1 << x):
subset_list.append([s[j] for j in range(x) if (i & (1 << j))])
return(subset_list)
def _get_counterfactuals(self, x):
i = self.sen_attributes_idx[0]
x1 = x.clone()
x1[:,[-i]] = 1 - x1[:,[-i]]
return(x1)
def compute_loss(self, y_true, y_pred, x, model):
y_pred_x = self._logit(y_pred)
x_new = self._get_counterfactuals(x)
y_pred_new = model(x_new)
y_pred_new_logit = self._logit(y_pred_new)
pred_diff = torch.abs(y_pred_x - y_pred_new)
if self.weight_vector is not None:
batch_weights = torch.zeros_like(y_true, device=device)
batch_weights[y_true==0] = self.weight_vector[0]
batch_weights[y_true==1] = self.weight_vector[1]
loss_fn = nn.BCELoss(weight=batch_weights)
else:
loss_fn = nn.MSELoss()
if(self.augmentation):
loss = loss_fn(y_pred, y_true) + loss_fn(y_pred_new, y_true)
else:
loss = loss_fn(y_pred, y_true)
return(self.reg_lambda*torch.mean(pred_diff) + loss)
class addLosses(Loss):
def __init__(self, name='addLosses', loss_list=[], \
loss_weights=[]):
super().__init__(name)
self.loss_list = loss_list
self.loss_weights = loss_weights
self.needs_model = True
def compute_loss(self, y_true, y_pred, x, model):
final_loss = []
for alpha, loss_fn in zip(self.loss_weights, self.loss_list):
if(loss_fn.needs_model):
loss = alpha * loss_fn.compute_loss(y_true, y_pred, x, model)
else:
loss = alpha * loss_fn.compute_loss(y_true, y_pred)
final_loss.append(loss)
final_loss = torch.stack(final_loss)
total_loss = torch.sum(final_loss)
return(total_loss)
| 34.709845 | 86 | 0.582251 | 1,968 | 13,398 | 3.621443 | 0.058943 | 0.098218 | 0.037042 | 0.035078 | 0.812404 | 0.772555 | 0.763575 | 0.757121 | 0.743791 | 0.739862 | 0 | 0.024163 | 0.295716 | 13,398 | 385 | 87 | 34.8 | 0.731136 | 0 | 0 | 0.708197 | 0 | 0 | 0.012465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.101639 | false | 0 | 0.013115 | 0 | 0.163934 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ae9fbd04150abde4c2d8e8a7bd8b531ee815894 | 272 | py | Python | cupy/logic/truth.py | fukuta0614/Chainer | 337fe78e1c27924c1195b8b677a9b2cd3ea68828 | [
"MIT"
] | null | null | null | cupy/logic/truth.py | fukuta0614/Chainer | 337fe78e1c27924c1195b8b677a9b2cd3ea68828 | [
"MIT"
] | 1 | 2016-11-09T06:32:32.000Z | 2016-11-09T10:20:04.000Z | cupy/logic/truth.py | fukuta0614/Chainer | 337fe78e1c27924c1195b8b677a9b2cd3ea68828 | [
"MIT"
] | 1 | 2021-05-27T16:52:11.000Z | 2021-05-27T16:52:11.000Z | def all(a, axis=None, out=None, keepdims=False):
# TODO(okuta): check type
return a.all(axis=axis, out=out, keepdims=keepdims)
def any(a, axis=None, out=None, keepdims=False):
# TODO(okuta): check type
return a.any(axis=axis, out=out, keepdims=keepdims)
| 30.222222 | 55 | 0.683824 | 44 | 272 | 4.227273 | 0.318182 | 0.053763 | 0.096774 | 0.129032 | 0.903226 | 0.903226 | 0.580645 | 0.580645 | 0.580645 | 0.580645 | 0 | 0 | 0.161765 | 272 | 8 | 56 | 34 | 0.815789 | 0.172794 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
702a1ec9d4d1ed1298b2dcd4b8d8d4a93a82c602 | 814 | py | Python | tests/test_provider_apparentlymart_testing.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_apparentlymart_testing.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_apparentlymart_testing.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_apparentlymart_testing.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:28:33 UTC)
def test_provider_import():
import terrascript.provider.apparentlymart.testing
def test_datasource_import():
from terrascript.data.apparentlymart.testing import testing_assertions
from terrascript.data.apparentlymart.testing import testing_tap
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.apparentlymart.testing
#
# t = terrascript.provider.apparentlymart.testing.testing()
# s = str(t)
#
# assert 'https://github.com/apparentlymart/terraform-provider-testing' in s
# assert '0.0.2' in s
| 30.148148 | 81 | 0.766585 | 103 | 814 | 5.941748 | 0.563107 | 0.205882 | 0.189542 | 0.196078 | 0.323529 | 0.173203 | 0.173203 | 0 | 0 | 0 | 0 | 0.021552 | 0.144963 | 814 | 26 | 82 | 31.307692 | 0.857759 | 0.642506 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.2 | 1 | 0.4 | true | 0 | 1 | 0 | 1.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
7076677a5d0312f7eec498f9a32ed1bac24bca1a | 79 | py | Python | braindyn/data/example_data.py | TommyClausner/braindyn | 69ac0ef7b5171b054b2b20da09532f73df4d6548 | [
"MIT"
] | null | null | null | braindyn/data/example_data.py | TommyClausner/braindyn | 69ac0ef7b5171b054b2b20da09532f73df4d6548 | [
"MIT"
] | null | null | null | braindyn/data/example_data.py | TommyClausner/braindyn | 69ac0ef7b5171b054b2b20da09532f73df4d6548 | [
"MIT"
] | null | null | null | import os
def get_path():
return os.path.dirname(os.path.abspath(__file__)) | 26.333333 | 53 | 0.746835 | 13 | 79 | 4.153846 | 0.692308 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113924 | 79 | 3 | 53 | 26.333333 | 0.771429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
70a8ead5d52bf852021cb91f4c5b0b4597ec0483 | 433 | py | Python | Exercicios/Mundo3/ex108/teste.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | Exercicios/Mundo3/ex108/teste.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | Exercicios/Mundo3/ex108/teste.py | mpaullos/cursoemvideo-python | 80732626b6b5471ec7fea6dc01d83931e5cfd8fb | [
"MIT"
] | null | null | null | from ex108 import moeda
preco = float(input('Digite o preço: R$'))
print(f'A metade de {moeda.virgula(preco)} é {moeda.virgula(moeda.metade(preco))}')
print('O dobro de {} é {}'.format(moeda.virgula(preco), moeda.virgula(moeda.dobro(preco))))
print(f'Aumentando 10% de {moeda.virgula(preco)} temos {moeda.virgula(moeda.aumenta(preco))}')
print(f'Reduzindo 13% de {moeda.virgula(preco)} temos {moeda.virgula(moeda.diminuir(preco))}')
| 61.857143 | 94 | 0.727483 | 67 | 433 | 4.701493 | 0.402985 | 0.304762 | 0.215873 | 0.180952 | 0.260317 | 0.260317 | 0.260317 | 0.260317 | 0 | 0 | 0 | 0.017588 | 0.080831 | 433 | 6 | 95 | 72.166667 | 0.773869 | 0 | 0 | 0 | 0 | 0.5 | 0.639723 | 0.408776 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
5685b8c6d54290c27e165c57ef62ad51c156cff9 | 41,047 | py | Python | duplicate/individual classification (adowaconan).py | adowaconan/Spindle_by_Graphical_Features | 660cad3ba9ca399997a12ebdddaf441445b10f64 | [
"MIT"
] | 1 | 2018-09-28T14:46:05.000Z | 2018-09-28T14:46:05.000Z | duplicate/individual classification (adowaconan).py | nmningmei/Spindle_by_Graphical_Features | 660cad3ba9ca399997a12ebdddaf441445b10f64 | [
"MIT"
] | null | null | null | duplicate/individual classification (adowaconan).py | nmningmei/Spindle_by_Graphical_Features | 660cad3ba9ca399997a12ebdddaf441445b10f64 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon May 29 11:14:44 2017
@author: ning
"""
import pandas as pd
import os
import numpy as np
from collections import Counter
try:
function_dir = 'D:\\NING - spindle\\Spindle_by_Graphical_Features'
os.chdir(function_dir)
except:
function_dir = 'C:\\Users\\ning\\OneDrive\\python works\\Spindle_by_Graphical_Features'
os.chdir(function_dir)
import eegPipelineFunctions
try:
file_dir = 'D:\\NING - spindle\\training set\\road_trip\\'
# file_dir = 'D:\\NING - spindle\\training set\\road_trip_more_channels\\'
os.chdir(file_dir)
except:
file_dir = 'C:\\Users\\ning\\Downloads\\road_trip\\'
# file_dir = 'C:\\Users\\ning\\Downloads\\road_trip_more_channels\\'
os.chdir(file_dir)
################################### Random forest #################################
signal_features_indivisual_results,graph_features_indivisual_results,combine_features_indivisual_results={},{},{}
signal_features_indivisual_results = eegPipelineFunctions.cross_validation_report(signal_features_indivisual_results,0,
clf_='RF',file_dir=file_dir,compute='signal')
graph_features_indivisual_results = eegPipelineFunctions.cross_validation_report(graph_features_indivisual_results,0,
clf_='RF',file_dir=file_dir,compute='graph')
combine_features_indivisual_results = eegPipelineFunctions.cross_validation_report(combine_features_indivisual_results,0,
clf_='RF',file_dir=file_dir,compute='combine')
signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_RF.csv',index=False)
graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_RF.csv',index=False)
combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_RF.csv',index=False)
################################### xgb #################################
signal_features_indivisual_results,graph_features_indivisual_results,combine_features_indivisual_results={},{},{}
signal_features_indivisual_results = eegPipelineFunctions.cross_validation_report(signal_features_indivisual_results,0,
clf_='xgb',file_dir=file_dir,compute='signal')
graph_features_indivisual_results = eegPipelineFunctions.cross_validation_report(graph_features_indivisual_results,0,
clf_='xgb',file_dir=file_dir,compute='graph')
combine_features_indivisual_results = eegPipelineFunctions.cross_validation_report(combine_features_indivisual_results,0,
clf_='xgb',file_dir=file_dir,compute='combine')
signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_xgb.csv',index=False)
graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_xgb.csv',index=False)
combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_xgb.csv',index=False)
################################### support vector machine ###########################
signal_features_indivisual_results,graph_features_indivisual_results,combine_features_indivisual_results={},{},{}
signal_features_indivisual_results = eegPipelineFunctions.cross_validation_report(signal_features_indivisual_results,0,
clf_='svm',file_dir=file_dir,compute='signal',n_estimators=1)
graph_features_indivisual_results = eegPipelineFunctions.cross_validation_report(graph_features_indivisual_results,0,
clf_='svm',file_dir=file_dir,compute='graph',n_estimators=1)
combine_features_indivisual_results = eegPipelineFunctions.cross_validation_report(combine_features_indivisual_results,0,
clf_='svm',file_dir=file_dir,compute='combine',n_estimators=1)
signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_svm.csv',index=False)
graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_svm.csv',index=False)
combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_svm.csv',index=False)
################################## logistic regression ##################################################
signal_features_indivisual_results,graph_features_indivisual_results,combine_features_indivisual_results={},{},{}
signal_features_indivisual_results = eegPipelineFunctions.cross_validation_report(signal_features_indivisual_results,0,
clf_='logistic',file_dir=file_dir,compute='signal')
graph_features_indivisual_results = eegPipelineFunctions.cross_validation_report(graph_features_indivisual_results,0,
clf_='logistic',file_dir=file_dir,compute='graph')
combine_features_indivisual_results = eegPipelineFunctions.cross_validation_report(combine_features_indivisual_results,0,
clf_='logistic',file_dir=file_dir,compute='combine')
signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_logistic.csv',index=False)
graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_logistic.csv',index=False)
combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_logistic.csv',index=False)
################################### knn #################################
signal_features_indivisual_results,graph_features_indivisual_results,combine_features_indivisual_results={},{},{}
signal_features_indivisual_results = eegPipelineFunctions.cross_validation_report(signal_features_indivisual_results,0,
clf_='knn',file_dir=file_dir,compute='signal',n_estimators=15)
graph_features_indivisual_results = eegPipelineFunctions.cross_validation_report(graph_features_indivisual_results,0,
clf_='knn',file_dir=file_dir,compute='graph',n_estimators=15)
combine_features_indivisual_results = eegPipelineFunctions.cross_validation_report(combine_features_indivisual_results,0,
clf_='knn',file_dir=file_dir,compute='combine',n_estimators=15)
signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_knn.csv',index=False)
graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_knn.csv',index=False)
combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_knn.csv',index=False)
################################ TPOT ############################################
#from sklearn.pipeline import make_pipeline, make_union
#from sklearn.decomposition import PCA
#from sklearn.neighbors import KNeighborsClassifier
#from sklearn.preprocessing import FunctionTransformer,Normalizer
#from sklearn.naive_bayes import BernoulliNB,GaussianNB
#from copy import copy
#from sklearn.svm import LinearSVC
#from sklearn.ensemble import GradientBoostingClassifier, VotingClassifier
#from sklearn.linear_model import LogisticRegression
#from sklearn.tree import DecisionTreeClassifier
#from sklearn.feature_selection import VarianceThreshold,SelectPercentile, f_classif
#clfs_graph = {1.5:make_pipeline(
# make_union(VotingClassifier([("est", GradientBoostingClassifier(max_depth=1,
# max_features=0.2,
# min_samples_leaf=5,
# min_samples_split=2,
# n_estimators=100,
# subsample=0.45))]),
# FunctionTransformer(copy)),
# LogisticRegression(C=0.5)),
# 2.0:LogisticRegression(C=0.1, dual=False),
# 2.5:LinearSVC(C=0.001, loss="hinge", penalty="l2", tol=0.1),
# 3.0:make_pipeline(make_union(
# Normalizer(norm="max"),
# FunctionTransformer(copy)),
# KNeighborsClassifier(n_neighbors=95, p=1)),
# 3.5:LogisticRegression(C=5.0),
# 4.0:make_pipeline(make_union(VotingClassifier([("est", BernoulliNB(alpha=100.0,
# fit_prior=True))]),
# FunctionTransformer(copy)),
# LogisticRegression(C=0.5, penalty="l2")),
# 4.5:LogisticRegression(),
# 5.0:LogisticRegression(C=0.5, dual=False, penalty="l2")}
#
#clfs_signal = {1.5:make_pipeline(VarianceThreshold(threshold=0.5),
# DecisionTreeClassifier(criterion="entropy",
# max_depth=3,
# min_samples_leaf=8,
# min_samples_split=6)),
# 2.0:make_pipeline(make_union(VotingClassifier([("est", GaussianNB())]), FunctionTransformer(copy)),
# LinearSVC(C=15.0, loss="hinge", penalty="l2", tol=0.01)),
# 2.5:LogisticRegression(C=0.01, dual=True),
# 3.0:LinearSVC(dual=True, loss="hinge", penalty="l2", tol=0.001),
# 3.5:LinearSVC(C=1.0, dual=True, loss="hinge", penalty="l2", tol=0.1),
# 4.0:make_pipeline(VarianceThreshold(threshold=0.9),
# make_union(VotingClassifier([("est", GaussianNB())]),
# FunctionTransformer(copy)),
# KNeighborsClassifier(n_neighbors=100, p=1)),
# 4.5:make_pipeline(SelectPercentile(score_func=f_classif, percentile=33),
# LogisticRegression(C=20.0, dual=False)),
# 5.0:make_pipeline(PCA(iterated_power=7, svd_solver="randomized"),GaussianNB())}
#
#signal_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[],
# 'matthews_corrcoef_mean':[],
# 'matthews_corrcoef_std':[]}
#graph_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[],
# 'matthews_corrcoef_mean':[],
# 'matthews_corrcoef_std':[]}
#combine_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[],
# 'matthews_corrcoef_mean':[],
# 'matthews_corrcoef_std':[]}
#for directory_1 in [f for f in os.listdir(file_dir) if ('epoch_length' in f)]:
# sub_dir = file_dir + directory_1 + '\\'
# epoch_length = directory_1.split(' ')[1]
# os.chdir(sub_dir)
# #signal_features_indivisual_results[directory_1],graph_features_indivisual_results[directory_1]={},{}
# #df_cc, df_pli, df_plv, df_signal,df_graph = [],[],[],[],[]
# for sub_fold in os.listdir(sub_dir):
# sub_fold_dir = sub_dir + sub_fold + '\\'
# os.chdir(sub_fold_dir)
# sub = sub_fold[:-4]
# day = sub_fold[4:][-4:]
# print(sub,day,epoch_length)
#
# cc_features, pli_features, plv_features, signal_features = [pd.read_csv(f) for f in os.listdir(sub_fold_dir) if ('csv' in f)]
# #df_cc.append(cc_features)
# #df_pli.append(pli_features)
# #df_plv.append(plv_features)
# label = cc_features['label']
# cc_features = eegPipelineFunctions.get_real_part(cc_features)
# pli_features = eegPipelineFunctions.get_real_part(pli_features)
# plv_features = eegPipelineFunctions.get_real_part(plv_features)
# cc_features.columns = ['cc_'+name for name in cc_features]
# pli_features.columns = ['pli_'+name for name in pli_features]
# plv_features.columns = ['plv_'+name for name in plv_features]
# cc_features = cc_features.drop('cc_label',1)
# pli_features = pli_features.drop('pli_label',1)
# plv_features = plv_features.drop('plv_label',1)
# df_combine = pd.concat([cc_features,pli_features,plv_features],axis=1)
# df_combine['label']=label
# df_two = pd.concat([cc_features, pli_features, plv_features, signal_features],axis=1)
# try:
# signal_temp = eegPipelineFunctions.cross_validation_with_clfs(signal_features,clf_=clfs_signal[float(epoch_length)])
# graph_temp = eegPipelineFunctions.cross_validation_with_clfs(df_combine,clf_=clfs_graph[float(epoch_length)])
# two_temp = eegPipelineFunctions.cross_validation_with_clfs(df_two,clf_=clfs_graph[float(epoch_length)])
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores,MCC=signal_temp
# signal_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# signal_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# signal_features_indivisual_results['fpr'].append(fpr)
# signal_features_indivisual_results['tpr'].append(tpr)
# signal_features_indivisual_results['precision'].append(precision)
# signal_features_indivisual_results['recall'].append(recall)
# signal_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# signal_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# signal_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# signal_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# signal_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# signal_features_indivisual_results['matthews_corrcoef_mean'].append(np.nanmean(MCC))
# signal_features_indivisual_results['matthews_corrcoef_std'].append(np.nanstd(MCC))
# signal_features_indivisual_results['subject'].append(sub)
# signal_features_indivisual_results['day'].append(int(day[-1]))
# signal_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores,MCC=graph_temp
# graph_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# graph_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# graph_features_indivisual_results['fpr'].append(fpr)
# graph_features_indivisual_results['tpr'].append(tpr)
# graph_features_indivisual_results['precision'].append(precision)
# graph_features_indivisual_results['recall'].append(recall)
# graph_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# graph_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# graph_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# graph_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# graph_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# graph_features_indivisual_results['matthews_corrcoef_mean'].append(np.nanmean(MCC))
# graph_features_indivisual_results['matthews_corrcoef_std'].append(np.nanstd(MCC))
# graph_features_indivisual_results['subject'].append(sub)
# graph_features_indivisual_results['day'].append(int(day[-1]))
# graph_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'graph:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores,MCC=two_temp
# combine_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# combine_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# combine_features_indivisual_results['fpr'].append(fpr)
# combine_features_indivisual_results['tpr'].append(tpr)
# combine_features_indivisual_results['precision'].append(precision)
# combine_features_indivisual_results['recall'].append(recall)
# combine_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# combine_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# combine_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# combine_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# combine_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# combine_features_indivisual_results['matthews_corrcoef_mean'].append(np.nanmean(MCC))
# combine_features_indivisual_results['matthews_corrcoef_std'].append(np.nanstd(MCC))
# combine_features_indivisual_results['subject'].append(sub)
# combine_features_indivisual_results['day'].append(int(day[-1]))
# combine_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# except:
# print(sub_fold,Counter(label),'not enough samples')
#signal_features_indivisual_results = pd.DataFrame(signal_features_indivisual_results)
#graph_features_indivisual_results = pd.DataFrame(graph_features_indivisual_results)
#combine_features_indivisual_results = pd.DataFrame(combine_features_indivisual_results)
#signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_TPOT.csv',index=False)
#graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_TPOT.csv',index=False)
#combine_features_indivisual_results.to_csv(file_dir+'individual_combine_feature_TPOT.csv',index=False)
#signal_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#graph_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#for directory_1 in [f for f in os.listdir(file_dir) if ('epoch_length' in f)]:
# sub_dir = file_dir + directory_1 + '\\'
# epoch_length = directory_1.split(' ')[1]
# os.chdir(sub_dir)
# #signal_features_indivisual_results[directory_1],graph_features_indivisual_results[directory_1]={},{}
# #df_cc, df_pli, df_plv, df_signal,df_graph = [],[],[],[],[]
# for sub_fold in os.listdir(sub_dir):
# sub_fold_dir = sub_dir + sub_fold + '\\'
# os.chdir(sub_fold_dir)
# sub = sub_fold[:-4]
# day = sub_fold[4:][-4:]
# print(sub,day,epoch_length)
#
# cc_features, pli_features, plv_features, signal_features = [pd.read_csv(f) for f in os.listdir(sub_fold_dir) if ('csv' in f)]
# #df_cc.append(cc_features)
# #df_pli.append(pli_features)
# #df_plv.append(plv_features)
# label = cc_features['label']
# cc_features = eegPipelineFunctions.get_real_part(cc_features)
# pli_features = eegPipelineFunctions.get_real_part(pli_features)
# plv_features = eegPipelineFunctions.get_real_part(plv_features)
# cc_features.columns = ['cc_'+name for name in cc_features]
# pli_features.columns = ['pli_'+name for name in pli_features]
# plv_features.columns = ['plv_'+name for name in plv_features]
# cc_features = cc_features.drop('cc_label',1)
# pli_features = pli_features.drop('pli_label',1)
# plv_features = plv_features.drop('plv_label',1)
# df_combine = pd.concat([cc_features,pli_features,plv_features],axis=1)
# df_combine['label']=label
# try:
# signal_temp = eegPipelineFunctions.cross_validation_with_clfs(signal_features,clf_='logistic')
# graph_temp = eegPipelineFunctions.cross_validation_with_clfs(df_combine,clf_='logistic')
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=signal_temp
# signal_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# signal_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# signal_features_indivisual_results['fpr'].append(fpr)
# signal_features_indivisual_results['tpr'].append(tpr)
# signal_features_indivisual_results['precision'].append(precision)
# signal_features_indivisual_results['recall'].append(recall)
# signal_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# signal_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# signal_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# signal_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# signal_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# signal_features_indivisual_results['subject'].append(sub)
# signal_features_indivisual_results['day'].append(int(day[-1]))
# signal_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=graph_temp
# graph_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# graph_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# graph_features_indivisual_results['fpr'].append(fpr)
# graph_features_indivisual_results['tpr'].append(tpr)
# graph_features_indivisual_results['precision'].append(precision)
# graph_features_indivisual_results['recall'].append(recall)
# graph_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# graph_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# graph_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# graph_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# graph_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# graph_features_indivisual_results['subject'].append(sub)
# graph_features_indivisual_results['day'].append(int(day[-1]))
# graph_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'graph:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# except:
# print(sub_fold,Counter(label),'not enough samples')
#signal_features_indivisual_results = pd.DataFrame(signal_features_indivisual_results)
#graph_features_indivisual_results = pd.DataFrame(graph_features_indivisual_results)
#signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_regression.csv',index=False)
#graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_regression.csv',index=False)
#
#
#signal_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#graph_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#combine_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#for directory_1 in [f for f in os.listdir(file_dir) if ('epoch_length' in f)]:
# sub_dir = file_dir + directory_1 + '\\'
# epoch_length = directory_1.split(' ')[1]
# os.chdir(sub_dir)
# #signal_features_indivisual_results[directory_1],graph_features_indivisual_results[directory_1]={},{}
# #df_cc, df_pli, df_plv, df_signal,df_graph = [],[],[],[],[]
# for sub_fold in os.listdir(sub_dir):
# sub_fold_dir = sub_dir + sub_fold + '\\'
# os.chdir(sub_fold_dir)
# sub = sub_fold[:-4]
# day = sub_fold[4:][-4:]
# print(sub,day,epoch_length)
# cc_features, pli_features, plv_features, signal_features = [pd.read_csv(f) for f in os.listdir(sub_fold_dir) if ('csv' in f)]
# #df_cc.append(cc_features)
# #df_pli.append(pli_features)
# #df_plv.append(plv_features)
# label = cc_features['label']
# cc_features = eegPipelineFunctions.get_real_part(cc_features)
# pli_features = eegPipelineFunctions.get_real_part(pli_features)
# plv_features = eegPipelineFunctions.get_real_part(plv_features)
# cc_features.columns = ['cc_'+name for name in cc_features]
# pli_features.columns = ['pli_'+name for name in pli_features]
# plv_features.columns = ['plv_'+name for name in plv_features]
# cc_features = cc_features.drop('cc_label',1)
# pli_features = pli_features.drop('pli_label',1)
# plv_features = plv_features.drop('plv_label',1)
# df_combine = pd.concat([cc_features,pli_features,plv_features],axis=1)
# df_combine['label']=label
# df_two = pd.concat([cc_features, pli_features, plv_features, signal_features],axis=1)
# try:
# signal_temp = eegPipelineFunctions.cross_validation_with_clfs(signal_features,clf_='RF')
# graph_temp = eegPipelineFunctions.cross_validation_with_clfs(df_combine,clf_='RF')
# two_temp = eegPipelineFunctions.cross_validation_with_clfs(df_two,clf_='RF')
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores,MCC=signal_temp
# signal_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# signal_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# signal_features_indivisual_results['fpr'].append(fpr)
# signal_features_indivisual_results['tpr'].append(tpr)
# signal_features_indivisual_results['precision'].append(precision)
# signal_features_indivisual_results['recall'].append(recall)
# signal_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# signal_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# signal_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# signal_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# signal_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# signal_features_indivisual_results['subject'].append(sub)
# signal_features_indivisual_results['day'].append(int(day[-1]))
# signal_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(MCC),np.std(MCC)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=graph_temp
# graph_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# graph_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# graph_features_indivisual_results['fpr'].append(fpr)
# graph_features_indivisual_results['tpr'].append(tpr)
# graph_features_indivisual_results['precision'].append(precision)
# graph_features_indivisual_results['recall'].append(recall)
# graph_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# graph_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# graph_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# graph_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# graph_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# graph_features_indivisual_results['subject'].append(sub)
# graph_features_indivisual_results['day'].append(int(day[-1]))
# graph_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'graph:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=two_temp
# combine_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# combine_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# combine_features_indivisual_results['fpr'].append(fpr)
# combine_features_indivisual_results['tpr'].append(tpr)
# combine_features_indivisual_results['precision'].append(precision)
# combine_features_indivisual_results['recall'].append(recall)
# combine_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# combine_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# combine_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# combine_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# combine_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# combine_features_indivisual_results['subject'].append(sub)
# combine_features_indivisual_results['day'].append(int(day[-1]))
# combine_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# except:
# print(sub_fold,Counter(label),'not enough samples')
#signal_features_indivisual_results = pd.DataFrame(signal_features_indivisual_results)
#graph_features_indivisual_results = pd.DataFrame(graph_features_indivisual_results)
#signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_RF.csv',index=False)
#graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_RF.csv',index=False)
##pickle.dump(signal_features_indivisual_results,open(file_dir+'individual_signal_feature_RF.p','wb'))
##pickle.dump(graph_features_indivisual_results,open(file_dir+'individual_graph_feature_RF.p','wb'))
#
#
#signal_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#graph_features_indivisual_results = {'subject':[],'day':[],'epoch_length':[],
# 'auc_score_mean':[],'auc_score_std':[],
# 'fpr':[],'tpr':[],
# 'precision':[],'recall':[],
# 'precision_mean':[],'precision_std':[],
# 'recall_mean':[],'recall_std':[],
# 'area_under_precision_recall':[]}
#for directory_1 in [f for f in os.listdir(file_dir) if ('epoch_length' in f)]:
# sub_dir = file_dir + directory_1 + '\\'
# epoch_length = directory_1.split(' ')[1]
# os.chdir(sub_dir)
# #signal_features_indivisual_results[directory_1],graph_features_indivisual_results[directory_1]={},{}
# #df_cc, df_pli, df_plv, df_signal,df_graph = [],[],[],[],[]
# for sub_fold in os.listdir(sub_dir):
# sub_fold_dir = sub_dir + sub_fold + '\\'
# os.chdir(sub_fold_dir)
# sub = sub_fold[:-4]
# day = sub_fold[4:][-4:]
# print(sub,day,epoch_length)
#
# cc_features, pli_features, plv_features, signal_features = [pd.read_csv(f) for f in os.listdir(sub_fold_dir) if ('csv' in f)]
# #df_cc.append(cc_features)
# #df_pli.append(pli_features)
# #df_plv.append(plv_features)
# label = cc_features['label']
# cc_features = eegPipelineFunctions.get_real_part(cc_features)
# pli_features = eegPipelineFunctions.get_real_part(pli_features)
# plv_features = eegPipelineFunctions.get_real_part(plv_features)
# cc_features.columns = ['cc_'+name for name in cc_features]
# pli_features.columns = ['pli_'+name for name in pli_features]
# plv_features.columns = ['plv_'+name for name in plv_features]
# cc_features = cc_features.drop('cc_label',1)
# pli_features = pli_features.drop('pli_label',1)
# plv_features = plv_features.drop('plv_label',1)
# df_combine = pd.concat([cc_features,pli_features,plv_features],axis=1)
# df_combine['label']=label
# try:
# signal_temp = eegPipelineFunctions.cross_validation_with_clfs(signal_features,clf_='svm')
# graph_temp = eegPipelineFunctions.cross_validation_with_clfs(df_combine,clf_='svm')
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=signal_temp
# signal_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# signal_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# signal_features_indivisual_results['fpr'].append(fpr)
# signal_features_indivisual_results['tpr'].append(tpr)
# signal_features_indivisual_results['precision'].append(precision)
# signal_features_indivisual_results['recall'].append(recall)
# signal_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# signal_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# signal_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# signal_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# signal_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# signal_features_indivisual_results['subject'].append(sub)
# signal_features_indivisual_results['day'].append(int(day[-1]))
# signal_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'signal:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# auc_score,fpr,tpr,precision,recall,precision_scores,recall_scores,average_scores=graph_temp
# graph_features_indivisual_results['auc_score_mean'].append(np.nanmean(auc_score))
# graph_features_indivisual_results['auc_score_std'].append(np.std(auc_score))
# graph_features_indivisual_results['fpr'].append(fpr)
# graph_features_indivisual_results['tpr'].append(tpr)
# graph_features_indivisual_results['precision'].append(precision)
# graph_features_indivisual_results['recall'].append(recall)
# graph_features_indivisual_results['precision_mean'].append(np.nanmean(precision_scores))
# graph_features_indivisual_results['precision_std'].append(np.std(precision_scores))
# graph_features_indivisual_results['recall_mean'].append(np.nanmean(recall_scores))
# graph_features_indivisual_results['recall_std'].append(np.std(recall_scores))
# graph_features_indivisual_results['area_under_precision_recall'].append(average_scores)
# graph_features_indivisual_results['subject'].append(sub)
# graph_features_indivisual_results['day'].append(int(day[-1]))
# graph_features_indivisual_results['epoch_length'].append(float(epoch_length))
# print(sub_fold,Counter(label),'graph:%.2f +/-%.2f'%(np.nanmean(auc_score),np.std(auc_score)))
# except:
# print(sub_fold,Counter(label),'not enough samples')
#signal_features_indivisual_results = pd.DataFrame(signal_features_indivisual_results)
#graph_features_indivisual_results = pd.DataFrame(graph_features_indivisual_results)
#signal_features_indivisual_results.to_csv(file_dir+'individual_signal_feature_svm.csv',index=False)
#graph_features_indivisual_results.to_csv(file_dir+'individual_graph_feature_svm.csv',index=False)
| 75.315596 | 146 | 0.64107 | 4,417 | 41,047 | 5.541318 | 0.048223 | 0.18606 | 0.258416 | 0.125388 | 0.937286 | 0.923353 | 0.913262 | 0.901332 | 0.884622 | 0.881517 | 0 | 0.007168 | 0.228519 | 41,047 | 544 | 147 | 75.454044 | 0.76575 | 0.808805 | 0 | 0.41791 | 0 | 0 | 0.12379 | 0.094323 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074627 | 0 | 0.074627 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
569f02aa68980a3d2ae9a853ff82d9c54ee60622 | 52 | py | Python | server/tests/__init__.py | isu-avista/iot | d87776a28778411f7ea0086ebbbb04a1b69a67a1 | [
"MIT"
] | null | null | null | server/tests/__init__.py | isu-avista/iot | d87776a28778411f7ea0086ebbbb04a1b69a67a1 | [
"MIT"
] | null | null | null | server/tests/__init__.py | isu-avista/iot | d87776a28778411f7ea0086ebbbb04a1b69a67a1 | [
"MIT"
] | null | null | null | from tests import test_server
from tests import api
| 17.333333 | 29 | 0.846154 | 9 | 52 | 4.777778 | 0.666667 | 0.418605 | 0.697674 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 52 | 2 | 30 | 26 | 0.977273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3b5f22344b54f46ba0efe94c2449d9a53f61769b | 95,288 | py | Python | codec/gen_py/fomo3d/__init__.py | tokenchain/fomo3d | beaf777030e17eb2f3b5ebc004df109d4840f222 | [
"MIT"
] | null | null | null | codec/gen_py/fomo3d/__init__.py | tokenchain/fomo3d | beaf777030e17eb2f3b5ebc004df109d4840f222 | [
"MIT"
] | null | null | null | codec/gen_py/fomo3d/__init__.py | tokenchain/fomo3d | beaf777030e17eb2f3b5ebc004df109d4840f222 | [
"MIT"
] | null | null | null | """Generated wrapper for fomo3d Solidity contract."""
# pylint: disable=too-many-arguments
import json
from typing import ( # pylint: disable=unused-import
Any,
List,
Optional,
Tuple,
Union,
)
import time
from eth_utils import to_checksum_address
from mypy_extensions import TypedDict # pylint: disable=unused-import
from hexbytes import HexBytes
from web3 import Web3
from web3.contract import ContractFunction
from web3.datastructures import AttributeDict
from web3.providers.base import BaseProvider
from web3.exceptions import ContractLogicError
from moody.m.bases import ContractMethod, Validator, ContractBase, Signatures
from moody.m.tx_params import TxParams
from moody.libeb import MiliDoS
from moody import Bolors
# Try to import a custom validator class definition; if there isn't one,
# declare one that we can instantiate for the default argument to the
# constructor for fomo3d below.
try:
# both mypy and pylint complain about what we're doing here, but this
# works just fine, so their messages have been disabled here.
from . import ( # type: ignore # pylint: disable=import-self
fomo3dValidator,
)
except ImportError:
class fomo3dValidator( # type: ignore
Validator
):
"""No-op input validator."""
try:
from .middleware import MIDDLEWARE # type: ignore
except ImportError:
pass
class ClaimTransactionTypehashMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the CLAIM_TRANSACTION_TYPEHASH method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("CLAIM_TRANSACTION_TYPEHASH")
def block_call(self, debug:bool=False) -> Union[bytes, str]:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return Union[bytes, str](returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> Union[bytes, str]:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: claim_transaction_typehash")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, claim_transaction_typehash: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, claim_transaction_typehash. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class DomainSeparatorMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the DOMAIN_SEPARATOR method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("DOMAIN_SEPARATOR")
def block_call(self, debug:bool=False) -> Union[bytes, str]:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return Union[bytes, str](returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> Union[bytes, str]:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: domain_separator")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, domain_separator: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, domain_separator. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class Eip712DomainTypehashMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the EIP712_DOMAIN_TYPEHASH method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("EIP712_DOMAIN_TYPEHASH")
def block_call(self, debug:bool=False) -> Union[bytes, str]:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return Union[bytes, str](returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> Union[bytes, str]:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: eip712_domain_typehash")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, eip712_domain_typehash: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, eip712_domain_typehash. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class VaultbuyTransactionTypehashMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the VAULTBUY_TRANSACTION_TYPEHASH method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("VAULTBUY_TRANSACTION_TYPEHASH")
def block_call(self, debug:bool=False) -> Union[bytes, str]:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return Union[bytes, str](returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> Union[bytes, str]:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: vaultbuy_transaction_typehash")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, vaultbuy_transaction_typehash: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, vaultbuy_transaction_typehash. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class BnbBuyMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the bnbBuy method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("bnbBuy")
def validate_and_normalize_inputs(self, buy_num: int, team: int, rounds: int)->any:
"""Validate the inputs to the bnbBuy method."""
self.validator.assert_valid(
method_name='bnbBuy',
parameter_name='_buy_num',
argument_value=buy_num,
)
# safeguard against fractional inputs
buy_num = int(buy_num)
self.validator.assert_valid(
method_name='bnbBuy',
parameter_name='_team',
argument_value=team,
)
self.validator.assert_valid(
method_name='bnbBuy',
parameter_name='_rounds',
argument_value=rounds,
)
return (buy_num, team, rounds)
def block_send(self, buy_num: int, team: int, rounds: int,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(buy_num, team, rounds)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: bnb_buy")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, bnb_buy: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, bnb_buy. Reason: Unknown")
def send_transaction(self, buy_num: int, team: int, rounds: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(buy_num, team, rounds) = self.validate_and_normalize_inputs(buy_num, team, rounds)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds).transact(tx_params.as_dict())
def build_transaction(self, buy_num: int, team: int, rounds: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(buy_num, team, rounds) = self.validate_and_normalize_inputs(buy_num, team, rounds)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds).buildTransaction(tx_params.as_dict())
def estimate_gas(self, buy_num: int, team: int, rounds: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(buy_num, team, rounds) = self.validate_and_normalize_inputs(buy_num, team, rounds)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds).estimateGas(tx_params.as_dict())
class ClaimMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the claim method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("claim")
def validate_and_normalize_inputs(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str])->any:
"""Validate the inputs to the claim method."""
self.validator.assert_valid(
method_name='claim',
parameter_name='_account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
self.validator.assert_valid(
method_name='claim',
parameter_name='_number',
argument_value=number,
)
# safeguard against fractional inputs
number = int(number)
self.validator.assert_valid(
method_name='claim',
parameter_name='nonce',
argument_value=nonce,
)
# safeguard against fractional inputs
nonce = int(nonce)
self.validator.assert_valid(
method_name='claim',
parameter_name='v',
argument_value=v,
)
self.validator.assert_valid(
method_name='claim',
parameter_name='r',
argument_value=r,
)
self.validator.assert_valid(
method_name='claim',
parameter_name='s',
argument_value=s,
)
return (account, number, nonce, v, r, s)
def block_send(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str],_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account, number, nonce, v, r, s)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: claim")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, claim: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, claim. Reason: Unknown")
def send_transaction(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account, number, nonce, v, r, s) = self.validate_and_normalize_inputs(account, number, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, number, nonce, v, r, s).transact(tx_params.as_dict())
def build_transaction(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account, number, nonce, v, r, s) = self.validate_and_normalize_inputs(account, number, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, number, nonce, v, r, s).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account, number, nonce, v, r, s) = self.validate_and_normalize_inputs(account, number, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account, number, nonce, v, r, s).estimateGas(tx_params.as_dict())
class EndTimeMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the end_time method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("end_time")
def validate_and_normalize_inputs(self, index_0: int)->any:
"""Validate the inputs to the end_time method."""
self.validator.assert_valid(
method_name='end_time',
parameter_name='index_0',
argument_value=index_0,
)
# safeguard against fractional inputs
index_0 = int(index_0)
return (index_0)
def block_call(self,index_0: int, debug:bool=False) -> int:
_fn = self._underlying_method(index_0)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, index_0: int,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(index_0)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: end_time")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, end_time: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, end_time. Reason: Unknown")
def send_transaction(self, index_0: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).transact(tx_params.as_dict())
def build_transaction(self, index_0: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).buildTransaction(tx_params.as_dict())
def estimate_gas(self, index_0: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).estimateGas(tx_params.as_dict())
class KeyFinalPriceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the key_final_price method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("key_final_price")
def block_call(self, debug:bool=False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: key_final_price")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_final_price: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_final_price. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class KeyIncreasingPriceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the key_increasing_price method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("key_increasing_price")
def block_call(self, debug:bool=False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: key_increasing_price")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_increasing_price: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_increasing_price. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class KeyInitPriceMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the key_init_price method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("key_init_price")
def block_call(self, debug:bool=False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: key_init_price")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_init_price: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, key_init_price. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class NonceOfMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the nonceOf method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("nonceOf")
def validate_and_normalize_inputs(self, account: str)->any:
"""Validate the inputs to the nonceOf method."""
self.validator.assert_valid(
method_name='nonceOf',
parameter_name='account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
return (account)
def block_call(self,account: str, debug:bool=False) -> int:
_fn = self._underlying_method(account)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, account: str,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(account)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: nonce_of")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, nonce_of: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, nonce_of. Reason: Unknown")
def send_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).transact(tx_params.as_dict())
def build_transaction(self, account: str, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).buildTransaction(tx_params.as_dict())
def estimate_gas(self, account: str, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(account) = self.validate_and_normalize_inputs(account)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(account).estimateGas(tx_params.as_dict())
class OwnerMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the owner method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("owner")
def block_call(self, debug:bool=False) -> str:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return str(returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> str:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: owner")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, owner: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, owner. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class RoundsMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the rounds method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address)
self._underlying_method = contract_function
self.sign = validator.getSignature("rounds")
def block_call(self, debug:bool=False) -> int:
_fn = self._underlying_method()
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, _gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method()
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: rounds")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, rounds: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, rounds. Reason: Unknown")
def send_transaction(self, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().transact(tx_params.as_dict())
def build_transaction(self, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().buildTransaction(tx_params.as_dict())
def estimate_gas(self, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method().estimateGas(tx_params.as_dict())
class SetActionTimeMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the setActionTime method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("setActionTime")
def validate_and_normalize_inputs(self, time: int)->any:
"""Validate the inputs to the setActionTime method."""
self.validator.assert_valid(
method_name='setActionTime',
parameter_name='_time',
argument_value=time,
)
# safeguard against fractional inputs
time = int(time)
return (time)
def block_send(self, time: int,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(time)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: set_action_time")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, set_action_time: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, set_action_time. Reason: Unknown")
def send_transaction(self, time: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(time) = self.validate_and_normalize_inputs(time)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(time).transact(tx_params.as_dict())
def build_transaction(self, time: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(time) = self.validate_and_normalize_inputs(time)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(time).buildTransaction(tx_params.as_dict())
def estimate_gas(self, time: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(time) = self.validate_and_normalize_inputs(time)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(time).estimateGas(tx_params.as_dict())
class StartTimeMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the start_time method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("start_time")
def validate_and_normalize_inputs(self, index_0: int)->any:
"""Validate the inputs to the start_time method."""
self.validator.assert_valid(
method_name='start_time',
parameter_name='index_0',
argument_value=index_0,
)
# safeguard against fractional inputs
index_0 = int(index_0)
return (index_0)
def block_call(self,index_0: int, debug:bool=False) -> int:
_fn = self._underlying_method(index_0)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, index_0: int,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(index_0)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: start_time")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, start_time: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, start_time. Reason: Unknown")
def send_transaction(self, index_0: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).transact(tx_params.as_dict())
def build_transaction(self, index_0: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).buildTransaction(tx_params.as_dict())
def estimate_gas(self, index_0: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(index_0) = self.validate_and_normalize_inputs(index_0)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0).estimateGas(tx_params.as_dict())
class TeamMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the team method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("team")
def validate_and_normalize_inputs(self, index_0: str, index_1: int)->any:
"""Validate the inputs to the team method."""
self.validator.assert_valid(
method_name='team',
parameter_name='index_0',
argument_value=index_0,
)
index_0 = self.validate_and_checksum_address(index_0)
self.validator.assert_valid(
method_name='team',
parameter_name='index_1',
argument_value=index_1,
)
# safeguard against fractional inputs
index_1 = int(index_1)
return (index_0, index_1)
def block_call(self,index_0: str, index_1: int, debug:bool=False) -> int:
_fn = self._underlying_method(index_0, index_1)
returned = _fn.call({
'from': self._operate
})
return int(returned)
def block_send(self, index_0: str, index_1: int,_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> int:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(index_0, index_1)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: team")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, team: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, team. Reason: Unknown")
def send_transaction(self, index_0: str, index_1: int, tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(index_0, index_1) = self.validate_and_normalize_inputs(index_0, index_1)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0, index_1).transact(tx_params.as_dict())
def build_transaction(self, index_0: str, index_1: int, tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(index_0, index_1) = self.validate_and_normalize_inputs(index_0, index_1)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0, index_1).buildTransaction(tx_params.as_dict())
def estimate_gas(self, index_0: str, index_1: int, tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(index_0, index_1) = self.validate_and_normalize_inputs(index_0, index_1)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(index_0, index_1).estimateGas(tx_params.as_dict())
class VaultBuyMethod(ContractMethod): # pylint: disable=invalid-name
"""Various interfaces to the vaultBuy method."""
def __init__(self, elib: MiliDoS, contract_address: str, contract_function: ContractFunction, validator: Validator=None):
"""Persist instance data."""
super().__init__(elib, contract_address, validator)
self._underlying_method = contract_function
self.sign = validator.getSignature("vaultBuy")
def validate_and_normalize_inputs(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str])->any:
"""Validate the inputs to the vaultBuy method."""
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='_buy_num',
argument_value=buy_num,
)
# safeguard against fractional inputs
buy_num = int(buy_num)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='_team',
argument_value=team,
)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='_rounds',
argument_value=rounds,
)
# safeguard against fractional inputs
rounds = int(rounds)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='_account',
argument_value=account,
)
account = self.validate_and_checksum_address(account)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='nonce',
argument_value=nonce,
)
# safeguard against fractional inputs
nonce = int(nonce)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='v',
argument_value=v,
)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='r',
argument_value=r,
)
self.validator.assert_valid(
method_name='vaultBuy',
parameter_name='s',
argument_value=s,
)
return (buy_num, team, rounds, account, nonce, v, r, s)
def block_send(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str],_gaswei:int,_pricewei:int,_valeth:int=0,_debugtx: bool = False,_receipList: bool = False) -> None:
"""Execute underlying contract method via eth_call.
:param tx_params: transaction parameters
:returns: the return value of the underlying method.
"""
_fn = self._underlying_method(buy_num, team, rounds, account, nonce, v, r, s)
try:
_t = _fn.buildTransaction({
'from': self._operate,
'gas': _gaswei,
'gasPrice': _pricewei
})
_t['nonce'] = self._web3_eth.getTransactionCount(self._operate)
if _valeth > 0:
_t['value'] = _valeth
if _debugtx:
print(f"======== Signing ✅ by {self._operate}")
print(f"======== Transaction ✅ check")
print(_t)
if 'data' in _t:
signed = self._web3_eth.account.sign_transaction(_t)
txHash = self._web3_eth.sendRawTransaction(signed.rawTransaction)
tx_receipt = None
if _receipList is True:
print(f"======== awaiting Confirmation 🚸️ {self.sign}")
tx_receipt = self._web3_eth.waitForTransactionReceipt(txHash)
if _debugtx:
print("======== TX Result ✅")
print(tx_receipt)
print(f"======== TX blockHash ✅")
if tx_receipt is not None:
print(f"{Bolors.OK}{tx_receipt.blockHash.hex()}{Bolors.RESET}")
else:
print(f"{Bolors.WARNING}{txHash.hex()}{Bolors.RESET} - broadcast hash")
if _receipList is False:
time.sleep(self._wait)
except ContractLogicError as er:
print(f"{Bolors.FAIL}Error {er} {Bolors.RESET}: vault_buy")
except ValueError as err:
if "message" in err.args[0]:
message = err.args[0]["message"]
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, vault_buy: {message}")
else:
print(f"{Bolors.FAIL}Error Revert {Bolors.RESET}, vault_buy. Reason: Unknown")
def send_transaction(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> Union[HexBytes, bytes]:
"""Execute underlying contract method via eth_sendTransaction.
:param tx_params: transaction parameters
"""
(buy_num, team, rounds, account, nonce, v, r, s) = self.validate_and_normalize_inputs(buy_num, team, rounds, account, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds, account, nonce, v, r, s).transact(tx_params.as_dict())
def build_transaction(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> dict:
"""Construct calldata to be used as input to the method."""
(buy_num, team, rounds, account, nonce, v, r, s) = self.validate_and_normalize_inputs(buy_num, team, rounds, account, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds, account, nonce, v, r, s).buildTransaction(tx_params.as_dict())
def estimate_gas(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], tx_params: Optional[TxParams] = None) -> int:
"""Estimate gas consumption of method call."""
(buy_num, team, rounds, account, nonce, v, r, s) = self.validate_and_normalize_inputs(buy_num, team, rounds, account, nonce, v, r, s)
tx_params = super().normalize_tx_params(tx_params)
return self._underlying_method(buy_num, team, rounds, account, nonce, v, r, s).estimateGas(tx_params.as_dict())
class SignatureGenerator(Signatures):
"""
The signature is generated for this and it is installed.
"""
def __init__(self, abi: any):
super().__init__(abi)
def claim_transaction_typehash(self) -> str:
return self._function_signatures["CLAIM_TRANSACTION_TYPEHASH"]
def domain_separator(self) -> str:
return self._function_signatures["DOMAIN_SEPARATOR"]
def eip712_domain_typehash(self) -> str:
return self._function_signatures["EIP712_DOMAIN_TYPEHASH"]
def vaultbuy_transaction_typehash(self) -> str:
return self._function_signatures["VAULTBUY_TRANSACTION_TYPEHASH"]
def bnb_buy(self) -> str:
return self._function_signatures["bnbBuy"]
def claim(self) -> str:
return self._function_signatures["claim"]
def end_time(self) -> str:
return self._function_signatures["end_time"]
def key_final_price(self) -> str:
return self._function_signatures["key_final_price"]
def key_increasing_price(self) -> str:
return self._function_signatures["key_increasing_price"]
def key_init_price(self) -> str:
return self._function_signatures["key_init_price"]
def nonce_of(self) -> str:
return self._function_signatures["nonceOf"]
def owner(self) -> str:
return self._function_signatures["owner"]
def rounds(self) -> str:
return self._function_signatures["rounds"]
def set_action_time(self) -> str:
return self._function_signatures["setActionTime"]
def start_time(self) -> str:
return self._function_signatures["start_time"]
def team(self) -> str:
return self._function_signatures["team"]
def vault_buy(self) -> str:
return self._function_signatures["vaultBuy"]
# pylint: disable=too-many-public-methods,too-many-instance-attributes
class fomo3d(ContractBase):
"""Wrapper class for fomo3d Solidity contract."""
_fn_claim_transaction_typehash: ClaimTransactionTypehashMethod
"""Constructor-initialized instance of
:class:`ClaimTransactionTypehashMethod`.
"""
_fn_domain_separator: DomainSeparatorMethod
"""Constructor-initialized instance of
:class:`DomainSeparatorMethod`.
"""
_fn_eip712_domain_typehash: Eip712DomainTypehashMethod
"""Constructor-initialized instance of
:class:`Eip712DomainTypehashMethod`.
"""
_fn_vaultbuy_transaction_typehash: VaultbuyTransactionTypehashMethod
"""Constructor-initialized instance of
:class:`VaultbuyTransactionTypehashMethod`.
"""
_fn_bnb_buy: BnbBuyMethod
"""Constructor-initialized instance of
:class:`BnbBuyMethod`.
"""
_fn_claim: ClaimMethod
"""Constructor-initialized instance of
:class:`ClaimMethod`.
"""
_fn_end_time: EndTimeMethod
"""Constructor-initialized instance of
:class:`EndTimeMethod`.
"""
_fn_key_final_price: KeyFinalPriceMethod
"""Constructor-initialized instance of
:class:`KeyFinalPriceMethod`.
"""
_fn_key_increasing_price: KeyIncreasingPriceMethod
"""Constructor-initialized instance of
:class:`KeyIncreasingPriceMethod`.
"""
_fn_key_init_price: KeyInitPriceMethod
"""Constructor-initialized instance of
:class:`KeyInitPriceMethod`.
"""
_fn_nonce_of: NonceOfMethod
"""Constructor-initialized instance of
:class:`NonceOfMethod`.
"""
_fn_owner: OwnerMethod
"""Constructor-initialized instance of
:class:`OwnerMethod`.
"""
_fn_rounds: RoundsMethod
"""Constructor-initialized instance of
:class:`RoundsMethod`.
"""
_fn_set_action_time: SetActionTimeMethod
"""Constructor-initialized instance of
:class:`SetActionTimeMethod`.
"""
_fn_start_time: StartTimeMethod
"""Constructor-initialized instance of
:class:`StartTimeMethod`.
"""
_fn_team: TeamMethod
"""Constructor-initialized instance of
:class:`TeamMethod`.
"""
_fn_vault_buy: VaultBuyMethod
"""Constructor-initialized instance of
:class:`VaultBuyMethod`.
"""
SIGNATURES:SignatureGenerator = None
def __init__(
self,
core_lib: MiliDoS,
contract_address: str,
validator: fomo3dValidator = None,
):
"""Get an instance of wrapper for smart contract.
"""
# pylint: disable=too-many-statements
super().__init__()
self.contract_address = contract_address
web3 = core_lib.w3
if not validator:
validator = fomo3dValidator(web3, contract_address)
# if any middleware was imported, inject it
try:
MIDDLEWARE
except NameError:
pass
else:
try:
for middleware in MIDDLEWARE:
web3.middleware_onion.inject(
middleware['function'], layer=middleware['layer'],
)
except ValueError as value_error:
if value_error.args == ("You can't add the same un-named instance twice",):
pass
self._web3_eth = web3.eth
functions = self._web3_eth.contract(address=to_checksum_address(contract_address), abi=fomo3d.abi()).functions
signed = SignatureGenerator(fomo3d.abi())
validator.bindSignatures(signed)
self.SIGNATURES = signed
self._fn_claim_transaction_typehash = ClaimTransactionTypehashMethod(core_lib, contract_address, functions.CLAIM_TRANSACTION_TYPEHASH, validator)
self._fn_domain_separator = DomainSeparatorMethod(core_lib, contract_address, functions.DOMAIN_SEPARATOR, validator)
self._fn_eip712_domain_typehash = Eip712DomainTypehashMethod(core_lib, contract_address, functions.EIP712_DOMAIN_TYPEHASH, validator)
self._fn_vaultbuy_transaction_typehash = VaultbuyTransactionTypehashMethod(core_lib, contract_address, functions.VAULTBUY_TRANSACTION_TYPEHASH, validator)
self._fn_bnb_buy = BnbBuyMethod(core_lib, contract_address, functions.bnbBuy, validator)
self._fn_claim = ClaimMethod(core_lib, contract_address, functions.claim, validator)
self._fn_end_time = EndTimeMethod(core_lib, contract_address, functions.end_time, validator)
self._fn_key_final_price = KeyFinalPriceMethod(core_lib, contract_address, functions.key_final_price, validator)
self._fn_key_increasing_price = KeyIncreasingPriceMethod(core_lib, contract_address, functions.key_increasing_price, validator)
self._fn_key_init_price = KeyInitPriceMethod(core_lib, contract_address, functions.key_init_price, validator)
self._fn_nonce_of = NonceOfMethod(core_lib, contract_address, functions.nonceOf, validator)
self._fn_owner = OwnerMethod(core_lib, contract_address, functions.owner, validator)
self._fn_rounds = RoundsMethod(core_lib, contract_address, functions.rounds, validator)
self._fn_set_action_time = SetActionTimeMethod(core_lib, contract_address, functions.setActionTime, validator)
self._fn_start_time = StartTimeMethod(core_lib, contract_address, functions.start_time, validator)
self._fn_team = TeamMethod(core_lib, contract_address, functions.team, validator)
self._fn_vault_buy = VaultBuyMethod(core_lib, contract_address, functions.vaultBuy, validator)
def event_bnb_buy(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event bnb_buy in contract fomo3d
Get log entry for BnbBuy event.
:param tx_hash: hash of transaction emitting BnbBuy event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=fomo3d.abi()).events.BnbBuy().processReceipt(tx_receipt)
def event_claim(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event claim in contract fomo3d
Get log entry for Claim event.
:param tx_hash: hash of transaction emitting Claim event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=fomo3d.abi()).events.Claim().processReceipt(tx_receipt)
def event_vault_buy(
self, tx_hash: Union[HexBytes, bytes]
) -> Tuple[AttributeDict]:
"""
Implementation of event vault_buy in contract fomo3d
Get log entry for VaultBuy event.
:param tx_hash: hash of transaction emitting VaultBuy event
"""
tx_receipt = self._web3_eth.getTransactionReceipt(tx_hash)
return self._web3_eth.contract(address=to_checksum_address(self.contract_address), abi=fomo3d.abi()).events.VaultBuy().processReceipt(tx_receipt)
def claim_transaction_typehash(self) -> Union[bytes, str]:
"""
Implementation of claim_transaction_typehash in contract fomo3d
Method of the function
"""
return self._fn_claim_transaction_typehash.block_call()
def domain_separator(self) -> Union[bytes, str]:
"""
Implementation of domain_separator in contract fomo3d
Method of the function
"""
return self._fn_domain_separator.block_call()
def eip712_domain_typehash(self) -> Union[bytes, str]:
"""
Implementation of eip712_domain_typehash in contract fomo3d
Method of the function
"""
return self._fn_eip712_domain_typehash.block_call()
def vaultbuy_transaction_typehash(self) -> Union[bytes, str]:
"""
Implementation of vaultbuy_transaction_typehash in contract fomo3d
Method of the function
"""
return self._fn_vaultbuy_transaction_typehash.block_call()
def bnb_buy(self, buy_num: int, team: int, rounds: int, wei:int=0) -> None:
"""
Implementation of bnb_buy in contract fomo3d
Method of the function
"""
return self._fn_bnb_buy.block_send(buy_num, team, rounds, self.call_contract_fee_amount,self.call_contract_fee_price,wei,self.call_contract_debug_flag,self.call_contract_enforce_tx_receipt)
def claim(self, account: str, number: int, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str]) -> None:
"""
Implementation of claim in contract fomo3d
Method of the function
"""
return self._fn_claim.block_send(account, number, nonce, v, r, s, self.call_contract_fee_amount,self.call_contract_fee_price,0,self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def end_time(self, index_0: int) -> int:
"""
Implementation of end_time in contract fomo3d
Method of the function
"""
return self._fn_end_time.block_call(index_0)
def key_final_price(self) -> int:
"""
Implementation of key_final_price in contract fomo3d
Method of the function
"""
return self._fn_key_final_price.block_call()
def key_increasing_price(self) -> int:
"""
Implementation of key_increasing_price in contract fomo3d
Method of the function
"""
return self._fn_key_increasing_price.block_call()
def key_init_price(self) -> int:
"""
Implementation of key_init_price in contract fomo3d
Method of the function
"""
return self._fn_key_init_price.block_call()
def nonce_of(self, account: str) -> int:
"""
Implementation of nonce_of in contract fomo3d
Method of the function
"""
return self._fn_nonce_of.block_call(account)
def owner(self) -> str:
"""
Implementation of owner in contract fomo3d
Method of the function
"""
return self._fn_owner.block_call()
def rounds(self) -> int:
"""
Implementation of rounds in contract fomo3d
Method of the function
"""
return self._fn_rounds.block_call()
def set_action_time(self, time: int) -> None:
"""
Implementation of set_action_time in contract fomo3d
Method of the function
"""
return self._fn_set_action_time.block_send(time, self.call_contract_fee_amount,self.call_contract_fee_price,0,self.call_contract_debug_flag, self.call_contract_enforce_tx_receipt)
def start_time(self, index_0: int) -> int:
"""
Implementation of start_time in contract fomo3d
Method of the function
"""
return self._fn_start_time.block_call(index_0)
def team(self, index_0: str, index_1: int) -> int:
"""
Implementation of team in contract fomo3d
Method of the function
"""
return self._fn_team.block_call(index_0, index_1)
def vault_buy(self, buy_num: int, team: int, rounds: int, account: str, nonce: int, v: int, r: Union[bytes, str], s: Union[bytes, str], wei:int=0) -> None:
"""
Implementation of vault_buy in contract fomo3d
Method of the function
"""
return self._fn_vault_buy.block_send(buy_num, team, rounds, account, nonce, v, r, s, self.call_contract_fee_amount,self.call_contract_fee_price,wei,self.call_contract_debug_flag,self.call_contract_enforce_tx_receipt)
def CallContractWait(self, t_long:int)-> "fomo3d":
self._fn_claim_transaction_typehash.setWait(t_long)
self._fn_domain_separator.setWait(t_long)
self._fn_eip712_domain_typehash.setWait(t_long)
self._fn_vaultbuy_transaction_typehash.setWait(t_long)
self._fn_bnb_buy.setWait(t_long)
self._fn_claim.setWait(t_long)
self._fn_end_time.setWait(t_long)
self._fn_key_final_price.setWait(t_long)
self._fn_key_increasing_price.setWait(t_long)
self._fn_key_init_price.setWait(t_long)
self._fn_nonce_of.setWait(t_long)
self._fn_owner.setWait(t_long)
self._fn_rounds.setWait(t_long)
self._fn_set_action_time.setWait(t_long)
self._fn_start_time.setWait(t_long)
self._fn_team.setWait(t_long)
self._fn_vault_buy.setWait(t_long)
return self
@staticmethod
def abi():
"""Return the ABI to the underlying contract."""
return json.loads(
'[{"inputs":[{"internalType":"address","name":"_owner","type":"address"}],"stateMutability":"nonpayable","type":"constructor"},{"anonymous":false,"inputs":[{"indexed":false,"internalType":"address","name":"account","type":"address"},{"indexed":false,"internalType":"uint256","name":"bnbvalue","type":"uint256"},{"indexed":false,"internalType":"uint256","name":"buy_num","type":"uint256"},{"indexed":false,"internalType":"enum fomo3d.group","name":"team","type":"uint8"},{"indexed":false,"internalType":"uint256","name":"rounds","type":"uint256"}],"name":"BnbBuy","type":"event"},{"anonymous":false,"inputs":[{"indexed":false,"internalType":"address","name":"account","type":"address"},{"indexed":false,"internalType":"uint256","name":"claimvalue","type":"uint256"}],"name":"Claim","type":"event"},{"anonymous":false,"inputs":[{"indexed":false,"internalType":"address","name":"account","type":"address"},{"indexed":false,"internalType":"uint256","name":"vaultvalue","type":"uint256"},{"indexed":false,"internalType":"uint256","name":"buy_num","type":"uint256"},{"indexed":false,"internalType":"enum fomo3d.group","name":"team","type":"uint8"},{"indexed":false,"internalType":"uint256","name":"rounds","type":"uint256"}],"name":"VaultBuy","type":"event"},{"inputs":[],"name":"CLAIM_TRANSACTION_TYPEHASH","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"DOMAIN_SEPARATOR","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"EIP712_DOMAIN_TYPEHASH","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"VAULTBUY_TRANSACTION_TYPEHASH","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},{"inputs":[{"internalType":"uint256","name":"_buy_num","type":"uint256"},{"internalType":"enum fomo3d.group","name":"_team","type":"uint8"},{"internalType":"uint64","name":"_rounds","type":"uint64"}],"name":"bnbBuy","outputs":[],"stateMutability":"payable","type":"function"},{"inputs":[{"internalType":"address payable","name":"_account","type":"address"},{"internalType":"uint256","name":"_number","type":"uint256"},{"internalType":"uint256","name":"nonce","type":"uint256"},{"internalType":"uint8","name":"v","type":"uint8"},{"internalType":"bytes32","name":"r","type":"bytes32"},{"internalType":"bytes32","name":"s","type":"bytes32"}],"name":"claim","outputs":[],"stateMutability":"nonpayable","type":"function"},{"inputs":[{"internalType":"uint256","name":"index_0","type":"uint256"}],"name":"end_time","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"key_final_price","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"key_increasing_price","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"key_init_price","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"nonceOf","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"owner","outputs":[{"internalType":"address","name":"","type":"address"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"rounds","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[{"internalType":"uint256","name":"_time","type":"uint256"}],"name":"setActionTime","outputs":[],"stateMutability":"nonpayable","type":"function"},{"inputs":[{"internalType":"uint256","name":"index_0","type":"uint256"}],"name":"start_time","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"inputs":[{"internalType":"address","name":"index_0","type":"address"},{"internalType":"uint256","name":"index_1","type":"uint256"}],"name":"team","outputs":[{"internalType":"enum fomo3d.group","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},{"inputs":[{"internalType":"uint256","name":"_buy_num","type":"uint256"},{"internalType":"enum fomo3d.group","name":"_team","type":"uint8"},{"internalType":"uint256","name":"_rounds","type":"uint256"},{"internalType":"address","name":"_account","type":"address"},{"internalType":"uint256","name":"nonce","type":"uint256"},{"internalType":"uint8","name":"v","type":"uint8"},{"internalType":"bytes32","name":"r","type":"bytes32"},{"internalType":"bytes32","name":"s","type":"bytes32"}],"name":"vaultBuy","outputs":[],"stateMutability":"payable","type":"function"}]' # noqa: E501 (line-too-long)
)
# pylint: disable=too-many-lines
| 40.861063 | 4,952 | 0.611536 | 10,488 | 95,288 | 5.315313 | 0.031465 | 0.041617 | 0.035159 | 0.014638 | 0.905268 | 0.856405 | 0.836493 | 0.819004 | 0.800886 | 0.779594 | 0 | 0.007836 | 0.267452 | 95,288 | 2,331 | 4,953 | 40.878593 | 0.789336 | 0.11376 | 0 | 0.72906 | 1 | 0.000728 | 0.176121 | 0.086373 | 0 | 0 | 0 | 0 | 0.016752 | 1 | 0.107065 | false | 0.002185 | 0.013838 | 0.012382 | 0.229425 | 0.14858 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8e67c8aa811db1226463cf8a01f6ba3e3a9a9a0d | 3,101 | py | Python | tests/test_plugin_loader.py | AverkinSergei/pyexcel-io | a611a69cf7c2fa75f226b7879aba61bcfdaceda1 | [
"BSD-3-Clause"
] | null | null | null | tests/test_plugin_loader.py | AverkinSergei/pyexcel-io | a611a69cf7c2fa75f226b7879aba61bcfdaceda1 | [
"BSD-3-Clause"
] | null | null | null | tests/test_plugin_loader.py | AverkinSergei/pyexcel-io | a611a69cf7c2fa75f226b7879aba61bcfdaceda1 | [
"BSD-3-Clause"
] | 1 | 2019-04-27T04:40:14.000Z | 2019-04-27T04:40:14.000Z | from mock import patch
from nose.tools import eq_
@patch('pkgutil.get_importer')
def test_load_from_pyinstaller(pkgutil_get_importer):
sample_toc = set(['pyexcel_io', 'pyexcel_xls', 'blah'])
pkgutil_get_importer.return_value.toc = sample_toc
from pyexcel_io.manager import load_from_pyinstaller
module_names = load_from_pyinstaller('pyexcel_', 'path')
expected = ['pyexcel_io', 'pyexcel_xls']
eq_(sorted(list(module_names)), sorted(expected))
@patch('pkgutil.get_importer')
@patch('pkgutil.iter_modules')
@patch('pyexcel_io.manager.pre_register')
def test_load_plugins(pre_register,
pkgutil_iter_modules,
pkgutil_get_importer):
sample_toc = set(['pyexcel_io'])
pkgutil_get_importer.return_value.toc = sample_toc
pkgutil_iter_modules.return_value = [('not used', 'pyexcel_xls', True)]
from pyexcel_io.manager import load_plugins
load_plugins('pyexcel_', '.', ['pyexcel_io'])
plugin_meta = {
'file_types': ['xls', 'xlsx', 'xlsm'],
'submodule': 'xls',
'stream_type': 'binary'
}
module_name = 'pyexcel_xls'
pre_register.assert_called_with(plugin_meta, module_name)
@patch('pkgutil.get_importer')
@patch('pkgutil.iter_modules')
@patch('pyexcel_io.manager.pre_register')
def test_load_plugins_without_pyinstaller(pre_register,
pkgutil_iter_modules,
pkgutil_get_importer):
sample_toc = set()
pkgutil_get_importer.return_value.toc = sample_toc
pkgutil_iter_modules.return_value = [('not used', 'pyexcel_xls', True)]
from pyexcel_io.manager import load_plugins
load_plugins('pyexcel_', '.', ['pyexcel_io'])
plugin_meta = {
'file_types': ['xls', 'xlsx', 'xlsm'],
'submodule': 'xls',
'stream_type': 'binary'
}
module_name = 'pyexcel_xls'
pre_register.assert_called_with(plugin_meta, module_name)
@patch('pkgutil.get_importer')
@patch('pkgutil.iter_modules')
@patch('pyexcel_io.manager.pre_register')
def test_load_plugins_without_any_plugins(pre_register,
pkgutil_iter_modules,
pkgutil_get_importer):
sample_toc = set()
pkgutil_get_importer.return_value.toc = sample_toc
pkgutil_iter_modules.return_value = []
from pyexcel_io.manager import load_plugins
load_plugins('pyexcel_', '.', ['pyexcel_io'])
assert pre_register.called is False
@patch('pkgutil.get_importer')
@patch('pkgutil.iter_modules')
@patch('pyexcel_io.manager.pre_register')
def test_load_plugins_import_error(pre_register,
pkgutil_iter_modules,
pkgutil_get_importer):
sample_toc = set(['test_non_existent_module'])
pkgutil_get_importer.return_value.toc = sample_toc
pkgutil_iter_modules.return_value = [('not used', 'pyexcel_xls', False)]
from pyexcel_io.manager import load_plugins
load_plugins('test_', '.', ['pyexcel_io'])
assert pre_register.called is False
| 38.283951 | 76 | 0.674944 | 370 | 3,101 | 5.232432 | 0.159459 | 0.07438 | 0.139463 | 0.059401 | 0.856405 | 0.856405 | 0.840909 | 0.840909 | 0.759298 | 0.734504 | 0 | 0 | 0.212835 | 3,101 | 80 | 77 | 38.7625 | 0.793118 | 0 | 0 | 0.728571 | 0 | 0 | 0.208965 | 0.047727 | 0 | 0 | 0 | 0 | 0.057143 | 1 | 0.071429 | false | 0 | 0.328571 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
8e80267bfcf960d63e046371b543daabe23e5964 | 125 | py | Python | ravenclaw/num_rows.py | idin/ravenclaw | 17a9b4c92d662f72c114e832f38b39eb73707bc2 | [
"MIT"
] | 1 | 2019-05-04T07:32:21.000Z | 2019-05-04T07:32:21.000Z | ravenclaw/num_rows.py | idin/ravenclaw | 17a9b4c92d662f72c114e832f38b39eb73707bc2 | [
"MIT"
] | null | null | null | ravenclaw/num_rows.py | idin/ravenclaw | 17a9b4c92d662f72c114e832f38b39eb73707bc2 | [
"MIT"
] | 1 | 2020-03-28T21:54:11.000Z | 2020-03-28T21:54:11.000Z |
def nrows(data):
return data.shape[0]
def ncols(data):
return data.shape[1]
# aliases
num_rows = nrows
num_cols = ncols
| 11.363636 | 21 | 0.712 | 21 | 125 | 4.142857 | 0.571429 | 0.229885 | 0.321839 | 0.436782 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0.168 | 125 | 10 | 22 | 12.5 | 0.817308 | 0.056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
8eb0d46094b71e79b43b04e018cef0c099a0c7f6 | 78 | py | Python | excript/lista2/q6.py | victorers1/anotacoes_curso_python | c4ef56bcfc7e3baa3944fc2962e8217c6d720b0e | [
"MIT"
] | null | null | null | excript/lista2/q6.py | victorers1/anotacoes_curso_python | c4ef56bcfc7e3baa3944fc2962e8217c6d720b0e | [
"MIT"
] | null | null | null | excript/lista2/q6.py | victorers1/anotacoes_curso_python | c4ef56bcfc7e3baa3944fc2962e8217c6d720b0e | [
"MIT"
] | null | null | null | print("--------------------------------------------------") #lol
print(50*'-') | 39 | 64 | 0.192308 | 4 | 78 | 3.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0.025641 | 78 | 2 | 65 | 39 | 0.171053 | 0.038462 | 0 | 0 | 0 | 0 | 0.68 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
8eba859ac03f0a1d73ade2a74f0785aff1162953 | 18,876 | py | Python | test/waveform_generator_test.py | vanessagraber/bilby | 80ee2d123a913d881f2a790b04e2939c46584d27 | [
"MIT"
] | null | null | null | test/waveform_generator_test.py | vanessagraber/bilby | 80ee2d123a913d881f2a790b04e2939c46584d27 | [
"MIT"
] | null | null | null | test/waveform_generator_test.py | vanessagraber/bilby | 80ee2d123a913d881f2a790b04e2939c46584d27 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
import unittest
import bilby
import numpy as np
import mock
from mock import MagicMock
def dummy_func_array_return_value(frequency_array, amplitude, mu, sigma, ra, dec, geocent_time, psi, **kwargs):
return amplitude + mu + frequency_array + sigma + ra + dec + geocent_time + psi
def dummy_func_dict_return_value(frequency_array, amplitude, mu, sigma, ra, dec, geocent_time, psi, **kwargs):
ht = {'plus': amplitude + mu + frequency_array + sigma + ra + dec + geocent_time + psi,
'cross': amplitude + mu + frequency_array + sigma + ra + dec + geocent_time + psi}
return ht
class TestWaveformGeneratorInstantiationWithoutOptionalParameters(unittest.TestCase):
def setUp(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(
1, 4096,
frequency_domain_source_model=dummy_func_dict_return_value)
self.simulation_parameters = dict(amplitude=1e-21, mu=100, sigma=1,
ra=1.375,
dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659)
def tearDown(self):
del self.waveform_generator
del self.simulation_parameters
def test_repr(self):
expected = 'WaveformGenerator(duration={}, sampling_frequency={}, start_time={}, ' \
'frequency_domain_source_model={}, time_domain_source_model={}, ' \
'parameter_conversion={}, waveform_arguments={})'\
.format(self.waveform_generator.duration,
self.waveform_generator.sampling_frequency,
self.waveform_generator.start_time,
self.waveform_generator.frequency_domain_source_model.__name__,
self.waveform_generator.time_domain_source_model,
None,
self.waveform_generator.waveform_arguments)
self.assertEqual(expected, repr(self.waveform_generator))
def test_repr_with_time_domain_source_model(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(1, 4096,
time_domain_source_model=dummy_func_dict_return_value)
expected = 'WaveformGenerator(duration={}, sampling_frequency={}, start_time={}, ' \
'frequency_domain_source_model={}, time_domain_source_model={}, ' \
'parameter_conversion={}, waveform_arguments={})'\
.format(self.waveform_generator.duration,
self.waveform_generator.sampling_frequency,
self.waveform_generator.start_time,
self.waveform_generator.frequency_domain_source_model,
self.waveform_generator.time_domain_source_model.__name__,
None,
self.waveform_generator.waveform_arguments)
self.assertEqual(expected, repr(self.waveform_generator))
def test_repr_with_param_conversion(self):
def conversion_func():
pass
self.waveform_generator.parameter_conversion = conversion_func
expected = 'WaveformGenerator(duration={}, sampling_frequency={}, start_time={}, ' \
'frequency_domain_source_model={}, time_domain_source_model={}, ' \
'parameter_conversion={}, waveform_arguments={})'\
.format(self.waveform_generator.duration,
self.waveform_generator.sampling_frequency,
self.waveform_generator.start_time,
self.waveform_generator.frequency_domain_source_model.__name__,
self.waveform_generator.time_domain_source_model,
conversion_func.__name__,
self.waveform_generator.waveform_arguments)
self.assertEqual(expected, repr(self.waveform_generator))
def test_duration(self):
self.assertEqual(self.waveform_generator.duration, 1)
def test_sampling_frequency(self):
self.assertEqual(self.waveform_generator.sampling_frequency, 4096)
def test_source_model(self):
self.assertEqual(self.waveform_generator.frequency_domain_source_model, dummy_func_dict_return_value)
def test_frequency_array_type(self):
self.assertIsInstance(self.waveform_generator.frequency_array, np.ndarray)
def test_time_array_type(self):
self.assertIsInstance(self.waveform_generator.time_array, np.ndarray)
def test_source_model_parameters(self):
self.waveform_generator.parameters = self.simulation_parameters.copy()
self.assertListEqual(sorted(list(self.waveform_generator.parameters.keys())),
sorted(list(self.simulation_parameters.keys())))
class TestWaveformArgumentsSetting(unittest.TestCase):
def setUp(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(1, 4096,
frequency_domain_source_model=dummy_func_dict_return_value,
waveform_arguments=dict(test='test', arguments='arguments'))
def tearDown(self):
del self.waveform_generator
def test_waveform_arguments_init_setting(self):
self.assertDictEqual(self.waveform_generator.waveform_arguments,
dict(test='test', arguments='arguments'))
class TestSetters(unittest.TestCase):
def setUp(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(1, 4096,
frequency_domain_source_model=dummy_func_dict_return_value)
self.simulation_parameters = dict(amplitude=1e-21, mu=100, sigma=1,
ra=1.375,
dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659)
def tearDown(self):
del self.waveform_generator
del self.simulation_parameters
def test_parameter_setter_sets_expected_values_with_expected_keys(self):
self.waveform_generator.parameters = self.simulation_parameters.copy()
for key in self.simulation_parameters:
self.assertEqual(self.waveform_generator.parameters[key], self.simulation_parameters[key])
def test_parameter_setter_none_handling(self):
with self.assertRaises(TypeError):
self.waveform_generator.parameters = None
# self.assertListEqual(sorted(list(self.waveform_generator.parameters.keys())),
# sorted(list(self.simulation_parameters.keys())))
def test_frequency_array_setter(self):
new_frequency_array = np.arange(1, 100)
self.waveform_generator.frequency_array = new_frequency_array
self.assertTrue(np.array_equal(new_frequency_array, self.waveform_generator.frequency_array))
def test_time_array_setter(self):
new_time_array = np.arange(1, 100)
self.waveform_generator.time_array = new_time_array
self.assertTrue(np.array_equal(new_time_array, self.waveform_generator.time_array))
def test_parameters_set_from_frequency_domain_source_model(self):
self.waveform_generator.frequency_domain_source_model = dummy_func_dict_return_value
self.waveform_generator.parameters = self.simulation_parameters.copy()
self.assertListEqual(sorted(list(self.waveform_generator.parameters.keys())),
sorted(list(self.simulation_parameters.keys())))
def test_parameters_set_from_time_domain_source_model(self):
self.waveform_generator.time_domain_source_model = dummy_func_dict_return_value
self.waveform_generator.parameters = self.simulation_parameters.copy()
self.assertListEqual(sorted(list(self.waveform_generator.parameters.keys())),
sorted(list(self.simulation_parameters.keys())))
def test_set_parameter_conversion_at_init(self):
def conversion_func():
pass
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(1, 4096,
frequency_domain_source_model=dummy_func_dict_return_value,
parameter_conversion=conversion_func)
self.assertEqual(conversion_func, self.waveform_generator.parameter_conversion)
class TestFrequencyDomainStrainMethod(unittest.TestCase):
def setUp(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(duration=1, sampling_frequency=4096,
frequency_domain_source_model=dummy_func_dict_return_value)
self.simulation_parameters = dict(amplitude=1e-2, mu=100, sigma=1,
ra=1.375,
dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659)
def tearDown(self):
del self.waveform_generator
del self.simulation_parameters
def test_parameter_conversion_is_called(self):
self.waveform_generator.parameter_conversion = MagicMock(side_effect=KeyError('test'))
with self.assertRaises(KeyError):
self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
def test_frequency_domain_source_model_call(self):
expected = self.waveform_generator.frequency_domain_source_model(self.waveform_generator.frequency_array,
self.simulation_parameters['amplitude'],
self.simulation_parameters['mu'],
self.simulation_parameters['sigma'],
self.simulation_parameters['ra'],
self.simulation_parameters['dec'],
self.simulation_parameters['geocent_time'],
self.simulation_parameters['psi'])
actual = self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected['plus'], actual['plus']))
self.assertTrue(np.array_equal(expected['cross'], actual['cross']))
def test_time_domain_source_model_call_with_ndarray(self):
self.waveform_generator.frequency_domain_source_model = None
self.waveform_generator.time_domain_source_model = dummy_func_array_return_value
def side_effect(value, value2):
return value
with mock.patch('bilby.core.utils.nfft') as m:
m.side_effect = side_effect
expected = self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
actual = self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected, actual))
def test_time_domain_source_model_call_with_dict(self):
self.waveform_generator.frequency_domain_source_model = None
self.waveform_generator.time_domain_source_model = dummy_func_dict_return_value
def side_effect(value, value2):
return value, self.waveform_generator.frequency_array
with mock.patch('bilby.core.utils.nfft') as m:
m.side_effect = side_effect
expected = self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
actual = self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected['plus'], actual['plus']))
self.assertTrue(np.array_equal(expected['cross'], actual['cross']))
def test_no_source_model_given(self):
self.waveform_generator.time_domain_source_model = None
self.waveform_generator.frequency_domain_source_model = None
with self.assertRaises(RuntimeError):
self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
def test_key_popping(self):
self.waveform_generator.parameter_conversion = MagicMock(return_value=(dict(amplitude=1e-21, mu=100, sigma=1,
ra=1.375, dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659, c=None, d=None),
['c', 'd']))
try:
self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
except RuntimeError:
pass
self.assertListEqual(sorted(self.waveform_generator.parameters.keys()),
sorted(['amplitude', 'mu', 'sigma', 'ra', 'dec', 'geocent_time', 'psi']))
class TestTimeDomainStrainMethod(unittest.TestCase):
def setUp(self):
self.waveform_generator = \
bilby.gw.waveform_generator.WaveformGenerator(1, 4096,
time_domain_source_model=dummy_func_dict_return_value)
self.simulation_parameters = dict(amplitude=1e-21, mu=100, sigma=1,
ra=1.375,
dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659)
def tearDown(self):
del self.waveform_generator
del self.simulation_parameters
def test_parameter_conversion_is_called(self):
self.waveform_generator.parameter_conversion = MagicMock(side_effect=KeyError('test'))
with self.assertRaises(KeyError):
self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
def test_time_domain_source_model_call(self):
expected = self.waveform_generator.time_domain_source_model(self.waveform_generator.time_array,
self.simulation_parameters['amplitude'],
self.simulation_parameters['mu'],
self.simulation_parameters['sigma'],
self.simulation_parameters['ra'],
self.simulation_parameters['dec'],
self.simulation_parameters['geocent_time'],
self.simulation_parameters['psi'])
actual = self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected['plus'], actual['plus']))
self.assertTrue(np.array_equal(expected['cross'], actual['cross']))
def test_frequency_domain_source_model_call_with_ndarray(self):
self.waveform_generator.time_domain_source_model = None
self.waveform_generator.frequency_domain_source_model = dummy_func_array_return_value
def side_effect(value, value2):
return value
with mock.patch('bilby.core.utils.infft') as m:
m.side_effect = side_effect
expected = self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
actual = self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected, actual))
def test_frequency_domain_source_model_call_with_dict(self):
self.waveform_generator.time_domain_source_model = None
self.waveform_generator.frequency_domain_source_model = dummy_func_dict_return_value
def side_effect(value, value2):
return value
with mock.patch('bilby.core.utils.infft') as m:
m.side_effect = side_effect
expected = self.waveform_generator.frequency_domain_strain(
parameters=self.simulation_parameters)
actual = self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
self.assertTrue(np.array_equal(expected['plus'], actual['plus']))
self.assertTrue(np.array_equal(expected['cross'], actual['cross']))
def test_no_source_model_given(self):
self.waveform_generator.time_domain_source_model = None
self.waveform_generator.frequency_domain_source_model = None
with self.assertRaises(RuntimeError):
self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
def test_key_popping(self):
self.waveform_generator.parameter_conversion = MagicMock(return_value=(dict(amplitude=1e-2,
mu=100,
sigma=1,
ra=1.375, dec=-1.2108,
geocent_time=1126259642.413,
psi=2.659, c=None, d=None),
['c', 'd']))
try:
self.waveform_generator.time_domain_strain(
parameters=self.simulation_parameters)
except RuntimeError:
pass
self.assertListEqual(sorted(self.waveform_generator.parameters.keys()),
sorted(['amplitude', 'mu', 'sigma', 'ra', 'dec', 'geocent_time', 'psi']))
if __name__ == '__main__':
unittest.main()
| 52.433333 | 118 | 0.598008 | 1,792 | 18,876 | 5.955915 | 0.071987 | 0.164059 | 0.188888 | 0.070271 | 0.90743 | 0.869577 | 0.860302 | 0.832381 | 0.801368 | 0.786939 | 0 | 0.019446 | 0.32438 | 18,876 | 359 | 119 | 52.579387 | 0.817455 | 0.007788 | 0 | 0.730375 | 0 | 0 | 0.047741 | 0.030065 | 0 | 0 | 0 | 0 | 0.112628 | 1 | 0.16041 | false | 0.013652 | 0.020478 | 0.017065 | 0.21843 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ed6c9d09d5e62307f8209dc4d444496f16d85e1 | 1,529 | py | Python | example/FedMAX/models.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | 6 | 2021-11-11T15:09:28.000Z | 2022-03-16T02:15:06.000Z | example/FedMAX/models.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | null | null | null | example/FedMAX/models.py | wnma3mz/flearn | df3c837bb164ec81736a3a64aaa85f574e7f67fa | [
"Apache-2.0"
] | null | null | null | import torch.nn.functional as F
from torch import nn
class LeNet5(nn.Module):
def __init__(self, num_classes):
super(LeNet5, self).__init__()
self.feature_layers = nn.Sequential(
nn.Conv2d(1, 6, 5),
nn.ReLU(),
nn.MaxPool2d(2, 2),
nn.Conv2d(6, 16, 5),
nn.ReLU(),
nn.MaxPool2d(2, 2),
)
self.fc1 = nn.Linear(256, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, num_classes)
def forward(self, x, return_act=False):
x = self.feature_layers(x)
x = x.view(x.size()[0], -1)
x = F.relu(self.fc1(x))
act = F.relu(self.fc2(x))
x = self.fc3(act)
if return_act:
return x, act
return x
class LeNet5Cifar10(nn.Module):
def __init__(self, num_classes):
super(LeNet5Cifar10, self).__init__()
self.feature_layers = nn.Sequential(
nn.Conv2d(3, 6, 5),
nn.ReLU(),
nn.MaxPool2d(2, 2),
nn.Conv2d(6, 16, 5),
nn.ReLU(),
nn.MaxPool2d(2, 2),
)
self.fc1 = nn.Linear(400, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, num_classes)
def forward(self, x, return_act=False):
x = self.feature_layers(x)
x = x.view(x.size()[0], -1)
x = F.relu(self.fc1(x))
act = F.relu(self.fc2(x))
x = self.fc3(act)
if return_act:
return x, act
return x
| 26.824561 | 45 | 0.509483 | 214 | 1,529 | 3.509346 | 0.214953 | 0.063915 | 0.090546 | 0.047936 | 0.868176 | 0.868176 | 0.868176 | 0.868176 | 0.77763 | 0.65779 | 0 | 0.080241 | 0.34794 | 1,529 | 56 | 46 | 27.303571 | 0.673019 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.041667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d974d87b5a10061fb9633cda0414d49a6a42cc6a | 20,879 | py | Python | accelbyte_py_sdk/api/sessionbrowser/wrappers/_session.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | accelbyte_py_sdk/api/sessionbrowser/wrappers/_session.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | 1 | 2021-10-13T03:46:58.000Z | 2021-10-13T03:46:58.000Z | accelbyte_py_sdk/api/sessionbrowser/wrappers/_session.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | # Copyright (c) 2021 AccelByte Inc. All Rights Reserved.
# This is licensed software from AccelByte Inc, for limitations
# and restrictions contact your company contract manager.
#
# Code generated. DO NOT EDIT!
# template file: justice_py_sdk_codegen/__main__.py
# pylint: disable=duplicate-code
# pylint: disable=line-too-long
# pylint: disable=missing-function-docstring
# pylint: disable=missing-function-docstring
# pylint: disable=missing-module-docstring
# pylint: disable=too-many-arguments
# pylint: disable=too-many-branches
# pylint: disable=too-many-instance-attributes
# pylint: disable=too-many-lines
# pylint: disable=too-many-locals
# pylint: disable=too-many-public-methods
# pylint: disable=too-many-return-statements
# pylint: disable=too-many-statements
# pylint: disable=unused-import
from typing import Any, Dict, List, Optional, Tuple, Union
from ....core import HeaderStr
from ....core import get_namespace as get_services_namespace
from ....core import run_request
from ....core import run_request_async
from ....core import same_doc_as
from ..models import ModelsActiveCustomGameResponse
from ..models import ModelsActiveMatchmakingGameResponse
from ..models import ModelsAddPlayerRequest
from ..models import ModelsAddPlayerResponse
from ..models import ModelsAdminSessionResponse
from ..models import ModelsCountActiveSessionResponse
from ..models import ModelsCreateSessionRequest
from ..models import ModelsJoinGameSessionRequest
from ..models import ModelsRecentPlayerQueryResponse
from ..models import ModelsSessionByUserIDsResponse
from ..models import ModelsSessionQueryResponse
from ..models import ModelsSessionResponse
from ..models import ModelsUpdateSessionRequest
from ..models import ModelsUpdateSettingsRequest
from ..models import ResponseError
from ..models import RestapiErrorResponseV2
from ..operations.session import AddPlayerToSession
from ..operations.session import AdminGetSession
from ..operations.session import CreateSession
from ..operations.session import DeleteSession
from ..operations.session import DeleteSessionLocalDS
from ..operations.session import GetActiveCustomGameSessions
from ..operations.session import GetActiveMatchmakingGameSessions
from ..operations.session import GetRecentPlayer
from ..operations.session import GetSession
from ..operations.session import GetSessionByUserIDs
from ..operations.session import GetTotalActiveSession
from ..operations.session import JoinSession
from ..operations.session import QuerySession
from ..operations.session import RemovePlayerFromSession
from ..operations.session import UpdateSession
from ..operations.session import UpdateSettings
@same_doc_as(AddPlayerToSession)
def add_player_to_session(body: ModelsAddPlayerRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = AddPlayerToSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(AddPlayerToSession)
async def add_player_to_session_async(body: ModelsAddPlayerRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = AddPlayerToSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(AdminGetSession)
def admin_get_session(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = AdminGetSession.create(
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(AdminGetSession)
async def admin_get_session_async(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = AdminGetSession.create(
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(CreateSession)
def create_session(body: ModelsCreateSessionRequest, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = CreateSession.create(
body=body,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(CreateSession)
async def create_session_async(body: ModelsCreateSessionRequest, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = CreateSession.create(
body=body,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(DeleteSession)
def delete_session(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = DeleteSession.create(
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(DeleteSession)
async def delete_session_async(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = DeleteSession.create(
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(DeleteSessionLocalDS)
def delete_session_local_ds(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = DeleteSessionLocalDS.create(
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(DeleteSessionLocalDS)
async def delete_session_local_ds_async(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = DeleteSessionLocalDS.create(
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetActiveCustomGameSessions)
def get_active_custom_game_sessions(server_region: Optional[str] = None, session_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetActiveCustomGameSessions.create(
server_region=server_region,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetActiveCustomGameSessions)
async def get_active_custom_game_sessions_async(server_region: Optional[str] = None, session_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetActiveCustomGameSessions.create(
server_region=server_region,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetActiveMatchmakingGameSessions)
def get_active_matchmaking_game_sessions(match_id: Optional[str] = None, server_region: Optional[str] = None, session_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetActiveMatchmakingGameSessions.create(
match_id=match_id,
server_region=server_region,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetActiveMatchmakingGameSessions)
async def get_active_matchmaking_game_sessions_async(match_id: Optional[str] = None, server_region: Optional[str] = None, session_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetActiveMatchmakingGameSessions.create(
match_id=match_id,
server_region=server_region,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetRecentPlayer)
def get_recent_player(user_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetRecentPlayer.create(
user_id=user_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetRecentPlayer)
async def get_recent_player_async(user_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetRecentPlayer.create(
user_id=user_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetSession)
def get_session(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetSession.create(
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetSession)
async def get_session_async(session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetSession.create(
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetSessionByUserIDs)
def get_session_by_user_i_ds(user_ids: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetSessionByUserIDs.create(
user_ids=user_ids,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetSessionByUserIDs)
async def get_session_by_user_i_ds_async(user_ids: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetSessionByUserIDs.create(
user_ids=user_ids,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetTotalActiveSession)
def get_total_active_session(session_type: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetTotalActiveSession.create(
session_type=session_type,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(GetTotalActiveSession)
async def get_total_active_session_async(session_type: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = GetTotalActiveSession.create(
session_type=session_type,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(JoinSession)
def join_session(body: ModelsJoinGameSessionRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = JoinSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(JoinSession)
async def join_session_async(body: ModelsJoinGameSessionRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = JoinSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(QuerySession)
def query_session(session_type: str, game_mode: Optional[str] = None, game_version: Optional[str] = None, joinable: Optional[str] = None, limit: Optional[int] = None, match_exist: Optional[str] = None, match_id: Optional[str] = None, offset: Optional[int] = None, server_status: Optional[str] = None, user_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = QuerySession.create(
session_type=session_type,
game_mode=game_mode,
game_version=game_version,
joinable=joinable,
limit=limit,
match_exist=match_exist,
match_id=match_id,
offset=offset,
server_status=server_status,
user_id=user_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(QuerySession)
async def query_session_async(session_type: str, game_mode: Optional[str] = None, game_version: Optional[str] = None, joinable: Optional[str] = None, limit: Optional[int] = None, match_exist: Optional[str] = None, match_id: Optional[str] = None, offset: Optional[int] = None, server_status: Optional[str] = None, user_id: Optional[str] = None, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = QuerySession.create(
session_type=session_type,
game_mode=game_mode,
game_version=game_version,
joinable=joinable,
limit=limit,
match_exist=match_exist,
match_id=match_id,
offset=offset,
server_status=server_status,
user_id=user_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(RemovePlayerFromSession)
def remove_player_from_session(session_id: str, user_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = RemovePlayerFromSession.create(
session_id=session_id,
user_id=user_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(RemovePlayerFromSession)
async def remove_player_from_session_async(session_id: str, user_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = RemovePlayerFromSession.create(
session_id=session_id,
user_id=user_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(UpdateSession)
def update_session(body: ModelsUpdateSessionRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = UpdateSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(UpdateSession)
async def update_session_async(body: ModelsUpdateSessionRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = UpdateSession.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(UpdateSettings)
def update_settings(body: ModelsUpdateSettingsRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = UpdateSettings.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return run_request(request, additional_headers=x_additional_headers, **kwargs)
@same_doc_as(UpdateSettings)
async def update_settings_async(body: ModelsUpdateSettingsRequest, session_id: str, namespace: Optional[str] = None, x_additional_headers: Optional[Dict[str, str]] = None, **kwargs):
if namespace is None:
namespace, error = get_services_namespace()
if error:
return None, error
request = UpdateSettings.create(
body=body,
session_id=session_id,
namespace=namespace,
)
return await run_request_async(request, additional_headers=x_additional_headers, **kwargs)
| 40.541748 | 442 | 0.725131 | 2,437 | 20,879 | 5.970866 | 0.064834 | 0.112157 | 0.07917 | 0.05278 | 0.829909 | 0.813346 | 0.804412 | 0.801388 | 0.794928 | 0.794928 | 0 | 0.000294 | 0.185833 | 20,879 | 514 | 443 | 40.620623 | 0.855697 | 0.036688 | 0 | 0.759434 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037736 | false | 0 | 0.089623 | 0 | 0.278302 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8de914540254279ae17efb583ea62edc2ac27f2f | 402 | py | Python | tests/rmdir_test/rmdir_test.py | usnistgov/CDE | 05137888a8ad67b0796814170ba61deef51bec03 | [
"BSD-3-Clause"
] | 4 | 2020-07-28T19:11:07.000Z | 2021-09-24T07:00:39.000Z | tests/rmdir_test/rmdir_test.py | usnistgov/CDE | 05137888a8ad67b0796814170ba61deef51bec03 | [
"BSD-3-Clause"
] | null | null | null | tests/rmdir_test/rmdir_test.py | usnistgov/CDE | 05137888a8ad67b0796814170ba61deef51bec03 | [
"BSD-3-Clause"
] | 2 | 2021-05-13T18:32:27.000Z | 2021-11-15T09:07:33.000Z | import os
if os.path.isdir('/home/pgbovine/guinea-pig-dir/guinea-pig-subdir'):
os.rmdir('/home/pgbovine/guinea-pig-dir/guinea-pig-subdir')
if os.path.isdir('/home/pgbovine/guinea-pig-dir'):
os.rmdir('/home/pgbovine/guinea-pig-dir')
os.mkdir('/home/pgbovine/guinea-pig-dir')
os.mkdir('/home/pgbovine/guinea-pig-dir/guinea-pig-subdir')
os.rmdir('/home/pgbovine/guinea-pig-dir/guinea-pig-subdir')
| 30.923077 | 68 | 0.738806 | 67 | 402 | 4.432836 | 0.19403 | 0.333333 | 0.424242 | 0.494949 | 0.973064 | 0.973064 | 0.973064 | 0.949495 | 0.949495 | 0.781145 | 0 | 0 | 0.047264 | 402 | 12 | 69 | 33.5 | 0.775457 | 0 | 0 | 0.25 | 0 | 0 | 0.685786 | 0.685786 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
a507b978e970f152cbdbf5625655eda0ab69df58 | 4,456 | py | Python | pyaz/acr/scope_map/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/acr/scope_map/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/acr/scope_map/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | from ... pyaz_utils import _call_az
def create(name, registry, description=None, gateway=None, repository=None, resource_group=None):
'''
Create a scope map for an Azure Container Registry.
Required Parameters:
- name -- The name of the scope map.
- registry -- The name of the container registry. You can configure the default registry name using `az configure --defaults acr=<registry name>`
Optional Parameters:
- description -- Description for the scope map. Maximum 256 characters are allowed.
- gateway -- gateway permissions. Use the format "--gateway GATEWAY [ACTION1 ACTION2 ...]" per flag. Valid actions are {'message/write', 'config/read', 'message/read', 'config/write'}
- repository -- repository permissions. Use the format "--repository REPO [ACTION1 ACTION2 ...]" per flag. Valid actions are {'metadata/read', 'metadata/write', 'content/write', 'content/delete', 'content/read'}
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az acr scope-map create", locals())
def delete(name, registry, resource_group=None, yes=None):
'''
Delete a scope map for an Azure Container Registry.
Required Parameters:
- name -- The name of the scope map.
- registry -- The name of the container registry. You can configure the default registry name using `az configure --defaults acr=<registry name>`
Optional Parameters:
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- yes -- Do not prompt for confirmation.
'''
return _call_az("az acr scope-map delete", locals())
def update(name, registry, add_gateway=None, add_repository=None, description=None, remove_gateway=None, remove_repository=None, resource_group=None):
'''
Update a scope map for an Azure Container Registry.
Required Parameters:
- name -- The name of the scope map.
- registry -- The name of the container registry. You can configure the default registry name using `az configure --defaults acr=<registry name>`
Optional Parameters:
- add_gateway -- gateway permissions to be added. Use the format "--add-gateway GATEWAY [ACTION1 ACTION2 ...]" per flag. Valid actions are {'message/write', 'config/read', 'message/read', 'config/write'}
- add_repository -- repository permissions to be added. Use the format "--add-repository REPO [ACTION1 ACTION2 ...]" per flag. Valid actions are {'metadata/read', 'metadata/write', 'content/write', 'content/delete', 'content/read'}
- description -- Description for the scope map. Maximum 256 characters are allowed.
- remove_gateway -- gateway permissions to be removed. Use the format "--remove-gateway GATEWAY [ACTION1 ACTION2 ...]" per flag. Valid actions are {'message/write', 'config/read', 'message/read', 'config/write'}
- remove_repository -- respsitory permissions to be removed. Use the format "--remove-repository REPO [ACTION1 ACTION2 ...]" per flag. Valid actions are {'metadata/read', 'metadata/write', 'content/write', 'content/delete', 'content/read'}
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az acr scope-map update", locals())
def show(name, registry, resource_group=None):
'''
Show details and attributes of a scope map for an Azure Container Registry.
Required Parameters:
- name -- The name of the scope map.
- registry -- The name of the container registry. You can configure the default registry name using `az configure --defaults acr=<registry name>`
Optional Parameters:
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az acr scope-map show", locals())
def list(registry, resource_group=None):
'''
List all scope maps for an Azure Container Registry.
Required Parameters:
- registry -- The name of the container registry. You can configure the default registry name using `az configure --defaults acr=<registry name>`
Optional Parameters:
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az acr scope-map list", locals())
| 55.7 | 243 | 0.71544 | 591 | 4,456 | 5.341794 | 0.133672 | 0.061768 | 0.047513 | 0.057016 | 0.841622 | 0.794742 | 0.794742 | 0.772569 | 0.725055 | 0.725055 | 0 | 0.004911 | 0.177513 | 4,456 | 79 | 244 | 56.405063 | 0.85648 | 0.785458 | 0 | 0 | 0 | 0 | 0.148594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.090909 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
eb8ca9d88465220d5573a36f79e52694bc3e4cb5 | 5,247 | py | Python | mermaid/forward_models_wrap.py | HastingsGreer/mermaid | bd13c5fc427eb8cd9054973a8eaaeb302078182d | [
"Apache-2.0"
] | 120 | 2019-10-29T23:53:02.000Z | 2022-03-30T02:59:58.000Z | mermaid/forward_models_wrap.py | AlexanderChristgau/mermaid | ba07883cc3cb5982e4655048a434b4495cb49c6d | [
"Apache-2.0"
] | 10 | 2019-11-05T09:28:35.000Z | 2022-01-09T19:12:51.000Z | mermaid/forward_models_wrap.py | AlexanderChristgau/mermaid | ba07883cc3cb5982e4655048a434b4495cb49c6d | [
"Apache-2.0"
] | 19 | 2019-11-10T13:34:39.000Z | 2022-03-13T20:30:10.000Z |
from __future__ import print_function
from __future__ import absolute_import
import torch
import torch.nn as nn
import numpy as np
class ODEWrapFunc(nn.Module):
"""
a wrap on tensor based torchdiffeq input
"""
def __init__(self, nested_class, has_combined_input=False, pars=None, variables_from_optimizer=None, extra_var=None, dim_info=None):
"""
:param nested_class: the model to be integrated
:param has_combined_input: the model has combined input in x e.g. EPDiff* equation, otherwise, model has individual input e.g. advect* , has x,u two inputs
:param pars: ParameterDict, settings passed to integrator
:param variables_from_optimizer: allows passing variables (as a dict from the optimizer; e.g., the current iteration)
:param extra_var: extra variable
:param dim_info: the input x can be a tensor concatenated by several variables along channel, dim_info is a list indicates the dim of each variable,
"""
super(ODEWrapFunc, self).__init__()
self.nested_class = nested_class
"""the model to be integrated"""
self.pars = pars
"""ParameterDict, settings passed to integrator"""
self.variables_from_optimizer = variables_from_optimizer
"""allows passing variables (as a dict from the optimizer; e.g., the current iteration)"""
self.extra_var = extra_var
"""extra variable"""
self.has_combined_input = has_combined_input
"""the model has combined input in x e.g. EPDiff* equation, otherwise, model has individual input e.g. advect* , has x,u two inputs"""
self.dim_info = dim_info
"""the input x can be a tensor concatenated by several variables along channel, dim_info is a list indicates the dim of each variable"""
self.opt_param = None
def set_dim_info(self, dim_info):
self.dim_info = [0] + list(np.cumsum(dim_info))
def set_opt_param(self, opt_param):
self.opt_param = opt_param
def set_debug_mode_on(self):
self.nested_class.debug_mode_on = True
def factor_input(self, y):
x = [y[:, self.dim_info[ind]:self.dim_info[ind + 1], ...] for ind in range(len(self.dim_info)-1)]
if not self.has_combined_input:
u = x[0]
x = x[1:]
else:
u = None
return u, x
@staticmethod
def factor_res(u, res):
if u is not None:
res = torch.cat((torch.zeros_like(u), *res), 1)
else:
res = torch.cat(res, 1)
return res
def forward(self,t,y):
u, x = self.factor_input(y)
res = self.nested_class.f(t, x, u, pars=self.pars, variables_from_optimizer=self.variables_from_optimizer)
res = self.factor_res(u, res)
return res
class ODEWrapFunc_tuple(nn.Module):
"""
a warp on tuple based torchdiffeq input
"""
def __init__(self, nested_class, has_combined_input=False, pars=None, variables_from_optimizer=None, extra_var=None, dim_info=None):
"""
:param nested_class: the model to be integrated
:param has_combined_input: the model has combined input in x e.g. EPDiff* equation, otherwise, model has individual input e.g. advect* , has x,u two inputs
:param pars: ParameterDict, settings passed to integrator
:param variables_from_optimizer: allows passing variables (as a dict from the optimizer; e.g., the current iteration)
:param extra_var: extra variable
:param dim_info: not use in tuple version
"""
super(ODEWrapFunc_tuple, self).__init__()
self.nested_class = nested_class
""" the model to be integrated"""
self.pars = pars
"""ParameterDict, settings passed to integrator"""
self.variables_from_optimizer = variables_from_optimizer
""" allows passing variables (as a dict from the optimizer; e.g., the current iteration)"""
self.extra_var = extra_var
"""extra variable"""
self.has_combined_input = has_combined_input
""" the model has combined input in x e.g. EPDiff* equation, otherwise, model has individual input e.g. advect* , has x,u two inputs"""
self.dim_info = dim_info
"""not use in tuple version"""
self.opt_param = None
def set_dim_info(self, dim_info):
self.dim_info = [0] + list(np.cumsum(dim_info))
def set_opt_param(self, opt_param):
self.opt_param = opt_param
def set_debug_mode_on(self):
self.nested_class.debug_mode_on = True
def factor_input(self, y):
if not self.has_combined_input:
u = y[0]
x=list(y[1:])
else:
x=list(y)
u=None
return u, x
@staticmethod
def factor_res(u, res):
if u is not None:
zero_grad=torch.zeros_like(u)
zero_grad.requires_grad=res[0].requires_grad
return (zero_grad, *res)
else:
return tuple(res)
def forward(self,t,y):
u, x = self.factor_input(y)
res = self.nested_class.f(t, x, u, pars=self.pars, variables_from_optimizer=self.variables_from_optimizer)
res = self.factor_res(u, res)
return res | 40.053435 | 163 | 0.648752 | 750 | 5,247 | 4.336 | 0.145333 | 0.045203 | 0.068881 | 0.02337 | 0.861624 | 0.861624 | 0.861624 | 0.833333 | 0.833333 | 0.833333 | 0 | 0.002829 | 0.259005 | 5,247 | 131 | 164 | 40.053435 | 0.833591 | 0.209072 | 0 | 0.730769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.179487 | false | 0 | 0.064103 | 0 | 0.358974 | 0.012821 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eb9c9989a876f603324a5ebafaf853c1816af9df | 75 | py | Python | app/eip/views/__init__.py | TennaGraph/TennaGraph | 002998d94300ee67168f1a8164c0e6bc86836e1f | [
"Apache-2.0"
] | 7 | 2018-11-13T17:39:15.000Z | 2019-03-27T04:55:24.000Z | app/eip/views/__init__.py | TennaGraph/TennaGraph | 002998d94300ee67168f1a8164c0e6bc86836e1f | [
"Apache-2.0"
] | 72 | 2018-11-09T14:20:25.000Z | 2020-06-05T19:28:19.000Z | app/eip/views/__init__.py | TennaGraph/TennaGraph | 002998d94300ee67168f1a8164c0e6bc86836e1f | [
"Apache-2.0"
] | 3 | 2018-11-19T19:10:39.000Z | 2019-08-23T20:52:23.000Z | from .eip_api_view import EIPAPIView
from .eips_api_view import EIPsAPIView | 37.5 | 38 | 0.88 | 12 | 75 | 5.166667 | 0.666667 | 0.225806 | 0.419355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093333 | 75 | 2 | 38 | 37.5 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ebc9a0a6e38a1aa3e80733d45cbebfb27ecef7fe | 118 | py | Python | stable_baselines3/offpac/__init__.py | Darklanx/stable-baselines3 | 084f5ebfc724543de485249b7a872637b29b472a | [
"MIT"
] | null | null | null | stable_baselines3/offpac/__init__.py | Darklanx/stable-baselines3 | 084f5ebfc724543de485249b7a872637b29b472a | [
"MIT"
] | null | null | null | stable_baselines3/offpac/__init__.py | Darklanx/stable-baselines3 | 084f5ebfc724543de485249b7a872637b29b472a | [
"MIT"
] | null | null | null | from stable_baselines3.offpac.offpac import OffPAC
from stable_baselines3.offpac.policies import CnnPolicy, MlpPolicy
| 39.333333 | 66 | 0.881356 | 15 | 118 | 6.8 | 0.533333 | 0.196078 | 0.392157 | 0.509804 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018349 | 0.076271 | 118 | 2 | 67 | 59 | 0.917431 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cce6cbba9c0de29d6be4e49fe814f8fc72a03ab9 | 2,405 | py | Python | tests/test_journal.py | rafacastillol/ledgeroni | 4a3df2e838a604481dd8b5472cdaba35ec9a4fb6 | [
"MIT"
] | null | null | null | tests/test_journal.py | rafacastillol/ledgeroni | 4a3df2e838a604481dd8b5472cdaba35ec9a4fb6 | [
"MIT"
] | null | null | null | tests/test_journal.py | rafacastillol/ledgeroni | 4a3df2e838a604481dd8b5472cdaba35ec9a4fb6 | [
"MIT"
] | null | null | null | import re
from datetime import datetime
import arrow
from ledgeroni.journal import Journal
from ledgeroni.query import And, Or, Not, RegexQuery, PayeeQuery
from ledgeroni.types import Transaction, Posting, Commodity
def test_add_transaction():
journal = Journal()
trans = Transaction(
date=arrow.get(datetime(2013, 2, 20)),
description='Purchased reddit gold for the year')
trans.add_posting(Posting(
account=('Asset', 'Bitcoin Wallet'),
amounts={None: -10.0}))
trans.add_posting(Posting(
account=('Expense', 'Web Services', 'Reddit'),
amounts=None))
journal.add_transaction(trans)
assert trans in journal.transactions
def test_add_transaction():
journal = Journal()
trans = Transaction(
date=arrow.get(datetime(2013, 2, 20)),
description='Purchased reddit gold for the year')
trans.add_posting(Posting(
account=('Asset', 'Bitcoin Wallet'),
amounts={None: -10.0}))
trans.add_posting(Posting(
account=('Expense', 'Web Services', 'Reddit'),
amounts=None))
journal.add_transaction(trans)
assert trans in journal.transactions
class TestTransactionsMatching:
def test_filter_pass(self):
journal = Journal()
trans = Transaction(
date=arrow.get(datetime(2013, 2, 20)),
description='Purchased reddit gold for the year')
trans.add_posting(Posting(
account=('Asset', 'Bitcoin Wallet'),
amounts={None: -10.0}))
trans.add_posting(Posting(
account=('Expense', 'Web Services', 'Reddit'),
amounts=None))
journal.add_transaction(trans)
assert trans in journal.transactions_matching(
query=RegexQuery(re.compile('Reddit')))
def test_filter_pass(self):
journal = Journal()
trans = Transaction(
date=arrow.get(datetime(2013, 2, 20)),
description='Purchased reddit gold for the year')
trans.add_posting(Posting(
account=('Asset', 'Bitcoin Wallet'),
amounts={None: -10.0}))
trans.add_posting(Posting(
account=('Expense', 'Web Services', 'Digg'),
amounts=None))
journal.add_transaction(trans)
assert trans not in journal.transactions_matching(
query=RegexQuery(re.compile('Reddit')))
| 29.691358 | 64 | 0.629106 | 262 | 2,405 | 5.69084 | 0.209924 | 0.042924 | 0.080483 | 0.118042 | 0.85446 | 0.85446 | 0.85446 | 0.85446 | 0.822267 | 0.75721 | 0 | 0.022284 | 0.253638 | 2,405 | 80 | 65 | 30.0625 | 0.808357 | 0 | 0 | 0.836066 | 0 | 0 | 0.133888 | 0 | 0 | 0 | 0 | 0 | 0.065574 | 1 | 0.065574 | false | 0.032787 | 0.098361 | 0 | 0.180328 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
690530afbef551b51c7275aa6c1b0d99447441e6 | 191 | py | Python | cobrame/__init__.py | EdwardCatoiu/cobrame | 64630f6dd78bef3bb5024b1a4b57e61ddbc01892 | [
"MIT"
] | null | null | null | cobrame/__init__.py | EdwardCatoiu/cobrame | 64630f6dd78bef3bb5024b1a4b57e61ddbc01892 | [
"MIT"
] | null | null | null | cobrame/__init__.py | EdwardCatoiu/cobrame | 64630f6dd78bef3bb5024b1a4b57e61ddbc01892 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from cobrame.core.Components import *
from cobrame.core.MEModel import *
from cobrame.core.MEReactions import *
from cobrame.core.ProcessData import *
| 27.285714 | 38 | 0.827225 | 25 | 191 | 6.12 | 0.4 | 0.261438 | 0.444444 | 0.54902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109948 | 191 | 6 | 39 | 31.833333 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
690dfdafb911e86223acacd50c83f101ba67e568 | 14,618 | py | Python | security/backends/logging/writer.py | druids/django-security | bca889ff1a58378a038a08ca365162d9e3ef3fbf | [
"MIT"
] | 9 | 2019-03-12T12:31:20.000Z | 2021-01-22T13:31:36.000Z | security/backends/logging/writer.py | druids/django-security | bca889ff1a58378a038a08ca365162d9e3ef3fbf | [
"MIT"
] | 28 | 2019-12-05T12:20:49.000Z | 2022-03-25T08:15:10.000Z | security/backends/logging/writer.py | druids/django-security | bca889ff1a58378a038a08ca365162d9e3ef3fbf | [
"MIT"
] | 5 | 2019-07-10T15:29:44.000Z | 2021-02-01T12:50:56.000Z | import logging
from security.backends.writer import BaseBackendWriter
input_request_logger = logging.getLogger('security.input_request')
output_request_logger = logging.getLogger('security.output_request')
command_logger = logging.getLogger('security.command')
celery_logger = logging.getLogger('security.celery')
class LoggingBackendWriter(BaseBackendWriter):
def input_request_started(self, logger):
input_request_logger.info(
'Input request "%(id)s" to "%(host)s" with path "%(path)s" was started',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
input_request_id=logger.id,
input_request_host=logger.data['host'],
input_request_path=logger.data['path'],
input_request_method=logger.data['method'],
input_request_view_slug=logger.data['view_slug'],
input_request_slug=logger.slug,
input_request_is_secure=logger.data['is_secure'],
input_request_ip=logger.data['ip'],
)
)
def input_request_finished(self, logger):
input_request_logger.info(
'Input request "%(id)s" to "%(host)s" with path "%(path)s" was finished',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
input_request_id=logger.id,
input_request_host=logger.data['host'],
input_request_path=logger.data['path'],
input_request_method=logger.data['method'],
input_request_view_slug=logger.data['view_slug'],
input_request_slug=logger.slug,
input_request_is_secure=logger.data['is_secure'],
input_request_ip=logger.data['ip'],
input_request_response_code=logger.data['response_code'],
input_request_start=logger.data['start'],
input_request_stop=logger.data['stop'],
input_request_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
)
)
def input_request_error(self, logger):
input_request_logger.error(
'Input request "%(id)s" to "%(host)s" with path "%(path)s" failed',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
input_request_id=logger.id,
input_request_host=logger.data['host'],
input_request_path=logger.data['path'],
input_request_method=logger.data['method'],
input_request_view_slug=logger.data['view_slug'],
input_request_slug=logger.slug,
input_request_is_secure=logger.data['is_secure'],
input_request_ip=logger.data['ip'],
input_request_start=logger.data['start'],
)
)
def output_request_started(self, logger):
output_request_logger.info(
'Output request "%(id)s" to "%(host)s" with path "%(path)s" was started',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
output_request_id=logger.id,
output_request_host=logger.data['host'],
output_request_path=logger.data['path'],
output_request_method=logger.data['method'],
output_request_slug=logger.slug,
output_request_is_secure=logger.data['is_secure'],
)
)
def output_request_finished(self, logger):
output_request_logger.info(
'Output request "%(id)s" to "%(host)s" with path "%(path)s" was successful',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
output_request_id=logger.id,
output_request_host=logger.data['host'],
output_request_path=logger.data['path'],
output_request_method=logger.data['method'],
output_request_slug=logger.slug,
output_request_is_secure=logger.data['is_secure'],
output_request_start=logger.data['start'],
output_request_stop=logger.data['stop'],
output_request_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
)
)
def output_request_error(self, logger):
output_request_logger.error(
'Output request "%(id)s" to "%(host)s" with path "%(path)s" failed',
dict(
id=logger.id,
host=logger.data['host'],
path=logger.data['path'],
),
extra=dict(
output_request_id=logger.id,
output_request_host=logger.data['host'],
output_request_path=logger.data['path'],
output_request_method=logger.data['method'],
output_request_slug=logger.slug,
output_request_is_secure=logger.data['is_secure'],
output_request_start=logger.data['start'],
output_request_stop=logger.data['stop'],
output_request_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
)
)
def command_started(self, logger):
command_logger.info(
'Command "%(id)s" with name "%(name)s" was started',
dict(
id=logger.id,
name=logger.data['name'],
),
extra=dict(
command_id=logger.id,
command_name=logger.data['name'],
command_is_executed_from_command_line=logger.data['is_executed_from_command_line'],
command_start=logger.data['start'],
)
)
def command_output_updated(self, logger):
"""Output is not stored into log"""
def command_finished(self, logger):
command_logger.info(
'Command "%(id)s" with name "%(name)s" was successful',
dict(
id=logger.id,
name=logger.data['name'],
),
extra=dict(
command_id=logger.id,
command_name=logger.data['name'],
command_is_executed_from_command_line=logger.data['is_executed_from_command_line'],
command_start=logger.data['start'],
command_stop=logger.data['stop'],
command_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
command_is_error=False
)
)
def command_error(self, logger):
command_logger.error(
'Command "%(id)s" with name "%(name)s" failed',
dict(
id=logger.id,
name=logger.data['name'],
),
extra=dict(
command_id=logger.id,
command_name=logger.data['name'],
command_is_executed_from_command_line=logger.data['is_executed_from_command_line'],
command_start=logger.data['start'],
command_stop=logger.data['stop'],
command_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
command_is_error=True
)
)
def celery_task_invocation_started(self, logger):
"""Invocation started means nothing for logging backend"""
def celery_task_invocation_triggered(self, logger):
celery_logger.info(
'Celery task invocation "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" was invoked',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_invocation_task_id=logger.data['celery_task_id'],
celery_task_invocation_name=logger.data['name'],
celery_task_invocation_queue=logger.data['queue_name'],
celery_task_invocation_start=logger.data['start'],
celery_task_invocation_state='TRIGGERED',
),
)
def celery_task_invocation_ignored(self, logger):
celery_logger.info(
'Celery task invocation "%(id)s" with name "%(name)s" was ignored',
dict(
id=logger.id,
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_invocation_name=logger.data['name'],
celery_task_invocation_queue=logger.data['queue_name'],
celery_task_invocation_start=logger.data['start'],
celery_task_invocation_stop=logger.data['stop'],
celery_task_invocation_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
celery_task_invocation_state='IGNORED',
),
)
def celery_task_invocation_timeout(self, logger):
celery_logger.warning(
'Celery task "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" caused a response timeout',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_invocation_task_id=logger.data['celery_task_id'],
celery_task_invocation_name=logger.data['name'],
celery_task_invocation_queue=logger.data['queue_name'],
celery_task_invocation_start=logger.data['start'],
celery_task_invocation_stop=logger.data['stop'],
celery_task_invocation_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
celery_task_invocation_state='TIMEOUT',
),
)
def celery_task_invocation_expired(self, logger):
celery_logger.error(
'Celery task invocation "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" was expired',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'] or '',
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_invocation_task_id=logger.data['celery_task_id'] or '',
celery_task_invocation_name=logger.data['name'],
celery_task_invocation_queue=logger.data['queue_name'],
celery_task_invocation_start=logger.data['start'],
celery_task_invocation_stop=logger.data['stop'],
celery_task_invocation_time=(logger.data['stop'] - logger.data['start']).total_seconds(),
task_state='EXPIRED',
),
)
def celery_task_run_started(self, logger):
celery_logger.info(
'Celery task "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" was started',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_name=logger.data['name'],
celery_task_queue=logger.data['queue_name'],
celery_task_tart=logger.data['start'],
celery_task_invocation_state='ACTIVE',
celery_task_attempt=logger.data['retries'],
),
)
def celery_task_run_succeeded(self, logger):
celery_logger.info(
'Celery task "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" was successful',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_name=logger.data['name'],
celery_task_queue=logger.data['queue_name'],
celery_task_tart=logger.data['start'],
celery_task_invocation_state='SUCCEEDED',
celery_task_attempt=logger.data['retries'],
celery_task_stop=logger.data['stop'],
celery_task_time=logger.data['stop'] - logger.data['start']
)
)
def celery_task_run_failed(self, logger):
celery_logger.error(
'Celery task "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" failed',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_name=logger.data['name'],
celery_task_queue=logger.data['queue_name'],
celery_task_tart=logger.data['start'],
celery_task_invocation_state='FAILED',
celery_task_attempt=logger.data['retries'],
celery_task_stop=logger.data['stop'],
celery_task_time=logger.data['stop'] - logger.data['start']
)
)
def celery_task_run_retried(self, logger):
celery_logger.warning(
'Celery task "%(id)s" with celery id "%(celery_task_id)s" and name "%(name)s" was repeated',
dict(
id=logger.id,
celery_task_id=logger.data['celery_task_id'],
name=logger.data['name'],
),
extra=dict(
id=logger.id,
celery_task_name=logger.data['name'],
celery_task_queue=logger.data['queue_name'],
celery_task_tart=logger.data['start'],
celery_task_invocation_state='RETRIED',
celery_task_attempt=logger.data['retries'],
celery_task_stop=logger.data['stop'],
celery_task_time=logger.data['stop'] - logger.data['start']
)
)
def celery_task_run_output_updated(self, logger):
"""Output is not stored into log"""
| 41.528409 | 117 | 0.551512 | 1,572 | 14,618 | 4.84542 | 0.052799 | 0.181174 | 0.094525 | 0.04595 | 0.883156 | 0.861888 | 0.850597 | 0.838125 | 0.834843 | 0.830379 | 0 | 0 | 0.330483 | 14,618 | 351 | 118 | 41.646724 | 0.778277 | 0.007662 | 0 | 0.718266 | 0 | 0.040248 | 0.157212 | 0.00911 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06192 | false | 0 | 0.006192 | 0 | 0.071207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15f39fcd219242b3a25d5f84d21e01056eda87c0 | 10,346 | py | Python | example/burrow_integration/query.py | Insafin/iroha | 5e3c3252b2a62fa887274bdf25547dc264c10c26 | [
"Apache-2.0"
] | 1,467 | 2016-10-25T12:27:19.000Z | 2022-03-28T04:32:05.000Z | example/burrow_integration/query.py | Insafin/iroha | 5e3c3252b2a62fa887274bdf25547dc264c10c26 | [
"Apache-2.0"
] | 2,366 | 2016-10-25T10:07:57.000Z | 2022-03-31T22:03:24.000Z | example/burrow_integration/query.py | Insafin/iroha | 5e3c3252b2a62fa887274bdf25547dc264c10c26 | [
"Apache-2.0"
] | 662 | 2016-10-26T04:41:22.000Z | 2022-03-31T04:15:02.000Z | import os
import binascii
from iroha import IrohaCrypto
from iroha import Iroha, IrohaGrpc
import sys
from Crypto.Hash import keccak
import integration_helpers
if sys.version_info[0] < 3:
raise Exception("Python 3 or a more recent version is required.")
# Here is the information about the environment and admin account information:
IROHA_HOST_ADDR = os.getenv("IROHA_HOST_ADDR", "127.0.0.1")
IROHA_PORT = os.getenv("IROHA_PORT", "50051")
ADMIN_ACCOUNT_ID = os.getenv("ADMIN_ACCOUNT_ID", "admin@test")
ADMIN_PRIVATE_KEY = os.getenv(
"ADMIN_PRIVATE_KEY",
"f101537e319568c765b2cc89698325604991dca57b9716b58016b253506cab70",
)
iroha = Iroha(ADMIN_ACCOUNT_ID)
net = IrohaGrpc("{}:{}".format(IROHA_HOST_ADDR, IROHA_PORT))
test_private_key = IrohaCrypto.private_key()
test_public_key = IrohaCrypto.derive_public_key(test_private_key).decode("utf-8")
@integration_helpers.trace
def create_contract():
bytecode = "608060405234801561001057600080fd5b5073a6abc17819738299b3b2c1ce46d55c74f04e290c6000806101000a81548173ffffffffffffffffffffffffffffffffffffffff021916908373ffffffffffffffffffffffffffffffffffffffff160217905550610ae5806100746000396000f3fe608060405234801561001057600080fd5b50600436106100565760003560e01c80622f5bc01461005b57806359bea24e1461008b57806371061398146100bb578063cd559561146100d9578063d4e804ab146100f7575b600080fd5b61007560048036038101906100709190610736565b610115565b604051610082919061087e565b60405180910390f35b6100a560048036038101906100a09190610736565b610280565b6040516100b2919061087e565b60405180910390f35b6100c36103ec565b6040516100d0919061087e565b60405180910390f35b6100e161054b565b6040516100ee919061087e565b60405180910390f35b6100ff6106aa565b60405161010c9190610863565b60405180910390f35b606060008260405160240161012a91906108a0565b6040516020818303038152906040527e2f5bc0000000000000000000000000000000000000000000000000000000007bffffffffffffffffffffffffffffffffffffffffffffffffffffffff19166020820180517bffffffffffffffffffffffffffffffffffffffffffffffffffffffff8381831617835250505050905060008060008054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16836040516101f0919061084c565b600060405180830381855af49150503d806000811461022b576040519150601f19603f3d011682016040523d82523d6000602084013e610230565b606091505b509150915081610275576040517f08c379a000000000000000000000000000000000000000000000000000000000815260040161026c906108c2565b60405180910390fd5b809350505050919050565b606060008260405160240161029591906108a0565b6040516020818303038152906040527f59bea24e000000000000000000000000000000000000000000000000000000007bffffffffffffffffffffffffffffffffffffffffffffffffffffffff19166020820180517bffffffffffffffffffffffffffffffffffffffffffffffffffffffff8381831617835250505050905060008060008054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff168360405161035c919061084c565b600060405180830381855af49150503d8060008114610397576040519150601f19603f3d011682016040523d82523d6000602084013e61039c565b606091505b5091509150816103e1576040517f08c379a00000000000000000000000000000000000000000000000000000000081526004016103d8906108c2565b60405180910390fd5b809350505050919050565b606060006040516024016040516020818303038152906040527f71061398000000000000000000000000000000000000000000000000000000007bffffffffffffffffffffffffffffffffffffffffffffffffffffffff19166020820180517bffffffffffffffffffffffffffffffffffffffffffffffffffffffff8381831617835250505050905060008060008054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16836040516104bd919061084c565b600060405180830381855af49150503d80600081146104f8576040519150601f19603f3d011682016040523d82523d6000602084013e6104fd565b606091505b509150915081610542576040517f08c379a0000000000000000000000000000000000000000000000000000000008152600401610539906108c2565b60405180910390fd5b80935050505090565b606060006040516024016040516020818303038152906040527fcd559561000000000000000000000000000000000000000000000000000000007bffffffffffffffffffffffffffffffffffffffffffffffffffffffff19166020820180517bffffffffffffffffffffffffffffffffffffffffffffffffffffffff8381831617835250505050905060008060008054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff168360405161061c919061084c565b600060405180830381855af49150503d8060008114610657576040519150601f19603f3d011682016040523d82523d6000602084013e61065c565b606091505b5091509150816106a1576040517f08c379a0000000000000000000000000000000000000000000000000000000008152600401610698906108c2565b60405180910390fd5b80935050505090565b60008054906101000a900473ffffffffffffffffffffffffffffffffffffffff1681565b60006106e16106dc84610907565b6108e2565b9050828152602081018484840111156106f957600080fd5b6107048482856109ad565b509392505050565b600082601f83011261071d57600080fd5b813561072d8482602086016106ce565b91505092915050565b60006020828403121561074857600080fd5b600082013567ffffffffffffffff81111561076257600080fd5b61076e8482850161070c565b91505092915050565b6107808161097b565b82525050565b600061079182610938565b61079b818561094e565b93506107ab8185602086016109bc565b6107b481610a4f565b840191505092915050565b60006107ca82610938565b6107d4818561095f565b93506107e48185602086016109bc565b80840191505092915050565b60006107fb82610943565b610805818561096a565b93506108158185602086016109bc565b61081e81610a4f565b840191505092915050565b600061083660278361096a565b915061084182610a60565b604082019050919050565b600061085882846107bf565b915081905092915050565b60006020820190506108786000830184610777565b92915050565b600060208201905081810360008301526108988184610786565b905092915050565b600060208201905081810360008301526108ba81846107f0565b905092915050565b600060208201905081810360008301526108db81610829565b9050919050565b60006108ec6108fd565b90506108f882826109ef565b919050565b6000604051905090565b600067ffffffffffffffff82111561092257610921610a20565b5b61092b82610a4f565b9050602081019050919050565b600081519050919050565b600081519050919050565b600082825260208201905092915050565b600081905092915050565b600082825260208201905092915050565b60006109868261098d565b9050919050565b600073ffffffffffffffffffffffffffffffffffffffff82169050919050565b82818337600083830152505050565b60005b838110156109da5780820151818401526020810190506109bf565b838111156109e9576000848401525b50505050565b6109f882610a4f565b810181811067ffffffffffffffff82111715610a1757610a16610a20565b5b80604052505050565b7f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6000601f19601f8301169050919050565b7f4572726f722063616c6c696e67207365727669636520636f6e7472616374206660008201527f756e6374696f6e0000000000000000000000000000000000000000000000000060208201525056fea2646970667358221220ec72f3b3f2061603a61400bade159313cfaa93d70a3c7b7f170d62cf2827a91064736f6c63430008040033"
"""Bytecode was generated using remix editor https://remix.ethereum.org/ from file query.sol. """
tx = iroha.transaction(
[iroha.command("CallEngine", caller=ADMIN_ACCOUNT_ID, input=bytecode)]
)
IrohaCrypto.sign_transaction(tx, ADMIN_PRIVATE_KEY)
net.send_tx(tx)
hex_hash = binascii.hexlify(IrohaCrypto.hash(tx))
for status in net.tx_status_stream(tx):
print(status)
return hex_hash
@integration_helpers.trace
def get_peers(address):
params = integration_helpers.get_first_four_bytes_of_keccak(b"getPeers()")
no_of_param = 0
tx = iroha.transaction(
[
iroha.command(
"CallEngine", caller=ADMIN_ACCOUNT_ID, callee=address, input=params
)
]
)
IrohaCrypto.sign_transaction(tx, ADMIN_PRIVATE_KEY)
response = net.send_tx(tx)
for status in net.tx_status_stream(tx):
print(status)
hex_hash = binascii.hexlify(IrohaCrypto.hash(tx))
return hex_hash
@integration_helpers.trace
def get_block(address):
params = integration_helpers.get_first_four_bytes_of_keccak(b"getBlock(string)")
no_of_param = 1
for x in range(no_of_param):
params = params + integration_helpers.left_padded_address_of_param(
x, no_of_param
)
params = params + integration_helpers.argument_encoding("10") # block height
tx = iroha.transaction(
[
iroha.command(
"CallEngine", caller=ADMIN_ACCOUNT_ID, callee=address, input=params
)
]
)
IrohaCrypto.sign_transaction(tx, ADMIN_PRIVATE_KEY)
response = net.send_tx(tx)
for status in net.tx_status_stream(tx):
print(status)
hex_hash = binascii.hexlify(IrohaCrypto.hash(tx))
return hex_hash
@integration_helpers.trace
def get_roles(address):
params = integration_helpers.get_first_four_bytes_of_keccak(b"getRoles()")
no_of_param = 0
tx = iroha.transaction(
[
iroha.command(
"CallEngine", caller=ADMIN_ACCOUNT_ID, callee=address, input=params
)
]
)
IrohaCrypto.sign_transaction(tx, ADMIN_PRIVATE_KEY)
response = net.send_tx(tx)
for status in net.tx_status_stream(tx):
print(status)
hex_hash = binascii.hexlify(IrohaCrypto.hash(tx))
return hex_hash
@integration_helpers.trace
def get_role_permissions(address):
params = integration_helpers.get_first_four_bytes_of_keccak(
b"getRolePermissions(string)"
)
no_of_param = 1
for x in range(no_of_param):
params = params + integration_helpers.left_padded_address_of_param(
x, no_of_param
)
params = params + integration_helpers.argument_encoding("money_creator") # role id
tx = iroha.transaction(
[
iroha.command(
"CallEngine", caller=ADMIN_ACCOUNT_ID, callee=address, input=params
)
]
)
IrohaCrypto.sign_transaction(tx, ADMIN_PRIVATE_KEY)
response = net.send_tx(tx)
for status in net.tx_status_stream(tx):
print(status)
hex_hash = binascii.hexlify(IrohaCrypto.hash(tx))
return hex_hash
hash = create_contract()
address = integration_helpers.get_engine_receipts_address(hash)
hash = get_peers(address)
integration_helpers.get_engine_receipts_result(hash)
hash = get_block(address)
integration_helpers.get_engine_receipts_result(hash)
hash = get_roles(address)
integration_helpers.get_engine_receipts_result(hash)
hash = get_role_permissions(address)
integration_helpers.get_engine_receipts_result(hash)
print("done")
| 72.34965 | 5,827 | 0.873188 | 578 | 10,346 | 15.33218 | 0.212803 | 0.038592 | 0.021327 | 0.014669 | 0.234597 | 0.234597 | 0.229858 | 0.220831 | 0.210223 | 0.210223 | 0 | 0.474191 | 0.086217 | 10,346 | 142 | 5,828 | 72.859155 | 0.46319 | 0.009376 | 0 | 0.495935 | 0 | 0 | 0.605401 | 0.581453 | 0 | 1 | 0 | 0 | 0 | 1 | 0.04065 | false | 0 | 0.056911 | 0 | 0.138211 | 0.04878 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6103cf49d2c831a93d6779e88f758ac1a3c16b5 | 2,836 | py | Python | docs/user_guide/half_float_ops.py | NunoEdgarGFlowHub/poptorch | 2e69b81c7c94b522d9f57cc53d31be562f5e3749 | [
"MIT"
] | null | null | null | docs/user_guide/half_float_ops.py | NunoEdgarGFlowHub/poptorch | 2e69b81c7c94b522d9f57cc53d31be562f5e3749 | [
"MIT"
] | null | null | null | docs/user_guide/half_float_ops.py | NunoEdgarGFlowHub/poptorch | 2e69b81c7c94b522d9f57cc53d31be562f5e3749 | [
"MIT"
] | null | null | null | # Copyright (c) 2020 Graphcore Ltd. All rights reserved.
import poptorch
import torch
# pragma pylint: disable=function-redefined
# zero_res_start
## torch.ones and zeros
class Model(torch.nn.Module):
def forward(self, x):
# dtype is ignored, however the type is resolved to be the type of x
return torch.zeros((2, 3, 4), dtype=torch.float32) + x
native_model = Model()
float16_tensor = torch.tensor([1.0], dtype=torch.float16)
float32_tensor = torch.tensor([1.0], dtype=torch.float32)
# The native model always yields a float32 tensor
assert native_model(float16_tensor).dtype == torch.float32
assert native_model(float32_tensor).dtype == torch.float32
# The poptorch model will resolve to the type of x
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float16_tensor).dtype == torch.float16
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float32_tensor).dtype == torch.float32
# zero_res_end
# rand_res_start
## torch.rand
class Model(torch.nn.Module):
def forward(self, x):
# dtype is ignored, however the type is resolved to be the type of x
return torch.rand((2, 3, 4), dtype=torch.float32) + x
native_model = Model()
float16_tensor = torch.tensor([1.0], dtype=torch.float16)
float32_tensor = torch.tensor([1.0], dtype=torch.float32)
# The native model always yields a float32 tensor
assert native_model(float16_tensor).dtype == torch.float32
assert native_model(float32_tensor).dtype == torch.float32
# The poptorch model will resolve to the type of x
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float16_tensor).dtype == torch.float16
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float32_tensor).dtype == torch.float32
# rand_res_end
# uniform_res_start
## torch.distributions.uniform.Uniform
class Model(torch.nn.Module):
def forward(self, x):
# dtype is ignored, however the type is resolved to be the type of x
ud = torch.distributions.uniform.Uniform(
torch.tensor([10.0], dtype=torch.float16),
torch.tensor([10.0], dtype=torch.float32))
return ud.sample((10, 10, 1000)) + x
native_model = Model()
float16_tensor = torch.tensor([1.0], dtype=torch.float16)
float32_tensor = torch.tensor([1.0], dtype=torch.float32)
# The native model always yields a float32 tensor
assert native_model(float16_tensor).dtype == torch.float32
assert native_model(float32_tensor).dtype == torch.float32
# The poptorch model will resolve to the type of x
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float16_tensor).dtype == torch.float16
poptorch_model = poptorch.inferenceModel(native_model)
assert poptorch_model(float32_tensor).dtype == torch.float16
# uniform_res_end
| 31.511111 | 76 | 0.757405 | 409 | 2,836 | 5.112469 | 0.158924 | 0.105213 | 0.113821 | 0.087996 | 0.846963 | 0.846963 | 0.824008 | 0.824008 | 0.824008 | 0.824008 | 0 | 0.050144 | 0.142102 | 2,836 | 89 | 77 | 31.865169 | 0.809289 | 0.263047 | 0 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.292683 | 1 | 0.073171 | false | 0 | 0.04878 | 0.04878 | 0.268293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c65493880d54a670b8e5c242967edaffcfd9f687 | 12,585 | py | Python | webservices/common/models/totals.py | 18F/openFEC | ee7b7368e0934f50c391789fb55444f811c1a2f7 | [
"CC0-1.0"
] | 246 | 2015-01-07T16:59:42.000Z | 2020-01-18T20:35:05.000Z | webservices/common/models/totals.py | 18F/openFEC | ee7b7368e0934f50c391789fb55444f811c1a2f7 | [
"CC0-1.0"
] | 2,532 | 2015-01-02T16:22:46.000Z | 2018-03-08T17:30:53.000Z | webservices/common/models/totals.py | 18F/openFEC | ee7b7368e0934f50c391789fb55444f811c1a2f7 | [
"CC0-1.0"
] | 75 | 2015-02-01T00:46:56.000Z | 2021-02-14T10:51:34.000Z | from .base import db, BaseModel
from webservices import docs
class CommitteeTotals(BaseModel):
__abstract__ = True
committee_id = db.Column(db.String, index=True, doc=docs.COMMITTEE_ID)
cycle = db.Column(db.Integer, primary_key=True, index=True, doc=docs.CYCLE)
offsets_to_operating_expenditures = db.Column(db.Numeric(30, 2))
political_party_committee_contributions = db.Column(db.Numeric(30, 2))
other_disbursements = db.Column(db.Numeric(30, 2))
other_political_committee_contributions = db.Column(db.Numeric(30, 2))
individual_itemized_contributions = db.Column(db.Numeric(30, 2), doc=docs.INDIVIDUAL_ITEMIZED_CONTRIBUTIONS)
individual_unitemized_contributions = db.Column(db.Numeric(30, 2), doc=docs.INDIVIDUAL_UNITEMIZED_CONTRIBUTIONS)
operating_expenditures = db.Column(db.Numeric(30, 2))
disbursements = db.Column(db.Numeric(30, 2), doc=docs.DISBURSEMENTS)
contributions = db.Column(db.Numeric(30, 2), doc=docs.CONTRIBUTIONS)
contribution_refunds = db.Column(db.Numeric(30, 2))
individual_contributions = db.Column(db.Numeric(30, 2))
refunded_individual_contributions = db.Column(db.Numeric(30, 2))
refunded_other_political_committee_contributions = db.Column(db.Numeric(30, 2))
refunded_political_party_committee_contributions = db.Column(db.Numeric(30, 2))
receipts = db.Column(db.Numeric(30, 2))
coverage_start_date = db.Column(db.DateTime(), index=True)
coverage_end_date = db.Column(db.DateTime(), index=True)
net_contributions = db.Column(db.Numeric(30, 2))
net_operating_expenditures = db.Column(db.Numeric(30, 2))
last_report_year = db.Column(db.Integer)
last_report_type_full = db.Column(db.String)
last_beginning_image_number = db.Column(db.BigInteger)
last_cash_on_hand_end_period = db.Column(db.Numeric(30, 2))
last_debts_owed_by_committee = db.Column(db.Numeric(30, 2))
last_debts_owed_to_committee = db.Column(db.Numeric(30, 2))
#Add additional fields and filters to /totals/{committee-type} endpoint#2631
committee_name = db.Column(db.String, doc=docs.COMMITTEE_NAME)
committee_type = db.Column(db.String, doc=docs.COMMITTEE_TYPE)
committee_designation = db.Column(db.String, doc=docs.DESIGNATION)
committee_type_full = db.Column(db.String, doc=docs.COMMITTEE_TYPE)
committee_designation_full = db.Column(db.String, doc=docs.DESIGNATION)
party_full = db.Column(db.String, doc=docs.PARTY_FULL)
class CandidateCommitteeTotals(db.Model):
__abstract__ = True
#making this it's own model hieararchy until can figure out
#how to maybe use existing classes while removing primary
#key stuff on cycle
candidate_id = db.Column(db.String, primary_key=True, doc=docs.CANDIDATE_ID)
cycle = db.Column(db.Integer, primary_key=True, index=True, doc=docs.CYCLE)
offsets_to_operating_expenditures = db.Column(db.Numeric(30, 2))
political_party_committee_contributions = db.Column(db.Numeric(30, 2))
other_disbursements = db.Column(db.Numeric(30, 2))
other_political_committee_contributions = db.Column(db.Numeric(30, 2))
individual_itemized_contributions = db.Column(db.Numeric(30, 2), doc=docs.INDIVIDUAL_ITEMIZED_CONTRIBUTIONS)
individual_unitemized_contributions = db.Column(db.Numeric(30, 2), doc=docs.INDIVIDUAL_UNITEMIZED_CONTRIBUTIONS)
disbursements = db.Column(db.Numeric(30, 2), doc=docs.DISBURSEMENTS)
contributions = db.Column(db.Numeric(30, 2), doc=docs.CONTRIBUTIONS)
contribution_refunds = db.Column(db.Numeric(30, 2))
individual_contributions = db.Column(db.Numeric(30, 2))
refunded_individual_contributions = db.Column(db.Numeric(30, 2))
refunded_other_political_committee_contributions = db.Column(db.Numeric(30, 2))
refunded_political_party_committee_contributions = db.Column(db.Numeric(30, 2))
receipts = db.Column(db.Numeric(30, 2))
coverage_start_date = db.Column(db.DateTime(), index=True)
coverage_end_date = db.Column(db.DateTime(), index=True)
operating_expenditures = db.Column(db.Numeric(30, 2))
last_report_year = db.Column(db.Integer)
last_report_type_full = db.Column(db.String)
last_beginning_image_number = db.Column(db.BigInteger)
last_cash_on_hand_end_period = db.Column(db.Numeric(30, 2))
last_debts_owed_by_committee = db.Column(db.Numeric(30, 2))
last_debts_owed_to_committee = db.Column(db.Numeric(30, 2))
class CommitteeTotalsPacPartyBase(CommitteeTotals):
__abstract__ = True
all_loans_received = db.Column(db.Numeric(30, 2))
allocated_federal_election_levin_share = db.Column(db.Numeric(30, 2))
coordinated_expenditures_by_party_committee = db.Column(db.Numeric(30, 2))
fed_candidate_committee_contributions = db.Column(db.Numeric(30, 2))
fed_candidate_contribution_refunds = db.Column(db.Numeric(30, 2))
fed_disbursements = db.Column(db.Numeric(30, 2))
fed_election_activity = db.Column(db.Numeric(30, 2))
fed_operating_expenditures = db.Column(db.Numeric(30, 2))
fed_receipts = db.Column(db.Numeric(30, 2))
independent_expenditures = db.Column(db.Numeric(30, 2))
loan_repayments_made = db.Column(db.Numeric(30, 2))
loan_repayments_received = db.Column(db.Numeric(30, 2))
loans_made = db.Column(db.Numeric(30, 2))
non_allocated_fed_election_activity = db.Column(db.Numeric(30, 2))
total_transfers = db.Column(db.Numeric(30,2))
other_fed_operating_expenditures = db.Column(db.Numeric(30, 2))
other_fed_receipts = db.Column(db.Numeric(30, 2))
shared_fed_activity = db.Column(db.Numeric(30, 2))
shared_fed_activity_nonfed = db.Column(db.Numeric(30, 2))
shared_fed_operating_expenditures = db.Column(db.Numeric(30, 2))
shared_nonfed_operating_expenditures = db.Column(db.Numeric(30, 2))
transfers_from_affiliated_party = db.Column(db.Numeric(30, 2))
transfers_from_nonfed_account = db.Column(db.Numeric(30, 2))
transfers_from_nonfed_levin = db.Column(db.Numeric(30, 2))
transfers_to_affiliated_committee = db.Column(db.Numeric(30, 2))
cash_on_hand_beginning_period = db.Column(db.Numeric(30, 2))
class CommitteeTotalsPresidential(CommitteeTotals):
__tablename__ = 'ofec_totals_presidential_mv'
candidate_contribution = db.Column(db.Numeric(30, 2))
exempt_legal_accounting_disbursement = db.Column(db.Numeric(30, 2))
federal_funds = db.Column(db.Numeric(30, 2))
fundraising_disbursements = db.Column(db.Numeric(30, 2))
loan_repayments_made = db.Column(db.Numeric(30, 2))
loans_received = db.Column(db.Numeric(30, 2))
loans_received_from_candidate = db.Column(db.Numeric(30, 2))
offsets_to_fundraising_expenditures = db.Column(db.Numeric(30, 2))
offsets_to_legal_accounting = db.Column(db.Numeric(30, 2))
total_offsets_to_operating_expenditures = db.Column(db.Numeric(30, 2))
other_loans_received = db.Column(db.Numeric(30, 2))
other_receipts = db.Column(db.Numeric(30, 2))
repayments_loans_made_by_candidate = db.Column(db.Numeric(30, 2))
repayments_other_loans = db.Column(db.Numeric(30, 2))
transfers_from_affiliated_committee = db.Column(db.Numeric(30, 2))
transfers_to_other_authorized_committee = db.Column(db.Numeric(30, 2))
cash_on_hand_beginning_period = db.Column(db.Numeric(30, 2))
net_operating_expenditures = db.Column('last_net_operating_expenditures', db.Numeric(30, 2))
net_contributions = db.Column('last_net_contributions', db.Numeric(30, 2))
class CandidateCommitteeTotalsPresidential(CandidateCommitteeTotals):
__table_args__ = {'extend_existing': True}
__tablename__ = 'ofec_totals_candidate_committees_mv'
candidate_contribution = db.Column(db.Numeric(30, 2))
exempt_legal_accounting_disbursement = db.Column(db.Numeric(30, 2))
federal_funds = db.Column(db.Numeric(30, 2))
fundraising_disbursements = db.Column(db.Numeric(30, 2))
loan_repayments_made = db.Column(db.Numeric(30, 2))
loans_received = db.Column(db.Numeric(30, 2))
loans_received_from_candidate = db.Column(db.Numeric(30, 2))
offsets_to_fundraising_expenditures = db.Column(db.Numeric(30, 2))
offsets_to_legal_accounting = db.Column(db.Numeric(30, 2))
total_offsets_to_operating_expenditures = db.Column(db.Numeric(30, 2))
other_loans_received = db.Column(db.Numeric(30, 2))
other_receipts = db.Column(db.Numeric(30, 2))
repayments_loans_made_by_candidate = db.Column(db.Numeric(30, 2))
repayments_other_loans = db.Column(db.Numeric(30, 2))
transfers_from_affiliated_committee = db.Column(db.Numeric(30, 2))
transfers_to_other_authorized_committee = db.Column(db.Numeric(30, 2))
#cash_on_hand_beginning_of_period = db.Column(db.Numeric(30, 2))
full_election = db.Column(db.Boolean, primary_key=True)
net_operating_expenditures = db.Column('last_net_operating_expenditures', db.Numeric(30, 2))
net_contributions = db.Column('last_net_contributions', db.Numeric(30, 2))
class CandidateCommitteeTotalsHouseSenate(CandidateCommitteeTotals):
__table_args__ = {'extend_existing': True}
__tablename__ = 'ofec_totals_candidate_committees_mv'
all_other_loans = db.Column('other_loans_received', db.Numeric(30, 2))
candidate_contribution = db.Column(db.Numeric(30, 2))
loan_repayments = db.Column('loan_repayments_made', db.Numeric(30, 2))
loan_repayments_candidate_loans = db.Column('repayments_loans_made_by_candidate', db.Numeric(30, 2))
loan_repayments_other_loans = db.Column('repayments_other_loans', db.Numeric(30, 2))
loans = db.Column('loans_received', db.Numeric(30,2))
loans_made_by_candidate = db.Column('loans_received_from_candidate', db.Numeric(30, 2))
other_receipts = db.Column(db.Numeric(30, 2))
transfers_from_other_authorized_committee = db.Column('transfers_from_affiliated_committee', db.Numeric(30, 2))
transfers_to_other_authorized_committee = db.Column(db.Numeric(30, 2))
#cash_on_hand_beginning_of_period = db.Column(db.Numeric(30, 2))
full_election = db.Column(db.Boolean, primary_key=True)
net_operating_expenditures = db.Column(db.Numeric(30, 2))
net_contributions = db.Column(db.Numeric(30, 2))
class CommitteeTotalsParty(CommitteeTotalsPacPartyBase):
__tablename__ = 'ofec_totals_parties_mv'
committee_name = db.Column(db.String)
committee_type = db.Column(db.String)
class CommitteeTotalsPac(CommitteeTotalsPacPartyBase):
__tablename__ = 'ofec_totals_pacs_mv'
committee_name = db.Column(db.String)
committee_type = db.Column(db.String)
class CommitteeTotalsPacParty(CommitteeTotalsPacPartyBase):
__tablename__ = 'ofec_totals_pacs_parties_mv'
class CommitteeTotalsHouseSenate(CommitteeTotals):
__tablename__ = 'ofec_totals_house_senate_mv'
all_other_loans = db.Column(db.Numeric(30, 2))
candidate_contribution = db.Column(db.Numeric(30, 2))
loan_repayments = db.Column(db.Numeric(30, 2))
loan_repayments_candidate_loans = db.Column(db.Numeric(30, 2))
loan_repayments_other_loans = db.Column(db.Numeric(30, 2))
loans = db.Column(db.Numeric(30, 2))
loans_made_by_candidate = db.Column(db.Numeric(30, 2))
other_receipts = db.Column(db.Numeric(30, 2))
transfers_from_other_authorized_committee = db.Column(db.Numeric(30, 2))
transfers_to_other_authorized_committee = db.Column(db.Numeric(30, 2))
cash_on_hand_beginning_period = db.Column(db.Numeric(30, 2))
class CommitteeTotalsIEOnly(BaseModel):
__tablename__ = 'ofec_totals_ie_only_mv'
committee_id = db.Column(db.String, index=True, doc=docs.COMMITTEE_ID)
cycle = db.Column(db.Integer, index=True, doc=docs.CYCLE)
coverage_start_date = db.Column(db.DateTime, doc=docs.COVERAGE_START_DATE)
coverage_end_date = db.Column(db.DateTime, doc=docs.COVERAGE_END_DATE)
total_independent_contributions = db.Column(db.Numeric(30, 2))
total_independent_expenditures = db.Column(db.Numeric(30, 2))
class ScheduleAByStateRecipientTotals(BaseModel):
__tablename__ = 'ofec_sched_a_aggregate_state_recipient_totals_mv'
total = db.Column(db.Numeric(30, 2), index=True, doc='The calculated total.')
count = db.Column(db.Integer, index=True, doc='Number of records making up the total.')
cycle = db.Column(db.Integer, index=True, doc=docs.CYCLE)
state = db.Column(db.String, index=True, doc=docs.STATE_GENERIC)
state_full = db.Column(db.String, index=True, doc=docs.STATE_GENERIC)
committee_type = db.Column(db.String, index=True, doc=docs.COMMITTEE_TYPE)
committee_type_full = db.Column(db.String, index=True, doc=docs.COMMITTEE_TYPE)
| 53.101266 | 116 | 0.757648 | 1,783 | 12,585 | 5.067863 | 0.089176 | 0.146082 | 0.170429 | 0.171315 | 0.868747 | 0.84927 | 0.831452 | 0.790726 | 0.721226 | 0.672311 | 0 | 0.035497 | 0.124752 | 12,585 | 236 | 117 | 53.326271 | 0.784839 | 0.026381 | 0 | 0.602094 | 0 | 0 | 0.051535 | 0.038304 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010471 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
d65fd519701468f625e5213da250d324d1f2a300 | 1,948 | py | Python | tests/test_rhf.py | bzhang25/QM_2017_SSS_Team9_new- | 13af70caeadd75fc539523d04c8b7e5fa68cdc2f | [
"BSD-3-Clause"
] | null | null | null | tests/test_rhf.py | bzhang25/QM_2017_SSS_Team9_new- | 13af70caeadd75fc539523d04c8b7e5fa68cdc2f | [
"BSD-3-Clause"
] | null | null | null | tests/test_rhf.py | bzhang25/QM_2017_SSS_Team9_new- | 13af70caeadd75fc539523d04c8b7e5fa68cdc2f | [
"BSD-3-Clause"
] | null | null | null | """
Testing
"""
import rhf
import pytest
import psi4
import numpy as np
def test_rhf():
"""
This function tests the rhf module without damping or diis
"""
mol = psi4.geometry("""
O
H 1 1.1
H 1 1.1 2 104
symmetry c1
"""
)
bas = 'sto-3g'
options = {'energy_conv' : 1.0e-6, 'density_conv' : 1.0e-6,'max_iter': 25,
'diis' : 'off', 'damping' : 'off','nelec' : 10}
molecule = rhf.RHF(mol, bas, options)
molecule.get_energy()
psi4.set_options({"scf_type": "pk"})
psi4_energy = psi4.energy("SCF/"+ bas, molecule=mol)
assert np.allclose(molecule.E, psi4_energy)
def test_rhf_damp():
"""
This function tests the rhf module with damping
"""
mol = psi4.geometry("""
O
H 1 1.1
H 1 1.1 2 104
symmetry c1
"""
)
bas = 'cc-pvtz'
options = {'energy_conv' : 1.0e-6, 'density_conv' : 1.0e-6,'max_iter': 25,
'diis' : 'off', 'nelec' : 10, 'damping': 'on', 'damping_start' : 5, 'damping_value' : 0.2}
molecule = rhf.RHF(mol, bas, options)
molecule.get_energy()
psi4.set_options({"scf_type": "pk"})
psi4_energy = psi4.energy("SCF/"+ bas, molecule=mol)
assert np.allclose(molecule.E, psi4_energy)
def test_rhf_jk():
"""
This function tests the rhf module with damping
"""
mol = psi4.geometry("""
O
H 1 1.1
H 1 1.1 2 104
symmetry c1
"""
)
bas = 'cc-pvtz'
options = {'energy_conv' : 1.0e-6, 'density_conv' : 1.0e-6,'max_iter': 25,
'diis' : 'off', 'nelec' : 10, 'damping': 'on', 'damping_start' : 5, 'damping_value' : 0.2}
molecule = rhf.RHF(mol, bas, options)
molecule.get_energy()
psi4_energy = psi4.energy("SCF/"+ bas, molecule=mol)
assert np.allclose(molecule.E, psi4_energy)
| 24.049383 | 106 | 0.534908 | 263 | 1,948 | 3.844106 | 0.235741 | 0.023739 | 0.017804 | 0.023739 | 0.890208 | 0.890208 | 0.861523 | 0.861523 | 0.861523 | 0.861523 | 0 | 0.06367 | 0.314682 | 1,948 | 80 | 107 | 24.35 | 0.693633 | 0.083162 | 0 | 0.705882 | 0 | 0 | 0.249279 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.058824 | false | 0 | 0.078431 | 0 | 0.137255 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6a4c0791f7fff9a2c45e0ef3e8e92451c85b176 | 46 | py | Python | app/blueprints/settings/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | app/blueprints/settings/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | app/blueprints/settings/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | from app.blueprints.settings.routes import bp
| 23 | 45 | 0.847826 | 7 | 46 | 5.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
d6b91f8725e840e46c4581074652170e3f824203 | 7,923 | py | Python | config.py | mell0/dongerror | a3bf9c5c8f5e31155aa6cb00dc7ba5aeb86662fe | [
"Apache-2.0"
] | 3 | 2019-08-22T14:57:31.000Z | 2019-08-22T19:31:19.000Z | config.py | mell0/dongerror | a3bf9c5c8f5e31155aa6cb00dc7ba5aeb86662fe | [
"Apache-2.0"
] | null | null | null | config.py | mell0/dongerror | a3bf9c5c8f5e31155aa6cb00dc7ba5aeb86662fe | [
"Apache-2.0"
] | null | null | null |
EXCEPTION_DONGERS = {
"BaseException": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"KeyboardInterrupt": b"\xe1\x95\x95\x28 \xd5\x9e \xe1\x97\x9c \xd5\x9e \x29\xe1\x95\x97",
"SystemExit": b"\xe2\x95\xb0\x5b \xc2\xb0 \xe7\x9b\x8a \xc2\xb0 \x5d\xe2\x95\xaf",
"GeneratorExit": b"\xe2\x95\xb0\x5b \xc2\xb0 \xe7\x9b\x8a \xc2\xb0 \x5d\xe2\x95\xaf",
"Exception": b"\x28\xe3\x83\x8e\xc2\xb0 \xcd\x9c\xca\x96 \xcd\xa1\xc2\xb0\x29\xe3\x83\x8e\xef\xb8\xb5\xe2\x94\xbb\xe2\x94\xbb",
"StopIteration": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"StopAsyncIteration": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ArithmeticError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"FloatingPointError": "\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"OverflowError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ZeroDivisionError": b"\xe2\x95\xae\x28\xef\xbf\xa3\xcf\x89\xef\xbf\xa3\x3b\x29\xe2\x95\xad",
"AssertionError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"AttributeError": b"\xe0\xbc\xbc\xe3\x81\xa4\xe0\xb2\xa0\xe7\x9b\x8a\xe0\xb2\xa0\xe0\xbc\xbd\xe3\x81\xa4\x20\xe2\x94\x80\x3d\xe2\x89\xa1\xce\xa3\x4f\x29\x29",
"BufferError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"EOFError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ImportError": b"\xe4\xb9\x81\xe0\xbb\x92\x28 \xcd\xa1\xe2\x97\x95 \xe1\xb4\xa5 \xe2\x97\x95\xcd\xa1 \x29\xe0\xa5\xad\xe3\x84\x8f",
"ModuleNotFoundError": b"\xe2\x95\xb0\x5b \xc2\xb0 \xe7\x9b\x8a \xc2\xb0 \x5d\xe2\x95\xaf",
"LookupError": b"\x28\xe3\x83\x8e\xc2\xb0 \xcd\x9c\xca\x96 \xcd\xa1\xc2\xb0\x29\xe3\x83\x8e\xef\xb8\xb5\xe2\x94\xbb\xe2\x94\xbb",
"IndexError": b"\x28\xe3\x81\xa4\xe2\x97\x89\xe7\x9b\x8a\xe2\x97\x89\x29\xe3\x81\xa4",
"KeyError": b"\xe2\x94\x8f\xe0\xbc\xbc\x20\xe2\x97\x89\x20\xe2\x95\xad\xe2\x95\xae\x20\xe2\x97\x89\xe0\xbc\xbd\xe2\x94\x93",
"MemoryError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"NameError": b"\xe2\x94\xbb\xe2\x94\x81\xe2\x94\xbb\xef\xb8\xb5\xe3\x83\xbd\x28\x60\xd0\x94\xc2\xb4\x29\xef\xbe\x89\xef\xb8\xb5\x20\xe2\x94\xbb\xe2\x94\x81\xe2\x94\xbb",
"UnboundLocalError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"OSError": b"\xe0\xbb\x92\x28\x20\xe1\x93\x80\x20\xe2\x80\xb8\x20\xe1\x93\x82\x20\x29\xe0\xa5\xad",
"BlockingIOError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ChildProcessError": b"\x28\x20\xcd\xa1\xc2\xb0\x28\x20\xcd\xa1\xc2\xb0\x20\xcd\x9c\xca\x96\x28\x20\xcd\xa1\xc2\xb0\x20\xcd\x9c\xca\x96\x20\xcd\xa1\xc2\xb0\x29\xca\x96\x20\xcd\xa1\xc2\xb0\x29\x20\xcd\xa1\xc2\xb0\x29",
"ConnectionError": b"\xd9\xa9\x28\xe2\x95\xac\xca\x98\xe7\x9b\x8a\xca\x98\xe2\x95\xac\x29\xdb\xb6",
"BrokenPipeError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ConnectionAbortedError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ConnectionRefusedError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ConnectionResetError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"FileExistsError": b"\xe3\x83\xbd\x28\x23\xef\xbe\x9f\xd0\x94\xef\xbe\x9f\x29\xef\xbe\x89\xe2\x94\x8c\xe2\x94\x9b",
"FileNotFoundError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"InterruptedError": b"\x28\x5f\xcc\x85\x5f\xcc\x85\x5f\xcc\x85\x5f\xcc\x85\x5f\xcc\xb2\xcc\x85\xd0\xbc\xcc\xb2\xcc\x85\x61\xcc\xb2\xcc\x85\xd1\x8f\xcc\xb2\xcc\x85\x69\xcc\xb2\xcc\x85\x6a\xc2\xad\xcc\xb2\xcc\x85\x75\xcc\xb2\xcc\x85\x61\xcc\xb2\xcc\x85\x6e\xcc\xb2\xcc\x85\x61\xcc\xb2\xcc\x85\x5f\xcc\x85\x5f\xcc\x85\x5f\xcc\x85\x28\x29\xe0\xb8\x94\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x89\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87\xe0\xb9\x87",
"IsADirectoryError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"NotADirectoryError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"PermissionError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"ProcessLookupError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"TimeoutError": b"\xca\x95\xe0\xbc\xbc\xe2\x9c\x96\xe2\x88\xa7\xe2\x9c\x96\xe0\xbc\xbd\xca\x94",
"ReferenceError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"RuntimeError": b"\xe2\x94\x80\x3d\xe2\x89\xa1\xce\xa3\xe1\x95\x95\x28 \xcd\xa1\xc2\xb0 \xcd\x9c\xca\x96 \xcd\xa1\xc2\xb0\x29\xe1\x95\x97",
"NotImplementedError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"RecursionError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"SyntaxError": b"\x6f\xcd\xa1\xcd\xa1\xcd\xa1\xe2\x95\xae\xe0\xbc\xbc \x2e \xe2\x80\xa2\xcc\x81 \x5f\xca\x96 \xe2\x80\xa2\xcc\x80 \x2e \xe0\xbc\xbd\xe2\x95\xad\x6f\xcd\xa1\xcd\xa1\xcd\xa1",
"IndentationError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"TabError": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"SystemError": b"\xe2\x94\x8c\x28\x20\xcd\x9d\xc2\xb0\x20\xcd\x9c\xca\x96\xcd\xa1\xc2\xb0\x29\x3d\xce\xb5\x2f\xcc\xb5\xcd\x87\xcc\xbf\xcc\xbf\x2f\xe2\x80\x99\xcc\xbf\xe2\x80\x99\xcc\xbf\x20\xcc\xbf",
"TypeError": b"\xe1\x95\x95\x28\xe2\x95\xad\xe0\xb2\xb0\xe2\x95\xad\x20\xcd\x9f\xca\x96\xe2\x95\xae\xe2\x80\xa2\xcc\x81\x29\xe2\x8a\x83\xc2\xa4\x3d\x28\xe2\x80\x94\xe2\x80\x94\xe2\x80\x94\xe2\x80\x94\x2d",
"ValueError": b"\xe1\x83\x9a\x28\xc2\xaf\xe3\x83\xad\xc2\xaf\x22\xe1\x83\x9a\x29",
"UnicodeError": b"\xe1\x95\x99\x28\xe2\x87\x80\xe2\x80\xb8\xe2\x86\xbc\xe2\x81\xa0\xe2\x80\xb6\x29\xe1\x95\x97",
"UnicodeDecodeError": b"\xe1\x95\x99\x28\xe2\x87\x80\xe2\x80\xb8\xe2\x86\xbc\xe2\x81\xa0\xe2\x80\xb6\x29\xe1\x95\x97",
"UnicodeEncodeError": b"\xe1\x95\x99\x28\xe2\x87\x80\xe2\x80\xb8\xe2\x86\xbc\xe2\x81\xa0\xe2\x80\xb6\x29\xe1\x95\x97",
"UnicodeTranslateError": b"\xe1\x95\x99\x28\xe2\x87\x80\xe2\x80\xb8\xe2\x86\xbc\xe2\x81\xa0\xe2\x80\xb6\x29\xe1\x95\x97",
"Warning": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"DeprecationWarning": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"PendingDeprecationWarning": b"\xc2\xaf\x5c\x5f\x28\xe3\x83\x84\x29\x5f\x2f\xc2\xaf",
"RuntimeWarning": b"\xe2\x94\x80\x3d\xe2\x89\xa1\xce\xa3\xe1\x95\x95\x28 \xcd\xa1\xc2\xb0 \xcd\x9c\xca\x96 \xcd\xa1\xc2\xb0\x29\xe1\x95\x97",
"SyntaxWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"UserWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"FutureWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"ImportWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"UnicodeWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"BytesWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
"ResourceWarning": b"\xe2\x80\xa6\x5f\xe3\x80\x86\x28\xef\xbe\x9f\xe2\x96\xbd\xef\xbe\x9f\x2a\x29",
} | 118.253731 | 1,300 | 0.704405 | 1,660 | 7,923 | 3.361446 | 0.096386 | 0.084946 | 0.070968 | 0.094624 | 0.646774 | 0.631362 | 0.62491 | 0.615771 | 0.608244 | 0.568996 | 0 | 0.238165 | 0.053515 | 7,923 | 67 | 1,301 | 118.253731 | 0.505934 | 0 | 0 | 0 | 0 | 0.969697 | 0.892086 | 0.727755 | 0 | 1 | 0 | 0 | 0.015152 | 1 | 0 | false | 0 | 0.030303 | 0 | 0.030303 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6c8f06999e21b4fc3906ca6eedf8c73de959d23 | 46,727 | py | Python | PandaRender.py | galsh17/cartwheel_train | a50abe18cfe8c1f0f24267c3efa8537ecf211e72 | [
"MIT"
] | 32 | 2018-09-04T08:51:08.000Z | 2022-02-22T02:04:38.000Z | PandaRender.py | galsh17/cartwheel_train | a50abe18cfe8c1f0f24267c3efa8537ecf211e72 | [
"MIT"
] | 5 | 2019-05-27T07:54:52.000Z | 2022-01-11T10:14:25.000Z | PandaRender.py | galsh17/cartwheel_train | a50abe18cfe8c1f0f24267c3efa8537ecf211e72 | [
"MIT"
] | 14 | 2018-06-22T15:29:39.000Z | 2021-09-28T12:58:37.000Z | """ The training-render file. (class TrainRenderer)
Update: This class is render only class. It does not take the mainloop()
control. Basically need to call function step().
Defines a rendering class. Defines a spinTask (panda3d) which basicalyl
renders 16-cameras at a time and sets them into a CPU-queue. This queue
is emtied by calls to step(). May build more purpose-build steps()s
update: There are 3 Panda renderer in this file, viz. TrainRenderer,
TestRenderer, NetVLADRenderer. For comments on what each does check out
those classes. For general usage of the renderers see `test_render.py`
"""
# Panda3d
from direct.showbase.ShowBase import ShowBase
from panda3d.core import *
from direct.stdpy.threading2 import Thread
# Usual Math and Image Processing
import numpy as np
import cv2
from scipy import interpolate
# import caffe
# import tensorflow as tf
# Other System Libs
import os
import argparse
import Queue
import copy
import time
import code
import pickle
# Custom-Misc
import TerminalColors
import CubeMaker
import PathMaker
class TrainRenderer(ShowBase):
renderIndx=0
# Basic Mesh & Camera Setup
def loadAllTextures(self, mesh, basePath, silent=True):
""" Loads texture files for a mesh """
c = 0
for child in mesh.getChildren():
submesh_name = child.get_name()
submesh_texture = basePath + submesh_name[:-5] + 'tex0.jpg'
child.setTexture( self.loader.loadTexture(submesh_texture) )
if silent == False:
print 'Loading texture file : ', submesh_texture
c = c + 1
print self.tcolor.OKGREEN, "Loaded ", c, "textures", self.tcolor.ENDC
def setupMesh(self):
""" Loads the .obj files. Will load mesh sub-divisions separately """
print 'Attempt Loading Mesh VErtices, FAces'
self.cyt = self.loader.loadModel( 'model_l/l6/level_6_0_0.obj' )
self.cyt2 = self.loader.loadModel( 'model_l/l6/level_6_128_0.obj' )
self.low_res = self.loader.loadModel( 'model_l/l3/level_3_0_0.obj' )
print self.tcolor.OKGREEN, 'Done Loading Vertices', self.tcolor.ENDC
print 'Attempt Loading Textures'
self.loadAllTextures( self.cyt, 'model_l/l6/')
self.loadAllTextures( self.cyt2, 'model_l/l6/')
self.loadAllTextures( self.low_res, 'model_l/l3/')
print self.tcolor.OKGREEN, 'Done Loading Textures', self.tcolor.ENDC
def positionMesh(self):
""" WIll have to manually adjust this for ur mesh. I position the
center where I fly my drone and oriented in ENU (East-north-up)
cords for easy alignment of GPS and my cordinates. If your model
is not metric scale will have to adjust for that too"""
self.cyt.setPos( 140,-450, 150 )
self.cyt2.setPos( 140,-450, 150 )
self.low_res.setPos( 140,-450, 150 )
self.cyt.setHpr( 198, -90, 0 )
self.cyt2.setHpr( 198, -90, 0 )
self.low_res.setHpr( 198, -90, 0 )
self.cyt.setScale(172)
self.cyt2.setScale(172)
self.low_res.setScale(172)
def customCamera(self, nameIndx):
lens = self.camLens
lens.setFov(83)
print 'self.customCamera : Set FOV at 83'
my_cam = Camera("cam"+nameIndx, lens)
my_camera = self.scene0.attachNewNode(my_cam)
# my_camera = self.render.attachNewNode(my_cam)
my_camera.setName("camera"+nameIndx)
return my_camera
def customDisplayRegion(self, rows, cols):
rSize = 1.0 / rows
cSize = 1.0 / cols
dr_list = []
for i in range(0,rows):
for j in range(0,cols):
# print i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize
dr_i = self.win2.makeDisplayRegion(i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize)
dr_i.setSort(-5)
dr_list.append( dr_i )
return dr_list
## Gives a random 6-dof pose. Need to set params manually here.
## X,Y,Z, Yaw(abt Z-axis), Pitch(abt X-axis), Roll(abt Y-axis)
## @param No : no inputs
def monte_carlo_sample(self):
# mc_X_min etc are set in constructor
X = np.random.uniform(self.mc_X_min,self.mc_X_max)
Y = np.random.uniform(self.mc_Y_min,self.mc_Y_max)
Z = np.random.uniform(self.mc_Z_min,self.mc_Z_max)
yaw = np.random.uniform( self.mc_yaw_min, self.mc_yaw_max)
roll = 0#np.random.uniform( self.mc_roll_min, self.mc_roll_max)
pitch = 0#np.random.uniform( self.mc_pitch_min, self.mc_pitch_max)
return X,Y,Z, yaw,roll,pitch
## Annotation-helpers for self.render
def putBoxes(self,X,Y,Z,r=1.,g=0.,b=0., scale=1.0):
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(r,g,b)
cube_x.setScale(scale)
cube_x.reparentTo(self.render)
cube_x.setPos(X,Y,Z)
## Set a cube in 3d env
def putTrainingBox(self,task):
cube = CubeMaker.CubeMaker().generate()
cube.setTransparency(TransparencyAttrib.MAlpha)
cube.setAlphaScale(0.5)
# cube.setScale(10)
# mc_X_min etc are set in constructor
sx = 0.5 * (self.mc_X_max - self.mc_X_min)
sy = 0.5 * (self.mc_Y_max - self.mc_Y_min)
sz = 0.5 * (self.mc_Z_max - self.mc_Z_min)
ax = 0.5 * (self.mc_X_max + self.mc_X_min)
ay = 0.5 * (self.mc_Y_max + self.mc_Y_min)
az = 0.5 * (self.mc_Z_max + self.mc_Z_min)
cube.setSx(sx)
cube.setSy(sy)
cube.setSz(sz)
cube.reparentTo(self.render)
cube.setPos(ax,ay,az)
## Task. This task draw the XYZ axis
def putAxesTask(self,task):
if (task.frame / 10) % 2 == 0:
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(1.0,0.0,0.0)
cube_x.setScale(1)
cube_x.reparentTo(self.render)
cube_x.setPos(task.frame,0,0)
cube_y = CubeMaker.CubeMaker().generate()
cube_y.setColor(0.0,1.0,0.0)
cube_y.setScale(1)
cube_y.reparentTo(self.render)
cube_y.setPos(0,task.frame,0)
cube_z = CubeMaker.CubeMaker().generate()
cube_z.setColor(0.0,0.0,1.0)
cube_z.setScale(1)
cube_z.reparentTo(self.render)
cube_z.setPos(0,0,task.frame)
if task.time > 25:
return None
return task.cont
## Render-n-Learn task
##
## set pose in each camera <br/>
## make note of the poses just set as this will take effect next <br/>
## Retrive Rendered Data <br/>
## Cut rendered data into individual image. Note rendered data will be 4X4 grid of images <br/>
## Put imX into the queue <br/>
def renderNlearnTask(self, task):
if task.time < 2: #do not do anything for 1st 2 sec
return task.cont
# print randX, randY, randZ
#
## set pose in each camera
# Note: The texture is grided images in a col-major format
poses = np.zeros( (len(self.cameraList), 4), dtype='float32' )
for i in range(len(self.cameraList)):
randX,randY, randZ, randYaw, randPitch, randRoll = self.monte_carlo_sample()
# if i<4 :
# randX = (i) * 30
# else:
# randX = 0
#
# randY = 0#task.frame
# randZ = 80
# randYaw = 0
# randPitch = 0
# randRoll = 0
self.cameraList[i].setPos(randX,randY,randZ)
self.cameraList[i].setHpr(randYaw,-90+randPitch,0+randRoll)
poses[i,0] = randX
poses[i,1] = randY
poses[i,2] = randZ
poses[i,3] = randYaw
# self.putBoxes(randX,randY,randZ, scale=0.3)
#
# if task.frame < 100:
# return task.cont
# else:
# return None
## make note of the poses just set as this will take effect next
if TrainRenderer.renderIndx == 0:
TrainRenderer.renderIndx = TrainRenderer.renderIndx + 1
self.prevPoses = poses
return task.cont
#
## Retrive Rendered Data
tex = self.win2.getScreenshot()
A = np.array(tex.getRamImageAs("RGB")).reshape(960,1280,3)
# A = np.zeros((960,1280,3))
# A_bgr = cv2.cvtColor(A.astype('uint8'),cv2.COLOR_RGB2BGR)
# cv2.imwrite( str(TrainRenderer.renderIndx)+'.png', A_bgr.astype('uint8') )
# myTexture = self.win2.getTexture()
# print myTexture
# retrive poses from prev render
texPoses = self.prevPoses
#
## Cut rendered data into individual image. Note rendered data will be 4X4 grid of images
#960 rows and 1280 cols (4x4 image-grid)
nRows = 240
nCols = 320
# Iterate over the rendered texture in a col-major format
c=0
if self.q_imStack.qsize() < 150:
for j in range(4): #j is for cols-indx
for i in range(4): #i is for rows-indx
#print i*nRows, j*nCols, (i+1)*nRows, (j+1)*nCols
im = A[i*nRows:(i+1)*nRows,j*nCols:(j+1)*nCols,:]
#imX = im.astype('float32')/255. - .5 # TODO: have a mean image
#imX = (im.astype('float32') - 128.0) /128.
imX = im.astype('float32') #- self.meanImage
## Put imX into the queue
# do not queue up if queue size begin to exceed 150
self.q_imStack.put( imX )
self.q_labelStack.put( texPoses[c,:] )
# fname = '__'+str(poses[c,0]) + '_' + str(poses[c,1]) + '_' + str(poses[c,2]) + '_' + str(poses[c,3]) + '_'
# cv2.imwrite( str(TrainRenderer.renderIndx)+'__'+str(i)+str(j)+fname+'.png', imX.astype('uint8') )
c = c + 1
else:
if self.queue_warning:
print 'q_imStack.qsize() > 150. Queue is filled, not retriving the rendered data'
#
# Call caffe iteration (reads from q_imStack and q_labelStack)
# Possibly upgrade to TensorFlow
# self.learning_iteration()
# if( TrainRenderer.renderIndx > 50 ):
# return None
#
# Prep for Next Iteration
TrainRenderer.renderIndx = TrainRenderer.renderIndx + 1
self.prevPoses = poses
return task.cont
## Execute 1-step.
##
## This function is to be called from outside to render once. This is a wrapper for app.taskMgr.step()
def step(self, batchsize):
""" One rendering.
This function needs to be called from outside in a loop for continous rendering
Returns 2 variables. One im_batch and another label
"""
# ltimes = int( batchsize/16 ) + 1
# print 'Render ', ltimes, 'times'
# for x in range(ltimes):
# Note: 2 renders sometime fails. Donno exactly what happens :'(
# Instead I do app.taskMgr.step() in the main() instead, once and 1 time here. This seem to work OK
# self.taskMgr.step()
# Thread.sleep(0.1)
if self.q_imStack.qsize() < 16*5:
self.taskMgr.step()
# print 'Queues Status (imStack=%d,labelStack=%d)' %(self.q_imStack.qsize(), self.q_labelStack.qsize())
# TODO: Check validity of batchsize. Also avoid hard coding the thresh for not retriving from queue.
im_batch = np.zeros((batchsize,240,320,3))
label_batch = np.zeros((batchsize,4))
# assert self.q_imStack.qsize() > 16*5
if self.q_imStack.qsize() >= 16*5:
# get a batch out
for i in range(batchsize):
im = self.q_imStack.get() #240x320x3 RGB
y = self.q_labelStack.get()
# print 'retrive', i
#remember to z-normalize
im_batch[i,:,:,0] = copy.deepcopy(im[:,:,0])#self.zNormalized( copy.deepcopy(im[:,:,0]) )
im_batch[i,:,:,1] = copy.deepcopy(im[:,:,1])#self.zNormalized( copy.deepcopy(im[:,:,1]) )
im_batch[i,:,:,2] = copy.deepcopy(im[:,:,2])#self.zNormalized( copy.deepcopy(im[:,:,2]) )
label_batch[i,0] = copy.deepcopy( y[0] )
label_batch[i,1] = copy.deepcopy( y[1] )
label_batch[i,2] = copy.deepcopy( y[2] )
label_batch[i,3] = copy.deepcopy( y[3] )
else:
return None, None
f_im = 'im_batch.pickle'
f_lab = 'label_batch.pickle'
print 'Loading : ', f_im, f_lab
with open( f_im, 'rb' ) as handle:
im_batch = pickle.load(handle )
with open( f_lab, 'rb' ) as handle:
label_batch = pickle.load(handle )
print 'Done.@!'
# im_batch = copy.deepcopy( self.X_im_batch )
# # label_batch = copy.deepcopy( self.X_label_batch )
#
r0 = np.random.randint( 0, im_batch.shape[0], batchsize )
# r1 = np.random.randint( 0, im_batch.shape[0], batchsize )
im_batch = im_batch[r0]
label_batch = label_batch[r0]
# Note:
# What is being done here is a bit of a hack. The thing is
# in the mainloop() ie. in train_tf_decop.py doesn't allow any
# if statements. So, I have instead saved a few example-renders on a
# pickle-file. If the queue is not sufficiently filled i just return
# from the saved file.
return im_batch, label_batch
def __init__(self, queue_warning=True):
ShowBase.__init__(self)
self.taskMgr.add( self.renderNlearnTask, "renderNlearnTask" ) #changing camera poses
self.taskMgr.add( self.putAxesTask, "putAxesTask" ) #draw co-ordinate axis
self.taskMgr.add( self.putTrainingBox, "putTrainingBox" )
self.queue_warning = queue_warning #suppress the warning of queue full if this var is True
# Set up training area. This is used in monte_carlo_sample() and putTrainingBox()
self.mc_X_max = 300
self.mc_X_min = -300
self.mc_Y_max = 360
self.mc_Y_min = -360
self.mc_Z_max = 120
self.mc_Z_min = 45
self.mc_yaw_max = 60
self.mc_yaw_min = -60
self.mc_roll_max = 5
self.mc_roll_min = -5
self.mc_pitch_max = 5
self.mc_pitch_min = -5
# # Note params
# self.PARAM_TENSORBOARD_PREFIX = TENSORBOARD_PREFIX
# self.PARAM_MODEL_SAVE_PREFIX = MODEL_SAVE_PREFIX
# self.PARAM_MODEL_RESTORE = MODEL_RESTORE
#
# self.PARAM_WRITE_SUMMARY_EVERY = WRITE_SUMMARY_EVERY
# self.PARAM_WRITE_TF_MODEL_EVERY = WRITE_TF_MODEL_EVERY
# Misc Setup
self.render.setAntialias(AntialiasAttrib.MAuto)
self.setFrameRateMeter(True)
self.tcolor = TerminalColors.bcolors()
#
# Set up Mesh (including load, position, orient, scale)
self.setupMesh()
self.positionMesh()
# Custom Render
# Important Note: self.render displays the low_res and self.scene0 is the images to retrive
self.scene0 = NodePath("scene0")
# cytX = copy.deepcopy( cyt )
self.low_res.reparentTo(self.render)
self.cyt.reparentTo(self.scene0)
self.cyt2.reparentTo(self.scene0)
#
# Make Buffering Window
bufferProp = FrameBufferProperties().getDefault()
props = WindowProperties()
props.setSize(1280, 960)
win2 = self.graphicsEngine.makeOutput( pipe=self.pipe, name='wine1',
sort=-1, fb_prop=bufferProp , win_prop=props,
flags=GraphicsPipe.BFRequireWindow)
#flags=GraphicsPipe.BFRefuseWindow)
# self.window = win2#self.win #dr.getWindow()
self.win2 = win2
# self.win2.setupCopyTexture()
# Adopted from : https://www.panda3d.org/forums/viewtopic.php?t=3880
#
# Set Multiple Cameras
self.cameraList = []
for i in range(4*4):
print 'Create camera#', i
self.cameraList.append( self.customCamera( str(i) ) )
# Disable default camera
# dr = self.camNode.getDisplayRegion(0)
# dr.setActive(0)
#
# Set Display Regions (4x4)
dr_list = self.customDisplayRegion(4,4)
#
# Setup each camera
for i in range(len(dr_list)):
dr_list[i].setCamera( self.cameraList[i] )
#
# Set buffered Queues (to hold rendered images and their positions)
# each queue element will be an RGB image of size 240x320x3
self.q_imStack = Queue.Queue()
self.q_labelStack = Queue.Queue()
print self.tcolor.OKGREEN, '\n##########\n'+'Panda3d Renderer Initialization Complete'+'\n##########\n', self.tcolor.ENDC
class TestRenderer(ShowBase):
renderIndx=0
## Basic Mesh & Camera Setup
def loadAllTextures(self, mesh, basePath, silent=True):
""" Loads texture files for a mesh """
c = 0
for child in mesh.getChildren():
submesh_name = child.get_name()
submesh_texture = basePath + submesh_name[:-5] + 'tex0.jpg'
child.setTexture( self.loader.loadTexture(submesh_texture) )
if silent == False:
print 'Loading texture file : ', submesh_texture
c = c + 1
print self.tcolor.OKGREEN, "Loaded ", c, "textures", self.tcolor.ENDC
def setupMesh(self):
""" Loads the .obj files. Will load mesh sub-divisions separately """
print 'Attempt Loading Mesh VErtices, FAces'
self.cyt = self.loader.loadModel( 'model_l/l6/level_6_0_0.obj' )
self.cyt2 = self.loader.loadModel( 'model_l/l6/level_6_128_0.obj' )
self.low_res = self.loader.loadModel( 'model_l/l0/level_0_0_0.obj' )
print self.tcolor.OKGREEN, 'Done Loading Vertices', self.tcolor.ENDC
print 'Attempt Loading Textures'
self.loadAllTextures( self.cyt, 'model_l/l6/')
self.loadAllTextures( self.cyt2, 'model_l/l6/')
self.loadAllTextures( self.low_res, 'model_l/l0/')
print self.tcolor.OKGREEN, 'Done Loading Textures', self.tcolor.ENDC
def positionMesh(self):
""" WIll have to manually adjust this for ur mesh. I position the
center where I fly my drone and oriented in ENU (East-north-up)
cords for easy alignment of GPS and my cordinates. If your model
is not metric scale will have to adjust for that too"""
self.cyt.setPos( 140,-450, 150 )
self.cyt2.setPos( 140,-450, 150 )
self.low_res.setPos( 140,-450, 150 )
self.cyt.setHpr( 198, -90, 0 )
self.cyt2.setHpr( 198, -90, 0 )
self.low_res.setHpr( 198, -90, 0 )
self.cyt.setScale(172)
self.cyt2.setScale(172)
self.low_res.setScale(172)
def customCamera(self, nameIndx):
lens = self.camLens
lens.setFov(83)
print 'self.customCamera : Set FOV at 83'
my_cam = Camera("cam"+nameIndx, lens)
my_camera = self.scene0.attachNewNode(my_cam)
# my_camera = self.render.attachNewNode(my_cam)
my_camera.setName("camera"+nameIndx)
return my_camera
def customDisplayRegion(self, rows, cols):
rSize = 1.0 / rows
cSize = 1.0 / cols
dr_list = []
for i in range(0,rows):
for j in range(0,cols):
# print i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize
dr_i = self.win2.makeDisplayRegion(i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize)
dr_i.setSort(-5)
dr_list.append( dr_i )
return dr_list
def monte_carlo_sample(self):
""" Gives a random 6-dof pose. Need to set params manually here.
X,Y,Z, Yaw(abt Z-axis), Pitch(abt X-axis), Roll(abt Y-axis) """
X = np.random.uniform(-50,50)
Y = np.random.uniform(-100,100)
Z = np.random.uniform(50,100)
yaw = np.random.uniform(-60,60)
roll = np.random.uniform(-5,5)
pitch = np.random.uniform(-5,5)
return X,Y,Z, yaw,roll,pitch
## Annotation-helpers for self.render
def putBoxes(self,X,Y,Z,r=1.,g=0.,b=0., scale=1.0):
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(r,g,b)
cube_x.setScale(scale)
cube_x.reparentTo(self.render)
cube_x.setPos(X,Y,Z)
def putAxesTask(self,task):
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(1.0,0.0,0.0)
cube_x.setScale(1)
cube_x.reparentTo(self.render)
cube_x.setPos(task.frame,0,0)
cube_y = CubeMaker.CubeMaker().generate()
cube_y.setColor(0.0,1.0,0.0)
cube_y.setScale(1)
cube_y.reparentTo(self.render)
cube_y.setPos(0,task.frame,0)
cube_z = CubeMaker.CubeMaker().generate()
cube_z.setColor(0.0,0.0,1.0)
cube_z.setScale(1)
cube_z.reparentTo(self.render)
cube_z.setPos(0,0,task.frame)
if task.time > 25:
return None
return task.cont
## Render-n-Learn task.
## Sets the camera position (1-cam only) with spline or your choice (see init).
### Renders the image at that point and queue the image and its position (pose)
def renderNtestTask(self, task):
if task.frame < 50: #do not do anything for 50 ticks, as spline's 1st node is at t=50
return task.cont
# print randX, randY, randZ
t = task.frame
if t > self.spl_u.max():
print 'End of Spline, End task'
# fName = 'trace__' + self.pathGen.__name__ + '.npz'
# np.savez( fName, loss=self.loss_ary, gt=self.gt_ary, pred=self.pred_ary )
# print 'PathData File Written : ', fName
# print 'Visualize result : `python tools/analyse_path_trace_subplot.py', fName, '`'
return None
#
# set pose in each camera
# Note: The texture is grided images in a col-major format
# TODO : since it is going to be only 1 camera eliminate this loop to simply code
poses = np.zeros( (len(self.cameraList), 4), dtype='float32' )
for i in range(len(self.cameraList)): #here usually # of cams will be 1 (for TestRenderer)
indx = TestRenderer.renderIndx
pt = interpolate.splev( t, self.spl_tck)
#randX,randY, randZ, randYaw, randPitch, randRoll = self.monte_carlo_sample()
randX = pt[0]
randY = pt[1]
randZ = pt[2]
randYaw = pt[3]
randPitch = 0
randRoll = 0
self.cameraList[i].setPos(randX,randY,randZ)
self.cameraList[i].setHpr(randYaw,-90+randPitch,0+randRoll)
poses[i,0] = randX
poses[i,1] = randY
poses[i,2] = randZ
poses[i,3] = randYaw
# make note of the poses just set as this will take effect next
if TestRenderer.renderIndx == 0:
TestRenderer.renderIndx = TestRenderer.renderIndx + 1
# self.putBoxes(0,0,0, scale=100)
self.prevPoses = poses
return task.cont
#
# Retrive Rendered Data
tex = self.win2.getScreenshot()
# A = np.array(tex.getRamImageAs("RGB")).reshape(960,1280,3) #@#
A = np.array(tex.getRamImageAs("RGB")).reshape(240,320,3)
# A = np.zeros((960,1280,3))
# A_bgr = cv2.cvtColor(A.astype('uint8'),cv2.COLOR_RGB2BGR)
# cv2.imwrite( str(TestRenderer.renderIndx)+'.png', A_bgr.astype('uint8') )
# myTexture = self.win2.getTexture()
# print myTexture
# retrive poses from prev render
texPoses = self.prevPoses
#
# Cut rendered data into individual image. Note rendered data will be 4X4 grid of images
#960 rows and 1280 cols (4x4 image-grid)
nRows = 240
nCols = 320
# Iterate over the rendered texture in a col-major format
c=0
# TODO : Eliminate this 2-loop as we know there is only 1 display region
#if self.q_imStack.qsize() < 150: #no limit on queue size
# for j in range(1): #j is for cols-indx
# for i in range(1): #i is for rows-indx
i=0
j=0
#print i*nRows, j*nCols, (i+1)*nRows, (j+1)*nCols
im = A[i*nRows:(i+1)*nRows,j*nCols:(j+1)*nCols,:]
#imX = im.astype('float32')/255. - .5 # TODO: have a mean image
#imX = (im.astype('float32') - 128.0) /128.
imX = im.astype('float32') #- self.meanImage
# Put imX into the queue
# do not queue up if queue size begin to exceed 150
self.q_imStack.put( imX )
self.q_labelStack.put( texPoses[c,:] )
self.putBoxes( texPoses[c,0], texPoses[c,1], texPoses[c,2] )
# print 'putBoxes', texPoses[c,0], texPoses[c,1], texPoses[c,2]
# fname = '__'+str(poses[c,0]) + '_' + str(poses[c,1]) + '_' + str(poses[c,2]) + '_' + str(poses[c,3]) + '_'
# cv2.imwrite( str(TestRenderer.renderIndx)+'__'+str(i)+str(j)+fname+'.png', imX.astype('uint8') )
c = c + 1
#
# Prep for Next Iteration
TestRenderer.renderIndx = TestRenderer.renderIndx + 1
self.prevPoses = poses
# if( TestRenderer.renderIndx > 5 ):
# return None
return task.cont
def step(self):
self.taskMgr.step()
# print 'Queues Status (imStack=%d,labelStack=%d)' %(self.q_imStack.qsize(), self.q_labelStack.qsize())
# Dequeue 1 elements
if self.q_imStack.qsize() > 2: # Do not dequeue if the queue size is less than 2
im = copy.deepcopy( self.q_imStack.get() ) #240x320x3 RGB
y = copy.deepcopy( self.q_labelStack.get() )
return im, y
else:
return None, None
def __init__(self, pathGen=None ):
ShowBase.__init__(self)
self.taskMgr.add( self.renderNtestTask, "renderNtestTask" ) #changing camera poses
self.taskMgr.add( self.putAxesTask, "putAxesTask" ) #draw co-ordinate axis
# Misc Setup
self.render.setAntialias(AntialiasAttrib.MAuto)
self.setFrameRateMeter(True)
self.tcolor = TerminalColors.bcolors()
#
# Set up Mesh (including load, position, orient, scale)
self.setupMesh()
self.positionMesh()
# Custom Render
# Important Note: self.render displays the low_res and self.scene0 is the images to retrive
self.scene0 = NodePath("scene0")
# cytX = copy.deepcopy( cyt )
self.low_res.reparentTo(self.render)
self.cyt.reparentTo(self.scene0)
self.cyt2.reparentTo(self.scene0)
#
# Make Buffering Window
bufferProp = FrameBufferProperties().getDefault()
props = WindowProperties()
# props.setSize(1280, 960)
props.setSize(320, 240) #@#
win2 = self.graphicsEngine.makeOutput( pipe=self.pipe, name='wine1',
sort=-1, fb_prop=bufferProp , win_prop=props,
flags=GraphicsPipe.BFRequireWindow)
#flags=GraphicsPipe.BFRefuseWindow)
# self.window = win2#self.win #dr.getWindow()
self.win2 = win2
# self.win2.setupCopyTexture()
# Adopted from : https://www.panda3d.org/forums/viewtopic.php?t=3880
#
# Set Multiple Cameras
self.cameraList = []
# for i in range(4*4):
for i in range(1*1): #@#
print 'Create camera#', i
self.cameraList.append( self.customCamera( str(i) ) )
# Disable default camera
# dr = self.camNode.getDisplayRegion(0)
# dr.setActive(0)
#
# Set Display Regions (4x4)
dr_list = self.customDisplayRegion(1,1)
#
# Setup each camera
for i in range(len(dr_list)):
dr_list[i].setCamera( self.cameraList[i] )
#
# Set buffered Queues (to hold rendered images and their positions)
# each queue element will be an RGB image of size 240x320x3
self.q_imStack = Queue.Queue()
self.q_labelStack = Queue.Queue()
#
# Setting up Splines
# Note: Start interpolation at 50,
if pathGen is None:
# self.pathGen = PathMaker.PathMaker().path_flat_h
# self.pathGen = PathMaker.PathMaker().path_smallM
# self.pathGen = PathMaker.PathMaker().path_yaw_only
# self.pathGen = PathMaker.PathMaker().path_bigM
# self.pathGen = PathMaker.PathMaker().path_flat_spiral
# self.pathGen = PathMaker.PathMaker().path_helix
# self.pathGen = PathMaker.PathMaker().path_like_real
# self.pathGen = PathMaker.PathMaker().path_like_real2
self.pathGen = PathMaker.PathMaker().path_large_loop
else:
self.pathGen = pathGen
t,X = self.pathGen()
self.spl_tck, self.spl_u = interpolate.splprep(X.T, u=t.T, s=0.0, per=1)
print 'Test Renderer Init Done'
print self.tcolor.OKGREEN, 'Test Renderer Init Done', self.tcolor.ENDC
# Setup NetVLAD Renderer - This renderer is custom made for NetVLAD training
# It renders 16 images at a time. (q, (P1,P2,..P5), (N1,N2,...,N10)).
# ie. 1st image is im, next 5 are near this im (potential positives).
# Last 10 are far from im (definite negatives)
class NetVLADRenderer(ShowBase):
renderIndx=0
# Basic Mesh & Camera Setup
def loadAllTextures(self, mesh, basePath, silent=True):
""" Loads texture files for a mesh """
c = 0
for child in mesh.getChildren():
submesh_name = child.get_name()
submesh_texture = basePath + submesh_name[:-5] + 'tex0.jpg'
child.setTexture( self.loader.loadTexture(submesh_texture) )
if silent == False:
print 'Loading texture file : ', submesh_texture
c = c + 1
print self.tcolor.OKGREEN, "Loaded ", c, "textures", self.tcolor.ENDC
def setupMesh(self):
""" Loads the .obj files. Will load mesh sub-divisions separately """
print 'Attempt Loading Mesh VErtices, FAces'
self.cyt = self.loader.loadModel( 'model_l/l6/level_6_0_0.obj' )
self.cyt2 = self.loader.loadModel( 'model_l/l6/level_6_128_0.obj' )
self.low_res = self.loader.loadModel( 'model_l/l3/level_3_0_0.obj' )
print self.tcolor.OKGREEN, 'Done Loading Vertices', self.tcolor.ENDC
print 'Attempt Loading Textures'
self.loadAllTextures( self.cyt, 'model_l/l6/')
self.loadAllTextures( self.cyt2, 'model_l/l6/')
self.loadAllTextures( self.low_res, 'model_l/l3/')
print self.tcolor.OKGREEN, 'Done Loading Textures', self.tcolor.ENDC
def positionMesh(self):
""" WIll have to manually adjust this for ur mesh. I position the
center where I fly my drone and oriented in ENU (East-north-up)
cords for easy alignment of GPS and my cordinates. If your model
is not metric scale will have to adjust for that too"""
self.cyt.setPos( 140,-450, 150 )
self.cyt2.setPos( 140,-450, 150 )
self.low_res.setPos( 140,-450, 150 )
self.cyt.setHpr( 198, -90, 0 )
self.cyt2.setHpr( 198, -90, 0 )
self.low_res.setHpr( 198, -90, 0 )
self.cyt.setScale(172)
self.cyt2.setScale(172)
self.low_res.setScale(172)
def customCamera(self, nameIndx):
lens = self.camLens
lens.setFov(83)
print 'self.customCamera : Set FOV at 83'
my_cam = Camera("cam"+nameIndx, lens)
my_camera = self.scene0.attachNewNode(my_cam)
# my_camera = self.render.attachNewNode(my_cam)
my_camera.setName("camera"+nameIndx)
return my_camera
def customDisplayRegion(self, rows, cols):
rSize = 1.0 / rows
cSize = 1.0 / cols
dr_list = []
for i in range(0,rows):
for j in range(0,cols):
# print i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize
dr_i = self.win2.makeDisplayRegion(i*rSize, (i+1)*rSize, j*cSize, (j+1)*cSize)
dr_i.setSort(-5)
dr_list.append( dr_i )
return dr_list
def mc_default( self, cam0X, cam0Y, cam0Z ):
return 0,0,80,0,0,0
def mc_far( self, cam0X, cam0Y, cam0Z ):
rf = np.random.uniform
nZ = rf(self.mc_Z_min,self.mc_Z_max)
fov = 1.3962 #80 degrees
sigma = cam0Z * np.tan(fov/2.)/3
nX = rf(self.mc_X_min, cam0X - 2*sigma) if rf(-1,1) > 0 else rf(cam0X + 2*sigma, self.mc_X_max )
nY = rf(self.mc_Y_min, cam0Y - 2*sigma) if rf(-1,1) > 0 else rf(cam0Y + 2*sigma, self.mc_Y_max )
yaw = rf(self.mc_yaw_min, self.mc_yaw_max)
return nX, nY, nZ, yaw, 0 , 0
# Return a random sample near (cam0X,cam0Y,cam0Z)
def mc_near( self, cam0X, cam0Y, cam0Z ):
rf = np.random.uniform
nZ = rf(self.mc_Z_min,self.mc_Z_max)
fov = 1.3962 #80 degrees
sigma = cam0Z * np.tan(fov/2.)/3
yaw = rf(self.mc_yaw_min, self.mc_yaw_max)
return rf(cam0X-sigma,cam0X+sigma), rf(cam0Y-sigma,cam0Y+sigma),nZ,yaw,0.,0.
## Gives a random 6-dof pose. Need to set params manually here.
## X,Y,Z, Yaw(abt Z-axis), Pitch(abt X-axis), Roll(abt Y-axis)
## @param No
def monte_carlo_sample(self):
# mc_X_min etc are set in constructor
X = np.random.uniform(self.mc_X_min,self.mc_X_max)
Y = np.random.uniform(self.mc_Y_min,self.mc_Y_max)
Z = np.random.uniform(self.mc_Z_min,self.mc_Z_max)
yaw = np.random.uniform( self.mc_yaw_min, self.mc_yaw_max)
roll = 0#np.random.uniform( self.mc_roll_min, self.mc_roll_max)
pitch = 0#np.random.uniform( self.mc_pitch_min, self.mc_pitch_max)
return X,Y,Z, yaw,roll,pitch
## Annotation-helpers for self.render
def putBoxes(self,X,Y,Z,r=1.,g=0.,b=0., scale=1.0):
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(r,g,b)
cube_x.setScale(scale)
cube_x.reparentTo(self.render)
cube_x.setPos(X,Y,Z)
## Set a cube in 3d env
def putTrainingBox(self,task):
cube = CubeMaker.CubeMaker().generate()
cube.setTransparency(TransparencyAttrib.MAlpha)
cube.setAlphaScale(0.5)
# cube.setScale(10)
# mc_X_min etc are set in constructor
sx = 0.5 * (self.mc_X_max - self.mc_X_min)
sy = 0.5 * (self.mc_Y_max - self.mc_Y_min)
sz = 0.5 * (self.mc_Z_max - self.mc_Z_min)
ax = 0.5 * (self.mc_X_max + self.mc_X_min)
ay = 0.5 * (self.mc_Y_max + self.mc_Y_min)
az = 0.5 * (self.mc_Z_max + self.mc_Z_min)
cube.setSx(sx)
cube.setSy(sy)
cube.setSz(sz)
cube.reparentTo(self.render)
cube.setPos(ax,ay,az)
## Task. This task draw the XYZ axis
def putAxesTask(self,task):
if (task.frame / 10) % 2 == 0:
cube_x = CubeMaker.CubeMaker().generate()
cube_x.setColor(1.0,0.0,0.0)
cube_x.setScale(1)
cube_x.reparentTo(self.render)
cube_x.setPos(task.frame,0,0)
cube_y = CubeMaker.CubeMaker().generate()
cube_y.setColor(0.0,1.0,0.0)
cube_y.setScale(1)
cube_y.reparentTo(self.render)
cube_y.setPos(0,task.frame,0)
cube_z = CubeMaker.CubeMaker().generate()
cube_z.setColor(0.0,0.0,1.0)
cube_z.setScale(1)
cube_z.reparentTo(self.render)
cube_z.setPos(0,0,task.frame)
if task.time > 25:
return None
return task.cont
## Render-n-Learn task
##
## set pose in each camera <br/>
## make note of the poses just set as this will take effect next <br/>
## Retrive Rendered Data <br/>
## Cut rendered data into individual image. Note rendered data will be 4X4 grid of images <br/>
## Put imX into the queue <br/>
def renderNlearnTask(self, task):
if task.time < 2: #do not do anything for 1st 2 sec
return task.cont
# print randX, randY, randZ
#
## set pose in each camera
# Note: The texture is grided images in a col-major format
poses = np.zeros( (len(self.cameraList), 4), dtype='float32' )
_randX= _randY= _randZ= _randYaw= _randPitch= _randRoll = 0
for i in range(len(self.cameraList)):
if i==0:
_randX,_randY, _randZ, _randYaw, _randPitch, _randRoll = self.monte_carlo_sample()
(randX,randY, randZ, randYaw, randPitch, randRoll) = _randX, _randY, _randZ, _randYaw, _randPitch, _randRoll
elif i>=1 and i<6:
randX,randY, randZ, randYaw, randPitch, randRoll = self.mc_near(_randX, _randY, _randZ )
else:
randX,randY, randZ, randYaw, randPitch, randRoll = self.mc_far(_randX, _randY, _randZ)
self.cameraList[i].setPos(randX,randY,randZ)
self.cameraList[i].setHpr(randYaw,-90+randPitch,0+randRoll)
poses[i,0] = randX
poses[i,1] = randY
poses[i,2] = randZ
poses[i,3] = randYaw
# self.putBoxes(randX,randY,randZ, scale=0.3)
#
# if task.frame < 100:
# return task.cont
# else:
# return None
## make note of the poses just set as this will take effect next
if NetVLADRenderer.renderIndx == 0:
NetVLADRenderer.renderIndx = NetVLADRenderer.renderIndx + 1
self.prevPoses = poses
return task.cont
#
## Retrive Rendered Data
tex = self.win2.getScreenshot()
A = np.array(tex.getRamImageAs("RGB")).reshape(960,1280,3)
# A = np.zeros((960,1280,3))
# A_bgr = cv2.cvtColor(A.astype('uint8'),cv2.COLOR_RGB2BGR)
# cv2.imwrite( str(TrainRenderer.renderIndx)+'.png', A_bgr.astype('uint8') )
# myTexture = self.win2.getTexture()
# print myTexture
# retrive poses from prev render
texPoses = self.prevPoses
#
## Cut rendered data into individual image. Note rendered data will be 4X4 grid of images
#960 rows and 1280 cols (4x4 image-grid)
nRows = 240
nCols = 320
# Iterate over the rendered texture in a col-major format
c=0
if self.q_imStack.qsize() < 150:
for j in range(4): #j is for cols-indx
for i in range(4): #i is for rows-indx
#print i*nRows, j*nCols, (i+1)*nRows, (j+1)*nCols
im = A[i*nRows:(i+1)*nRows,j*nCols:(j+1)*nCols,:]
#imX = im.astype('float32')/255. - .5 # TODO: have a mean image
#imX = (im.astype('float32') - 128.0) /128.
imX = im.astype('float32') #- self.meanImage
# print 'Noise Added to renderedIm'
# imX = imX + 10.*np.random.randn( imX.shape[0], imX.shape[1], imX.shape[2] )
## Put imX into the queue
# do not queue up if queue size begin to exceed 150
self.q_imStack.put( imX )
self.q_labelStack.put( texPoses[c,:] )
# fname = '__'+str(poses[c,0]) + '_' + str(poses[c,1]) + '_' + str(poses[c,2]) + '_' + str(poses[c,3]) + '_'
# cv2.imwrite( str(TrainRenderer.renderIndx)+'__'+str(i)+str(j)+fname+'.png', imX.astype('uint8') )
c = c + 1
else:
print 'q_imStack.qsize() > 150. Queue is filled, not retriving the rendered data'
#
# Call caffe iteration (reads from q_imStack and q_labelStack)
# Possibly upgrade to TensorFlow
# self.learning_iteration()
# if( TrainRenderer.renderIndx > 50 ):
# return None
#
# Prep for Next Iteration
NetVLADRenderer.renderIndx = NetVLADRenderer.renderIndx + 1
self.prevPoses = poses
return task.cont
## Execute 1-step.
##
## This function is to be called from outside to render once. This is a wrapper for app.taskMgr.step()
def step(self, batchsize):
""" One rendering.
This function needs to be called from outside in a loop for continous rendering
Returns 2 variables. One im_batch and another label
"""
# ltimes = int( batchsize/16 ) + 1
# print 'Render ', ltimes, 'times'
# for x in range(ltimes):
# Note: 2 renders sometime fails. Donno exactly what happens :'(
# Instead I do app.taskMgr.step() in the main() instead, once and 1 time here. This seem to work OK
# self.taskMgr.step()
# Thread.sleep(0.1)
self.taskMgr.step()
# print 'Queues Status (imStack=%d,labelStack=%d)' %(self.q_imStack.qsize(), self.q_labelStack.qsize())
# TODO: Check validity of batchsize. Also avoid hard coding the thresh for not retriving from queue.
im_batch = np.zeros((batchsize,240,320,3))
label_batch = np.zeros((batchsize,4))
# assert self.q_imStack.qsize() > 16*5
if self.q_imStack.qsize() >= 16*5:
# get a batch out
for i in range(batchsize):
im = self.q_imStack.get() #240x320x3 RGB
y = self.q_labelStack.get()
# print 'retrive', i
#remember to z-normalize
im_batch[i,:,:,0] = copy.deepcopy(im[:,:,0])#self.zNormalized( copy.deepcopy(im[:,:,0]) )
im_batch[i,:,:,1] = copy.deepcopy(im[:,:,1])#self.zNormalized( copy.deepcopy(im[:,:,1]) )
im_batch[i,:,:,2] = copy.deepcopy(im[:,:,2])#self.zNormalized( copy.deepcopy(im[:,:,2]) )
label_batch[i,0] = copy.deepcopy( y[0] )
label_batch[i,1] = copy.deepcopy( y[1] )
label_batch[i,2] = copy.deepcopy( y[2] )
label_batch[i,3] = copy.deepcopy( y[3] )
else:
return None, None
f_im = 'im_batch.pickle'
f_lab = 'label_batch.pickle'
print 'Loading : ', f_im, f_lab
with open( f_im, 'rb' ) as handle:
im_batch = pickle.load(handle )
with open( f_lab, 'rb' ) as handle:
label_batch = pickle.load(handle )
print 'Done.@!'
# im_batch = copy.deepcopy( self.X_im_batch )
# # label_batch = copy.deepcopy( self.X_label_batch )
#
r0 = np.random.randint( 0, im_batch.shape[0], batchsize )
# r1 = np.random.randint( 0, im_batch.shape[0], batchsize )
im_batch = im_batch[r0]
label_batch = label_batch[r0]
# Note:
# What is being done here is a bit of a hack. The thing is
# in the mainloop() ie. in train_tf_decop.py doesn't allow any
# if statements. So, I have instead saved a few example-renders on a
# pickle-file. If the queue is not sufficiently filled i just return
# from the saved file.
return im_batch, label_batch
def __init__(self):
ShowBase.__init__(self)
self.taskMgr.add( self.renderNlearnTask, "renderNlearnTask" ) #changing camera poses
self.taskMgr.add( self.putAxesTask, "putAxesTask" ) #draw co-ordinate axis
self.taskMgr.add( self.putTrainingBox, "putTrainingBox" )
# Set up training area. This is used in monte_carlo_sample() and putTrainingBox()
self.mc_X_max = 300
self.mc_X_min = -300
self.mc_Y_max = 360
self.mc_Y_min = -360
self.mc_Z_max = 120
self.mc_Z_min = 45
self.mc_yaw_max = 85
self.mc_yaw_min = -85
self.mc_roll_max = 5
self.mc_roll_min = -5
self.mc_pitch_max = 5
self.mc_pitch_min = -5
# # Note params
# self.PARAM_TENSORBOARD_PREFIX = TENSORBOARD_PREFIX
# self.PARAM_MODEL_SAVE_PREFIX = MODEL_SAVE_PREFIX
# self.PARAM_MODEL_RESTORE = MODEL_RESTORE
#
# self.PARAM_WRITE_SUMMARY_EVERY = WRITE_SUMMARY_EVERY
# self.PARAM_WRITE_TF_MODEL_EVERY = WRITE_TF_MODEL_EVERY
# Misc Setup
self.render.setAntialias(AntialiasAttrib.MAuto)
self.setFrameRateMeter(True)
self.tcolor = TerminalColors.bcolors()
#
# Set up Mesh (including load, position, orient, scale)
self.setupMesh()
self.positionMesh()
# Custom Render
# Important Note: self.render displays the low_res and self.scene0 is the images to retrive
self.scene0 = NodePath("scene0")
# cytX = copy.deepcopy( cyt )
self.low_res.reparentTo(self.render)
self.cyt.reparentTo(self.scene0)
self.cyt2.reparentTo(self.scene0)
#
# Make Buffering Window
bufferProp = FrameBufferProperties().getDefault()
props = WindowProperties()
props.setSize(1280, 960)
win2 = self.graphicsEngine.makeOutput( pipe=self.pipe, name='wine1',
sort=-1, fb_prop=bufferProp , win_prop=props,
flags=GraphicsPipe.BFRequireWindow)
#flags=GraphicsPipe.BFRefuseWindow)
# self.window = win2#self.win #dr.getWindow()
self.win2 = win2
# self.win2.setupCopyTexture()
# Adopted from : https://www.panda3d.org/forums/viewtopic.php?t=3880
#
# Set Multiple Cameras
self.cameraList = []
for i in range(4*4):
print 'Create camera#', i
self.cameraList.append( self.customCamera( str(i) ) )
# Disable default camera
# dr = self.camNode.getDisplayRegion(0)
# dr.setActive(0)
#
# Set Display Regions (4x4)
dr_list = self.customDisplayRegion(4,4)
#
# Setup each camera
for i in range(len(dr_list)):
dr_list[i].setCamera( self.cameraList[i] )
#
# Set buffered Queues (to hold rendered images and their positions)
# each queue element will be an RGB image of size 240x320x3
self.q_imStack = Queue.Queue()
self.q_labelStack = Queue.Queue()
print self.tcolor.OKGREEN, '\n##########\n'+'Panda3d Renderer Initialization Complete'+'\n##########\n', self.tcolor.ENDC
| 33.884699 | 129 | 0.587048 | 6,258 | 46,727 | 4.276127 | 0.10179 | 0.019731 | 0.002354 | 0.007399 | 0.886024 | 0.872422 | 0.857661 | 0.851046 | 0.847683 | 0.845329 | 0 | 0.037244 | 0.298393 | 46,727 | 1,378 | 130 | 33.909289 | 0.779008 | 0.291159 | 0 | 0.858954 | 0 | 0 | 0.052762 | 0.007904 | 0 | 0 | 0 | 0.003628 | 0 | 0 | null | null | 0 | 0.025357 | null | null | 0.055468 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ba52948517c89b86df15d3df00ea580c1bfa05b7 | 14,358 | py | Python | dataset.py | hantek/SelfAttentiveSentEmbed | d3f8b798d195fea59efd129c778ee65d15d20b93 | [
"Apache-2.0"
] | 56 | 2017-10-10T10:54:00.000Z | 2022-01-17T03:16:56.000Z | dataset.py | hantek/SelfAttentiveSentEmbed | d3f8b798d195fea59efd129c778ee65d15d20b93 | [
"Apache-2.0"
] | 3 | 2018-01-11T06:13:27.000Z | 2018-06-26T21:34:47.000Z | dataset.py | hantek/SelfAttentiveSentEmbed | d3f8b798d195fea59efd129c778ee65d15d20b93 | [
"Apache-2.0"
] | 14 | 2017-11-03T09:24:07.000Z | 2021-08-11T07:36:11.000Z | import os
import cPickle
import theano
import numpy
import warnings
import pdb
class SNLI(object):
def __init__(self, batch_size=50, loadall=False,
datapath="/home/hantek/datasets/SNLI_GloVe_converted"):
self.batch_size = batch_size
self.datapath = datapath
data_file = open(self.datapath, 'rb')
cPickle.load(data_file)
cPickle.load(data_file)
self.train_set, self.dev_set, self.test_set = cPickle.load(data_file)
self.weight = cPickle.load(data_file).astype(theano.config.floatX)
if loadall:
self.word2embed = cPickle.load(data_file) # key: word, value: embedding
self.word2num = cPickle.load(data_file) # key: word, value: number
self.num2word = cPickle.load(data_file) # key: number, value: word
data_file.close()
self.train_size = len(self.train_set)
self.dev_size = len(self.dev_set)
self.test_size = len(self.test_set)
self.train_ptr = 0
self.dev_ptr = 0
self.test_ptr = 0
def train_minibatch_generator(self):
while self.train_ptr <= self.train_size - self.batch_size:
self.train_ptr += self.batch_size
minibatch = self.train_set[self.train_ptr - self.batch_size : self.train_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo, longest_premise = \
numpy.max(map(lambda x: (len(x[0]), len(x[1])), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
mask_premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
for i, (h, p, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
premises[i, :len(p)] = p
mask_premises[i, :len(p)] = (1,) * len(p)
truth[i] = t
yield hypos, mask_hypos, premises, mask_premises, truth
else:
self.train_ptr = 0
raise StopIteration
def dev_minibatch_generator(self, ):
while self.dev_ptr <= self.dev_size - self.batch_size:
self.dev_ptr += self.batch_size
minibatch = self.dev_set[self.dev_ptr - self.batch_size : self.dev_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo, longest_premise = \
numpy.max(map(lambda x: (len(x[0]), len(x[1])), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
mask_premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
for i, (h, p, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
premises[i, :len(p)] = p
mask_premises[i, :len(p)] = (1,) * len(p)
truth[i] = t
yield hypos, mask_hypos, premises, mask_premises, truth
else:
self.dev_ptr = 0
raise StopIteration
def test_minibatch_generator(self, ):
while self.test_ptr <= self.test_size - self.batch_size:
self.test_ptr += self.batch_size
minibatch = self.test_set[self.test_ptr - self.batch_size : self.test_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo, longest_premise = \
numpy.max(map(lambda x: (len(x[0]), len(x[1])), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
mask_premises = numpy.zeros((self.batch_size, longest_premise), dtype='int32')
for i, (h, p, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
premises[i, :len(p)] = p
mask_premises[i, :len(p)] = (1,) * len(p)
truth[i] = t
yield hypos, mask_hypos, premises, mask_premises, truth
else:
self.test_ptr = 0
raise StopIteration
class SICK(SNLI):
def __init__(self, batch_size=50, loadall=False, augment=False,
datapath="/Users/johanlin/Datasets/SICK/"):
self.batch_size = batch_size
if augment:
self.datapath = datapath + os.sep + 'SICK_augmented.pkl'
else:
self.datapath = datapath + os.sep + 'SICK.pkl'
super(SICK, self).__init__(batch_size, loadall, self.datapath)
class YELP(object):
def __init__(self, batch_size=50, loadall=False,
datapath="/home/hantek/datasets/NLC_data/yelp/yelp.pkl"):
self.batch_size = batch_size
self.datapath = datapath
data_file = open(self.datapath, 'rb')
cPickle.load(data_file)
cPickle.load(data_file)
self.train_set, self.dev_set, self.test_set = cPickle.load(data_file)
self.weight = cPickle.load(data_file).astype(theano.config.floatX)
if loadall:
self.word2embed = cPickle.load(data_file) # key: word, value: embedding
self.word2num = cPickle.load(data_file) # key: word, value: number
self.num2word = cPickle.load(data_file) # key: number, value: word
data_file.close()
self.train_size = len(self.train_set)
self.dev_size = len(self.dev_set)
self.test_size = len(self.test_set)
self.train_ptr = 0
self.dev_ptr = 0
self.test_ptr = 0
def train_minibatch_generator(self):
while self.train_ptr <= self.train_size - self.batch_size:
self.train_ptr += self.batch_size
minibatch = self.train_set[self.train_ptr - self.batch_size : self.train_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.train_ptr = 0
raise StopIteration
def dev_minibatch_generator(self, ):
while self.dev_ptr <= self.dev_size - self.batch_size:
self.dev_ptr += self.batch_size
minibatch = self.dev_set[self.dev_ptr - self.batch_size : self.dev_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.dev_ptr = 0
raise StopIteration
def test_minibatch_generator(self, ):
while self.test_ptr <= self.test_size - self.batch_size:
self.test_ptr += self.batch_size
minibatch = self.test_set[self.test_ptr - self.batch_size : self.test_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.test_ptr = 0
raise StopIteration
class AGE2(YELP):
def __init__(self, batch_size=50, loadall=False,
datapath="/home/hantek/datasets/NLC_data/age2/age2.pkl"):
super(AGE2, self).__init__(batch_size, loadall, datapath)
class STANFORDSENTIMENTTREEBANK(object):
def __init__(self, batch_size=50, loadext=False, loadhelper=False, wordembed='word2vec',
datapath="/home/hantek/datasets/StanfordSentimentTreebank"):
self.batch_size = batch_size
self.datapath = datapath
save_file = open(self.datapath + os.sep + 'sst_' + wordembed + '.pkl', 'rb')
cPickle.load(save_file)
self.train_set, self.dev_set, self.test_set = cPickle.load(save_file)
self.weight = cPickle.load(save_file).astype(theano.config.floatX)
save_file.close()
if loadext == True:
save_file_ext = open(self.datapath + os.sep + 'sst_' + wordembed + '_ext.pkl', 'rb')
train_set, dev_set, test_set = cPickle.load(save_file_ext)
self.train_set += train_set
self.dev_set += dev_set
self.test_set += test_set
save_file_ext.close()
if loadhelper == True:
helper = open(self.datapath + os.sep + 'sst_' + wordembed + '_helper.pkl', 'rb')
self.word2embed = cPickle.load(helper) # key: word, value: embedding
self.word2num = cPickle.load(helper) # key: word, value: number
self.num2word = cPickle.load(helper) # key: number, value: word
helper.close()
self.train_size = len(self.train_set)
self.dev_size = len(self.dev_set)
self.test_size = len(self.test_set)
self.train_ptr = 0
self.dev_ptr = 0
self.test_ptr = 0
def train_minibatch_generator(self):
while self.train_ptr <= self.train_size - self.batch_size:
self.train_ptr += self.batch_size
minibatch = self.train_set[self.train_ptr - self.batch_size : self.train_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.train_ptr = 0
raise StopIteration
def dev_minibatch_generator(self, ):
while self.dev_ptr <= self.dev_size - self.batch_size:
self.dev_ptr += self.batch_size
minibatch = self.dev_set[self.dev_ptr - self.batch_size : self.dev_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.dev_ptr = 0
raise StopIteration
def test_minibatch_generator(self, ):
while self.test_ptr <= self.test_size - self.batch_size:
self.test_ptr += self.batch_size
minibatch = self.test_set[self.test_ptr - self.batch_size : self.test_ptr]
if len (minibatch) < self.batch_size:
warnings.warn("There will be empty slots in minibatch data.", UserWarning)
longest_hypo = numpy.max(map(lambda x: len(x[0]), minibatch), axis=0)
hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
truth = numpy.zeros((self.batch_size,), dtype='int32')
mask_hypos = numpy.zeros((self.batch_size, longest_hypo), dtype='int32')
for i, (h, t) in enumerate(minibatch):
hypos[i, :len(h)] = h
mask_hypos[i, :len(h)] = (1,) * len(h)
truth[i] = t
yield hypos, mask_hypos, truth
else:
self.test_ptr = 0
raise StopIteration
| 43.246988 | 96 | 0.580582 | 1,826 | 14,358 | 4.374042 | 0.059146 | 0.094654 | 0.126956 | 0.078503 | 0.928884 | 0.907725 | 0.893327 | 0.877426 | 0.857268 | 0.852761 | 0 | 0.014012 | 0.299136 | 14,358 | 331 | 97 | 43.377644 | 0.779688 | 0.016228 | 0 | 0.846442 | 0 | 0 | 0.060007 | 0.014665 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052434 | false | 0 | 0.022472 | 0 | 0.093633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba5d88c1b5cc845d1eeeb493b01d6b7653cb09f2 | 6,683 | py | Python | loldib/getratings/models/NA/na_kindred/na_kindred_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_kindred/na_kindred_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_kindred/na_kindred_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Kindred_Jng_Aatrox(Ratings):
pass
class NA_Kindred_Jng_Ahri(Ratings):
pass
class NA_Kindred_Jng_Akali(Ratings):
pass
class NA_Kindred_Jng_Alistar(Ratings):
pass
class NA_Kindred_Jng_Amumu(Ratings):
pass
class NA_Kindred_Jng_Anivia(Ratings):
pass
class NA_Kindred_Jng_Annie(Ratings):
pass
class NA_Kindred_Jng_Ashe(Ratings):
pass
class NA_Kindred_Jng_AurelionSol(Ratings):
pass
class NA_Kindred_Jng_Azir(Ratings):
pass
class NA_Kindred_Jng_Bard(Ratings):
pass
class NA_Kindred_Jng_Blitzcrank(Ratings):
pass
class NA_Kindred_Jng_Brand(Ratings):
pass
class NA_Kindred_Jng_Braum(Ratings):
pass
class NA_Kindred_Jng_Caitlyn(Ratings):
pass
class NA_Kindred_Jng_Camille(Ratings):
pass
class NA_Kindred_Jng_Cassiopeia(Ratings):
pass
class NA_Kindred_Jng_Chogath(Ratings):
pass
class NA_Kindred_Jng_Corki(Ratings):
pass
class NA_Kindred_Jng_Darius(Ratings):
pass
class NA_Kindred_Jng_Diana(Ratings):
pass
class NA_Kindred_Jng_Draven(Ratings):
pass
class NA_Kindred_Jng_DrMundo(Ratings):
pass
class NA_Kindred_Jng_Ekko(Ratings):
pass
class NA_Kindred_Jng_Elise(Ratings):
pass
class NA_Kindred_Jng_Evelynn(Ratings):
pass
class NA_Kindred_Jng_Ezreal(Ratings):
pass
class NA_Kindred_Jng_Fiddlesticks(Ratings):
pass
class NA_Kindred_Jng_Fiora(Ratings):
pass
class NA_Kindred_Jng_Fizz(Ratings):
pass
class NA_Kindred_Jng_Galio(Ratings):
pass
class NA_Kindred_Jng_Gangplank(Ratings):
pass
class NA_Kindred_Jng_Garen(Ratings):
pass
class NA_Kindred_Jng_Gnar(Ratings):
pass
class NA_Kindred_Jng_Gragas(Ratings):
pass
class NA_Kindred_Jng_Graves(Ratings):
pass
class NA_Kindred_Jng_Hecarim(Ratings):
pass
class NA_Kindred_Jng_Heimerdinger(Ratings):
pass
class NA_Kindred_Jng_Illaoi(Ratings):
pass
class NA_Kindred_Jng_Irelia(Ratings):
pass
class NA_Kindred_Jng_Ivern(Ratings):
pass
class NA_Kindred_Jng_Janna(Ratings):
pass
class NA_Kindred_Jng_JarvanIV(Ratings):
pass
class NA_Kindred_Jng_Jax(Ratings):
pass
class NA_Kindred_Jng_Jayce(Ratings):
pass
class NA_Kindred_Jng_Jhin(Ratings):
pass
class NA_Kindred_Jng_Jinx(Ratings):
pass
class NA_Kindred_Jng_Kalista(Ratings):
pass
class NA_Kindred_Jng_Karma(Ratings):
pass
class NA_Kindred_Jng_Karthus(Ratings):
pass
class NA_Kindred_Jng_Kassadin(Ratings):
pass
class NA_Kindred_Jng_Katarina(Ratings):
pass
class NA_Kindred_Jng_Kayle(Ratings):
pass
class NA_Kindred_Jng_Kayn(Ratings):
pass
class NA_Kindred_Jng_Kennen(Ratings):
pass
class NA_Kindred_Jng_Khazix(Ratings):
pass
class NA_Kindred_Jng_Kindred(Ratings):
pass
class NA_Kindred_Jng_Kled(Ratings):
pass
class NA_Kindred_Jng_KogMaw(Ratings):
pass
class NA_Kindred_Jng_Leblanc(Ratings):
pass
class NA_Kindred_Jng_LeeSin(Ratings):
pass
class NA_Kindred_Jng_Leona(Ratings):
pass
class NA_Kindred_Jng_Lissandra(Ratings):
pass
class NA_Kindred_Jng_Lucian(Ratings):
pass
class NA_Kindred_Jng_Lulu(Ratings):
pass
class NA_Kindred_Jng_Lux(Ratings):
pass
class NA_Kindred_Jng_Malphite(Ratings):
pass
class NA_Kindred_Jng_Malzahar(Ratings):
pass
class NA_Kindred_Jng_Maokai(Ratings):
pass
class NA_Kindred_Jng_MasterYi(Ratings):
pass
class NA_Kindred_Jng_MissFortune(Ratings):
pass
class NA_Kindred_Jng_MonkeyKing(Ratings):
pass
class NA_Kindred_Jng_Mordekaiser(Ratings):
pass
class NA_Kindred_Jng_Morgana(Ratings):
pass
class NA_Kindred_Jng_Nami(Ratings):
pass
class NA_Kindred_Jng_Nasus(Ratings):
pass
class NA_Kindred_Jng_Nautilus(Ratings):
pass
class NA_Kindred_Jng_Nidalee(Ratings):
pass
class NA_Kindred_Jng_Nocturne(Ratings):
pass
class NA_Kindred_Jng_Nunu(Ratings):
pass
class NA_Kindred_Jng_Olaf(Ratings):
pass
class NA_Kindred_Jng_Orianna(Ratings):
pass
class NA_Kindred_Jng_Ornn(Ratings):
pass
class NA_Kindred_Jng_Pantheon(Ratings):
pass
class NA_Kindred_Jng_Poppy(Ratings):
pass
class NA_Kindred_Jng_Quinn(Ratings):
pass
class NA_Kindred_Jng_Rakan(Ratings):
pass
class NA_Kindred_Jng_Rammus(Ratings):
pass
class NA_Kindred_Jng_RekSai(Ratings):
pass
class NA_Kindred_Jng_Renekton(Ratings):
pass
class NA_Kindred_Jng_Rengar(Ratings):
pass
class NA_Kindred_Jng_Riven(Ratings):
pass
class NA_Kindred_Jng_Rumble(Ratings):
pass
class NA_Kindred_Jng_Ryze(Ratings):
pass
class NA_Kindred_Jng_Sejuani(Ratings):
pass
class NA_Kindred_Jng_Shaco(Ratings):
pass
class NA_Kindred_Jng_Shen(Ratings):
pass
class NA_Kindred_Jng_Shyvana(Ratings):
pass
class NA_Kindred_Jng_Singed(Ratings):
pass
class NA_Kindred_Jng_Sion(Ratings):
pass
class NA_Kindred_Jng_Sivir(Ratings):
pass
class NA_Kindred_Jng_Skarner(Ratings):
pass
class NA_Kindred_Jng_Sona(Ratings):
pass
class NA_Kindred_Jng_Soraka(Ratings):
pass
class NA_Kindred_Jng_Swain(Ratings):
pass
class NA_Kindred_Jng_Syndra(Ratings):
pass
class NA_Kindred_Jng_TahmKench(Ratings):
pass
class NA_Kindred_Jng_Taliyah(Ratings):
pass
class NA_Kindred_Jng_Talon(Ratings):
pass
class NA_Kindred_Jng_Taric(Ratings):
pass
class NA_Kindred_Jng_Teemo(Ratings):
pass
class NA_Kindred_Jng_Thresh(Ratings):
pass
class NA_Kindred_Jng_Tristana(Ratings):
pass
class NA_Kindred_Jng_Trundle(Ratings):
pass
class NA_Kindred_Jng_Tryndamere(Ratings):
pass
class NA_Kindred_Jng_TwistedFate(Ratings):
pass
class NA_Kindred_Jng_Twitch(Ratings):
pass
class NA_Kindred_Jng_Udyr(Ratings):
pass
class NA_Kindred_Jng_Urgot(Ratings):
pass
class NA_Kindred_Jng_Varus(Ratings):
pass
class NA_Kindred_Jng_Vayne(Ratings):
pass
class NA_Kindred_Jng_Veigar(Ratings):
pass
class NA_Kindred_Jng_Velkoz(Ratings):
pass
class NA_Kindred_Jng_Vi(Ratings):
pass
class NA_Kindred_Jng_Viktor(Ratings):
pass
class NA_Kindred_Jng_Vladimir(Ratings):
pass
class NA_Kindred_Jng_Volibear(Ratings):
pass
class NA_Kindred_Jng_Warwick(Ratings):
pass
class NA_Kindred_Jng_Xayah(Ratings):
pass
class NA_Kindred_Jng_Xerath(Ratings):
pass
class NA_Kindred_Jng_XinZhao(Ratings):
pass
class NA_Kindred_Jng_Yasuo(Ratings):
pass
class NA_Kindred_Jng_Yorick(Ratings):
pass
class NA_Kindred_Jng_Zac(Ratings):
pass
class NA_Kindred_Jng_Zed(Ratings):
pass
class NA_Kindred_Jng_Ziggs(Ratings):
pass
class NA_Kindred_Jng_Zilean(Ratings):
pass
class NA_Kindred_Jng_Zyra(Ratings):
pass
| 16.026379 | 46 | 0.77151 | 972 | 6,683 | 4.878601 | 0.151235 | 0.203712 | 0.407423 | 0.494728 | 0.808941 | 0.808941 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166243 | 6,683 | 416 | 47 | 16.064904 | 0.851041 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 8 |
ba8026b1bac460ab742e9001754d9e6a3c61d5eb | 182 | py | Python | app/views/dashboard/sliders/__init__.py | Wern-rm/raton.by | 68f862f2bc0551bf2327e9d6352c0cde93f45301 | [
"MIT"
] | null | null | null | app/views/dashboard/sliders/__init__.py | Wern-rm/raton.by | 68f862f2bc0551bf2327e9d6352c0cde93f45301 | [
"MIT"
] | null | null | null | app/views/dashboard/sliders/__init__.py | Wern-rm/raton.by | 68f862f2bc0551bf2327e9d6352c0cde93f45301 | [
"MIT"
] | null | null | null | from app.views.dashboard.sliders.index import sliders
from app.views.dashboard.sliders.delete import slider_delete
from app.views.dashboard.sliders.activation import slider_activated | 60.666667 | 67 | 0.873626 | 26 | 182 | 6.038462 | 0.423077 | 0.133758 | 0.229299 | 0.401274 | 0.535032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06044 | 182 | 3 | 67 | 60.666667 | 0.918129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2404179cf3a7fd76543ccff7a1e25075360ef93a | 9,259 | py | Python | xlist/index.py | ihgazni2/xlist | 3c4cc976dcfabc8bb18022389e67d78ce4bfd659 | [
"MIT"
] | null | null | null | xlist/index.py | ihgazni2/xlist | 3c4cc976dcfabc8bb18022389e67d78ce4bfd659 | [
"MIT"
] | null | null | null | xlist/index.py | ihgazni2/xlist | 3c4cc976dcfabc8bb18022389e67d78ce4bfd659 | [
"MIT"
] | null | null | null |
def uniform_index(index,lngth):
if(index<0):
rl = lngth+index
if(rl<0):
index = 0
else:
index = rl
elif(index>=lngth):
index = lngth
else:
index = index
return(index)
def index_fst(ol,value):
return(ol.index(value))
def index_fst_not(ol,value):
lngth = ol.__len__()
for i in range(0,lngth):
if(value == ol[i]):
pass
else:
return(i)
return(None)
def index_lst(ol,value):
length = ol.__len__()
for i in range(length-1,-1,-1):
if(value == ol[i]):
return(i)
else:
pass
return(None)
def index_lst_not(ol,value):
length = ol.__len__()
for i in range(length-1,-1,-1):
if(value == ol[i]):
pass
else:
return(i)
return(None)
def index_which(ol,value,which):
length = ol.__len__()
seq = -1
for i in range(0,length):
if(value == ol[i]):
seq = seq + 1
if(seq == which):
return(i)
else:
pass
else:
pass
return(None)
def index_which_not(ol,value,which):
length = ol.__len__()
seq = -1
for i in range(0,length):
if(value == ol[i]):
pass
else:
seq = seq + 1
if(seq == which):
return(i)
else:
pass
return(None)
def indexes_all(ol,value):
length = ol.__len__()
indexes =[]
for i in range(0,length):
if(value == ol[i]):
indexes.append(i)
else:
pass
return(indexes)
def indexes_all_not(ol,value):
length = ol.__len__()
indexes =[]
for i in range(0,length):
if(value == ol[i]):
pass
else:
indexes.append(i)
return(indexes)
def indexes_some(ol,value,*seqs):
seqs = list(seqs)
length = ol.__len__()
indexes =[]
seq = -1
for i in range(0,length):
if(value == ol[i]):
seq = seq + 1
if(seq in seqs):
indexes.append(i)
else:
pass
else:
pass
return(indexes)
def indexes_some_not(ol,value,*seqs):
seqs = list(seqs)
length = ol.__len__()
indexes =[]
seq = -1
for i in range(0,length):
if(value == ol[i]):
pass
else:
seq = seq + 1
if(seq in seqs):
indexes.append(i)
else:
pass
return(indexes)
def indexes_fst_slice(ol,value):
length = ol.__len__()
begin = None
slice = []
for i in range(0,length):
if(ol[i]==value):
begin = i
break
else:
pass
if(begin == None):
return(None)
else:
slice.append(begin)
for i in range(begin+1,length):
if(ol[i]==value):
slice.append(i)
else:
break
return(slice)
def indexes_fst_not_slice(ol,value):
length = ol.__len__()
begin = None
slice = []
for i in range(0,length):
if(not(ol[i]==value)):
begin = i
break
else:
pass
if(begin == None):
return(None)
else:
slice.append(begin)
for i in range(begin+1,length):
if(not(ol[i]==value)):
slice.append(i)
else:
break
return(slice)
def indexes_lst_slice(ol,value):
length = ol.__len__()
end = None
slice = []
for i in range(length-1,-1,-1):
if(ol[i]==value):
end = i
break
else:
pass
if(end == None):
return(None)
else:
slice.append(end)
for i in range(end-1,-1,-1):
if(ol[i]==value):
slice.append(i)
else:
break
slice.reverse()
return(slice)
def indexes_lst_not_slice(ol,value):
length = ol.__len__()
end = None
slice = []
for i in range(length-1,-1,-1):
if(not(ol[i]==value)):
end = i
break
else:
pass
if(end == None):
return(None)
else:
slice.append(end)
for i in range(end-1,-1,-1):
if(not(ol[i]==value)):
slice.append(i)
else:
break
slice.reverse()
return(slice)
def indexes_which_slice(ol,value):
length = ol.__len__()
seq = -1
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = (ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
cursor = cursor + 1
elif(cond1 & (not(cond2))):
slice.append(cursor)
cursor = cursor + 1
elif((not(cond1)) & (not(cond2))):
seq = seq + 1
if(seq == which):
return(slice)
else:
cursor = cursor + 1
begin = None
slice = []
else:
cursor = cursor + 1
if(slice):
seq = seq + 1
else:
pass
if(seq == which):
return(slice)
else:
return([])
def indexes_which_not_slice(ol,value):
length = ol.__len__()
seq = -1
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = not(ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
cursor = cursor + 1
elif(cond1 & (not(cond2))):
slice.append(cursor)
cursor = cursor + 1
elif((not(cond1)) & (not(cond2))):
seq = seq + 1
if(seq == which):
return(slice)
else:
cursor = cursor + 1
begin = None
slice = []
else:
cursor = cursor + 1
if(slice):
seq = seq + 1
else:
pass
if(seq == which):
return(slice)
else:
return([])
def indexes_some_slices(ol,value,*seqs):
seqs = list(seqs)
rslt = []
length = ol.__len__()
seq = -1
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = (ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
elif(cond1 & (not(cond2))):
slice.append(cursor)
elif((not(cond1)) & (not(cond2))):
seq = seq + 1
if(seq in seqs):
rslt.append(slice)
else:
pass
begin = None
slice = []
else:
pass
cursor = cursor + 1
if(slice):
seq = seq + 1
if(seq in seqs):
rslt.append(slice)
else:
pass
else:
pass
return(rslt)
def indexes_some_not_slices(ol,value,*seqs):
seqs = list(seqs)
rslt = []
length = ol.__len__()
seq = -1
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = not(ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
elif(cond1 & (not(cond2))):
slice.append(cursor)
elif((not(cond1)) & (not(cond2))):
seq = seq + 1
if(seq in seqs):
rslt.append(slice)
else:
pass
begin = None
slice = []
else:
pass
cursor = cursor + 1
if(slice):
seq = seq + 1
if(seq in seqs):
rslt.append(slice)
else:
pass
else:
pass
return(rslt)
def indexes_all_slices(ol,value):
rslt = []
length = ol.__len__()
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = (ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
elif(cond1 & (not(cond2))):
slice.append(cursor)
elif((not(cond1)) & (not(cond2))):
rslt.append(slice)
begin = None
slice = []
else:
pass
cursor = cursor + 1
if(slice):
rslt.append(slice)
else:
pass
return(rslt)
def indexes_all_not_slices(ol,value):
rslt = []
length = ol.__len__()
cursor = 0
begin = None
slice = []
while(cursor < length):
cond1 = not(ol[cursor] == value)
cond2 = (begin == None)
if(cond1 & cond2):
begin = cursor
slice.append(cursor)
elif(cond1 & (not(cond2))):
slice.append(cursor)
elif((not(cond1)) & (not(cond2))):
rslt.append(slice)
begin = None
slice = []
else:
pass
cursor = cursor + 1
if(slice):
rslt.append(slice)
else:
pass
return(rslt)
| 21.043182 | 44 | 0.449509 | 1,039 | 9,259 | 3.893167 | 0.041386 | 0.051422 | 0.048949 | 0.04623 | 0.939184 | 0.925834 | 0.904326 | 0.88801 | 0.887268 | 0.870457 | 0 | 0.022256 | 0.427368 | 9,259 | 439 | 45 | 21.091116 | 0.740664 | 0 | 0 | 0.914508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054404 | false | 0.080311 | 0 | 0.002591 | 0.054404 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
24205447ad08600d1fb319ed6b91d504d6805bb3 | 18,070 | py | Python | py_stringsimjoin/tests/test_converter_utils.py | guptaarth87/py_stringsimjoin | f8c2d1e380d39a491b887684ae734a5b7cbda24c | [
"BSD-3-Clause"
] | null | null | null | py_stringsimjoin/tests/test_converter_utils.py | guptaarth87/py_stringsimjoin | f8c2d1e380d39a491b887684ae734a5b7cbda24c | [
"BSD-3-Clause"
] | null | null | null | py_stringsimjoin/tests/test_converter_utils.py | guptaarth87/py_stringsimjoin | f8c2d1e380d39a491b887684ae734a5b7cbda24c | [
"BSD-3-Clause"
] | null | null | null | import random
import string
import unittest
from nose.tools import assert_equal, assert_list_equal, raises
import pandas as pd
from py_stringsimjoin.utils.converter import dataframe_column_to_str, \
series_to_str
class DataframeColumnToStrTestCases(unittest.TestCase):
def setUp(self):
float_col = pd.Series(pd.np.random.randn(10)).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
float_col_with_int_val = pd.Series(
pd.np.random.randint(1, 100, 10)).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
str_col = pd.Series([random.choice(string.ascii_lowercase)
for _ in range(10)]).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
int_col = pd.Series(pd.np.random.randint(1, 100, 20))
nan_col = pd.Series([pd.np.NaN for _ in range(20)])
self.dataframe = pd.DataFrame({'float_col': float_col,
'float_col_with_int_val': float_col_with_int_val,
'int_col': int_col,
'str_col': str_col,
'nan_col': nan_col})
def test_str_col(self):
assert_equal(self.dataframe['str_col'].dtype, object)
out_df = dataframe_column_to_str(self.dataframe, 'str_col',
inplace=False, return_col=False)
assert_equal(type(out_df), pd.DataFrame)
assert_equal(out_df['str_col'].dtype, object)
assert_equal(self.dataframe['str_col'].dtype, object)
assert_equal(sum(pd.isnull(self.dataframe['str_col'])),
sum(pd.isnull(out_df['str_col'])))
def test_int_col(self):
assert_equal(self.dataframe['int_col'].dtype, int)
out_df = dataframe_column_to_str(self.dataframe, 'int_col',
inplace=False, return_col=False)
assert_equal(type(out_df), pd.DataFrame)
assert_equal(out_df['int_col'].dtype, object)
assert_equal(self.dataframe['int_col'].dtype, int)
assert_equal(sum(pd.isnull(out_df['int_col'])), 0)
def test_float_col(self):
assert_equal(self.dataframe['float_col'].dtype, float)
out_df = dataframe_column_to_str(self.dataframe, 'float_col',
inplace=False, return_col=False)
assert_equal(type(out_df), pd.DataFrame)
assert_equal(out_df['float_col'].dtype, object)
assert_equal(self.dataframe['float_col'].dtype, float)
assert_equal(sum(pd.isnull(self.dataframe['float_col'])),
sum(pd.isnull(out_df['float_col'])))
def test_float_col_with_int_val(self):
assert_equal(self.dataframe['float_col_with_int_val'].dtype, float)
out_df = dataframe_column_to_str(
self.dataframe, 'float_col_with_int_val',
inplace=False, return_col=False)
assert_equal(type(out_df), pd.DataFrame)
assert_equal(out_df['float_col_with_int_val'].dtype, object)
assert_equal(self.dataframe['float_col_with_int_val'].dtype, float)
assert_equal(sum(pd.isnull(self.dataframe['float_col_with_int_val'])),
sum(pd.isnull(out_df['float_col_with_int_val'])))
for idx, row in self.dataframe.iterrows():
if pd.isnull(row['float_col_with_int_val']):
continue
assert_equal(str(int(row['float_col_with_int_val'])),
out_df.ix[idx]['float_col_with_int_val'])
def test_str_col_with_inplace(self):
assert_equal(self.dataframe['str_col'].dtype, object)
nan_cnt_before = sum(pd.isnull(self.dataframe['str_col']))
flag = dataframe_column_to_str(self.dataframe, 'str_col',
inplace=True, return_col=False)
assert_equal(flag, True)
assert_equal(self.dataframe['str_col'].dtype, object)
nan_cnt_after = sum(pd.isnull(self.dataframe['str_col']))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_str_col_with_return_col(self):
assert_equal(self.dataframe['str_col'].dtype, object)
nan_cnt_before = sum(pd.isnull(self.dataframe['str_col']))
out_series = dataframe_column_to_str(self.dataframe, 'str_col',
inplace=False, return_col=True)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.dataframe['str_col'].dtype, object)
nan_cnt_after = sum(pd.isnull(out_series))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_int_col_with_inplace(self):
assert_equal(self.dataframe['int_col'].dtype, int)
flag = dataframe_column_to_str(self.dataframe, 'int_col',
inplace=True, return_col=False)
assert_equal(flag, True)
assert_equal(self.dataframe['int_col'].dtype, object)
assert_equal(sum(pd.isnull(self.dataframe['int_col'])), 0)
def test_int_col_with_return_col(self):
assert_equal(self.dataframe['int_col'].dtype, int)
out_series = dataframe_column_to_str(self.dataframe, 'int_col',
inplace=False, return_col=True)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.dataframe['int_col'].dtype, int)
assert_equal(sum(pd.isnull(out_series)), 0)
def test_float_col_with_inplace(self):
assert_equal(self.dataframe['float_col'].dtype, float)
nan_cnt_before = sum(pd.isnull(self.dataframe['float_col']))
flag = dataframe_column_to_str(self.dataframe, 'float_col',
inplace=True, return_col=False)
assert_equal(flag, True)
assert_equal(self.dataframe['float_col'].dtype, object)
nan_cnt_after = sum(pd.isnull(self.dataframe['float_col']))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_float_col_with_return_col(self):
assert_equal(self.dataframe['float_col'].dtype, float)
nan_cnt_before = sum(pd.isnull(self.dataframe['float_col']))
out_series = dataframe_column_to_str(self.dataframe, 'float_col',
inplace=False, return_col=True)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.dataframe['float_col'].dtype, float)
nan_cnt_after = sum(pd.isnull(out_series))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_nan_col_with_inplace(self):
assert_equal(self.dataframe['nan_col'].dtype, float)
nan_cnt_before = sum(pd.isnull(self.dataframe['nan_col']))
flag = dataframe_column_to_str(self.dataframe, 'nan_col',
inplace=True, return_col=False)
assert_equal(flag, True)
assert_equal(self.dataframe['nan_col'].dtype, object)
nan_cnt_after = sum(pd.isnull(self.dataframe['nan_col']))
assert_equal(nan_cnt_before, nan_cnt_after)
@raises(AssertionError)
def test_invalid_dataframe(self):
dataframe_column_to_str([], 'test_col')
@raises(AssertionError)
def test_invalid_col_name(self):
dataframe_column_to_str(self.dataframe, 'invalid_col')
@raises(AssertionError)
def test_invalid_inplace_flag(self):
dataframe_column_to_str(self.dataframe, 'str_col', inplace=None)
@raises(AssertionError)
def test_invalid_return_col_flag(self):
dataframe_column_to_str(self.dataframe, 'str_col',
inplace=True, return_col=None)
@raises(AssertionError)
def test_invalid_flag_combination(self):
dataframe_column_to_str(self.dataframe, 'str_col',
inplace=True, return_col=True)
class SeriesToStrTestCases(unittest.TestCase):
def setUp(self):
self.float_col = pd.Series(pd.np.random.randn(10)).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
self.float_col_with_int_val = pd.Series(
pd.np.random.randint(1, 100, 10)).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
self.str_col = pd.Series([random.choice(string.ascii_lowercase)
for _ in range(10)]).append(
pd.Series([pd.np.NaN for _ in range(10)], index=range(10, 20)))
self.int_col = pd.Series(pd.np.random.randint(1, 100, 20))
self.nan_col = pd.Series([pd.np.NaN for _ in range(20)])
def test_str_col(self):
assert_equal(self.str_col.dtype, object)
out_series = series_to_str(self.str_col, inplace=False)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.str_col.dtype, object)
assert_equal(sum(pd.isnull(self.str_col)),
sum(pd.isnull(out_series)))
def test_int_col(self):
assert_equal(self.int_col.dtype, int)
out_series = series_to_str(self.int_col, inplace=False)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.int_col.dtype, int)
assert_equal(sum(pd.isnull(out_series)), 0)
def test_float_col(self):
assert_equal(self.float_col.dtype, float)
out_series = series_to_str(self.float_col, inplace=False)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.float_col.dtype, float)
assert_equal(sum(pd.isnull(self.float_col)),
sum(pd.isnull(out_series)))
def test_float_col_with_int_val(self):
assert_equal(self.float_col_with_int_val.dtype, float)
out_series = series_to_str(self.float_col_with_int_val, inplace=False)
assert_equal(type(out_series), pd.Series)
assert_equal(out_series.dtype, object)
assert_equal(self.float_col_with_int_val.dtype, float)
assert_equal(sum(pd.isnull(self.float_col_with_int_val)),
sum(pd.isnull(out_series)))
for idx, val in self.float_col_with_int_val.iteritems():
if pd.isnull(val):
continue
assert_equal(str(int(val)), out_series.ix[idx])
def test_str_col_with_inplace(self):
assert_equal(self.str_col.dtype, object)
nan_cnt_before = sum(pd.isnull(self.str_col))
flag = series_to_str(self.str_col, inplace=True)
assert_equal(flag, True)
assert_equal(self.str_col.dtype, object)
nan_cnt_after = sum(pd.isnull(self.str_col))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_int_col_with_inplace(self):
assert_equal(self.int_col.dtype, int)
flag = series_to_str(self.int_col, inplace=True)
assert_equal(flag, True)
assert_equal(self.int_col.dtype, object)
assert_equal(sum(pd.isnull(self.int_col)), 0)
def test_float_col_with_inplace(self):
assert_equal(self.float_col.dtype, float)
nan_cnt_before = sum(pd.isnull(self.float_col))
flag = series_to_str(self.float_col, inplace=True)
assert_equal(flag, True)
assert_equal(self.float_col.dtype, object)
nan_cnt_after = sum(pd.isnull(self.float_col))
assert_equal(nan_cnt_before, nan_cnt_after)
# test the case with a series containing only NaN values. In this case,
# inplace flag will be ignored.
def test_nan_col_with_inplace(self):
assert_equal(self.nan_col.dtype, float)
nan_cnt_before = sum(pd.isnull(self.nan_col))
out_series = series_to_str(self.nan_col, inplace=True)
assert_equal(out_series.dtype, object)
assert_equal(self.nan_col.dtype, float)
nan_cnt_after = sum(pd.isnull(out_series))
assert_equal(nan_cnt_before, nan_cnt_after)
def test_empty_series_with_inplace(self):
empty_series = pd.Series(dtype=int)
assert_equal(empty_series.dtype, int)
out_series = series_to_str(empty_series, inplace=True)
assert_equal(out_series.dtype, object)
assert_equal(empty_series.dtype, int)
assert_equal(len(out_series), 0)
@raises(AssertionError)
def test_invalid_series(self):
series_to_str([])
@raises(AssertionError)
def test_invalid_inplace_flag(self):
series_to_str(self.int_col, inplace=None)
| 66.925926 | 107 | 0.454676 | 1,735 | 18,070 | 4.401153 | 0.053026 | 0.135411 | 0.074646 | 0.069146 | 0.914484 | 0.880042 | 0.833683 | 0.79374 | 0.747774 | 0.711367 | 0 | 0.008097 | 0.466906 | 18,070 | 269 | 108 | 67.174721 | 0.784595 | 0.005479 | 0 | 0.568376 | 0 | 0 | 0.03612 | 0.012244 | 0 | 0 | 0 | 0 | 0.431624 | 1 | 0.123932 | false | 0 | 0.025641 | 0 | 0.15812 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79f38a01f8896addee293aba1ad520e9e595d905 | 2,333 | py | Python | stst/features/features_dependency.py | skywalkerzhang/STS-tookit | 74fbd4d7709e4474ccde3a8e1c260a9ef162a1d9 | [
"MIT"
] | 94 | 2017-02-08T02:47:58.000Z | 2022-02-04T15:05:36.000Z | stst/features/features_dependency.py | saharghannay/Semantic-Texual-Similarity-Toolkits | 7ef271e4e4ca55330b31bce06368274c2ddbe3a9 | [
"MIT"
] | 18 | 2017-02-08T07:29:58.000Z | 2019-07-24T10:00:32.000Z | stst/features/features_dependency.py | saharghannay/Semantic-Texual-Similarity-Toolkits | 7ef271e4e4ca55330b31bce06368274c2ddbe3a9 | [
"MIT"
] | 31 | 2017-04-07T06:59:09.000Z | 2021-12-21T09:23:09.000Z | from stst.modules.features import Feature
from stst import utils
class DependencyGramFeature(Feature):
def __init__(self, convey='count', **karwgs):
super(DependencyGramFeature, self).__init__(**karwgs)
self.convey = convey
self.feature_name += '-%s' % (convey)
def extract_information(self, train_instances):
seqs = []
for train_instance in train_instances:
dep_sa, dep_sb = train_instance.get_dependency()
dep_sa = [(dep[1], dep[2]) for dep in dep_sa]
dep_sb = [(dep[1], dep[2]) for dep in dep_sb]
seqs.append(dep_sa)
seqs.append(dep_sb)
self.idf_weight = utils.idf_calculator(seqs)
self.vocab = utils.word2index(self.idf_weight)
def extract(self, train_instance):
dep_sa, dep_sb = train_instance.get_dependency()
dep_sa = [(dep[1], dep[2]) for dep in dep_sa]
dep_sb = [(dep[1], dep[2]) for dep in dep_sb]
features = []
feature, info = utils.sentence_match_features(dep_sa, dep_sb)
features += feature
feature, info = utils.sentence_vectorize_features(dep_sa, dep_sb, self.idf_weight, convey=self.convey)
features += feature
infos = [dep_sa, dep_sb]
return features, infos
class DependencyRelationFeature(Feature):
def __init__(self, convey='count', **karwgs):
super(DependencyRelationFeature, self).__init__(**karwgs)
self.convey = convey
self.feature_name = self.feature_name + '-%s' % (convey)
# self.feature_name = self.feature_name + '-%s' % (stopwords)
def extract_information(self, train_instances):
seqs = []
for train_instance in train_instances:
dep_sa, dep_sb = train_instance.get_dependency()
seqs.append(dep_sa)
seqs.append(dep_sb)
self.idf_weight = utils.idf_calculator(seqs)
def extract(self, train_instance):
dep_sa, dep_sb = train_instance.get_dependency()
features = []
feature, info = utils.sentence_match_features(dep_sa, dep_sb)
features += feature
feature, info = utils.sentence_vectorize_features(dep_sa, dep_sb, self.idf_weight, convey=self.convey)
features += feature
infos = [dep_sa, dep_sb]
return features, infos
| 35.892308 | 110 | 0.64252 | 292 | 2,333 | 4.839041 | 0.160959 | 0.056617 | 0.079264 | 0.084926 | 0.856334 | 0.85138 | 0.85138 | 0.85138 | 0.757254 | 0.69356 | 0 | 0.005143 | 0.249893 | 2,333 | 64 | 111 | 36.453125 | 0.802286 | 0.025289 | 0 | 0.816327 | 0 | 0 | 0.007042 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122449 | false | 0 | 0.040816 | 0 | 0.244898 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
030317a1d9a328d452bf29bc7a802e29629b1a42 | 197 | py | Python | speaker_encoder/data_objects/__init__.py | ishine/ppg-vc | b59cb9862cf4b82a3bdb589950e25cab85fc9b03 | [
"Apache-2.0"
] | 133 | 2021-06-12T04:44:36.000Z | 2022-03-31T09:42:26.000Z | speaker_encoder/data_objects/__init__.py | zhouyh-jlu/ppg-vc | b59cb9862cf4b82a3bdb589950e25cab85fc9b03 | [
"Apache-2.0"
] | 20 | 2021-06-14T20:15:49.000Z | 2022-02-18T09:05:00.000Z | speaker_encoder/data_objects/__init__.py | zhouyh-jlu/ppg-vc | b59cb9862cf4b82a3bdb589950e25cab85fc9b03 | [
"Apache-2.0"
] | 41 | 2021-06-12T05:43:50.000Z | 2022-03-27T07:56:28.000Z | from speaker_encoder.data_objects.speaker_verification_dataset import SpeakerVerificationDataset
from speaker_encoder.data_objects.speaker_verification_dataset import SpeakerVerificationDataLoader
| 65.666667 | 99 | 0.939086 | 20 | 197 | 8.85 | 0.5 | 0.124294 | 0.20339 | 0.248588 | 0.689266 | 0.689266 | 0.689266 | 0.689266 | 0.689266 | 0 | 0 | 0 | 0.040609 | 197 | 2 | 100 | 98.5 | 0.936508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
033b3c8ffa91766392eb0198025e8e5861f59718 | 62 | py | Python | login.py | zhangli1758152342/test | d8042c1414d35e61bb376d06caef55ed4aea684c | [
"MIT"
] | null | null | null | login.py | zhangli1758152342/test | d8042c1414d35e61bb376d06caef55ed4aea684c | [
"MIT"
] | null | null | null | login.py | zhangli1758152342/test | d8042c1414d35e61bb376d06caef55ed4aea684c | [
"MIT"
] | null | null | null | num 10
a = 20
b = 30
m = 0
c= 40
d = 50
e = 60
| 3.875 | 7 | 0.354839 | 14 | 62 | 1.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.464286 | 0.548387 | 62 | 15 | 8 | 4.133333 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
300995c6390af15793fa4a8fd12ec68fb71c4baf | 138 | py | Python | torch_geometric_temporal/data/discrete/__init__.py | LFrancesco/pytorch_geometric_temporal | 0964515a6041ce0cceb12e36ed640df22c046b4d | [
"MIT"
] | 1 | 2020-11-24T13:10:37.000Z | 2020-11-24T13:10:37.000Z | torch_geometric_temporal/data/discrete/__init__.py | mariaast/pytorch_geometric_temporal | 2c99d690cf183e6c9e7ff40d15ba2f8b875c1aaf | [
"MIT"
] | null | null | null | torch_geometric_temporal/data/discrete/__init__.py | mariaast/pytorch_geometric_temporal | 2c99d690cf183e6c9e7ff40d15ba2f8b875c1aaf | [
"MIT"
] | null | null | null | from .static_graph_discrete_signal import StaticGraphDiscreteSignal
from .dynamic_graph_discrete_signal import DynamicGraphDiscreteSignal
| 46 | 69 | 0.927536 | 14 | 138 | 8.714286 | 0.642857 | 0.213115 | 0.311475 | 0.409836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057971 | 138 | 2 | 70 | 69 | 0.938462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
066382173d5d2e46697c4a8a0cde6dd8e10ae00e | 132 | py | Python | AdventOfCode2015/Day16/Day16.py | MattTitmas/AdventOfCode | 36be4f6bf973f77ff93b08dc69c977bb11951f27 | [
"MIT"
] | null | null | null | AdventOfCode2015/Day16/Day16.py | MattTitmas/AdventOfCode | 36be4f6bf973f77ff93b08dc69c977bb11951f27 | [
"MIT"
] | null | null | null | AdventOfCode2015/Day16/Day16.py | MattTitmas/AdventOfCode | 36be4f6bf973f77ff93b08dc69c977bb11951f27 | [
"MIT"
] | null | null | null | def part1():
return 40
def part2():
return 241
print(f"answer to part1: {part1()}")
print(f"answer to part2: {part2()}")
| 13.2 | 36 | 0.613636 | 20 | 132 | 4.05 | 0.5 | 0.148148 | 0.296296 | 0.345679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104762 | 0.204545 | 132 | 9 | 37 | 14.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.396947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0.333333 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
06aa2d487fa6759a5ffaef071331dad2ccf8dd94 | 230 | py | Python | models/module_zoo/__init__.py | hardik01shah/TAdaConv | a1174c1c39ee91c6f6fc7a99471160f17d05297e | [
"Apache-2.0"
] | 30 | 2022-01-29T12:39:31.000Z | 2022-03-23T05:58:52.000Z | models/module_zoo/__init__.py | hardik01shah/TAdaConv | a1174c1c39ee91c6f6fc7a99471160f17d05297e | [
"Apache-2.0"
] | 5 | 2022-02-07T02:09:27.000Z | 2022-03-29T10:40:22.000Z | models/module_zoo/__init__.py | hardik01shah/TAdaConv | a1174c1c39ee91c6f6fc7a99471160f17d05297e | [
"Apache-2.0"
] | 6 | 2022-01-30T03:22:22.000Z | 2022-03-31T07:28:32.000Z | #!/usr/bin/env python3
# Copyright (C) Alibaba Group Holding Limited.
from models.module_zoo.heads import *
from models.module_zoo.stems import *
from models.module_zoo.branches import *
from models.module_zoo.ops import * | 32.857143 | 48 | 0.769565 | 34 | 230 | 5.088235 | 0.558824 | 0.231214 | 0.369942 | 0.439306 | 0.433526 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005051 | 0.13913 | 230 | 7 | 49 | 32.857143 | 0.868687 | 0.286957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
06ab0fa00d180d0f2a5fcf709cf907daf46f77ba | 98 | py | Python | eagleos/colors.py | 5qc/EagleOS | c42513e4a317e0bc111183c752ba39de824346c3 | [
"MIT"
] | null | null | null | eagleos/colors.py | 5qc/EagleOS | c42513e4a317e0bc111183c752ba39de824346c3 | [
"MIT"
] | null | null | null | eagleos/colors.py | 5qc/EagleOS | c42513e4a317e0bc111183c752ba39de824346c3 | [
"MIT"
] | null | null | null | def color(r, g, b, msg):
return "\033[38;2;{};{};{}m{} \033[38;2;0;0;0m".format(r, g, b, msg)
| 32.666667 | 72 | 0.5 | 22 | 98 | 2.227273 | 0.636364 | 0.081633 | 0.122449 | 0.244898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180723 | 0.153061 | 98 | 2 | 73 | 49 | 0.409639 | 0 | 0 | 0 | 0 | 0.5 | 0.387755 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
2367ff6a7c22d94c8de11f1cb943a48a9ef48e58 | 28,434 | py | Python | sdk/python/pulumi_aiven/service_user.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 7 | 2019-11-28T22:30:11.000Z | 2021-12-27T16:40:54.000Z | sdk/python/pulumi_aiven/service_user.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 97 | 2019-12-17T09:58:57.000Z | 2022-03-31T15:19:02.000Z | sdk/python/pulumi_aiven/service_user.py | pulumi/pulumi-aiven | 0d330ef43c17ce2d2a77588c1d9754de6c8ca736 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-11-24T12:22:38.000Z | 2020-11-24T12:22:38.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['ServiceUserArgs', 'ServiceUser']
@pulumi.input_type
class ServiceUserArgs:
def __init__(__self__, *,
project: pulumi.Input[str],
service_name: pulumi.Input[str],
username: pulumi.Input[str],
authentication: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
redis_acl_categories: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_channels: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_commands: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a ServiceUser resource.
:param pulumi.Input[str] project: and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
:param pulumi.Input[str] service_name: Service to link the user to
:param pulumi.Input[str] username: is the actual name of the user account.
:param pulumi.Input[str] authentication: Authentication details
:param pulumi.Input[str] password: Password of the user
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_categories: Redis specific field, defines command category rules.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_channels: Permitted pub/sub channel patterns
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_commands: Redis specific field, defines rules for individual commands.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_keys: Redis specific field, defines key access rules.
"""
pulumi.set(__self__, "project", project)
pulumi.set(__self__, "service_name", service_name)
pulumi.set(__self__, "username", username)
if authentication is not None:
pulumi.set(__self__, "authentication", authentication)
if password is not None:
pulumi.set(__self__, "password", password)
if redis_acl_categories is not None:
pulumi.set(__self__, "redis_acl_categories", redis_acl_categories)
if redis_acl_channels is not None:
pulumi.set(__self__, "redis_acl_channels", redis_acl_channels)
if redis_acl_commands is not None:
pulumi.set(__self__, "redis_acl_commands", redis_acl_commands)
if redis_acl_keys is not None:
pulumi.set(__self__, "redis_acl_keys", redis_acl_keys)
@property
@pulumi.getter
def project(self) -> pulumi.Input[str]:
"""
and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: pulumi.Input[str]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="serviceName")
def service_name(self) -> pulumi.Input[str]:
"""
Service to link the user to
"""
return pulumi.get(self, "service_name")
@service_name.setter
def service_name(self, value: pulumi.Input[str]):
pulumi.set(self, "service_name", value)
@property
@pulumi.getter
def username(self) -> pulumi.Input[str]:
"""
is the actual name of the user account.
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: pulumi.Input[str]):
pulumi.set(self, "username", value)
@property
@pulumi.getter
def authentication(self) -> Optional[pulumi.Input[str]]:
"""
Authentication details
"""
return pulumi.get(self, "authentication")
@authentication.setter
def authentication(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "authentication", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
Password of the user
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="redisAclCategories")
def redis_acl_categories(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines command category rules.
"""
return pulumi.get(self, "redis_acl_categories")
@redis_acl_categories.setter
def redis_acl_categories(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_categories", value)
@property
@pulumi.getter(name="redisAclChannels")
def redis_acl_channels(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Permitted pub/sub channel patterns
"""
return pulumi.get(self, "redis_acl_channels")
@redis_acl_channels.setter
def redis_acl_channels(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_channels", value)
@property
@pulumi.getter(name="redisAclCommands")
def redis_acl_commands(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines rules for individual commands.
"""
return pulumi.get(self, "redis_acl_commands")
@redis_acl_commands.setter
def redis_acl_commands(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_commands", value)
@property
@pulumi.getter(name="redisAclKeys")
def redis_acl_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines key access rules.
"""
return pulumi.get(self, "redis_acl_keys")
@redis_acl_keys.setter
def redis_acl_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_keys", value)
@pulumi.input_type
class _ServiceUserState:
def __init__(__self__, *,
access_cert: Optional[pulumi.Input[str]] = None,
access_key: Optional[pulumi.Input[str]] = None,
authentication: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
redis_acl_categories: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_channels: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_commands: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
service_name: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ServiceUser resources.
:param pulumi.Input[str] access_cert: is the access certificate of the user (not applicable for all services).
:param pulumi.Input[str] access_key: is the access key of the user (not applicable for all services).
:param pulumi.Input[str] authentication: Authentication details
:param pulumi.Input[str] password: Password of the user
:param pulumi.Input[str] project: and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_categories: Redis specific field, defines command category rules.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_channels: Permitted pub/sub channel patterns
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_commands: Redis specific field, defines rules for individual commands.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_keys: Redis specific field, defines key access rules.
:param pulumi.Input[str] service_name: Service to link the user to
:param pulumi.Input[str] type: tells whether the user is primary account or regular account.
:param pulumi.Input[str] username: is the actual name of the user account.
"""
if access_cert is not None:
pulumi.set(__self__, "access_cert", access_cert)
if access_key is not None:
pulumi.set(__self__, "access_key", access_key)
if authentication is not None:
pulumi.set(__self__, "authentication", authentication)
if password is not None:
pulumi.set(__self__, "password", password)
if project is not None:
pulumi.set(__self__, "project", project)
if redis_acl_categories is not None:
pulumi.set(__self__, "redis_acl_categories", redis_acl_categories)
if redis_acl_channels is not None:
pulumi.set(__self__, "redis_acl_channels", redis_acl_channels)
if redis_acl_commands is not None:
pulumi.set(__self__, "redis_acl_commands", redis_acl_commands)
if redis_acl_keys is not None:
pulumi.set(__self__, "redis_acl_keys", redis_acl_keys)
if service_name is not None:
pulumi.set(__self__, "service_name", service_name)
if type is not None:
pulumi.set(__self__, "type", type)
if username is not None:
pulumi.set(__self__, "username", username)
@property
@pulumi.getter(name="accessCert")
def access_cert(self) -> Optional[pulumi.Input[str]]:
"""
is the access certificate of the user (not applicable for all services).
"""
return pulumi.get(self, "access_cert")
@access_cert.setter
def access_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "access_cert", value)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> Optional[pulumi.Input[str]]:
"""
is the access key of the user (not applicable for all services).
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter
def authentication(self) -> Optional[pulumi.Input[str]]:
"""
Authentication details
"""
return pulumi.get(self, "authentication")
@authentication.setter
def authentication(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "authentication", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
Password of the user
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="redisAclCategories")
def redis_acl_categories(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines command category rules.
"""
return pulumi.get(self, "redis_acl_categories")
@redis_acl_categories.setter
def redis_acl_categories(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_categories", value)
@property
@pulumi.getter(name="redisAclChannels")
def redis_acl_channels(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Permitted pub/sub channel patterns
"""
return pulumi.get(self, "redis_acl_channels")
@redis_acl_channels.setter
def redis_acl_channels(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_channels", value)
@property
@pulumi.getter(name="redisAclCommands")
def redis_acl_commands(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines rules for individual commands.
"""
return pulumi.get(self, "redis_acl_commands")
@redis_acl_commands.setter
def redis_acl_commands(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_commands", value)
@property
@pulumi.getter(name="redisAclKeys")
def redis_acl_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Redis specific field, defines key access rules.
"""
return pulumi.get(self, "redis_acl_keys")
@redis_acl_keys.setter
def redis_acl_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "redis_acl_keys", value)
@property
@pulumi.getter(name="serviceName")
def service_name(self) -> Optional[pulumi.Input[str]]:
"""
Service to link the user to
"""
return pulumi.get(self, "service_name")
@service_name.setter
def service_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "service_name", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[str]]:
"""
tells whether the user is primary account or regular account.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def username(self) -> Optional[pulumi.Input[str]]:
"""
is the actual name of the user account.
"""
return pulumi.get(self, "username")
@username.setter
def username(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "username", value)
class ServiceUser(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
authentication: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
redis_acl_categories: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_channels: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_commands: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
service_name: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
## # Service User Resource
The Service User resource allows the creation and management of Aiven Service Users.
## Example Usage
```python
import pulumi
import pulumi_aiven as aiven
myserviceuser = aiven.ServiceUser("myserviceuser",
project=aiven_project["myproject"]["project"],
service_name=aiven_service["myservice"]["service_name"],
username="<USERNAME>")
```
> **Note** The service user resource is not supported for Aiven Grafana services.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] authentication: Authentication details
:param pulumi.Input[str] password: Password of the user
:param pulumi.Input[str] project: and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_categories: Redis specific field, defines command category rules.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_channels: Permitted pub/sub channel patterns
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_commands: Redis specific field, defines rules for individual commands.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_keys: Redis specific field, defines key access rules.
:param pulumi.Input[str] service_name: Service to link the user to
:param pulumi.Input[str] username: is the actual name of the user account.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServiceUserArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
## # Service User Resource
The Service User resource allows the creation and management of Aiven Service Users.
## Example Usage
```python
import pulumi
import pulumi_aiven as aiven
myserviceuser = aiven.ServiceUser("myserviceuser",
project=aiven_project["myproject"]["project"],
service_name=aiven_service["myservice"]["service_name"],
username="<USERNAME>")
```
> **Note** The service user resource is not supported for Aiven Grafana services.
:param str resource_name: The name of the resource.
:param ServiceUserArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServiceUserArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
authentication: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
redis_acl_categories: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_channels: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_commands: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
service_name: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServiceUserArgs.__new__(ServiceUserArgs)
__props__.__dict__["authentication"] = authentication
__props__.__dict__["password"] = password
if project is None and not opts.urn:
raise TypeError("Missing required property 'project'")
__props__.__dict__["project"] = project
__props__.__dict__["redis_acl_categories"] = redis_acl_categories
__props__.__dict__["redis_acl_channels"] = redis_acl_channels
__props__.__dict__["redis_acl_commands"] = redis_acl_commands
__props__.__dict__["redis_acl_keys"] = redis_acl_keys
if service_name is None and not opts.urn:
raise TypeError("Missing required property 'service_name'")
__props__.__dict__["service_name"] = service_name
if username is None and not opts.urn:
raise TypeError("Missing required property 'username'")
__props__.__dict__["username"] = username
__props__.__dict__["access_cert"] = None
__props__.__dict__["access_key"] = None
__props__.__dict__["type"] = None
super(ServiceUser, __self__).__init__(
'aiven:index/serviceUser:ServiceUser',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
access_cert: Optional[pulumi.Input[str]] = None,
access_key: Optional[pulumi.Input[str]] = None,
authentication: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
redis_acl_categories: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_channels: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_commands: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
redis_acl_keys: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
service_name: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
username: Optional[pulumi.Input[str]] = None) -> 'ServiceUser':
"""
Get an existing ServiceUser resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] access_cert: is the access certificate of the user (not applicable for all services).
:param pulumi.Input[str] access_key: is the access key of the user (not applicable for all services).
:param pulumi.Input[str] authentication: Authentication details
:param pulumi.Input[str] password: Password of the user
:param pulumi.Input[str] project: and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_categories: Redis specific field, defines command category rules.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_channels: Permitted pub/sub channel patterns
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_commands: Redis specific field, defines rules for individual commands.
:param pulumi.Input[Sequence[pulumi.Input[str]]] redis_acl_keys: Redis specific field, defines key access rules.
:param pulumi.Input[str] service_name: Service to link the user to
:param pulumi.Input[str] type: tells whether the user is primary account or regular account.
:param pulumi.Input[str] username: is the actual name of the user account.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServiceUserState.__new__(_ServiceUserState)
__props__.__dict__["access_cert"] = access_cert
__props__.__dict__["access_key"] = access_key
__props__.__dict__["authentication"] = authentication
__props__.__dict__["password"] = password
__props__.__dict__["project"] = project
__props__.__dict__["redis_acl_categories"] = redis_acl_categories
__props__.__dict__["redis_acl_channels"] = redis_acl_channels
__props__.__dict__["redis_acl_commands"] = redis_acl_commands
__props__.__dict__["redis_acl_keys"] = redis_acl_keys
__props__.__dict__["service_name"] = service_name
__props__.__dict__["type"] = type
__props__.__dict__["username"] = username
return ServiceUser(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="accessCert")
def access_cert(self) -> pulumi.Output[str]:
"""
is the access certificate of the user (not applicable for all services).
"""
return pulumi.get(self, "access_cert")
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Output[str]:
"""
is the access key of the user (not applicable for all services).
"""
return pulumi.get(self, "access_key")
@property
@pulumi.getter
def authentication(self) -> pulumi.Output[Optional[str]]:
"""
Authentication details
"""
return pulumi.get(self, "authentication")
@property
@pulumi.getter
def password(self) -> pulumi.Output[str]:
"""
Password of the user
"""
return pulumi.get(self, "password")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
and `service_name` - (Required) define the project and service the user belongs to. They should be defined
using reference as shown above to set up dependencies correctly.
"""
return pulumi.get(self, "project")
@property
@pulumi.getter(name="redisAclCategories")
def redis_acl_categories(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Redis specific field, defines command category rules.
"""
return pulumi.get(self, "redis_acl_categories")
@property
@pulumi.getter(name="redisAclChannels")
def redis_acl_channels(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Permitted pub/sub channel patterns
"""
return pulumi.get(self, "redis_acl_channels")
@property
@pulumi.getter(name="redisAclCommands")
def redis_acl_commands(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Redis specific field, defines rules for individual commands.
"""
return pulumi.get(self, "redis_acl_commands")
@property
@pulumi.getter(name="redisAclKeys")
def redis_acl_keys(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Redis specific field, defines key access rules.
"""
return pulumi.get(self, "redis_acl_keys")
@property
@pulumi.getter(name="serviceName")
def service_name(self) -> pulumi.Output[str]:
"""
Service to link the user to
"""
return pulumi.get(self, "service_name")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
tells whether the user is primary account or regular account.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter
def username(self) -> pulumi.Output[str]:
"""
is the actual name of the user account.
"""
return pulumi.get(self, "username")
| 43.744615 | 148 | 0.652107 | 3,312 | 28,434 | 5.371377 | 0.058877 | 0.1181 | 0.107813 | 0.073075 | 0.888645 | 0.867847 | 0.848848 | 0.826138 | 0.808263 | 0.791681 | 0 | 0.000046 | 0.241155 | 28,434 | 649 | 149 | 43.812018 | 0.824443 | 0.286453 | 0 | 0.725806 | 1 | 0 | 0.096852 | 0.001865 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163978 | false | 0.064516 | 0.013441 | 0 | 0.276882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
88e8c85067a2558a4c3ee9c41dc8aac8f7d32b6a | 5,071 | py | Python | head/detection/yolo.py | hotcouscous1/TensorBricks | 94dd90dce43a455d9abd7ccf19e1e7f361da7824 | [
"BSD-3-Clause"
] | null | null | null | head/detection/yolo.py | hotcouscous1/TensorBricks | 94dd90dce43a455d9abd7ccf19e1e7f361da7824 | [
"BSD-3-Clause"
] | null | null | null | head/detection/yolo.py | hotcouscous1/TensorBricks | 94dd90dce43a455d9abd7ccf19e1e7f361da7824 | [
"BSD-3-Clause"
] | null | null | null | from layers import *
class Yolo_V3_Predictor(nn.Module):
"""
__version__ = 1.0
__date__ = Mar 7, 2022
paper : https://arxiv.org/abs/1804.02767
'stride' is a stride to the input image.
"""
def __init__(self,
channels: int,
num_anchors: int,
num_classes: int,
stride: int):
num_pred = 4 + 1 + num_classes
pred_channels = num_anchors * num_pred
self.num_anchors, self.num_pred, self.stride = num_anchors, num_pred, stride
super(Yolo_V3_Predictor, self).__init__()
self.conv_pred = nn.Conv2d(channels, pred_channels, 1, bias=True)
def forward(self, f, anchors: Tensor = None):
pred = self.conv_pred(f)
pred = pred.view(f.shape[0], self.num_anchors, self.num_pred, f.shape[2], f.shape[3])
pred = pred.permute(0, 1, 3, 4, 2).contiguous().view(f.shape[0], -1, self.num_pred)
if not self.training:
if anchors.shape[1] != pred.shape[1]:
raise RuntimeError('number of anchors and regressions must be matched')
pred[..., :2] = torch.sigmoid(pred[..., :2]) * self.stride + anchors[..., :2]
pred[..., 2:4] = torch.exp(pred[..., 2:4]) * anchors[..., 2:]
return pred
class Yolo_V3_Head(nn.Module):
"""
__version__ = 1.0
__date__ = Mar 7, 2022
paper : https://arxiv.org/abs/1804.02767
The structure is explained in <2.3. Predictions Across Scales> of the paper.
"""
def __init__(self,
num_levels: int,
channels: list,
num_anchors: list,
num_classes: int,
strides: list):
self.num_levels = num_levels
if len(channels) != num_levels or len(num_anchors) != num_levels:
raise ValueError('make len(channels) == num_levels, and len(num_anchors) == num_levels')
super(Yolo_V3_Head, self).__init__()
self.heads = nn.ModuleList([Yolo_V3_Predictor(channels[i], num_anchors[i], num_classes, strides[i])
for i in range(num_levels)])
def forward(self, features, anchors: List[Tensor] = None):
out = []
for i in range(self.num_levels):
if anchors:
pred = self.heads[i](features[i], anchors[i])
else:
pred = self.heads[i](features[i])
out.append(pred)
out = torch.cat(out, 1)
return out
class Yolo_V4_Predictor(nn.Module):
"""
__version__ = 1.0
__date__ = Mar 7, 2022
'stride' is a stride to the input image.
"""
def __init__(self,
channels: int,
num_anchors: int,
num_classes: int,
stride: int,
Act: nn.Module = None):
num_pred = 4 + 1 + num_classes
pred_channels = num_anchors * num_pred
self.num_anchors, self.num_pred, self.stride = num_anchors, num_pred, stride
super(Yolo_V4_Predictor, self).__init__()
self.conv_pred = Static_ConvLayer(channels, pred_channels, 1, bias=True, batch_norm=False, Act=Act)
def forward(self, f, anchors=None):
pred = self.conv_pred(f)
pred = pred.view(f.shape[0], self.num_anchors, self.num_pred, f.shape[2], f.shape[3])
pred = pred.permute(0, 1, 3, 4, 2).contiguous().view(f.shape[0], -1, self.num_pred)
if not self.training:
if anchors.shape[1] != pred.shape[1]:
raise RuntimeError('number of anchors and regressions must be matched')
pred[..., :2] = (torch.sigmoid(pred[..., :2]) * 2 - 0.5) * self.stride + anchors[..., :2]
pred[..., 2:4] = (torch.sigmoid(pred[..., 2:4]) * 2) ** 2 * anchors[..., 2:]
return pred
class Yolo_V4_Head(nn.Module):
"""
__version__ = 1.0
__date__ = Mar 7, 2022
The structure is based on the implementation;
https://github.com/AlexeyAB/darknet
"""
def __init__(self,
num_levels: int,
channels: list,
num_anchors: list,
num_classes: int,
strides: list,
Act: nn.Module = None):
self.num_levels = num_levels
if len(channels) != num_levels or len(num_anchors) != num_levels:
raise ValueError('make len(channels) == num_levels, and len(num_anchors) == num_levels')
super(Yolo_V4_Head, self).__init__()
self.heads = nn.ModuleList([Yolo_V4_Predictor(channels[i], num_anchors[i], num_classes, strides[i], Act)
for i in range(num_levels)])
def forward(self, features, anchors: List[Tensor] = None):
out = []
for i in range(self.num_levels):
if anchors:
pred = self.heads[i](features[i], anchors[i])
else:
pred = self.heads[i](features[i], None)
out.append(pred)
out = torch.cat(out, 1)
return out
| 28.8125 | 112 | 0.555315 | 651 | 5,071 | 4.095238 | 0.168971 | 0.067517 | 0.03901 | 0.024006 | 0.889722 | 0.873218 | 0.809452 | 0.809452 | 0.75994 | 0.75994 | 0 | 0.032342 | 0.317097 | 5,071 | 175 | 113 | 28.977143 | 0.737511 | 0.097022 | 0 | 0.725275 | 0 | 0 | 0.052302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.087912 | false | 0 | 0.010989 | 0 | 0.186813 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
003ef86e6cac90a12ad888e7af3d7729ed67c30f | 96 | py | Python | bin/mailroom.py | UWPCE-Py310-Fall2020/mailroom-everything-kiljoy001 | 2f35ccb2e02ea48c1b18044670f55cb3dc661d93 | [
"MIT"
] | null | null | null | bin/mailroom.py | UWPCE-Py310-Fall2020/mailroom-everything-kiljoy001 | 2f35ccb2e02ea48c1b18044670f55cb3dc661d93 | [
"MIT"
] | null | null | null | bin/mailroom.py | UWPCE-Py310-Fall2020/mailroom-everything-kiljoy001 | 2f35ccb2e02ea48c1b18044670f55cb3dc661d93 | [
"MIT"
] | null | null | null | import mailroom_stuff.Front_End
if __name__ == '__main__':
mailroom_stuff.Front_End.main()
| 19.2 | 35 | 0.770833 | 13 | 96 | 4.769231 | 0.615385 | 0.419355 | 0.580645 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 96 | 4 | 36 | 24 | 0.738095 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
cc85a8ed37eb07f1d484f07d64bfd7ea294dcdd8 | 16,779 | py | Python | apsuite/commisslib/bo_opt_injection.py | carneirofc/apsuite | 1bbaa44ec6b89f50201790d6fab05c32729db6e1 | [
"MIT"
] | 1 | 2016-02-25T01:48:49.000Z | 2016-02-25T01:48:49.000Z | apsuite/commisslib/bo_opt_injection.py | carneirofc/apsuite | 1bbaa44ec6b89f50201790d6fab05c32729db6e1 | [
"MIT"
] | 12 | 2015-09-25T12:46:41.000Z | 2022-03-22T12:04:03.000Z | apsuite/commisslib/bo_opt_injection.py | carneirofc/apsuite | 1bbaa44ec6b89f50201790d6fab05c32729db6e1 | [
"MIT"
] | 2 | 2022-02-08T13:12:26.000Z | 2022-03-15T17:38:11.000Z | """."""
import time as _time
from epics import PV as _PV
import numpy as _np
import matplotlib.pyplot as plt
from ..optimization import PSO, SimulAnneal
class PSOInjection(PSO):
"""."""
def __init__(self, save=False):
"""."""
self.reference = []
self.niter = 0
self.nr_turns = 0
self.nr_bpm = 0
self.nswarm = 0
self.bpm_idx = 0
self._name_hands = []
self._name_quads = []
self._name_corrs = []
self._name_sept = []
self._name_kckr = []
self.eyes = []
self.hands = []
self.pv_nr_pts_sp = []
self.pv_nr_pts_rb = []
self.pv_buffer_mon = []
self.pv_buffer_reset = []
self.pv_nr_sample = []
self._wait_change = 0
self.f_init = 0
PSO.__init__(self, save=save)
def initialization(self):
"""."""
print('='*50)
d_quad = input(
'TB Quads KL Variation (Default = 1 [1/m]): ')
d_corr = input(
'TB Corrs Kick Variation (Default = 1000 [urad]): ')
d_sept = input('InjSept Variation (Default = 2 [mrad]): ')
d_kckr = input('InjKicker Variation (Default = 2 [mrad]): ')
print('='*50)
nr_iter = input('Number of Iteractions (Default = 10): ')
nr_swarm = input('Swarm Size (Default = 10 + 2 * sqrt(D)): ')
nr_pts = input('Buffer Size (SOFB) (Default = 10): ')
nr_turns = input(
'Number of Turns to Measure Sum Signal (Default = 1): ')
nr_bpm = input(
'Set the last BPM to Read ' +
'(Default = 50, Range = [1,50]): ')
print('='*50)
if not nr_iter:
nr_iter = 10
if not nr_swarm:
nr_swarm = 0
if not nr_pts:
nr_pts = 10
if not nr_bpm:
nr_bpm = 50
if not nr_turns:
nr_turns = 1
if not nr_pts:
nr_pts = 10
if not d_quad:
d_quad = 1
if not d_corr:
d_corr = 1000
if not d_kckr:
d_kckr = 2
if not d_sept:
d_sept = 2
d_quad = float(d_quad)
d_corr = float(d_corr)
d_sept = float(d_sept)
d_kckr = float(d_kckr)
nr_iter = int(nr_iter)
nr_swarm = int(nr_swarm)
nr_pts = int(nr_pts)
nr_turns = int(nr_turns)
nr_bpm = int(nr_bpm)
if not d_quad + d_corr + d_sept + d_kckr:
raise Exception('You have set zero variation for all dimensions!')
self.niter = nr_iter
self.nswarm = nr_swarm
self.nr_turns = nr_turns
self.nr_bpm = nr_bpm
self.bpm_idx = self.nr_bpm + 50 * (self.nr_turns - 1)
self._wait_change = 1
self.get_pvs()
print('Waiting for PVs connection...')
print('='*50)
while True:
if self.check_connect():
break
self.pv_nr_pts_sp.value = nr_pts
quad_lim = _np.ones(len(self._name_quads)) * d_quad
corr_lim = _np.ones(len(self._name_corrs)) * d_corr
sept_lim = _np.ones(len(self._name_sept)) * d_sept
kckr_lim = _np.ones(len(self._name_kckr)) * d_kckr
up = _np.concatenate((quad_lim, corr_lim, sept_lim, kckr_lim))
down = -1 * up
self.set_limits(upper=up, lower=down)
print(
'The script will take {:.2f} minutes to be finished'.format(
(nr_pts*0.5 + self._wait_change) *
self.nswarm * self.niter / 60))
print('='*50)
self.reference = _np.array([h.value for h in self.hands])
self.reset_wait_buffer()
self.init_obj_func()
print('Initial Objective Function {:.5f}'.format(self.f_init))
print('='*50)
def get_pvs(self):
"""."""
# Number of turns to measure Sum Signal
prefix = ''
self.pv_nr_sample = _PV(
prefix + 'BO-Glob:AP-SOFB:TrigNrSamplesPost-SP')
_time.sleep(self._wait_change)
self.pv_nr_sample.value = int(self.nr_turns)
# Diagnostic to calculate Objective Function
self.eyes = _PV(
prefix + 'BO-Glob:AP-SOFB:MTurnSum-Mon', auto_monitor=True)
self._name_quads = [
# 'TB-01:MA-QF1', 'TB-01:MA-QD1',
'TB-02:MA-QF2A', 'TB-02:MA-QD2A',
'TB-02:MA-QF2B', 'TB-02:MA-QD2B',
'TB-03:MA-QF3', 'TB-03:MA-QD3',
'TB-04:MA-QF4', 'TB-04:MA-QD4',
]
self._name_quads = [prefix + q + ':KL-SP' for q in self._name_quads]
self._name_hands.extend(self._name_quads)
self._name_corrs = [
# 'TB-01:MA-CH-1', 'TB-01:MA-CV-1',
# 'TB-01:MA-CH-2', 'TB-01:MA-CV-2',
# 'TB-02:MA-CH-1', 'TB-02:MA-CV-1',
# 'TB-02:MA-CH-2', 'TB-02:MA-CV-2',
'TB-04:MA-CH-1', 'TB-04:MA-CV-1',
# 'TB-04:MA-CH-2', # Magnet transformed to QS
'TB-04:MA-CV-2',
]
self._name_corrs = [prefix + c + ':Kick-SP' for c in self._name_corrs]
self._name_hands.extend(self._name_corrs)
self._name_sept = [prefix + 'TB-04:PM-InjSept:Kick-SP']
self._name_hands.extend(self._name_sept)
self._name_kckr = [prefix + 'BO-01D:PM-InjKckr:Kick-SP']
self._name_hands.extend(self._name_kckr)
# Actuator to change settings
self.hands = [_PV(h) for h in self._name_hands]
self.pv_nr_pts_sp = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothNrPts-SP')
self.pv_nr_pts_rb = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothNrPts-RB')
self.pv_buffer_mon = _PV(prefix + 'BO-Glob:AP-SOFB:BufferCount-Mon')
self.pv_buffer_reset = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothReset-Cmd')
def check_connect(self):
"""."""
conh = [h.connected for h in self.hands]
cone = self.eyes.connected
if cone and sum(conh) == len(conh):
con = True
else:
con = False
return con
def get_change(self, part):
"""."""
return self.reference + self.position[part, :]
def set_change(self, change):
"""."""
for k in range(len(self.hands)):
self.hands[k].value = change[k]
def reset_wait_buffer(self):
"""."""
self.pv_buffer_reset.value = 1
_time.sleep(self._wait_change)
while True:
if self.pv_buffer_mon.value == self.pv_nr_pts_rb.value:
break
def save_bpms_sum(self):
"""."""
with open('BPM_Sum.txt', 'a') as sbpm:
sbpm.write('='*50 + '\n')
_np.savetxt(sbpm, self.eyes.value[:self.bpm_idx], fmt='%+.8e')
def init_obj_func(self):
"""."""
self.f_init = -_np.sum(self.eyes.value[:self.bpm_idx])
def calc_obj_fun(self):
"""."""
f_out = _np.zeros(self.nswarm)
for i in range(self.nswarm):
chg = self.get_change(i)
self.set_change(chg)
# _time.sleep(self._wait_change)
self.reset_wait_buffer()
self.save_bpms_sum()
f_out[i] = _np.sum(self.eyes.value[:self.bpm_idx])
print(
'Particle {:02d}/{:d} | Obj. Func. : {:f}'.format(
i+1, self.nswarm, f_out[i]))
print('='*50)
return - f_out
def run(self):
"""."""
self.start()
self.join()
pos = self.best_positions_history
fig = self.best_figures_history
plt.plot(-fig, '-o')
plt.xlabel('Number of Iteractions')
plt.ylabel('Objective Function')
plt.savefig('obj_fun_PSO.png')
plt.show()
print(
'The Objective Function changed from {:.5f} to {:.5f}'.format(
self.f_init, _np.abs(fig[-1])))
if _np.abs(self.f_init) < _np.abs(fig[-1]):
imp = (_np.abs(fig[-1]/self.f_init) - 1) * 100
print('The script improved the system in {:.3f} %!'.format(imp))
set_opt = input(
'Do you want to set the best configuration found? (y or n): ')
if set_opt == 'y':
best_setting = self.reference + pos[-1, :]
self.set_change(best_setting)
print('Best configuration found was set to the machine!')
else:
print('Ok... Setting initial reference!')
self.set_change(self.reference)
else:
print('It was not possible to improve the system...')
print('Setting initial reference.')
self.set_change(self.reference)
return pos, fig
class SAInjection(SimulAnneal):
"""."""
def __init__(self, save=False):
"""."""
self.reference = []
self.niter = 0
self.nr_turns = 0
self.nr_bpm = 0
self.bpm_idx = 0
self.temperature = 0
self._name_hands = []
self._name_quads = []
self._name_corrs = []
self._name_sept = []
self._name_kckr = []
self.eyes = []
self.hands = []
self.pv_nr_pts_sp = []
self.pv_nr_pts_rb = []
self.pv_buffer_mon = []
self.pv_buffer_reset = []
self.pv_nr_sample = []
self._wait_change = 0
self.f_init = 0
SimulAnneal.__init__(self, save=save)
def initialization(self):
"""."""
print('='*50)
d_quad = input(
'TB Quads KL Max Delta (Default = 0.5 [1/m]): ')
d_corr = input(
'TB Corrs Kick Max Delta (Default = 500 [urad]): ')
d_sept = input('InjSept Variation (Default = 2 [mrad]): ')
d_kckr = input('Inj Kicker Max Delta (Default = 2 [mrad]): ')
temp = input('Initial Temperature (Default = 0): ')
print('='*50)
nr_iter = input('Number of Iteractions (Default = 100): ')
nr_pts = input('Buffer Size (SOFB) (Default = 10): ')
nr_turns = input(
'Number of Turns to Measure BPM Sum Signal (Default = 1): ')
nr_bpm = input(
'Set the last BPM to Read ' +
'(Default = 50, Range = [1,50]): ')
print('='*50)
if not nr_iter:
nr_iter = 100
if not nr_pts:
nr_pts = 10
if not nr_bpm:
nr_bpm = 50
if not nr_turns:
nr_turns = 1
if not nr_pts:
nr_pts = 10
if not d_quad:
d_quad = 0.5
if not d_corr:
d_corr = 500
if not d_kckr:
d_kckr = 2
if not d_sept:
d_sept = 2
if not temp:
temp = 0
d_quad = float(d_quad)
d_corr = float(d_corr)
d_sept = float(d_sept)
d_kckr = float(d_kckr)
temp = float(temp)
nr_iter = int(nr_iter)
nr_pts = int(nr_pts)
nr_turns = int(nr_turns)
nr_bpm = int(nr_bpm)
if not d_quad + d_corr + d_sept + d_kckr:
raise Exception('You have set zero variation for all dimensions!')
self.niter = nr_iter
self.nr_turns = nr_turns
self.nr_bpm = nr_bpm
self.bpm_idx = self.nr_bpm + 50 * (self.nr_turns - 1)
self.temperature = temp
self._wait_change = 1
self.get_pvs()
print('Waiting for PVs connection...')
print('='*50)
while True:
if self.check_connect():
break
self.pv_nr_pts_sp.value = nr_pts
quad_lim = _np.ones(len(self._name_quads)) * d_quad
corr_lim = _np.ones(len(self._name_corrs)) * d_corr
sept_lim = _np.ones(len(self._name_sept)) * d_sept
kckr_lim = _np.ones(len(self._name_kckr)) * d_kckr
delta = _np.concatenate((quad_lim, corr_lim, sept_lim, kckr_lim))
self.set_deltas(dmax=delta)
print(
'The script will take {:.2f} minutes to be finished'.format(
(nr_pts*0.5 + self._wait_change) * self.niter / 60))
print('='*50)
self.reference = _np.array([h.value for h in self.hands])
self.position = self.reference
self.reset_wait_buffer()
self.init_obj_fun()
print('Initial Objective Function {:.5f}'.format(self.f_init))
print('='*50)
def get_pvs(self):
"""."""
prefix = ''
# Number of turns to measure Sum Signal
self.pv_nr_sample = _PV(
prefix + 'BO-Glob:AP-SOFB:TrigNrSamplesPost-SP')
_time.sleep(self._wait_change)
self.pv_nr_sample.value = int(self.nr_turns)
# Diagnostic to calculate Objective Function
self.eyes = _PV(
prefix + 'BO-Glob:AP-SOFB:MTurnSum-Mon', auto_monitor=True)
self._name_quads = [
# 'TB-01:MA-QF1', 'TB-01:MA-QD1',
# 'TB-02:MA-QF2A', 'TB-02:MA-QD2A',
# 'TB-02:MA-QF2B', 'TB-02:MA-QD2B',
'TB-03:MA-QF3', 'TB-03:MA-QD3',
'TB-04:MA-QF4', 'TB-04:MA-QD4',
]
self._name_quads = [prefix + q + ':KL-SP' for q in self._name_quads]
self._name_hands.extend(self._name_quads)
self._name_corrs = [
# 'TB-01:MA-CH-1', 'TB-01:MA-CV-1',
# 'TB-01:MA-CH-2', 'TB-01:MA-CV-2',
# 'TB-02:MA-CH-1', 'TB-02:MA-CV-1',
# 'TB-02:MA-CH-2', 'TB-02:MA-CV-2',
'TB-04:MA-CH-1', 'TB-04:MA-CV-1',
# 'TB-04:MA-CH-2', # Magnet transformed to QS
'TB-04:MA-CV-2',
]
self._name_corrs = [prefix + c + ':Kick-SP' for c in self._name_corrs]
self._name_hands.extend(self._name_corrs)
self._name_sept = [prefix + 'TB-04:PM-InjSept:Kick-SP']
self._name_hands.extend(self._name_sept)
self._name_kckr = [prefix + 'BO-01D:PM-InjKckr:Kick-SP']
self._name_hands.extend(self._name_kckr)
# Actuator to change settings
self.hands = [_PV(h) for h in self._name_hands]
self.pv_nr_pts_sp = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothNrPts-SP')
self.pv_nr_pts_rb = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothNrPts-RB')
self.pv_buffer_mon = _PV(prefix + 'BO-Glob:AP-SOFB:BufferCount-Mon')
self.pv_buffer_reset = _PV(prefix + 'BO-Glob:AP-SOFB:SmoothReset-Cmd')
def check_connect(self):
"""."""
conh = [h.connected for h in self.hands]
cone = self.eyes.connected
if cone and sum(conh) == len(conh):
con = False
else:
con = True
return con
def get_change(self):
"""."""
return self.position
def set_change(self, change):
"""."""
for k in range(len(self.hands)):
self.hands[k].value = change[k]
def reset_wait_buffer(self):
"""."""
self.pv_buffer_reset.value = 1
_time.sleep(self._wait_change)
while True:
if self.pv_buffer_mon.value == self.pv_nr_pts_rb.value:
break
def init_obj_fun(self):
"""."""
self.f_init = -_np.sum(self.eyes.value[:self.bpm_idx])
def save_bpms_sum(self):
"""."""
with open('BPM_Sum.txt', 'a') as sbpm:
sbpm.write(print('='*50))
_np.savetxt(sbpm, self.eyes.value[:self.bpm_idx], fmt='%+.8e')
def calc_obj_fun(self):
"""."""
f_out = []
chg = self.get_change()
self.set_change(chg)
# _time.sleep(self._wait_change)
self.reset_wait_buffer()
self.save_bpms_sum()
f_out = _np.sum(self.eyes.value[:self.bpm_idx])
return - f_out
def run(self):
"""."""
self.start()
self.join()
pos = self.best_positions_history
fig = self.best_figures_history
plt.plot(-fig, '-o')
plt.xlabel('Number of Iteractions')
plt.ylabel('Objective Function')
plt.savefig('obj_fun_SA.png')
plt.show()
print(
'The Objective Function changed from {:.5f} to {:.5f}'.format(
self.f_init, _np.abs(fig[-1])))
if _np.abs(self.f_init) < _np.abs(fig[-1]):
imp = (_np.abs(fig[-1]/self.f_init) - 1) * 100
print(
'The script improved the system in {:.3f} %!'.format(imp))
set_opt = input(
'Set the best configuration found? (y or n): ')
if set_opt == 'y':
_np.savetxt('initial_reference.txt', self.reference)
# _np.savetxt('initial_obj_fun.txt', self.f_init)
self.set_change(pos[-1, :])
print('Best configuration found was set to the machine!')
else:
print('Ok... Setting initial reference!')
self.set_change(self.reference)
return pos, fig
| 32.643969 | 79 | 0.530127 | 2,281 | 16,779 | 3.669882 | 0.111355 | 0.049695 | 0.017202 | 0.015769 | 0.8717 | 0.8717 | 0.855692 | 0.837295 | 0.825111 | 0.814598 | 0 | 0.027642 | 0.331605 | 16,779 | 513 | 80 | 32.707602 | 0.71877 | 0.051851 | 0 | 0.790404 | 0 | 0 | 0.164224 | 0.031157 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.012626 | 0 | 0.093434 | 0.078283 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cca81de2f16f0dbfdf119477f9008ec72fa16dc7 | 9,063 | py | Python | app/test/unittest/test_admin.py | michalkoziara/IoT-RESTful-Webservice | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | 2 | 2021-09-24T02:45:32.000Z | 2021-11-15T09:44:44.000Z | app/test/unittest/test_admin.py | PKramek/IoT-RESTful-Webservice-1 | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | null | null | null | app/test/unittest/test_admin.py | PKramek/IoT-RESTful-Webservice-1 | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | 1 | 2021-09-11T11:47:32.000Z | 2021-09-11T11:47:32.000Z | from unittest.mock import patch
import pytest
from app.main.repository.admin_repository import AdminRepository
from app.main.repository.device_group_repository import DeviceGroupRepository
from app.main.repository.user_repository import UserRepository
from app.main.service.admin_service import AdminService
from app.main.util.constants import Constants
def test_create_admin_should_return_success_message_when_valid_parameters(
get_device_group_default_values,
create_device_group):
admin_service_instance = AdminService.get_instance()
device_group_values = get_device_group_default_values()
device_group_values['admin_id'] = None
device_group = create_device_group(device_group_values)
with patch.object(UserRepository,
'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = None
with patch.object(AdminRepository,
'get_admin_by_email_or_username') as get_admin_by_email_or_username_mock:
get_admin_by_email_or_username_mock.return_value = None
with patch.object(DeviceGroupRepository,
'get_device_group_by_product_key') as get_device_group_by_product_key_mock:
get_device_group_by_product_key_mock.return_value = device_group
with patch.object(AdminRepository, 'save') as save_mock:
save_mock.return_value = True
with patch(
'app.main.service.admin_service.is_password_hash_correct'
) as password_check_mock:
password_check_mock.return_value = True
result = admin_service_instance.create_admin(
'username',
'email',
'password',
'product_key',
device_group.password
)
assert result
assert result == Constants.RESPONSE_MESSAGE_CREATED
@pytest.mark.parametrize("username, email, password, product_key, product_password", [
(None, 'test email', "password", "product key", "product password"),
('test username', None, "password", "product key", "product password"),
('test username', 'test email', None, "product key", "product password"),
('test username', 'test email', "password", None, "product password"),
('test username', 'test email', "password", "product key", None)])
def test_create_user_should_return_bad_request_message_when_no_parameter(
username,
email,
password,
product_key,
product_password):
admin_service_instance = AdminService.get_instance()
result = admin_service_instance.create_admin(username, email, password, product_key, product_password)
assert result
assert result == Constants.RESPONSE_MESSAGE_BAD_REQUEST
def test_create_user_should_return_user_already_exists_message_when_duplicate_user(create_user):
admin_service_instance = AdminService.get_instance()
user = create_user()
with patch.object(UserRepository, 'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = user
result = admin_service_instance.create_admin(
user.username,
'email',
'password',
'product_key',
'product password'
)
assert result
assert result == Constants.RESPONSE_MESSAGE_USER_ALREADY_EXISTS
def test_create_user_should_return_user_already_exists_message_when_duplicate_admin(create_admin):
admin_service_instance = AdminService.get_instance()
admin = create_admin()
with patch.object(UserRepository, 'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = None
with patch.object(AdminRepository, 'get_admin_by_email_or_username') as get_admin_by_email_or_username_mock:
get_admin_by_email_or_username_mock.return_value = admin
result = admin_service_instance.create_admin(
admin.username,
'email',
'password',
'product_key',
'product password'
)
assert result
assert result == Constants.RESPONSE_MESSAGE_USER_ALREADY_EXISTS
def test_create_user_should_return_error_message_when_save_failed(
get_device_group_default_values,
create_device_group):
admin_service_instance = AdminService.get_instance()
device_group_values = get_device_group_default_values()
device_group_values['admin_id'] = None
device_group = create_device_group(device_group_values)
with patch.object(UserRepository,
'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = None
with patch.object(AdminRepository,
'get_admin_by_email_or_username') as get_admin_by_email_or_username_mock:
get_admin_by_email_or_username_mock.return_value = None
with patch.object(DeviceGroupRepository,
'get_device_group_by_product_key') as get_device_group_by_product_key_mock:
get_device_group_by_product_key_mock.return_value = device_group
with patch.object(AdminRepository, 'save') as save_mock:
save_mock.return_value = False
with patch(
'app.main.service.admin_service.is_password_hash_correct'
) as password_check_mock:
password_check_mock.return_value = True
result = admin_service_instance.create_admin(
'username',
'email',
'password',
'product_key',
device_group.password
)
assert result
assert result == Constants.RESPONSE_MESSAGE_ERROR
def test_create_user_should_return_product_key_not_found_message_when_device_group_has_admin(create_device_group):
admin_service_instance = AdminService.get_instance()
device_group = create_device_group()
with patch.object(UserRepository,
'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = None
with patch.object(AdminRepository,
'get_admin_by_email_or_username') as get_admin_by_email_or_username_mock:
get_admin_by_email_or_username_mock.return_value = None
with patch.object(DeviceGroupRepository,
'get_device_group_by_product_key') as get_device_group_by_product_key_mock:
get_device_group_by_product_key_mock.return_value = device_group
result = admin_service_instance.create_admin(
'username',
'email',
'password',
'product_key',
device_group.password
)
assert result
assert result == Constants.RESPONSE_MESSAGE_PRODUCT_KEY_NOT_FOUND
def test_create_admin_should_return_invalid_credentials_message_when_invalid_device_group_password(
get_device_group_default_values,
create_device_group):
admin_service_instance = AdminService.get_instance()
device_group_values = get_device_group_default_values()
device_group_values['admin_id'] = None
device_group = create_device_group(device_group_values)
with patch.object(UserRepository,
'get_user_by_email_or_username') as get_user_by_email_or_username_mock:
get_user_by_email_or_username_mock.return_value = None
with patch.object(AdminRepository,
'get_admin_by_email_or_username') as get_admin_by_email_or_username_mock:
get_admin_by_email_or_username_mock.return_value = None
with patch.object(DeviceGroupRepository,
'get_device_group_by_product_key') as get_device_group_by_product_key_mock:
get_device_group_by_product_key_mock.return_value = device_group
with patch.object(AdminRepository, 'save') as save_mock:
save_mock.return_value = True
result = admin_service_instance.create_admin(
'username',
'email',
'password',
'product_key',
'not' + device_group.password
)
assert result
assert result == Constants.RESPONSE_MESSAGE_INVALID_CREDENTIALS
if __name__ == '__main__':
pytest.main(['app/unittest/{}.py'.format(__file__)])
| 39.92511 | 116 | 0.667439 | 1,023 | 9,063 | 5.381232 | 0.077224 | 0.099909 | 0.053951 | 0.101907 | 0.873206 | 0.862852 | 0.803633 | 0.767847 | 0.751135 | 0.751135 | 0 | 0 | 0.272978 | 9,063 | 226 | 117 | 40.10177 | 0.835483 | 0 | 0 | 0.70303 | 0 | 0 | 0.123469 | 0.061569 | 0 | 0 | 0 | 0 | 0.084848 | 1 | 0.042424 | false | 0.169697 | 0.042424 | 0 | 0.084848 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
ccd56f43649d9064e10eef90768b5751d787cbc7 | 18,177 | py | Python | unit_tests/web_tests/test_web_user.py | hep-gc/cloud-scheduler-2 | 180d9dc4f8751cf8c8254518e46f83f118187e84 | [
"Apache-2.0"
] | 3 | 2020-03-03T03:25:36.000Z | 2021-12-03T15:31:39.000Z | unit_tests/web_tests/test_web_user.py | hep-gc/cloud-scheduler-2 | 180d9dc4f8751cf8c8254518e46f83f118187e84 | [
"Apache-2.0"
] | 341 | 2017-06-08T17:27:59.000Z | 2022-01-28T19:37:57.000Z | unit_tests/web_tests/test_web_user.py | hep-gc/cloud-scheduler-2 | 180d9dc4f8751cf8c8254518e46f83f118187e84 | [
"Apache-2.0"
] | 3 | 2018-04-25T16:13:20.000Z | 2020-04-15T20:03:46.000Z | if __name__ == "__main__" or __name__ == "test_web_user":
__package__ = 'cloudscheduler.unit_tests.web_tests'
import unittest
import sys
from . import web_test_setup_cleanup as wtsc
from . import web_test_assertions_v2 as wta
from . import web_test_page_objects as pages
from . import web_test_helpers as helpers
class TestWebUserCommon(unittest.TestCase):
"""A class for the user tests that should be repeated in all iterations."""
@classmethod
def setUpClass(cls):
cls.page = pages.UsersPage(cls.driver, cls.gvar['address'])
cls.oversize = cls.gvar['oversize']
def setUp(self):
self.page.get_homepage()
self.page.click_top_nav('Users')
def test_web_user_find(self):
# Finds the users page
pass
def test_web_user_add_with_group(self):
# Adds a user who's in a group
username = self.gvar['user'] + '-wiu5'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.side_button_exists(username))
self.assertTrue(self.page.group_box_checked(group_name))
wta.assertHasAttribute('user', username, 'user_groups', group_name)
def test_web_user_add_without_group(self):
# Adds a user who's not in any groups
username = self.gvar['user'] + '-wiu6'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_add_user()
self.assertTrue(self.page.side_button_exists(username))
wta.assertHasAttribute('user', username, 'user_groups', 'None')
def test_web_user_add_superuser(self):
# Adds a super user
username = self.gvar['user'] + '-wiu7'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.side_button_exists(username))
self.assertTrue(self.page.superuser_box_checked())
wta.assertHasAttribute('user', username, 'is_superuser', '1')
def test_web_user_add_regular_user(self):
# Adds a non-super user
username = self.gvar['user'] + '-wiu8'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.side_button_exists(username))
self.assertFalse(self.page.superuser_box_checked())
wta.assertHasAttribute('user', username, 'is_superuser', '0')
def test_web_user_add_without_username(self):
# Tries to add a user without a username
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
def test_web_user_add_with_conflicting_username(self):
# Tries to add a user with a username that's already taken
username = self.gvar['user'] + '-wiu1'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
wta.assertHasAttribute('user', username, 'is_superuser', '0')
def test_web_user_add_name_with_symbols(self):
# Tries to add a user with symbols in their name
username = 'inv@|id-web-te$t'
group_name = self.gvar['user'] + '-wig1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_name_with_two_dashes(self):
# Tries to add a user with two dashes in their name
username = 'invalid--web--test'
group_name = self.gvar['user'] + '-wig1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_name_with_uppercase(self):
# Tries to add a user with uppercase letters in their name
username = 'INVALID-WEB-TEST'
group_name = self.gvar['user'] + '-wig1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_name_with_starting_ending_dash(self):
# Tries to add a user with starting and ending dashes in their name
username = '-invalid-web-test-'
group_name = self.gvar['user'] + '-wig1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_name_too_long(self):
# Tries to add a user with a name that is too long for the database
username = self.oversize['varchar_32']
group_name = self.gvar['user'] + '-wig1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'])
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_without_password(self):
# Tries to add a user without a password
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
wta.assertNotExists('user', username)
def test_web_user_add_password_mismatched(self):
# Tries to add a user with a non-matching "confirm password"
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(self.gvar['user_secret'], 'incorrect_password')
self.page.click_superuser_checkbox()
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
wta.assertNotExists('user', username)
def test_web_user_add_password_too_short(self):
# Tries to add a user with a password that's too short
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
password = 'Aa1'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(password)
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_password_without_uppercase(self):
# Tries to add a user with a password without uppercase letters
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
password = 'abcd1234'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(password)
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_password_without_lowercase(self):
# Tries to add a user with a password without lowercase letters
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
password = 'ABCD1234'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(password)
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_add_password_without_numbers(self):
# Tries to add a user with a password without numbers
username = self.gvar['user'] + '-wiu9'
group_name = self.gvar['user'] + '-wig3'
password = 'ABCDabcd'
self.page.click_add_button()
self.page.type_username(username)
self.page.type_password(password)
self.page.click_group_checkbox(group_name)
self.page.click_add_user()
self.assertTrue(self.page.error_message_displayed())
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_update_password(self):
# Changes a user's password
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password(username + '-password')
self.page.click_update_user()
self.assertFalse(self.page.error_message_displayed())
def test_web_user_update_password_mismatched(self):
# Tries to change a user's password with a non-matching "confirm password"
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password(username + '-password', 'incorrect_password')
self.page.click_update_user()
self.assertTrue(self.page.error_message_displayed())
def test_web_user_update_password_too_short(self):
# Tries to change a user's password to one that's too short
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password('Aa1')
self.page.click_update_user()
self.assertTrue(self.page.error_message_displayed())
def test_web_user_update_password_without_uppercase(self):
# Tries to change a user's password to one without uppercase letters
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password('abcd1234')
self.page.click_update_user()
self.assertTrue(self.page.error_message_displayed())
def test_web_user_update_password_without_lowercase(self):
# Tries to change a user's password to one without lowercase letters
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password('ABCD1234')
self.page.click_update_user()
self.assertTrue(self.page.error_message_displayed())
def test_web_user_update_password_without_numbers(self):
# Tries to change a user's password to one without numbers
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.type_password('ABCDabcd')
self.page.click_update_user()
self.assertTrue(self.page.error_message_displayed())
@unittest.skip("No current infrastructure to test this.")
def test_web_user_update_cert_cn(self):
# Changes a user's certificate common name
pass
@unittest.skip("No current infrastructure to test this.")
def test_web_user_update_cert_cn_too_long(self):
# Tries to change a user's certificate common name to one that's too long for the database
pass
def test_web_user_update_superuser(self):
# Changes a regular user to a super user
username = self.gvar['user'] + '-wiu4'
self.page.click_side_button(username)
self.page.click_superuser_checkbox()
self.page.click_update_user()
self.assertTrue(self.page.superuser_box_checked())
wta.assertHasAttribute('user', username, 'is_superuser', '1')
def test_web_user_update_group_add(self):
# Adds a user to a group
username = self.gvar['user'] + '-wiu4'
group_name = self.gvar['user'] + '-wig3'
self.page.click_side_button(username)
self.page.click_group_checkbox(group_name)
self.page.click_update_user()
self.assertTrue(self.page.group_box_checked(group_name))
wta.assertHasAttribute('user', username, 'user_groups', group_name)
def test_web_user_update_group_remove(self):
# Removes a user from a group
username = self.gvar['user'] + '-wiu4'
group_name = self.gvar['user'] + '-wig1'
self.page.click_side_button(username)
self.page.click_group_checkbox(group_name)
self.page.click_update_user()
self.assertFalse(self.page.group_box_checked(group_name))
wta.assertHasNotAttribute('user', username, 'user_groups', group_name)
def test_web_user_delete(self):
# Deletes a user
username = self.gvar['user'] + '-wiu3'
self.page.click_side_button(username)
self.page.click_delete_button()
self.page.click_delete_modal()
self.assertFalse(self.page.side_button_exists(username))
wta.assertNotExists('user', username)
def test_web_user_delete_cancel(self):
# Tries to delete a user but clicks cancel
username = self.gvar['user'] + '-wiu1'
self.page.click_side_button(username)
self.page.click_delete_button()
self.page.click_delete_cancel()
self.assertTrue(self.page.side_button_exists(username))
wta.assertExists('user', username)
@classmethod
def tearDownClass(cls):
wtsc.cleanup(cls)
class TestWebUserSuperUserFirefox(TestWebUserCommon):
"""A class to test user operations via the web interface, in Firefox, with a super user."""
@classmethod
def setUpClass(cls):
try:
wtsc.setup(cls, 2, ['users'], browser='firefox')
super(TestWebUserSuperUserFirefox, cls).setUpClass()
print("\nUser Tests:")
except:
print("Error in test setup")
super(TestWebUserSuperUserFirefox, cls).tearDownClass()
raise
class TestWebUserSuperUserChromium(TestWebUserCommon):
"""A class to test user operations via the web interface, in Chromium, with a super user."""
@classmethod
def setUpClass(cls):
try:
wtsc.setup(cls, 2, ['users'], browser='chromium')
super(TestWebUserSuperUserChromium, cls).setUpClass()
print("\nUser Tests (Chromium):")
except:
print("Error in test setup")
super(TestWebUserSuperUserChromium, cls).tearDownClass()
raise
class TestWebUserSuperUserOpera(TestWebUserCommon):
"""A class to test user operations via the web interface, in Opera, with a super user."""
@classmethod
def setUpClass(cls):
try:
wtsc.setup(cls, 2, ['users'], browser='opera')
super(TestWebUserSuperUserOpera, cls).setUpClass()
print("\nUser Tests (Opera):")
except:
print("Error in test setup")
super(TestWebUserSuperUserOpera, cls).tearDownClass()
raise
class TestWebUserSuperUserChrome(TestWebUserCommon):
"""A class to test user operations via the web interface, in Chrome, with a super user."""
@classmethod
def setUpClass(cls):
try:
wtsc.setup(cls, 2, ['users'], browser='chrome')
super(TestWebUserSuperUserChrome, cls).setUpClass()
print("\nUser Tests (Chrome):")
except:
print("Error in test setup")
super(TestWebUserSuperUserChrome, cls).tearDownClass()
raise
if __name__ == "__main__":
runner = unittest.TextTestRunner(verbosity=2)
tests = [ TestWebUserSuperUserFirefox,
TestWebUserSuperUserChromium,
TestWebUserSuperUserOpera,
TestWebUserSuperUserChrome ]
suite = helpers.parse_command_line_arguments(sys.argv, tests, False)
runner.run(suite)
| 42.769412 | 98 | 0.674864 | 2,280 | 18,177 | 5.130263 | 0.083772 | 0.114901 | 0.100026 | 0.046508 | 0.846456 | 0.811747 | 0.785928 | 0.754125 | 0.733094 | 0.716509 | 0 | 0.004923 | 0.217803 | 18,177 | 424 | 99 | 42.870283 | 0.817766 | 0.102932 | 0 | 0.696793 | 0 | 0 | 0.075446 | 0.002154 | 0 | 0 | 0 | 0 | 0.177843 | 1 | 0.110787 | false | 0.119534 | 0.017493 | 0 | 0.142857 | 0.023324 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
9dc461aa0d987b80c609fdb7b51effa078d206d1 | 154 | py | Python | app/child_id.py | jieggii/giving-tuesday-bot | f27d143d2f24b81c9121ae0852d3f73a5897b165 | [
"MIT"
] | 1 | 2021-11-18T04:27:19.000Z | 2021-11-18T04:27:19.000Z | app/child_id.py | jieggii/giving-tuesday-bot | f27d143d2f24b81c9121ae0852d3f73a5897b165 | [
"MIT"
] | null | null | null | app/child_id.py | jieggii/giving-tuesday-bot | f27d143d2f24b81c9121ae0852d3f73a5897b165 | [
"MIT"
] | null | null | null | N = 100
def prettify_child_id(child_id: int):
return child_id + N
def parse_pretty_child_id(pretty_child_id: int):
return pretty_child_id - N
| 15.4 | 48 | 0.74026 | 27 | 154 | 3.814815 | 0.37037 | 0.407767 | 0.378641 | 0.31068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024 | 0.188312 | 154 | 9 | 49 | 17.111111 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
9de9910ccda9d4b5f9fcf4821a5acc020858996f | 26,461 | py | Python | pytorch_privacy/analysis/pld_accountant.py | MJHutchinson/PytorchPrivacy | b8084914a00b2047054f79d8339609bcdfb9d026 | [
"Apache-2.0"
] | 2 | 2020-01-06T00:54:54.000Z | 2020-05-03T14:55:39.000Z | pytorch_privacy/analysis/pld_accountant.py | MJHutchinson/PytorchPrivacy | b8084914a00b2047054f79d8339609bcdfb9d026 | [
"Apache-2.0"
] | null | null | null | pytorch_privacy/analysis/pld_accountant.py | MJHutchinson/PytorchPrivacy | b8084914a00b2047054f79d8339609bcdfb9d026 | [
"Apache-2.0"
] | 1 | 2019-10-23T00:15:19.000Z | 2019-10-23T00:15:19.000Z | """
Paper: https://arxiv.org/abs/1906.03049
Referenced code from: https://github.com/DPBayes/PLD-Accountant
"""
import os
import pickle
import logging
import numpy as np
from functools import lru_cache
from pytorch_privacy.analysis.utils import grab_pickled_accountant_results
logger = logging.getLogger()
@lru_cache(maxsize=100)
def get_FF1_add_remove(sigma, q, nx, L):
""" For a single Gaussian Mechanism, compute the FFT approximation points for
the privacy loss distribution, under the addition/removal adjacency definition.
:param sigma: The effective noise applied
:param q: The sample probability
:param nx: The number of approximation points
:param L: The clipping length of the approximation
:return: The FFT approximation points
"""
try:
filename = f"add_remove_{sigma}_{q}_{nx}_{L}.p"
saved_result_flag, result, fp = grab_pickled_accountant_results(filename)
except (AttributeError, EOFError, ImportError, IndexError, pickle.UnpicklingError):
logger.error("Error reading accountant Pickle!")
if saved_result_flag:
return result
# Evaluate the PLD distribution,
# This is the case of substitution relation (subsection 5.1)
if q == 1.0:
q = 1 - 1E-5
half = int(nx / 2)
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
ii = int(np.floor(float(nx * (L + np.log(1 - q)) / (2 * L))))
# Evaluate the PLD distribution,
# The case of remove/add relation (Subsection 5.1)
Linvx = (sigma ** 2) * np.log((np.exp(x[ii + 1:]) - (1 - q)) / q) + 0.5
ALinvx = (1 / np.sqrt(2 * np.pi * sigma ** 2)) * ((1 - q) * np.exp(-Linvx * Linvx / (2 * sigma ** 2)) +
q * np.exp(-(Linvx - 1) * (Linvx - 1) / (2 * sigma ** 2)))
dLinvx = (sigma ** 2 * np.exp(x[ii + 1:])) / (np.exp(x[ii + 1:]) - (1 - q))
fx = np.zeros(nx)
fx[ii + 1:] = np.real(ALinvx * dLinvx)
# Flip fx, i.e. fx <- D(fx), the matrix D = [0 I;I 0]
temp = np.copy(fx[half:])
fx[half:] = np.copy(fx[:half])
fx[:half] = temp
FF1 = np.fft.fft(fx * dx)
try:
os.makedirs(os.path.dirname(fp), exist_ok=True)
with open(fp, 'wb+') as dump:
pickle.dump(FF1, dump)
except (FileNotFoundError, pickle.PickleError, pickle.PicklingError):
logger.error("Error with saving accountant pickle")
return FF1
@lru_cache(maxsize=100)
def get_FF1_substitution(sigma, q, nx, L):
""" For a single Gaussian Mechanism, compute the FFT approximation points for
the privacy loss distribution, under the substitution adjacency definition.
:param sigma: The effective noise applied
:param q: The sample probability
:param nx: The number of approximation points
:param L: The clipping length of the approximation
:return: The FFT approximation points
"""
try:
filename = f"sub_{sigma}_{q}_{nx}_{L}.p"
saved_result_flag, result, fp = grab_pickled_accountant_results(filename)
except (AttributeError, EOFError, ImportError, IndexError, pickle.UnpicklingError):
logger.error("Error reading accountant Pickle!")
if saved_result_flag:
return result
# Evaluate the PLD distribution,
# This is the case of substitution relation (subsection 5.2)
half = int(nx / 2)
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
c = q * np.exp(-1 / (2 * sigma ** 2))
ey = np.exp(x)
term1 = (-(1 - q) * (1 - ey) + np.sqrt((1 - q) ** 2 * (1 - ey) ** 2 + 4 * c ** 2 * ey)) / (2 * c)
term1 = np.maximum(term1, 1e-16)
Linvx = (sigma ** 2) * np.log(term1)
sq = np.sqrt((1 - q) ** 2 * (1 - ey) ** 2 + 4 * c ** 2 * ey)
nom1 = 4 * c ** 2 * ey - 2 * (1 - q) ** 2 * ey * (1 - ey)
term1 = nom1 / (2 * sq)
nom2 = term1 + (1 - q) * ey
nom2 = nom2 * (sq + (1 - q) * (1 - ey))
dLinvx = sigma ** 2 * nom2 / (4 * c ** 2 * ey)
ALinvx = (1 / np.sqrt(2 * np.pi * sigma ** 2)) * ((1 - q) * np.exp(-Linvx * Linvx / (2 * sigma ** 2)) +
q * np.exp(-(Linvx - 1) * (Linvx - 1) / (2 * sigma ** 2)))
fx = np.real(ALinvx * dLinvx)
# Flip fx, i.e. fx <- D(fx), the matrix D = [0 I;I 0]
temp = np.copy(fx[half:])
fx[half:] = np.copy(fx[:half])
fx[:half] = temp
FF1 = np.fft.fft(fx * dx)
try:
os.makedirs(os.path.dirname(fp), exist_ok=True)
with open(fp, 'wb+') as dump:
pickle.dump(FF1, dump)
except (FileNotFoundError, pickle.PickleError, pickle.PicklingError):
logger.error("Error with saving accountant pickle")
return FF1
def get_delta_add_remove(effective_z_t, q_t, target_eps=1.0, nx=1E6, L=20.0, F_prod=None):
""" Computes the approximation of the exact privacy as per https://arxiv.org/abs/1906.03049
for a fixed epsilon, for a list of given mechanisms applied. Considers neighbouring sets
as those with the addition/removal property.
:param effective_z_t: 1D numpy array of the effective noises applied in the mechanisms.
:param q_t: 1D numpy array of the selection probabilities of the data used in the mechanisms.
:param target_eps: The target epsilon to aim for.
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:param F_prod: Specify a previous F_prod for computing online accountancy. If none, assume not
online.
:return: (epsilon, delta) privacy bound.
"""
nx = int(nx)
half = int(nx / 2)
tol_newton = 1e-10 # set this to, e.g., 0.01*target_delta
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
fx_table = []
if F_prod is None:
F_prod = np.ones(x.size)
ncomp = effective_z_t.size
if (q_t.size != ncomp):
logger.error('The arrays for q and sigma are of different size!')
return float('inf')
for ij in range(ncomp):
# Change from original: cache results for speedup on similar requests
sigma = float(effective_z_t[ij])
q = float(q_t[ij])
FF1 = get_FF1_add_remove(sigma, q, nx, L)
# Compute the DFT
F_prod = F_prod * FF1
# first jj for which 1-exp(target_eps-x)>0,
# i.e. start of the integral domain
jj = int(np.floor(float(nx * (L + target_eps) / (2 * L))))
# Compute the inverse DFT
cfx = np.fft.ifft((F_prod / dx))
# Flip again, i.e. cfx <- D(cfx), D = [0 I;I 0]
temp = np.copy(cfx[half:])
cfx[half:] = cfx[:half]
cfx[:half] = temp
# Evaluate \delta(target_eps) and \delta'(target_eps)
exp_e = 1 - np.exp(target_eps - x)
integrand = exp_e * cfx
sum_int = np.sum(integrand[jj + 1:])
delta = sum_int * dx
# print('Unbounded DP-delta after ' + str(int(ncomp)) + ' compositions defined by sigma and q arrays:' + str(
# np.real(delta)) + ' (epsilon=' + str(target_eps) + ')')
# Change from original: return signature consistent across methods to give epsilon and delta.
return (target_eps, np.real(delta)), F_prod
def get_delta_substitution(effective_z_t, q_t, target_eps=1.0, nx=1E6, L=20.0, F_prod=None):
""" Computes the approximation of the exact privacy as per https://arxiv.org/abs/1906.03049
for a fixed epsilon, for a list of given mechanisms applied. Considers neighbouring sets
as those with the substitution property.
:param effective_z_t: 1D numpy array of the effective noises applied in the mechanisms.
:param q_t: 1D numpy array of the selection probabilities of the data used in the mechanisms.
:param target_eps: The target epsilon to aim for.
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:param F_prod: Specify a previous F_prod for computing online accountancy. If none, assume not
online.
:return: (epsilon, delta) privacy bound.
"""
nx = int(nx)
half = int(nx / 2)
tol_newton = 1e-10 # set this to, e.g., 0.01*target_delta
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
fx_table = []
if F_prod is None:
F_prod = np.ones(x.size)
ncomp = effective_z_t.size
if (q_t.size != ncomp):
logger.error('The arrays for q and sigma are of different size!')
return float('inf')
for ij in range(ncomp):
# Change from original: cache results for speedup on similar requests
sigma = effective_z_t[ij]
q = q_t[ij]
FF1 = get_FF1_substitution(sigma, q, nx, L)
F_prod = F_prod * FF1
# first jj for which 1-exp(target_eps-x)>0,
# i.e. start of the integral domain
jj = int(np.floor(float(nx * (L + np.real(target_eps)) / (2 * L))))
# Compute the inverse DFT
cfx = np.fft.ifft((F_prod / dx))
# Flip again, i.e. cfx <- D(cfx), D = [0 I;I 0]
temp = np.copy(cfx[half:])
cfx[half:] = cfx[:half]
cfx[:half] = temp
# Evaluate \delta(target_eps) and \delta'(target_eps)
exp_e = 1 - np.exp(target_eps - x)
integrand = exp_e * cfx
sum_int = np.sum(integrand[jj + 1:])
delta = sum_int * dx
# print('Bounded DP-delta after ' + str(int(ncomp)) + ' compositions defined by sigma and q arrays:' + str(np.real(delta)) + ' (epsilon=' + str(target_eps) + ')')
# Change from original: return signature consistent across methods to give epsilon and delta.
return (target_eps, np.real(delta)), F_prod
def get_eps_add_remove(effective_z_t, q_t, target_delta=1e-6, nx=1E6, L=20.0, F_prod=None):
""" Computes the approximation of the exact privacy as per https://arxiv.org/abs/1906.03049
for a fixed epsilon, for a list of given mechanisms applied. Considers neighbouring sets
as those with the addition/removal property.
:param effective_z_t: 1D numpy array of the effective noises applied in the mechanisms.
:param q_t: 1D numpy array of the selection probabilities of the data used in the mechanisms.
:param target_delta: The target delta to aim for.
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:param F_prod: Specify a previous F_prod for computing online accountancy. If none, assume not
online.
:return: (epsilon, delta) privacy bound.
"""
nx = int(nx)
half = int(nx / 2)
tol_newton = 1e-10 # set this to, e.g., 0.01*target_delta
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
fx_table = []
if F_prod is None:
F_prod = np.ones(x.size)
ncomp = effective_z_t.size
if (q_t.size != ncomp):
logger.error('The arrays for q and sigma are of different size!')
return float('inf')
for ij in range(ncomp):
# Change from original: cache results for speedup on similar requests
sigma = float(effective_z_t[ij])
q = float(q_t[ij])
# this isn't doing the right thing!!
FF1 = get_FF1_add_remove(sigma, q, nx, L)
# Compute the DFT
F_prod = F_prod * FF1
# Initial value \epsilon_0
eps_0 = 0
exp_e = 1 - np.exp(eps_0 - x)
# first jj for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
jj = int(np.floor(float(nx * (L + eps_0) / (2 * L))))
# Compute the inverse DFT
cfx = np.fft.ifft((F_prod / dx))
# Flip again, i.e. cfx <- D(cfx), D = [0 I;I 0]
temp = np.copy(cfx[half:])
cfx[half:] = cfx[:half]
cfx[:half] = temp
# Evaluate \delta(eps_0) and \delta'(eps_0)
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[jj + 1:])
sum_int2 = np.sum(integrand2[jj + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
# Here tol is the stopping criterion for Newton's iteration
# e.g., 0.1*delta value or 0.01*delta value (relative error small enough)
while np.abs(delta_temp - target_delta) > tol_newton:
# print('Residual of the Newton iteration: ' + str(np.abs(delta_temp - target_delta)))
# Update epsilon
eps_0 = eps_0 - (delta_temp - target_delta) / derivative
if (eps_0 < -L or eps_0 > L):
break
# Integrands and integral domain
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
# first kk for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
kk = int(np.floor(float(nx * (L + np.real(eps_0)) / (2 * L))))
integrand = exp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
delta_temp = sum_int * dx
# Evaluate \delta(eps_0) and \delta'(eps_0)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
sum_int2 = np.sum(integrand2[kk + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
if (np.real(eps_0) < -L or np.real(eps_0) > L):
logger.error('Epsilon out of [-L,L] window, please check the parameters.')
return (float('inf'), float('inf')), F_prod
else:
# print('Unbounded DP-epsilon after ' + str(int(ncomp)) + ' compositions defined by sigma and q arrays: ' + str(np.real(eps_0)) + ' (delta=' + str(target_delta) + ')')
return (np.real(eps_0), target_delta), F_prod
def get_eps_substitution(effective_z_t, q_t, target_delta=1e-6, nx=1E6, L=20.0, F_prod=None):
""" Computes the approximation of the exact privacy as per https://arxiv.org/abs/1906.03049
for a fixed epsilon, for a list of given mechanisms applied. Considers neighbouring sets
as those with the substitution property.
:param effective_z_t: 1D numpy array of the effective noises applied in the mechanisms.
:param q_t: 1D numpy array of the selection probabilities of the data used in the mechanisms.
:param target_delta: The target delta to aim for.
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:param F_prod: Specify a previous F_prod for computing online accountancy. If none, assume not
online.
:return: (epsilon, delta) privacy bound.
"""
nx = int(nx)
half = int(nx / 2)
tol_newton = 1e-10 # set this to, e.g., 0.01*target_delta
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
# Initial value \epsilon_0
eps_0 = 0
fx_table = []
if F_prod is None:
F_prod = np.ones(x.size)
ncomp = effective_z_t.size
if (q_t.size != ncomp):
logger.error('The arrays for q and sigma are of different size!')
return float('inf')
for ij in range(ncomp):
sigma = effective_z_t[ij]
q = q_t[ij]
FF1 = get_FF1_substitution(sigma, q, nx, L)
F_prod = F_prod * FF1
exp_e = 1 - np.exp(eps_0 - x)
# first jj for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
jj = int(np.floor(float(nx * (L + np.real(eps_0)) / (2 * L))))
# Compute the inverse DFT
cfx = np.fft.ifft((F_prod / dx))
# Flip again, i.e. cfx <- D(cfx), D = [0 I;I 0]
temp = np.copy(cfx[half:])
cfx[half:] = cfx[:half]
cfx[:half] = temp
# Evaluate \delta(eps_0) and \delta'(eps_0)
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[jj + 1:])
sum_int2 = np.sum(integrand2[jj + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
# Here tol is the stopping criterion for Newton's iteration
# e.g., 0.1*delta value or 0.01*delta value (relative error small enough)
while np.abs(delta_temp - target_delta) > tol_newton:
# print('Residual of the Newton iteration: ' + str(np.abs(delta_temp - target_delta)))
# Update epsilon
eps_0 = eps_0 - (delta_temp - target_delta) / derivative
if (eps_0 < -L or eps_0 > L):
break
# Integrands and integral domain
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
# first kk for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
kk = int(np.floor(float(nx * (L + np.real(eps_0)) / (2 * L))))
integrand = exp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
delta_temp = sum_int * dx
# Evaluate \delta(eps_0) and \delta'(eps_0)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
sum_int2 = np.sum(integrand2[kk + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
if (np.real(eps_0) < -L or np.real(eps_0) > L):
logger.error('Epsilon out of [-L,L] window, please check the parameters.')
return (float('inf'), float('inf')), F_prod
else:
# print('Bounded DP-epsilon after ' + str(int(ncomp)) + ' compositions defined by sigma and q arrays: ' + str(
# np.real(eps_0)) + ' (delta=' + str(target_delta) + ')')
return (np.real(eps_0), target_delta), F_prod
#
# print('Bounded DP-epsilon after ' + str(int(ncomp)) + ' compositions:' + str(np.real(eps_0)) + ' (delta=' + str(target_delta) + ')')
# return np.real(eps_0)
def get_eps_add_remove_fixed_params(target_delta=1e-6, sigma=2.0, q=0.01, ncomp=1E4, nx=1E6, L=20.0):
nx = int(nx)
tol_newton = 1e-10 # set this to, e.g., 0.01*target_delta
dx = 2.0 * L / nx # discretisation interval \Delta x
x = np.linspace(-L, L - dx, nx, dtype=np.complex128) # grid for the numerical integration
# first ii for which x(ii+1)>log(1-q),
# i.e. start of the integral domain
ii = int(np.floor(float(nx * (L + np.log(1 - q)) / (2 * L))))
# Initial value \epsilon_0
eps_0 = 0
# Evaluate the PLD distribution,
# The case of remove/add relation (Subsection 5.1)
ey = np.exp(x[ii + 1:])
Linvx = (sigma ** 2) * np.log((np.exp(x[ii + 1:]) - (1 - q)) / q) + 0.5
ALinvx = (1 / np.sqrt(2 * np.pi * sigma ** 2)) * ((1 - q) * np.exp(-Linvx * Linvx / (2 * sigma ** 2)) +
q * np.exp(-(Linvx - 1) * (Linvx - 1) / (2 * sigma ** 2)));
dLinvx = (sigma ** 2) * ey / (ey - (1 - q));
fx = np.zeros(nx)
fx[ii + 1:] = np.real(ALinvx * dLinvx)
half = int(nx / 2)
# Flip fx, i.e. fx <- D(fx), the matrix D = [0 I;I 0]
temp = np.copy(fx[half:])
fx[half:] = np.copy(fx[:half])
fx[:half] = temp
# Compute the DFT
FF1 = np.fft.fft(fx * dx)
exp_e = 1 - np.exp(eps_0 - x)
# Find first jj for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
jj = int(np.floor(float(nx * (L + eps_0) / (2 * L))))
# Compute the inverse DFT
cfx = np.fft.ifft((FF1 ** ncomp / dx))
# Flip again, i.e. cfx <- D(cfx), D = [0 I;I 0]
temp = np.copy(cfx[half:])
cfx[half:] = cfx[:half]
cfx[:half] = temp
# Evaluate \delta(eps_0) and \delta'(eps_0)
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[jj + 1:])
sum_int2 = np.sum(integrand2[jj + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
# Here tol is the stopping criterion for Newton's iteration
# e.g., 0.1*delta value or 0.01*delta value (relative error small enough)
while np.abs(delta_temp - target_delta) > tol_newton:
# print('Residual of the Newton iteration: ' + str(np.abs(delta_temp - target_delta)))
# Update epsilon
eps_0 = eps_0 - (delta_temp - target_delta) / derivative
if (eps_0 < -L or eps_0 > L):
break
# Integrands and integral domain
exp_e = 1 - np.exp(eps_0 - x)
dexp_e = -np.exp(eps_0 - x)
# Find first kk for which 1-exp(eps_0-x)>0,
# i.e. start of the integral domain
kk = int(np.floor(float(nx * (L + np.real(eps_0)) / (2 * L))))
integrand = exp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
delta_temp = sum_int * dx
# Evaluate \delta(eps_0) and \delta'(eps_0)
integrand = exp_e * cfx
integrand2 = dexp_e * cfx
sum_int = np.sum(integrand[kk + 1:])
sum_int2 = np.sum(integrand2[kk + 1:])
delta_temp = sum_int * dx
derivative = sum_int2 * dx
if (np.real(eps_0) < -L or np.real(eps_0) > L):
print('Error: epsilon out of [-L,L] window, please check the parameters.')
return float('inf'), target_delta
else:
print(
'Add-Remove DP-epsilon after ' + str(int(ncomp)) + ' compositions:' + str(np.real(eps_0)) + ' (delta=' + str(
target_delta) + ')')
return np.real(eps_0)
def compute_privacy_loss_from_ledger(ledger, target_eps=None, target_delta=None, adjacency_definition='add_remove',
nx=1E6, L=50.0):
""" Compute the privacy loss of the queries entered into a ledger
using the Gaussian Mechanism utilising an approximation to the true privacy bound
(https://arxiv.org/abs/1906.03049) with one of epsilon or delta fixed.
:param ledger: The ledger of queries to compute for
:param target_eps: A target epsilon to aim for.
:param target_delta: A target delta to aim for.
:param adjacency_definition: The definition of adjacent datasets to use. Can be
'add_remove' or 'substitution'
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:return: epsilon, delta
"""
effective_z_t = []
q_t = []
for sample in ledger:
# note this specific effective z calculation allows for different scale factors to be applied!
effective_z = sum([
(q.noise_stddev / q.l2_norm_bound) ** -2 for q in sample.queries
]) ** -0.5
effective_z_t.append(effective_z)
q_t.append(sample.selection_probability)
effective_z_t = np.array(effective_z_t)
q_t = np.array(q_t)
if target_delta is None and target_eps is None:
raise ValueError(
"One of the target values must not be None")
if target_eps is not None and target_delta is not None:
raise ValueError(
"Exactly one out of eps and delta must be None. (None is).")
if adjacency_definition is 'add_remove':
if target_eps is not None:
privacy_bound, _ = get_delta_add_remove(effective_z_t, q_t, target_eps=target_eps, nx=nx, L=L)
return privacy_bound
else:
privacy_bound, _ = get_eps_add_remove(effective_z_t, q_t, target_delta=target_delta, nx=nx, L=L)
return privacy_bound
elif adjacency_definition is 'substitution':
if target_eps is not None:
privacy_bound, _ = get_delta_substitution(effective_z_t, q_t, target_eps=target_eps, nx=nx, L=L)
return privacy_bound
else:
privacy_bound, _ = get_eps_substitution(effective_z_t, q_t, target_delta=target_delta, nx=nx, L=L)
return privacy_bound
raise ValueError('adjacency_definition must be one of "substitution" or "add_remove".')
def compute_online_privacy_from_ledger(ledger, F_prod,
target_delta=None, target_eps=None,
adjacency_definition='add_remove', nx=1E6, L=50.0):
""" Compute new PLD privacy in an online fashion, to speed up computation.
:param ledger: The ledger of queries to compute for. An incremental ledger,
NOT the whole ledger.
:param target_eps: A target epsilon to aim for.
:param target_delta: A target delta to aim for.
:param adjacency_definition: The definition of adjacent datasets to use. Can be
'add_remove' or 'substitution'
:param nx: The number of discretisation points to use.
:param L: The range truncation parameter
:return:
"""
if ledger == [] and F_prod is None:
return tuple([None, None]), None
effective_z_t = []
q_t = []
for sample in ledger:
# note this specific effective z calculation allows for different scale factors to be applied!
effective_z = sum([
(q.noise_stddev / q.l2_norm_bound) ** -2 for q in sample.queries
]) ** -0.5
effective_z_t.append(effective_z)
q_t.append(sample.selection_probability)
effective_z_t = np.array(effective_z_t)
q_t = np.array(q_t)
if target_delta is None and target_eps is None:
raise ValueError(
"One of the target values must not be None")
if target_eps is not None and target_delta is not None:
raise ValueError(
"Exactly one out of eps and delta must be None. (None is).")
if adjacency_definition is 'add_remove':
if target_eps is not None:
privacy_bound, F_prod = get_delta_add_remove(effective_z_t, q_t, target_eps=target_eps, nx=nx, L=L,
F_prod=F_prod)
return privacy_bound, F_prod
else:
privacy_bound, F_prod = get_eps_add_remove(effective_z_t, q_t, target_delta=target_delta, nx=nx, L=L,
F_prod=F_prod)
return privacy_bound, F_prod
if adjacency_definition is 'substitution':
if target_eps is not None:
privacy_bound, F_prod = get_delta_substitution(effective_z_t, q_t, target_eps=target_eps, nx=nx, L=L,
F_prod=F_prod)
return privacy_bound, F_prod
else:
privacy_bound, F_prod = get_eps_substitution(effective_z_t, q_t, target_delta=target_delta, nx=nx, L=L,
F_prod=F_prod)
return privacy_bound, F_prod
raise ValueError('adjacency_definition must be one of "substitution" or "add_remove".')
| 37.008392 | 175 | 0.613318 | 4,076 | 26,461 | 3.848871 | 0.079244 | 0.017338 | 0.022438 | 0.010709 | 0.935173 | 0.931094 | 0.926186 | 0.914903 | 0.912991 | 0.906489 | 0 | 0.026747 | 0.268093 | 26,461 | 714 | 176 | 37.060224 | 0.783291 | 0.360266 | 0 | 0.835135 | 0 | 0 | 0.063759 | 0.003589 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024324 | false | 0 | 0.021622 | 0 | 0.113514 | 0.005405 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1d5edb9990e6c328757242f8d2982681fce39fd | 42,450 | py | Python | sdk/python/pulumi_sumologic/cse_aggregation_rule.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-10-13T03:50:41.000Z | 2021-10-13T03:50:41.000Z | sdk/python/pulumi_sumologic/cse_aggregation_rule.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | 28 | 2021-05-21T11:00:45.000Z | 2022-03-31T15:47:13.000Z | sdk/python/pulumi_sumologic/cse_aggregation_rule.py | pulumi/pulumi-sumologic | 962fa056ee4b96e61a200e7bf2308bfad723c3af | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['CseAggregationRuleArgs', 'CseAggregationRule']
@pulumi.input_type
class CseAggregationRuleArgs:
def __init__(__self__, *,
aggregation_functions: pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]],
description_expression: pulumi.Input[str],
enabled: pulumi.Input[bool],
entity_selectors: pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]],
match_expression: pulumi.Input[str],
name_expression: pulumi.Input[str],
severity_mapping: pulumi.Input['CseAggregationRuleSeverityMappingArgs'],
trigger_expression: pulumi.Input[str],
window_size: pulumi.Input[str],
group_by_entity: Optional[pulumi.Input[bool]] = None,
group_by_fields: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
is_prototype: Optional[pulumi.Input[bool]] = None,
name: Optional[pulumi.Input[str]] = None,
summary_expression: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a CseAggregationRule resource.
:param pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]] aggregation_functions: One or more named aggregation functions
:param pulumi.Input[str] description_expression: The description of the generated Signals
:param pulumi.Input[bool] enabled: Whether the rule should generate Signals
:param pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]] entity_selectors: The entities to generate Signals on
:param pulumi.Input[str] match_expression: The expression for which records to match on
:param pulumi.Input[str] name_expression: The name of the generated Signals
:param pulumi.Input['CseAggregationRuleSeverityMappingArgs'] severity_mapping: The configuration of how the severity of the Signals should be mapped from the Records
:param pulumi.Input[str] trigger_expression: The expression to determine whether a Signal should be created based on the aggregation results
:param pulumi.Input[str] window_size: How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
:param pulumi.Input[bool] group_by_entity: Whether to group records by the specified entity fields
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_by_fields: A list of fields to group records by
:param pulumi.Input[bool] is_prototype: Whether the generated Signals should be prototype Signals
:param pulumi.Input[str] name: The name of the Rule
:param pulumi.Input[str] summary_expression: The summary of the generated Signals
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: The tags of the generated Signals
"""
pulumi.set(__self__, "aggregation_functions", aggregation_functions)
pulumi.set(__self__, "description_expression", description_expression)
pulumi.set(__self__, "enabled", enabled)
pulumi.set(__self__, "entity_selectors", entity_selectors)
pulumi.set(__self__, "match_expression", match_expression)
pulumi.set(__self__, "name_expression", name_expression)
pulumi.set(__self__, "severity_mapping", severity_mapping)
pulumi.set(__self__, "trigger_expression", trigger_expression)
pulumi.set(__self__, "window_size", window_size)
if group_by_entity is not None:
pulumi.set(__self__, "group_by_entity", group_by_entity)
if group_by_fields is not None:
pulumi.set(__self__, "group_by_fields", group_by_fields)
if is_prototype is not None:
pulumi.set(__self__, "is_prototype", is_prototype)
if name is not None:
pulumi.set(__self__, "name", name)
if summary_expression is not None:
pulumi.set(__self__, "summary_expression", summary_expression)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="aggregationFunctions")
def aggregation_functions(self) -> pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]]:
"""
One or more named aggregation functions
"""
return pulumi.get(self, "aggregation_functions")
@aggregation_functions.setter
def aggregation_functions(self, value: pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]]):
pulumi.set(self, "aggregation_functions", value)
@property
@pulumi.getter(name="descriptionExpression")
def description_expression(self) -> pulumi.Input[str]:
"""
The description of the generated Signals
"""
return pulumi.get(self, "description_expression")
@description_expression.setter
def description_expression(self, value: pulumi.Input[str]):
pulumi.set(self, "description_expression", value)
@property
@pulumi.getter
def enabled(self) -> pulumi.Input[bool]:
"""
Whether the rule should generate Signals
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: pulumi.Input[bool]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter(name="entitySelectors")
def entity_selectors(self) -> pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]]:
"""
The entities to generate Signals on
"""
return pulumi.get(self, "entity_selectors")
@entity_selectors.setter
def entity_selectors(self, value: pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]]):
pulumi.set(self, "entity_selectors", value)
@property
@pulumi.getter(name="matchExpression")
def match_expression(self) -> pulumi.Input[str]:
"""
The expression for which records to match on
"""
return pulumi.get(self, "match_expression")
@match_expression.setter
def match_expression(self, value: pulumi.Input[str]):
pulumi.set(self, "match_expression", value)
@property
@pulumi.getter(name="nameExpression")
def name_expression(self) -> pulumi.Input[str]:
"""
The name of the generated Signals
"""
return pulumi.get(self, "name_expression")
@name_expression.setter
def name_expression(self, value: pulumi.Input[str]):
pulumi.set(self, "name_expression", value)
@property
@pulumi.getter(name="severityMapping")
def severity_mapping(self) -> pulumi.Input['CseAggregationRuleSeverityMappingArgs']:
"""
The configuration of how the severity of the Signals should be mapped from the Records
"""
return pulumi.get(self, "severity_mapping")
@severity_mapping.setter
def severity_mapping(self, value: pulumi.Input['CseAggregationRuleSeverityMappingArgs']):
pulumi.set(self, "severity_mapping", value)
@property
@pulumi.getter(name="triggerExpression")
def trigger_expression(self) -> pulumi.Input[str]:
"""
The expression to determine whether a Signal should be created based on the aggregation results
"""
return pulumi.get(self, "trigger_expression")
@trigger_expression.setter
def trigger_expression(self, value: pulumi.Input[str]):
pulumi.set(self, "trigger_expression", value)
@property
@pulumi.getter(name="windowSize")
def window_size(self) -> pulumi.Input[str]:
"""
How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
return pulumi.get(self, "window_size")
@window_size.setter
def window_size(self, value: pulumi.Input[str]):
pulumi.set(self, "window_size", value)
@property
@pulumi.getter(name="groupByEntity")
def group_by_entity(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to group records by the specified entity fields
"""
return pulumi.get(self, "group_by_entity")
@group_by_entity.setter
def group_by_entity(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "group_by_entity", value)
@property
@pulumi.getter(name="groupByFields")
def group_by_fields(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of fields to group records by
"""
return pulumi.get(self, "group_by_fields")
@group_by_fields.setter
def group_by_fields(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "group_by_fields", value)
@property
@pulumi.getter(name="isPrototype")
def is_prototype(self) -> Optional[pulumi.Input[bool]]:
"""
Whether the generated Signals should be prototype Signals
"""
return pulumi.get(self, "is_prototype")
@is_prototype.setter
def is_prototype(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_prototype", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the Rule
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="summaryExpression")
def summary_expression(self) -> Optional[pulumi.Input[str]]:
"""
The summary of the generated Signals
"""
return pulumi.get(self, "summary_expression")
@summary_expression.setter
def summary_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "summary_expression", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The tags of the generated Signals
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _CseAggregationRuleState:
def __init__(__self__, *,
aggregation_functions: Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]]] = None,
description_expression: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
entity_selectors: Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]]] = None,
group_by_entity: Optional[pulumi.Input[bool]] = None,
group_by_fields: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
is_prototype: Optional[pulumi.Input[bool]] = None,
match_expression: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
name_expression: Optional[pulumi.Input[str]] = None,
severity_mapping: Optional[pulumi.Input['CseAggregationRuleSeverityMappingArgs']] = None,
summary_expression: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
trigger_expression: Optional[pulumi.Input[str]] = None,
window_size: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering CseAggregationRule resources.
:param pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]] aggregation_functions: One or more named aggregation functions
:param pulumi.Input[str] description_expression: The description of the generated Signals
:param pulumi.Input[bool] enabled: Whether the rule should generate Signals
:param pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]] entity_selectors: The entities to generate Signals on
:param pulumi.Input[bool] group_by_entity: Whether to group records by the specified entity fields
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_by_fields: A list of fields to group records by
:param pulumi.Input[bool] is_prototype: Whether the generated Signals should be prototype Signals
:param pulumi.Input[str] match_expression: The expression for which records to match on
:param pulumi.Input[str] name: The name of the Rule
:param pulumi.Input[str] name_expression: The name of the generated Signals
:param pulumi.Input['CseAggregationRuleSeverityMappingArgs'] severity_mapping: The configuration of how the severity of the Signals should be mapped from the Records
:param pulumi.Input[str] summary_expression: The summary of the generated Signals
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: The tags of the generated Signals
:param pulumi.Input[str] trigger_expression: The expression to determine whether a Signal should be created based on the aggregation results
:param pulumi.Input[str] window_size: How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
if aggregation_functions is not None:
pulumi.set(__self__, "aggregation_functions", aggregation_functions)
if description_expression is not None:
pulumi.set(__self__, "description_expression", description_expression)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if entity_selectors is not None:
pulumi.set(__self__, "entity_selectors", entity_selectors)
if group_by_entity is not None:
pulumi.set(__self__, "group_by_entity", group_by_entity)
if group_by_fields is not None:
pulumi.set(__self__, "group_by_fields", group_by_fields)
if is_prototype is not None:
pulumi.set(__self__, "is_prototype", is_prototype)
if match_expression is not None:
pulumi.set(__self__, "match_expression", match_expression)
if name is not None:
pulumi.set(__self__, "name", name)
if name_expression is not None:
pulumi.set(__self__, "name_expression", name_expression)
if severity_mapping is not None:
pulumi.set(__self__, "severity_mapping", severity_mapping)
if summary_expression is not None:
pulumi.set(__self__, "summary_expression", summary_expression)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if trigger_expression is not None:
pulumi.set(__self__, "trigger_expression", trigger_expression)
if window_size is not None:
pulumi.set(__self__, "window_size", window_size)
@property
@pulumi.getter(name="aggregationFunctions")
def aggregation_functions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]]]:
"""
One or more named aggregation functions
"""
return pulumi.get(self, "aggregation_functions")
@aggregation_functions.setter
def aggregation_functions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleAggregationFunctionArgs']]]]):
pulumi.set(self, "aggregation_functions", value)
@property
@pulumi.getter(name="descriptionExpression")
def description_expression(self) -> Optional[pulumi.Input[str]]:
"""
The description of the generated Signals
"""
return pulumi.get(self, "description_expression")
@description_expression.setter
def description_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description_expression", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether the rule should generate Signals
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter(name="entitySelectors")
def entity_selectors(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]]]:
"""
The entities to generate Signals on
"""
return pulumi.get(self, "entity_selectors")
@entity_selectors.setter
def entity_selectors(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['CseAggregationRuleEntitySelectorArgs']]]]):
pulumi.set(self, "entity_selectors", value)
@property
@pulumi.getter(name="groupByEntity")
def group_by_entity(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to group records by the specified entity fields
"""
return pulumi.get(self, "group_by_entity")
@group_by_entity.setter
def group_by_entity(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "group_by_entity", value)
@property
@pulumi.getter(name="groupByFields")
def group_by_fields(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of fields to group records by
"""
return pulumi.get(self, "group_by_fields")
@group_by_fields.setter
def group_by_fields(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "group_by_fields", value)
@property
@pulumi.getter(name="isPrototype")
def is_prototype(self) -> Optional[pulumi.Input[bool]]:
"""
Whether the generated Signals should be prototype Signals
"""
return pulumi.get(self, "is_prototype")
@is_prototype.setter
def is_prototype(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_prototype", value)
@property
@pulumi.getter(name="matchExpression")
def match_expression(self) -> Optional[pulumi.Input[str]]:
"""
The expression for which records to match on
"""
return pulumi.get(self, "match_expression")
@match_expression.setter
def match_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "match_expression", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the Rule
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="nameExpression")
def name_expression(self) -> Optional[pulumi.Input[str]]:
"""
The name of the generated Signals
"""
return pulumi.get(self, "name_expression")
@name_expression.setter
def name_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name_expression", value)
@property
@pulumi.getter(name="severityMapping")
def severity_mapping(self) -> Optional[pulumi.Input['CseAggregationRuleSeverityMappingArgs']]:
"""
The configuration of how the severity of the Signals should be mapped from the Records
"""
return pulumi.get(self, "severity_mapping")
@severity_mapping.setter
def severity_mapping(self, value: Optional[pulumi.Input['CseAggregationRuleSeverityMappingArgs']]):
pulumi.set(self, "severity_mapping", value)
@property
@pulumi.getter(name="summaryExpression")
def summary_expression(self) -> Optional[pulumi.Input[str]]:
"""
The summary of the generated Signals
"""
return pulumi.get(self, "summary_expression")
@summary_expression.setter
def summary_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "summary_expression", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The tags of the generated Signals
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="triggerExpression")
def trigger_expression(self) -> Optional[pulumi.Input[str]]:
"""
The expression to determine whether a Signal should be created based on the aggregation results
"""
return pulumi.get(self, "trigger_expression")
@trigger_expression.setter
def trigger_expression(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "trigger_expression", value)
@property
@pulumi.getter(name="windowSize")
def window_size(self) -> Optional[pulumi.Input[str]]:
"""
How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
return pulumi.get(self, "window_size")
@window_size.setter
def window_size(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "window_size", value)
class CseAggregationRule(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
aggregation_functions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleAggregationFunctionArgs']]]]] = None,
description_expression: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
entity_selectors: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleEntitySelectorArgs']]]]] = None,
group_by_entity: Optional[pulumi.Input[bool]] = None,
group_by_fields: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
is_prototype: Optional[pulumi.Input[bool]] = None,
match_expression: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
name_expression: Optional[pulumi.Input[str]] = None,
severity_mapping: Optional[pulumi.Input[pulumi.InputType['CseAggregationRuleSeverityMappingArgs']]] = None,
summary_expression: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
trigger_expression: Optional[pulumi.Input[str]] = None,
window_size: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a Sumo Logic CSE [Aggregation Rule](https://help.sumologic.com/Cloud_SIEM_Enterprise/CSE_Rules/09_Write_an_Aggregation_Rule).
## Example Usage
```python
import pulumi
import pulumi_sumologic as sumologic
aggregation_rule = sumologic.CseAggregationRule("aggregationRule",
aggregation_functions=[sumologic.CseAggregationRuleAggregationFunctionArgs(
arguments=["metadata_deviceEventId"],
function="count_distinct",
name="distinct_eventid_count",
)],
description_expression="Signal description",
enabled=True,
entity_selectors=[sumologic.CseAggregationRuleEntitySelectorArgs(
entity_type="_ip",
expression="srcDevice_ip",
)],
group_by_entity=True,
group_by_fields=["dstDevice_hostname"],
is_prototype=False,
match_expression="objectType = \"Network\"",
name_expression="Signal name",
severity_mapping=sumologic.CseAggregationRuleSeverityMappingArgs(
default=5,
type="constant",
),
summary_expression="Signal summary",
tags=["_mitreAttackTactic:TA0009"],
trigger_expression="distinct_eventid_count > 5",
window_size="T30M")
```
## Import
Aggregation Rules can be imported using the field id, e.g.hcl
```sh
$ pulumi import sumologic:index/cseAggregationRule:CseAggregationRule aggregation_rule id
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleAggregationFunctionArgs']]]] aggregation_functions: One or more named aggregation functions
:param pulumi.Input[str] description_expression: The description of the generated Signals
:param pulumi.Input[bool] enabled: Whether the rule should generate Signals
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleEntitySelectorArgs']]]] entity_selectors: The entities to generate Signals on
:param pulumi.Input[bool] group_by_entity: Whether to group records by the specified entity fields
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_by_fields: A list of fields to group records by
:param pulumi.Input[bool] is_prototype: Whether the generated Signals should be prototype Signals
:param pulumi.Input[str] match_expression: The expression for which records to match on
:param pulumi.Input[str] name: The name of the Rule
:param pulumi.Input[str] name_expression: The name of the generated Signals
:param pulumi.Input[pulumi.InputType['CseAggregationRuleSeverityMappingArgs']] severity_mapping: The configuration of how the severity of the Signals should be mapped from the Records
:param pulumi.Input[str] summary_expression: The summary of the generated Signals
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: The tags of the generated Signals
:param pulumi.Input[str] trigger_expression: The expression to determine whether a Signal should be created based on the aggregation results
:param pulumi.Input[str] window_size: How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: CseAggregationRuleArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a Sumo Logic CSE [Aggregation Rule](https://help.sumologic.com/Cloud_SIEM_Enterprise/CSE_Rules/09_Write_an_Aggregation_Rule).
## Example Usage
```python
import pulumi
import pulumi_sumologic as sumologic
aggregation_rule = sumologic.CseAggregationRule("aggregationRule",
aggregation_functions=[sumologic.CseAggregationRuleAggregationFunctionArgs(
arguments=["metadata_deviceEventId"],
function="count_distinct",
name="distinct_eventid_count",
)],
description_expression="Signal description",
enabled=True,
entity_selectors=[sumologic.CseAggregationRuleEntitySelectorArgs(
entity_type="_ip",
expression="srcDevice_ip",
)],
group_by_entity=True,
group_by_fields=["dstDevice_hostname"],
is_prototype=False,
match_expression="objectType = \"Network\"",
name_expression="Signal name",
severity_mapping=sumologic.CseAggregationRuleSeverityMappingArgs(
default=5,
type="constant",
),
summary_expression="Signal summary",
tags=["_mitreAttackTactic:TA0009"],
trigger_expression="distinct_eventid_count > 5",
window_size="T30M")
```
## Import
Aggregation Rules can be imported using the field id, e.g.hcl
```sh
$ pulumi import sumologic:index/cseAggregationRule:CseAggregationRule aggregation_rule id
```
:param str resource_name: The name of the resource.
:param CseAggregationRuleArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(CseAggregationRuleArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
aggregation_functions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleAggregationFunctionArgs']]]]] = None,
description_expression: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
entity_selectors: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleEntitySelectorArgs']]]]] = None,
group_by_entity: Optional[pulumi.Input[bool]] = None,
group_by_fields: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
is_prototype: Optional[pulumi.Input[bool]] = None,
match_expression: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
name_expression: Optional[pulumi.Input[str]] = None,
severity_mapping: Optional[pulumi.Input[pulumi.InputType['CseAggregationRuleSeverityMappingArgs']]] = None,
summary_expression: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
trigger_expression: Optional[pulumi.Input[str]] = None,
window_size: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = CseAggregationRuleArgs.__new__(CseAggregationRuleArgs)
if aggregation_functions is None and not opts.urn:
raise TypeError("Missing required property 'aggregation_functions'")
__props__.__dict__["aggregation_functions"] = aggregation_functions
if description_expression is None and not opts.urn:
raise TypeError("Missing required property 'description_expression'")
__props__.__dict__["description_expression"] = description_expression
if enabled is None and not opts.urn:
raise TypeError("Missing required property 'enabled'")
__props__.__dict__["enabled"] = enabled
if entity_selectors is None and not opts.urn:
raise TypeError("Missing required property 'entity_selectors'")
__props__.__dict__["entity_selectors"] = entity_selectors
__props__.__dict__["group_by_entity"] = group_by_entity
__props__.__dict__["group_by_fields"] = group_by_fields
__props__.__dict__["is_prototype"] = is_prototype
if match_expression is None and not opts.urn:
raise TypeError("Missing required property 'match_expression'")
__props__.__dict__["match_expression"] = match_expression
__props__.__dict__["name"] = name
if name_expression is None and not opts.urn:
raise TypeError("Missing required property 'name_expression'")
__props__.__dict__["name_expression"] = name_expression
if severity_mapping is None and not opts.urn:
raise TypeError("Missing required property 'severity_mapping'")
__props__.__dict__["severity_mapping"] = severity_mapping
__props__.__dict__["summary_expression"] = summary_expression
__props__.__dict__["tags"] = tags
if trigger_expression is None and not opts.urn:
raise TypeError("Missing required property 'trigger_expression'")
__props__.__dict__["trigger_expression"] = trigger_expression
if window_size is None and not opts.urn:
raise TypeError("Missing required property 'window_size'")
__props__.__dict__["window_size"] = window_size
super(CseAggregationRule, __self__).__init__(
'sumologic:index/cseAggregationRule:CseAggregationRule',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
aggregation_functions: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleAggregationFunctionArgs']]]]] = None,
description_expression: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
entity_selectors: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleEntitySelectorArgs']]]]] = None,
group_by_entity: Optional[pulumi.Input[bool]] = None,
group_by_fields: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
is_prototype: Optional[pulumi.Input[bool]] = None,
match_expression: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
name_expression: Optional[pulumi.Input[str]] = None,
severity_mapping: Optional[pulumi.Input[pulumi.InputType['CseAggregationRuleSeverityMappingArgs']]] = None,
summary_expression: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
trigger_expression: Optional[pulumi.Input[str]] = None,
window_size: Optional[pulumi.Input[str]] = None) -> 'CseAggregationRule':
"""
Get an existing CseAggregationRule resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleAggregationFunctionArgs']]]] aggregation_functions: One or more named aggregation functions
:param pulumi.Input[str] description_expression: The description of the generated Signals
:param pulumi.Input[bool] enabled: Whether the rule should generate Signals
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['CseAggregationRuleEntitySelectorArgs']]]] entity_selectors: The entities to generate Signals on
:param pulumi.Input[bool] group_by_entity: Whether to group records by the specified entity fields
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_by_fields: A list of fields to group records by
:param pulumi.Input[bool] is_prototype: Whether the generated Signals should be prototype Signals
:param pulumi.Input[str] match_expression: The expression for which records to match on
:param pulumi.Input[str] name: The name of the Rule
:param pulumi.Input[str] name_expression: The name of the generated Signals
:param pulumi.Input[pulumi.InputType['CseAggregationRuleSeverityMappingArgs']] severity_mapping: The configuration of how the severity of the Signals should be mapped from the Records
:param pulumi.Input[str] summary_expression: The summary of the generated Signals
:param pulumi.Input[Sequence[pulumi.Input[str]]] tags: The tags of the generated Signals
:param pulumi.Input[str] trigger_expression: The expression to determine whether a Signal should be created based on the aggregation results
:param pulumi.Input[str] window_size: How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _CseAggregationRuleState.__new__(_CseAggregationRuleState)
__props__.__dict__["aggregation_functions"] = aggregation_functions
__props__.__dict__["description_expression"] = description_expression
__props__.__dict__["enabled"] = enabled
__props__.__dict__["entity_selectors"] = entity_selectors
__props__.__dict__["group_by_entity"] = group_by_entity
__props__.__dict__["group_by_fields"] = group_by_fields
__props__.__dict__["is_prototype"] = is_prototype
__props__.__dict__["match_expression"] = match_expression
__props__.__dict__["name"] = name
__props__.__dict__["name_expression"] = name_expression
__props__.__dict__["severity_mapping"] = severity_mapping
__props__.__dict__["summary_expression"] = summary_expression
__props__.__dict__["tags"] = tags
__props__.__dict__["trigger_expression"] = trigger_expression
__props__.__dict__["window_size"] = window_size
return CseAggregationRule(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="aggregationFunctions")
def aggregation_functions(self) -> pulumi.Output[Sequence['outputs.CseAggregationRuleAggregationFunction']]:
"""
One or more named aggregation functions
"""
return pulumi.get(self, "aggregation_functions")
@property
@pulumi.getter(name="descriptionExpression")
def description_expression(self) -> pulumi.Output[str]:
"""
The description of the generated Signals
"""
return pulumi.get(self, "description_expression")
@property
@pulumi.getter
def enabled(self) -> pulumi.Output[bool]:
"""
Whether the rule should generate Signals
"""
return pulumi.get(self, "enabled")
@property
@pulumi.getter(name="entitySelectors")
def entity_selectors(self) -> pulumi.Output[Sequence['outputs.CseAggregationRuleEntitySelector']]:
"""
The entities to generate Signals on
"""
return pulumi.get(self, "entity_selectors")
@property
@pulumi.getter(name="groupByEntity")
def group_by_entity(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to group records by the specified entity fields
"""
return pulumi.get(self, "group_by_entity")
@property
@pulumi.getter(name="groupByFields")
def group_by_fields(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A list of fields to group records by
"""
return pulumi.get(self, "group_by_fields")
@property
@pulumi.getter(name="isPrototype")
def is_prototype(self) -> pulumi.Output[Optional[bool]]:
"""
Whether the generated Signals should be prototype Signals
"""
return pulumi.get(self, "is_prototype")
@property
@pulumi.getter(name="matchExpression")
def match_expression(self) -> pulumi.Output[str]:
"""
The expression for which records to match on
"""
return pulumi.get(self, "match_expression")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the Rule
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="nameExpression")
def name_expression(self) -> pulumi.Output[str]:
"""
The name of the generated Signals
"""
return pulumi.get(self, "name_expression")
@property
@pulumi.getter(name="severityMapping")
def severity_mapping(self) -> pulumi.Output['outputs.CseAggregationRuleSeverityMapping']:
"""
The configuration of how the severity of the Signals should be mapped from the Records
"""
return pulumi.get(self, "severity_mapping")
@property
@pulumi.getter(name="summaryExpression")
def summary_expression(self) -> pulumi.Output[Optional[str]]:
"""
The summary of the generated Signals
"""
return pulumi.get(self, "summary_expression")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
The tags of the generated Signals
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="triggerExpression")
def trigger_expression(self) -> pulumi.Output[str]:
"""
The expression to determine whether a Signal should be created based on the aggregation results
"""
return pulumi.get(self, "trigger_expression")
@property
@pulumi.getter(name="windowSize")
def window_size(self) -> pulumi.Output[str]:
"""
How long of a window to aggregate records for. Current acceptable values are T05M, T10M, T30M, T60M, T24H, T12H, or T05D.
"""
return pulumi.get(self, "window_size")
| 47.219132 | 191 | 0.671143 | 4,620 | 42,450 | 5.94632 | 0.052597 | 0.100502 | 0.060644 | 0.047321 | 0.916715 | 0.905613 | 0.879659 | 0.845989 | 0.828116 | 0.817487 | 0 | 0.003639 | 0.229706 | 42,450 | 898 | 192 | 47.271715 | 0.836509 | 0.296113 | 0 | 0.733598 | 1 | 0 | 0.153821 | 0.062441 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163022 | false | 0.001988 | 0.013917 | 0 | 0.274354 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ae71dfee9135d63d4a78086ff844e1d33bcdafdd | 17,181 | py | Python | tests/test_api_network.py | skylli/devicehive-python | c61da550443b2d59938076b8394ac3e1eadc5fba | [
"Apache-2.0"
] | 28 | 2015-07-26T10:39:33.000Z | 2021-04-16T11:33:50.000Z | tests/test_api_network.py | skylli/devicehive-python | c61da550443b2d59938076b8394ac3e1eadc5fba | [
"Apache-2.0"
] | 24 | 2015-05-27T08:55:32.000Z | 2020-12-23T06:03:07.000Z | tests/test_api_network.py | skylli/devicehive-python | c61da550443b2d59938076b8394ac3e1eadc5fba | [
"Apache-2.0"
] | 21 | 2015-09-29T20:17:40.000Z | 2021-02-12T05:02:27.000Z | # Copyright (C) 2018 DataArt
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# =============================================================================
from devicehive import NetworkError, ApiResponseError, SubscriptionError
def test_save(test):
test.only_admin_implementation()
device_hive_api = test.device_hive_api()
name = test.generate_id('n-s', test.NETWORK_ENTITY)
description = '%s-description' % name
network = device_hive_api.create_network(name, description)
name = test.generate_id('n-s', test.NETWORK_ENTITY)
description = '%s-description' % name
network.name = name
network.description = description
network.save()
network_1 = device_hive_api.get_network(network.id)
network.remove()
try:
network.save()
assert False
except NetworkError:
pass
try:
network_1.save()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
def test_remove(test):
test.only_admin_implementation()
device_hive_api = test.device_hive_api()
name = test.generate_id('n-r', test.NETWORK_ENTITY)
description = '%s-description' % name
network = device_hive_api.create_network(name, description)
network_1 = device_hive_api.get_network(network.id)
network.remove()
assert not network.id
assert not network.name
assert not network.description
try:
network.remove()
assert False
except NetworkError:
pass
try:
network_1.remove()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
# ==========================================================================
name = test.generate_id('n-r', test.NETWORK_ENTITY)
description = '%s-description' % name
network = device_hive_api.create_network(name, description)
device_id = test.generate_id('n-r', test.DEVICE_ENTITY)
device_hive_api.put_device(device_id, network_id=network.id)
try:
network.remove()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 400
device = device_hive_api.get_device(device_id)
assert device.id == device_id
network.remove(force=True)
try:
device_hive_api.get_device(device_id)
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
def test_subscribe_insert_commands(test):
test.only_admin_implementation()
def init_data(handler):
device_id = test.generate_id('n-s-i-c', test.DEVICE_ENTITY)
network_name = test.generate_id('n-s-i-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
command_names = ['%s-name-%s' % (device_id, i) for i in range(2)]
device = handler.api.put_device(device_id, network_id=network.id)
return device, network, command_names, []
def send_data(handler, device, command_names):
for command_name in command_names:
command = device.send_command(command_name)
handler.data['command_ids'].append(command.id)
def set_handler_data(handler, device, network, command_names, command_ids):
handler.data['device'] = device
handler.data['network'] = network
handler.data['command_names'] = command_names
handler.data['command_ids'] = command_ids
def handle_connect(handler):
device, network, command_names, command_ids = init_data(handler)
set_handler_data(handler, device, network, command_names, command_ids)
send_data(handler, device, command_names)
handler.data['subscription'] = network.subscribe_insert_commands()
def handle_command_insert(handler, command):
assert command.id in handler.data['command_ids']
handler.data['command_ids'].remove(command.id)
if handler.data['command_ids']:
return
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_insert)
def handle_connect(handler):
device, network, command_names, command_ids = init_data(handler)
command_name = command_names[:1]
set_handler_data(handler, device, network, command_names, command_ids)
send_data(handler, device, command_name)
handler.data['subscription'] = network.subscribe_insert_commands(
names=command_name)
def handle_command_insert(handler, command):
assert command.id == handler.data['command_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_insert)
def handle_connect(handler):
network_name = test.generate_id('n-s-i-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
device_id = test.generate_id('n-s-i-c', test.DEVICE_ENTITY)
device = handler.api.put_device(device_id, network_id=network.id)
command_name = '%s-name-1' % device_id
command = device.send_command(command_name)
set_handler_data(handler, device, network, [command_name], [command.id])
handler.data['subscription'] = network.subscribe_insert_commands()
def handle_command_insert(handler, command):
assert command.id == handler.data['command_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_insert)
def handle_connect(handler):
network_name = test.generate_id('n-s-i-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
network_1 = handler.api.get_network(network.id)
network.remove()
try:
network_1.subscribe_insert_commands()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
test.run(handle_connect)
def test_unsubscribe_insert_commands(test):
test.only_admin_implementation()
def handle_connect(handler):
name = test.generate_id('n-u-i-c', test.NETWORK_ENTITY)
description = '%s-description' % name
network = handler.api.create_network(name, description)
subscription = network.subscribe_insert_commands()
subscription.remove()
try:
subscription.remove()
assert False
except SubscriptionError:
pass
network.remove()
test.run(handle_connect)
def test_subscribe_update_commands(test):
test.only_admin_implementation()
def init_data(handler):
device_id = test.generate_id('n-s-u-c', test.DEVICE_ENTITY)
network_name = test.generate_id('n-s-u-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
command_names = ['%s-name-%s' % (device_id, i) for i in range(2)]
device = handler.api.put_device(device_id, network_id=network.id)
return device, network, command_names, []
def send_data(handler, device, command_names):
for command_name in command_names:
command = device.send_command(command_name)
handler.data['command_ids'].append(command.id)
command.status = 'status'
command.save()
def set_handler_data(handler, device, network, command_names, command_ids):
handler.data['device'] = device
handler.data['network'] = network
handler.data['command_names'] = command_names
handler.data['command_ids'] = command_ids
def handle_connect(handler):
device, network, command_names, command_ids = init_data(handler)
set_handler_data(handler, device, network, command_names, command_ids)
send_data(handler, device, command_names)
handler.data['subscription'] = network.subscribe_update_commands()
def handle_command_update(handler, command):
assert command.id in handler.data['command_ids']
handler.data['command_ids'].remove(command.id)
if handler.data['command_ids']:
return
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_update=handle_command_update)
def handle_connect(handler):
device, network, command_names, command_ids = init_data(handler)
command_name = command_names[:1]
set_handler_data(handler, device, network, command_names, command_ids)
send_data(handler, device, command_name)
handler.data['subscription'] = network.subscribe_update_commands(
names=command_name)
def handle_command_update(handler, command):
assert command.id == handler.data['command_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_update=handle_command_update)
def handle_connect(handler):
network_name = test.generate_id('n-s-u-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
device_id = test.generate_id('n-s-u-c', test.DEVICE_ENTITY)
device = handler.api.put_device(device_id, network_id=network.id)
command_name = '%s-name-1' % device_id
command = device.send_command(command_name)
command.status = 'status'
command.save()
set_handler_data(handler, device, network, [command_name], [command.id])
handler.data['subscription'] = network.subscribe_update_commands()
def handle_command_update(handler, command):
assert command.id == handler.data['command_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_command_update=handle_command_update)
def handle_connect(handler):
network_name = test.generate_id('n-s-u-c', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
network_1 = handler.api.get_network(network.id)
network.remove()
try:
network_1.subscribe_update_commands()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
test.run(handle_connect)
def test_unsubscribe_update_commands(test):
test.only_admin_implementation()
def handle_connect(handler):
name = test.generate_id('n-u-u-c', test.NETWORK_ENTITY)
description = '%s-description' % name
network = handler.api.create_network(name, description)
subscription = network.subscribe_update_commands()
subscription.remove()
try:
subscription.remove()
assert False
except SubscriptionError:
pass
network.remove()
test.run(handle_connect)
def test_subscribe_notifications(test):
test.only_admin_implementation()
def init_data(handler):
device_id = test.generate_id('n-s-n', test.DEVICE_ENTITY)
network_name = test.generate_id('n-s-n', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
notification_names = ['%s-name-%s' % (device_id, i) for i in range(2)]
device = handler.api.put_device(device_id, network_id=network.id)
return device, network, notification_names, []
def send_data(handler, device, notification_names):
for notification_name in notification_names:
notification = device.send_notification(notification_name)
handler.data['notification_ids'].append(notification.id)
def set_handler_data(handler, device, network, notification_names,
notification_ids):
handler.data['device'] = device
handler.data['network'] = network
handler.data['notification_names'] = notification_names
handler.data['notification_ids'] = notification_ids
def handle_connect(handler):
device, network, notification_names, notification_ids = init_data(
handler)
set_handler_data(handler, device, network, notification_names,
notification_ids)
send_data(handler, device, notification_names)
handler.data['subscription'] = network.subscribe_notifications()
def handle_notification(handler, notification):
assert notification.id in handler.data['notification_ids']
handler.data['notification_ids'].remove(notification.id)
if handler.data['notification_ids']:
return
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_notification=handle_notification)
def handle_connect(handler):
device, network, notification_names, notification_ids = init_data(
handler)
notification_name = notification_names[:1]
set_handler_data(handler, device, network, notification_names,
notification_ids)
send_data(handler, device, notification_name)
handler.data['subscription'] = network.subscribe_notifications(
names=notification_name)
def handle_notification(handler, notification):
assert notification.id == handler.data['notification_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_notification=handle_notification)
def handle_connect(handler):
network_name = test.generate_id('n-s-n', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
device_id = test.generate_id('n-s-n', test.DEVICE_ENTITY)
device = handler.api.put_device(device_id, network_id=network.id)
notification_name = '%s-name-1' % device_id
notification = device.send_notification(notification_name)
set_handler_data(handler, device, network, [notification_name],
[notification.id])
handler.data['subscription'] = network.subscribe_notifications()
def handle_notification(handler, notification):
assert notification.id == handler.data['notification_ids'][0]
handler.data['subscription'].remove()
handler.data['device'].remove()
handler.data['network'].remove()
handler.disconnect()
test.run(handle_connect, handle_notification=handle_notification)
def handle_connect(handler):
network_name = test.generate_id('n-s-n', test.NETWORK_ENTITY)
description = '%s-description' % network_name
network = handler.api.create_network(network_name, description)
network_1 = handler.api.get_network(network.id)
network.remove()
try:
network_1.subscribe_notifications()
assert False
except ApiResponseError as api_response_error:
assert api_response_error.code == 404
test.run(handle_connect)
def test_unsubscribe_notifications(test):
test.only_admin_implementation()
def handle_connect(handler):
name = test.generate_id('n-u-n', test.NETWORK_ENTITY)
description = '%s-description' % name
network = handler.api.create_network(name, description)
subscription = network.subscribe_notifications()
subscription.remove()
try:
subscription.remove()
assert False
except SubscriptionError:
pass
network.remove()
test.run(handle_connect)
| 38.959184 | 80 | 0.675339 | 1,990 | 17,181 | 5.593467 | 0.065327 | 0.077082 | 0.036654 | 0.030995 | 0.900009 | 0.893541 | 0.870721 | 0.858144 | 0.840985 | 0.834606 | 0 | 0.004 | 0.214248 | 17,181 | 440 | 81 | 39.047727 | 0.820519 | 0.040801 | 0 | 0.845272 | 0 | 0 | 0.067055 | 0 | 0 | 0 | 0 | 0 | 0.091691 | 1 | 0.117479 | false | 0.014327 | 0.002865 | 0 | 0.137536 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae767aa83868054f37cbcb563a4eaf05782d1983 | 139 | py | Python | model_wrangler/__init__.py | bmcmenamin/model_wrangler | c5471cc106d475c50bf26791b913f2d556a1de0a | [
"MIT"
] | null | null | null | model_wrangler/__init__.py | bmcmenamin/model_wrangler | c5471cc106d475c50bf26791b913f2d556a1de0a | [
"MIT"
] | null | null | null | model_wrangler/__init__.py | bmcmenamin/model_wrangler | c5471cc106d475c50bf26791b913f2d556a1de0a | [
"MIT"
] | 1 | 2018-01-23T23:26:15.000Z | 2018-01-23T23:26:15.000Z | import model_wrangler.architecture
import model_wrangler.dataset_managers
import model_wrangler.model_wrangler
import model_wrangler.model
| 27.8 | 38 | 0.913669 | 18 | 139 | 6.722222 | 0.333333 | 0.53719 | 0.628099 | 0.396694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057554 | 139 | 4 | 39 | 34.75 | 0.923664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8828538591820bf22276768b90b552ac089f1341 | 19,984 | py | Python | app/tests/v2/test_auth.py | calebrotich10/store-manager-api-v2 | 16dff84823e77218f1135c99f0592f113fddee84 | [
"MIT"
] | null | null | null | app/tests/v2/test_auth.py | calebrotich10/store-manager-api-v2 | 16dff84823e77218f1135c99f0592f113fddee84 | [
"MIT"
] | null | null | null | app/tests/v2/test_auth.py | calebrotich10/store-manager-api-v2 | 16dff84823e77218f1135c99f0592f113fddee84 | [
"MIT"
] | 1 | 2018-11-04T18:09:38.000Z | 2018-11-04T18:09:38.000Z | """Module contains tests to endpoints that
are used for user registration and authentication
"""
import json
from . import base_test
from . import common_functions
from app.api.v2 import database
class TestAuth(base_test.TestBaseClass):
""" Class contains tests for auth endpoints """
def test_missing_token(self):
"""Test GET /products - when token is missing"""
self.register_test_admin_account()
token = ""
response = self.app_test_client.get(
'{}/products'.format(self.BASE_URL),
headers=dict(Authorization=token),
content_type='application/json'
)
self.assertEqual(response.status_code, 401)
self.assertEqual(common_functions.convert_response_to_json(
response)["Message"], "You need to login")
def test_missing_token_user(self):
"""Test GET /products - when user who generated token is missing"""
query = """DELETE FROM users"""
database.insert_to_db(query)
response = self.app_test_client.post('{}/products'.format(
self.BASE_URL), json=self.PRODUCT, headers=dict(Authorization=self.token),
content_type='application/json')
self.assertEqual(response.status_code, 406)
self.assertEqual(common_functions.convert_response_to_json(
response)["message"], "The token is invalid since it is not associated to any account")
def test_invalid_token(self):
"""Test GET /products - when token is missing"""
token = "sample_invalid-token-afskdghkfhwkedaf-ksfakjfwey"
response = self.app_test_client.get(
'{}/products'.format(self.BASE_URL),
headers=dict(Authorization=token),
content_type='application/json'
)
self.assertEqual(response.status_code, 403)
self.assertEqual(common_functions.convert_response_to_json(
response)["Message"], "The token is either expired or wrong")
def test_add_new_user(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user@gmail.com",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['user']['email'], "test_add_new_user@gmail.com")
self.assertEqual(res.status_code, 202)
def test_add_new_user_no_data(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Missing required credentials")
self.assertEqual(res.status_code, 400)
def test_add_new_user_missing_data(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Please supply a valid email")
self.assertEqual(res.status_code, 400)
def test_add_new_user_missing_email(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Please supply an email to be able to register an attendant")
self.assertEqual(res.status_code, 400)
def test_add_new_user_missing_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user@gmail.com"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Please supply a password to be able to register an attendant")
self.assertEqual(res.status_code, 400)
def test_add_new_user_invalid_email(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Please supply a valid email")
self.assertEqual(res.status_code, 400)
def test_add_new_user_email_not_string(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": 2,
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Email should be a string")
self.assertEqual(res.status_code, 400)
def test_add_new_user_password_not_string(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "add_user@gmail.com",
"password": 2
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Password should be a string")
self.assertEqual(res.status_code, 400)
def test_add_new_user_no_digit_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "No#digit"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have a digit")
self.assertEqual(res.status_code, 400)
def test_add_new_user_short_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "Shor#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must be long than 6 characters or less than 12")
self.assertEqual(res.status_code, 400)
def test_add_new_user_no_special_ch_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "NoSplCh12"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Password must have a special charater")
self.assertEqual(res.status_code, 400)
def test_add_new_user_no_upper_case_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"role": "Admin",
"password": "noupper12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have an upper case character")
self.assertEqual(res.status_code, 400)
def test_add_new_user_no_lower_case_password(self):
res = self.app_test_client.post("api/v2/auth/signup",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"role": "Admin",
"password": "NOLOWER12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have a lower case character")
self.assertEqual(res.status_code, 400)
def test_add_new_user_existing(self):
"""Test POST /auth/signup"""
response = self.register_test_admin_account()
data = json.loads(response.data.decode())
self.assertEqual(data['message'], "Record already exists in the database")
self.assertEqual(response.status_code, 400)
def test_login_existing_user(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": "user@gmail.com",
"password": "Password12#"
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['token'])
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "You are successfully logged in!")
self.assertEqual(resp.status_code, 200)
def test_login_no_credentials(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "Kindly enter your credentials")
self.assertEqual(resp.status_code, 400)
def test_login_no_email(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"password": "Password12#"
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "Kindly provide an email address to log in")
self.assertEqual(resp.status_code, 400)
def test_login_no_password(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": "user@gmail.com"
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "Kindly provide a password to log in")
self.assertEqual(resp.status_code, 400)
def test_login_email_not_string(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": 1,
"password": "Password12#"
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "E-mail should be a string")
self.assertEqual(resp.status_code, 406)
def test_login_password_not_string(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": "user@gmail.com",
"password": 1
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "Password should be a string")
self.assertEqual(resp.status_code, 406)
def test_login_wrong_password(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": "user@gmail.com",
"password": "neverexpecteduser"
},
headers={
"Content-Type": "application/json"
})
self.assertTrue(common_functions.convert_response_to_json(
resp)['message'], "Wrong credentials provided")
self.assertEqual(resp.status_code, 403)
def test_login_non_existant_user(self):
resp = self.app_test_client.post("api/v2/auth/login",
json={
"email": "non_matching_credentials_user_1018@gmail.com",
"password": "neverexpecteduser"
},
headers={
"Content-Type": "application/json"
})
self.assertEqual(common_functions.convert_response_to_json(
resp)['message'], "Try again. E-mail or password is incorrect!")
def test_abort_if_user_is_not_admin(self):
self.register_test_attendant_account()
token = self.login_test_attendant()
response = self.app_test_client.post('{}/products'.format(
self.BASE_URL), json={
'product_id': 1, 'product_name': "Hammer", 'product_price': 200, 'category': 200
}, headers=dict(Authorization=token),
content_type='application/json')
self.assertTrue(common_functions.convert_response_to_json(
response)['message'], "Unauthorized. This action is not for you")
self.assertEqual(response.status_code, 401)
def test_logout(self):
response = self.app_test_client.post('{}/auth/logout'.format(
self.BASE_URL), headers=dict(Authorization=self.token),
content_type='application/json')
self.assertEqual(response.status_code, 200)
self.assertEqual(common_functions.convert_response_to_json(
response)['message'], "User Logged out successfully"
)
def test_add_new_admin_user(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user@gmail.com",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['user']['email'], "test_add_new_user@gmail.com")
self.assertEqual(res.status_code, 202)
def test_add_new_admin_user_no_data(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Missing required credentials")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_missing_data(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Please supply a valid email")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_missing_email(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Please supply an email to be able to register an admin")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_missing_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user@gmail.com"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Please supply a password to be able to register an admin")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_invalid_email(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user",
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Please supply a valid email")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_email_not_string(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": 2,
"password": "Password12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Email should be a string")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_password_not_string(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "add_user@gmail.com",
"password": 2
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Password should be a string")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_no_digit_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "No#digit"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have a digit")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_short_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "Shor#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must be long than 6 characters or less than 12")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_no_special_ch_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"password": "NoSplCh12"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
print(data)
self.assertEqual(data['message'], "Password must have a special charater")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_no_upper_case_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"role": "Admin",
"password": "noupper12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have an upper case character")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_no_lower_case_password(self):
res = self.app_test_client.post("api/v2/auth/signup/admin",
json={
"email": "test_add_new_user_invalid_email@gmail.com",
"role": "Admin",
"password": "NOLOWER12#"
},
headers=dict(Authorization=self.token),
content_type='application/json')
data = json.loads(res.data.decode())
self.assertEqual(data['message'], "Password must have a lower case character")
self.assertEqual(res.status_code, 400)
def test_add_new_admin_user_existing(self):
"""Test POST /auth/signup/admin"""
response = self.register_test_admin_account()
data = json.loads(response.data.decode())
self.assertEqual(data['message'], "Record already exists in the database")
self.assertEqual(response.status_code, 400) | 35.685714 | 105 | 0.631956 | 2,400 | 19,984 | 5.045417 | 0.0725 | 0.090429 | 0.037988 | 0.054753 | 0.918821 | 0.91114 | 0.900487 | 0.893715 | 0.887026 | 0.877694 | 0 | 0.014251 | 0.241543 | 19,984 | 560 | 106 | 35.685714 | 0.784654 | 0.016613 | 0 | 0.724832 | 0 | 0 | 0.238746 | 0.049758 | 0 | 0 | 0 | 0 | 0.183445 | 1 | 0.091723 | false | 0.138702 | 0.008949 | 0 | 0.102908 | 0.017897 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
4e85e3187bbe03d2752bc5ea94b1d787c653638a | 12,746 | py | Python | chess/piece.py | lucasvalentim/chess-cli | f77828cca92d1d23a8c614c07072b3ce72cfbadd | [
"MIT"
] | null | null | null | chess/piece.py | lucasvalentim/chess-cli | f77828cca92d1d23a8c614c07072b3ce72cfbadd | [
"MIT"
] | null | null | null | chess/piece.py | lucasvalentim/chess-cli | f77828cca92d1d23a8c614c07072b3ce72cfbadd | [
"MIT"
] | null | null | null | from chess.constants import *
class Piece(object):
def __init__(self, piece_value):
if piece_value in PIECES:
self.__value = piece_value
if piece_value in WHITE_PIECES:
self.__color = WHITE
else:
self.__color = BLACK
@property
def value(self):
return self.__value
@property
def color(self):
return self.__color
def squares_available(self, board, coordinate):
squares_available = []
if self.value == WHITE_PAWN:
if coordinate.rank != RANK_8:
if board.get(board.neighboring_squares_upper(coordinate)[0]) == EMPTY:
squares_available.append(board.neighboring_squares_upper(coordinate)[0])
if coordinate.rank == RANK_2 and board.get(board.neighboring_squares_upper(coordinate)[1]) == EMPTY:
squares_available.append(board.neighboring_squares_upper(coordinate)[1])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_upper_left(coordinate)[0]) in BLACK_PIECES:
squares_available.append(board.neighboring_squares_upper_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_upper_right(coordinate)[0]) in BLACK_PIECES:
squares_available.append(board.neighboring_squares_upper_right(coordinate)[0])
elif self.value == BLACK_PAWN:
if coordinate.rank != RANK_1:
if board.get(board.neighboring_squares_lower(coordinate)[0]) == EMPTY:
squares_available.append(board.neighboring_squares_lower(coordinate)[0])
if coordinate.rank == RANK_7 and board.get(board.neighboring_squares_lower(coordinate)[1]) == EMPTY:
squares_available.append(board.neighboring_squares_lower(coordinate)[1])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_lower_left(coordinate)[0]) in WHITE_PIECES:
squares_available.append(board.neighboring_squares_lower_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_lower_right(coordinate)[0]) in WHITE_PIECES:
squares_available.append(board.neighboring_squares_lower_right(coordinate)[0])
elif self.value == WHITE_KNIGHT:
for square_coordinate in board.neighboring_squares_knight_radius(coordinate):
if board.get(square_coordinate) in [EMPTY] + BLACK_PIECES:
squares_available.append(square_coordinate)
elif self.value == BLACK_KNIGHT:
for square_coordinate in board.neighboring_squares_knight_radius(coordinate):
if board.get(square_coordinate) in [EMPTY] + WHITE_PIECES:
squares_available.append(square_coordinate)
elif self.value == WHITE_BISHOP:
for squares_coordinates in [board.neighboring_squares_upper_left(coordinate),
board.neighboring_squares_lower_right(coordinate),
board.neighboring_squares_upper_right(coordinate),
board.neighboring_squares_lower_left(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in BLACK_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == BLACK_BISHOP:
for squares_coordinates in [board.neighboring_squares_upper_left(coordinate),
board.neighboring_squares_lower_right(coordinate),
board.neighboring_squares_upper_right(coordinate),
board.neighboring_squares_lower_left(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in WHITE_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == WHITE_ROOK:
for squares_coordinates in [board.neighboring_squares_upper(coordinate),
board.neighboring_squares_lower(coordinate),
board.neighboring_squares_left(coordinate),
board.neighboring_squares_right(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in BLACK_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == BLACK_ROOK:
for squares_coordinates in [board.neighboring_squares_upper(coordinate),
board.neighboring_squares_lower(coordinate),
board.neighboring_squares_left(coordinate),
board.neighboring_squares_right(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in WHITE_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == WHITE_QUEEN:
for squares_coordinates in [board.neighboring_squares_upper(coordinate),
board.neighboring_squares_lower(coordinate),
board.neighboring_squares_left(coordinate),
board.neighboring_squares_right(coordinate),
board.neighboring_squares_upper_left(coordinate),
board.neighboring_squares_lower_right(coordinate),
board.neighboring_squares_upper_right(coordinate),
board.neighboring_squares_lower_left(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in BLACK_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == BLACK_QUEEN:
for squares_coordinates in [board.neighboring_squares_upper(coordinate),
board.neighboring_squares_lower(coordinate),
board.neighboring_squares_left(coordinate),
board.neighboring_squares_right(coordinate),
board.neighboring_squares_upper_left(coordinate),
board.neighboring_squares_lower_right(coordinate),
board.neighboring_squares_upper_right(coordinate),
board.neighboring_squares_lower_left(coordinate)]:
for square_coordinate in squares_coordinates:
if board.get(square_coordinate) == EMPTY:
squares_available.append(square_coordinate)
elif board.get(square_coordinate) in WHITE_PIECES:
squares_available.append(square_coordinate)
break
else:
break
elif self.value == WHITE_KING:
if coordinate.rank != RANK_8:
if board.get(board.neighboring_squares_upper(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_upper(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_upper_left(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_upper_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_upper_right(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_upper_right(coordinate)[0])
if coordinate.rank != RANK_1:
if board.get(board.neighboring_squares_lower(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_lower(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_lower_left(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_lower_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_lower_right(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_lower_right(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_left(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_right(coordinate)[0]) in [EMPTY] + BLACK_PIECES:
squares_available.append(board.neighboring_squares_right(coordinate)[0])
elif self.value == BLACK_KING:
if coordinate.rank != RANK_8:
if board.get(board.neighboring_squares_upper(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_upper(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_upper_left(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_upper_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_upper_right(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_upper_right(coordinate)[0])
if coordinate.rank != RANK_1:
if board.get(board.neighboring_squares_lower(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_lower(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_lower_left(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_lower_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_lower_right(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_lower_right(coordinate)[0])
if coordinate.file != FILE_A and board.get(
board.neighboring_squares_left(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_left(coordinate)[0])
if coordinate.file != FILE_H and board.get(
board.neighboring_squares_right(coordinate)[0]) in [EMPTY] + WHITE_PIECES:
squares_available.append(board.neighboring_squares_right(coordinate)[0])
return squares_available
| 53.108333 | 120 | 0.588655 | 1,221 | 12,746 | 5.835381 | 0.044226 | 0.18414 | 0.264702 | 0.125754 | 0.959579 | 0.958456 | 0.947649 | 0.930246 | 0.930246 | 0.894316 | 0 | 0.006663 | 0.340577 | 12,746 | 239 | 121 | 53.330544 | 0.841047 | 0 | 0 | 0.704301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021505 | false | 0 | 0.005376 | 0.010753 | 0.048387 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4e9578c45c779e77fba9186779c22ddd8c9d770b | 3,184 | py | Python | mpf/tests/test_SwitchPlayer.py | Wolfmarsh/mpf | ad71f381ce8a0e65f28958e51cf8a8b38a6154fb | [
"MIT"
] | null | null | null | mpf/tests/test_SwitchPlayer.py | Wolfmarsh/mpf | ad71f381ce8a0e65f28958e51cf8a8b38a6154fb | [
"MIT"
] | null | null | null | mpf/tests/test_SwitchPlayer.py | Wolfmarsh/mpf | ad71f381ce8a0e65f28958e51cf8a8b38a6154fb | [
"MIT"
] | null | null | null | from mpf.tests.MpfTestCase import MpfTestCase
class TestSwitchPlayer(MpfTestCase):
def get_config_file(self):
return 'config.yaml'
def get_machine_path(self):
return 'tests/machine_files/switch_player/'
def setUp(self):
self.machine_config_patches['mpf']['plugins'] = ['mpf.plugins.switch_player.SwitchPlayer']
super().setUp()
def _sw_handler(self):
self.hits += 1
def test_switch_player(self):
self.hits = 0
self.machine.switch_controller.add_switch_handler("s_test3", self._sw_handler)
self.post_event("test_start")
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(0, self.hits)
self.advance_time_and_run(0.1)
self.assertEqual(True, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.advance_time_and_run(0.6)
self.assertEqual(True, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(1, self.hits)
self.advance_time_and_run(0.1)
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(1, self.hits)
self.advance_time_and_run(1)
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(True, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(1, self.hits)
self.advance_time_and_run(1)
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(True, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(2, self.hits)
self.advance_time_and_run(0.1)
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(2, self.hits)
self.advance_time_and_run(1)
self.assertEqual(False, self.machine.switch_controller.is_active("s_test1"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test2"))
self.assertEqual(False, self.machine.switch_controller.is_active("s_test3"))
self.assertEqual(3, self.hits)
| 53.066667 | 98 | 0.720791 | 421 | 3,184 | 5.180523 | 0.12114 | 0.213205 | 0.194865 | 0.309491 | 0.813847 | 0.813847 | 0.805594 | 0.805594 | 0.805594 | 0.805594 | 0 | 0.016741 | 0.155779 | 3,184 | 59 | 99 | 53.966102 | 0.794643 | 0 | 0 | 0.660377 | 0 | 0 | 0.087312 | 0.022613 | 0 | 0 | 0 | 0 | 0.584906 | 1 | 0.09434 | false | 0 | 0.018868 | 0.037736 | 0.169811 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
1171ab549c349db28d0fc47d9845482e702bda7c | 17,351 | py | Python | src/test/Python/UpdateObject.py | Termpey/xlr-rally-plugin | 681d1be1bd40f5d81d2b57259ba7e7cf79360b33 | [
"MIT"
] | null | null | null | src/test/Python/UpdateObject.py | Termpey/xlr-rally-plugin | 681d1be1bd40f5d81d2b57259ba7e7cf79360b33 | [
"MIT"
] | null | null | null | src/test/Python/UpdateObject.py | Termpey/xlr-rally-plugin | 681d1be1bd40f5d81d2b57259ba7e7cf79360b33 | [
"MIT"
] | 1 | 2020-01-23T19:25:16.000Z | 2020-01-23T19:25:16.000Z | #
# Copyright 2019 XEBIALABS
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
import ssl, httplib, urllib, json
class UpdateObject:
baseURL = '/slm/webservice/v2.0/'
def __init__(self, userAndPass):
self.userAndPass = userAndPass
#
# Feature Manipulation
#
def basicFeature(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context())
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'portfolioitem/feature?fetch=FormattedID&query=(FormattedID%20%3D%20F20420)'
conn.request('GET', curURL, "", headers)
fQResp = conn.getresponse()
fQJson = json.loads(fQResp.read())
print("****Feature Query****\n")
print(fQJson)
print("\n\n")
fRef = fQJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch": [{"Entry": {"Path": "/porfolioitem/feature/%s", "Method": "POST", "Body": {"feature": {"State": "On Hold", "Description": "Testing Plugin Script", "Notes": "Testing Plugin Script", "c_AcceptancCriteria": "Testing Plugin Script", "PlannedStartDate": "2020-01-30", "PlannedEndDate": "2020-12-30"}}}}]}"""%fRef
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request('POST', curURL, data, headers)
request = conn.getresponse()
print("****Feature Batch Update****\n")
print(request.read())
print("\n\n")
return ""
#
# Set Mileston feature
#
def milestoneFeature(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'portfolioitem/feature?fetch=FormattedID&query=(FormattedID%20%3D%20F20420)'
conn.request('GET', curURL, "", headers)
fQResp = conn.getresponse()
fQJson = json.loads(fQResp.read())
print("****Feature Query****\n")
print(fQJson)
print("\n\n")
fRef = fQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'milestone?fetch=Name&query=(Name%20%3D%20\"2020%20Thu%2010/1\")'
conn.request('GET', curURL, "", headers)
milReq = conn.getresponse()
milJson = json.loads(milReq.read())
print("****Milestone Query****\n")
print(milJson)
print("\n\n")
milRef = milJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/portfolioitem/feature/%s", "Method": 'POST", "Body": {"feature": {"Milestones": {"Milestone": "%s"}}}}]}"""%(fRef, milRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****Feature Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Set Team to Tech Feature
#
def teamFeature(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'portfolioitem/feature?fetch=FormattedID&query=(FormattedID%20%3D%20F20420)'
conn.request('GET', curURL, "", headers)
fQResp = conn.getresponse()
fQJson = json.loads(fQResp.read())
print("****Feature Query****\n")
print(fQJson)
print("\n\n")
fRef = fQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'project?fetch=Name&query=(Name%20%3D%20\"Tech\")'
conn.request('GET', curURL, "", headers)
teamReq = conn.getresponse()
teamJson = json.loads(teamReq.read())
print("****Project/Team Query****\n")
print(teamJson)
print("\n\n")
teamRef = teamJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/portfolioitem/feature/%s", "Method": 'POST", "Body": {"feature": {"Project": "%s"}}}]}"""%(fRef, teamRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****Feature Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Set Owner on Feature
#
def ownerFeature(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'portfolioitem/feature?fetch=FormattedID&query=(FormattedID%20%3D%20F20420)'
conn.request('GET', curURL, "", headers)
fQResp = conn.getresponse()
fQJson = json.loads(fQResp.read())
print("****Feature Query****\n")
print(fQJson)
print("\n\n")
fRef = fQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'user?fetch=Name&query=((FirstName%20%3D%20Connor)%20AND%20(LastName%20%3D%20Trempe))'
conn.request('GET', curURL, "", headers)
ownReq = conn.getresponse()
ownJson = json.loads(ownReq.read())
print("****User Query****\n")
print(ownJson)
print("\n\n")
ownRef = ownJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/portfolioitem/feature/%s", "Method": 'POST", "Body": {"feature": {"Owner": {"_ref": "%s"}}}}]}"""%(fRef, ownRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****Feature Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# User Story Manipulation
#
def basicUserStoryManipulation(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"Description": "Plugin Script Testing", "Notes": "Plugin Script Testing", "c_AcceptanceCriteria": "Plugin Script Testing"}}}]}"""%usRef
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Searching for and Adding Iteration to User Story
#
def iterationUserStory(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'iteration?fetch=Name&query=((Project.Name%20%3D%20Tech)%20AND%20(Name%20%3D%20\"2020%20Sprint%201:%201/1%20-%201/14\"))'
conn.request('GET', curURL, "", headers)
iterReq = conn.getresponse()
iterJson = json.loads(iterReq.read())
print("****Iteration Query****\n")
print(iterJson)
print("\n\n")
iterRef = iterJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"Iteration": "%s"}}}]}"""%(usRef, iterRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Add Mileston to User Story
#
def milestoneUserStory(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'milestone?fetch=Name&query=(Name%20%3D%20\"2020%20Thu%2010/1\")'
conn.request('GET', curURL, "", headers)
milReq = conn.getresponse()
milJson = json.loads(milReq.read())
print("****Milestone Query****\n")
print(milJson)
print("\n\n")
milRef = milJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"Milestones": {"Milestone": "%s"}}}}]}"""%(usRef, milRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Adding Team By Title
#
def teamUserStory(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'project?fetch=Name&query=(Name%20%3D%20\"Release%20Management%20Team\")'
conn.request('GET', curURL, "", headers)
ownReq = conn.getresponse()
ownJson = json.loads(ownReq.read())
print("****Project/Team Query****\n")
print(ownJson)
print("\n\n")
ownRef = ownJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"Project": "%s"}}}]}"""%(usRef, ownRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Add State by Project and Title
#
def stateUserStory(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'flowstate?fetch=Name&query=((Name%20%3D%20Accepted)%20AND%20(Project.name%20%3D%20\"Release%20Management%20Team\"))'
conn.request('GET', curURL, "", headers)
stsReq = conn.getresponse()
stsJson = json.loads(stsReq.read())
print("****State(Flow State) Query****\n")
print(stsJson)
print("\n\n")
stsRef = stsJson.get('QueryResult').get('Results')[0].get('_ref')
curURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"FlowState": {"_ref": "%s"}}}}]}"""%(usRef, stsRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return ""
#
# Add Owner by Name User Story
#
def ownerUserStory(self):
conn = httplib.HTTPSConnection("rally1.rallydev.com","443",context=ssl._create_unverified_context()) #Re-establis Connection after PUT/POST/DELETE
headers = {'Authorization' : 'Basic %s' % self.userAndPass}
curURL = UpdateObject.baseURL + 'hierarchicalrequirement?fetch=FormattedID&query=(FormattedID%20%3D%20US138725)'
conn.request('GET', curURL, "", headers)
usQResp = conn.getresponse()
usQJson = json.loads(usQResp.read())
print("****User Story Query****\n")
print(usQJson)
print("\n\n")
usRef = usQJson.get('QueryResult').get('Results')[0].get('_ref')
curUrl = UpdateObject.baseURL + 'user?fetch=FirstName,LastName&query=((FistName%20%3D%20Connor)%20AND%20(LastName%20%3D%20Trempe))'
conn.request('GET', curURL, "", headers)
ownReq = conn.getresponse()
ownJson = json.loads(ownReq.read())
print("****User Query****\n")
print(ownJson)
print("\n\n")
ownRef = ownJson.get('QueryResult').get('Results')[0].get('_ref')
cURL = UpdateObject.baseURL + 'batch'
data = """{"Batch":["Entry":{"Path":"/hierarchicalrequirement/%s", "Method": 'POST", "Body": {"hierarchicalrequirement": {"User": {"_ref": "%s"}}}}]}"""%(usRef, ownRef)
headers = {'Authorization' : 'Basic %s', 'ZSESSIONID' : '%s'%(self.userAndPass, 'API KEY')}
conn.request("POST", curURL, data, headers)
request = conn.getresponse()
print("****User Story Manipulation****\n")
print(request.read())
print("\n\n")
return "" | 34.702 | 462 | 0.608092 | 1,850 | 17,351 | 5.672432 | 0.133514 | 0.050696 | 0.018677 | 0.049552 | 0.789308 | 0.782828 | 0.777968 | 0.773966 | 0.773966 | 0.770821 | 0 | 0.022829 | 0.212322 | 17,351 | 500 | 463 | 34.702 | 0.745006 | 0.099706 | 0 | 0.817844 | 0 | 0.063197 | 0.341039 | 0.139793 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040892 | false | 0.081784 | 0.003717 | 0 | 0.089219 | 0.315985 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.