hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
06a983a93f0a24acf649dd7b76504883c5559396 | 21,110 | py | Python | py/test_conv.py | avdmitry/convnet | 4ae77c04e898643bb476e0604fab5682f23069d7 | [
"BSD-2-Clause"
] | 293 | 2015-01-01T12:23:24.000Z | 2022-03-28T19:34:17.000Z | py/test_conv.py | weilaiyxj/convnet | 4ae77c04e898643bb476e0604fab5682f23069d7 | [
"BSD-2-Clause"
] | 16 | 2015-01-05T17:46:04.000Z | 2017-08-13T17:20:26.000Z | py/test_conv.py | weilaiyxj/convnet | 4ae77c04e898643bb476e0604fab5682f23069d7 | [
"BSD-2-Clause"
] | 181 | 2015-01-04T18:06:45.000Z | 2021-07-30T05:37:36.000Z | import sys
from convnet import *
import numpy as np
import conv_cpu
test_gemm = True
def DivUp(a, b):
return (a + b - 1) / b
def TestConvUp(images_shape, conv_desc):
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels)
output_shape = cm.GetOutputShape4D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
filters = np.random.randn(filters_shape[0], filters_shape[1] * filters_shape[2] * filters_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
filters_gpu = cm.CUDAMatrix(filters, shape=filters_shape)
output_gpu = cm.empty(output_shape)
if test_gemm:
cc_gemm.convUp(images_gpu, filters_gpu, output_gpu, conv_desc)
else:
cc.convUp(images_gpu, filters_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.ConvUp(images, filters, images_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestConvDown(images_shape, conv_desc):
deriv_shape = cm.GetOutputShape4D(images_shape, conv_desc)
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3]).astype(np.float32)
filters = np.random.randn(filters_shape[0], filters_shape[1] * filters_shape[2] * filters_shape[3]).astype(np.float32)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
filters_gpu = cm.CUDAMatrix(filters, shape=filters_shape)
images_gpu = cm.empty(images_shape)
if test_gemm:
cc_gemm.convDown(derivs_gpu, filters_gpu, images_gpu, conv_desc)
else:
cc.convDown(derivs_gpu, filters_gpu, images_gpu, conv_desc)
images_cpu = conv_cpu.ConvDown(derivs, filters, images_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(images_cpu, images_gpu.asarray())
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
derivs_gpu.free_device_memory()
return diff
def TestConvOutp(images_shape, conv_desc, partial_sum_y=0, partial_sum_x=0):
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels)
deriv_shape = cm.GetOutputShape4D(images_shape, conv_desc)
batch_size, num_modules_x, num_modules_y, num_output_channels = deriv_shape
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
filters_gpu = cm.empty(filters_shape)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
if test_gemm:
cc_gemm.convOutp(images_gpu, derivs_gpu, filters_gpu, conv_desc)
else:
if partial_sum_x == 0:
partial_sum_x = num_modules_x
if partial_sum_y == 0:
partial_sum_y = num_modules_y
partial_sum_locs_y = DivUp(num_modules_y, partial_sum_y)
partial_sum_locs_x = DivUp(num_modules_x, partial_sum_x)
filters_temp_gpu = cm.empty((filters_shape[0], filters_shape[1], filters_shape[2], filters_shape[3] * partial_sum_locs_x * partial_sum_locs_y))
cc.convOutp(images_gpu, derivs_gpu, filters_gpu, conv_desc, partialSumY=partial_sum_y, partialSumX=partial_sum_x, temp=filters_temp_gpu)
filters_cpu, filters_temp_cpu = conv_cpu.ConvOutp(images, derivs, images_shape, cm.GetConvDescTuple(conv_desc), partial_sum_y=partial_sum_y, partial_sum_x=partial_sum_x)
diff1 = Diff(filters_gpu.asarray(), filters_cpu)
if test_gemm:
diff2 = 0
else:
diff2 = Diff(filters_temp_gpu.asarray(), filters_temp_cpu)
filters_temp_gpu.free_device_memory()
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
derivs_gpu.free_device_memory()
return diff1, diff2
def TestMaxPool(images_shape, conv_desc):
output_shape = cm.GetOutputShape4D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
output_gpu = cm.empty(output_shape)
if test_gemm:
cc_gemm.MaxPool(images_gpu, output_gpu, conv_desc)
else:
cc.MaxPool(images_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.MaxPool(images, images_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestMaxPoolUndo(images_shape, conv_desc):
deriv_shape = cm.GetOutputShape4D(images_shape, conv_desc)
images = np.random.rand(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3]).astype(np.float32)
maxes = conv_cpu.MaxPool(images, images_shape, cm.GetConvDescTuple(conv_desc))
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
maxes_gpu = cm.CUDAMatrix(maxes, shape=deriv_shape)
targets_gpu = cm.empty(images_shape)
if test_gemm:
cc_gemm.MaxPoolUndo(images_gpu, derivs_gpu, maxes_gpu, targets_gpu, conv_desc)
else:
cc.MaxPoolUndo(images_gpu, derivs_gpu, maxes_gpu, targets_gpu, conv_desc)
output_cpu = conv_cpu.MaxPoolUndo(images, maxes, derivs, images_shape, deriv_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
derivs_gpu.free_device_memory()
maxes_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestMaxPoolRprop(images_shape, conv_desc):
maxes_shape = cm.GetOutputShape4D(images_shape, conv_desc)
images = np.random.rand(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
R_images = np.random.rand(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
maxes = conv_cpu.MaxPool(images, images_shape, cm.GetConvDescTuple2(conv_desc))
images_gpu = cm.CUDAMatrix(images)
R_images_gpu = cm.CUDAMatrix(R_images)
maxes_gpu = cm.CUDAMatrix(maxes)
targets_gpu = cm.empty(maxes.shape)
images_gpu.set_shape4d(images_shape)
R_images_gpu.set_shape4d(images_shape)
maxes_gpu.set_shape4d(maxes_shape)
targets_gpu.set_shape4d(maxes_shape)
assert test_gemm
cc_gemm.MaxPoolRprop(images_gpu, R_images_gpu, maxes_gpu, targets_gpu, conv_desc)
output_cpu = conv_cpu.MaxPoolRprop(images, R_images, maxes, images_shape, cm.GetConvDescTuple2(conv_desc))
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
R_images_gpu.free_device_memory()
maxes_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestAvgPool(images_shape, conv_desc):
output_shape = cm.GetOutputShape4D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
output_gpu = cm.empty(output_shape)
if test_gemm:
cc_gemm.AvgPool(images_gpu, output_gpu, conv_desc)
else:
cc.AvgPool(images_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.AvgPool(images, images_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestAvgPoolUndo(images_shape, conv_desc):
deriv_shape = cm.GetOutputShape4D(images_shape, conv_desc)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3]).astype(np.float32)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
targets_gpu = cm.empty(images_shape)
if test_gemm:
cc_gemm.AvgPoolUndo(derivs_gpu, targets_gpu, conv_desc)
else:
cc.AvgPoolUndo(derivs_gpu, targets_gpu, conv_desc)
output_cpu = conv_cpu.AvgPoolUndo(derivs, images_shape, cm.GetConvDescTuple(conv_desc))
diff = Diff(output_cpu, targets_gpu.asarray())
derivs_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestResponseNormCrossMap(images_shape, numF, add_scale, pow_scale, blocked):
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
targets_gpu = cm.empty(images_shape)
if test_gemm:
cc_gemm.ResponseNormCrossMap(images_gpu, targets_gpu, numF, add_scale, pow_scale, blocked)
else:
cc.ResponseNormCrossMap(images_gpu, targets_gpu, numF, add_scale, pow_scale, blocked)
output_cpu = conv_cpu.ResponseNormCrossMap(images, images_shape, numF, add_scale, pow_scale, blocked)
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestResponseNormCrossMapUndo(images_shape, numF, add_scale, pow_scale, blocked):
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
derivs = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
derivs_gpu = cm.CUDAMatrix(derivs, shape=images_shape)
targets_gpu = cm.empty(images_shape)
if test_gemm:
cc_gemm.ResponseNormCrossMapUndo(derivs_gpu, images_gpu, targets_gpu, numF, add_scale, pow_scale, blocked)
else:
acts_gpu = cm.empty(images_shape)
cc.ResponseNormCrossMap(images_gpu, acts_gpu, numF, add_scale, pow_scale, blocked)
cc.ResponseNormCrossMapUndo(derivs_gpu, images_gpu, acts_gpu, targets_gpu, numF, add_scale, pow_scale, blocked)
acts_gpu.free_device_memory()
output_cpu = conv_cpu.ResponseNormCrossMapUndo(derivs, images, images_shape, numF, add_scale, pow_scale, blocked)
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestResponseNormCrossMapRprop(images_shape, numF, add_scale, pow_scale, blocked):
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
R_images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images)
R_images_gpu = cm.CUDAMatrix(R_images)
targets_gpu = cm.empty((images_shape[0], images_shape[1] * images_shape[2] * images_shape[3]))
images_gpu.set_shape4d(images_shape)
R_images_gpu.set_shape4d(images_shape)
targets_gpu.set_shape4d(images_shape)
if test_gemm:
cc_gemm.ResponseNormCrossMapRprop(images_gpu, R_images_gpu, targets_gpu, numF, add_scale, pow_scale, blocked)
else:
raise RuntimeError('Not implemented')
output_cpu = conv_cpu.ResponseNormCrossMapRprop(images, R_images, images_shape, numF, add_scale, pow_scale, blocked)
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
R_images_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestConvUp3D(images_shape, conv_desc):
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels * conv_desc.kernel_size_t)
output_shape = cm.GetOutputShape5D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3] * images_shape[4]).astype(np.float32)
filters = np.random.randn(filters_shape[0], filters_shape[1] * filters_shape[2] * filters_shape[3]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
filters_gpu = cm.CUDAMatrix(filters, shape=filters_shape)
output_gpu = cm.empty(output_shape)
assert test_gemm
cc_gemm.convUp3D(images_gpu, filters_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.ConvUp3D(images, filters, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestConvDown3D(images_shape, conv_desc):
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels * conv_desc.kernel_size_t)
deriv_shape = cm.GetOutputShape5D(images_shape, conv_desc)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3] * deriv_shape[4]).astype(np.float32)
filters = np.random.randn(filters_shape[0], filters_shape[1] * filters_shape[2] * filters_shape[3]).astype(np.float32)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
filters_gpu = cm.CUDAMatrix(filters, shape=filters_shape)
images_gpu = cm.empty(images_shape)
assert test_gemm
cc_gemm.convDown3D(derivs_gpu, filters_gpu, images_gpu, conv_desc)
images_cpu = conv_cpu.ConvDown3D(derivs, filters, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(images_cpu, images_gpu.asarray())
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
derivs_gpu.free_device_memory()
return diff
def TestConvOutp3D(images_shape, conv_desc, partial_sum_y=0, partial_sum_x=0):
filters_shape = (conv_desc.num_output_channels, conv_desc.kernel_size_x, conv_desc.kernel_size_y, conv_desc.num_input_channels * conv_desc.kernel_size_t)
deriv_shape = cm.GetOutputShape5D(images_shape, conv_desc)
batch_size, num_modules_x, num_modules_y, num_output_channels, num_modules_t = deriv_shape
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3] * images_shape[4]).astype(np.float32)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3] * deriv_shape[4]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
filters_gpu = cm.empty(filters_shape)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
cc_gemm.convOutp3D(images_gpu, derivs_gpu, filters_gpu, conv_desc)
filters_cpu = conv_cpu.ConvOutp3D(images, derivs, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(filters_gpu.asarray(), filters_cpu)
images_gpu.free_device_memory()
filters_gpu.free_device_memory()
derivs_gpu.free_device_memory()
return diff
def TestMaxPool3D(images_shape, conv_desc):
output_shape = cm.GetOutputShape5D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3] * images_shape[4]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
output_gpu = cm.empty(output_shape)
assert test_gemm
cc_gemm.MaxPool3D(images_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.MaxPool3D(images, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestMaxPool3DUndo(images_shape, conv_desc):
deriv_shape = cm.GetOutputShape5D(images_shape, conv_desc)
images = np.random.rand(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3] * images_shape[4]).astype(np.float32)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3] * deriv_shape[4]).astype(np.float32)
maxes = conv_cpu.MaxPool3D(images, images_shape, cm.GetConvDescTuple3D(conv_desc))
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
maxes_gpu = cm.CUDAMatrix(maxes, shape=deriv_shape)
targets_gpu = cm.empty(images_shape)
assert test_gemm
cc_gemm.MaxPool3DUndo(images_gpu, derivs_gpu, maxes_gpu, targets_gpu, conv_desc)
output_cpu = conv_cpu.MaxPool3DUndo(images, maxes, derivs, images_shape, deriv_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(output_cpu, targets_gpu.asarray())
images_gpu.free_device_memory()
derivs_gpu.free_device_memory()
maxes_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def TestAvgPool3D(images_shape, conv_desc):
output_shape = cm.GetOutputShape5D(images_shape, conv_desc)
images = np.random.randn(images_shape[0], images_shape[1] * images_shape[2] * images_shape[3] * images_shape[4]).astype(np.float32)
images_gpu = cm.CUDAMatrix(images, shape=images_shape)
output_gpu = cm.empty(output_shape)
assert test_gemm
cc_gemm.AvgPool3D(images_gpu, output_gpu, conv_desc)
output_cpu = conv_cpu.AvgPool3D(images, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(output_cpu, output_gpu.asarray())
images_gpu.free_device_memory()
output_gpu.free_device_memory()
return diff
def TestAvgPool3DUndo(images_shape, conv_desc):
deriv_shape = cm.GetOutputShape5D(images_shape, conv_desc)
derivs = np.random.randn(deriv_shape[0], deriv_shape[1] * deriv_shape[2] * deriv_shape[3] * deriv_shape[4]).astype(np.float32)
derivs_gpu = cm.CUDAMatrix(derivs, shape=deriv_shape)
targets_gpu = cm.empty(images_shape)
assert test_gemm
cc_gemm.AvgPool3DUndo(derivs_gpu, targets_gpu, conv_desc)
output_cpu = conv_cpu.AvgPool3DUndo(derivs, images_shape, cm.GetConvDescTuple3D(conv_desc))
diff = Diff(output_cpu, targets_gpu.asarray())
derivs_gpu.free_device_memory()
targets_gpu.free_device_memory()
return diff
def Diff(a, b):
scale = np.abs(a + b).mean()
diff = np.abs(a - b).max() / scale
return diff
def Check(diff, tol=1e-4):
if diff < tol:
result = 'PASSED'
else:
result = 'FAILED'
print diff, result
def Test2D():
batch_size = 128
image_size_x = 12
image_size_y = 12
num_input_channels = 32
sizeF = 8
add_scale = 0.005
pow_scale = 0.75
blocked = False
num_output_channels = 64
kernel_size_y = 3
kernel_size_x = 3
stride_y = 2
stride_x = 2
padding_y = 1
padding_x = 1
partial_sum = 0
images_shape = (batch_size, image_size_x, image_size_y, num_input_channels)
conv_desc = cm.GetConvDesc(num_input_channels, num_output_channels,
kernel_size_y, kernel_size_x, stride_y,
stride_x, padding_y, padding_x)
pool_desc = cm.GetConvDesc(num_input_channels, num_input_channels,
kernel_size_y, kernel_size_x, stride_y,
stride_x, padding_y, padding_x)
print 'ConvUp'
Check(TestConvUp(images_shape, conv_desc))
print 'ConvDown'
Check(TestConvDown(images_shape, conv_desc))
print 'ConvOutp'
d1, d2 = TestConvOutp(images_shape, conv_desc, partial_sum_y=partial_sum, partial_sum_x=partial_sum)
Check(d1)
print 'MaxPool'
Check(TestMaxPool(images_shape, pool_desc))
print 'AvgPool'
Check(TestAvgPool(images_shape, pool_desc))
print 'MaxPoolUndo'
Check(TestMaxPoolUndo(images_shape, pool_desc))
print 'AvgPoolUndo'
Check(TestAvgPoolUndo(images_shape, pool_desc))
print 'MaxPoolRprop'
Check(TestMaxPoolRprop(images_shape, pool_desc))
print 'ResponseNormCrossMap'
Check(TestResponseNormCrossMap(images_shape, sizeF, add_scale, pow_scale, blocked))
print 'ResponseNormCrossMapUndo'
Check(TestResponseNormCrossMapUndo(images_shape, sizeF, add_scale, pow_scale, blocked))
print 'ResponseNormCrossMapRprop'
Check(TestResponseNormCrossMapRprop(images_shape, sizeF, add_scale, pow_scale, blocked))
def Test3D():
batch_size = 128
image_size_x = 32
image_size_y = 24
image_size_t = 12
num_input_channels = 3
num_output_channels = 64
kernel_size_y = 7
kernel_size_x = 7
kernel_size_t = 3
stride_y = 2
stride_x = 2
stride_t = 2
padding_y = 1
padding_x = 1
padding_t = 0
images_shape = (batch_size, image_size_x, image_size_y, num_input_channels, image_size_t)
conv_desc = cm.GetConvDesc(num_input_channels, num_output_channels,
kernel_size_y, kernel_size_x,
stride_y, stride_x,
padding_y, padding_x, kernel_size_t, stride_t, padding_t)
pool_desc = cm.GetConvDesc(num_input_channels, num_input_channels,
kernel_size_y, kernel_size_x,
stride_y, stride_x,
padding_y, padding_x, kernel_size_t, stride_t, padding_t)
print 'ConvUp'
Check(TestConvUp3D(images_shape, conv_desc))
print 'ConvDown'
Check(TestConvDown3D(images_shape, conv_desc))
print 'ConvOutp'
Check(TestConvOutp3D(images_shape, conv_desc))
print 'MaxPool'
Check(TestMaxPool3D(images_shape, pool_desc))
print 'MaxPoolUndo'
Check(TestMaxPool3DUndo(images_shape, pool_desc))
print 'AvgPool'
Check(TestAvgPool3D(images_shape, pool_desc))
print 'AvgPoolUndo'
Check(TestAvgPool3DUndo(images_shape, pool_desc))
def main():
print "Testing 2D convolutions"
Test2D()
print "Testing 3D convolutions"
Test3D()
if __name__ == '__main__':
board = LockGPU()
print 'Using board', board
main()
FreeGPU(board)
| 42.646465 | 171 | 0.768877 | 3,110 | 21,110 | 4.861093 | 0.051768 | 0.138246 | 0.043855 | 0.064096 | 0.864929 | 0.828483 | 0.806588 | 0.761807 | 0.742823 | 0.712197 | 0 | 0.017525 | 0.126907 | 21,110 | 494 | 172 | 42.732794 | 0.802724 | 0 | 0 | 0.596154 | 0 | 0 | 0.01369 | 0.002321 | 0 | 0 | 0 | 0 | 0.016827 | 0 | null | null | 0.002404 | 0.009615 | null | null | 0.052885 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
236f56023d948e1c421946df069737f3faed4953 | 12,508 | bzl | Python | nodejs/private/yarn_versions.bzl | cgrindel/rules_nodejs | 7bac80598276ed39f33fe2018c6c58fc458f3921 | [
"Apache-2.0"
] | null | null | null | nodejs/private/yarn_versions.bzl | cgrindel/rules_nodejs | 7bac80598276ed39f33fe2018c6c58fc458f3921 | [
"Apache-2.0"
] | null | null | null | nodejs/private/yarn_versions.bzl | cgrindel/rules_nodejs | 7bac80598276ed39f33fe2018c6c58fc458f3921 | [
"Apache-2.0"
] | null | null | null | """
Generated code; do not edit
Update by running yarn update-yarn-versions
Note that we don't support Yarn 2 yet, see
https://github.com/bazelbuild/rules_nodejs/issues/1599
"""
# @unsorted-dict-items
YARN_VERSIONS = {
"0.16.0": ("yarn-v0.16.0.tar.gz", "yarn-v0.16.0", "cd1d7eeb8eb2518441d99c914e5fd18b68e2759743d212dfd8f00574a1de6da8"),
"0.16.1": ("yarn-v0.16.1.tar.gz", "yarn-v0.16.1", "73be27c34ef1dd4217fec23cdfb6b800cd995e9079d4c4764724ef98e98fec07"),
"0.17.0": ("yarn-v0.17.0.tar.gz", "yarn-v0.17.0", "bb87332c23baec5680e13c9afa858d851276eca27e33e215a84338fb4acb0026"),
"0.17.2": ("yarn-v0.17.2.tar.gz", "yarn-v0.17.2", "0e0ff23581920c27b276c320bbcbcd998b7dbb9e0f91aa91cbcd241644df25e0"),
"0.17.3": ("yarn-v0.17.3.tar.gz", "yarn-v0.17.3", "883df435f68ce47c93c2e4be27acbd4122ae52ef4d334f75104c0c5e187a9173"),
"0.17.4": ("yarn-v0.17.4.tar.gz", "yarn-v0.17.4", "1bcd0371f6b35ae70be663beffc34a6d63c85ab23bf338e34c489b32e369fa9a"),
"0.17.5": ("yarn-v0.17.5.tar.gz", "yarn-v0.17.5", "3dcd8718080aba7a14b78e1090db1abfbd154039a15cba803bb1578016250288"),
"0.17.6": ("yarn-v0.17.6.tar.gz", "yarn-v0.17.6", "54c4d615fe388a2ccdaf3998911447ff5ea11ff19c1dd45558fed40f31cac8e4"),
"0.17.7": ("yarn-v0.17.7.tar.gz", "yarn-v0.17.7", "899eddf77fea4b313c12d3d079523f4e7735159fc5da6a30b995d5e77f3e5948"),
"0.17.8": ("yarn-v0.17.8.tar.gz", "yarn-v0.17.8", "b54e762e2a54f1fb23c6b0f9c239c3791aae05aface5ea0d6498f2a7979b541c"),
"0.17.9": ("yarn-v0.17.9.tar.gz", "yarn-v0.17.9", "6846f46d6a500dca8f4490f80da62898a9162f94cdb7486c2e86787092d2fd8d"),
"0.17.10": ("yarn-v0.17.10.tar.gz", "yarn-v0.17.10", "592140a9d387a892935ca49ee93a8207b95073e2b732693987420dd1a7606672"),
"0.18.0": ("yarn-v0.18.0.tar.gz", "yarn-v0.18.0", "8fb1843d2a1283972ff5fead9d6c9f9002de793ecec6dfec7abec908403ecd19"),
"0.18.1": ("yarn-v0.18.1.tar.gz", "yarn-v0.18.1", "7d16699c8690ef145e1732004266fb82a32b0c06210a43c624986d100537b5a8"),
"0.18.2": ("yarn-v0.18.2.tar.gz", "yarn-v0.18.2", "ca5917410b548cdf208ee928e68d6b4a18727f4e4cb4656c06b47b8ac64e3ee6"),
"0.19.0": ("yarn-v0.19.0.tar.gz", "yarn-v0.19.0", "f7ce03b06b7499153176f2650111ab2b6b95bc8f819d146da5bc60c71b9bfdef"),
"0.19.1": ("yarn-v0.19.1.tar.gz", "yarn-v0.19.1", "751e1c0becbb2c3275f61d79ad8c4fc336e7c44c72d5296b5342a6f468526d7d"),
"0.20.0": ("yarn-v0.20.0.tar.gz", "yarn-v0.20.0", "4622f3c7a2fdf0dac3ef38b49eb636b813734e06a5a688fb2545df857792ebe3"),
"0.20.3": ("yarn-v0.20.3.tar.gz", "yarn-v0.20.3", "e7d052aba18716616213a602d66528eda7a2bdda7962fc23644ce53e74b1e1d5"),
"0.20.4": ("yarn-v0.20.4.tar.gz", "yarn-v0.20.4", "5a7848796379b4ae59deb9f6e3129eb1d026ef3eac603786388aa7976ae4ab55"),
"0.21.0": ("yarn-v0.21.0.tar.gz", "yarn-v0.21.0", "373c4a8cd44347edf9b5b7fe3bed8b816cf46e97e1430bce1bd0ef89a8673042"),
"0.21.1": ("yarn-v0.21.1.tar.gz", "yarn-v0.21.1", "20efb92efc66631b10f4e6ab9b1e8b6773d6729baa5ade963030fd66badf3bb6"),
"0.21.2": ("yarn-v0.21.2.tar.gz", "yarn-v0.21.2", "1ccd5676112dd1aa99759cde942f9c2e9ec22c15977f910d8d298210deb6797e"),
"0.21.3": ("yarn-v0.21.3.tar.gz", "yarn-v0.21.3", "0946a4d1abc106c131b700cc31e5c3aa5f2205eb3bb9d17411f8115354a97d5d"),
"0.22.0": ("yarn-v0.22.0.tar.gz", "yarn-v0.22.0", "e295042279b644f2bc3ea3407a2b2fb417a200d35590b0ec535422d21cf19a09"),
"0.23.0": ("yarn-v0.23.0.tar.gz", "yarn-v0.23.0", "8cbd733e6bf253e0f5e2e5db7dbd165311150754b33df7c00bcaec175a3b33c2"),
"0.23.2": ("yarn-v0.23.2.tar.gz", "yarn-v0.23.2", "2e4f3c5eb0bddad10fdc08a300ab43fe0f626544893deb9e07a4497e998cb82f"),
"0.23.3": ("yarn-v0.23.3.tar.gz", "yarn-v0.23.3", "9f7569b9b89bbe4c3c0bbd8917f551ec26935802668b6e6139ea45db67e3a314"),
"0.23.4": ("yarn-v0.23.4.tar.gz", "yarn-v0.23.4", "bab03e63593295969a3ec95c08a476c80eb821e6ea787829a1ac4b4b1c2298d7"),
"0.24.0": ("yarn-v0.24.0.tar.gz", "yarn-v0.24.0", "5100ce3711391905981e1b5a5a66dd191a0199da71da42e94d96b73517fa4c20"),
"0.24.1": ("yarn-v0.24.1.tar.gz", "yarn-v0.24.1", "9f2450ab745f04fa1cb768b5f3c925fa95c500174033fd2162440b94e8b97b0a"),
"0.24.2": ("yarn-v0.24.2.tar.gz", "yarn-v0.24.2", "6d1afbb0abd01f2b8d1bfd37da0666c670d09fc7cabad61f5db30f41c5c6363c"),
"0.24.3": ("yarn-v0.24.3.tar.gz", "yarn-v0.24.3", "0bcb35837d089fe3ae66448284971518f2a6c8e3257209e26a9496c9b8d5b353"),
"0.24.4": ("yarn-v0.24.4.tar.gz", "yarn-v0.24.4", "90ab424615f7f24a2e2895f33b90bdd4be0e93ebd56083adff4fafa718a75e68"),
"0.24.5": ("yarn-v0.24.5.tar.gz", "yarn-v0.24.5", "a7492431eedee0203faeac64e75f484e4911ab707c96c5bee4c7b97bf19c102c"),
"0.24.6": ("yarn-v0.24.6.tar.gz", "yarn-v0.24.6", "c375ab86d4ca0b46addc5b462280bd42ddefb114ee0025d38044bb4610cd625d"),
"0.25.1": ("yarn-v0.25.1.tar.gz", "yarn-v0.25.1", "b7da6188b842c6914d78600020e34be4feb9afb1e0050ecc5c9cad911684a2ae"),
"0.25.2": ("yarn-v0.25.2.tar.gz", "yarn-v0.25.2", "d1484fb8d1b28fec377c3379696d579dbf79c041acb240cd0dd7a5e041d7d829"),
"0.25.3": ("yarn-v0.25.3.tar.gz", "yarn-v0.25.3", "bc1f6c8de231bad14f5617d000e5db1da5cfaa55f9c055a4ce231b672bc98481"),
"0.25.4": ("yarn-v0.25.4.tar.gz", "yarn-v0.25.4", "ba8d588199067a3b44844c5a781ed96eb41af5eae19d23f6c21f32348a60a5d0"),
"0.26.1": ("yarn-v0.26.1.tar.gz", "yarn-v0.26.1", "b0f0810b3b6b529bd9250030d5b58be76a6e60a3bc70e4f3f03c3c45aa743d41"),
"0.27.0": ("yarn-v0.27.0.tar.gz", "yarn-v0.27.0", "17c3e249e7861e7e2e5d3a26a4e3ffa931255e1f08b6669448ae87e871d8d0ea"),
"0.27.1": ("yarn-v0.27.1.tar.gz", "yarn-v0.27.1", "a72cb7b86444fdf8e58f5a46d8b134d067791f19f9af381697afc3120bb0c816"),
"0.27.2": ("yarn-v0.27.2.tar.gz", "yarn-v0.27.2", "29c96fd079d08b78b05cc6d8996428a0d4a333468e22e9247d828cf227cf5a85"),
"0.27.3": ("yarn-v0.27.3.tar.gz", "yarn-v0.27.3", "0685bad9e0857eddb401f5c18d38a01e0337e06508bda245ee803d915a932639"),
"0.27.4": ("yarn-v0.27.4.tar.gz", "yarn-v0.27.4", "da4545662b73e2c87442feb885d5ccd670684809a4a8aed0b8fac1a2f63d12ea"),
"0.27.5": ("yarn-v0.27.5.tar.gz", "yarn-v0.27.5", "f0f3510246ee74eb660ea06930dcded7b684eac2593aa979a7add84b72517968"),
"0.28.1": ("yarn-v0.28.1.tar.gz", "yarn-v0.28.1", "280d67a59135bf79b5b8498b581dd19c82604ffd80a2eb44e29ff7c9d743fb16"),
"0.28.4": ("yarn-v0.28.4.tar.gz", "yarn-v0.28.4", "057ef781107bb5d3e7a2a655d75054fbeb265a249a905375bc25bec10d42b31f"),
"1.0.0": ("yarn-v1.0.0.tar.gz", "yarn-v1.0.0", "0f3d47e35f391507edda1c87a3014b86c2eb32aaec00d0a4b1e7413bec63787d"),
"1.0.1": ("yarn-v1.0.1.tar.gz", "yarn-v1.0.1", "6b00b5e0a7074a512d39d2d91ba6262dde911d452617939ca4be4a700dd77cf1"),
"1.0.2": ("yarn-v1.0.2.tar.gz", "yarn-v1.0.2", "8a31f8fa50ab6d5f8852025fb0ea4a50f2f8b82792f060fa99de0acc370b0698"),
"1.1.0": ("yarn-v1.1.0.tar.gz", "yarn-v1.1.0", "171c1f9ee93c488c0d774ac6e9c72649047c3f896277d88d0f805266519430f3"),
"1.2.0": ("yarn-v1.2.0.tar.gz", "yarn-v1.2.0", "533cf428a5a354d8393864d31451478a850bb7c173d8d756553898041963c949"),
"1.2.1": ("yarn-v1.2.1.tar.gz", "yarn-v1.2.1", "f8ed07675c3a0b866e11a02af5c15d2f34c3aa261ab1501943ecee328786c959"),
"1.3.2": ("yarn-v1.3.2.tar.gz", "yarn-v1.3.2", "6cfe82e530ef0837212f13e45c1565ba53f5199eec2527b85ecbcd88bf26821d"),
"1.4.0": ("yarn-v1.4.0.tar.gz", "yarn-v1.4.0", "5ebff618b0213e1ded88ea759faa355c0dbeacfa2a9e6736ebe1a1671c28bd8d"),
"1.5.1": ("yarn-v1.5.1.tar.gz", "yarn-v1.5.1", "cd31657232cf48d57fdbff55f38bfa058d2fb4950450bd34af72dac796af4de1"),
"1.6.0": ("yarn-v1.6.0.tar.gz", "yarn-v1.6.0", "a57b2fdb2bfeeb083d45a883bc29af94d5e83a21c25f3fc001c295938e988509"),
"1.7.0": ("yarn-v1.7.0.tar.gz", "yarn-v1.7.0", "e7720ee346b2bd7ec32b7e04517643c38648f5022c7981168321ba1636f2dca3"),
"1.8.0": ("yarn-v1.8.0.tar.gz", "yarn-v1.8.0", "3d8dc87cae99f7547b82026f818b3a14f0393cfa09337bb9adfb446d50a527a7"),
"1.9.1": ("yarn-v1.9.1.tar.gz", "yarn-v1.9.1", "974840b0dda99faf697fcea582737718ba7c52d34b5f1fe20f8a29bacfd762b5"),
"1.9.2": ("yarn-v1.9.2.tar.gz", "yarn-v1.9.2", "3ad69cc7f68159a562c676e21998eb21b44138cae7e8fe0749a7d620cf940204"),
"1.9.4": ("yarn-v1.9.4.tar.gz", "yarn-v1.9.4", "7667eb715077b4bad8e2a832e7084e0e6f1ba54d7280dc573c8f7031a7fb093e"),
"1.10.0": ("yarn-v1.10.0.tar.gz", "yarn-v1.10.0", "83277bd505c7f4009c13077266020c97298727de7edf67af5ca66eccae9d4632"),
"1.10.1": ("yarn-v1.10.1.tar.gz", "yarn-v1.10.1", "97bf147cb28229e66e4e3c5733a93c851bbcb0f10fbc72696ed011774f4c6f1b"),
"1.11.0": ("yarn-v1.11.0.tar.gz", "yarn-v1.11.0", "97f1f1456686764a581fdebc061a79a64429ce7518a9ff8722facd4e86874e34"),
"1.11.1": ("yarn-v1.11.1.tar.gz", "yarn-v1.11.1", "03acd60fd8e2c111989e017244596bf9b4d8c20e6d4404155fe142007023cabb"),
"1.12.0": ("yarn-v1.12.0.tar.gz", "yarn-v1.12.0", "90c40e80743eebbce63d83c539f3ca97e24b30eb2d98b9858e37e13fbe8fafdc"),
"1.12.1": ("yarn-v1.12.1.tar.gz", "yarn-v1.12.1", "09bea8f4ec41e9079fa03093d3b2db7ac5c5331852236d63815f8df42b3ba88d"),
"1.12.3": ("yarn-v1.12.3.tar.gz", "yarn-v1.12.3", "02cd4b589ec22c4bdbd2bc5ebbfd99c5e99b07242ad68a539cb37896b93a24f2"),
"1.13.0": ("yarn-v1.13.0.tar.gz", "yarn-v1.13.0", "125d40ebf621ebb08e3f66a618bd2cc5cd77fa317a312900a1ab4360ed38bf14"),
"1.14.0": ("yarn-v1.14.0.tar.gz", "yarn-v1.14.0", "2d38fc0700f106762f72f0aeebcec0e227f1e94bd10488d179ca1596053ab700"),
"1.15.0": ("yarn-v1.15.0.tar.gz", "yarn-v1.15.0", "d2f2c6e11a8686f66d7a37b438470bf032f15cefc843f9301e47a52e5817454b"),
"1.15.1": ("yarn-v1.15.1.tar.gz", "yarn-v1.15.1", "c63f402bcbf7499f015fe50f05f8b8d5f63fab08a10e4f78eb440baf54aac167"),
"1.15.2": ("yarn-v1.15.2.tar.gz", "yarn-v1.15.2", "c4feca9ba5d6bf1e820e8828609d3de733edf0e4722d17ed7ce493ed39f61abd"),
"1.16.0": ("yarn-v1.16.0.tar.gz", "yarn-v1.16.0", "df202627d9a70cf09ef2fb11cb298cb619db1b958590959d6f6e571b50656029"),
"1.17.0": ("yarn-v1.17.0.tar.gz", "yarn-v1.17.0", "c7ec0f1a2028c0f9a21d27fa1a689b5730d13ddcd3a145f3a2db50ebf98e65cc"),
"1.17.1": ("yarn-v1.17.1.tar.gz", "yarn-v1.17.1", "6e952e47b49e663017b1f67b42017a4a5fe477a09e242fa1414d782ec9b84259"),
"1.17.2": ("yarn-v1.17.2.tar.gz", "yarn-v1.17.2", "1cb4eb5b30adcb995198e4ff95f344d3404116b1d2bd77323a6f22dd52596fd7"),
"1.17.3": ("yarn-v1.17.3.tar.gz", "yarn-v1.17.3", "e3835194409f1b3afa1c62ca82f561f1c29d26580c9e220c36866317e043c6f3"),
"1.18.0": ("yarn-v1.18.0.tar.gz", "yarn-v1.18.0", "2f8d93c217ecca06eb720794c8d1484a67d37bdb58ab761108b5c651d59d3fc6"),
"1.19.0": ("yarn-v1.19.0.tar.gz", "yarn-v1.19.0", "6bbdaab9c31eedbe7b53cbcde2be06b8c926f139bd0f7c00fccad406016e8934"),
"1.19.1": ("yarn-v1.19.1.tar.gz", "yarn-v1.19.1", "34293da6266f2aae9690d59c2d764056053ff7eebc56b80b8df05010c3da9343"),
"1.19.2": ("yarn-v1.19.2.tar.gz", "yarn-v1.19.2", "2ed90e6aaf3988df5c75b6829b7c523754453a0b7134a9d0bf11161f927eae25"),
"1.21.0": ("yarn-v1.21.0.tar.gz", "yarn-v1.21.0", "dd17d4e5bc560aa28140038a31fa50603ef76b710fee44e5ec5efbea7ad24c61"),
"1.21.1": ("yarn-v1.21.1.tar.gz", "yarn-v1.21.1", "d1d9f4a0f16f5ed484e814afeb98f39b82d4728c6c8beaafb5abc99c02db6674"),
"1.22.0": ("yarn-v1.22.0.tar.gz", "yarn-v1.22.0", "de8871c4e2822cba80d58c2e72366fb78567ec56e873493c9ca0cca76c60f9a5"),
"1.22.1": ("yarn-v1.22.1.tar.gz", "yarn-v1.22.1", "3af905904932078faa8f485d97c928416b30a86dd09dcd76e746a55c7f533b72"),
"1.22.4": ("yarn-v1.22.4.tar.gz", "yarn-v1.22.4", "bc5316aa110b2f564a71a3d6e235be55b98714660870c5b6b2d2d3f12587fb58"),
"1.22.5": ("yarn-v1.22.5.tar.gz", "yarn-v1.22.5", "c664fb4692e4dfea750a37a533780834b40198c00cef4bbc5e8c14abab2ac141"),
"1.22.10": ("yarn-v1.22.10.tar.gz", "yarn-v1.22.10", "7e433d4a77e2c79e6a7ae4866782608a8e8bcad3ec6783580577c59538381a6e"),
"1.22.11": ("yarn-v1.22.11.tar.gz", "yarn-v1.22.11", "2c320de14a6014f62d29c34fec78fdbb0bc71c9ccba48ed0668de452c1f5fe6c"),
"1.22.12": ("yarn-v1.22.12.tar.gz", "yarn-v1.22.12", "1bfc6a45ed9788193907d93bfd2070ed805357733d785ae56176705c0ec4cfc1"),
"1.22.13": ("yarn-v1.22.13.tar.gz", "yarn-v1.22.13", "92b312f0f159c63bbc4ff5f553da8d9b2ffd6886a53c7d9a678c50e2cf4ed321"),
"1.22.14": ("yarn-v1.22.14.tar.gz", "yarn-v1.22.14", "9b0c53cbca3f3cf7cdee35c74db538d54a6abca1c63ef0db10288e165184615d"),
"1.22.15": ("yarn-v1.22.15.tar.gz", "yarn-v1.22.15", "0c2841b9423f0fb9657ae6b18873f39551396ec242bfb882b11bed9e4648235e"),
"1.22.16": ("yarn-v1.22.16.tar.gz", "yarn-v1.22.16", "c0369d6a9aeb4f3b86095c6e6f64de7a7555a888e03260c3f02727636e1f1693"),
"1.22.17": ("yarn-v1.22.17.tar.gz", "yarn-v1.22.17", "267982c61119a055ba2b23d9cf90b02d3d16c202c03cb0c3a53b9633eae37249"),
"1.22.18": ("yarn-v1.22.18.tar.gz", "yarn-v1.22.18", "816e5c073b3d35936a398d1fe769ebbcd517298e3510b649e8fc67cd3a62e113"),
}
| 111.678571 | 125 | 0.739687 | 1,435 | 12,508 | 6.445993 | 0.112892 | 0.066162 | 0.097297 | 0.060649 | 0.150486 | 0 | 0 | 0 | 0 | 0 | 0 | 0.444768 | 0.066358 | 12,508 | 111 | 126 | 112.684685 | 0.34732 | 0.01535 | 0 | 0 | 1 | 0 | 0.819386 | 0.519987 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
88c283d26d2325b100fb7cf03a6ded541355b40f | 268 | py | Python | blurr/text/modeling/all.py | warner-benjamin/blurr | c946c3c1b6ee92c6305f102b3d13181132ad083b | [
"Apache-2.0"
] | null | null | null | blurr/text/modeling/all.py | warner-benjamin/blurr | c946c3c1b6ee92c6305f102b3d13181132ad083b | [
"Apache-2.0"
] | 3 | 2022-02-10T03:53:07.000Z | 2022-02-10T04:18:53.000Z | blurr/text/modeling/all.py | warner-benjamin/blurr | c946c3c1b6ee92c6305f102b3d13181132ad083b | [
"Apache-2.0"
] | null | null | null | from ..utils import *
from ...utils import *
from .core import *
from .language_modeling import *
from .question_answering import *
from .token_classification import *
from .seq2seq.core import *
from .seq2seq.summarization import *
from .seq2seq.translation import *
| 26.8 | 36 | 0.776119 | 33 | 268 | 6.212121 | 0.393939 | 0.390244 | 0.24878 | 0.185366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012931 | 0.134328 | 268 | 9 | 37 | 29.777778 | 0.87069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
88ee09d63cc69df1064e10d93d2a04d2b31f0cfc | 194 | py | Python | moha/posthf/pt/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | 12 | 2019-12-07T18:37:34.000Z | 2022-03-30T14:23:38.000Z | moha/posthf/pt/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | null | null | null | moha/posthf/pt/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | 2 | 2019-12-08T05:48:47.000Z | 2021-10-31T21:40:21.000Z | from __future__ import division, print_function
from __future__ import absolute_import
from moha.posthf.pt.auxiliary import *
from moha.posthf.pt.mp2 import *
from moha.posthf.pt.mp3 import *
| 24.25 | 47 | 0.814433 | 29 | 194 | 5.103448 | 0.448276 | 0.202703 | 0.283784 | 0.405405 | 0.445946 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011696 | 0.118557 | 194 | 7 | 48 | 27.714286 | 0.853801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.2 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
88fc58d3a86f75d19ab09d149969a5b9a41cb55d | 169 | py | Python | stla_lite/grants/admin.py | 401ode/stla-lite | 6355b477e729ba8c378285f08cc01410224341b7 | [
"MIT"
] | null | null | null | stla_lite/grants/admin.py | 401ode/stla-lite | 6355b477e729ba8c378285f08cc01410224341b7 | [
"MIT"
] | 1 | 2017-10-04T18:34:26.000Z | 2017-10-04T18:34:26.000Z | stla_lite/grants/admin.py | 401ode/stla-lite | 6355b477e729ba8c378285f08cc01410224341b7 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import GrantAward, GrantAwardTask
@admin.register(GrantAward, GrantAwardTask)
class GrantAdmin(admin.ModelAdmin):
pass | 28.166667 | 46 | 0.822485 | 19 | 169 | 7.315789 | 0.684211 | 0.345324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106509 | 169 | 6 | 47 | 28.166667 | 0.92053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
cc3530b6c3c4289af8e68c855238efd50bc14088 | 1,542 | py | Python | class5-notebook/5-numpy_functions.py | cce-bigdataintro-1160/spring2019 | e55b24bda61bbc87ad39dfec93c68a8f2da77831 | [
"MIT"
] | 1 | 2019-05-04T04:12:17.000Z | 2019-05-04T04:12:17.000Z | class5-notebook/5-numpy_functions.py | cce-bigdataintro-1160/spring2019 | e55b24bda61bbc87ad39dfec93c68a8f2da77831 | [
"MIT"
] | null | null | null | class5-notebook/5-numpy_functions.py | cce-bigdataintro-1160/spring2019 | e55b24bda61bbc87ad39dfec93c68a8f2da77831 | [
"MIT"
] | 1 | 2020-03-11T02:13:25.000Z | 2020-03-11T02:13:25.000Z | #!/usr/bin/env python3
import numpy as np
def pretty_print(name, to_print):
print(f'{name}:')
print(f'{to_print}\n\n')
random_int_array = np.random.randint(1, 100, 20)
pretty_print('random_int_array', random_int_array)
pretty_print('np.mean(random_int_array)', np.mean(random_int_array))
pretty_print('np.min(random_int_array)', np.min(random_int_array))
pretty_print('np.max(random_int_array)', np.max(random_int_array))
pretty_print('np.std(random_int_array)', np.std(random_int_array))
pretty_print('np.mean(random_int_array)', random_int_array.mean())
pretty_print('np.min(random_int_array)', random_int_array.min())
pretty_print('np.max(random_int_array)', random_int_array.max())
pretty_print('np.std(random_int_array)', random_int_array.std())
reshaped_random_int_matrix = random_int_array.reshape(5, 4)
pretty_print('np.mean(reshaped_random_int_matrix)', np.mean(reshaped_random_int_matrix))
pretty_print('np.min(reshaped_random_int_matrix)', np.min(reshaped_random_int_matrix))
pretty_print('np.max(reshaped_random_int_matrix)', np.max(reshaped_random_int_matrix))
pretty_print('np.std(reshaped_random_int_matrix)', np.std(reshaped_random_int_matrix))
pretty_print('np.mean(reshaped_random_int_matrix)', np.mean(reshaped_random_int_matrix[1]))
pretty_print('np.min(reshaped_random_int_matrix)', np.min(reshaped_random_int_matrix[1]))
pretty_print('np.max(reshaped_random_int_matrix)', np.max(reshaped_random_int_matrix[1]))
pretty_print('np.std(reshaped_random_int_matrix)', np.std(reshaped_random_int_matrix[1]))
| 42.833333 | 91 | 0.806744 | 258 | 1,542 | 4.391473 | 0.120155 | 0.29391 | 0.247132 | 0.345102 | 0.858782 | 0.816417 | 0.717564 | 0.558694 | 0.556046 | 0.556046 | 0 | 0.00885 | 0.047341 | 1,542 | 35 | 92 | 44.057143 | 0.762423 | 0.013619 | 0 | 0 | 0 | 0 | 0.332237 | 0.307895 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.043478 | 0 | 0.086957 | 0.869565 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
cc998fd8287849d18dba74a72e0c59bd1ea71612 | 1,370 | py | Python | services/backend/images/tests/test_image_viewset.py | patpio/drf_images_api | ef689bac10ce8b9d2f03d6b647fa4bbd70b02f1c | [
"Beerware"
] | 1 | 2022-02-27T16:34:46.000Z | 2022-02-27T16:34:46.000Z | services/backend/images/tests/test_image_viewset.py | patpio/drf_images_api | ef689bac10ce8b9d2f03d6b647fa4bbd70b02f1c | [
"Beerware"
] | null | null | null | services/backend/images/tests/test_image_viewset.py | patpio/drf_images_api | ef689bac10ce8b9d2f03d6b647fa4bbd70b02f1c | [
"Beerware"
] | null | null | null | import pytest
from rest_framework import status
from rest_framework.test import force_authenticate
from images.views import ImageViewSet
@pytest.mark.views
def test_user_can_see_own_image(db, api_rf, create_test_user, create_test_image, remove_test_data):
view = ImageViewSet.as_view({'get': 'list'})
user = create_test_user
request = api_rf.get('/api/v1/images/')
force_authenticate(request, user)
response = view(request)
assert response.status_code == status.HTTP_200_OK
assert response.data[0].get('id') == create_test_image.pk
assert response.data[0].get('name') == 'test_file.jpg'
assert response.data[0].get('author') == create_test_user.pk
@pytest.mark.views
def test_user_can_see_own_image_with_link(db, api_rf, create_test_user, create_test_image, remove_test_data):
view = ImageViewSet.as_view({'get': 'list'})
user = create_test_user
user.tier.link_flag = True
request = api_rf.get('/api/v1/images/')
force_authenticate(request, user)
response = view(request)
assert response.status_code == status.HTTP_200_OK
assert response.data[0].get('id') == create_test_image.pk
assert response.data[0].get('url') == "http://testserver/media/images/test_file.jpg"
assert response.data[0].get('name') == 'test_file.jpg'
assert response.data[0].get('author') == create_test_user.pk
| 36.052632 | 109 | 0.734307 | 205 | 1,370 | 4.629268 | 0.253659 | 0.105374 | 0.132771 | 0.140148 | 0.816649 | 0.816649 | 0.816649 | 0.816649 | 0.805058 | 0.805058 | 0 | 0.012669 | 0.135766 | 1,370 | 37 | 110 | 37.027027 | 0.788851 | 0 | 0 | 0.714286 | 0 | 0 | 0.10292 | 0 | 0 | 0 | 0 | 0 | 0.321429 | 1 | 0.071429 | false | 0 | 0.142857 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aec6bc21fc9a432ff5676c8f5297f1677b1bdc06 | 4,219 | py | Python | Packs/IntegrationsAndIncidentsHealthCheck/Scripts/IntegrationsCheck_Widget_IntegrationsErrorsInfo/test_data/constants.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 799 | 2016-08-02T06:43:14.000Z | 2022-03-31T11:10:11.000Z | Packs/IntegrationsAndIncidentsHealthCheck/Scripts/IntegrationsCheck_Widget_IntegrationsErrorsInfo/test_data/constants.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 9,317 | 2016-08-07T19:00:51.000Z | 2022-03-31T21:56:04.000Z | Packs/IntegrationsAndIncidentsHealthCheck/Scripts/IntegrationsCheck_Widget_IntegrationsErrorsInfo/test_data/constants.py | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 1,297 | 2016-08-04T13:59:00.000Z | 2022-03-31T23:43:06.000Z | FAILED_TABLE = '''[{"brand": "Active Directory Query v2", "category": "Data Enrichment & Threat Intelligence",
"information": "Failed to access LDAP server. Please validate the server host and port are configured correctly (85)",
"instance": "Active Directory Query v2_instance_1"},
{"brand": "BigFix", "category": "Vulnerability Management",
"information": "Invalid URL 'xd7x91xd7x94xd7xa0/api/help': No schema supplied. Perhaps you meant http://בהנ/api/help? (85)",
"instance": "BigFix_instance_1"}, {"brand": "Tanium Threat Response", "category": "Endpoint",
"information": "Error in Tanium Threat Response Integration: Invalid URL 'sfgdfg/api/v2/session/login': No schema supplied. Perhaps you meant http://sfgdfg/api/v2/session/login? (85)",
"instance": "Tanium Threat Response_instance_1"},
{"category": "Forensics & Malware Analysis",
"instance": "Threat Grid_instance_1", "brand": "Threat Grid"},
{"instance": "VirusTotal_instance_1", "brand": "VirusTotal",
"category": "Data Enrichment & Threat Intelligence",
"information": "403 Forbidden - The API key is not valid (85)"},
{"brand": "remoteaccess", "category": "Endpoint",
"information": "ssh: handshake failed: ssh: unable to authenticate, attempted methods [none], no supported methods remain",
"instance": "remoteaccess_instance_1"}]'''
FAILED_TABLE_EXPECTED = {'data': [{'Brand': 'Active Directory Query v2',
'Category': 'Data Enrichment & Threat Intelligence',
'Information': 'Failed to access LDAP server. Please validate the '
'server host and port are configured correctly (85)',
'Instance': 'Active Directory Query v2_instance_1'},
{'Brand': 'BigFix',
'Category': 'Vulnerability Management',
'Information': "Invalid URL 'xd7x91xd7x94xd7xa0/api/help': No "
'schema supplied. Perhaps you meant '
'http://בהנ/api/help? (85)',
'Instance': 'BigFix_instance_1'},
{'Brand': 'Tanium Threat Response',
'Category': 'Endpoint',
'Information': 'Error in Tanium Threat Response Integration: '
"Invalid URL 'sfgdfg/api/v2/session/login': No "
'schema supplied. Perhaps you meant '
'http://sfgdfg/api/v2/session/login? (85)',
'Instance': 'Tanium Threat Response_instance_1'},
{'Brand': 'Threat Grid',
'Category': 'Forensics & Malware Analysis',
'Information': None,
'Instance': 'Threat Grid_instance_1'},
{'Brand': 'VirusTotal',
'Category': 'Data Enrichment & Threat Intelligence',
'Information': '403 Forbidden - The API key is not valid (85)',
'Instance': 'VirusTotal_instance_1'},
{'Brand': 'remoteaccess',
'Category': 'Endpoint',
'Information': 'ssh: handshake failed: ssh: unable to authenticate, '
'attempted methods [none], no supported methods '
'remain',
'Instance': 'remoteaccess_instance_1'}],
'total': 6}
| 82.72549 | 236 | 0.462669 | 320 | 4,219 | 6.015625 | 0.24375 | 0.056104 | 0.065455 | 0.045714 | 0.938182 | 0.901818 | 0.875844 | 0.875844 | 0.875844 | 0.875844 | 0 | 0.024697 | 0.433752 | 4,219 | 50 | 237 | 84.38 | 0.78108 | 0 | 0 | 0.122449 | 0 | 0.081633 | 0.640673 | 0.061152 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4e006b3048d9e620c86a4e95c606106b86c0974d | 935 | py | Python | mmmbgknow/utils/csv.py | mood-mapping-muppets/repo | 08cd84eb6ed1723c1157c0ce3639d29d5510987e | [
"Apache-2.0"
] | 1 | 2021-02-01T17:14:19.000Z | 2021-02-01T17:14:19.000Z | mmmbgknow/utils/csv.py | mood-mapping-muppets/repo | 08cd84eb6ed1723c1157c0ce3639d29d5510987e | [
"Apache-2.0"
] | 23 | 2021-02-01T16:12:15.000Z | 2021-02-20T14:17:02.000Z | mmmbgknow/utils/csv.py | mood-mapping-muppets/repo | 08cd84eb6ed1723c1157c0ce3639d29d5510987e | [
"Apache-2.0"
] | 1 | 2021-03-01T13:17:59.000Z | 2021-03-01T13:17:59.000Z | def read_csv_map(filename):
import csv
with open(filename, mode='r') as infile:
reader = csv.reader(infile)
next(reader)
return {row[0]: row[1] for row in reader}
def read_csv_map_set(filename):
import csv
res = {}
with open(filename, mode='r') as infile:
reader = csv.reader(infile)
next(reader)
for row in reader:
res.setdefault(row[0], set()).add(row[1])
return res
def read_csv_map_map_set(filename):
import csv
res = {}
with open(filename, mode='r') as infile:
reader = csv.reader(infile)
next(reader)
for row in reader:
res.setdefault(row[0], {}).setdefault(row[1], set()).add(row[2])
return res
def read_csv_set(filename):
import csv
with open(filename, mode='r') as infile:
reader = csv.reader(infile)
next(reader)
return {row[0] for row in reader}
| 22.804878 | 76 | 0.590374 | 132 | 935 | 4.098485 | 0.189394 | 0.051756 | 0.073937 | 0.147874 | 0.824399 | 0.754159 | 0.754159 | 0.754159 | 0.754159 | 0.754159 | 0 | 0.011958 | 0.284492 | 935 | 40 | 77 | 23.375 | 0.796712 | 0 | 0 | 0.733333 | 0 | 0 | 0.004278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.133333 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4e3061d920bdacf740e35cfc6ee09dad3bba890e | 40,403 | py | Python | carbondesign/tests/test_select_html.py | dozymoe/django-carbondesign | 34aed0cfdccfa90fcb5bf2bbd347229815f1417b | [
"MIT"
] | null | null | null | carbondesign/tests/test_select_html.py | dozymoe/django-carbondesign | 34aed0cfdccfa90fcb5bf2bbd347229815f1417b | [
"MIT"
] | null | null | null | carbondesign/tests/test_select_html.py | dozymoe/django-carbondesign | 34aed0cfdccfa90fcb5bf2bbd347229815f1417b | [
"MIT"
] | null | null | null | # pylint:disable=missing-module-docstring,missing-class-docstring,missing-function-docstring
from django import forms
#-
from .base import compare_template, SimpleTestCase
class DummyForm(forms.Form):
select = forms.ChoiceField(
required=False,
choices=(
('solong', "A much longer option that is worth having around "
"to check how text flows"),
("Category 1", (
('option1', "Option 1"),
('option2', "Option 2"),
)),
("Category 2", (
('option1', "Option 1"),
('option2', "Option 2"),
)),
))
select_help = forms.ChoiceField(
required=False,
choices=(
('solong', "A much longer option that is worth having around "
"to check how text flows"),
("Category 1", (
('option1', "Option 1"),
('option2', "Option 2"),
)),
("Category 2", (
('option1', "Option 1"),
('option2', "Option 2"),
)),
),
help_text="Optional helper text.")
class SelectTestHtml(SimpleTestCase):
maxDiff = None
def test_default(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input__wrapper">
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_disabled(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--disabled">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_disabled(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--disabled">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_help(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select">
<label for="id_select_help" class="bx--label">
Select label
</label>
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select_help" name="select_help">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_help_disabled(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--disabled">
<label for="id_select_help" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select_help" name="select_help" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_help(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help mode="inline" label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline">
<label for="id_select_help" class="bx--label">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select_help" name="select_help">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_help_disabled(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help mode="inline" label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--disabled">
<label for="id_select_help" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select_help" name="select_help" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_invalid(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__invalid-icon" width="16" height="16"
viewBox="0 0 16 16" aria-hidden="true">
<path d="M8,1C4.2,1,1,4.2,1,8s3.2,7,7,7s7-3.1,7-7S11.9,1,8,1z M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2 c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z"></path>
<path d="M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8 c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z" data-icon-path="inner-path" opacity="0"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_invalid_disabled(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid bx--select--disabled">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_help_invalid(self):
form = DummyForm(data={'select_help': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help mode="inline" label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid">
<label for="id_select_help" class="bx--label">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select_help" name="select_help">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__invalid-icon" width="16" height="16"
viewBox="0 0 16 16" aria-hidden="true">
<path d="M8,1C4.2,1,1,4.2,1,8s3.2,7,7,7s7-3.1,7-7S11.9,1,8,1z M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2 c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z"></path>
<path d="M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8 c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z" data-icon-path="inner-path" opacity="0"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_help_invalid_disabled(self):
form = DummyForm(data={'select_help': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select_help mode="inline" label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid bx--select--disabled">
<label for="id_select_help" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select_help" name="select_help" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
<div id="hint-id_select_help" class="bx--form__helper-text">
Optional helper text.
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_invalid_light(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid bx--select--light">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__invalid-icon" width="16" height="16"
viewBox="0 0 16 16" aria-hidden="true">
<path d="M8,1C4.2,1,1,4.2,1,8s3.2,7,7,7s7-3.1,7-7S11.9,1,8,1z M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2 c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z"></path>
<path d="M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8 c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z" data-icon-path="inner-path" opacity="0"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_inline_invalid_disabled_light(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select mode="inline" label="Select label" disabled=True light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--inline bx--select--invalid bx--select--disabled bx--select--light">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input--inline__wrapper">
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_invalid(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--invalid">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__invalid-icon" width="16" height="16"
viewBox="0 0 16 16" aria-hidden="true">
<path d="M8,1C4.2,1,1,4.2,1,8s3.2,7,7,7s7-3.1,7-7S11.9,1,8,1z M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2 c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z"></path>
<path d="M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8 c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z" data-icon-path="inner-path" opacity="0"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_invalid_disabled(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" disabled=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--invalid bx--select--disabled">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_light(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--light">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input__wrapper" >
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_disabled_light(self):
form = DummyForm(data={})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" disabled=True light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--disabled bx--select--light">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input__wrapper">
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_invalid_light(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--invalid bx--select--light">
<label for="id_select" class="bx--label">
Select label
</label>
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select">
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__invalid-icon" width="16" height="16"
viewBox="0 0 16 16" aria-hidden="true">
<path d="M8,1C4.2,1,1,4.2,1,8s3.2,7,7,7s7-3.1,7-7S11.9,1,8,1z M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2 c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z"></path>
<path d="M7.5,4h1v5h-1C7.5,9,7.5,4,7.5,4z M8,12.2c-0.4,0-0.8-0.4-0.8-0.8s0.3-0.8,0.8-0.8 c0.4,0,0.8,0.4,0.8,0.8S8.4,12.2,8,12.2z" data-icon-path="inner-path" opacity="0"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
def test_default_invalid_disabled_light(self):
form = DummyForm(data={'select': 'a'})
context = {'form': form}
template = """
{% load carbondesign %}
{% Select form.select label="Select label" disabled=True light=True %}
"""
expected = """
<div class="bx--form-item">
<div class="bx--select bx--select--invalid bx--select--disabled bx--select--light">
<label for="id_select" class="bx--label bx--label--disabled">
Select label
</label>
<div class="bx--select-input__wrapper" data-invalid>
<select class="bx--select-input" id="id_select" name="select" disabled>
<option class="bx--select-option" value="">Choose an option</option>
<option class="bx--select-option" value="solong">A much longer option that is worth having around to check how text flows</option>
<optgroup class="bx--select-optgroup" label="Category 1">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
<optgroup class="bx--select-optgroup" label="Category 2">
<option class="bx--select-option" value="option1">Option 1</option>
<option class="bx--select-option" value="option2">Option 2</option>
</optgroup>
</select>
<svg focusable="false" preserveAspectRatio="xMidYMid meet"
xmlns="http://www.w3.org/2000/svg" fill="currentColor"
class="bx--select__arrow" width="16" height="16" viewBox="0 0 16 16"
aria-hidden="true">
<path d="M8 11L3 6 3.7 5.3 8 9.6 12.3 5.3 13 6z"></path>
</svg>
</div>
<div class="bx--form-requirement">
<div class="bx--form-requirement__title">Select a valid choice.</div>
<p class="bx--form-requirement__supplement">a is not one of the available choices.</p>
</div>
</div>
</div>
"""
rendered = compare_template(template, expected, context)
self.assertEqual(*rendered)
| 42.395593 | 192 | 0.647229 | 5,911 | 40,403 | 4.371003 | 0.024361 | 0.089678 | 0.128304 | 0.088246 | 0.989666 | 0.989666 | 0.989666 | 0.989356 | 0.989356 | 0.989356 | 0 | 0.05079 | 0.166696 | 40,403 | 952 | 193 | 42.440126 | 0.716615 | 0.002252 | 0 | 0.953694 | 0 | 0.094818 | 0.848868 | 0.315513 | 0 | 0 | 0 | 0 | 0.022051 | 1 | 0.022051 | false | 0 | 0.002205 | 0 | 0.029768 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4e37e0798118fe0c44ef2db5960f299398f815d0 | 2,215 | py | Python | prog/command2.py | cunningr/cli-builder | 367aeba8d8d114118ac732af3fd873bca28ad4c1 | [
"MIT"
] | null | null | null | prog/command2.py | cunningr/cli-builder | 367aeba8d8d114118ac732af3fd873bca28ad4c1 | [
"MIT"
] | null | null | null | prog/command2.py | cunningr/cli-builder | 367aeba8d8d114118ac732af3fd873bca28ad4c1 | [
"MIT"
] | null | null | null | import logging
import prog.common as common
logger = logging.getLogger('main.{}'.format(__name__))
class Command2:
def __init__(self, args):
# Run setup tasks
self.args = args
self.attribute1 = None
self.attribute2 = None
self._setup_func1()
# Run the command with args
self.run(args)
@staticmethod
def add_args(_key, _subparsers):
_args = _subparsers.add_parser(_key, help='use prog command1 -h for help')
_args.add_argument('--arg1', help='value for agr1')
_args.add_argument('--arg2', help='value for agr2')
_sub_command2 = _args.add_subparsers(help='sub-command help')
_sub_command2 = SubCommand2.add_args('sub2', _sub_command2)
_sub_command2.set_defaults(func=SubCommand2)
return _args
def run(self, args):
logger.info('Executing command2 with --arg1: {}'.format(args.arg1))
logger.info('--arg2: {}'.format(self.attribute2))
logger.debug('*** This is a DEBUG level log ***')
def _setup_func1(self):
self.attribute1 = common.some_common_var
if self.args.arg2:
self.attribute2 = self.args.arg2
else:
self.attribute2 = 'NOT_SET'
class SubCommand2:
def __init__(self, args):
# Run setup tasks
self.args = args
self.attribute1 = None
self.attribute2 = None
self._setup_func1()
# Run the command with args
self.run(args)
@staticmethod
def add_args(_key, _subparsers):
_args = _subparsers.add_parser(_key, help='use prog command1 -h for help')
_args.add_argument('--arg1', required=True, help='value for agr1')
_args.add_argument('--arg2', help='value for agr2')
return _args
def run(self, args):
logger.info('Executing sub_command2 with --arg1: {}'.format(args.arg1))
logger.info('--arg2: {}'.format(self.attribute2))
logger.debug('*** This is a DEBUG level log ***')
def _setup_func1(self):
self.attribute1 = common.some_common_var
if self.args.arg2:
self.attribute2 = self.args.arg2
else:
self.attribute2 = 'NOT_SET'
| 31.642857 | 82 | 0.618962 | 271 | 2,215 | 4.826568 | 0.236162 | 0.061162 | 0.045872 | 0.022936 | 0.813456 | 0.813456 | 0.813456 | 0.813456 | 0.813456 | 0.747706 | 0 | 0.028711 | 0.260948 | 2,215 | 69 | 83 | 32.101449 | 0.770312 | 0.037472 | 0 | 0.769231 | 0 | 0 | 0.158439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.038462 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d89e2f2eba77fcde1dc4fa0b713f7d066fb71f7 | 3,853 | py | Python | my_pilz_sandbox/scripts/seq.py | ct2034/my_pilz_sandbox | 40400c6469918f56d384580d41f61b2cca3b49c9 | [
"BSD-3-Clause"
] | null | null | null | my_pilz_sandbox/scripts/seq.py | ct2034/my_pilz_sandbox | 40400c6469918f56d384580d41f61b2cca3b49c9 | [
"BSD-3-Clause"
] | null | null | null | my_pilz_sandbox/scripts/seq.py | ct2034/my_pilz_sandbox | 40400c6469918f56d384580d41f61b2cca3b49c9 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
from geometry_msgs.msg import Pose, Point, PoseArray, Quaternion
import math
import numpy as np
from pilz_robot_programming import *
import random
import rospy
import time
__REQUIRED_API_VERSION__ = "1" # API version
SLOW_VEL_SCALE = .1
ACC_SCALE = .1
GRIPPER_POSE_CLOSED = 0.001
GRIPPER_POSE_OPEN = 0.029
# trying circ command
def sequence(r):
r.move(Ptp(goal=Pose(position=Point(0.0, 0.0, 1.1), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
r.move(Ptp(goal=Pose(position=Point(0.0, 0.0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
print("prepared.")
seq = Sequence()
seq.append(Ptp(goal=Pose(position=Point(0.0, 0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
seq.append(Ptp(goal=Pose(position=Point(0.2, 0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE),
blend_radius=0.099)
seq.append(Ptp(goal=Pose(position=Point(0.2, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE),
blend_radius=0.099)
seq.append(Ptp(goal=Pose(position=Point(0, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
print("start sequence 1")
r.move(seq)
print("end sequence 1")
r.move(Ptp(goal=Pose(position=Point(0.0, 0.0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
print("prepared.")
seq = Sequence()
seq.append(Ptp(goal=Pose(position=Point(0.0, 0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
seq.append(Ptp(goal=Pose(position=Point(0.2, 0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE),
blend_radius=0.099)
seq.append(Ptp(goal=Pose(position=Point(0.2, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE),
blend_radius=0.099)
seq.append(Gripper(goal=GRIPPER_POSE_CLOSED))
seq.append(Ptp(goal=Pose(position=Point(0, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
seq.append(Gripper(goal=GRIPPER_POSE_OPEN))
seq.append(Gripper(goal=GRIPPER_POSE_CLOSED))
print("start sequence 2")
r.move(seq)
print("end sequence 2")
seq = Sequence()
seq.append(Gripper(goal=GRIPPER_POSE_CLOSED))
seq.append(Ptp(goal=Pose(position=Point(0, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
seq.append(Ptp(goal=Pose(position=Point(0.2, 0.2, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE),
blend_radius=0.19)
seq.append(Gripper(goal=GRIPPER_POSE_OPEN))
seq.append(Ptp(goal=Pose(position=Point(0.2, 0, .9), orientation=Quaternion(0,0,0,1)),
vel_scale=SLOW_VEL_SCALE,
acc_scale=ACC_SCALE))
seq.append(Gripper(goal=GRIPPER_POSE_CLOSED))
print("start sequence 3")
r.move(seq)
print("end sequence 3")
if __name__ == "__main__":
# init a rosnode
rospy.init_node('robot_program_node')
# initialisation
r = Robot(__REQUIRED_API_VERSION__) # instance of the robot
# start the main program
sequence(r)
| 35.675926 | 92 | 0.61796 | 563 | 3,853 | 4.015986 | 0.129663 | 0.038921 | 0.160991 | 0.026537 | 0.803184 | 0.803184 | 0.77134 | 0.77134 | 0.77134 | 0.754091 | 0 | 0.053467 | 0.247599 | 3,853 | 107 | 93 | 36.009346 | 0.726457 | 0.032961 | 0 | 0.702381 | 0 | 0 | 0.0363 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011905 | false | 0 | 0.083333 | 0 | 0.095238 | 0.095238 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d967d3fc0b978b3b18b3cec8fd4da248a03215c | 3,690 | py | Python | anonport.py | Anon6372098/Anon-Port-Scanner | 7592fa7927f377c9f09161fe742c249a95370568 | [
"Apache-2.0"
] | 3 | 2018-12-28T15:31:42.000Z | 2020-03-07T01:35:32.000Z | anonport.py | Anon6372098/Anon-Port-Scanner | 7592fa7927f377c9f09161fe742c249a95370568 | [
"Apache-2.0"
] | null | null | null | anonport.py | Anon6372098/Anon-Port-Scanner | 7592fa7927f377c9f09161fe742c249a95370568 | [
"Apache-2.0"
] | 1 | 2021-01-15T03:36:54.000Z | 2021-01-15T03:36:54.000Z | import marshal
exec(marshal.loads('''c\x00\x00\x00\x00\x00\x00\x00\x00\x05\x00\x00\x00@\x00\x00\x00s\x19\x01\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x02\x00Z\x02\x00e\x02\x00GHe\x03\x00e\x00\x00j\x04\x00\x83\x01\x00d\x03\x00k\x00\x00rR\x00d\x04\x00GHd\x05\x00GHe\x00\x00j\x05\x00d\x06\x00\x83\x01\x00\x01n\x00\x00e\x00\x00j\x04\x00d\x07\x00\x19Z\x06\x00e\x00\x00j\x04\x00d\x08\x00\x19Z\x07\x00e\x00\x00j\x04\x00d\t\x00\x19Z\x08\x00x\x99\x00e\t\x00e\n\x00e\x07\x00\x83\x01\x00e\n\x00e\x08\x00\x83\x01\x00\x83\x02\x00D]|\x00Z\x0b\x00yg\x00d\n\x00Ge\x0b\x00GHe\x01\x00j\x01\x00e\x01\x00j\x0c\x00e\x01\x00j\r\x00\x83\x02\x00Z\x0e\x00e\x0e\x00j\x0f\x00d\x0b\x00\x83\x01\x00\x01e\x0e\x00j\x10\x00e\x01\x00j\x11\x00e\x06\x00\x83\x01\x00e\n\x00e\x0b\x00\x83\x01\x00f\x02\x00\x83\x01\x00\x01d\x0c\x00Ge\x0b\x00GHe\x0e\x00j\x12\x00\x83\x00\x00\x01Wq\x95\x00\x01\x01\x01d\r\x00GHq\x95\x00Xq\x95\x00Wd\x01\x00S(\x0e\x00\x00\x00i\xff\xff\xff\xffNs\xff\x05\x00\x00\n ### ## ## ####### ## ## ######## ####### ######## ######## \n ## ## ### ## ## ## ### ## ## ## ## ## ## ## ## \n ## ## #### ## ## ## #### ## ## ## ## ## ## ## ## \n## ## ## ## ## ## ## ## ## ## ######## ## ## ######## ## \n######### ## #### ## ## ## #### ## ## ## ## ## ## \n## ## ## ### ## ## ## ### ## ## ## ## ## ## \n## ## ## ## ####### ## ## ## ####### ## ## ##\n\n ###### ###### ### ## ## ## ## ######## ######## \n## ## ## ## ## ## ### ## ### ## ## ## ## \n## ## ## ## #### ## #### ## ## ## ## \n ###### ## ## ## ## ## ## ## ## ## ###### ######## \n ## ## ######### ## #### ## #### ## ## ## \n## ## ## ## ## ## ## ### ## ### ## ## ## \n ###### ###### ## ## ## ## ## ## ######## ## ##\n \n\t\t+=================================================+\n\t\t+ Creator : Anon6372098 +\n\t\t+ Contact : anon6372098.id@gmail.com + \n\t\t+ Team : D4RK SYST3M F41LUR3 S33K3R +\n\t\t+ Homepage : https://www.dsfs-indo.zone.id/ + \n\t\t+ Thanks to: All Member of DSFS Official +\n\t\t+ GitHub : https://github.com/Anon6372098/ + \n\t\t+=================================================+\ni\x04\x00\x00\x00sD\x00\x00\x00\n Penggunaan : python anonport.py <website> <awal port> <akhir port>s>\x00\x00\x00\n Contoh : python anonport.py www.dsfs-indo.zone.id 1 8000i\x00\x00\x00\x00i\x01\x00\x00\x00i\x02\x00\x00\x00i\x03\x00\x00\x00s\x12\x00\x00\x00\n[#] Memindai porti\x05\x00\x00\x00s&\x00\x00\x00\n[+] Ditemukan port yang terbuka :) : s\'\x00\x00\x00\n[-] Tak ditemukan port yang terbuka :((\x13\x00\x00\x00t\x03\x00\x00\x00syst\x06\x00\x00\x00sockett\x04\x00\x00\x00anont\x03\x00\x00\x00lent\x04\x00\x00\x00argvt\x04\x00\x00\x00exitt\x08\x00\x00\x00web_anont\t\x00\x00\x00awal_anont\n\x00\x00\x00akhir_anont\x05\x00\x00\x00ranget\x03\x00\x00\x00intt\x04\x00\x00\x00portt\x07\x00\x00\x00AF_INETt\x0b\x00\x00\x00SOCK_STREAMt\x01\x00\x00\x00st\n\x00\x00\x00settimeoutt\x07\x00\x00\x00connectt\r\x00\x00\x00gethostbynamet\x05\x00\x00\x00close(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x06\x00\x00\x00<seni>t\x08\x00\x00\x00<module>\x06\x00\x00\x00s&\x00\x00\x00\x18\x1c\x06\x01\x05\x04\x15\x01\x05\x01\x05\x01\x10\x02\r\x01\r\x01\r\x02"\x01\x03\x01\t\x01\x18\x01\r\x01"\x01\t\x01\x0e\x01\x03\x01''')) | 1,845 | 3,675 | 0.493225 | 524 | 3,690 | 3.46374 | 0.240458 | 0.224793 | 0.133884 | 0.099174 | 0.177961 | 0.117355 | 0.040771 | 0.040771 | 0.027548 | 0.027548 | 0 | 0.262139 | 0.201897 | 3,690 | 2 | 3,675 | 1,845 | 0.35416 | 0 | 0 | 0 | 0 | 0.5 | 0.98835 | 0.548903 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
9de1ae6f2101fe57732b31dbed16e5641e9a0007 | 1,735 | py | Python | scripts/inheritanceSuperclasses.py | rmtew/livecoding | 9c5619c9653d4cd83977fc1f3aae51da004f1e8b | [
"BSD-3-Clause"
] | null | null | null | scripts/inheritanceSuperclasses.py | rmtew/livecoding | 9c5619c9653d4cd83977fc1f3aae51da004f1e8b | [
"BSD-3-Clause"
] | null | null | null | scripts/inheritanceSuperclasses.py | rmtew/livecoding | 9c5619c9653d4cd83977fc1f3aae51da004f1e8b | [
"BSD-3-Clause"
] | null | null | null | # Purpose: Implement base classes in order to test inheritance.
class OldStyleBase:
""" OldStyleBase doc string. """
def __init__(self, *args, **kwargs):
""" OldStyleBase __init__ doc string """
self.args = args
self.kwargs = kwargs
def Func(self, *args, **kwargs):
""" OldStyleBase Func doc string """
self.args = args
self.kwargs = kwargs
def Func_Arguments1(self, arg1, kwarg1=False, *args, **kwargs):
return (arg1, kwarg1, args, kwargs)
def Func_Arguments2(self, arg1, kwarg1=True, *args, **kwargs):
return (arg1, kwarg1, args, kwargs)
class NewStyleBase(object):
""" NewStyleBase doc string. """
def __init__(self, *args, **kwargs):
""" NewStyleBase __init__ doc string """
self.args = args
self.kwargs = kwargs
def Func(self, *args, **kwargs):
""" NewStyleBase Func doc string """
self.args = args
self.kwargs = kwargs
def Func_Arguments1(self, arg1, kwarg1=False, *args, **kwargs):
return (arg1, kwarg1, args, kwargs)
def Func_Arguments2(self, arg1, kwarg1=True, *args, **kwargs):
return (arg1, kwarg1, args, kwargs)
class OldStyle(OldStyleBase):
def __init__(self, *args, **kwargs):
OldStyleBase.__init__(self, *args, **kwargs)
def Func(self, *args, **kwargs):
OldStyleBase.Func(self, *args, **kwargs)
class NewStyle(NewStyleBase):
def __init__(self, *args, **kwargs):
NewStyleBase.__init__(self, *args, **kwargs)
def Func(self, *args, **kwargs):
NewStyleBase.Func(self, *args, **kwargs)
def FuncSuper(self, *args, **kwargs):
super(NewStyle, self).Func(self, *args, **kwargs)
| 28.442623 | 67 | 0.614409 | 194 | 1,735 | 5.309278 | 0.164948 | 0.213592 | 0.190291 | 0.12233 | 0.763107 | 0.763107 | 0.763107 | 0.617476 | 0.563107 | 0.487379 | 0 | 0.015244 | 0.243804 | 1,735 | 60 | 68 | 28.916667 | 0.769817 | 0.139481 | 0 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.382353 | false | 0 | 0 | 0.117647 | 0.617647 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
3b03dc76c21e48ca31f953257eef3b7feeafc920 | 9,339 | py | Python | tests/unit_tests/test_properties/test_transformers/test_SubstituteCalls/test_argmin.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 5 | 2022-01-28T20:30:34.000Z | 2022-03-17T09:26:52.000Z | tests/unit_tests/test_properties/test_transformers/test_SubstituteCalls/test_argmin.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 9 | 2022-01-27T03:50:28.000Z | 2022-02-08T18:42:17.000Z | tests/unit_tests/test_properties/test_transformers/test_SubstituteCalls/test_argmin.py | samysweb/dnnv | 58fb95b7300914d9da28eed86c39eca473b1aaef | [
"MIT"
] | 2 | 2022-02-03T17:32:43.000Z | 2022-03-24T16:38:49.000Z | import numpy as np
import pytest
from dnnv.nn.utils import TensorDetails
from dnnv.properties.expressions import *
from dnnv.properties.transformers import SubstituteCalls
def test_argmin_symbol():
expr = Constant(np.argmin)(Symbol("x"))
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_non_concrete_network():
expr = Constant(np.argmin)(Network("N")(Symbol("x")))
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_constant():
expr = Constant(np.argmin)(Constant(np.array([3, 2, 5, 1, 4])))
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr is Constant(3)
def test_argmin_concrete_network():
fake_network = lambda x: x
fake_network.input_details = (TensorDetails((1, 5), np.float32),)
fake_network.input_shape = ((1, 5),)
fake_network.output_details = (TensorDetails((1, 3), np.float32),)
fake_network.output_shape = ((1, 3),)
expr = Constant(np.argmin)(Network("N")(Symbol("x")))
expr.concretize(N=fake_network)
new_expr = SubstituteCalls().visit(expr)
expected_expr = IfThenElse(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 1)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 2)]
),
),
Constant(0),
IfThenElse(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 1)], Network("N")(Symbol("x"))[(0, 2)]
)
),
Constant(1),
Constant(2),
),
)
assert new_expr is not expr
assert new_expr.is_equivalent(expected_expr)
def test_argmin_equal_too_many_args():
expr = Constant(np.argmin)(Symbol("x"), Symbol("a")) == Symbol("y")
with pytest.raises(RuntimeError, match="Too many arguments for argcmp"):
_ = SubstituteCalls().visit(expr)
def test_argmin_symbol_equal_symbol():
expr = Constant(np.argmin)(Symbol("x")) == Symbol("y")
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_symbol_equal_constant():
expr = Constant(np.argmin)(Symbol("x")) == Constant(0)
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_constant_equal_constant():
expr = Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4]))) == Constant(0)
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr is Constant(False)
expr = Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4]))) == Constant(1)
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr is Constant(True)
expr = Constant(4) == Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4])))
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr is Constant(False)
expr = Constant(1) == Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4])))
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr is Constant(True)
def test_argmin_concrete_network_equal_constant():
fake_network = lambda x: x
fake_network.input_details = (TensorDetails((1, 5), np.float32),)
fake_network.input_shape = ((1, 5),)
fake_network.output_details = (TensorDetails((1, 3), np.float32),)
fake_network.output_shape = ((1, 3),)
expr = Constant(np.argmin)(Network("N")(Symbol("x"))) == Constant(0)
expr.concretize(N=fake_network)
new_expr = SubstituteCalls().visit(expr)
expected_expr = And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 1)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 2)]
),
)
assert new_expr is not expr
assert new_expr.is_equivalent(expected_expr)
def test_argmin_concrete_network_equal_symbol():
fake_network = lambda x: x
fake_network.input_details = (TensorDetails((1, 5), np.float32),)
fake_network.input_shape = ((1, 5),)
fake_network.output_details = (TensorDetails((1, 3), np.float32),)
fake_network.output_shape = ((1, 3),)
expr = Constant(np.argmin)(Network("N")(Symbol("x"))) == Symbol("y")
expr.concretize(N=fake_network)
new_expr = SubstituteCalls().visit(expr)
expected_expr = And(
Implies(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 1)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 2)]
),
),
Equal(Symbol("y"), Constant(0)),
),
Implies(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 1)], Network("N")(Symbol("x"))[(0, 0)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 1)], Network("N")(Symbol("x"))[(0, 2)]
),
),
Equal(Symbol("y"), Constant(1)),
),
Implies(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 2)], Network("N")(Symbol("x"))[(0, 0)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 2)], Network("N")(Symbol("x"))[(0, 1)]
),
),
Equal(Symbol("y"), Constant(2)),
),
)
assert new_expr is not expr
assert new_expr.is_equivalent(expected_expr)
def test_argmin_symbol_notequal_symbol():
expr = Constant(np.argmin)(Symbol("x")) != Symbol("y")
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_symbol_notequal_constant():
expr = Constant(np.argmin)(Symbol("x")) != Constant(0)
new_expr = SubstituteCalls().visit(expr)
assert new_expr is not expr
assert new_expr.is_equivalent(expr)
def test_argmin_constant_notequal_constant():
expr = Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4]))) != Constant(0)
new_expr = SubstituteCalls().visit(expr).propagate_constants()
assert new_expr is not expr
assert new_expr is Constant(True)
expr = Constant(np.argmin)(Constant(np.array([2, 1, 5, 3, 4]))) != Constant(1)
new_expr = SubstituteCalls().visit(expr).propagate_constants()
assert new_expr is not expr
assert new_expr is Constant(False)
def test_argmin_concrete_network_notequal_constant():
fake_network = lambda x: x
fake_network.input_details = (TensorDetails((1, 5), np.float32),)
fake_network.input_shape = ((1, 5),)
fake_network.output_details = (TensorDetails((1, 3), np.float32),)
fake_network.output_shape = ((1, 3),)
expr = Constant(np.argmin)(Network("N")(Symbol("x"))) != Constant(0)
expr.concretize(N=fake_network)
new_expr = SubstituteCalls().visit(expr)
expected_expr = Or(
Not(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 1)]
)
),
Not(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 2)]
)
),
)
assert new_expr is not expr
assert new_expr.is_equivalent(expected_expr)
def test_argmin_concrete_network_notequal_symbol():
fake_network = lambda x: x
fake_network.input_details = (TensorDetails((1, 5), np.float32),)
fake_network.input_shape = ((1, 5),)
fake_network.output_details = (TensorDetails((1, 3), np.float32),)
fake_network.output_shape = ((1, 3),)
expr = Constant(np.argmin)(Network("N")(Symbol("x"))) != Symbol("y")
expr.concretize(N=fake_network)
new_expr = SubstituteCalls().visit(expr)
expected_expr = Or(
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 1)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 0)], Network("N")(Symbol("x"))[(0, 2)]
),
Not(Equal(Symbol("y"), Constant(0))),
),
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 1)], Network("N")(Symbol("x"))[(0, 0)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 1)], Network("N")(Symbol("x"))[(0, 2)]
),
Not(Equal(Symbol("y"), Constant(1))),
),
And(
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 2)], Network("N")(Symbol("x"))[(0, 0)]
),
LessThanOrEqual(
Network("N")(Symbol("x"))[(0, 2)], Network("N")(Symbol("x"))[(0, 1)]
),
Not(Equal(Symbol("y"), Constant(2))),
),
)
assert new_expr is not expr
assert new_expr.is_equivalent(expected_expr)
| 33.96 | 88 | 0.578542 | 1,156 | 9,339 | 4.511246 | 0.060554 | 0.072483 | 0.118121 | 0.126558 | 0.944583 | 0.938447 | 0.909684 | 0.894727 | 0.874784 | 0.874784 | 0 | 0.027344 | 0.252061 | 9,339 | 274 | 89 | 34.083942 | 0.719256 | 0 | 0 | 0.742358 | 0 | 0 | 0.014456 | 0 | 0 | 0 | 0 | 0 | 0.157205 | 1 | 0.065502 | false | 0 | 0.021834 | 0 | 0.087336 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d186b96e5ec3afe064277100d40f87c8830c689d | 5,137 | py | Python | timeline/assets/thumbnail.py | simonmysun/praxis | f38c8ea9fba557b347b577068159a77a1b018218 | [
"MIT"
] | 1 | 2020-08-07T12:25:10.000Z | 2020-08-07T12:25:10.000Z | timeline/assets/thumbnail.py | simonmysun/praxis | f38c8ea9fba557b347b577068159a77a1b018218 | [
"MIT"
] | null | null | null | timeline/assets/thumbnail.py | simonmysun/praxis | f38c8ea9fba557b347b577068159a77a1b018218 | [
"MIT"
] | 2 | 2016-08-16T17:44:59.000Z | 2016-10-20T08:06:46.000Z | from PIL import Image
sizes = [(400, 400)]
files = ["/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170420.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170405.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170219.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170208.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170152.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_170000.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_163045.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_162938.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_162815.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_162753.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_162031.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_161944.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_161653.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_161245.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_161229.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_160521.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_160254.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_160136.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_155720.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_155504.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_155138.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_155132.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_154659.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_154650.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_153208.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_153154.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_143559.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_143551.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/PANO_20161106_140443.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_140604.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_140545.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_140152.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_140141.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_140133.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_135515-EFFECTS.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_135515.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_135249.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_135242.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_134901.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_134853.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_134846.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_134311.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_133833.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_133828.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_133816.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_133810.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/PANO_20161106_133602.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_133129.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_132016.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_132006.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_131944.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_131640.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_130415.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_125550.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_124848.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_124840.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_124535.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_124338.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_122708.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_122555.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_122533.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_114928.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_114724.jpg", "/home/mysun/Downloads/xxx/Takeout/timeline/assets/IMG_20161106_114220.jpg"]
for image in files:
for size in sizes:
im = Image.open(image)
im.thumbnail(size)
im.save("%s.thumbnail.jpg" % image)
| 467 | 4,946 | 0.822075 | 734 | 5,137 | 5.579019 | 0.122616 | 0.140659 | 0.281319 | 0.328205 | 0.8779 | 0.8779 | 0.8779 | 0.8779 | 0.8779 | 0.864957 | 0 | 0.179753 | 0.023165 | 5,137 | 10 | 4,947 | 513.7 | 0.636309 | 0 | 0 | 0 | 0 | 0 | 0.914542 | 0.911427 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
d1a940724aaab91e406edf5fb0aee181fec76e2f | 5,617 | py | Python | tests/test_floating_ip.py | cloudscale-ch/cloudscale-cli | 29e7d0c3820b5903a6b509b9edef3063f388fd7c | [
"MIT"
] | 7 | 2020-07-18T07:15:58.000Z | 2020-12-10T13:25:08.000Z | tests/test_floating_ip.py | cloudscale-ch/cloudscale-cli | 29e7d0c3820b5903a6b509b9edef3063f388fd7c | [
"MIT"
] | 18 | 2020-08-17T22:12:34.000Z | 2021-05-17T14:59:07.000Z | tests/test_floating_ip.py | cloudscale-ch/cloudscale-cli | 29e7d0c3820b5903a6b509b9edef3063f388fd7c | [
"MIT"
] | null | null | null | from cloudscale import CLOUDSCALE_API_URL
from cloudscale_cli.cli import cli
import responses
import click
from click.testing import CliRunner
FLOATING_IP_RESP = {
"href": "https://api.cloudscale.ch/v1/floating-ips/192.0.2.123",
"created_at": "2019-05-29T13:18:42.505197Z",
"network": "192.0.2.123/32",
"ip_version": 4,
"server": {
"href": "https://api.cloudscale.ch/v1/servers/47cec963-fcd2-482f-bdb6-24461b2d47b1",
"uuid": "47cec963-fcd2-482f-bdb6-24461b2d47b1",
"name": "db-master"
},
"region": {
"slug": "lpg"
},
"next_hop": "198.51.100.1",
"reverse_ptr": "192.0.2.123.cust.cloudscale.ch",
"tags": {}
}
@responses.activate
def test_floating_ip_get_all():
network_id = "192.0.2.123"
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips',
json=[FLOATING_IP_RESP],
status=200)
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips',
json={},
status=500)
runner = CliRunner()
result = runner.invoke(cli, [
'-a',
'token',
'floating-ip',
'list',
])
assert result.exit_code == 0
result = runner.invoke(cli, [
'-a',
'token',
'floating-ip',
'list',
])
assert result.exit_code > 0
@responses.activate
def test_floating_ip_get_by_uuid():
network_id = "192.0.2.123"
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json=FLOATING_IP_RESP,
status=200)
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json={},
status=500)
runner = CliRunner()
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'show',
network_id,
])
assert result.exit_code == 0
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'show',
network_id,
])
assert result.exit_code > 0
@responses.activate
def test_floating_ip_delete():
network_id = "192.0.2.123"
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json=FLOATING_IP_RESP,
status=200)
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips/unknown',
json=FLOATING_IP_RESP,
status=200)
responses.add(
responses.DELETE,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
status=204)
responses.add(
responses.DELETE,
CLOUDSCALE_API_URL + '/floating-ips/unknown',
json={
"detail": "Not found."
},
status=404)
runner = CliRunner()
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'delete',
network_id,
])
assert result.exit_code == 1
runner = CliRunner()
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'delete',
network_id,
'--force',
])
assert result.exit_code == 0
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'delete',
'--force',
'unknown',
])
assert result.exit_code > 0
@responses.activate
def test_floating_ip_create():
ip_version = 4
server_uuid = "47cec963-fcd2-482f-bdb6-24461b2d47b1"
responses.add(
responses.POST,
CLOUDSCALE_API_URL + '/floating-ips',
json=FLOATING_IP_RESP,
status=201)
responses.add(
responses.POST,
CLOUDSCALE_API_URL + '/floating-ips',
json=FLOATING_IP_RESP,
status=500)
runner = CliRunner()
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'create',
'--ip-version',
ip_version,
'--server-uuid',
server_uuid,
])
assert result.exit_code == 0
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'create',
'--ip-version',
ip_version,
'--server-uuid',
server_uuid,
])
assert result.exit_code > 0
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'create',
'--ip-version',
6,
'--server-uuid',
server_uuid,
])
assert result.exit_code > 0
@responses.activate
def test_floating_ip_update():
network_id = "192.0.2.123"
reverse_ptr = "192.0.2.123.cust.cloudscale.ch"
responses.add(
responses.PATCH,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json=FLOATING_IP_RESP,
status=204)
responses.add(
responses.GET,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json=FLOATING_IP_RESP,
status=200)
responses.add(
responses.PATCH,
CLOUDSCALE_API_URL + '/floating-ips/' + network_id,
json={},
status=500)
runner = CliRunner()
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'update',
'--reverse-ptr',
reverse_ptr,
network_id,
])
assert result.exit_code == 0
result = runner.invoke(cli, [
'-a', 'token',
'floating-ip',
'update',
'--reverse-ptr',
reverse_ptr,
network_id,
])
assert result.exit_code > 0
def test_floating_ip_missing_api_key():
runner = CliRunner()
result = runner.invoke(cli, [
'floating-ip',
'list',
])
assert result.exit_code == 1
| 24.211207 | 92 | 0.553142 | 620 | 5,617 | 4.827419 | 0.154839 | 0.093552 | 0.074841 | 0.104243 | 0.877381 | 0.854995 | 0.789175 | 0.755429 | 0.755429 | 0.678917 | 0 | 0.05348 | 0.304255 | 5,617 | 231 | 93 | 24.316017 | 0.712385 | 0 | 0 | 0.820277 | 0 | 0.009217 | 0.191205 | 0.035784 | 0 | 0 | 0 | 0 | 0.059908 | 1 | 0.02765 | false | 0 | 0.023041 | 0 | 0.050691 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1afc6d5b41bdb819664b3077cf3ead620fec1c4 | 163 | py | Python | ganslate/data/__init__.py | ibro45/a | a90d92eaf041331cd3397f788cb60884cb0e176b | [
"BSD-3-Clause"
] | 17 | 2021-09-07T15:23:04.000Z | 2022-01-28T15:46:54.000Z | ganslate/data/__init__.py | ibro45/a | a90d92eaf041331cd3397f788cb60884cb0e176b | [
"BSD-3-Clause"
] | 18 | 2021-09-08T12:31:39.000Z | 2021-12-13T15:26:01.000Z | ganslate/data/__init__.py | ibro45/a | a90d92eaf041331cd3397f788cb60884cb0e176b | [
"BSD-3-Clause"
] | 2 | 2021-11-10T11:23:00.000Z | 2022-02-10T07:57:20.000Z | from .unpaired_image_dataset import UnpairedImageDataset, UnpairedImageDatasetConfig
from .paired_image_dataset import PairedImageDataset, PairedImageDatasetConfig | 81.5 | 84 | 0.920245 | 14 | 163 | 10.428571 | 0.714286 | 0.164384 | 0.246575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055215 | 163 | 2 | 85 | 81.5 | 0.948052 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d1ceb253214177f8cf3ac0fb7358524644d77fbb | 10,927 | py | Python | PoisDenoiser/nnLayers/gradChecker.py | AndreiDavydov/Poisson_Denoiser | a0b8f3dce8282b8e50d44cacb7bdc4fc6d4abc22 | [
"MIT"
] | 4 | 2019-12-24T10:54:40.000Z | 2021-12-27T14:07:06.000Z | PoisDenoiser/nnLayers/gradChecker.py | AndreiDavydov/Poisson_Denoiser | a0b8f3dce8282b8e50d44cacb7bdc4fc6d4abc22 | [
"MIT"
] | null | null | null | PoisDenoiser/nnLayers/gradChecker.py | AndreiDavydov/Poisson_Denoiser | a0b8f3dce8282b8e50d44cacb7bdc4fc6d4abc22 | [
"MIT"
] | 1 | 2020-09-28T06:04:12.000Z | 2020-09-28T06:04:12.000Z | import torch as th
from PoisDenoiser.nnLayers.functional import *
def poisLikelihoodFunc_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 1, 5,5)).type(dtype)
z = th.rand((4, 1, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
grad_output = th.randn(x.size(0)).type_as(x)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_poisLikelihoodFunc(input, z, grad_output)
for k in range(0,x.numel()):
cur_x = float(x.view(-1)[k])
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
y = poisLikelihoodFunc(x,z)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad
def condFunc_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 1, 5,5)).type(dtype)
z = th.rand((4, 1, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
grad_output = th.rand(x.size(0)).type_as(x)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_condFunc(input, z, grad_output)
for k in range(0,x.numel()):
cur_x = x.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
y = condFunc(x,z)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad
def dCond_dAlphaFunc_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 3, 5,5)).type(dtype)
z = th.rand((4, 3, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
alpha = th.rand(x.size(0), 1,1,1).type_as(x)*100
grad_output = th.rand(x.size(0)).type_as(x)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_dCond_dAlphaFunc(input, z, alpha, grad_output)
for k in range(0,x.numel()):
cur_x = x.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# alpha grad
alpha_numgrad = th.zeros_like(alpha).view(-1)
perturb = th.zeros_like(alpha).view(-1)
cost = lambda input: cost_dCond_dAlphaFunc(x, z, input, grad_output)
for k in range(0,alpha.numel()):
cur_alpha = alpha.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_alpha else cur_alpha/2
loss1 = cost(alpha.view(-1).add( perturb).view(alpha.size()))
loss2 = cost(alpha.view(-1).add(-perturb).view(alpha.size()))
alpha_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
alpha_numgrad = alpha_numgrad.view(alpha.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
alpha.requires_grad = True
y = dCond_dAlphaFunc(x,z,alpha)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
err_alpha = th.norm(alpha.grad.data.view(-1) - alpha_numgrad.view(-1))/\
th.norm(alpha.grad.data.view(-1) + alpha_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad,\
err_alpha, alpha.grad.data, alpha_numgrad
def projFunc_gradCheck(noise_param=0.01, epsilon=1e-4, dtype='torch.DoubleTensor'):
x = th.rand((4, 3, 5,5)).type(dtype)
z = th.rand((4, 3, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
alpha = th.rand(x.size(0), 1,1,1).type_as(x)*100
grad_output = th.randn_like(x).type_as(x)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_projFunc(input, z, alpha, grad_output)
for k in range(0,x.numel()):
cur_x = x.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# alpha grad
alpha_numgrad = th.zeros_like(alpha).view(-1)
perturb = th.zeros_like(alpha).view(-1)
cost = lambda input: cost_projFunc(x, z, input, grad_output)
for k in range(0,alpha.numel()):
cur_alpha = alpha.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_alpha else cur_alpha/2
loss1 = cost(alpha.view(-1).add( perturb).view(alpha.size()))
loss2 = cost(alpha.view(-1).add(-perturb).view(alpha.size()))
alpha_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
alpha_numgrad = alpha_numgrad.view(alpha.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
alpha.requires_grad = True
y = projFunc(x,z,alpha)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
err_alpha = th.norm(alpha.grad.data.view(-1) - alpha_numgrad.view(-1))/\
th.norm(alpha.grad.data.view(-1) + alpha_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad,\
err_alpha, alpha.grad.data, alpha_numgrad
def alphaFunc_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 1, 5,5)).type(dtype)
z = th.rand((4, 1, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
x = z.clone() + 1e-8
grad_output = th.rand((x.size(0), 1,1,1)).type_as(x)
# # x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_alphaFunc(input, z, grad_output)
for k in range(0,x.numel()):
cur_x = x.view(-1)[k]
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
y = alphaFunc(x,z)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad
def poisProx_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 1, 2,2)).type(dtype)
z = th.rand((4, 1, 2,2)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample()
x[2:4] = z[2:4].clone()
grad_output = th.ones_like(x).type_as(x)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_poisProx(input, z, grad_output)
for k in range(0,x.numel()):
cur_x = float(x.view(-1)[k])
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
if th.abs(perturb[k]) < 1e-10:
# both x and z equal to 0. Losses are equal.
x_numgrad[k] = 1
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size())
# compute errors
x.requires_grad = True
z.requires_grad = False
y = poisProx(x,z)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
(th.norm(x.grad.data.view(-1) + x_numgrad.view(-1)))
return err_x, x.grad.data, x_numgrad
from PoisDenoiser.networks.PoisNet.net import PoisNet
def poisNet_gradCheck(noise_param=0.01, epsilon=1e-7, dtype='torch.DoubleTensor'):
x = th.rand((4, 1, 5,5)).type(dtype).cuda(0)
z = th.rand((4, 1, 5,5)).type(dtype)
z = th.distributions.Poisson(z/noise_param).sample().cuda(0)
model = PoisNet(stages=5, output_features=64).double().cuda(0)
grad_output = th.randn_like(x).type_as(x).cuda(0)
# x grad
x_numgrad = th.zeros_like(x).view(-1)
perturb = th.zeros_like(x).view(-1)
cost = lambda input: cost_poisNet(input, z, model, grad_output)
for k in range(0,x.numel()):
cur_x = float(x.view(-1)[k])
perturb[k] = epsilon if epsilon < cur_x else cur_x/2
loss1 = cost(x.view(-1).add( perturb).view(x.size()))
loss2 = cost(x.view(-1).add(-perturb).view(x.size()))
x_numgrad[k] = (loss1-loss2)/(2*perturb[k])
perturb[k] = 0
x_numgrad = x_numgrad.view(x.size()).cuda(0)
# compute errors
x.requires_grad = True
z.requires_grad = False
y = model(x,z)
y.backward(grad_output)
err_x = th.norm(x.grad.data.view(-1) - x_numgrad.view(-1))/\
th.norm(x.grad.data.view(-1) + x_numgrad.view(-1))
return err_x, x.grad.data, x_numgrad
def cost_poisLikelihoodFunc(x,z, weights):
out = poisLikelihoodFunc(x,z)
return out.mul(weights).sum()
def cost_condFunc(x,z, weights):
out = condFunc(x,z)
return out.mul(weights).sum()
def cost_dCond_dAlphaFunc(x,z,alpha, weights):
out = dCond_dAlphaFunc(x,z,alpha)
return out.mul(weights).sum()
def cost_projFunc(x,z,alpha, weights):
out = projFunc(x,z,alpha)
return out.mul(weights).sum()
def cost_alphaFunc(x,z, weights):
out = alphaFunc(x,z)
return out.mul(weights).sum()
def cost_poisProx(x,z, weights):
out = poisProx(x,z)
return out.mul(weights).sum()
def cost_poisNet(x,z,model, weights):
out = model(x,z)
return out.mul(weights).sum()
| 31.309456 | 93 | 0.604558 | 1,774 | 10,927 | 3.598647 | 0.055806 | 0.06344 | 0.032895 | 0.042293 | 0.898496 | 0.886122 | 0.886122 | 0.873434 | 0.868734 | 0.843985 | 0 | 0.033459 | 0.223209 | 10,927 | 348 | 94 | 31.399425 | 0.718662 | 0.020042 | 0 | 0.747748 | 0 | 0 | 0.011788 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063063 | false | 0 | 0.013514 | 0 | 0.13964 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1e7e5e6c21706789bb79badac98910e6d494f4c | 9,689 | py | Python | tests/test_account.py | sanketsaurav/qds-sdk-py | be9aefbd571afbdb4d38c5a3d9ac80a4a024f5e5 | [
"Apache-2.0"
] | 42 | 2015-01-21T17:34:58.000Z | 2021-12-13T15:08:42.000Z | tests/test_account.py | sanketsaurav/qds-sdk-py | be9aefbd571afbdb4d38c5a3d9ac80a4a024f5e5 | [
"Apache-2.0"
] | 157 | 2015-01-18T00:14:38.000Z | 2021-07-16T08:20:51.000Z | tests/test_account.py | sanketsaurav/qds-sdk-py | be9aefbd571afbdb4d38c5a3d9ac80a4a024f5e5 | [
"Apache-2.0"
] | 123 | 2015-01-14T10:38:11.000Z | 2021-06-23T20:16:21.000Z | from __future__ import print_function
import sys
import os
if sys.version_info > (2, 7, 0):
import unittest
else:
import unittest2 as unittest
from mock import Mock
sys.path.append(os.path.join(os.path.dirname(__file__), '../bin'))
import qds
from qds_sdk.connection import Connection
from test_base import print_command
from test_base import QdsCliTestCase
class TestAccountCreate(QdsCliTestCase):
def test_all(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
qds.main()
Connection._api_call.assert_called_with("POST", "account", {'account': {
'name': 'new_account',
'acc_key': 'dummy',
'level': 'free',
'compute_type': 'CUSTOMER_MANAGED',
'aws_region': 'us-east-1',
'storage_type': 'CUSTOMER_MANAGED',
'CacheQuotaSizeInGB': '25',
'secret': 'dummy',
'use_previous_account_plan': 'true',
'compute_secret_key': 'dummy',
'compute_access_key': 'dummy',
'defloc': 's3://bucket/path'}})
def test_no_name(self):
sys.argv = ['qds.py', 'account', 'create',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_storage_acc_key(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_storage_secret_key(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_aws_region(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_invalid_region(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'non-existent']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_compute_acc_key(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_compute_secret_key(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_no_location(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--previous-account-plan', 'true',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
def test_default_previous_account_plan(self):
sys.argv = ['qds.py', 'account', 'create',
'--name', 'new_account',
'--location', 's3://bucket/path',
'--storage-access-key', 'dummy',
'--storage-secret-key', 'dummy',
'--compute-access-key', 'dummy',
'--compute-secret-key', 'dummy',
'--aws-region', 'us-east-1']
print_command()
Connection._api_call = Mock(return_value={})
qds.main()
Connection._api_call.assert_called_with("POST", "account", {'account': {
'name': 'new_account',
'acc_key': 'dummy',
'level': 'free',
'compute_type': 'CUSTOMER_MANAGED',
'aws_region': 'us-east-1',
'storage_type': 'CUSTOMER_MANAGED',
'CacheQuotaSizeInGB': '25',
'secret': 'dummy',
'use_previous_account_plan': 'false',
'compute_secret_key': 'dummy',
'compute_access_key': 'dummy',
'defloc': 's3://bucket/path'}})
class TestAccountBranding(QdsCliTestCase):
def test_logo(self):
sys.argv = ['qds.py', 'account', 'branding',
'--account-id', '4',
'--logo-uri', 'https://www.xyz.com/image.jpg']
print_command()
Connection._api_call = Mock(return_value={})
qds.main()
Connection._api_call.assert_called_with("PUT", "accounts/branding", {'logo': {
'logo_uri' : 'https://www.xyz.com/image.jpg'},
'account_id' : '4'})
def test_link(self):
sys.argv = ['qds.py', 'account', 'branding',
'--account-id', '4',
'--link-url', 'https://www.xyz.com',
'--link-label', 'Documentation']
print_command()
Connection._api_call = Mock(return_value={})
qds.main()
Connection._api_call.assert_called_with("PUT", "accounts/branding", {'link': {
'link_url' : 'https://www.xyz.com',
'link_label' : 'Documentation'},
'account_id' : '4'})
def test_logo_link(self):
sys.argv = ['qds.py', 'account', 'branding',
'--account-id', '4',
'--logo-uri', 'https://www.xyz.com/image.jpg',
'--link-url', 'https://www.xyz.com',
'--link-label', 'Documentation']
print_command()
Connection._api_call = Mock(return_value={})
qds.main()
Connection._api_call.assert_called_with("PUT", "accounts/branding", {'logo': {
'logo_uri' : 'https://www.xyz.com/image.jpg'},
'link': {'link_url' : 'https://www.xyz.com',
'link_label' : 'Documentation'},
'account_id' : '4'})
def test_without_account_id(self):
sys.argv = ['qds.py', 'account', 'branding',
'--logo-uri', 'https://www.xyz.com/image.jpg',
'--link-url', 'https://www.xyz.com',
'--link-label', 'Documentation']
print_command()
Connection._api_call = Mock(return_value={})
with self.assertRaises(SystemExit):
qds.main()
if __name__ == '__main__':
unittest.main()
| 40.539749 | 86 | 0.491898 | 930 | 9,689 | 4.930108 | 0.112903 | 0.073282 | 0.061069 | 0.042748 | 0.89313 | 0.889422 | 0.889422 | 0.876336 | 0.876336 | 0.876336 | 0 | 0.005444 | 0.336464 | 9,689 | 238 | 87 | 40.710084 | 0.707731 | 0 | 0 | 0.821101 | 0 | 0 | 0.322634 | 0.026525 | 0 | 0 | 0 | 0 | 0.06422 | 1 | 0.06422 | false | 0 | 0.045872 | 0 | 0.119266 | 0.073395 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae082f534eaf001379a3dd003f5f1094360e1455 | 173 | py | Python | quart_cdi/utils.py | WatanukiRasadar/quart_cdi | d7ac1ed2314453e0bcfbdb841c755070ed2dcaf6 | [
"MIT"
] | null | null | null | quart_cdi/utils.py | WatanukiRasadar/quart_cdi | d7ac1ed2314453e0bcfbdb841c755070ed2dcaf6 | [
"MIT"
] | null | null | null | quart_cdi/utils.py | WatanukiRasadar/quart_cdi | d7ac1ed2314453e0bcfbdb841c755070ed2dcaf6 | [
"MIT"
] | null | null | null | from importlib import import_module
def require(code_path: str):
module_path, var_name = code_path.split(':')
return getattr(import_module(module_path), var_name)
| 24.714286 | 56 | 0.763006 | 25 | 173 | 4.96 | 0.56 | 0.193548 | 0.209677 | 0.274194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138728 | 173 | 6 | 57 | 28.833333 | 0.832215 | 0 | 0 | 0 | 0 | 0 | 0.00578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ae18aaf878cd5c824d500589ea348e6305f090b0 | 165 | py | Python | dataio/transformation/__init__.py | DragonRoar/deep-radiomics-glioma | 178cd2f7239a644741ed70848a67e752831b038b | [
"Apache-2.0"
] | 1 | 2022-01-25T08:20:57.000Z | 2022-01-25T08:20:57.000Z | dataio/transformation/__init__.py | DragonRoar/deep-radiomics-glioma | 178cd2f7239a644741ed70848a67e752831b038b | [
"Apache-2.0"
] | 1 | 2022-02-21T10:02:04.000Z | 2022-02-21T10:02:04.000Z | dataio/transformation/__init__.py | DragonRoar/deep-radiomics-glioma | 178cd2f7239a644741ed70848a67e752831b038b | [
"Apache-2.0"
] | 2 | 2021-06-18T04:31:10.000Z | 2022-03-24T05:09:39.000Z | from .transforms import RandomIntensityShiftScale
from .transforms import RandomHorizontalFlip
from .transforms import ToTensor
from .transforms import RandomRotate
| 33 | 49 | 0.878788 | 16 | 165 | 9.0625 | 0.4375 | 0.386207 | 0.551724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09697 | 165 | 4 | 50 | 41.25 | 0.973154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ae20ade097dc4a4226174d588188575a9e66672d | 171 | py | Python | tests/data/python37.py | quic0/black | 8b340e210271a8108995fd479c55dbc0a34466bd | [
"MIT"
] | null | null | null | tests/data/python37.py | quic0/black | 8b340e210271a8108995fd479c55dbc0a34466bd | [
"MIT"
] | null | null | null | tests/data/python37.py | quic0/black | 8b340e210271a8108995fd479c55dbc0a34466bd | [
"MIT"
] | null | null | null | #!/usr/bin/env python3.7
def f():
return (i*2 async for i in arange(42))
# output
#!/usr/bin/env python3.7
def f():
return (i * 2 async for i in arange(42))
| 12.214286 | 44 | 0.596491 | 33 | 171 | 3.090909 | 0.484848 | 0.117647 | 0.176471 | 0.313725 | 0.941176 | 0.941176 | 0.941176 | 0.941176 | 0.941176 | 0.941176 | 0 | 0.076336 | 0.233918 | 171 | 13 | 45 | 13.153846 | 0.70229 | 0.309942 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 12 |
8819e021db4bf4d389e37cee1900ba8063577d9a | 14,554 | py | Python | SiameseFingerprint/inception_nn_model.py | TuongL94/MasterThesis | 25cc6e4d43d49777f28ac31ed3a5a0c6c7d90bf9 | [
"Apache-2.0"
] | 1 | 2019-11-15T03:24:18.000Z | 2019-11-15T03:24:18.000Z | SiameseFingerprint/inception_nn_model.py | TuongL94/MasterThesis | 25cc6e4d43d49777f28ac31ed3a5a0c6c7d90bf9 | [
"Apache-2.0"
] | null | null | null | SiameseFingerprint/inception_nn_model.py | TuongL94/MasterThesis | 25cc6e4d43d49777f28ac31ed3a5a0c6c7d90bf9 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Fri Mar 2 09:57:15 2018
@author: Tuong Lam
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
def inception_a_block(input, training):
conv1_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv1_1")
conv1_1 = tf.layers.batch_normalization(
conv1_1,
training = training,
name = "batch_norm_1_1",
reuse = tf.AUTO_REUSE)
conv1_1 = tf.nn.leaky_relu(
conv1_1)
conv2_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_1")
conv2_1 = tf.layers.batch_normalization(
conv2_1,
training = training,
name = "batch_norm_2_1",
reuse = tf.AUTO_REUSE)
conv2_1 = tf.nn.leaky_relu(
conv2_1)
conv3_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_1")
conv3_1 = tf.layers.batch_normalization(
conv3_1,
training = training,
name = "batch_norm_3_1",
reuse = tf.AUTO_REUSE)
conv3_1 = tf.nn.leaky_relu(
conv3_1)
max1 = tf.layers.max_pooling2d(inputs = input,
pool_size = [3,3],
strides = 1,
padding = "same")
conv4_1 = tf.layers.conv2d(
inputs = max1,
filters = 32,
kernel_size = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv4_1")
conv4_1 = tf.layers.batch_normalization(
conv4_1,
training = training,
name = "batch_norm_4_1",
reuse = tf.AUTO_REUSE)
conv4_1 = tf.nn.leaky_relu(
conv4_1)
conv2_2 = tf.layers.conv2d(
inputs = conv2_1,
filters = 32,
kernel_size = [3,3],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_2")
conv2_2 = tf.layers.batch_normalization(
conv2_2,
training = training,
name = "batch_norm_2_2",
reuse = tf.AUTO_REUSE)
conv2_2 = tf.nn.leaky_relu(
conv2_2)
conv3_2 = tf.layers.conv2d(
inputs = conv3_1,
filters = 32,
kernel_size = [3,3],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_2")
conv3_2 = tf.layers.batch_normalization(
conv3_2,
training = training,
name = "batch_norm_3_2",
reuse = tf.AUTO_REUSE)
conv3_2 = tf.nn.leaky_relu(
conv3_2)
conv3_3 = tf.layers.conv2d(
inputs = conv3_2,
filters = 32,
kernel_size = [3,3],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_3")
conv3_3 = tf.layers.batch_normalization(
conv3_3,
training = training,
name = "batch_norm_3_3",
reuse = tf.AUTO_REUSE)
conv3_3 = tf.nn.leaky_relu(
conv3_3)
output = tf.concat([conv1_1,conv4_1,conv2_2,conv3_3],axis=3)
return output
def inception_b_block(input, training):
conv1_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv1_1")
conv1_1 = tf.layers.batch_normalization(
conv1_1,
training = training,
name = "batch_norm_1_1",
reuse = tf.AUTO_REUSE)
conv1_1 = tf.nn.leaky_relu(
conv1_1)
conv2_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_1")
conv2_1 = tf.layers.batch_normalization(
conv2_1,
training = training,
name = "batch_norm_2_1",
reuse = tf.AUTO_REUSE)
conv2_1 = tf.nn.leaky_relu(
conv2_1)
conv3_1 = tf.layers.conv2d(
inputs = input,
filters = 32,
kernel_size = [1,1],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_1")
conv3_1 = tf.layers.batch_normalization(
conv3_1,
training = training,
name = "batch_norm_3_1",
reuse = tf.AUTO_REUSE)
conv3_1 = tf.nn.leaky_relu(
conv3_1)
conv1_2 = tf.layers.conv2d(
inputs = conv1_1,
filters = 64,
kernel_size = [3,3],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv1_2")
conv1_2 = tf.layers.batch_normalization(
conv1_2,
training = training,
name = "batch_norm_1_2",
reuse = tf.AUTO_REUSE)
conv1_2 = tf.nn.leaky_relu(
conv1_2)
conv2_2 = tf.layers.conv2d(
inputs = conv2_1,
filters = 64,
kernel_size = [3,3],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_2")
conv2_2 = tf.layers.batch_normalization(
conv2_2,
training = training,
name = "batch_norm_2_2",
reuse = tf.AUTO_REUSE)
conv2_2 = tf.nn.leaky_relu(
conv2_2)
conv3_2 = tf.layers.conv2d(
inputs = conv3_1,
filters = 64,
kernel_size = [3,3],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_2")
conv3_2 = tf.layers.batch_normalization(
conv3_2,
training = training,
name = "batch_norm_3_2",
reuse = tf.AUTO_REUSE)
conv3_2 = tf.nn.leaky_relu(
conv3_2)
output = tf.concat([conv1_2,conv2_2,conv3_2],axis=3)
return output
def reduction_1_block(input, training):
max1_1 = tf.layers.max_pooling2d(inputs = input,
pool_size = [3,3],
strides = 2)
conv1_1 = tf.layers.conv2d(
inputs = input,
filters = 64,
kernel_size = [3,3],
strides = [2,2],
padding = "valid",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv1_1")
conv1_1 = tf.layers.batch_normalization(
conv1_1,
training = training,
name = "batch_norm_1_1",
reuse = tf.AUTO_REUSE)
conv1_1 = tf.nn.leaky_relu(
conv1_1)
conv2_1 = tf.layers.conv2d(
inputs = input,
filters = 64,
kernel_size = [1,1],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_1")
conv2_1 = tf.layers.batch_normalization(
conv2_1,
training = training,
name = "batch_norm_2_1",
reuse = tf.AUTO_REUSE)
conv2_1 = tf.nn.leaky_relu(
conv2_1)
conv2_2 = tf.layers.conv2d(
inputs = conv2_1,
filters = 64,
kernel_size = [3,3],
strides = [1,1],
padding = "same",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2_2")
conv2_2 = tf.layers.batch_normalization(
conv2_2,
training = training,
name = "batch_norm_2_2",
reuse = tf.AUTO_REUSE)
conv2_2 = tf.nn.leaky_relu(
conv2_2)
conv3_2 = tf.layers.conv2d(
inputs = conv2_2,
filters = 64,
kernel_size = [3,3],
strides = [2,2],
padding = "valid",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv3_2")
conv3_2 = tf.layers.batch_normalization(
conv3_2,
training = training,
name = "batch_norm_3_2",
reuse = tf.AUTO_REUSE)
conv3_2 = tf.nn.leaky_relu(
conv3_2)
output = tf.concat([conv1_1,max1_1,conv3_2],axis=3)
return output
def stem(input, training):
output = tf.layers.batch_normalization(
input,
training = training,
name = "batch_norm_1",
reuse = tf.AUTO_REUSE)
# Convolutional layer 1
output = tf.layers.conv2d(
inputs = output,
filters = 16,
kernel_size = [7,7],
strides = [1,1],
padding = "valid",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv1")
output = tf.layers.batch_normalization(
output,
training = training,
name = "batch_norm_2",
reuse = tf.AUTO_REUSE)
output = tf.nn.leaky_relu(
output)
output = tf.layers.dropout(
output,
rate = 0.5,
training = training,
seed = 1)
# Pooling layer 1
output = tf.layers.max_pooling2d(inputs = output,
pool_size = [2,2],
strides = 2)
# Convolutional Layer 2
output = tf.layers.conv2d(
inputs = output,
filters = 16,
kernel_size = [5,5],
strides = [1,1],
padding = "valid",
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="conv2")
output = tf.layers.batch_normalization(
output,
training = training,
name = "batch_norm_3",
reuse = tf.AUTO_REUSE)
output = tf.nn.leaky_relu(
output)
output = tf.layers.dropout(
output,
rate = 0.5,
training = training,
seed = 2)
# Pooling layer 2
output = tf.layers.max_pooling2d(
inputs = output,
pool_size = [2,2],
strides = 2)
# # Convolutional Layer 3
# output = tf.layers.conv2d(
# inputs = output,
# filters = 32,
# kernel_size = [3,3],
# strides = [1,1],
# padding = "same",
# reuse = tf.AUTO_REUSE,
# kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
# name="conv3")
#
# output = tf.layers.batch_normalization(
# output,
# training = training,
# name = "batch_norm_4",
# reuse = tf.AUTO_REUSE)
#
# output = tf.nn.leaky_relu(
# output)
#
# output = tf.layers.dropout(
# output,
# rate = 0.5,
# training = training,
# seed = 3)
# # Convolutional Layer 4
# output = tf.layers.conv2d(
# inputs = output,
# filters = 64,
# kernel_size = [3,3],
# strides = [1,1],
# padding = "same",
# reuse = tf.AUTO_REUSE,
# kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
# name="conv4")
#
# output = tf.layers.batch_normalization(
# output,
# training = training,
# name = "batch_norm_5",
# reuse = tf.AUTO_REUSE)
#
# output = tf.nn.leaky_relu(
# output)
#
# output = tf.layers.dropout(
# output,
# rate = 0.5,
# training = training,
# seed = 4)
return output
def inference(input, training = True):
output = stem(input, training)
with tf.variable_scope("inception_1"):
output = inception_a_block(output, training)
with tf.variable_scope("inception_2"):
output = inception_a_block(output, training)
with tf.variable_scope("inception_3"):
output = inception_a_block(output, training)
with tf.variable_scope("reduction_1"):
output = reduction_1_block(output, training)
with tf.variable_scope("inception_4"):
output = inception_b_block(output, training)
output = tf.layers.max_pooling2d(inputs = output,
pool_size = [2,2],
strides = 2)
output = tf.layers.flatten(output)
output = tf.layers.dense(
output,
512,
reuse = tf.AUTO_REUSE,
kernel_regularizer = tf.contrib.layers.l2_regularizer(0.3),
name="dense")
output = tf.nn.leaky_relu(
output)
output = tf.nn.l2_normalize(
output,
axis=1)
return output | 28.205426 | 72 | 0.507489 | 1,603 | 14,554 | 4.366812 | 0.059888 | 0.061714 | 0.069143 | 0.100571 | 0.910143 | 0.855857 | 0.831857 | 0.814286 | 0.802143 | 0.802143 | 0 | 0.062148 | 0.389721 | 14,554 | 516 | 73 | 28.205426 | 0.725963 | 0.101553 | 0 | 0.821138 | 0 | 0 | 0.042038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01355 | false | 0 | 0.01084 | 0 | 0.03794 | 0.00271 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8837ce0bb66ea3ce0a35c1a229f6a3f6a40d6d94 | 3,109 | py | Python | cli_examples.py | stuffgora/tf-2-keras-enet-with-watercolor | 3d46306c3f79dfda34a92e4c6fbe104752711490 | [
"Apache-2.0"
] | null | null | null | cli_examples.py | stuffgora/tf-2-keras-enet-with-watercolor | 3d46306c3f79dfda34a92e4c6fbe104752711490 | [
"Apache-2.0"
] | null | null | null | cli_examples.py | stuffgora/tf-2-keras-enet-with-watercolor | 3d46306c3f79dfda34a92e4c6fbe104752711490 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# Train loop
#for i in {0..50}
# do
# let eps=20+$i*20
# let i_ep=$i*20
# python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256 --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --steps 100 --wc_in_encoder 0 --train_flow 1 --predict_flow 0 --eval_flow 0 --epochs "$eps" --initial_epoch "$i_ep"
# done
# Predict {batch_size} images from test dataset with enet_wc_before_encoder_256x256
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256 --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --steps 100 --wc_in_encoder 0 --train_flow 0 --predict_flow 1 --eval_flow 0 --epochs 50
# Predict {batch_size} images from test dataset with enet_no_wc_256x256
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_no_wc_256x256 --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --steps 100 --train_flow 0 --predict_flow 1 --eval_flow 0 --epochs 50
# Predict {batch_size} images from test dataset with enet_wc_in_encoder_1_256x256
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_in_encoder_1_256x256 --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --steps 100 --wc_in_encoder 1 --train_flow 0 --predict_flow 1 --eval_flow 0 --epochs 50
#1
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256_test_as_val --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --use_test_as_val_ds 1 --concat_ds 1 --steps 100 --wc_in_encoder 0 --train_flow 1 --predict_flow 0 --eval_flow 0 --epochs 50
#2
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256_test_as_val --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --use_test_as_val_ds 1 --concat_ds 1 --steps 300 ^Cepochs 700 --initial_epoch 51 --wc_in_encoder 0 --train_flow 1 --predict_flow 0 --eval_flow 0
#3 loop : 10 runs for 200 epochs
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256_test_as_val --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --use_test_as_val_ds 1 --concat_ds 1 --steps 200 --epochs "$eps" --initial_epoch "$i_ep" --wc_in_encoder 0 --train_flow 1 --predict_flow 0 --eval_flow 0
#4
python enet_keras_wc_train_val_pred.py --dataset_name camvid --model_name enet_wc_before_encoder_256x256_test_as_val --image_height 256 --image_width 256 --num_classes 12 --batch_size 10 --val_batch_size 20 --repeat_train_ds 0 --use_test_as_val_ds 1 --concat_ds 1 --steps 200 --epochs 700 --initial_epoch 650 --wc_in_encoder 0 --train_flow 1 --predict_flow 0 --eval_flow 0
| 94.212121 | 381 | 0.790286 | 573 | 3,109 | 3.862129 | 0.1274 | 0.077271 | 0.044736 | 0.061455 | 0.932219 | 0.925441 | 0.900136 | 0.900136 | 0.900136 | 0.900136 | 0 | 0.098988 | 0.109682 | 3,109 | 32 | 382 | 97.15625 | 0.700506 | 0.228691 | 0 | 0 | 0 | 0 | 0.003782 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8850a161e09d23ebbde21e4bab20204833db9a44 | 496,022 | py | Python | python-opcua/opcua/server/standard_address_space/standard_address_space_part12.py | ssriblo/ionic-smarthome-test-1 | 060bc247e0b8295d6cd869d90b364756515cfc19 | [
"MIT"
] | 1 | 2020-12-18T15:18:19.000Z | 2020-12-18T15:18:19.000Z | python-opcua/opcua/server/standard_address_space/standard_address_space_part12.py | ssriblo/ionic-smarthome-test-1 | 060bc247e0b8295d6cd869d90b364756515cfc19 | [
"MIT"
] | 42 | 2020-08-20T04:01:12.000Z | 2021-01-09T18:50:21.000Z | python-opcua/opcua/server/standard_address_space/standard_address_space_part12.py | ssriblo/ionic-smarthome-test-1 | 060bc247e0b8295d6cd869d90b364756515cfc19 | [
"MIT"
] | null | null | null |
# -*- coding: utf-8 -*-
"""
DO NOT EDIT THIS FILE!
It is automatically generated from opcfoundation.org schemas.
Date:2020-06-19 17:31:10.199368
"""
import datetime
from dateutil.tz import tzutc
from opcua import ua
from opcua.ua import NodeId, QualifiedName, NumericNodeId, StringNodeId, GuidNodeId
from opcua.ua import NodeClass, LocalizedText
def create_standard_address_space_Part12(server):
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12522, 0)
node.BrowseName = QualifiedName('TrustListType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(11575, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("TrustListType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12542, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19296, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12543, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12546, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12548, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12550, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(11575, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12542, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12542, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12542, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12542, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19296, 0)
node.BrowseName = QualifiedName('UpdateFrequency', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UpdateFrequency")
attrs.DataType = NumericNodeId(290, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19296, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19296, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19296, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12543, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12543, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12544, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12543, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12545, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12543, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12543, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12544, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12543, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12544, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12544, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12544, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12543, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12545, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12543, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12545, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12545, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12545, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12543, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12546, 0)
node.BrowseName = QualifiedName('CloseAndUpdate', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("CloseAndUpdate")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12546, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12705, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12546, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12547, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12546, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12546, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12705, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12546, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12705, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12705, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12705, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12546, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12547, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12546, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'ApplyChangesRequired'
extobj.DataType = NumericNodeId(1, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12547, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12547, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12547, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12546, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12548, 0)
node.BrowseName = QualifiedName('AddCertificate', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("AddCertificate")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12548, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12549, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12548, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12548, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12549, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12548, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Certificate'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'IsTrustedCertificate'
extobj.DataType = NumericNodeId(1, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12549, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12549, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12549, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12548, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12550, 0)
node.BrowseName = QualifiedName('RemoveCertificate', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12522, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("RemoveCertificate")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12550, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12551, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12550, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12550, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12551, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12550, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Thumbprint'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'IsTrustedCertificate'
extobj.DataType = NumericNodeId(1, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12551, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12551, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12551, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12550, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12552, 0)
node.BrowseName = QualifiedName('TrustListMasks', 0)
node.NodeClass = NodeClass.DataType
node.ParentNodeId = NumericNodeId(29, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.DataTypeAttributes()
attrs.DisplayName = LocalizedText("TrustListMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12552, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12553, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12552, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(29, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12553, 0)
node.BrowseName = QualifiedName('EnumValues', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12552, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EnumValues")
attrs.DataType = NumericNodeId(7594, 0)
value = []
extobj = ua.EnumValueType()
extobj.Value = 0
extobj.DisplayName.Text = 'None'
value.append(extobj)
extobj = ua.EnumValueType()
extobj.Value = 1
extobj.DisplayName.Text = 'TrustedCertificates'
value.append(extobj)
extobj = ua.EnumValueType()
extobj.Value = 2
extobj.DisplayName.Text = 'TrustedCrls'
value.append(extobj)
extobj = ua.EnumValueType()
extobj.Value = 4
extobj.DisplayName.Text = 'IssuerCertificates'
value.append(extobj)
extobj = ua.EnumValueType()
extobj.Value = 8
extobj.DisplayName.Text = 'IssuerCrls'
value.append(extobj)
extobj = ua.EnumValueType()
extobj.Value = 15
extobj.DisplayName.Text = 'All'
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12553, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12553, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12553, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12552, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12554, 0)
node.BrowseName = QualifiedName('TrustListDataType', 0)
node.NodeClass = NodeClass.DataType
node.ParentNodeId = NumericNodeId(22, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.DataTypeAttributes()
attrs.DisplayName = LocalizedText("TrustListDataType")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12554, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(22, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19297, 0)
node.BrowseName = QualifiedName('TrustListOutOfDateAlarmType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(11753, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("TrustListOutOfDateAlarmType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19297, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19446, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19297, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19447, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19297, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19448, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(19297, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(11753, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19446, 0)
node.BrowseName = QualifiedName('TrustListId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19297, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("TrustListId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19446, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19446, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19446, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19297, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19447, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19297, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19447, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19447, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19447, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19297, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19448, 0)
node.BrowseName = QualifiedName('UpdateFrequency', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19297, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UpdateFrequency")
attrs.DataType = NumericNodeId(290, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19448, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19448, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19448, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19297, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12555, 0)
node.BrowseName = QualifiedName('CertificateGroupType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(58, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("CertificateGroupType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12555, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12555, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13631, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12555, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12555, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12555, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(58, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13599, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12555, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13600, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13601, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13602, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13603, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13605, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13608, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13610, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13613, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13615, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13618, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13620, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13621, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13599, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13600, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13600, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13600, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13600, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13601, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13601, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13601, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13601, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13602, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13602, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13602, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13602, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13603, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13603, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13603, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13603, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13605, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13605, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13606, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13605, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13607, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13605, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13605, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13606, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13605, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13606, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13606, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13606, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13605, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13607, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13605, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13607, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13607, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13607, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13605, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13608, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13608, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13609, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13608, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13608, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13609, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13608, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13609, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13609, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13609, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13608, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13610, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13610, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13611, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13610, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13612, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13610, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13610, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13611, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13610, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13611, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13611, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13611, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13610, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13612, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13610, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13612, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13612, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13612, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13610, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13613, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13613, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13614, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13613, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13613, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13614, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13613, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13614, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13614, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13614, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13613, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13615, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13615, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13616, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13615, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13617, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13615, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13615, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13616, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13615, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13615, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13617, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13615, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13615, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13618, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13619, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13619, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13618, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13619, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13619, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13619, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13618, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13620, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13621, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13599, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13621, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13622, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13621, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13623, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13621, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13621, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13599, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13622, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13621, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13622, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13622, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13622, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13621, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13623, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13621, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13623, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13623, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13623, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13621, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13631, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12555, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13631, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13631, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13631, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19450, 0)
node.BrowseName = QualifiedName('CertificateExpired', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12555, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(13225, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("CertificateExpired")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19451, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19452, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19453, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19454, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19455, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19456, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19458, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19459, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19460, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19461, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19464, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19465, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19466, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19467, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19476, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19478, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19480, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19482, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19483, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19484, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19485, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19487, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19505, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19509, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19518, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20101, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20138, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20139, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20141, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20142, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13225, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19450, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19451, 0)
node.BrowseName = QualifiedName('EventId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EventId")
attrs.DataType = ua.NodeId(ua.ObjectIds.ByteString)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19451, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19451, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19451, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19452, 0)
node.BrowseName = QualifiedName('EventType', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EventType")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19452, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19452, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19452, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19453, 0)
node.BrowseName = QualifiedName('SourceNode', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceNode")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19453, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19453, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19453, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19454, 0)
node.BrowseName = QualifiedName('SourceName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceName")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19454, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19454, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19454, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19455, 0)
node.BrowseName = QualifiedName('Time', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Time")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19455, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19455, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19455, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19456, 0)
node.BrowseName = QualifiedName('ReceiveTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ReceiveTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19456, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19456, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19456, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19458, 0)
node.BrowseName = QualifiedName('Message', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Message")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19458, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19458, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19458, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19459, 0)
node.BrowseName = QualifiedName('Severity', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Severity")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19459, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19459, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19459, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19460, 0)
node.BrowseName = QualifiedName('ConditionClassId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionClassId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19460, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19460, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19460, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19461, 0)
node.BrowseName = QualifiedName('ConditionClassName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionClassName")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19461, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19461, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19461, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19464, 0)
node.BrowseName = QualifiedName('ConditionName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionName")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19464, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19464, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19464, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19465, 0)
node.BrowseName = QualifiedName('BranchId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("BranchId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19465, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19465, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19465, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19466, 0)
node.BrowseName = QualifiedName('Retain', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Retain")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19466, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19466, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19466, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19467, 0)
node.BrowseName = QualifiedName('EnabledState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EnabledState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19467, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19468, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19467, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19467, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19467, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19468, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19467, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19468, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19468, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19468, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19467, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19476, 0)
node.BrowseName = QualifiedName('Quality', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Quality")
attrs.DataType = ua.NodeId(ua.ObjectIds.StatusCode)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19476, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19477, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19476, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19476, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19476, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19477, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19476, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19477, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19477, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19477, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19476, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19478, 0)
node.BrowseName = QualifiedName('LastSeverity', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastSeverity")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19478, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19479, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19478, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19478, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19478, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19479, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19478, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19479, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19479, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19479, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19478, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19480, 0)
node.BrowseName = QualifiedName('Comment', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Comment")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19480, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19481, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19480, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19480, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19480, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19481, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19480, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19481, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19481, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19481, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19480, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19482, 0)
node.BrowseName = QualifiedName('ClientUserId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ClientUserId")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19482, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19482, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19482, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19483, 0)
node.BrowseName = QualifiedName('Disable', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Disable")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(19483, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2803, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19483, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19483, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19484, 0)
node.BrowseName = QualifiedName('Enable', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Enable")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(19484, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2803, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19484, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19484, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19485, 0)
node.BrowseName = QualifiedName('AddComment', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("AddComment")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19485, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19486, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(19485, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2829, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19485, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19485, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19486, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19485, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'EventId'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The identifier for the event to comment.'
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Comment'
extobj.DataType = NumericNodeId(21, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The comment to add to the condition.'
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19486, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19486, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19486, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19485, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19487, 0)
node.BrowseName = QualifiedName('AckedState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("AckedState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19487, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19488, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19487, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19487, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19487, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19488, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19487, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19488, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19488, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19488, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19487, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19505, 0)
node.BrowseName = QualifiedName('Acknowledge', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Acknowledge")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19505, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19506, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(19505, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8944, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19505, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19505, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19506, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19505, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'EventId'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The identifier for the event to comment.'
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Comment'
extobj.DataType = NumericNodeId(21, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The comment to add to the condition.'
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19506, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19506, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19506, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19505, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19509, 0)
node.BrowseName = QualifiedName('ActiveState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ActiveState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19509, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19510, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19509, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19509, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(19509, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19510, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19509, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19510, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19510, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19510, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19509, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(19518, 0)
node.BrowseName = QualifiedName('InputNode', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputNode")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(19518, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(19518, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(19518, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20101, 0)
node.BrowseName = QualifiedName('SuppressedOrShelved', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SuppressedOrShelved")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20101, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20101, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20101, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20138, 0)
node.BrowseName = QualifiedName('NormalState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("NormalState")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20138, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20138, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20138, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20139, 0)
node.BrowseName = QualifiedName('ExpirationDate', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ExpirationDate")
attrs.DataType = ua.NodeId(ua.ObjectIds.DateTime)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20139, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20139, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20139, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20141, 0)
node.BrowseName = QualifiedName('CertificateType', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateType")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20141, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20141, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20141, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20142, 0)
node.BrowseName = QualifiedName('Certificate', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(19450, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Certificate")
attrs.DataType = ua.NodeId(ua.ObjectIds.ByteString)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20142, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20142, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20142, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19450, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20143, 0)
node.BrowseName = QualifiedName('TrustListOutOfDate', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12555, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(19297, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustListOutOfDate")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20144, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20145, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20146, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20147, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20148, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20149, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20151, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20152, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20153, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20154, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20157, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20158, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20159, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20160, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20169, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20171, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20173, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20175, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20176, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20177, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20178, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20180, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20198, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20202, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20211, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20249, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20286, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20287, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20288, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20289, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(19297, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20143, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20144, 0)
node.BrowseName = QualifiedName('EventId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EventId")
attrs.DataType = ua.NodeId(ua.ObjectIds.ByteString)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20144, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20144, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20144, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20145, 0)
node.BrowseName = QualifiedName('EventType', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EventType")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20145, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20145, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20145, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20146, 0)
node.BrowseName = QualifiedName('SourceNode', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceNode")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20146, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20146, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20146, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20147, 0)
node.BrowseName = QualifiedName('SourceName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceName")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20147, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20147, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20147, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20148, 0)
node.BrowseName = QualifiedName('Time', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Time")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20148, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20148, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20148, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20149, 0)
node.BrowseName = QualifiedName('ReceiveTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ReceiveTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20149, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20149, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20149, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20151, 0)
node.BrowseName = QualifiedName('Message', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Message")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20151, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20151, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20151, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20152, 0)
node.BrowseName = QualifiedName('Severity', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Severity")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20152, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20152, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20152, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20153, 0)
node.BrowseName = QualifiedName('ConditionClassId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionClassId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20153, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20153, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20153, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20154, 0)
node.BrowseName = QualifiedName('ConditionClassName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionClassName")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20154, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20154, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20154, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20157, 0)
node.BrowseName = QualifiedName('ConditionName', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ConditionName")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20157, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20157, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20157, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20158, 0)
node.BrowseName = QualifiedName('BranchId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("BranchId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20158, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20158, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20158, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20159, 0)
node.BrowseName = QualifiedName('Retain', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Retain")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20159, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20159, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20159, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20160, 0)
node.BrowseName = QualifiedName('EnabledState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EnabledState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20160, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20161, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20160, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20160, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20160, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20161, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20160, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20161, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20161, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20161, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20160, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20169, 0)
node.BrowseName = QualifiedName('Quality', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Quality")
attrs.DataType = ua.NodeId(ua.ObjectIds.StatusCode)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20169, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20170, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20169, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20169, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20169, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20170, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20169, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20170, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20170, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20170, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20169, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20171, 0)
node.BrowseName = QualifiedName('LastSeverity', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastSeverity")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20171, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20172, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20171, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20171, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20171, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20172, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20171, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20172, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20172, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20172, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20171, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20173, 0)
node.BrowseName = QualifiedName('Comment', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(9002, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Comment")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20173, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20174, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20173, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(9002, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20173, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20173, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20174, 0)
node.BrowseName = QualifiedName('SourceTimestamp', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20173, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SourceTimestamp")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20174, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20174, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20174, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20173, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20175, 0)
node.BrowseName = QualifiedName('ClientUserId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ClientUserId")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20175, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20175, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20175, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20176, 0)
node.BrowseName = QualifiedName('Disable', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Disable")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(20176, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2803, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20176, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20176, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20177, 0)
node.BrowseName = QualifiedName('Enable', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Enable")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(20177, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2803, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20177, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20177, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20178, 0)
node.BrowseName = QualifiedName('AddComment', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("AddComment")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20178, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20179, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(20178, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2829, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20178, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20178, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20179, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20178, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'EventId'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The identifier for the event to comment.'
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Comment'
extobj.DataType = NumericNodeId(21, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The comment to add to the condition.'
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20179, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20179, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20179, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20178, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20180, 0)
node.BrowseName = QualifiedName('AckedState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("AckedState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20180, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20181, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20180, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20180, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20180, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20181, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20180, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20181, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20181, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20181, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20180, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20198, 0)
node.BrowseName = QualifiedName('Acknowledge', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Acknowledge")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20198, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20199, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(3065, 0)
ref.SourceNodeId = NumericNodeId(20198, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8944, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20198, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20198, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20199, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20198, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'EventId'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The identifier for the event to comment.'
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Comment'
extobj.DataType = NumericNodeId(21, 0)
extobj.ValueRank = -1
extobj.Description.Text = 'The comment to add to the condition.'
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20199, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20199, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20199, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20198, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20202, 0)
node.BrowseName = QualifiedName('ActiveState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(8995, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ActiveState")
attrs.DataType = ua.NodeId(ua.ObjectIds.LocalizedText)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20202, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20203, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20202, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(8995, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20202, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(20202, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20203, 0)
node.BrowseName = QualifiedName('Id', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20202, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Id")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20203, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20203, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20203, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20202, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20211, 0)
node.BrowseName = QualifiedName('InputNode', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputNode")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20211, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20211, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20211, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20249, 0)
node.BrowseName = QualifiedName('SuppressedOrShelved', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SuppressedOrShelved")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20249, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20249, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20249, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20286, 0)
node.BrowseName = QualifiedName('NormalState', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("NormalState")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20286, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20286, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20286, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20287, 0)
node.BrowseName = QualifiedName('TrustListId', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("TrustListId")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20287, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20287, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20287, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20288, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20288, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20288, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20288, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(20289, 0)
node.BrowseName = QualifiedName('UpdateFrequency', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(20143, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UpdateFrequency")
attrs.DataType = NumericNodeId(290, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(20289, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(20289, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(20289, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(20143, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13813, 0)
node.BrowseName = QualifiedName('CertificateGroupFolderType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(61, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("CertificateGroupFolderType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13813, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13814, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13813, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13848, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13813, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13882, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(35, 0)
ref.SourceNodeId = NumericNodeId(13813, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13916, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(13813, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(61, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13814, 0)
node.BrowseName = QualifiedName('DefaultApplicationGroup', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13813, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12555, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("DefaultApplicationGroup")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13814, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13814, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13847, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13814, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13814, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13814, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13813, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13815, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13814, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13816, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13817, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13818, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13819, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13821, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13824, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13826, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13829, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13831, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13834, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13836, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13837, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13815, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13814, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13816, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13816, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13816, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13816, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13817, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13817, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13817, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13817, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13818, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13818, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13818, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13818, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13819, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13819, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13819, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13819, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13821, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13821, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13822, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13821, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13823, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13821, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13821, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13822, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13821, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13822, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13822, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13822, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13821, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13823, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13821, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13823, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13823, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13823, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13821, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13824, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13824, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13825, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13824, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13824, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13825, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13824, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13825, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13825, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13825, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13824, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13826, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13826, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13827, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13826, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13828, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13826, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13826, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13827, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13826, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13827, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13827, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13827, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13826, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13828, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13826, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13828, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13828, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13828, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13826, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13829, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13829, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13830, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13829, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13829, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13830, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13829, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13830, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13830, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13830, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13829, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13831, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13831, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13832, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13831, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13833, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13831, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13831, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13832, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13831, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13832, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13832, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13832, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13831, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13833, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13831, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13833, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13833, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13833, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13831, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13834, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13834, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13835, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13834, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13834, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13835, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13834, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13835, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13835, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13835, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13834, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13836, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13836, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13836, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13836, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13837, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13815, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13837, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13838, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13837, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13839, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13837, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13837, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13815, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13838, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13837, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13838, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13838, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13838, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13837, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13839, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13837, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13839, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13839, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13839, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13837, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13847, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13814, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13847, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13847, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13847, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13814, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13848, 0)
node.BrowseName = QualifiedName('DefaultHttpsGroup', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13813, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12555, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("DefaultHttpsGroup")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13848, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13848, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13881, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13848, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13848, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13848, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13813, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13849, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13848, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13850, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13851, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13852, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13853, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13855, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13858, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13860, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13863, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13865, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13868, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13870, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13871, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13849, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13848, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13850, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13850, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13850, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13850, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13851, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13851, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13851, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13851, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13852, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13853, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13853, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13853, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13853, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13855, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13855, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13856, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13855, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13857, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13855, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13855, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13856, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13855, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13856, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13856, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13856, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13855, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13857, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13855, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13857, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13857, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13857, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13855, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13858, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13858, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13859, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13858, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13858, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13859, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13858, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13859, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13859, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13859, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13858, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13860, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13861, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13862, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13861, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13860, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13861, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13861, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13861, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13860, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13862, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13860, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13862, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13862, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13862, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13860, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13863, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13863, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13864, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13863, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13863, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13864, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13863, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13864, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13864, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13864, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13863, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13865, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13865, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13866, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13865, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13867, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13865, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13865, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13866, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13865, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13866, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13866, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13866, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13865, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13867, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13865, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13867, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13867, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13867, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13865, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13868, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13868, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13869, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13868, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13868, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13869, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13868, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13869, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13869, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13869, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13868, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13870, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13870, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13870, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13870, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13871, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13849, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13871, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13872, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13871, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13873, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13871, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13871, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13849, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13872, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13871, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13872, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13872, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13872, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13871, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13873, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13871, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13873, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13873, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13873, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13871, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13881, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13848, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13881, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13881, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13881, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13848, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13882, 0)
node.BrowseName = QualifiedName('DefaultUserTokenGroup', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13813, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12555, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("DefaultUserTokenGroup")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13882, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13882, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13915, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13882, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13882, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13882, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13813, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13883, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13882, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13884, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13885, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13886, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13887, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13889, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13892, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13894, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13897, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13899, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13902, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13904, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13905, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13883, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13882, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13884, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13884, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13884, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13884, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13885, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13885, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13885, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13885, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13886, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13886, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13886, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13886, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13887, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13887, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13887, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13887, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13889, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13889, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13890, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13889, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13891, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13889, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13889, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13890, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13889, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13890, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13890, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13890, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13889, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13891, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13889, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13891, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13891, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13891, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13889, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13892, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13892, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13893, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13892, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13892, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13893, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13892, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13893, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13893, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13893, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13892, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13894, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13894, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13895, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13894, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13896, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13894, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13894, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13895, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13894, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13895, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13895, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13895, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13894, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13896, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13894, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13896, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13896, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13896, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13894, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13897, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13897, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13898, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13897, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13897, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13898, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13897, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13898, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13898, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13898, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13897, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13899, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13899, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13900, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13899, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13901, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13899, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13899, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13900, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13899, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13900, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13900, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13900, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13899, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13901, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13899, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13901, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13901, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13901, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13899, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13902, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13902, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13903, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13902, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13902, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13903, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13902, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13903, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13903, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13903, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13902, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13904, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13904, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13904, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13904, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13905, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13883, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13905, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13906, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13905, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13907, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13905, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13905, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13883, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13906, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13905, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13906, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13906, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13906, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13905, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13907, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13905, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13907, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13907, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13907, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13905, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13915, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13882, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13915, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13915, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13915, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13882, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13916, 0)
node.BrowseName = QualifiedName('<AdditionalGroup>', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13813, 0)
node.ReferenceTypeId = NumericNodeId(35, 0)
node.TypeDefinition = NumericNodeId(12555, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("<AdditionalGroup>")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13916, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13916, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13949, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13916, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13916, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(11508, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(35, 0)
ref.SourceNodeId = NumericNodeId(13916, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13813, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13917, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13916, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13918, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13919, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13920, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13921, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13923, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13926, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13928, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13931, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13933, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13936, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13938, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13939, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13917, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13916, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13918, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13918, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13918, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13918, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13919, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13919, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13919, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13919, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13920, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13920, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13920, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13920, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13921, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13921, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13921, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13921, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13923, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13923, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13924, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13923, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13925, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13923, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13923, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13924, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13923, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13924, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13924, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13924, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13923, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13925, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13923, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13925, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13925, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13925, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13923, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13926, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13926, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13927, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13926, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13926, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13927, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13926, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13927, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13927, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13927, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13926, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13928, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13928, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13929, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13928, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13930, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13928, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13928, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13929, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13928, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13929, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13929, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13929, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13928, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13930, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13928, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13930, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13930, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13930, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13928, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13931, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13931, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13932, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13931, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13931, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13932, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13931, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13932, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13932, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13932, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13931, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13933, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13933, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13934, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13933, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13935, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13933, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13933, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13934, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13933, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13934, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13934, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13934, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13933, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13935, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13933, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13935, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13935, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13935, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13933, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13936, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13936, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13937, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13936, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13936, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13937, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13936, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13937, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13937, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13937, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13936, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13938, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13938, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13938, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13938, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13939, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13917, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13939, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13940, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13939, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13941, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13939, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13939, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13917, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13940, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13939, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13940, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13940, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13940, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13939, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13941, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13939, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13941, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13941, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13941, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13939, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13949, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13916, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13949, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13949, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13949, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13916, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12556, 0)
node.BrowseName = QualifiedName('CertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(58, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("CertificateType")
attrs.IsAbstract = True
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12556, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(58, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12557, 0)
node.BrowseName = QualifiedName('ApplicationCertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(12556, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("ApplicationCertificateType")
attrs.IsAbstract = True
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12557, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12556, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12558, 0)
node.BrowseName = QualifiedName('HttpsCertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(12556, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("HttpsCertificateType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12558, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12556, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(15181, 0)
node.BrowseName = QualifiedName('UserCredentialCertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(12556, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("UserCredentialCertificateType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(15181, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12556, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12559, 0)
node.BrowseName = QualifiedName('RsaMinApplicationCertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(12557, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("RsaMinApplicationCertificateType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12559, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12557, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12560, 0)
node.BrowseName = QualifiedName('RsaSha256ApplicationCertificateType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(12557, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("RsaSha256ApplicationCertificateType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12560, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12557, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12561, 0)
node.BrowseName = QualifiedName('TrustListUpdatedAuditEventType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(2127, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("TrustListUpdatedAuditEventType")
attrs.IsAbstract = True
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12561, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2127, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12581, 0)
node.BrowseName = QualifiedName('ServerConfigurationType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(58, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("ServerConfigurationType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13950, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12708, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12583, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12584, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12585, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12616, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12734, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12731, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12775, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12581, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(58, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13950, 0)
node.BrowseName = QualifiedName('CertificateGroups', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(13813, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("CertificateGroups")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13950, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13951, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13950, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13813, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13950, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13950, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13951, 0)
node.BrowseName = QualifiedName('DefaultApplicationGroup', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13950, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12555, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("DefaultApplicationGroup")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13951, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13951, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13984, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13951, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12555, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13951, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13951, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13950, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13952, 0)
node.BrowseName = QualifiedName('TrustList', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(13951, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(12522, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("TrustList")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13953, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13954, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13955, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13956, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13958, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13961, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13963, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13966, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13968, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13971, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13973, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13974, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13952, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13951, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13953, 0)
node.BrowseName = QualifiedName('Size', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Size")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt64)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13953, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13953, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13953, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13954, 0)
node.BrowseName = QualifiedName('Writable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("Writable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13954, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13954, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13954, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13955, 0)
node.BrowseName = QualifiedName('UserWritable', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("UserWritable")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13955, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13955, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13955, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13956, 0)
node.BrowseName = QualifiedName('OpenCount', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OpenCount")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt16)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13956, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13956, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13956, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13958, 0)
node.BrowseName = QualifiedName('Open', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Open")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13958, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13959, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13958, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13960, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13958, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13958, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13959, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13958, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Mode'
extobj.DataType = NumericNodeId(3, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13959, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13959, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13959, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13958, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13960, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13958, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13960, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13960, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13960, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13958, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13961, 0)
node.BrowseName = QualifiedName('Close', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Close")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13961, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13962, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13961, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13961, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13962, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13961, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13962, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13962, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13962, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13961, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13963, 0)
node.BrowseName = QualifiedName('Read', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Read")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13963, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13964, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13963, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13965, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13963, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13963, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13964, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13963, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Length'
extobj.DataType = NumericNodeId(6, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13964, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13964, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13964, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13963, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13965, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13963, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13965, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13965, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13965, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13963, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13966, 0)
node.BrowseName = QualifiedName('Write', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("Write")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13966, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13967, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13966, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13966, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13967, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13966, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Data'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13967, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13967, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13967, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13966, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13968, 0)
node.BrowseName = QualifiedName('GetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13968, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13969, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13968, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13970, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13968, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13968, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13969, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13968, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13969, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13969, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13969, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13968, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13970, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13968, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13970, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13970, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13970, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13968, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13971, 0)
node.BrowseName = QualifiedName('SetPosition', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("SetPosition")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13971, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13972, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13971, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13971, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13972, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13971, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Position'
extobj.DataType = NumericNodeId(9, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13972, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13972, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13972, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13971, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13973, 0)
node.BrowseName = QualifiedName('LastUpdateTime', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("LastUpdateTime")
attrs.DataType = NumericNodeId(294, 0)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13973, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13973, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13973, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13974, 0)
node.BrowseName = QualifiedName('OpenWithMasks', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(13952, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("OpenWithMasks")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13974, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13975, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13974, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13976, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13974, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(13974, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13952, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13975, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13974, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Masks'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13975, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13975, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13975, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13974, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13976, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13974, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'FileHandle'
extobj.DataType = NumericNodeId(7, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13976, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13976, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13976, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13974, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13984, 0)
node.BrowseName = QualifiedName('CertificateTypes', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(13951, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateTypes")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13984, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13984, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13984, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13951, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12708, 0)
node.BrowseName = QualifiedName('ServerCapabilities', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ServerCapabilities")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12708, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12708, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12708, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12583, 0)
node.BrowseName = QualifiedName('SupportedPrivateKeyFormats', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("SupportedPrivateKeyFormats")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12583, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12583, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12583, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12584, 0)
node.BrowseName = QualifiedName('MaxTrustListSize', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("MaxTrustListSize")
attrs.DataType = ua.NodeId(ua.ObjectIds.UInt32)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12584, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12584, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12584, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12585, 0)
node.BrowseName = QualifiedName('MulticastDnsEnabled', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("MulticastDnsEnabled")
attrs.DataType = ua.NodeId(ua.ObjectIds.Boolean)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12585, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12585, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12585, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12616, 0)
node.BrowseName = QualifiedName('UpdateCertificate', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("UpdateCertificate")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12617, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12618, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12616, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12617, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12616, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'CertificateGroupId'
extobj.DataType = NumericNodeId(17, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'CertificateTypeId'
extobj.DataType = NumericNodeId(17, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Certificate'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'IssuerCertificates'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = 1
extobj.ArrayDimensions = [0]
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'PrivateKeyFormat'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'PrivateKey'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12617, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12616, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12618, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12616, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'ApplyChangesRequired'
extobj.DataType = NumericNodeId(1, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12618, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12616, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12734, 0)
node.BrowseName = QualifiedName('ApplyChanges', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("ApplyChanges")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12734, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12734, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12731, 0)
node.BrowseName = QualifiedName('CreateSigningRequest', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("CreateSigningRequest")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12731, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12732, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12731, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12733, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12731, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12731, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12732, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12731, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'CertificateGroupId'
extobj.DataType = NumericNodeId(17, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'CertificateTypeId'
extobj.DataType = NumericNodeId(17, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'SubjectName'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'RegeneratePrivateKey'
extobj.DataType = NumericNodeId(1, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'Nonce'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12732, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12732, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12732, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12731, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12733, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12731, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'CertificateRequest'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12733, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12733, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12733, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12731, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12775, 0)
node.BrowseName = QualifiedName('GetRejectedList', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(12581, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("GetRejectedList")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12775, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12776, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12775, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(12775, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12581, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12776, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12775, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'Certificates'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = 1
extobj.ArrayDimensions = [0]
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12776, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(12776, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12776, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12775, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12620, 0)
node.BrowseName = QualifiedName('CertificateUpdatedAuditEventType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(2127, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("CertificateUpdatedAuditEventType")
attrs.IsAbstract = True
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13735, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(12620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(13736, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(12620, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2127, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13735, 0)
node.BrowseName = QualifiedName('CertificateGroup', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12620, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateGroup")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13735, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13735, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13735, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12620, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(13736, 0)
node.BrowseName = QualifiedName('CertificateType', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(12620, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("CertificateType")
attrs.DataType = ua.NodeId(ua.ObjectIds.NodeId)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(13736, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(13736, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(13736, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12620, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17496, 0)
node.BrowseName = QualifiedName('KeyCredentialConfigurationFolderType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(61, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialConfigurationFolderType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(17496, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17511, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(17496, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17522, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(17496, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(61, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17511, 0)
node.BrowseName = QualifiedName('<ServiceName>', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(17496, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(18001, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("<ServiceName>")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17511, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17512, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17511, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17513, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17511, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17511, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(11508, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(17511, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17496, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17512, 0)
node.BrowseName = QualifiedName('ResourceUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17511, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ResourceUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17512, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17512, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17512, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17511, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17513, 0)
node.BrowseName = QualifiedName('ProfileUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17511, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ProfileUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17513, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17513, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17513, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17511, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17522, 0)
node.BrowseName = QualifiedName('CreateCredential', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(17496, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("CreateCredential")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17523, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17524, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(17522, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17496, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17523, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17522, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'ResourceUri'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'ProfileUri'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'EndpointUrls'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = 1
extobj.ArrayDimensions = [0]
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17523, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17523, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17523, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17524, 0)
node.BrowseName = QualifiedName('OutputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17522, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("OutputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'CredentialNodeId'
extobj.DataType = NumericNodeId(17, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17524, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17524, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17524, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17522, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18155, 0)
node.BrowseName = QualifiedName('KeyCredentialConfiguration', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12637, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(17496, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialConfiguration")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(18155, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12637, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18155, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17496, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18001, 0)
node.BrowseName = QualifiedName('KeyCredentialConfigurationType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(58, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialConfigurationType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18069, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18165, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18004, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18005, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18006, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18008, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(18001, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(58, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18069, 0)
node.BrowseName = QualifiedName('ResourceUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ResourceUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18069, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18069, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18069, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18165, 0)
node.BrowseName = QualifiedName('ProfileUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ProfileUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18165, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18165, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18165, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18004, 0)
node.BrowseName = QualifiedName('EndpointUrls', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("EndpointUrls")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18004, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18004, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18004, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18005, 0)
node.BrowseName = QualifiedName('ServiceStatus', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ServiceStatus")
attrs.DataType = ua.NodeId(ua.ObjectIds.StatusCode)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18005, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18005, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18005, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18006, 0)
node.BrowseName = QualifiedName('UpdateCredential', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("UpdateCredential")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18006, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18007, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18006, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(18006, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18007, 0)
node.BrowseName = QualifiedName('InputArguments', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18006, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("InputArguments")
attrs.DataType = NumericNodeId(296, 0)
value = []
extobj = ua.Argument()
extobj.Name = 'CredentialId'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'CredentialSecret'
extobj.DataType = NumericNodeId(15, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'CertificateThumbprint'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
extobj = ua.Argument()
extobj.Name = 'SecurityPolicyUri'
extobj.DataType = NumericNodeId(12, 0)
extobj.ValueRank = -1
value.append(extobj)
attrs.Value = ua.Variant(value, ua.VariantType.ExtensionObject)
attrs.ValueRank = 1
attrs.ArrayDimensions = [0]
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18007, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18007, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18007, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18006, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18008, 0)
node.BrowseName = QualifiedName('DeleteCredential', 0)
node.NodeClass = NodeClass.Method
node.ParentNodeId = NumericNodeId(18001, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
attrs = ua.MethodAttributes()
attrs.DisplayName = LocalizedText("DeleteCredential")
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18008, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(80, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(18008, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18001, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18011, 0)
node.BrowseName = QualifiedName('KeyCredentialAuditEventType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(2127, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialAuditEventType")
attrs.IsAbstract = True
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18011, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18028, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(18011, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(2127, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18028, 0)
node.BrowseName = QualifiedName('ResourceUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(18011, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ResourceUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18028, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18028, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18028, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18011, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18029, 0)
node.BrowseName = QualifiedName('KeyCredentialUpdatedAuditEventType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(18011, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialUpdatedAuditEventType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(18029, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18011, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18047, 0)
node.BrowseName = QualifiedName('KeyCredentialDeletedAuditEventType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(18011, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("KeyCredentialDeletedAuditEventType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(18047, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18011, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17732, 0)
node.BrowseName = QualifiedName('AuthorizationServices', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12637, 0)
node.ReferenceTypeId = NumericNodeId(47, 0)
node.TypeDefinition = NumericNodeId(61, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("AuthorizationServices")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(47, 0)
ref.SourceNodeId = NumericNodeId(17732, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12637, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17732, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(61, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17852, 0)
node.BrowseName = QualifiedName('AuthorizationServiceConfigurationType', 0)
node.NodeClass = NodeClass.ObjectType
node.ParentNodeId = NumericNodeId(58, 0)
node.ReferenceTypeId = NumericNodeId(45, 0)
attrs = ua.ObjectTypeAttributes()
attrs.DisplayName = LocalizedText("AuthorizationServiceConfigurationType")
attrs.IsAbstract = False
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18072, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17860, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(18073, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(45, 0)
ref.SourceNodeId = NumericNodeId(17852, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(58, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18072, 0)
node.BrowseName = QualifiedName('ServiceUri', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17852, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ServiceUri")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18072, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18072, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18072, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17852, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(17860, 0)
node.BrowseName = QualifiedName('ServiceCertificate', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17852, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("ServiceCertificate")
attrs.DataType = ua.NodeId(ua.ObjectIds.ByteString)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(17860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(17860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(17860, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17852, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(18073, 0)
node.BrowseName = QualifiedName('IssuerEndpointUrl', 0)
node.NodeClass = NodeClass.Variable
node.ParentNodeId = NumericNodeId(17852, 0)
node.ReferenceTypeId = NumericNodeId(46, 0)
node.TypeDefinition = NumericNodeId(68, 0)
attrs = ua.VariableAttributes()
attrs.DisplayName = LocalizedText("IssuerEndpointUrl")
attrs.DataType = ua.NodeId(ua.ObjectIds.String)
attrs.ValueRank = -1
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(18073, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(68, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(37, 0)
ref.SourceNodeId = NumericNodeId(18073, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(78, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(46, 0)
ref.SourceNodeId = NumericNodeId(18073, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(17852, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12680, 0)
node.BrowseName = QualifiedName('Default Binary', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12554, 0)
node.ReferenceTypeId = NumericNodeId(38, 0)
node.TypeDefinition = NumericNodeId(76, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("Default Binary")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(38, 0)
ref.SourceNodeId = NumericNodeId(12680, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12554, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(39, 0)
ref.SourceNodeId = NumericNodeId(12680, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12681, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12680, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(76, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(12676, 0)
node.BrowseName = QualifiedName('Default XML', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12554, 0)
node.ReferenceTypeId = NumericNodeId(38, 0)
node.TypeDefinition = NumericNodeId(76, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("Default XML")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(38, 0)
ref.SourceNodeId = NumericNodeId(12676, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12554, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(39, 0)
ref.SourceNodeId = NumericNodeId(12676, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12677, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(12676, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(76, 0)
refs.append(ref)
server.add_references(refs)
node = ua.AddNodesItem()
node.RequestedNewNodeId = NumericNodeId(15044, 0)
node.BrowseName = QualifiedName('Default JSON', 0)
node.NodeClass = NodeClass.Object
node.ParentNodeId = NumericNodeId(12554, 0)
node.ReferenceTypeId = NumericNodeId(38, 0)
node.TypeDefinition = NumericNodeId(76, 0)
attrs = ua.ObjectAttributes()
attrs.DisplayName = LocalizedText("Default JSON")
attrs.EventNotifier = 0
node.NodeAttributes = attrs
server.add_nodes([node])
refs = []
ref = ua.AddReferencesItem()
ref.IsForward = False
ref.ReferenceTypeId = NumericNodeId(38, 0)
ref.SourceNodeId = NumericNodeId(15044, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(12554, 0)
refs.append(ref)
ref = ua.AddReferencesItem()
ref.IsForward = True
ref.ReferenceTypeId = NumericNodeId(40, 0)
ref.SourceNodeId = NumericNodeId(15044, 0)
ref.TargetNodeClass = NodeClass.DataType
ref.TargetNodeId = NumericNodeId(76, 0)
refs.append(ref)
server.add_references(refs)
| 37.116283 | 83 | 0.704241 | 51,636 | 496,022 | 6.753021 | 0.011523 | 0.02542 | 0.069906 | 0.079438 | 0.979154 | 0.978718 | 0.978383 | 0.977471 | 0.975122 | 0.974821 | 0 | 0.054798 | 0.189504 | 496,022 | 13,363 | 84 | 37.11906 | 0.812561 | 0.000282 | 0 | 0.944419 | 1 | 0 | 0.019059 | 0.002454 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000077 | false | 0 | 0.000383 | 0 | 0.00046 | 0.000153 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
885346dacfe0ed25b9beea8059c5a97e401d1d5a | 16,773 | py | Python | test/testsArithmeticExpressions/testMax.py | mouton5000/DiscreteEventApplicationEditor | 4a4272fd9b0a7f3f228fee1e9e7b351e4a21cd33 | [
"MIT"
] | null | null | null | test/testsArithmeticExpressions/testMax.py | mouton5000/DiscreteEventApplicationEditor | 4a4272fd9b0a7f3f228fee1e9e7b351e4a21cd33 | [
"MIT"
] | null | null | null | test/testsArithmeticExpressions/testMax.py | mouton5000/DiscreteEventApplicationEditor | 4a4272fd9b0a7f3f228fee1e9e7b351e4a21cd33 | [
"MIT"
] | null | null | null | __author__ = 'mouton'
from triggerExpressions import Evaluation
from unittest import TestCase
from math import pi, sqrt
from arithmeticExpressions import ALitteral, Max, UndefinedLitteral, SelfLitteral
from database import Variable
class TestMax(TestCase):
@classmethod
def setUpClass(cls):
import grammar.grammars
grammar.grammars.compileGrammars()
def setUp(self):
self.eval1 = Evaluation()
self.eval2 = Evaluation()
self.eval2[Variable('X')] = 1
self.eval2[Variable('T')] = 'abc'
self.eval2[Variable('Z')] = 12.0
def test_integers_max_with_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 20)
def test_integers_max_with_non_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 20)
def test_strings_max_with_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 'def')
def test_strings_max_with_non_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'def')
def test_floats_max_with_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), pi)
def test_floats_max_with_non_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), pi)
def test_integer_string_max_with_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 'def')
def test_integer_string_max_with_non_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'def')
def test_string_integer_max_with_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 'abc')
def test_string_integer_max_with_non_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'abc')
def test_integer_float_max_with_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 10)
def test_integer_float_max_with_non_empty_evaluation(self):
a1 = ALitteral(10)
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 10)
def test_float_integer_max_with_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 20)
def test_float_integer_max_with_non_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 20)
def test_string_float_max_with_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 'abc')
def test_string_float_max_with_non_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'abc')
def test_float_string_max_with_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1), 'def')
def test_float_string_max_with_non_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'def')
def test_integer_undefined_max_with_empty_evaluation(self):
a1 = ALitteral(10)
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_integer_undefined_max_with_non_empty_evaluation(self):
a1 = ALitteral(10)
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_undefined_integer_max_with_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral(20)
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_undefined_integer_max_with_non_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral(20)
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_string_undefined_max_with_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_string_undefined_max_with_non_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_undefined_string_max_with_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral('def')
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_undefined_string_max_with_non_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral('def')
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_float_undefined_max_with_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_float_undefined_max_with_non_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_undefined_float_max_with_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_undefined_float_max_with_non_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_undefined_undefined_max_with_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1)
def test_undefined_undefined_max_with_non_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2)
def test_integer_evaluated_variable_max(self):
a1 = ALitteral(10)
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 10)
def test_evaluated_variable_integer_max(self):
a1 = ALitteral(Variable('X'))
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 20)
def test_string_evaluated_variable_max(self):
a1 = ALitteral('abc')
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'abc')
def test_evaluated_variable_string_max(self):
a1 = ALitteral(Variable('X'))
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 'def')
def test_float_evaluated_variable_max(self):
a1 = ALitteral(pi)
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), pi)
def test_evaluated_variable_float_max(self):
a1 = ALitteral(Variable('X'))
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), sqrt(2))
def test_evaluated_variable_evaluated_variable_max(self):
a1 = ALitteral(Variable('X'))
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2), 1)
def test_evaluated_variable_undefined_subtraction(self):
a1 = ALitteral(Variable('X'))
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2), 0
def test_undefined_evaluated_variable_subtraction(self):
a1 = UndefinedLitteral()
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2), 0
def test_integer_unevaluated_variable_max(self):
a1 = ALitteral(10)
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_integer_max(self):
a1 = ALitteral(Variable('Y'))
a2 = ALitteral(20)
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_string_unevaluated_variable_max(self):
a1 = ALitteral('abc')
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_string_max(self):
a1 = ALitteral(Variable('Y'))
a2 = ALitteral('def')
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_float_unevaluated_variable_max(self):
a1 = ALitteral(pi)
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_float_max(self):
a1 = ALitteral(Variable('Y'))
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_unevaluated_variable_max(self):
a1 = ALitteral(Variable('Y'))
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_evaluated_variable_max(self):
a1 = ALitteral(Variable('Y'))
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_evaluated_variable_unevaluated_variable_max(self):
a1 = ALitteral(Variable('X'))
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2)
def test_unevaluated_variable_undefined_subtraction(self):
a1 = ALitteral(Variable('Y'))
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2), 0
def test_undefined_unevaluated_variable_subtraction(self):
a1 = UndefinedLitteral()
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2), 0
def test_integer_self_litteral_max_with_empty_evaluation(self):
a1 = ALitteral(10)
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), 10)
def test_integer_self_litteral_max_with_non_empty_evaluation(self):
a1 = ALitteral(10)
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), 10)
def test_self_litteral_integer_max_with_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), 20)
def test_self_litteral_integer_max_with_non_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral(20)
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), 20)
def test_string_self_litteral_max_with_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), 'abc')
def test_string_self_litteral_max_with_non_empty_evaluation(self):
a1 = ALitteral('abc')
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), 'abc')
def test_self_litteral_string_max_with_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), 'def')
def test_self_litteral_string_max_with_non_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral('def')
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), 'def')
def test_float_self_litteral_max_with_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), pi)
def test_float_self_litteral_max_with_non_empty_evaluation(self):
a1 = ALitteral(pi)
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), pi)
def test_self_litteral_float_max_with_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), sqrt(2))
def test_self_litteral_float_max_with_non_empty_evaluation(self):
a1 = SelfLitteral()
a2 = ALitteral(sqrt(2))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), sqrt(2))
def test_self_litteral_self_litteral_max_with_empty_evaluation(self):
a1 = SelfLitteral()
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval1, 1), 1)
def test_self_litteral_self_litteral_max_with_non_empty_evaluation(self):
a1 = SelfLitteral()
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 1), 1)
def test_self_litteral_undefined_max_with_empty_evaluation(self):
a1 = SelfLitteral()
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1, 1)
def test_self_litteral_undefined_max_with_non_empty_evaluation(self):
a1 = SelfLitteral()
a2 = UndefinedLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2, 1)
def test_undefined_self_litteral_max_with_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = SelfLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval1, 1)
def test_undefined_self_litteral_max_with_non_empty_evaluation(self):
a1 = UndefinedLitteral()
a2 = SelfLitteral()
expr = Max(a1, a2)
with self.assertRaises(TypeError):
expr.value(self.eval2, 1)
def test_self_litteral_evaluated_variable_max(self):
a1 = ALitteral(Variable('X'))
a2 = SelfLitteral()
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 2), 2)
def test_evaluated_variable_self_litteral_max(self):
a1 = SelfLitteral()
a2 = ALitteral(Variable('X'))
expr = Max(a1, a2)
self.assertEqual(expr.value(self.eval2, 2), 2)
def test_self_litteral_unevaluated_variable_max(self):
a1 = SelfLitteral()
a2 = ALitteral(Variable('Y'))
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2, 1)
def test_unevaluated_variable_self_litteral_max(self):
a1 = ALitteral(Variable('Y'))
a2 = SelfLitteral()
expr = Max(a1, a2)
with self.assertRaises(ValueError):
expr.value(self.eval2, 1)
| 33.546 | 81 | 0.629583 | 2,041 | 16,773 | 4.954924 | 0.031847 | 0.051221 | 0.065856 | 0.08049 | 0.948878 | 0.936517 | 0.917532 | 0.869178 | 0.810739 | 0.761001 | 0 | 0.040019 | 0.259584 | 16,773 | 499 | 82 | 33.613226 | 0.774297 | 0 | 0 | 0.740476 | 0 | 0 | 0.009539 | 0 | 0 | 0 | 0 | 0 | 0.17619 | 1 | 0.180952 | false | 0 | 0.014286 | 0 | 0.197619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
887926f5facb12964348eac7acf1f5f2d6f20e5b | 83,763 | py | Python | full-watershed-per-pour-point.py | jacquealope/full-watershed-per-pour-point | 50695581a21fd414d4dc56b6fbf1c06dac65de5c | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | full-watershed-per-pour-point.py | jacquealope/full-watershed-per-pour-point | 50695581a21fd414d4dc56b6fbf1c06dac65de5c | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | full-watershed-per-pour-point.py | jacquealope/full-watershed-per-pour-point | 50695581a21fd414d4dc56b6fbf1c06dac65de5c | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | ##-------------------------------------------------------
## Scripted in 2018 for the Bureau of Water Quality and Planning for the batch deliniation of watersheds individually
##
## LICENSE: Feel free to use, share, change this script, it was a lot of work to put together and I know Nevada is not the only
## state or people that needs the ability to create an individual watershed for hundreds or thousands of pour points.
## No need to give credit as I am sharing this without any restrictions.
##
## PYTHON VERSION PARTICULARS:
## This script was made for Python 2.7.x but a conversion to 3.x should be fairly easy as this was written clean enough to do so.
## once you try to run it in 3.x and you can fix them as they come up.
##
## My poor description of this script (LOL)... Full outline bewlow this....
## This script will create individual FULL watersheds for as many pour points as you have
## The watershed tool iteslf moved through a raster format so if you ran more than one along a segment of a stream
## they would end up "erased" downstream and you'd end up with weird shaped watersheds like horseshoes rather than the full normal watershed shape.
## This script allows you to get accurate watershed coverage for EACH pour point as well as accurate area calculations
## and you won't have to run each individual pour point seperate. Hope this explination helps..LOL! If not the outline is below.
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
## NOTES!
##
## NOTE: This script has some hard coded values for Nevada HUC numbers, you can change these easily.
## The loops are the only places where they are hard coded at, the "walks" will gather your HUC numbers regardless of what they are,
## it puts those values in the list then will look for that value in each loop.
## Just change the values at the top of the loop and then comment out or copy/paaste each for any additional or less than needed.
##
## NOTE: the walks are finding raster types of GRID and only those with _fd or _fa endings, which is the
## USGS naming convention for the files you need to complete this.
##
## _fd = Flow direction
## _fa = Flow accumulation
##
## NOTE: Also the final coordinate system is NAD 83 Z11N so if you need to convert to something else you will need the
## coordinate system parameters and the transformation if not from NAD83.
## These projection variables are at the bottom of the script after the last loop.
##
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
# SCRIPT OUTLINE
# 1. Gather paths and create variables
# 2. Check out Spatial Analyst License
# 3. Set environments
# 4. Create File Geodatabase for project
# 5. Set workspace to new File Geodatabase
# 6. Split pour point shapefile by HUC4 using the USGS polygon
# 7. Create list variables to hold file paths
# 8. “Walk” the network paths provided to find the necessary DEMs and Feature Classes we need and store them in lists.
# 9. Create an empty polygon Feature Class to append all the watersheds to
# 10. Add a field to store the acreage value
# 11. Create variable for all the loops to use
# 12. Loop each HUC4 pour point Feature Class created:
# a. Create loop specific variables
# b. Identify HUC4 being run (by variables) and print
# c. IF this HUC4 has pour points (if not, exit loop)
# i. Grab a count of number of points in the specified HUC4
# ii. Find the Flow accumulation DEM, if exists (if not, exit loop)
# iii. If the FA dem exists, find the Flow Direction DEM, if exists (if not, exit loop)
# iv. If the FD dem exists, select the first row in that HUC pour point Feature Class
# 1. Create a point Feature Class out of that single point in the project FGDB
# 2. Run the Snap Pour Point tool on the new Feature Class to create a PourPoint Raster in the project FGDB
# 3. Delete the single Pour Point Feature Class it created
# 4. Run the watershed tool on the Pour Point Raster
# 5. Save the Watershed Raster in the project FGDB
# 6. Convert the Watershed to Polygon, saving it in the project FGDB
# 7. Delete the Raster Watershed
# 8. Append the poly watershed to empty Feature Class created before the loop
# 9. Delete the polygon watershed
# 10. Repeat until all Points done for this HUC4
# d. Delete the HUC4 point Feature Class the loop just finished with
# e. Move to the next loop for the next HUC4 until all HUC4’s are completed.
# 13. Reproject the completed Watershed polygon Feature Class to UTM
# 14. Calculate acreage
# 15. Delete the non-UTM watershed
# 16. Remove the “dangly bits” on the watersheds (anything less than an acre)
# 17. Reproject the original Pour Point Shapefile for the state to the project FGDB
# 18. Done!
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
## INSTRUCTIONS
# 1. Close out of everything ESRI (if you had it open, check the Task Manager too!)
# 2. Gather locations for the script:
# a. Folder directory for your project
# b. What you want to call your File Geodatabase (FGDB) (no need to create one as it does so here)
# c. Where the original pour point shapefile is (in the project folder is ideal)
# d. Where the USGS HUC4 Shapefile is
# e. The upper directory of the DEM’s (from the USGS)
# 3. Open the IDLE
# a. Open start menu and type IDLE and it will come up, press enter
# 4. Open the script:
# a. Name: FINAL_watershed_script.py
# b. NOTE: Do NOT just double-click on the script, you must open it in the IDLE
# i. You can also right-click>Edit with IDLE
# ii.
# iii. NOTE: Do not use the ArcPro or a version of python that is not 2.7.x, the syntax is a bit different, not too bad so you can edit this to conform to ArcPro in the future.
# 5. Change green text (indicates a string) areas in the script noted by a comment that says “change”, you can use the find tool to flag these
# a. If wanting to use a Find tool: best to use a program like notepad++, the IDLE one sucks
# b. Comments start with a #.
# All the areas to be changed are at the top of script:
# #change to reflect project folder location
# projectLocation = r"L:\GIS_Bureaus\GIS_BWQP\GIS_Projects\BIO_PredictiveModelWatershed\2017"
# #change to reflect project name and year
# fgdb_name = "WatershedDeliniation2017.gdb"
# #Change this to reflect your original pourPoints file, I have it here already in my project folder location
# pourPointsOrig = projectLocation + "\\2017PointData.shp"
# #change the usgsHUC4 to reflect the HUC4 shapefile or feature class location
# usgsHUC4 = r"L:/GIS_Bureaus/GIS_BWQP/GIS_Data/WatershedData/HUC4_usgs.shp"
# #This is the upper folder level of the DEM directory
# #change the variable here if the DEM location changes
# demDirectory = r"L:\GIS_Bureaus\GIS_BWQP\GIS_Data\WatershedData\DEMs"
# 6. Once you have changed the file paths, save the script (File>Save OR ctrl+S)
# 7. To run the script: press F5 or under the menu Run> Run Module
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
##---------------------------------------------------------------------------------------------------------------------------
#Import Modules
import arcpy
import os
import time
print "==================================================================================================================="
print "==================================================================================================================="
print"Script started at " + time.strftime('%A, %d %b %Y %H:%M:%S')
print "==================================================================================================================="
print "==================================================================================================================="
#change to reflect project folder location
projectLocation = r"L:\GIS_Bureaus\GIS_BWQP\GIS_Projects\BIO_PredictiveModelWatershed\2017"
#change to reflect project name and year
fgdb_name = "WatershedDeliniation2017.gdb"
#Change this to reflect your original pourPoints file, I have it here already in my project folder location
pourPointsOrig = projectLocation + "\\2017PointData.shp"
#change the usgsHUC4 to reflect the HUC4 shapefile or feature class location
usgsHUC4 = r"L:/GIS_Bureaus/GIS_BWQP/GIS_Data/WatershedData/HUC4_usgs.shp"
#This is the upper folder level of the DEM directory
#change the variable here if the DEM location changes
demDirectory = r"L:\GIS_Bureaus\GIS_BWQP\GIS_Data\WatershedData\DEMs"
#Checkout Spatial Analyst extension
arcpy.AddMessage("Checking license... ")
if arcpy.CheckExtension("Spatial") == "Available":
arcpy.CheckOutExtension("Spatial")
arcpy.AddMessage("Spatial Analyst license checked out... ")
print "Got a Spatial Analyst License!"
else:
arcpy.AddMessage("Spatial Analyst license needed... ")
raise LicenseError
#Set environments
arcpy.env.overwriteOutput = True
arcpy.env.XYResolution = "0.00001 Meters"
arcpy.env.XYTolerance = "0.0001 Meters"
#Creates File GDB for the watersheds
arcpy.CreateFileGDB_management(projectLocation, fgdb_name)
print "created File Geodatabase"
arcpy.env.workspace = projectLocation + "\\" + fgdb_name
wrkSpace = projectLocation + "\\" + fgdb_name
print "set wrkSpace Variable to newly created File Geodatabase"
#create individual pour point feature classes by the HUC4 value
arcpy.Split_analysis(pourPointsOrig, usgsHUC4, "HUC4", wrkSpace, "")
#---------------------------------------------------------------------------------------------------------------------#
#"walking" through the project locations specified at the top of the script to find the files we need for the script and then placing them in a list we can pull from
#If you are getting a lot of files "not found" when there should be, un-comment out (remove #) the print line at the bottom of each walk (ex: print "flow direction list is " + str(flowDirList))
#un-commenting out that line will show you the contents of the list, if it prints: flow direction list is [] and there is nothing in the brackets, perhaps a file name was not correct or the path was wrong
#this is just a placeholder for this variable
flowDirList = []
flowDir = r""
#This "walk" finds the FLOW DIRECTION DEM that we need
#I have the DEM directory at the top level and it will search through the sub folders.
walk = arcpy.da.Walk(demDirectory, type="GRID")
for dirpath, dirnames, filenames in walk:
for filename in filenames:
#Change the filename variable in quotes below
if filename.endswith("_fd"):
flowDir = dirpath + "\\" + filename
flowDirList.append(flowDir)
flowDirList.sort()
#print "flow direction list is " + str(flowDirList)
print "flow direction raster list created"
#this is just a placeholder for this variable
flowAcList = []
flowAc = r""
#This "walk" finds the FLOW ACCUMULATION DEM that we need
#I have the DEM directory at the top level and it will search through the sub folders.
walk = arcpy.da.Walk(demDirectory, type="GRID")
for dirpath, dirnames, filenames in walk:
for filename in filenames:
#Change the filename variable in quotes below
if filename.endswith("_fa"):
flowAc = dirpath + "\\" + filename
flowAcList.append(flowAc)
flowAcList.sort()
#print "flow accumulation list is " + str(flowAcList)
print "flow accumulation raster list created"
pourPointFCList = []
featureClass = ""
#This "walk" finds the pour point feature classes that we just made using the split tool and creates a list to store them
walk = arcpy.da.Walk(wrkSpace, type="POINT")
for dirpath, dirnames, filenames in walk:
for filename in filenames:
if filename.startswith('r'):
featureClass = dirpath + "\\" + filename
pourPointFCList.append(featureClass)
pourPointFCList.sort()
print "pour point FC list created"
#print "my pour point list by HUC4 is "+ str(pourPointFCList)
#---------------------------------------------------------------------------------------------------------------------#
#create empty feature class for the final merging of watersheds
outName = "AllWatersheds"
schemaType = "NO_TEST"
fieldMappings = ""
subtype = ""
print "Creating Feature Class to merge all watersheds into..."
# the FC (templateFC) called out in this tool is a template FC to use for the new one that will have all the necessary fields we need
#Not sure why everything is in NAD83 Albers but it is so I am making the final merged FC Albers as well.Since it is NAD83, a transformation is not needed to reproject to UTM
templateFC = r"L:\GIS_Bureaus\GIS_BWQP\GIS_Projects\BIO_PredictiveModelWatershed\PythonScripts\PythonTemplateFCs.gdb\TemplateWatershed"
spatial_reference_ALBERS = "PROJCS['NAD_1983_Albers',GEOGCS['GCS_North_American_1983',DATUM['D_North_American_1983',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Albers'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-96.0],PARAMETER['Standard_Parallel_1',29.5],PARAMETER['Standard_Parallel_2',45.5],PARAMETER['Latitude_Of_Origin',23.0],UNIT['Meter',1.0]];-16901100 -6972200 10000;-100000 10000;-100000 10000;0.001;0.001;0.001;IsHighPrecision"
arcpy.CreateFeatureclass_management(wrkSpace, outName, "POLYGON", templateFC, "DISABLED", "DISABLED", spatial_reference_ALBERS, config_keyword="", spatial_grid_1="0", spatial_grid_2="0", spatial_grid_3="0")
print "Feature class " + outName + " created"
emptyFC = wrkSpace + "\\" + outName
print "New Feature Class is " + emptyFC
fieldName = "DA_acresUS"
#adds acres column into new FC
arcpy.AddField_management (emptyFC, fieldName, "DOUBLE")
#---------------------------------------------------------------------------------------------------------------------#
#loop time!
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print "LOOP TIME! YEAH! This may take a long while.... be patient"
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
#loop variables
pourPointPoly = ""
pourPointRaster = ""
pointSelection = ""
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1501")
demFA = demDirectory + str("\\r1501\\r1501_fa")
demFD = demDirectory + str("\\r1501\\r1501_fd")
pourPointPolyOutput = wrkSpace + "\\p1501"
#loop for HUC4 1501
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1501 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
if item in pourPointFCList:
print "pour point feature class list contains HUC 1501..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1501, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1501!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1501_fD not found"
else:
print "Flow Accumulation DEM r1501_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1501"
else:
print "1501 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1501 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1503")
demFA = demDirectory + str("\\r1503\\r1503_fa")
demFD = demDirectory + str("\\r1503\\r1503_fd")
pourPointPolyOutput = wrkSpace + "\\p1503"
#loop for HUC4 1503
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1503 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1503..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1503, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1503!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1503_fD not found"
else:
print "Flow Accumulation DEM r1503_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1503"
else:
print "1503 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1503 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1601")
demFA = demDirectory + str("\\r1601\\r1601_fa")
demFD = demDirectory + str("\\r1601\\r1601_fd")
pourPointPolyOutput = wrkSpace + "\\p1601"
#loop for HUC4 1601
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1601 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1601..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1601, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1601!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1601_fD not found"
else:
print "Flow Accumulation DEM r1601_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1601"
else:
print "1601 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1601 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1602")
demFA = demDirectory + str("\\r1602\\r1602_fa")
demFD = demDirectory + str("\\r1602\\r1602_fd")
pourPointPolyOutput = wrkSpace + "\\p1602"
#loop for HUC4 1602
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1602 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1602..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1602, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1602!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1602_fD not found"
else:
print "Flow Accumulation DEM r1602_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1602"
else:
print "1602 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1602 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1603")
demFA = demDirectory + str("\\r1603\\r1603_fa")
demFD = demDirectory + str("\\r1603\\r1603_fd")
pourPointPolyOutput = wrkSpace + "\\p1603"
#loop for HUC4 1603
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1603 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1603..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1603, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------ready for the next row in 1603!------------"
del updateRows
del row
else:
print "Flow Direction DEM r1603_fD not found"
else:
print "Flow Accumulation DEM r1603_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1603"
else:
print "1603 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1603 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1604")
demFA = demDirectory + str("\\r1604\\r1604_fa")
demFD = demDirectory + str("\\r1604\\r1604_fd")
pourPointPolyOutput = wrkSpace + "\\p1604"
#loop for HUC4 1604
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1604 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1604..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1604, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1604!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1604_fD not found"
else:
print "Flow Accumulation DEM r1604_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1604"
else:
print "1604 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1604 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1605")
demFA = demDirectory + str("\\r1605\\r1605_fa")
demFD = demDirectory + str("\\r1605\\r1605_fd")
pourPointPolyOutput = wrkSpace + "\\p1605"
#loop for HUC4 1605
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1605 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1605..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1605, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1605!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1605_fD not found"
else:
print "Flow Accumulation DEM r1605_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1605"
else:
print "1605 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1605 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1606a")
demFA = demDirectory + str("\\r1606a\\r1606a_fa")
demFD = demDirectory + str("\\r1606a\\r1606a_fd")
pourPointPolyOutput = wrkSpace + "\\p1606a"
#loop for HUC4 1606a
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1606a started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1606a..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1606a, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1606a!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1606a_fD not found"
else:
print "Flow Accumulation DEM r1606a_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1606a"
else:
print "1606a pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1606a at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1606b")
demFA = demDirectory + str("\\r1606b\\r1606b_fa")
demFD = demDirectory + str("\\r1606b\\r1606b_fd")
pourPointPolyOutput = wrkSpace + "\\p1606b"
#loop for HUC4 1606b
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1606b started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1606b..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1606b, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1606b!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1606b_fD not found"
else:
print "Flow Accumulation DEM r1606b_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1606b"
else:
print "1606b pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1606b at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1704")
demFA = demDirectory + str("\\r1704\\r1704_fa")
demFD = demDirectory + str("\\r1704\\r1704_fd")
pourPointPolyOutput = wrkSpace + "\\p1704"
#loop for HUC4 1704
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1704 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1704..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1704, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1704!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1704_fD not found"
else:
print "Flow Accumulation DEM r1704_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1704"
else:
print "1704 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1704 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1705")
demFA = demDirectory + str("\\r1705\\r1705_fa")
demFD = demDirectory + str("\\r1705\\r1705_fd")
pourPointPolyOutput = wrkSpace + "\\p1705"
#loop for HUC4 1705
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1705 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1705..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1705, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1705!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1705_fD not found"
else:
print "Flow Accumulation DEM r1705_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1705"
else:
print "1705 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1705 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1712")
demFA = demDirectory + str("\\r1712\\r1712_fa")
demFD = demDirectory + str("\\r1712\\r1712_fd")
pourPointPolyOutput = wrkSpace + "\\p1712"
#loop for HUC4 1712
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1712 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1712..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1712, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1712!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1712_fD not found"
else:
print "Flow Accumulation DEM r1712_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1712"
else:
print "1712 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1712 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1808")
demFA = demDirectory + str("\\r1808\\r1808_fa")
demFD = demDirectory + str("\\r1808\\r1808_fd")
pourPointPolyOutput = wrkSpace + "\\p1808"
#loop for HUC4 1808
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1808 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1808..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1808, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1808!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1808_fD not found"
else:
print "Flow Accumulation DEM r1808_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1808"
else:
print "1808 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1808 at " + time.strftime('%A, %d %b %Y %H:%M:%S') + ", moving onto the next one!"
#---------------------------------------------------------------------------------------------------------------------#
#these variables will be repeated before each HUC4 loop
item = wrkSpace + str("\\r1809")
demFA = demDirectory + str("\\r1809\\r1809_fa")
demFD = demDirectory + str("\\r1809\\r1809_fd")
pourPointPolyOutput = wrkSpace + "\\p1809"
#loop for HUC4 1809
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print"HUC4 1809 started " + time.strftime('%A, %d %b %Y %H:%M:%S')
if item in pourPointFCList:
print "pour point feature class list contains HUC 1809..."+ str(item)
countPourPount = arcpy.GetCount_management(item)
print "-------------------------------------------------------------"
print "In HUC4 1809, there are " + countPourPount.getOutput(0) + " number of points"
print "-------------------------------------------------------------"
updateRows = arcpy.da.UpdateCursor(item, ["OBJECTID_1"])
if demFA in flowAcList:
print "Flow Accumulation DEM found..." + str(demFA)
if demFD in flowDirList:
print "Flow Direction DEM found..." + str(demFD)
for row in updateRows:
#SQL condition variable where it loops for every FID in the dataset
where_clause = "OBJECTID_1 = {0}".format(row[0])
#the feature class name variable for the output
out_feature_class = wrkSpace + "\\ws" + str(row[0])
#Use select by condition to create individual feature class for each pour point row for this HUC
pointSelection = arcpy.Select_analysis(item, out_feature_class, where_clause)
print "feature class created for..." + str(pointSelection)
#create the pour point raster
pourPointRasterOutput = projectLocation + "\\ppr" + str(row[0])
pourPointRaster = arcpy.gp.SnapPourPoint_sa(pointSelection, demFA, pourPointRasterOutput, "60", "OBJECTID_1")
print "Snap Pour Point tool run for HUC4 points..." + str(pourPointRaster)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pointSelection)
print "pour point feature class deleted..." + str(pointSelection)
#Use watershed method in spatial analyst to create watershed delineation for each pour point, then save as raster
watershed = arcpy.sa.Watershed(demFD, pourPointRaster, "VALUE")
watershed.save(projectLocation + "\\ws" + str(row[0]))
print "watershed created..." + str(watershed)
#Delete pour point feature class as it's no longer needed
arcpy.Delete_management(pourPointRaster)
print "pour point raster deleted..." + str(pourPointRaster)
#Convert the raster of the watershed into a polygon of the watershed
wsPoly = arcpy.RasterToPolygon_conversion(watershed, wrkSpace + "\\ws" + str(row[0]), "SIMPLIFY", "VALUE")
print "converted raster watershed to polygon"
#Delete the raster watershed file as it's no longer needed
arcpy.Delete_management(watershed)
print "deleted raster watershed..." + str(watershed)
#Add a drainage area field to calculate drainage area
arcpy.AddField_management (wsPoly, "DA_acresUS", "DOUBLE")
#Calculate drainage area
arcpy.CalculateField_management(wsPoly, "DA_acresUS", "!SHAPE.AREA@ACRES!", "PYTHON", "")
print "calculated acres"
#merge all 1604 HUC watersheds
arcpy.Append_management(wsPoly, emptyFC, schemaType, fieldMappings, subtype)
print "appended..." + str(wsPoly)
#deleting the Albers FC
arcpy.Delete_management(wsPoly)
print "Deleted the watershed FC " + str(wsPoly)
updateRows.updateRow(row)
print "------------------------ready for the next row in 1809!------------------------"
del updateRows
del row
else:
print "Flow Direction DEM r1809_fD not found"
else:
print "Flow Accumulation DEM r1809_fa not found"
#deleting the split pour point feature class because we don't need it anymore
arcpy.Delete_management(item)
print "Deleted the split pour point Feature Class for 1809"
else:
print "1809 pour point not found"
print "...only be worried if you know you have points here..."
print "done with 1809 at " + time.strftime('%A, %d %b %Y %H:%M:%S')
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
print "This was the last HUC4, cleaning up and reprojecting next..."
print "-------------------------------------------------------------"
print "-------------------------------------------------------------"
#---------------------------------------------------------------------------------------------------------------------#
#Reprojecting the merged FC to UTM
print "reprojecting the feature classes to UTM"
spatialRef = arcpy.SpatialReference('NAD 1983 UTM Zone 11N')
albersFC = emptyFC
utmFC = emptyFC + "_UTM"
#reprojecting to UTM
arcpy.Project_management(albersFC, utmFC, spatialRef)
print "reprojected Albers watershed to UTM"
#Calculate drainage area
arcpy.CalculateField_management(utmFC, "DA_acresUS", "!SHAPE.AREA@ACRES!", "PYTHON", "")
print "calculated acres"
#deleting the Albers FC
arcpy.Delete_management(albersFC)
print "Deleted the Albers watershed FC"
#removing all the dangly bits on the watersheds
print "deleting dangly bits on watersheds...."
with arcpy.da.UpdateCursor(utmFC, fieldName) as cursor:
for row in cursor:
if row[0]<1:
cursor.deleteRow()
print "deleted dangly bits! Now your watersheds are soooo pretty!"
#Reprojecting the orig pour point shapefile to UTM
print "reprojecting the orig pour point shapefile to UTM"
albersFC = pourPointsOrig
utmFC = wrkSpace + "\\PourPoints_UTM"
#reprojecting to UTM
arcpy.Project_management(albersFC, utmFC, spatialRef)
print "reprojected Albers pour point to UTM"
print "The pour points are now in the" + fgdb_name +" File Geodatabase"
#done!
print "==================================================================================================================="
print "==================================================================================================================="
print "Done! Completed at " + time.strftime('%A, %d %b %Y %H:%M:%S') + " Now get back to work!!!"
print "==================================================================================================================="
print "==================================================================================================================="
| 54.285807 | 561 | 0.552905 | 8,964 | 83,763 | 5.12104 | 0.074297 | 0.036271 | 0.031021 | 0.0398 | 0.770809 | 0.765799 | 0.761486 | 0.757913 | 0.742882 | 0.727328 | 0 | 0.023399 | 0.256629 | 83,763 | 1,543 | 562 | 54.285807 | 0.713829 | 0.286332 | 0 | 0.697072 | 0 | 0.001126 | 0.34363 | 0.112888 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.003378 | null | null | 0.440315 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
ee46c2079b10e2f7e1c8f195654a58fd6f742542 | 272 | py | Python | utils/checks.py | LastAeon77/CSbot | 6e45b0ea3867a2c17f919ea08a683ae5bf479ca0 | [
"MIT"
] | null | null | null | utils/checks.py | LastAeon77/CSbot | 6e45b0ea3867a2c17f919ea08a683ae5bf479ca0 | [
"MIT"
] | null | null | null | utils/checks.py | LastAeon77/CSbot | 6e45b0ea3867a2c17f919ea08a683ae5bf479ca0 | [
"MIT"
] | null | null | null | async def owner_check(ctx):
return ctx.author.id == ctx.bot.config["discord"]["owner"]
async def member_check(ctx):
return (
ctx.author.id == ctx.bot.config["Allowed_Users"]["member1"]
or ctx.author.id == ctx.bot.config["discord"]["owner"]
)
| 27.2 | 67 | 0.632353 | 38 | 272 | 4.447368 | 0.421053 | 0.159763 | 0.195266 | 0.248521 | 0.715976 | 0.715976 | 0.715976 | 0.715976 | 0.43787 | 0 | 0 | 0.004505 | 0.183824 | 272 | 9 | 68 | 30.222222 | 0.756757 | 0 | 0 | 0 | 0 | 0 | 0.161765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee685f99b798617c41f0e9b16464b0a61c85f90d | 7,090 | py | Python | ClickCount/crud.py | khashayarghamati/ClickCount | 5748e22cccad02f75a498da761018db78368e007 | [
"MIT"
] | null | null | null | ClickCount/crud.py | khashayarghamati/ClickCount | 5748e22cccad02f75a498da761018db78368e007 | [
"MIT"
] | null | null | null | ClickCount/crud.py | khashayarghamati/ClickCount | 5748e22cccad02f75a498da761018db78368e007 | [
"MIT"
] | null | null | null | from django.utils import timezone
from .forms import (ImageClickCountForm, UrlClickCountForm,
ButtonClickCountForm, UrlMonitoringForm)
from .models import (ImageClickCount, ButtonClickCount,
UrlClickCount, UrlMonitoring)
class CRUD(object):
@staticmethod
def update(data):
return Update(data)
@staticmethod
def create(data):
return Create(data)
class Update(object):
def __init__(self, data):
self.data = data
def updateButton(self):
name = self.data['name']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
try:
instance = ButtonClickCount.objects.get(buttonName=name)
except UrlMonitoring.DoesNotExist:
return {
'message':
'record didn\'t exist. Check your "state" key and values'
}
count = instance.count + 1
form = ButtonClickCountForm({
'buttonName': name,
'buttonDescription': description,
'count': count,
'timeStamp': timezone.datetime.now(),
'uID': uid,
'userIP': ip,
'userSession': session
}, instance=instance)
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def updateURL(self):
url = self.data['url']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
try:
instance = UrlClickCount.objects.get(url=url)
except UrlMonitoring.DoesNotExist:
return {
'message':
'record didn\'t exist. Check your "state" key and values'
}
count = instance.count + 1
form = UrlClickCountForm({
'url': url,
'urlDescription': description,
'count': count,
'timeStamp': timezone.datetime.now(),
'uID': uid,
'userIP': ip,
'userSession': session
}, instance=instance)
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def updateImage(self):
name = self.data['name']
url = self.data['url']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
try:
instance = ImageClickCount.objects.get(imageName=name,
imageURL=url)
except UrlMonitoring.DoesNotExist:
return {
'message':
'record didn\'t exist. Check your "state" key and values'
}
count = instance.count + 1
form = ImageClickCountForm({
'imageURL': url,
'imageDescription': description,
'imageName': name,
'count': count,
'timeStamp': timezone.datetime.now(),
'uID': uid,
'userIP': ip,
'userSession': session
}, instance=instance)
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def updateUrlMonitoring(self):
url = self.data['url']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
closeTime = self.data['close']
try:
instance = UrlMonitoring.objects.get(url=url, isClose=False)
except UrlMonitoring.DoesNotExist:
return {
'message':
'record didn\'t exist. Check your \'state\' key and values'
}
form = UrlMonitoringForm({
'url': url,
'closeTime': closeTime,
'openTime': instance.openTime,
'timeStamp': timezone.datetime.now(),
'uID': uid,
'userIP': ip,
'userSession': session,
'isClose': True
}, instance=instance)
if form.is_valid():
form.save()
return {'message': 'user\'s visited time updated'}
else:
return {'message': form.errors}
class Create(object):
def __init__(self, data):
self.data = data
def createImage(self):
name = self.data['name']
url = self.data['url']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
form = ImageClickCountForm({
'imageURL': url,
'imageDescription': description,
'imageName': name,
'count': 1,
'uID': uid,
'userIP': ip,
'userSession': session
})
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def createButton(self):
name = self.data['name']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
form = ButtonClickCountForm({
'buttonName': name,
'buttonDescription': description,
'count': 1,
'uID': uid,
'userIP': ip,
'userSession': session
})
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def createURL(self):
url = self.data['url']
description = self.data['description']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
form = UrlClickCountForm({
'url': url,
'urlDescription': description,
'count': 1,
'uID': uid,
'userIP': ip,
'userSession': session
})
if form.is_valid():
form.save()
return {'message': 'clicked save!'}
else:
return {'message': form.errors}
def createUrlMonitoring(self):
url = self.data['url']
uid = self.data['uid']
ip = self.data['ip']
session = self.data['session']
openTime = self.data['open']
form = UrlMonitoringForm({
'url': url,
'closeTime': None,
'openTime': openTime,
'timeStamp': timezone.datetime.now(),
'uID': uid,
'userIP': ip,
'userSession': session
})
if form.is_valid():
form.save()
return {'message': 'user\'s visited time saved'}
else:
return {'message': form.errors}
| 27.587549 | 79 | 0.498307 | 625 | 7,090 | 5.6272 | 0.1472 | 0.104635 | 0.025021 | 0.031845 | 0.807506 | 0.77168 | 0.77168 | 0.721638 | 0.721638 | 0.661359 | 0 | 0.001358 | 0.37701 | 7,090 | 256 | 80 | 27.695313 | 0.794883 | 0 | 0 | 0.811321 | 0 | 0 | 0.132299 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0 | 0.014151 | 0.009434 | 0.188679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee732ad6612393786f386dec5b059f16eb78cecb | 55 | py | Python | example/b.py | noahnu/pypnp | 008e14b3cc77837ef8c8d85497c23be1e298a3a9 | [
"MIT"
] | null | null | null | example/b.py | noahnu/pypnp | 008e14b3cc77837ef8c8d85497c23be1e298a3a9 | [
"MIT"
] | 5 | 2021-11-16T14:06:56.000Z | 2021-11-16T14:07:09.000Z | example/b.py | noahnu/pypnp | 008e14b3cc77837ef8c8d85497c23be1e298a3a9 | [
"MIT"
] | null | null | null | from c_1 import func
def func_b():
return func()
| 9.166667 | 20 | 0.654545 | 10 | 55 | 3.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.254545 | 55 | 5 | 21 | 11 | 0.804878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c9c410e4cdfb3abe46fe7eb3cc60dd89d9aa812f | 79 | py | Python | exp/variable_with_same_name_as_module.py | nicolasessisbreton/fython | 988f5a94cee8b16b0000501a22239195c73424a1 | [
"Apache-2.0"
] | 41 | 2016-01-21T05:14:45.000Z | 2021-11-24T20:37:21.000Z | exp/variable_with_same_name_as_module.py | nicolasessisbreton/fython | 988f5a94cee8b16b0000501a22239195c73424a1 | [
"Apache-2.0"
] | 5 | 2016-01-21T05:36:37.000Z | 2016-08-22T19:26:51.000Z | exp/variable_with_same_name_as_module.py | nicolasessisbreton/fython | 988f5a94cee8b16b0000501a22239195c73424a1 | [
"Apache-2.0"
] | 3 | 2016-01-23T04:03:44.000Z | 2016-08-21T15:58:38.000Z | variable_with_same_name_as_module = 1
print(variable_with_same_name_as_module) | 26.333333 | 40 | 0.911392 | 14 | 79 | 4.428571 | 0.571429 | 0.387097 | 0.516129 | 0.645161 | 0.903226 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0.013333 | 0.050633 | 79 | 3 | 40 | 26.333333 | 0.813333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
4ed4fb34f896943bff086ad3548e6f1b3f32c7d2 | 2,086 | py | Python | train_cmd_default.py | CheungBH/yolov3-channel-and-layer-pruning | 457f81386cbc54ace0ad677581e383c516305ba9 | [
"Apache-2.0"
] | null | null | null | train_cmd_default.py | CheungBH/yolov3-channel-and-layer-pruning | 457f81386cbc54ace0ad677581e383c516305ba9 | [
"Apache-2.0"
] | null | null | null | train_cmd_default.py | CheungBH/yolov3-channel-and-layer-pruning | 457f81386cbc54ace0ad677581e383c516305ba9 | [
"Apache-2.0"
] | null | null | null | #-*-coding:utf-8-*-
cmds = [
# "python train.py --wdir black_origin --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/black_origin/last.pt --batch-size 16 --epochs 300",
# "python train.py --wdir black_sE-3 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/black_sE-3/last.pt --batch-size 16 --epochs 300 -sr --s 0.001 --prune 1",
# "python train.py --wdir black_s2E-3 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/black_s2E-3/last.pt --batch-size 16 --epochs 300 -sr --s 0.002 --prune 1",
# "python train.py --wdir black_s5E-3 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/darknet53.conv.74 --batch-size 16 --epochs 300 -sr --s 0.005 --prune 1",
# "python train.py --wdir black_s3E-3 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/darknet53.conv.74 --batch-size 16 --epochs 300 -sr --s 0.003 --prune 1",
# "python train.py --wdir black_s2E-3_45*0.01 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/black_s2E-3_45*0.01/last.pt --batch-size 16 --epochs 300 -sr --s 0.002 --prune 1",
# "python train.py --wdir black_s1E-3_60*0.01 --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --weights weights/darknet53.conv.74 --batch-size 16 --epochs 300 -sr --s 0.001 --prune 1",
#"python train.py --wdir celiling_origin --cfg cfg/yolov3-1cls.cfg --data data/ceiling_cam/ceiling.data --weights weights/darknet53.conv.74 --batch-size 16 --epochs 100",
# "python train.py --wdir ceiling0507_lrE-4 --cfg cfg/yolov3-1cls.cfg --data data/ceiling.data --weights weights/darknet53.conv.74 --batch-size 16 --epochs 100",
"python train.py --wdir black_nopre --cfg cfg/yolov3-1cls.cfg --data data/swim_enhanced/enhanced.data --batch-size 16 --epochs 300",
"python train.py --wdir gray_nopre --cfg cfg/yolov3-1cls.cfg --data data/swim_gray/gray.data --batch-size 16 --epochs 300",
]
import os
for cmd in cmds:
os.system(cmd)
| 104.3 | 216 | 0.711409 | 357 | 2,086 | 4.078431 | 0.168067 | 0.083104 | 0.098214 | 0.128434 | 0.923764 | 0.901786 | 0.880495 | 0.857143 | 0.807005 | 0.700549 | 0 | 0.089325 | 0.119847 | 2,086 | 19 | 217 | 109.789474 | 0.703704 | 0.817833 | 0 | 0 | 0 | 0.285714 | 0.672973 | 0.151351 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
14ddf3a1835e841b5aa1ad1f98ff227309a6e9e9 | 34,141 | py | Python | tests/system/use_cases/test_use_cases.py | dimensigon/dimensigon | 079d7c91a66e10f13510d89844fbadb27e005b40 | [
"Apache-2.0"
] | 2 | 2020-11-20T10:27:14.000Z | 2021-02-21T13:57:56.000Z | tests/system/use_cases/test_use_cases.py | dimensigon/dimensigon | 079d7c91a66e10f13510d89844fbadb27e005b40 | [
"Apache-2.0"
] | null | null | null | tests/system/use_cases/test_use_cases.py | dimensigon/dimensigon | 079d7c91a66e10f13510d89844fbadb27e005b40 | [
"Apache-2.0"
] | null | null | null | from flask import url_for
from dimensigon.domain.entities import Server, Gate
from dimensigon.domain.entities.route import Route
from dimensigon.web import db
from tests.base import TestDimensigonBase
class TestRoutes(TestDimensigonBase):
def test_routes_get(self):
s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='server2')
g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
dns=s2.name)
Route(s2, g2, cost=0)
s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='server3')
g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
dns=s3.name)
Route(s3, g3, cost=0)
s4 = Server(id='123e4567-e89b-12d3-a456-426655440004', name='server4')
g4 = Gate(id='123e4567-e89b-12d3-a456-426655440014', server=s4, port=5001,
dns=s4.name)
Route(s4, s2, cost=1)
db.session.add_all([s2, s3, s4])
db.session.commit()
response = self.client.get(url_for('api_1_0.routes', _external=False),
headers=self.auth.header)
data = response.get_json()
self.assertDictEqual({'server': {'id': self.s1.id, 'name': self.s1.name},
'route_list': [
dict(destination_id=s2.id,
gate_id=g2.id,
proxy_server_id=None, cost=0),
dict(destination_id=s3.id,
gate_id=g3.id,
proxy_server_id=None, cost=0),
dict(destination_id=s4.id,
gate_id=None,
proxy_server_id=s2.id, cost=1)]}, data)
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch(self, mocked_ping, mocked_thread):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='server1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='server2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# Route(s2, gate=g2, cost=0)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='server3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
# Route(s3, gate=g3, cost=0)
# s4 = Server(id='123e4567-e89b-12d3-a456-426655440004', name='server4')
# g4 = Gate(id='123e4567-e89b-12d3-a456-426655440014', server=s4, port=5001,
# dns=s4.name)
# Route(s4, proxy_server=s2, cost=1)
# db.session.add_all([s1, s2, s3, s4])
#
# mocked_ping.return_value = (None, None)
# access_token = create_access_token(identity='test')
# response = self.client.patch(url_for('api_1_0.routes'),
# headers={"Authorization": f"Bearer {access_token}"},
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440004',
# gate_id=None,
# proxy_server_id=None,
# cost=None)]})
#
# # s = Server.query.get('123e4567-e89b-12d3-a456-426655440001')
# # self.assertEqual(None, s.route.gate)
# # self.assertEqual(None, s.route.proxy_server)
# # self.assertEqual(None, s.route.cost)
#
# # s = Server.query.get('123e4567-e89b-12d3-a456-426655440002')
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# # s = Server.query.get('123e4567-e89b-12d3-a456-426655440003')
# self.assertEqual(g3, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(0, s3.route.cost)
#
# # s = Server.query.get('123e4567-e89b-12d3-a456-426655440004')
# self.assertEqual(None, s4.route.gate)
# self.assertEqual(None, s4.route.proxy_server)
# self.assertEqual(None, s4.route.cost)
#
# self.assertEqual(1, mocked_thread.Thread.call_count)
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.update_route_table_cost')
# @responses.activate
# def test_routes_post(self, mocked_utr, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='server1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
#
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='server2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5001,
# dns=s2.name)
# Route(destination=s2, cost=0)
# db.session.add_all([s1, s2])
#
# mocked_utr.return_value = {s1: TempRoute(proxy_server=s1, gate=g1, cost=0)}
#
# resp = self.client.post(url_for('api_1_0.routes'),
# json={'discover_new_neighbours': True, 'check_current_neighbours': True},
# headers=self.headers)
#
# mocked_utr.assert_called_once_with(discover_new_neighbours=True, check_current_neighbours=True)
#
# args, kwargs = mocked_threading.Thread.call_args
#
# self.assertTupleEqual((self.app, s2, 'api_1_0.routes'), kwargs['args'])
#
# self.assertDictEqual({'server_id': '123e4567-e89b-12d3-a456-426655440001',
# 'route_list': [{'destination_id': '123e4567-e89b-12d3-a456-426655440001',
# 'proxy_server_id': '123e4567-e89b-12d3-a456-426655440001',
# 'gate_id': '123e4567-e89b-12d3-a456-426655440011',
# 'cost': 0}]}, kwargs['kwargs']['json'])
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario1(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, gate=g2, cost=0)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return 0, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return True
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440003',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440001',
# gate_id='123e4567-e89b-12d3-a456-426655440011',
# proxy_server_id=None,
# cost=0),
# dict(destination_id='123e4567-e89b-12d3-a456-426655440002',
# gate_id='123e4567-e89b-12d3-a456-426655440012',
# proxy_server_id=None,
# cost=0)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(g3, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(0, s3.route.cost)
#
# mocked_threading.Thread.assert_called_once()
# args, kwargs = mocked_threading.Thread.call_args
# kwargs.pop('target')
# self.assertDictEqual(dict(args=(self.app, s2, 'api_1_0.routes'),
# kwargs={'json': {'server_id': str(s1.id),
# 'route_list': [
# {'destination_id': str(s3.id),
# 'proxy_server_id': None,
# 'gate_id': str(g3.id),
# 'cost': 0}]},
# 'headers': self.headers}), kwargs)
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario2(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, gate=g2, cost=0)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return None, None
# else:
# raise
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return False
# else:
# raise
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440003',
# gate_id='123e4567-e89b-12d3-a456-426655440013',
# proxy_server_id=None,
# cost=0),
# dict(destination_id='123e4567-e89b-12d3-a456-426655440002',
# gate_id='123e4567-e89b-12d3-a456-426655440012',
# proxy_server_id=None,
# cost=0)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(None, s3.route.gate)
# self.assertEqual(s2, s3.route.proxy_server)
# self.assertEqual(1, s3.route.cost)
#
# mocked_threading.Thread.assert_not_called()
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario3(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, gate=g2, cost=0)
# Route(s3, proxy_server=s2, cost=1)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return 1, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return True
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440003',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440001',
# gate_id='123e4567-e89b-12d3-a456-426655440011',
# proxy_server_id=None,
# cost=0)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(g3, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(0, s3.route.cost)
#
# mocked_threading.Thread.assert_called_once()
# args, kwargs = mocked_threading.Thread.call_args
# kwargs.pop('target')
# self.assertDictEqual(dict(args=(self.app, s2, 'api_1_0.routes'),
# kwargs={'json': {'server_id': str(s1.id),
# 'route_list': [
# {'destination_id': str(s3.id),
# 'proxy_server_id': None,
# 'gate_id': str(g3.id),
# 'cost': 0}]},
# 'headers': self.headers}), kwargs)
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario4(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return None, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return None, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return False
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440001',
# gate_id='123e4567-e89b-12d3-a456-426655440011',
# proxy_server_id=None,
# cost=0),
# dict(destination_id='123e4567-e89b-12d3-a456-426655440003',
# gate_id='123e4567-e89b-12d3-a456-426655440013',
# proxy_server_id=None,
# cost=0)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(None, s3.route.gate)
# self.assertEqual(s2, s3.route.proxy_server)
# self.assertEqual(1, s3.route.cost)
#
# mocked_threading.Thread.assert_not_called()
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario5(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, gate=g2, cost=0)
# Route(s3, gate=g3, cost=0)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return 0, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return True
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440003',
# gate_id=None,
# proxy_server_id=None,
# cost=None)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(g3, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(0, s3.route.cost)
#
# mocked_threading.Thread.assert_not_called()
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario6(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, gate=g2, cost=0)
# Route(s3, gate=g3, cost=0)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return None, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return False
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440003',
# gate_id=None,
# proxy_server_id=None,
# cost=None)
# ]})
#
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(None, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(None, s3.route.cost)
#
# mocked_threading.Thread.assert_not_called()
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario7(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5001,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5002,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5003,
# dns=s3.name)
#
# Route(s2, proxy_server=None, gate=g2, cost=0)
# Route(s3, proxy_server=s2, gate=None, cost=1)
# db.session.add_all([s1, s2, s3])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return None, None
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return False
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440003',
# gate_id=None,
# proxy_server_id=None,
# cost=None)
# ]})
#
# self.assertEqual(0, mocked_ping.call_count)
# self.assertEqual(0, mocked_check_host.call_count)
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(None, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(None, s3.route.cost)
#
# mocked_threading.Thread.assert_not_called()
#
# @patch('dimensigon.web.api_1_0.urls.use_cases.threading')
# @patch('dimensigon.web.api_1_0.urls.use_cases.check_host')
# @patch('dimensigon.web.api_1_0.urls.use_cases.ping_server')
# def test_routes_patch_scenario8(self, mocked_ping, mocked_check_host, mocked_threading):
# s1 = Server(id='123e4567-e89b-12d3-a456-426655440001', name='node1', me=True)
# g1 = Gate(id='123e4567-e89b-12d3-a456-426655440011', server=s1, port=5000,
# dns=s1.name)
# s2 = Server(id='123e4567-e89b-12d3-a456-426655440002', name='node2')
# g2 = Gate(id='123e4567-e89b-12d3-a456-426655440012', server=s2, port=5000,
# dns=s2.name)
# s3 = Server(id='123e4567-e89b-12d3-a456-426655440003', name='node3')
# g3 = Gate(id='123e4567-e89b-12d3-a456-426655440013', server=s3, port=5000,
# dns=s3.name)
#
# s4 = Server(id='123e4567-e89b-12d3-a456-426655440004', name='node4')
# g4 = Gate(id='123e4567-e89b-12d3-a456-426655440014', server=s4, port=5000,
# dns=s4.name)
#
# Route(s2, proxy_server=None, gate=g2, cost=0)
# Route(s3, proxy_server=s2, gate=None, cost=1)
# Route(s4, proxy_server=s2, gate=None, cost=1)
# db.session.add_all([s1, s2, s3, s4])
# db.session.commit()
#
# def ping(server, *args, **kwargs):
# if str(server.id) == '123e4567-e89b-12d3-a456-426655440001':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440002':
# return 0, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440003':
# return 1, None
# elif str(server.id) == '123e4567-e89b-12d3-a456-426655440004':
# return None, None
# else:
# raise
#
# mocked_ping.side_effect = ping
#
# def check_host(host, port, *args, **kwargs):
# if host == 'node1':
# return True
# elif host == 'node2':
# return True
# elif host == 'node3':
# return False
# elif host == 'node4':
# return False
# else:
# raise
#
# mocked_check_host.side_effect = check_host
#
# response = self.client.patch(url_for('api_1_0.routes'),
# headers=self.headers,
# json={"server_id": '123e4567-e89b-12d3-a456-426655440002',
# "route_list": [
# dict(destination_id='123e4567-e89b-12d3-a456-426655440004',
# gate_id=None,
# proxy_server_id=s3,
# cost=1),
#
# ]})
#
# self.assertEqual(0, mocked_ping.call_count)
# self.assertEqual(0, mocked_check_host.call_count)
# self.assertIsNone(s1.route)
#
# self.assertEqual(g2, s2.route.gate)
# self.assertEqual(None, s2.route.proxy_server)
# self.assertEqual(0, s2.route.cost)
#
# self.assertEqual(g3, s3.route.gate)
# self.assertEqual(None, s3.route.proxy_server)
# self.assertEqual(0, s3.route.cost)
#
# self.assertEqual(None, s4.route.gate)
# self.assertEqual(s3, s4.route.proxy_server)
# self.assertEqual(1, s4.route.cost)
#
# mocked_threading.Thread.assert_called_once()
# args, kwargs = mocked_threading.Thread.call_args
# kwargs.pop('target')
# self.assertDictEqual(dict(args=(self.app, s2, 'api_1_0.routes'),
# kwargs={'json': {'server_id': str(s1.id),
# 'route_list': [
# {'destination_id': str(s3.id),
# 'proxy_server_id': None,
# 'gate_id': str(g3.id),
# 'cost': 0},
# {'destination_id': str(s4.id),
# 'proxy_server_id': str(s3.id),
# 'gate_id': None,
# 'cost': 1}
# ]},
# 'headers': self.headers}), kwargs)
| 49.986823 | 108 | 0.498902 | 3,552 | 34,141 | 4.65991 | 0.041385 | 0.093523 | 0.124698 | 0.155872 | 0.935053 | 0.926474 | 0.915116 | 0.904966 | 0.888231 | 0.882612 | 0 | 0.19419 | 0.370874 | 34,141 | 682 | 109 | 50.060117 | 0.576423 | 0.832049 | 0 | 0.057143 | 0 | 0 | 0.053687 | 0.042478 | 0 | 0 | 0 | 0 | 0.028571 | 1 | 0.028571 | false | 0 | 0.142857 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
14ff9cac3996d2613116806e44cd8a1a0239f048 | 977,803 | py | Python | intersight/api/virtualization_api.py | CiscoDevNet/intersight-python | 04b721f37c3044646a91c185c7259edfb991557a | [
"Apache-2.0"
] | 5 | 2021-12-16T15:13:32.000Z | 2022-03-29T16:09:54.000Z | intersight/api/virtualization_api.py | CiscoDevNet/intersight-python | 04b721f37c3044646a91c185c7259edfb991557a | [
"Apache-2.0"
] | 4 | 2022-01-25T19:05:51.000Z | 2022-03-29T20:18:37.000Z | intersight/api/virtualization_api.py | CiscoDevNet/intersight-python | 04b721f37c3044646a91c185c7259edfb991557a | [
"Apache-2.0"
] | 2 | 2020-07-07T15:01:08.000Z | 2022-01-31T04:27:35.000Z | """
Cisco Intersight
Cisco Intersight is a management platform delivered as a service with embedded analytics for your Cisco and 3rd party IT infrastructure. This platform offers an intelligent level of management that enables IT organizations to analyze, simplify, and automate their environments in more advanced ways than the prior generations of tools. Cisco Intersight provides an integrated and intuitive management experience for resources in the traditional data center as well as at the edge. With flexible deployment options to address complex security needs, getting started with Intersight is quick and easy. Cisco Intersight has deep integration with Cisco UCS and HyperFlex systems allowing for remote deployment, configuration, and ongoing maintenance. The model-based deployment works for a single system in a remote location or hundreds of systems in a data center and enables rapid, standardized configuration and deployment. It also streamlines maintaining those systems whether you are working with small or very large configurations. The Intersight OpenAPI document defines the complete set of properties that are returned in the HTTP response. From that perspective, a client can expect that no additional properties are returned, unless these properties are explicitly defined in the OpenAPI document. However, when a client uses an older version of the Intersight OpenAPI document, the server may send additional properties because the software is more recent than the client. In that case, the client may receive properties that it does not know about. Some generated SDKs perform a strict validation of the HTTP response body against the OpenAPI document. # noqa: E501
The version of the OpenAPI document: 1.0.9-4950
Contact: intersight@cisco.com
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from intersight.api_client import ApiClient, Endpoint as _Endpoint
from intersight.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from intersight.model.error import Error
from intersight.model.virtualization_cisco_hypervisor_manager import VirtualizationCiscoHypervisorManager
from intersight.model.virtualization_cisco_hypervisor_manager_response import VirtualizationCiscoHypervisorManagerResponse
from intersight.model.virtualization_esxi_console import VirtualizationEsxiConsole
from intersight.model.virtualization_esxi_console_response import VirtualizationEsxiConsoleResponse
from intersight.model.virtualization_host import VirtualizationHost
from intersight.model.virtualization_host_response import VirtualizationHostResponse
from intersight.model.virtualization_iwe_cluster import VirtualizationIweCluster
from intersight.model.virtualization_iwe_cluster_response import VirtualizationIweClusterResponse
from intersight.model.virtualization_iwe_datacenter import VirtualizationIweDatacenter
from intersight.model.virtualization_iwe_datacenter_response import VirtualizationIweDatacenterResponse
from intersight.model.virtualization_iwe_dv_uplink import VirtualizationIweDvUplink
from intersight.model.virtualization_iwe_dv_uplink_response import VirtualizationIweDvUplinkResponse
from intersight.model.virtualization_iwe_dvswitch import VirtualizationIweDvswitch
from intersight.model.virtualization_iwe_dvswitch_response import VirtualizationIweDvswitchResponse
from intersight.model.virtualization_iwe_host import VirtualizationIweHost
from intersight.model.virtualization_iwe_host_interface import VirtualizationIweHostInterface
from intersight.model.virtualization_iwe_host_interface_response import VirtualizationIweHostInterfaceResponse
from intersight.model.virtualization_iwe_host_response import VirtualizationIweHostResponse
from intersight.model.virtualization_iwe_host_vswitch import VirtualizationIweHostVswitch
from intersight.model.virtualization_iwe_host_vswitch_response import VirtualizationIweHostVswitchResponse
from intersight.model.virtualization_iwe_network import VirtualizationIweNetwork
from intersight.model.virtualization_iwe_network_response import VirtualizationIweNetworkResponse
from intersight.model.virtualization_iwe_virtual_disk import VirtualizationIweVirtualDisk
from intersight.model.virtualization_iwe_virtual_disk_response import VirtualizationIweVirtualDiskResponse
from intersight.model.virtualization_iwe_virtual_machine import VirtualizationIweVirtualMachine
from intersight.model.virtualization_iwe_virtual_machine_network_interface import VirtualizationIweVirtualMachineNetworkInterface
from intersight.model.virtualization_iwe_virtual_machine_network_interface_response import VirtualizationIweVirtualMachineNetworkInterfaceResponse
from intersight.model.virtualization_iwe_virtual_machine_response import VirtualizationIweVirtualMachineResponse
from intersight.model.virtualization_virtual_disk import VirtualizationVirtualDisk
from intersight.model.virtualization_virtual_disk_response import VirtualizationVirtualDiskResponse
from intersight.model.virtualization_virtual_machine import VirtualizationVirtualMachine
from intersight.model.virtualization_virtual_machine_response import VirtualizationVirtualMachineResponse
from intersight.model.virtualization_virtual_network import VirtualizationVirtualNetwork
from intersight.model.virtualization_virtual_network_response import VirtualizationVirtualNetworkResponse
from intersight.model.virtualization_vmware_cluster import VirtualizationVmwareCluster
from intersight.model.virtualization_vmware_cluster_response import VirtualizationVmwareClusterResponse
from intersight.model.virtualization_vmware_datacenter import VirtualizationVmwareDatacenter
from intersight.model.virtualization_vmware_datacenter_response import VirtualizationVmwareDatacenterResponse
from intersight.model.virtualization_vmware_datastore import VirtualizationVmwareDatastore
from intersight.model.virtualization_vmware_datastore_cluster import VirtualizationVmwareDatastoreCluster
from intersight.model.virtualization_vmware_datastore_cluster_response import VirtualizationVmwareDatastoreClusterResponse
from intersight.model.virtualization_vmware_datastore_response import VirtualizationVmwareDatastoreResponse
from intersight.model.virtualization_vmware_distributed_network import VirtualizationVmwareDistributedNetwork
from intersight.model.virtualization_vmware_distributed_network_response import VirtualizationVmwareDistributedNetworkResponse
from intersight.model.virtualization_vmware_distributed_switch import VirtualizationVmwareDistributedSwitch
from intersight.model.virtualization_vmware_distributed_switch_response import VirtualizationVmwareDistributedSwitchResponse
from intersight.model.virtualization_vmware_folder import VirtualizationVmwareFolder
from intersight.model.virtualization_vmware_folder_response import VirtualizationVmwareFolderResponse
from intersight.model.virtualization_vmware_host import VirtualizationVmwareHost
from intersight.model.virtualization_vmware_host_response import VirtualizationVmwareHostResponse
from intersight.model.virtualization_vmware_kernel_network import VirtualizationVmwareKernelNetwork
from intersight.model.virtualization_vmware_kernel_network_response import VirtualizationVmwareKernelNetworkResponse
from intersight.model.virtualization_vmware_network import VirtualizationVmwareNetwork
from intersight.model.virtualization_vmware_network_response import VirtualizationVmwareNetworkResponse
from intersight.model.virtualization_vmware_physical_network_interface import VirtualizationVmwarePhysicalNetworkInterface
from intersight.model.virtualization_vmware_physical_network_interface_response import VirtualizationVmwarePhysicalNetworkInterfaceResponse
from intersight.model.virtualization_vmware_uplink_port import VirtualizationVmwareUplinkPort
from intersight.model.virtualization_vmware_uplink_port_response import VirtualizationVmwareUplinkPortResponse
from intersight.model.virtualization_vmware_vcenter import VirtualizationVmwareVcenter
from intersight.model.virtualization_vmware_vcenter_response import VirtualizationVmwareVcenterResponse
from intersight.model.virtualization_vmware_virtual_disk import VirtualizationVmwareVirtualDisk
from intersight.model.virtualization_vmware_virtual_disk_response import VirtualizationVmwareVirtualDiskResponse
from intersight.model.virtualization_vmware_virtual_machine import VirtualizationVmwareVirtualMachine
from intersight.model.virtualization_vmware_virtual_machine_response import VirtualizationVmwareVirtualMachineResponse
from intersight.model.virtualization_vmware_virtual_machine_snapshot import VirtualizationVmwareVirtualMachineSnapshot
from intersight.model.virtualization_vmware_virtual_machine_snapshot_response import VirtualizationVmwareVirtualMachineSnapshotResponse
from intersight.model.virtualization_vmware_virtual_network_interface import VirtualizationVmwareVirtualNetworkInterface
from intersight.model.virtualization_vmware_virtual_network_interface_response import VirtualizationVmwareVirtualNetworkInterfaceResponse
from intersight.model.virtualization_vmware_virtual_switch import VirtualizationVmwareVirtualSwitch
from intersight.model.virtualization_vmware_virtual_switch_response import VirtualizationVmwareVirtualSwitchResponse
class VirtualizationApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __create_virtualization_cisco_hypervisor_manager(
self,
virtualization_cisco_hypervisor_manager,
**kwargs
):
"""Create a 'virtualization.CiscoHypervisorManager' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_cisco_hypervisor_manager(virtualization_cisco_hypervisor_manager, async_req=True)
>>> result = thread.get()
Args:
virtualization_cisco_hypervisor_manager (VirtualizationCiscoHypervisorManager): The 'virtualization.CiscoHypervisorManager' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationCiscoHypervisorManager
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_cisco_hypervisor_manager'] = \
virtualization_cisco_hypervisor_manager
return self.call_with_http_info(**kwargs)
self.create_virtualization_cisco_hypervisor_manager = _Endpoint(
settings={
'response_type': (VirtualizationCiscoHypervisorManager,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/CiscoHypervisorManagers',
'operation_id': 'create_virtualization_cisco_hypervisor_manager',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_cisco_hypervisor_manager',
'if_match',
'if_none_match',
],
'required': [
'virtualization_cisco_hypervisor_manager',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_cisco_hypervisor_manager':
(VirtualizationCiscoHypervisorManager,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_cisco_hypervisor_manager': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_cisco_hypervisor_manager
)
def __create_virtualization_esxi_console(
self,
virtualization_esxi_console,
**kwargs
):
"""Create a 'virtualization.EsxiConsole' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_esxi_console(virtualization_esxi_console, async_req=True)
>>> result = thread.get()
Args:
virtualization_esxi_console (VirtualizationEsxiConsole): The 'virtualization.EsxiConsole' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationEsxiConsole
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_esxi_console'] = \
virtualization_esxi_console
return self.call_with_http_info(**kwargs)
self.create_virtualization_esxi_console = _Endpoint(
settings={
'response_type': (VirtualizationEsxiConsole,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/EsxiConsoles',
'operation_id': 'create_virtualization_esxi_console',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_esxi_console',
'if_match',
'if_none_match',
],
'required': [
'virtualization_esxi_console',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_esxi_console':
(VirtualizationEsxiConsole,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_esxi_console': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_esxi_console
)
def __create_virtualization_iwe_datacenter(
self,
virtualization_iwe_datacenter,
**kwargs
):
"""Create a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_iwe_datacenter(virtualization_iwe_datacenter, async_req=True)
>>> result = thread.get()
Args:
virtualization_iwe_datacenter (VirtualizationIweDatacenter): The 'virtualization.IweDatacenter' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_iwe_datacenter'] = \
virtualization_iwe_datacenter
return self.call_with_http_info(**kwargs)
self.create_virtualization_iwe_datacenter = _Endpoint(
settings={
'response_type': (VirtualizationIweDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters',
'operation_id': 'create_virtualization_iwe_datacenter',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_iwe_datacenter',
'if_match',
'if_none_match',
],
'required': [
'virtualization_iwe_datacenter',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_iwe_datacenter':
(VirtualizationIweDatacenter,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_iwe_datacenter': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_iwe_datacenter
)
def __create_virtualization_virtual_disk(
self,
virtualization_virtual_disk,
**kwargs
):
"""Create a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_virtual_disk(virtualization_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
virtualization_virtual_disk (VirtualizationVirtualDisk): The 'virtualization.VirtualDisk' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_virtual_disk'] = \
virtualization_virtual_disk
return self.call_with_http_info(**kwargs)
self.create_virtualization_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks',
'operation_id': 'create_virtualization_virtual_disk',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_virtual_disk',
'if_match',
'if_none_match',
],
'required': [
'virtualization_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_virtual_disk':
(VirtualizationVirtualDisk,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_virtual_disk': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_virtual_disk
)
def __create_virtualization_virtual_machine(
self,
virtualization_virtual_machine,
**kwargs
):
"""Create a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_virtual_machine(virtualization_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
virtualization_virtual_machine (VirtualizationVirtualMachine): The 'virtualization.VirtualMachine' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_virtual_machine'] = \
virtualization_virtual_machine
return self.call_with_http_info(**kwargs)
self.create_virtualization_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines',
'operation_id': 'create_virtualization_virtual_machine',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_virtual_machine',
'if_match',
'if_none_match',
],
'required': [
'virtualization_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_virtual_machine':
(VirtualizationVirtualMachine,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_virtual_machine': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_virtual_machine
)
def __create_virtualization_virtual_network(
self,
virtualization_virtual_network,
**kwargs
):
"""Create a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_virtualization_virtual_network(virtualization_virtual_network, async_req=True)
>>> result = thread.get()
Args:
virtualization_virtual_network (VirtualizationVirtualNetwork): The 'virtualization.VirtualNetwork' resource to create.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
if_none_match (str): For methods that apply server-side changes, If-None-Match used with the * value can be used to create a resource not known to exist, guaranteeing that another resource creation didn't happen before, losing the data of the previous put. The request will be processed only if the eventually existing resource's ETag doesn't match any of the values listed. Otherwise, the status code 412 (Precondition Failed) is used. The asterisk is a special value representing any resource. It is only useful when creating a resource, usually with PUT, to check if another resource with the identity has already been created before. The comparison with the stored ETag uses the weak comparison algorithm, meaning two resources are considered identical if the content is equivalent - they don't have to be identical byte for byte.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['virtualization_virtual_network'] = \
virtualization_virtual_network
return self.call_with_http_info(**kwargs)
self.create_virtualization_virtual_network = _Endpoint(
settings={
'response_type': (VirtualizationVirtualNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks',
'operation_id': 'create_virtualization_virtual_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'virtualization_virtual_network',
'if_match',
'if_none_match',
],
'required': [
'virtualization_virtual_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'virtualization_virtual_network':
(VirtualizationVirtualNetwork,),
'if_match':
(str,),
'if_none_match':
(str,),
},
'attribute_map': {
'if_match': 'If-Match',
'if_none_match': 'If-None-Match',
},
'location_map': {
'virtualization_virtual_network': 'body',
'if_match': 'header',
'if_none_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_virtualization_virtual_network
)
def __delete_virtualization_iwe_cluster(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.IweCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_iwe_cluster(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_iwe_cluster = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweClusters/{Moid}',
'operation_id': 'delete_virtualization_iwe_cluster',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_iwe_cluster
)
def __delete_virtualization_iwe_datacenter(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_iwe_datacenter(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_iwe_datacenter = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters/{Moid}',
'operation_id': 'delete_virtualization_iwe_datacenter',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_iwe_datacenter
)
def __delete_virtualization_iwe_virtual_machine_network_interface(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.IweVirtualMachineNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_iwe_virtual_machine_network_interface(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_iwe_virtual_machine_network_interface = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachineNetworkInterfaces/{Moid}',
'operation_id': 'delete_virtualization_iwe_virtual_machine_network_interface',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_iwe_virtual_machine_network_interface
)
def __delete_virtualization_virtual_disk(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_virtual_disk(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_virtual_disk = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks/{Moid}',
'operation_id': 'delete_virtualization_virtual_disk',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_virtual_disk
)
def __delete_virtualization_virtual_machine(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_virtual_machine(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_virtual_machine = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines/{Moid}',
'operation_id': 'delete_virtualization_virtual_machine',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_virtual_machine
)
def __delete_virtualization_virtual_network(
self,
moid,
**kwargs
):
"""Delete a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_virtualization_virtual_network(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.delete_virtualization_virtual_network = _Endpoint(
settings={
'response_type': None,
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks/{Moid}',
'operation_id': 'delete_virtualization_virtual_network',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_virtualization_virtual_network
)
def __get_virtualization_cisco_hypervisor_manager_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.CiscoHypervisorManager' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_cisco_hypervisor_manager_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationCiscoHypervisorManager
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_cisco_hypervisor_manager_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationCiscoHypervisorManager,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/CiscoHypervisorManagers/{Moid}',
'operation_id': 'get_virtualization_cisco_hypervisor_manager_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_cisco_hypervisor_manager_by_moid
)
def __get_virtualization_cisco_hypervisor_manager_list(
self,
**kwargs
):
"""Read a 'virtualization.CiscoHypervisorManager' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_cisco_hypervisor_manager_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationCiscoHypervisorManagerResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_cisco_hypervisor_manager_list = _Endpoint(
settings={
'response_type': (VirtualizationCiscoHypervisorManagerResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/CiscoHypervisorManagers',
'operation_id': 'get_virtualization_cisco_hypervisor_manager_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_cisco_hypervisor_manager_list
)
def __get_virtualization_esxi_console_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.EsxiConsole' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_esxi_console_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationEsxiConsole
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_esxi_console_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationEsxiConsole,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/EsxiConsoles/{Moid}',
'operation_id': 'get_virtualization_esxi_console_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_esxi_console_by_moid
)
def __get_virtualization_esxi_console_list(
self,
**kwargs
):
"""Read a 'virtualization.EsxiConsole' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_esxi_console_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationEsxiConsoleResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_esxi_console_list = _Endpoint(
settings={
'response_type': (VirtualizationEsxiConsoleResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/EsxiConsoles',
'operation_id': 'get_virtualization_esxi_console_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_esxi_console_list
)
def __get_virtualization_host_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.Host' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_host_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_host_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/Hosts/{Moid}',
'operation_id': 'get_virtualization_host_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_host_by_moid
)
def __get_virtualization_host_list(
self,
**kwargs
):
"""Read a 'virtualization.Host' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_host_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationHostResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_host_list = _Endpoint(
settings={
'response_type': (VirtualizationHostResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/Hosts',
'operation_id': 'get_virtualization_host_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_host_list
)
def __get_virtualization_iwe_cluster_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_cluster_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_cluster_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweClusters/{Moid}',
'operation_id': 'get_virtualization_iwe_cluster_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_cluster_by_moid
)
def __get_virtualization_iwe_cluster_list(
self,
**kwargs
):
"""Read a 'virtualization.IweCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_cluster_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweClusterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_cluster_list = _Endpoint(
settings={
'response_type': (VirtualizationIweClusterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweClusters',
'operation_id': 'get_virtualization_iwe_cluster_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_cluster_list
)
def __get_virtualization_iwe_datacenter_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_datacenter_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_datacenter_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters/{Moid}',
'operation_id': 'get_virtualization_iwe_datacenter_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_datacenter_by_moid
)
def __get_virtualization_iwe_datacenter_list(
self,
**kwargs
):
"""Read a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_datacenter_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDatacenterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_datacenter_list = _Endpoint(
settings={
'response_type': (VirtualizationIweDatacenterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters',
'operation_id': 'get_virtualization_iwe_datacenter_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_datacenter_list
)
def __get_virtualization_iwe_dv_uplink_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweDvUplink' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_dv_uplink_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDvUplink
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_dv_uplink_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweDvUplink,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDvUplinks/{Moid}',
'operation_id': 'get_virtualization_iwe_dv_uplink_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_dv_uplink_by_moid
)
def __get_virtualization_iwe_dv_uplink_list(
self,
**kwargs
):
"""Read a 'virtualization.IweDvUplink' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_dv_uplink_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDvUplinkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_dv_uplink_list = _Endpoint(
settings={
'response_type': (VirtualizationIweDvUplinkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDvUplinks',
'operation_id': 'get_virtualization_iwe_dv_uplink_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_dv_uplink_list
)
def __get_virtualization_iwe_dvswitch_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweDvswitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_dvswitch_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDvswitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_dvswitch_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweDvswitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDvswitches/{Moid}',
'operation_id': 'get_virtualization_iwe_dvswitch_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_dvswitch_by_moid
)
def __get_virtualization_iwe_dvswitch_list(
self,
**kwargs
):
"""Read a 'virtualization.IweDvswitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_dvswitch_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDvswitchResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_dvswitch_list = _Endpoint(
settings={
'response_type': (VirtualizationIweDvswitchResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDvswitches',
'operation_id': 'get_virtualization_iwe_dvswitch_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_dvswitch_list
)
def __get_virtualization_iwe_host_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHosts/{Moid}',
'operation_id': 'get_virtualization_iwe_host_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_by_moid
)
def __get_virtualization_iwe_host_interface_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweHostInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_interface_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHostInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_interface_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweHostInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHostInterfaces/{Moid}',
'operation_id': 'get_virtualization_iwe_host_interface_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_interface_by_moid
)
def __get_virtualization_iwe_host_interface_list(
self,
**kwargs
):
"""Read a 'virtualization.IweHostInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_interface_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHostInterfaceResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_interface_list = _Endpoint(
settings={
'response_type': (VirtualizationIweHostInterfaceResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHostInterfaces',
'operation_id': 'get_virtualization_iwe_host_interface_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_interface_list
)
def __get_virtualization_iwe_host_list(
self,
**kwargs
):
"""Read a 'virtualization.IweHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHostResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_list = _Endpoint(
settings={
'response_type': (VirtualizationIweHostResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHosts',
'operation_id': 'get_virtualization_iwe_host_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_list
)
def __get_virtualization_iwe_host_vswitch_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweHostVswitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_vswitch_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHostVswitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_vswitch_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweHostVswitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHostVswitches/{Moid}',
'operation_id': 'get_virtualization_iwe_host_vswitch_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_vswitch_by_moid
)
def __get_virtualization_iwe_host_vswitch_list(
self,
**kwargs
):
"""Read a 'virtualization.IweHostVswitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_host_vswitch_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHostVswitchResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_host_vswitch_list = _Endpoint(
settings={
'response_type': (VirtualizationIweHostVswitchResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHostVswitches',
'operation_id': 'get_virtualization_iwe_host_vswitch_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_host_vswitch_list
)
def __get_virtualization_iwe_network_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_network_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_network_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweNetworks/{Moid}',
'operation_id': 'get_virtualization_iwe_network_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_network_by_moid
)
def __get_virtualization_iwe_network_list(
self,
**kwargs
):
"""Read a 'virtualization.IweNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_network_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweNetworkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_network_list = _Endpoint(
settings={
'response_type': (VirtualizationIweNetworkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweNetworks',
'operation_id': 'get_virtualization_iwe_network_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_network_list
)
def __get_virtualization_iwe_virtual_disk_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_disk_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_disk_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualDisks/{Moid}',
'operation_id': 'get_virtualization_iwe_virtual_disk_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_disk_by_moid
)
def __get_virtualization_iwe_virtual_disk_list(
self,
**kwargs
):
"""Read a 'virtualization.IweVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_disk_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualDiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_disk_list = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualDiskResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualDisks',
'operation_id': 'get_virtualization_iwe_virtual_disk_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_disk_list
)
def __get_virtualization_iwe_virtual_machine_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_machine_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_machine_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachines/{Moid}',
'operation_id': 'get_virtualization_iwe_virtual_machine_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_machine_by_moid
)
def __get_virtualization_iwe_virtual_machine_list(
self,
**kwargs
):
"""Read a 'virtualization.IweVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_machine_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachineResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_machine_list = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachineResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachines',
'operation_id': 'get_virtualization_iwe_virtual_machine_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_machine_list
)
def __get_virtualization_iwe_virtual_machine_network_interface_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.IweVirtualMachineNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_machine_network_interface_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachineNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_machine_network_interface_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachineNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachineNetworkInterfaces/{Moid}',
'operation_id': 'get_virtualization_iwe_virtual_machine_network_interface_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_machine_network_interface_by_moid
)
def __get_virtualization_iwe_virtual_machine_network_interface_list(
self,
**kwargs
):
"""Read a 'virtualization.IweVirtualMachineNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_iwe_virtual_machine_network_interface_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachineNetworkInterfaceResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_iwe_virtual_machine_network_interface_list = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachineNetworkInterfaceResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachineNetworkInterfaces',
'operation_id': 'get_virtualization_iwe_virtual_machine_network_interface_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_iwe_virtual_machine_network_interface_list
)
def __get_virtualization_virtual_disk_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_disk_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_disk_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks/{Moid}',
'operation_id': 'get_virtualization_virtual_disk_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_disk_by_moid
)
def __get_virtualization_virtual_disk_list(
self,
**kwargs
):
"""Read a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_disk_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualDiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_disk_list = _Endpoint(
settings={
'response_type': (VirtualizationVirtualDiskResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks',
'operation_id': 'get_virtualization_virtual_disk_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_disk_list
)
def __get_virtualization_virtual_machine_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_machine_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_machine_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines/{Moid}',
'operation_id': 'get_virtualization_virtual_machine_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_machine_by_moid
)
def __get_virtualization_virtual_machine_list(
self,
**kwargs
):
"""Read a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_machine_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualMachineResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_machine_list = _Endpoint(
settings={
'response_type': (VirtualizationVirtualMachineResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines',
'operation_id': 'get_virtualization_virtual_machine_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_machine_list
)
def __get_virtualization_virtual_network_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_network_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_network_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVirtualNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks/{Moid}',
'operation_id': 'get_virtualization_virtual_network_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_network_by_moid
)
def __get_virtualization_virtual_network_list(
self,
**kwargs
):
"""Read a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_virtual_network_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualNetworkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_virtual_network_list = _Endpoint(
settings={
'response_type': (VirtualizationVirtualNetworkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks',
'operation_id': 'get_virtualization_virtual_network_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_virtual_network_list
)
def __get_virtualization_vmware_cluster_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_cluster_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_cluster_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareClusters/{Moid}',
'operation_id': 'get_virtualization_vmware_cluster_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_cluster_by_moid
)
def __get_virtualization_vmware_cluster_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_cluster_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareClusterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_cluster_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareClusterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareClusters',
'operation_id': 'get_virtualization_vmware_cluster_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_cluster_list
)
def __get_virtualization_vmware_datacenter_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datacenter_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datacenter_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatacenters/{Moid}',
'operation_id': 'get_virtualization_vmware_datacenter_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datacenter_by_moid
)
def __get_virtualization_vmware_datacenter_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datacenter_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatacenterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datacenter_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatacenterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatacenters',
'operation_id': 'get_virtualization_vmware_datacenter_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datacenter_list
)
def __get_virtualization_vmware_datastore_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareDatastore' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datastore_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastore
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datastore_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastore,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastores/{Moid}',
'operation_id': 'get_virtualization_vmware_datastore_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datastore_by_moid
)
def __get_virtualization_vmware_datastore_cluster_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareDatastoreCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datastore_cluster_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastoreCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datastore_cluster_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastoreCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastoreClusters/{Moid}',
'operation_id': 'get_virtualization_vmware_datastore_cluster_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datastore_cluster_by_moid
)
def __get_virtualization_vmware_datastore_cluster_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareDatastoreCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datastore_cluster_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastoreClusterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datastore_cluster_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastoreClusterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastoreClusters',
'operation_id': 'get_virtualization_vmware_datastore_cluster_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datastore_cluster_list
)
def __get_virtualization_vmware_datastore_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareDatastore' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_datastore_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastoreResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_datastore_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastoreResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastores',
'operation_id': 'get_virtualization_vmware_datastore_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_datastore_list
)
def __get_virtualization_vmware_distributed_network_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareDistributedNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_distributed_network_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_distributed_network_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedNetworks/{Moid}',
'operation_id': 'get_virtualization_vmware_distributed_network_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_distributed_network_by_moid
)
def __get_virtualization_vmware_distributed_network_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareDistributedNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_distributed_network_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedNetworkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_distributed_network_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedNetworkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedNetworks',
'operation_id': 'get_virtualization_vmware_distributed_network_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_distributed_network_list
)
def __get_virtualization_vmware_distributed_switch_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareDistributedSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_distributed_switch_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_distributed_switch_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedSwitches/{Moid}',
'operation_id': 'get_virtualization_vmware_distributed_switch_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_distributed_switch_by_moid
)
def __get_virtualization_vmware_distributed_switch_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareDistributedSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_distributed_switch_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedSwitchResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_distributed_switch_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedSwitchResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedSwitches',
'operation_id': 'get_virtualization_vmware_distributed_switch_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_distributed_switch_list
)
def __get_virtualization_vmware_folder_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareFolder' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_folder_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareFolder
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_folder_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareFolder,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareFolders/{Moid}',
'operation_id': 'get_virtualization_vmware_folder_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_folder_by_moid
)
def __get_virtualization_vmware_folder_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareFolder' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_folder_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareFolderResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_folder_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareFolderResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareFolders',
'operation_id': 'get_virtualization_vmware_folder_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_folder_list
)
def __get_virtualization_vmware_host_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_host_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_host_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareHosts/{Moid}',
'operation_id': 'get_virtualization_vmware_host_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_host_by_moid
)
def __get_virtualization_vmware_host_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_host_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareHostResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_host_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareHostResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareHosts',
'operation_id': 'get_virtualization_vmware_host_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_host_list
)
def __get_virtualization_vmware_kernel_network_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareKernelNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_kernel_network_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareKernelNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_kernel_network_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareKernelNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareKernelNetworks/{Moid}',
'operation_id': 'get_virtualization_vmware_kernel_network_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_kernel_network_by_moid
)
def __get_virtualization_vmware_kernel_network_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareKernelNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_kernel_network_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareKernelNetworkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_kernel_network_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareKernelNetworkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareKernelNetworks',
'operation_id': 'get_virtualization_vmware_kernel_network_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_kernel_network_list
)
def __get_virtualization_vmware_network_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_network_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_network_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareNetworks/{Moid}',
'operation_id': 'get_virtualization_vmware_network_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_network_by_moid
)
def __get_virtualization_vmware_network_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_network_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareNetworkResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_network_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareNetworkResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareNetworks',
'operation_id': 'get_virtualization_vmware_network_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_network_list
)
def __get_virtualization_vmware_physical_network_interface_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwarePhysicalNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_physical_network_interface_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwarePhysicalNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_physical_network_interface_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwarePhysicalNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwarePhysicalNetworkInterfaces/{Moid}',
'operation_id': 'get_virtualization_vmware_physical_network_interface_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_physical_network_interface_by_moid
)
def __get_virtualization_vmware_physical_network_interface_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwarePhysicalNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_physical_network_interface_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwarePhysicalNetworkInterfaceResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_physical_network_interface_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwarePhysicalNetworkInterfaceResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwarePhysicalNetworkInterfaces',
'operation_id': 'get_virtualization_vmware_physical_network_interface_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_physical_network_interface_list
)
def __get_virtualization_vmware_uplink_port_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareUplinkPort' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_uplink_port_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareUplinkPort
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_uplink_port_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareUplinkPort,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareUplinkPorts/{Moid}',
'operation_id': 'get_virtualization_vmware_uplink_port_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_uplink_port_by_moid
)
def __get_virtualization_vmware_uplink_port_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareUplinkPort' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_uplink_port_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareUplinkPortResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_uplink_port_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareUplinkPortResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareUplinkPorts',
'operation_id': 'get_virtualization_vmware_uplink_port_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_uplink_port_list
)
def __get_virtualization_vmware_vcenter_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVcenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_vcenter_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVcenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_vcenter_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVcenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVcenters/{Moid}',
'operation_id': 'get_virtualization_vmware_vcenter_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_vcenter_by_moid
)
def __get_virtualization_vmware_vcenter_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVcenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_vcenter_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVcenterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_vcenter_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVcenterResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVcenters',
'operation_id': 'get_virtualization_vmware_vcenter_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_vcenter_list
)
def __get_virtualization_vmware_virtual_disk_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_disk_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_disk_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualDisks/{Moid}',
'operation_id': 'get_virtualization_vmware_virtual_disk_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_disk_by_moid
)
def __get_virtualization_vmware_virtual_disk_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_disk_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualDiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_disk_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualDiskResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualDisks',
'operation_id': 'get_virtualization_vmware_virtual_disk_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_disk_list
)
def __get_virtualization_vmware_virtual_machine_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_machine_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_machine_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachines/{Moid}',
'operation_id': 'get_virtualization_vmware_virtual_machine_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_machine_by_moid
)
def __get_virtualization_vmware_virtual_machine_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_machine_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachineResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_machine_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachineResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachines',
'operation_id': 'get_virtualization_vmware_virtual_machine_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_machine_list
)
def __get_virtualization_vmware_virtual_machine_snapshot_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualMachineSnapshot' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_machine_snapshot_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachineSnapshot
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_machine_snapshot_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachineSnapshot,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachineSnapshots/{Moid}',
'operation_id': 'get_virtualization_vmware_virtual_machine_snapshot_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_machine_snapshot_by_moid
)
def __get_virtualization_vmware_virtual_machine_snapshot_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualMachineSnapshot' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_machine_snapshot_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachineSnapshotResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_machine_snapshot_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachineSnapshotResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachineSnapshots',
'operation_id': 'get_virtualization_vmware_virtual_machine_snapshot_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_machine_snapshot_list
)
def __get_virtualization_vmware_virtual_network_interface_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_network_interface_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_network_interface_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualNetworkInterfaces/{Moid}',
'operation_id': 'get_virtualization_vmware_virtual_network_interface_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_network_interface_by_moid
)
def __get_virtualization_vmware_virtual_network_interface_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_network_interface_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualNetworkInterfaceResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_network_interface_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualNetworkInterfaceResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualNetworkInterfaces',
'operation_id': 'get_virtualization_vmware_virtual_network_interface_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_network_interface_list
)
def __get_virtualization_vmware_virtual_switch_by_moid(
self,
moid,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_switch_by_moid(moid, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_switch_by_moid = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualSwitches/{Moid}',
'operation_id': 'get_virtualization_vmware_virtual_switch_by_moid',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'moid',
],
'required': [
'moid',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
},
'attribute_map': {
'moid': 'Moid',
},
'location_map': {
'moid': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_switch_by_moid
)
def __get_virtualization_vmware_virtual_switch_list(
self,
**kwargs
):
"""Read a 'virtualization.VmwareVirtualSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_virtualization_vmware_virtual_switch_list(async_req=True)
>>> result = thread.get()
Keyword Args:
filter (str): Filter criteria for the resources to return. A URI with a $filter query option identifies a subset of the entries from the Collection of Entries. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the $filter option. The expression language that is used in $filter queries supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false).. [optional] if omitted the server will use the default value of ""
orderby (str): Determines what properties are used to sort the collection of resources.. [optional]
top (int): Specifies the maximum number of resources to return in the response.. [optional] if omitted the server will use the default value of 100
skip (int): Specifies the number of resources to skip in the response.. [optional] if omitted the server will use the default value of 0
select (str): Specifies a subset of properties to return.. [optional] if omitted the server will use the default value of ""
expand (str): Specify additional attributes or related resources to return in addition to the primary resources.. [optional]
apply (str): Specify one or more transformation operations to perform aggregation on the resources. The transformations are processed in order with the output from a transformation being used as input for the subsequent transformation. The \"$apply\" query takes a sequence of set transformations, separated by forward slashes to express that they are consecutively applied, i.e. the result of each transformation is the input to the next transformation. Supported aggregation methods are \"aggregate\" and \"groupby\". The **aggregate** transformation takes a comma-separated list of one or more aggregate expressions as parameters and returns a result set with a single instance, representing the aggregated value for all instances in the input set. The **groupby** transformation takes one or two parameters and 1. Splits the initial set into subsets where all instances in a subset have the same values for the grouping properties specified in the first parameter, 2. Applies set transformations to each subset according to the second parameter, resulting in a new set of potentially different structure and cardinality, 3. Ensures that the instances in the result set contain all grouping properties with the correct values for the group, 4. Concatenates the intermediate result sets into one result set. A groupby transformation affects the structure of the result set.. [optional]
count (bool): The $count query specifies the service should return the count of the matching resources, instead of returning the resources.. [optional]
inlinecount (str): The $inlinecount query option allows clients to request an inline count of the matching resources included with the resources in the response.. [optional] if omitted the server will use the default value of "allpages"
at (str): Similar to \"$filter\", but \"at\" is specifically used to filter versioning information properties for resources to return. A URI with an \"at\" Query Option identifies a subset of the Entries from the Collection of Entries identified by the Resource Path section of the URI. The subset is determined by selecting only the Entries that satisfy the predicate expression specified by the query option. The expression language that is used in at operators supports references to properties and literals. The literal values can be strings enclosed in single quotes, numbers and boolean values (true or false) or any of the additional literal representations shown in the Abstract Type System section.. [optional]
tags (str): The 'tags' parameter is used to request a summary of the Tag utilization for this resource. When the 'tags' parameter is specified, the response provides a list of tag keys, the number of times the key has been used across all documents, and the tag values that have been assigned to the tag key.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualSwitchResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_virtualization_vmware_virtual_switch_list = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualSwitchResponse,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualSwitches',
'operation_id': 'get_virtualization_vmware_virtual_switch_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'filter',
'orderby',
'top',
'skip',
'select',
'expand',
'apply',
'count',
'inlinecount',
'at',
'tags',
],
'required': [],
'nullable': [
],
'enum': [
'inlinecount',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('inlinecount',): {
"ALLPAGES": "allpages",
"NONE": "none"
},
},
'openapi_types': {
'filter':
(str,),
'orderby':
(str,),
'top':
(int,),
'skip':
(int,),
'select':
(str,),
'expand':
(str,),
'apply':
(str,),
'count':
(bool,),
'inlinecount':
(str,),
'at':
(str,),
'tags':
(str,),
},
'attribute_map': {
'filter': '$filter',
'orderby': '$orderby',
'top': '$top',
'skip': '$skip',
'select': '$select',
'expand': '$expand',
'apply': '$apply',
'count': '$count',
'inlinecount': '$inlinecount',
'at': 'at',
'tags': 'tags',
},
'location_map': {
'filter': 'query',
'orderby': 'query',
'top': 'query',
'skip': 'query',
'select': 'query',
'expand': 'query',
'apply': 'query',
'count': 'query',
'inlinecount': 'query',
'at': 'query',
'tags': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'text/csv',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
],
'content_type': [],
},
api_client=api_client,
callable=__get_virtualization_vmware_virtual_switch_list
)
def __patch_virtualization_cisco_hypervisor_manager(
self,
moid,
virtualization_cisco_hypervisor_manager,
**kwargs
):
"""Update a 'virtualization.CiscoHypervisorManager' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_cisco_hypervisor_manager(moid, virtualization_cisco_hypervisor_manager, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_cisco_hypervisor_manager (VirtualizationCiscoHypervisorManager): The 'virtualization.CiscoHypervisorManager' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationCiscoHypervisorManager
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_cisco_hypervisor_manager'] = \
virtualization_cisco_hypervisor_manager
return self.call_with_http_info(**kwargs)
self.patch_virtualization_cisco_hypervisor_manager = _Endpoint(
settings={
'response_type': (VirtualizationCiscoHypervisorManager,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/CiscoHypervisorManagers/{Moid}',
'operation_id': 'patch_virtualization_cisco_hypervisor_manager',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_cisco_hypervisor_manager',
'if_match',
],
'required': [
'moid',
'virtualization_cisco_hypervisor_manager',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_cisco_hypervisor_manager':
(VirtualizationCiscoHypervisorManager,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_cisco_hypervisor_manager': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_cisco_hypervisor_manager
)
def __patch_virtualization_esxi_console(
self,
moid,
virtualization_esxi_console,
**kwargs
):
"""Update a 'virtualization.EsxiConsole' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_esxi_console(moid, virtualization_esxi_console, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_esxi_console (VirtualizationEsxiConsole): The 'virtualization.EsxiConsole' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationEsxiConsole
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_esxi_console'] = \
virtualization_esxi_console
return self.call_with_http_info(**kwargs)
self.patch_virtualization_esxi_console = _Endpoint(
settings={
'response_type': (VirtualizationEsxiConsole,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/EsxiConsoles/{Moid}',
'operation_id': 'patch_virtualization_esxi_console',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_esxi_console',
'if_match',
],
'required': [
'moid',
'virtualization_esxi_console',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_esxi_console':
(VirtualizationEsxiConsole,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_esxi_console': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_esxi_console
)
def __patch_virtualization_host(
self,
moid,
virtualization_host,
**kwargs
):
"""Update a 'virtualization.Host' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_host(moid, virtualization_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_host (VirtualizationHost): The 'virtualization.Host' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_host'] = \
virtualization_host
return self.call_with_http_info(**kwargs)
self.patch_virtualization_host = _Endpoint(
settings={
'response_type': (VirtualizationHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/Hosts/{Moid}',
'operation_id': 'patch_virtualization_host',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_host',
'if_match',
],
'required': [
'moid',
'virtualization_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_host':
(VirtualizationHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_host
)
def __patch_virtualization_iwe_cluster(
self,
moid,
virtualization_iwe_cluster,
**kwargs
):
"""Update a 'virtualization.IweCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_cluster(moid, virtualization_iwe_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_cluster (VirtualizationIweCluster): The 'virtualization.IweCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_cluster'] = \
virtualization_iwe_cluster
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_cluster = _Endpoint(
settings={
'response_type': (VirtualizationIweCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweClusters/{Moid}',
'operation_id': 'patch_virtualization_iwe_cluster',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_cluster':
(VirtualizationIweCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_cluster
)
def __patch_virtualization_iwe_datacenter(
self,
moid,
virtualization_iwe_datacenter,
**kwargs
):
"""Update a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_datacenter(moid, virtualization_iwe_datacenter, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_datacenter (VirtualizationIweDatacenter): The 'virtualization.IweDatacenter' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_datacenter'] = \
virtualization_iwe_datacenter
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_datacenter = _Endpoint(
settings={
'response_type': (VirtualizationIweDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters/{Moid}',
'operation_id': 'patch_virtualization_iwe_datacenter',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_datacenter',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_datacenter',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_datacenter':
(VirtualizationIweDatacenter,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_datacenter': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_datacenter
)
def __patch_virtualization_iwe_host(
self,
moid,
virtualization_iwe_host,
**kwargs
):
"""Update a 'virtualization.IweHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_host(moid, virtualization_iwe_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_host (VirtualizationIweHost): The 'virtualization.IweHost' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_host'] = \
virtualization_iwe_host
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_host = _Endpoint(
settings={
'response_type': (VirtualizationIweHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHosts/{Moid}',
'operation_id': 'patch_virtualization_iwe_host',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_host',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_host':
(VirtualizationIweHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_host
)
def __patch_virtualization_iwe_network(
self,
moid,
virtualization_iwe_network,
**kwargs
):
"""Update a 'virtualization.IweNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_network(moid, virtualization_iwe_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_network (VirtualizationIweNetwork): The 'virtualization.IweNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_network'] = \
virtualization_iwe_network
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_network = _Endpoint(
settings={
'response_type': (VirtualizationIweNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweNetworks/{Moid}',
'operation_id': 'patch_virtualization_iwe_network',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_network',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_network':
(VirtualizationIweNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_network
)
def __patch_virtualization_iwe_virtual_disk(
self,
moid,
virtualization_iwe_virtual_disk,
**kwargs
):
"""Update a 'virtualization.IweVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_virtual_disk(moid, virtualization_iwe_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_virtual_disk (VirtualizationIweVirtualDisk): The 'virtualization.IweVirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_virtual_disk'] = \
virtualization_iwe_virtual_disk
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualDisks/{Moid}',
'operation_id': 'patch_virtualization_iwe_virtual_disk',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_virtual_disk':
(VirtualizationIweVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_virtual_disk
)
def __patch_virtualization_iwe_virtual_machine(
self,
moid,
virtualization_iwe_virtual_machine,
**kwargs
):
"""Update a 'virtualization.IweVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_iwe_virtual_machine(moid, virtualization_iwe_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_virtual_machine (VirtualizationIweVirtualMachine): The 'virtualization.IweVirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_virtual_machine'] = \
virtualization_iwe_virtual_machine
return self.call_with_http_info(**kwargs)
self.patch_virtualization_iwe_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachines/{Moid}',
'operation_id': 'patch_virtualization_iwe_virtual_machine',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_virtual_machine':
(VirtualizationIweVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_iwe_virtual_machine
)
def __patch_virtualization_virtual_disk(
self,
moid,
virtualization_virtual_disk,
**kwargs
):
"""Update a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_virtual_disk(moid, virtualization_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_disk (VirtualizationVirtualDisk): The 'virtualization.VirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_disk'] = \
virtualization_virtual_disk
return self.call_with_http_info(**kwargs)
self.patch_virtualization_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks/{Moid}',
'operation_id': 'patch_virtualization_virtual_disk',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_disk':
(VirtualizationVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_virtual_disk
)
def __patch_virtualization_virtual_machine(
self,
moid,
virtualization_virtual_machine,
**kwargs
):
"""Update a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_virtual_machine(moid, virtualization_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_machine (VirtualizationVirtualMachine): The 'virtualization.VirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_machine'] = \
virtualization_virtual_machine
return self.call_with_http_info(**kwargs)
self.patch_virtualization_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines/{Moid}',
'operation_id': 'patch_virtualization_virtual_machine',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_machine':
(VirtualizationVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_virtual_machine
)
def __patch_virtualization_virtual_network(
self,
moid,
virtualization_virtual_network,
**kwargs
):
"""Update a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_virtual_network(moid, virtualization_virtual_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_network (VirtualizationVirtualNetwork): The 'virtualization.VirtualNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_network'] = \
virtualization_virtual_network
return self.call_with_http_info(**kwargs)
self.patch_virtualization_virtual_network = _Endpoint(
settings={
'response_type': (VirtualizationVirtualNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks/{Moid}',
'operation_id': 'patch_virtualization_virtual_network',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_network',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_network':
(VirtualizationVirtualNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_virtual_network
)
def __patch_virtualization_vmware_cluster(
self,
moid,
virtualization_vmware_cluster,
**kwargs
):
"""Update a 'virtualization.VmwareCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_cluster(moid, virtualization_vmware_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_cluster (VirtualizationVmwareCluster): The 'virtualization.VmwareCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_cluster'] = \
virtualization_vmware_cluster
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_cluster = _Endpoint(
settings={
'response_type': (VirtualizationVmwareCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareClusters/{Moid}',
'operation_id': 'patch_virtualization_vmware_cluster',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_cluster':
(VirtualizationVmwareCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_cluster
)
def __patch_virtualization_vmware_datacenter(
self,
moid,
virtualization_vmware_datacenter,
**kwargs
):
"""Update a 'virtualization.VmwareDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_datacenter(moid, virtualization_vmware_datacenter, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datacenter (VirtualizationVmwareDatacenter): The 'virtualization.VmwareDatacenter' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datacenter'] = \
virtualization_vmware_datacenter
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_datacenter = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatacenters/{Moid}',
'operation_id': 'patch_virtualization_vmware_datacenter',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datacenter',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datacenter',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datacenter':
(VirtualizationVmwareDatacenter,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datacenter': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_datacenter
)
def __patch_virtualization_vmware_datastore(
self,
moid,
virtualization_vmware_datastore,
**kwargs
):
"""Update a 'virtualization.VmwareDatastore' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_datastore(moid, virtualization_vmware_datastore, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datastore (VirtualizationVmwareDatastore): The 'virtualization.VmwareDatastore' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastore
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datastore'] = \
virtualization_vmware_datastore
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_datastore = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastore,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastores/{Moid}',
'operation_id': 'patch_virtualization_vmware_datastore',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datastore',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datastore',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datastore':
(VirtualizationVmwareDatastore,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datastore': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_datastore
)
def __patch_virtualization_vmware_datastore_cluster(
self,
moid,
virtualization_vmware_datastore_cluster,
**kwargs
):
"""Update a 'virtualization.VmwareDatastoreCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_datastore_cluster(moid, virtualization_vmware_datastore_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datastore_cluster (VirtualizationVmwareDatastoreCluster): The 'virtualization.VmwareDatastoreCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastoreCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datastore_cluster'] = \
virtualization_vmware_datastore_cluster
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_datastore_cluster = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastoreCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastoreClusters/{Moid}',
'operation_id': 'patch_virtualization_vmware_datastore_cluster',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datastore_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datastore_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datastore_cluster':
(VirtualizationVmwareDatastoreCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datastore_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_datastore_cluster
)
def __patch_virtualization_vmware_distributed_network(
self,
moid,
virtualization_vmware_distributed_network,
**kwargs
):
"""Update a 'virtualization.VmwareDistributedNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_distributed_network(moid, virtualization_vmware_distributed_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_distributed_network (VirtualizationVmwareDistributedNetwork): The 'virtualization.VmwareDistributedNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_distributed_network'] = \
virtualization_vmware_distributed_network
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_distributed_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedNetworks/{Moid}',
'operation_id': 'patch_virtualization_vmware_distributed_network',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_distributed_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_distributed_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_distributed_network':
(VirtualizationVmwareDistributedNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_distributed_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_distributed_network
)
def __patch_virtualization_vmware_distributed_switch(
self,
moid,
virtualization_vmware_distributed_switch,
**kwargs
):
"""Update a 'virtualization.VmwareDistributedSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_distributed_switch(moid, virtualization_vmware_distributed_switch, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_distributed_switch (VirtualizationVmwareDistributedSwitch): The 'virtualization.VmwareDistributedSwitch' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_distributed_switch'] = \
virtualization_vmware_distributed_switch
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_distributed_switch = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedSwitches/{Moid}',
'operation_id': 'patch_virtualization_vmware_distributed_switch',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_distributed_switch',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_distributed_switch',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_distributed_switch':
(VirtualizationVmwareDistributedSwitch,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_distributed_switch': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_distributed_switch
)
def __patch_virtualization_vmware_folder(
self,
moid,
virtualization_vmware_folder,
**kwargs
):
"""Update a 'virtualization.VmwareFolder' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_folder(moid, virtualization_vmware_folder, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_folder (VirtualizationVmwareFolder): The 'virtualization.VmwareFolder' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareFolder
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_folder'] = \
virtualization_vmware_folder
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_folder = _Endpoint(
settings={
'response_type': (VirtualizationVmwareFolder,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareFolders/{Moid}',
'operation_id': 'patch_virtualization_vmware_folder',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_folder',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_folder',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_folder':
(VirtualizationVmwareFolder,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_folder': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_folder
)
def __patch_virtualization_vmware_host(
self,
moid,
virtualization_vmware_host,
**kwargs
):
"""Update a 'virtualization.VmwareHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_host(moid, virtualization_vmware_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_host (VirtualizationVmwareHost): The 'virtualization.VmwareHost' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_host'] = \
virtualization_vmware_host
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_host = _Endpoint(
settings={
'response_type': (VirtualizationVmwareHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareHosts/{Moid}',
'operation_id': 'patch_virtualization_vmware_host',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_host',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_host':
(VirtualizationVmwareHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_host
)
def __patch_virtualization_vmware_kernel_network(
self,
moid,
virtualization_vmware_kernel_network,
**kwargs
):
"""Update a 'virtualization.VmwareKernelNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_kernel_network(moid, virtualization_vmware_kernel_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_kernel_network (VirtualizationVmwareKernelNetwork): The 'virtualization.VmwareKernelNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareKernelNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_kernel_network'] = \
virtualization_vmware_kernel_network
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_kernel_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareKernelNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareKernelNetworks/{Moid}',
'operation_id': 'patch_virtualization_vmware_kernel_network',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_kernel_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_kernel_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_kernel_network':
(VirtualizationVmwareKernelNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_kernel_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_kernel_network
)
def __patch_virtualization_vmware_network(
self,
moid,
virtualization_vmware_network,
**kwargs
):
"""Update a 'virtualization.VmwareNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_network(moid, virtualization_vmware_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_network (VirtualizationVmwareNetwork): The 'virtualization.VmwareNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_network'] = \
virtualization_vmware_network
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareNetworks/{Moid}',
'operation_id': 'patch_virtualization_vmware_network',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_network':
(VirtualizationVmwareNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_network
)
def __patch_virtualization_vmware_physical_network_interface(
self,
moid,
virtualization_vmware_physical_network_interface,
**kwargs
):
"""Update a 'virtualization.VmwarePhysicalNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_physical_network_interface(moid, virtualization_vmware_physical_network_interface, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_physical_network_interface (VirtualizationVmwarePhysicalNetworkInterface): The 'virtualization.VmwarePhysicalNetworkInterface' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwarePhysicalNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_physical_network_interface'] = \
virtualization_vmware_physical_network_interface
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_physical_network_interface = _Endpoint(
settings={
'response_type': (VirtualizationVmwarePhysicalNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwarePhysicalNetworkInterfaces/{Moid}',
'operation_id': 'patch_virtualization_vmware_physical_network_interface',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_physical_network_interface',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_physical_network_interface',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_physical_network_interface':
(VirtualizationVmwarePhysicalNetworkInterface,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_physical_network_interface': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_physical_network_interface
)
def __patch_virtualization_vmware_uplink_port(
self,
moid,
virtualization_vmware_uplink_port,
**kwargs
):
"""Update a 'virtualization.VmwareUplinkPort' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_uplink_port(moid, virtualization_vmware_uplink_port, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_uplink_port (VirtualizationVmwareUplinkPort): The 'virtualization.VmwareUplinkPort' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareUplinkPort
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_uplink_port'] = \
virtualization_vmware_uplink_port
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_uplink_port = _Endpoint(
settings={
'response_type': (VirtualizationVmwareUplinkPort,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareUplinkPorts/{Moid}',
'operation_id': 'patch_virtualization_vmware_uplink_port',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_uplink_port',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_uplink_port',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_uplink_port':
(VirtualizationVmwareUplinkPort,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_uplink_port': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_uplink_port
)
def __patch_virtualization_vmware_virtual_disk(
self,
moid,
virtualization_vmware_virtual_disk,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_virtual_disk(moid, virtualization_vmware_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_disk (VirtualizationVmwareVirtualDisk): The 'virtualization.VmwareVirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_disk'] = \
virtualization_vmware_virtual_disk
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualDisks/{Moid}',
'operation_id': 'patch_virtualization_vmware_virtual_disk',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_disk':
(VirtualizationVmwareVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_virtual_disk
)
def __patch_virtualization_vmware_virtual_machine(
self,
moid,
virtualization_vmware_virtual_machine,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_virtual_machine(moid, virtualization_vmware_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_machine (VirtualizationVmwareVirtualMachine): The 'virtualization.VmwareVirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_machine'] = \
virtualization_vmware_virtual_machine
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachines/{Moid}',
'operation_id': 'patch_virtualization_vmware_virtual_machine',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_machine':
(VirtualizationVmwareVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_virtual_machine
)
def __patch_virtualization_vmware_virtual_machine_snapshot(
self,
moid,
virtualization_vmware_virtual_machine_snapshot,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualMachineSnapshot' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_virtual_machine_snapshot(moid, virtualization_vmware_virtual_machine_snapshot, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_machine_snapshot (VirtualizationVmwareVirtualMachineSnapshot): The 'virtualization.VmwareVirtualMachineSnapshot' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachineSnapshot
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_machine_snapshot'] = \
virtualization_vmware_virtual_machine_snapshot
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_virtual_machine_snapshot = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachineSnapshot,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachineSnapshots/{Moid}',
'operation_id': 'patch_virtualization_vmware_virtual_machine_snapshot',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_machine_snapshot',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_machine_snapshot',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_machine_snapshot':
(VirtualizationVmwareVirtualMachineSnapshot,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_machine_snapshot': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_virtual_machine_snapshot
)
def __patch_virtualization_vmware_virtual_network_interface(
self,
moid,
virtualization_vmware_virtual_network_interface,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_virtual_network_interface(moid, virtualization_vmware_virtual_network_interface, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_network_interface (VirtualizationVmwareVirtualNetworkInterface): The 'virtualization.VmwareVirtualNetworkInterface' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_network_interface'] = \
virtualization_vmware_virtual_network_interface
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_virtual_network_interface = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualNetworkInterfaces/{Moid}',
'operation_id': 'patch_virtualization_vmware_virtual_network_interface',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_network_interface',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_network_interface',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_network_interface':
(VirtualizationVmwareVirtualNetworkInterface,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_network_interface': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_virtual_network_interface
)
def __patch_virtualization_vmware_virtual_switch(
self,
moid,
virtualization_vmware_virtual_switch,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_virtualization_vmware_virtual_switch(moid, virtualization_vmware_virtual_switch, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_switch (VirtualizationVmwareVirtualSwitch): The 'virtualization.VmwareVirtualSwitch' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_switch'] = \
virtualization_vmware_virtual_switch
return self.call_with_http_info(**kwargs)
self.patch_virtualization_vmware_virtual_switch = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualSwitches/{Moid}',
'operation_id': 'patch_virtualization_vmware_virtual_switch',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_switch',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_switch',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_switch':
(VirtualizationVmwareVirtualSwitch,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_switch': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__patch_virtualization_vmware_virtual_switch
)
def __update_virtualization_cisco_hypervisor_manager(
self,
moid,
virtualization_cisco_hypervisor_manager,
**kwargs
):
"""Update a 'virtualization.CiscoHypervisorManager' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_cisco_hypervisor_manager(moid, virtualization_cisco_hypervisor_manager, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_cisco_hypervisor_manager (VirtualizationCiscoHypervisorManager): The 'virtualization.CiscoHypervisorManager' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationCiscoHypervisorManager
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_cisco_hypervisor_manager'] = \
virtualization_cisco_hypervisor_manager
return self.call_with_http_info(**kwargs)
self.update_virtualization_cisco_hypervisor_manager = _Endpoint(
settings={
'response_type': (VirtualizationCiscoHypervisorManager,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/CiscoHypervisorManagers/{Moid}',
'operation_id': 'update_virtualization_cisco_hypervisor_manager',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_cisco_hypervisor_manager',
'if_match',
],
'required': [
'moid',
'virtualization_cisco_hypervisor_manager',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_cisco_hypervisor_manager':
(VirtualizationCiscoHypervisorManager,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_cisco_hypervisor_manager': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_cisco_hypervisor_manager
)
def __update_virtualization_esxi_console(
self,
moid,
virtualization_esxi_console,
**kwargs
):
"""Update a 'virtualization.EsxiConsole' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_esxi_console(moid, virtualization_esxi_console, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_esxi_console (VirtualizationEsxiConsole): The 'virtualization.EsxiConsole' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationEsxiConsole
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_esxi_console'] = \
virtualization_esxi_console
return self.call_with_http_info(**kwargs)
self.update_virtualization_esxi_console = _Endpoint(
settings={
'response_type': (VirtualizationEsxiConsole,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/EsxiConsoles/{Moid}',
'operation_id': 'update_virtualization_esxi_console',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_esxi_console',
'if_match',
],
'required': [
'moid',
'virtualization_esxi_console',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_esxi_console':
(VirtualizationEsxiConsole,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_esxi_console': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_esxi_console
)
def __update_virtualization_host(
self,
moid,
virtualization_host,
**kwargs
):
"""Update a 'virtualization.Host' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_host(moid, virtualization_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_host (VirtualizationHost): The 'virtualization.Host' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_host'] = \
virtualization_host
return self.call_with_http_info(**kwargs)
self.update_virtualization_host = _Endpoint(
settings={
'response_type': (VirtualizationHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/Hosts/{Moid}',
'operation_id': 'update_virtualization_host',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_host',
'if_match',
],
'required': [
'moid',
'virtualization_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_host':
(VirtualizationHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_host
)
def __update_virtualization_iwe_cluster(
self,
moid,
virtualization_iwe_cluster,
**kwargs
):
"""Update a 'virtualization.IweCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_cluster(moid, virtualization_iwe_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_cluster (VirtualizationIweCluster): The 'virtualization.IweCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_cluster'] = \
virtualization_iwe_cluster
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_cluster = _Endpoint(
settings={
'response_type': (VirtualizationIweCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweClusters/{Moid}',
'operation_id': 'update_virtualization_iwe_cluster',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_cluster':
(VirtualizationIweCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_cluster
)
def __update_virtualization_iwe_datacenter(
self,
moid,
virtualization_iwe_datacenter,
**kwargs
):
"""Update a 'virtualization.IweDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_datacenter(moid, virtualization_iwe_datacenter, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_datacenter (VirtualizationIweDatacenter): The 'virtualization.IweDatacenter' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_datacenter'] = \
virtualization_iwe_datacenter
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_datacenter = _Endpoint(
settings={
'response_type': (VirtualizationIweDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweDatacenters/{Moid}',
'operation_id': 'update_virtualization_iwe_datacenter',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_datacenter',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_datacenter',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_datacenter':
(VirtualizationIweDatacenter,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_datacenter': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_datacenter
)
def __update_virtualization_iwe_host(
self,
moid,
virtualization_iwe_host,
**kwargs
):
"""Update a 'virtualization.IweHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_host(moid, virtualization_iwe_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_host (VirtualizationIweHost): The 'virtualization.IweHost' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_host'] = \
virtualization_iwe_host
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_host = _Endpoint(
settings={
'response_type': (VirtualizationIweHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweHosts/{Moid}',
'operation_id': 'update_virtualization_iwe_host',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_host',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_host':
(VirtualizationIweHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_host
)
def __update_virtualization_iwe_network(
self,
moid,
virtualization_iwe_network,
**kwargs
):
"""Update a 'virtualization.IweNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_network(moid, virtualization_iwe_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_network (VirtualizationIweNetwork): The 'virtualization.IweNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_network'] = \
virtualization_iwe_network
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_network = _Endpoint(
settings={
'response_type': (VirtualizationIweNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweNetworks/{Moid}',
'operation_id': 'update_virtualization_iwe_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_network',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_network':
(VirtualizationIweNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_network
)
def __update_virtualization_iwe_virtual_disk(
self,
moid,
virtualization_iwe_virtual_disk,
**kwargs
):
"""Update a 'virtualization.IweVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_virtual_disk(moid, virtualization_iwe_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_virtual_disk (VirtualizationIweVirtualDisk): The 'virtualization.IweVirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_virtual_disk'] = \
virtualization_iwe_virtual_disk
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualDisks/{Moid}',
'operation_id': 'update_virtualization_iwe_virtual_disk',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_virtual_disk':
(VirtualizationIweVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_virtual_disk
)
def __update_virtualization_iwe_virtual_machine(
self,
moid,
virtualization_iwe_virtual_machine,
**kwargs
):
"""Update a 'virtualization.IweVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_iwe_virtual_machine(moid, virtualization_iwe_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_iwe_virtual_machine (VirtualizationIweVirtualMachine): The 'virtualization.IweVirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationIweVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_iwe_virtual_machine'] = \
virtualization_iwe_virtual_machine
return self.call_with_http_info(**kwargs)
self.update_virtualization_iwe_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationIweVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/IweVirtualMachines/{Moid}',
'operation_id': 'update_virtualization_iwe_virtual_machine',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_iwe_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_iwe_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_iwe_virtual_machine':
(VirtualizationIweVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_iwe_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_iwe_virtual_machine
)
def __update_virtualization_virtual_disk(
self,
moid,
virtualization_virtual_disk,
**kwargs
):
"""Update a 'virtualization.VirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_virtual_disk(moid, virtualization_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_disk (VirtualizationVirtualDisk): The 'virtualization.VirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_disk'] = \
virtualization_virtual_disk
return self.call_with_http_info(**kwargs)
self.update_virtualization_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualDisks/{Moid}',
'operation_id': 'update_virtualization_virtual_disk',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_disk':
(VirtualizationVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_virtual_disk
)
def __update_virtualization_virtual_machine(
self,
moid,
virtualization_virtual_machine,
**kwargs
):
"""Update a 'virtualization.VirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_virtual_machine(moid, virtualization_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_machine (VirtualizationVirtualMachine): The 'virtualization.VirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_machine'] = \
virtualization_virtual_machine
return self.call_with_http_info(**kwargs)
self.update_virtualization_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualMachines/{Moid}',
'operation_id': 'update_virtualization_virtual_machine',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_machine':
(VirtualizationVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_virtual_machine
)
def __update_virtualization_virtual_network(
self,
moid,
virtualization_virtual_network,
**kwargs
):
"""Update a 'virtualization.VirtualNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_virtual_network(moid, virtualization_virtual_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_virtual_network (VirtualizationVirtualNetwork): The 'virtualization.VirtualNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVirtualNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_virtual_network'] = \
virtualization_virtual_network
return self.call_with_http_info(**kwargs)
self.update_virtualization_virtual_network = _Endpoint(
settings={
'response_type': (VirtualizationVirtualNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VirtualNetworks/{Moid}',
'operation_id': 'update_virtualization_virtual_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_virtual_network',
'if_match',
],
'required': [
'moid',
'virtualization_virtual_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_virtual_network':
(VirtualizationVirtualNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_virtual_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_virtual_network
)
def __update_virtualization_vmware_cluster(
self,
moid,
virtualization_vmware_cluster,
**kwargs
):
"""Update a 'virtualization.VmwareCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_cluster(moid, virtualization_vmware_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_cluster (VirtualizationVmwareCluster): The 'virtualization.VmwareCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_cluster'] = \
virtualization_vmware_cluster
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_cluster = _Endpoint(
settings={
'response_type': (VirtualizationVmwareCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareClusters/{Moid}',
'operation_id': 'update_virtualization_vmware_cluster',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_cluster':
(VirtualizationVmwareCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_cluster
)
def __update_virtualization_vmware_datacenter(
self,
moid,
virtualization_vmware_datacenter,
**kwargs
):
"""Update a 'virtualization.VmwareDatacenter' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_datacenter(moid, virtualization_vmware_datacenter, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datacenter (VirtualizationVmwareDatacenter): The 'virtualization.VmwareDatacenter' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatacenter
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datacenter'] = \
virtualization_vmware_datacenter
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_datacenter = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatacenter,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatacenters/{Moid}',
'operation_id': 'update_virtualization_vmware_datacenter',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datacenter',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datacenter',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datacenter':
(VirtualizationVmwareDatacenter,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datacenter': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_datacenter
)
def __update_virtualization_vmware_datastore(
self,
moid,
virtualization_vmware_datastore,
**kwargs
):
"""Update a 'virtualization.VmwareDatastore' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_datastore(moid, virtualization_vmware_datastore, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datastore (VirtualizationVmwareDatastore): The 'virtualization.VmwareDatastore' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastore
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datastore'] = \
virtualization_vmware_datastore
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_datastore = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastore,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastores/{Moid}',
'operation_id': 'update_virtualization_vmware_datastore',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datastore',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datastore',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datastore':
(VirtualizationVmwareDatastore,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datastore': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_datastore
)
def __update_virtualization_vmware_datastore_cluster(
self,
moid,
virtualization_vmware_datastore_cluster,
**kwargs
):
"""Update a 'virtualization.VmwareDatastoreCluster' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_datastore_cluster(moid, virtualization_vmware_datastore_cluster, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_datastore_cluster (VirtualizationVmwareDatastoreCluster): The 'virtualization.VmwareDatastoreCluster' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDatastoreCluster
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_datastore_cluster'] = \
virtualization_vmware_datastore_cluster
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_datastore_cluster = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDatastoreCluster,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDatastoreClusters/{Moid}',
'operation_id': 'update_virtualization_vmware_datastore_cluster',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_datastore_cluster',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_datastore_cluster',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_datastore_cluster':
(VirtualizationVmwareDatastoreCluster,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_datastore_cluster': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_datastore_cluster
)
def __update_virtualization_vmware_distributed_network(
self,
moid,
virtualization_vmware_distributed_network,
**kwargs
):
"""Update a 'virtualization.VmwareDistributedNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_distributed_network(moid, virtualization_vmware_distributed_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_distributed_network (VirtualizationVmwareDistributedNetwork): The 'virtualization.VmwareDistributedNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_distributed_network'] = \
virtualization_vmware_distributed_network
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_distributed_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedNetworks/{Moid}',
'operation_id': 'update_virtualization_vmware_distributed_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_distributed_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_distributed_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_distributed_network':
(VirtualizationVmwareDistributedNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_distributed_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_distributed_network
)
def __update_virtualization_vmware_distributed_switch(
self,
moid,
virtualization_vmware_distributed_switch,
**kwargs
):
"""Update a 'virtualization.VmwareDistributedSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_distributed_switch(moid, virtualization_vmware_distributed_switch, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_distributed_switch (VirtualizationVmwareDistributedSwitch): The 'virtualization.VmwareDistributedSwitch' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareDistributedSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_distributed_switch'] = \
virtualization_vmware_distributed_switch
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_distributed_switch = _Endpoint(
settings={
'response_type': (VirtualizationVmwareDistributedSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareDistributedSwitches/{Moid}',
'operation_id': 'update_virtualization_vmware_distributed_switch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_distributed_switch',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_distributed_switch',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_distributed_switch':
(VirtualizationVmwareDistributedSwitch,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_distributed_switch': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_distributed_switch
)
def __update_virtualization_vmware_folder(
self,
moid,
virtualization_vmware_folder,
**kwargs
):
"""Update a 'virtualization.VmwareFolder' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_folder(moid, virtualization_vmware_folder, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_folder (VirtualizationVmwareFolder): The 'virtualization.VmwareFolder' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareFolder
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_folder'] = \
virtualization_vmware_folder
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_folder = _Endpoint(
settings={
'response_type': (VirtualizationVmwareFolder,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareFolders/{Moid}',
'operation_id': 'update_virtualization_vmware_folder',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_folder',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_folder',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_folder':
(VirtualizationVmwareFolder,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_folder': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_folder
)
def __update_virtualization_vmware_host(
self,
moid,
virtualization_vmware_host,
**kwargs
):
"""Update a 'virtualization.VmwareHost' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_host(moid, virtualization_vmware_host, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_host (VirtualizationVmwareHost): The 'virtualization.VmwareHost' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareHost
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_host'] = \
virtualization_vmware_host
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_host = _Endpoint(
settings={
'response_type': (VirtualizationVmwareHost,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareHosts/{Moid}',
'operation_id': 'update_virtualization_vmware_host',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_host',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_host',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_host':
(VirtualizationVmwareHost,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_host': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_host
)
def __update_virtualization_vmware_kernel_network(
self,
moid,
virtualization_vmware_kernel_network,
**kwargs
):
"""Update a 'virtualization.VmwareKernelNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_kernel_network(moid, virtualization_vmware_kernel_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_kernel_network (VirtualizationVmwareKernelNetwork): The 'virtualization.VmwareKernelNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareKernelNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_kernel_network'] = \
virtualization_vmware_kernel_network
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_kernel_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareKernelNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareKernelNetworks/{Moid}',
'operation_id': 'update_virtualization_vmware_kernel_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_kernel_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_kernel_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_kernel_network':
(VirtualizationVmwareKernelNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_kernel_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_kernel_network
)
def __update_virtualization_vmware_network(
self,
moid,
virtualization_vmware_network,
**kwargs
):
"""Update a 'virtualization.VmwareNetwork' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_network(moid, virtualization_vmware_network, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_network (VirtualizationVmwareNetwork): The 'virtualization.VmwareNetwork' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareNetwork
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_network'] = \
virtualization_vmware_network
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_network = _Endpoint(
settings={
'response_type': (VirtualizationVmwareNetwork,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareNetworks/{Moid}',
'operation_id': 'update_virtualization_vmware_network',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_network',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_network',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_network':
(VirtualizationVmwareNetwork,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_network': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_network
)
def __update_virtualization_vmware_physical_network_interface(
self,
moid,
virtualization_vmware_physical_network_interface,
**kwargs
):
"""Update a 'virtualization.VmwarePhysicalNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_physical_network_interface(moid, virtualization_vmware_physical_network_interface, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_physical_network_interface (VirtualizationVmwarePhysicalNetworkInterface): The 'virtualization.VmwarePhysicalNetworkInterface' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwarePhysicalNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_physical_network_interface'] = \
virtualization_vmware_physical_network_interface
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_physical_network_interface = _Endpoint(
settings={
'response_type': (VirtualizationVmwarePhysicalNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwarePhysicalNetworkInterfaces/{Moid}',
'operation_id': 'update_virtualization_vmware_physical_network_interface',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_physical_network_interface',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_physical_network_interface',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_physical_network_interface':
(VirtualizationVmwarePhysicalNetworkInterface,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_physical_network_interface': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_physical_network_interface
)
def __update_virtualization_vmware_uplink_port(
self,
moid,
virtualization_vmware_uplink_port,
**kwargs
):
"""Update a 'virtualization.VmwareUplinkPort' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_uplink_port(moid, virtualization_vmware_uplink_port, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_uplink_port (VirtualizationVmwareUplinkPort): The 'virtualization.VmwareUplinkPort' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareUplinkPort
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_uplink_port'] = \
virtualization_vmware_uplink_port
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_uplink_port = _Endpoint(
settings={
'response_type': (VirtualizationVmwareUplinkPort,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareUplinkPorts/{Moid}',
'operation_id': 'update_virtualization_vmware_uplink_port',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_uplink_port',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_uplink_port',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_uplink_port':
(VirtualizationVmwareUplinkPort,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_uplink_port': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_uplink_port
)
def __update_virtualization_vmware_virtual_disk(
self,
moid,
virtualization_vmware_virtual_disk,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualDisk' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_virtual_disk(moid, virtualization_vmware_virtual_disk, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_disk (VirtualizationVmwareVirtualDisk): The 'virtualization.VmwareVirtualDisk' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualDisk
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_disk'] = \
virtualization_vmware_virtual_disk
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_virtual_disk = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualDisk,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualDisks/{Moid}',
'operation_id': 'update_virtualization_vmware_virtual_disk',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_disk',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_disk',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_disk':
(VirtualizationVmwareVirtualDisk,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_disk': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_virtual_disk
)
def __update_virtualization_vmware_virtual_machine(
self,
moid,
virtualization_vmware_virtual_machine,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualMachine' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_virtual_machine(moid, virtualization_vmware_virtual_machine, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_machine (VirtualizationVmwareVirtualMachine): The 'virtualization.VmwareVirtualMachine' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachine
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_machine'] = \
virtualization_vmware_virtual_machine
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_virtual_machine = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachine,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachines/{Moid}',
'operation_id': 'update_virtualization_vmware_virtual_machine',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_machine',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_machine',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_machine':
(VirtualizationVmwareVirtualMachine,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_machine': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_virtual_machine
)
def __update_virtualization_vmware_virtual_machine_snapshot(
self,
moid,
virtualization_vmware_virtual_machine_snapshot,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualMachineSnapshot' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_virtual_machine_snapshot(moid, virtualization_vmware_virtual_machine_snapshot, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_machine_snapshot (VirtualizationVmwareVirtualMachineSnapshot): The 'virtualization.VmwareVirtualMachineSnapshot' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualMachineSnapshot
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_machine_snapshot'] = \
virtualization_vmware_virtual_machine_snapshot
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_virtual_machine_snapshot = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualMachineSnapshot,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualMachineSnapshots/{Moid}',
'operation_id': 'update_virtualization_vmware_virtual_machine_snapshot',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_machine_snapshot',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_machine_snapshot',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_machine_snapshot':
(VirtualizationVmwareVirtualMachineSnapshot,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_machine_snapshot': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_virtual_machine_snapshot
)
def __update_virtualization_vmware_virtual_network_interface(
self,
moid,
virtualization_vmware_virtual_network_interface,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualNetworkInterface' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_virtual_network_interface(moid, virtualization_vmware_virtual_network_interface, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_network_interface (VirtualizationVmwareVirtualNetworkInterface): The 'virtualization.VmwareVirtualNetworkInterface' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualNetworkInterface
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_network_interface'] = \
virtualization_vmware_virtual_network_interface
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_virtual_network_interface = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualNetworkInterface,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualNetworkInterfaces/{Moid}',
'operation_id': 'update_virtualization_vmware_virtual_network_interface',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_network_interface',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_network_interface',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_network_interface':
(VirtualizationVmwareVirtualNetworkInterface,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_network_interface': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_virtual_network_interface
)
def __update_virtualization_vmware_virtual_switch(
self,
moid,
virtualization_vmware_virtual_switch,
**kwargs
):
"""Update a 'virtualization.VmwareVirtualSwitch' resource. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_virtualization_vmware_virtual_switch(moid, virtualization_vmware_virtual_switch, async_req=True)
>>> result = thread.get()
Args:
moid (str): The unique Moid identifier of a resource instance.
virtualization_vmware_virtual_switch (VirtualizationVmwareVirtualSwitch): The 'virtualization.VmwareVirtualSwitch' resource to update.
Keyword Args:
if_match (str): For methods that apply server-side changes, and in particular for PUT, If-Match can be used to prevent the lost update problem. It can check if the modification of a resource that the user wants to upload will not override another change that has been done since the original resource was fetched. If the request cannot be fulfilled, the 412 (Precondition Failed) response is returned. When modifying a resource using POST or PUT, the If-Match header must be set to the value of the resource ModTime property after which no lost update problem should occur. For example, a client send a GET request to obtain a resource, which includes the ModTime property. The ModTime indicates the last time the resource was created or modified. The client then sends a POST or PUT request with the If-Match header set to the ModTime property of the resource as obtained in the GET request.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
VirtualizationVmwareVirtualSwitch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['moid'] = \
moid
kwargs['virtualization_vmware_virtual_switch'] = \
virtualization_vmware_virtual_switch
return self.call_with_http_info(**kwargs)
self.update_virtualization_vmware_virtual_switch = _Endpoint(
settings={
'response_type': (VirtualizationVmwareVirtualSwitch,),
'auth': [
'cookieAuth',
'http_signature',
'oAuth2',
'oAuth2'
],
'endpoint_path': '/api/v1/virtualization/VmwareVirtualSwitches/{Moid}',
'operation_id': 'update_virtualization_vmware_virtual_switch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'moid',
'virtualization_vmware_virtual_switch',
'if_match',
],
'required': [
'moid',
'virtualization_vmware_virtual_switch',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'moid':
(str,),
'virtualization_vmware_virtual_switch':
(VirtualizationVmwareVirtualSwitch,),
'if_match':
(str,),
},
'attribute_map': {
'moid': 'Moid',
'if_match': 'If-Match',
},
'location_map': {
'moid': 'path',
'virtualization_vmware_virtual_switch': 'body',
'if_match': 'header',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/json-patch+json'
]
},
api_client=api_client,
callable=__update_virtualization_vmware_virtual_switch
)
| 47.905688 | 1,678 | 0.524965 | 91,954 | 977,803 | 5.408128 | 0.007439 | 0.017736 | 0.014639 | 0.015202 | 0.982111 | 0.97918 | 0.96853 | 0.958395 | 0.954623 | 0.95252 | 0 | 0.002549 | 0.402191 | 977,803 | 20,410 | 1,679 | 47.908035 | 0.848205 | 0.433226 | 0 | 0.768752 | 0 | 0 | 0.235842 | 0.07585 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009756 | false | 0 | 0.00519 | 0 | 0.024702 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
11be59b31fe2321b5ab4ae9e3e75a69b764a0ea1 | 30,953 | py | Python | q2_diversity/tests/test_beta.py | jairideout/diversity | 0024301a03134b2b05aeb83f6b01bd22da5b8cb2 | [
"BSD-3-Clause"
] | null | null | null | q2_diversity/tests/test_beta.py | jairideout/diversity | 0024301a03134b2b05aeb83f6b01bd22da5b8cb2 | [
"BSD-3-Clause"
] | null | null | null | q2_diversity/tests/test_beta.py | jairideout/diversity | 0024301a03134b2b05aeb83f6b01bd22da5b8cb2 | [
"BSD-3-Clause"
] | null | null | null | # ----------------------------------------------------------------------------
# Copyright (c) 2016-2017, QIIME 2 development team.
#
# Distributed under the terms of the Modified BSD License.
#
# The full license is in the file LICENSE, distributed with this software.
# ----------------------------------------------------------------------------
import unittest
import io
import os
import tempfile
import glob
import collections
import skbio
import numpy as np
import numpy.testing as npt
from biom.table import Table
import pandas as pd
import qiime2
from q2_diversity import (beta, beta_phylogenetic, bioenv,
beta_group_significance, beta_correlation)
from q2_diversity._beta._visualizer import (_get_distance_boxplot_data,
_metadata_distance)
class BetaDiversityTests(unittest.TestCase):
def test_beta(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
actual = beta(table=t, metric='braycurtis')
# expected computed with scipy.spatial.distance.braycurtis
expected = skbio.DistanceMatrix([[0.0000000, 0.3333333, 0.6666667],
[0.3333333, 0.0000000, 0.4285714],
[0.6666667, 0.4285714, 0.0000000]],
ids=['S1', 'S2', 'S3'])
self.assertEqual(actual.ids, expected.ids)
for id1 in actual.ids:
for id2 in actual.ids:
npt.assert_almost_equal(actual[id1, id2], expected[id1, id2])
def test_beta_phylo_metric(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
with self.assertRaises(ValueError):
beta(table=t, metric='unweighted_unifrac')
def test_beta_unknown_metric(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
with self.assertRaises(ValueError):
beta(table=t, metric='not-a-metric')
def test_beta_phylogenetic(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
tree = skbio.TreeNode.read(io.StringIO(
'((O1:0.25, O2:0.50):0.25, O3:0.75)root;'))
actual = beta_phylogenetic(
table=t, phylogeny=tree, metric='unweighted_unifrac')
# expected computed with skbio.diversity.beta_diversity
expected = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['S1', 'S2', 'S3'])
self.assertEqual(actual.ids, expected.ids)
for id1 in actual.ids:
for id2 in actual.ids:
npt.assert_almost_equal(actual[id1, id2], expected[id1, id2])
def test_beta_phylogenetic_non_phylo_metric(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
tree = skbio.TreeNode.read(io.StringIO(
'((O1:0.25, O2:0.50):0.25, O3:0.75)root;'))
with self.assertRaises(ValueError):
beta_phylogenetic(table=t, phylogeny=tree, metric='braycurtis')
def test_beta_phylogenetic_unknown_metric(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
tree = skbio.TreeNode.read(io.StringIO(
'((O1:0.25, O2:0.50):0.25, O3:0.75)root;'))
with self.assertRaises(ValueError):
beta_phylogenetic(table=t, phylogeny=tree, metric='not-a-metric')
def test_beta_phylogenetic_skbio_error_rewriting(self):
t = Table(np.array([[0, 1, 3], [1, 1, 2]]),
['O1', 'O2'],
['S1', 'S2', 'S3'])
tree = skbio.TreeNode.read(io.StringIO(
'((O1:0.25):0.25, O3:0.75)root;'))
# Verify through regex that there is a ``feature_ids`` substring
# followed by a ``phylogeny``
with self.assertRaisesRegex(skbio.tree.MissingNodeError,
'feature_ids.*phylogeny'):
beta_phylogenetic(table=t, phylogeny=tree,
metric='weighted_unifrac')
class BioenvTests(unittest.TestCase):
def test_bioenv(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.Metadata(
pd.DataFrame([['1.0', 'a'], ['2.0', 'b'], ['3.0', 'c']],
index=['sample1', 'sample2', 'sample3'],
columns=['metadata1', 'metadata2']))
with tempfile.TemporaryDirectory() as output_dir:
bioenv(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
self.assertTrue('metadata1' in open(index_fp).read())
self.assertTrue('not numerical' in open(index_fp).read())
self.assertTrue('<strong>metadata2' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
def test_bioenv_exclude_missing_data(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.Metadata(
pd.DataFrame([['1.0', '2.0'], ['2.0', ''], ['3.0', '42.0']],
index=['sample1', 'sample2', 'sample3'],
columns=['metadata1', 'metadata2']))
with tempfile.TemporaryDirectory() as output_dir:
bioenv(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
self.assertTrue('metadata1' in open(index_fp).read())
self.assertTrue('metadata2' in open(index_fp).read())
self.assertTrue('Warning' in open(index_fp).read())
self.assertTrue('contained 3 samples' in open(index_fp).read())
self.assertTrue('2 samples' in open(index_fp).read())
def test_bioenv_extra_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.Metadata(
pd.DataFrame([['1.0', 'a'], ['2.0', 'b'], ['3.0', 'c'],
['4.0', 'd']],
index=['sample1', 'sample2', 'sample3', 'sample4'],
columns=['metadata1', 'metadata2']))
with tempfile.TemporaryDirectory() as output_dir:
bioenv(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
self.assertTrue('metadata1' in open(index_fp).read())
self.assertTrue('not numerical' in open(index_fp).read())
self.assertTrue('<strong>metadata2' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
def test_bioenv_zero_variance_column(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.Metadata(
pd.DataFrame([['1.0', '2.0'], ['2.0', '2.0'], ['3.0', '2.0']],
index=['sample1', 'sample2', 'sample3'],
columns=['metadata1', 'metadata2']))
with tempfile.TemporaryDirectory() as output_dir:
bioenv(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue('metadata1' in open(index_fp).read())
self.assertTrue('no variance' in open(index_fp).read())
self.assertTrue('<strong>metadata2' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
class BetaGroupSignificanceTests(unittest.TestCase):
def test_permanova(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected boxplots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.png')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.png')))
# no extra boxplots are generated
self.assertEqual(len(glob.glob('%s/*-boxplots.pdf' % output_dir)),
2)
self.assertEqual(len(glob.glob('%s/*-boxplots.png' % output_dir)),
2)
self.assertTrue('PERMANOVA results' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
self.assertFalse('Pairwise permanova' in open(index_fp).read())
def test_anosim(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, method='anosim',
permutations=42)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected boxplots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.png')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.png')))
# no extra boxplots are generated
self.assertEqual(len(glob.glob('%s/*-boxplots.pdf' % output_dir)),
2)
self.assertEqual(len(glob.glob('%s/*-boxplots.png' % output_dir)),
2)
self.assertTrue('ANOSIM results' in open(index_fp).read())
self.assertTrue('<td>42</td>' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
self.assertFalse('Pairwise anosim' in open(index_fp).read())
def test_permanova_pairwise(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, pairwise=True)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected boxplots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.png')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.png')))
# no extra boxplots are generated
self.assertEqual(len(glob.glob('%s/*-boxplots.pdf' % output_dir)),
2)
self.assertEqual(len(glob.glob('%s/*-boxplots.png' % output_dir)),
2)
self.assertTrue('PERMANOVA results' in open(index_fp).read())
self.assertTrue('Pairwise permanova' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
def test_anosim_pairwise(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, method='anosim',
permutations=42, pairwise=True)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected boxplots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'a-boxplots.png')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir, 'b-boxplots.png')))
# no extra boxplots are generated
self.assertEqual(len(glob.glob('%s/*-boxplots.pdf' % output_dir)),
2)
self.assertEqual(len(glob.glob('%s/*-boxplots.png' % output_dir)),
2)
self.assertTrue('ANOSIM results' in open(index_fp).read())
self.assertTrue('<td>42</td>' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
self.assertTrue('Pairwise anosim' in open(index_fp).read())
def test_alt_permutations(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, permutations=42)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue('<td>42</td>' in open(index_fp).read())
def test_invalid_method(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b'], name='a or b',
index=['sample1', 'sample2', 'sample3']))
with self.assertRaises(ValueError):
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, method='bad!')
def test_filtered_samples_numeric_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25, 0.66],
[0.25, 0.00, 0.00, 0.66],
[0.25, 0.00, 0.00, 0.66],
[0.66, 0.66, 0.66, 0.00]],
ids=['sample1', 'sample2', 'sample3',
'sample4'])
md = qiime2.MetadataCategory(
pd.Series(['1.0', '2.0', '2.0', ''], name='a or b',
index=['sample1', 'sample2', 'sample3', 'sample4']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue('Warning' in open(index_fp).read())
def test_filtered_samples_str_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25, 0.66],
[0.25, 0.00, 0.00, 0.66],
[0.25, 0.00, 0.00, 0.66],
[0.66, 0.66, 0.66, 0.00]],
ids=['sample1', 'sample2', 'sample3',
'sample4'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b', ''], name='a or b',
index=['sample1', 'sample2', 'sample3', 'sample4']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue('Warning' in open(index_fp).read())
def test_extra_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series(['a', 'b', 'b', 'c'], name='a or b',
index=['sample1', 'sample2', 'sample3', 'sample4']))
with tempfile.TemporaryDirectory() as output_dir:
beta_group_significance(output_dir, dm, md, permutations=42)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue('<td>2</td>' in open(index_fp).read())
def test_get_distance_boxplot_data_two_groups(self):
dm = skbio.DistanceMatrix([[0.00, 0.12, 0.13, 0.14, 0.15],
[0.12, 0.00, 0.22, 0.23, 0.24],
[0.13, 0.22, 0.00, 0.31, 0.32],
[0.14, 0.23, 0.31, 0.00, 0.44],
[0.15, 0.24, 0.32, 0.44, 0.00]],
ids=['s1', 's2', 's3', 's4', 's5'])
groupings = collections.OrderedDict(
[('g1', ['s1', 's2']), ('g2', ['s3', 's4', 's5'])])
obs = _get_distance_boxplot_data(dm, 'g1', groupings)
exp_data = [[0.12], [0.13, 0.14, 0.15, 0.22, 0.23, 0.24]]
exp_labels = ['g1 (n=1)', 'g2 (n=6)']
self.assertEqual(obs[0], exp_data)
self.assertEqual(obs[1], exp_labels)
def test_get_distance_boxplot_data_within_always_first(self):
dm = skbio.DistanceMatrix([[0.00, 0.12, 0.13, 0.14, 0.15],
[0.12, 0.00, 0.22, 0.23, 0.24],
[0.13, 0.22, 0.00, 0.31, 0.32],
[0.14, 0.23, 0.31, 0.00, 0.44],
[0.15, 0.24, 0.32, 0.44, 0.00]],
ids=['s1', 's2', 's3', 's4', 's5'])
groupings = collections.OrderedDict(
[('g2', ['s3', 's4', 's5']), ('g1', ['s1', 's2'])])
obs = _get_distance_boxplot_data(dm, 'g1', groupings)
exp_data = [[0.12], [0.13, 0.14, 0.15, 0.22, 0.23, 0.24]]
exp_labels = ['g1 (n=1)', 'g2 (n=6)']
self.assertEqual(obs[0], exp_data)
self.assertEqual(obs[1], exp_labels)
def test_get_distance_boxplot_data_three_groups(self):
dm = skbio.DistanceMatrix([[0.00, 0.12, 0.13, 0.14, 0.15],
[0.12, 0.00, 0.22, 0.23, 0.24],
[0.13, 0.22, 0.00, 0.31, 0.32],
[0.14, 0.23, 0.31, 0.00, 0.44],
[0.15, 0.24, 0.32, 0.44, 0.00]],
ids=['s1', 's2', 's3', 's4', 's5'])
groupings = collections.OrderedDict(
[('g1', ['s1', 's2']), ('g2', ['s3', 's5']), ('g3', ['s4'])])
obs = _get_distance_boxplot_data(dm, 'g1', groupings)
exp_data = [[0.12], [0.13, 0.15, 0.22, 0.24], [0.14, 0.23]]
exp_labels = ['g1 (n=1)', 'g2 (n=4)', 'g3 (n=2)']
self.assertEqual(obs[0], exp_data)
self.assertEqual(obs[1], exp_labels)
def test_get_distance_boxplot_data_between_order_retained(self):
dm = skbio.DistanceMatrix([[0.00, 0.12, 0.13, 0.14, 0.15],
[0.12, 0.00, 0.22, 0.23, 0.24],
[0.13, 0.22, 0.00, 0.31, 0.32],
[0.14, 0.23, 0.31, 0.00, 0.44],
[0.15, 0.24, 0.32, 0.44, 0.00]],
ids=['s1', 's2', 's3', 's4', 's5'])
groupings = collections.OrderedDict(
[('g1', ['s1', 's2']), ('g3', ['s4']), ('g2', ['s3', 's5'])])
obs = _get_distance_boxplot_data(dm, 'g1', groupings)
exp_data = [[0.12], [0.14, 0.23], [0.13, 0.15, 0.22, 0.24]]
exp_labels = ['g1 (n=1)', 'g3 (n=2)', 'g2 (n=4)']
self.assertEqual(obs[0], exp_data)
self.assertEqual(obs[1], exp_labels)
class BetaCorrelationTests(unittest.TestCase):
def test_metadata_distance_int(self):
md = pd.Series([1, 2, 3], name='number',
index=['sample1', 'sample2', 'sample3'])
exp = skbio.DistanceMatrix([[0, 1, 2],
[1, 0, 1],
[2, 1, 0]],
ids=['sample1', 'sample2', 'sample3'])
obs = _metadata_distance(md)
self.assertEqual(exp, obs)
def test_metadata_distance_float(self):
md = pd.Series([1.5, 2.0, 3.0], name='number',
index=['sample1', 'sample2', 'sample3'])
exp = skbio.DistanceMatrix([[0.0, 0.5, 1.5],
[0.5, 0.0, 1.0],
[1.5, 1.0, 0.0]],
ids=['sample1', 'sample2', 'sample3'])
obs = _metadata_distance(md)
self.assertEqual(exp, obs)
def test_metadata_distance_one_sample(self):
md = pd.Series([1.5], name='number',
index=['sample1'])
exp = skbio.DistanceMatrix([[0.0]],
ids=['sample1'])
obs = _metadata_distance(md)
self.assertEqual(exp, obs)
def test_basic(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1, 2, 3], name='number',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_correlation(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected plots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.png')))
self.assertTrue('Mantel test results' in open(index_fp).read())
self.assertTrue('Spearman rho' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
def test_warning_on_extra_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1, 2, 3, 4], name='number',
index=['sample1', 'sample2', 'sample3', 'sample4']))
with tempfile.TemporaryDirectory() as output_dir:
beta_correlation(output_dir, dm, md)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected plots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.png')))
self.assertTrue('Mantel test results' in open(index_fp).read())
self.assertTrue('Spearman rho' in open(index_fp).read())
self.assertTrue('Warning' in open(index_fp).read())
def test_error_on_missing_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1, 2], name='number',
index=['sample1', 'sample2']))
with tempfile.TemporaryDirectory() as output_dir:
with self.assertRaisesRegex(ValueError, 'no data: sample3'):
beta_correlation(output_dir, dm, md)
def test_error_on_nan_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1.0, 2.0, ''], name='number',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
with self.assertRaisesRegex(ValueError, 'no data: sample3'):
beta_correlation(output_dir, dm, md)
def test_error_on_non_numeric_metadata(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1.0, 2.0, 'hello-world'], name='number',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
with self.assertRaisesRegex(ValueError, 'Non-numeric data was'):
beta_correlation(output_dir, dm, md)
def test_basic_pearson(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1, 2, 3], name='number',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_correlation(output_dir, dm, md, method='pearson')
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected plots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.png')))
self.assertTrue('Mantel test results' in open(index_fp).read())
self.assertTrue('Pearson r' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
def test_basic_alt_permutations(self):
dm = skbio.DistanceMatrix([[0.00, 0.25, 0.25],
[0.25, 0.00, 0.00],
[0.25, 0.00, 0.00]],
ids=['sample1', 'sample2', 'sample3'])
md = qiime2.MetadataCategory(
pd.Series([1, 2, 3], name='number',
index=['sample1', 'sample2', 'sample3']))
with tempfile.TemporaryDirectory() as output_dir:
beta_correlation(output_dir, dm, md, permutations=42)
index_fp = os.path.join(output_dir, 'index.html')
self.assertTrue(os.path.exists(index_fp))
# all expected plots are generated
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.pdf')))
self.assertTrue(os.path.exists(
os.path.join(output_dir,
'beta-correlation-scatter.png')))
self.assertTrue('Mantel test results' in open(index_fp).read())
self.assertTrue('<td>42</td>' in open(index_fp).read())
self.assertTrue('Spearman rho' in open(index_fp).read())
self.assertFalse('Warning' in open(index_fp).read())
if __name__ == "__main__":
unittest.main()
| 49.209857 | 78 | 0.485478 | 3,558 | 30,953 | 4.117763 | 0.06914 | 0.026005 | 0.027848 | 0.042591 | 0.888404 | 0.880827 | 0.874548 | 0.862535 | 0.844652 | 0.839738 | 0 | 0.081541 | 0.356961 | 30,953 | 628 | 79 | 49.288217 | 0.654542 | 0.030369 | 0 | 0.768797 | 0 | 0.005639 | 0.106606 | 0.008203 | 0 | 0 | 0 | 0 | 0.216165 | 1 | 0.06391 | false | 0 | 0.026316 | 0 | 0.097744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eeefa5e8e9037cbe73081d252e4311e6d8864778 | 22,808 | py | Python | src/generator/AutoRest.Python.Tests/Expected/AcceptanceTests/BodyString/autorestswaggerbatservice/operations/string_operations.py | fhoering/autorest | b36c77ebb6a5c92aca72eea0894a683506af5817 | [
"MIT"
] | null | null | null | src/generator/AutoRest.Python.Tests/Expected/AcceptanceTests/BodyString/autorestswaggerbatservice/operations/string_operations.py | fhoering/autorest | b36c77ebb6a5c92aca72eea0894a683506af5817 | [
"MIT"
] | null | null | null | src/generator/AutoRest.Python.Tests/Expected/AcceptanceTests/BodyString/autorestswaggerbatservice/operations/string_operations.py | fhoering/autorest | b36c77ebb6a5c92aca72eea0894a683506af5817 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrest.pipeline import ClientRawResponse
from .. import models
class StringOperations(object):
"""StringOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.config = config
def get_null(
self, custom_headers=None, raw=False, **operation_config):
"""Get null string value value.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: str
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/null'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_null(
self, string_body=None, custom_headers=None, raw=False, **operation_config):
"""Set string value null.
:param string_body: Possible values include: ''
:type string_body: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/null'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
if string_body is not None:
body_content = self._serialize.body(string_body, 'str')
else:
body_content = None
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_empty(
self, custom_headers=None, raw=False, **operation_config):
"""Get empty string value value ''.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: str
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/empty'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_empty(
self, string_body, custom_headers=None, raw=False, **operation_config):
"""Set string value empty ''.
:param string_body: Possible values include: ''
:type string_body: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/empty'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(string_body, 'str')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_mbcs(
self, custom_headers=None, raw=False, **operation_config):
"""Get mbcs string value
'啊齄丂狛狜隣郎隣兀﨩ˊ▇█〞〡¦℡㈱‐ー﹡﹢﹫、〓ⅰⅹ⒈€㈠㈩ⅠⅫ! ̄ぁんァヶΑ︴АЯаяāɡㄅㄩ─╋︵﹄︻︱︳︴ⅰⅹɑɡ〇〾⿻⺁䜣€
'.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: str
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/mbcs'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_mbcs(
self, string_body, custom_headers=None, raw=False, **operation_config):
"""Set string value mbcs
'啊齄丂狛狜隣郎隣兀﨩ˊ▇█〞〡¦℡㈱‐ー﹡﹢﹫、〓ⅰⅹ⒈€㈠㈩ⅠⅫ! ̄ぁんァヶΑ︴АЯаяāɡㄅㄩ─╋︵﹄︻︱︳︴ⅰⅹɑɡ〇〾⿻⺁䜣€
'.
:param string_body: Possible values include:
'啊齄丂狛狜隣郎隣兀﨩ˊ▇█〞〡¦℡㈱‐ー﹡﹢﹫、〓ⅰⅹ⒈€㈠㈩ⅠⅫ! ̄ぁんァヶΑ︴АЯаяāɡㄅㄩ─╋︵﹄︻︱︳︴ⅰⅹɑɡ〇〾⿻⺁䜣€
'
:type string_body: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/mbcs'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(string_body, 'str')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_whitespace(
self, custom_headers=None, raw=False, **operation_config):
"""Get string value with leading and trailing whitespace
'<tab><space><space>Now is the time for all good men to come to the
aid of their country<tab><space><space>'.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: str
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/whitespace'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_whitespace(
self, string_body, custom_headers=None, raw=False, **operation_config):
"""Set String value with leading and trailing whitespace
'<tab><space><space>Now is the time for all good men to come to the
aid of their country<tab><space><space>'.
:param string_body: Possible values include: ' Now is the time for
all good men to come to the aid of their country '
:type string_body: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/whitespace'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(string_body, 'str')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_not_provided(
self, custom_headers=None, raw=False, **operation_config):
"""Get String value when no string value is sent in response payload.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: str
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/notProvided'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('str', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def get_base64_encoded(
self, custom_headers=None, raw=False, **operation_config):
"""Get value that is base64 encoded.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: bytes
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/base64Encoding'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('base64', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def get_base64_url_encoded(
self, custom_headers=None, raw=False, **operation_config):
"""Get value that is base64url encoded.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: bytes
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/base64UrlEncoding'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('base64', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_base64_url_encoded(
self, string_body, custom_headers=None, raw=False, **operation_config):
"""Put value that is base64url encoded.
:param string_body:
:type string_body: bytes
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/base64UrlEncoding'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(string_body, 'base64')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_null_base64_url_encoded(
self, custom_headers=None, raw=False, **operation_config):
"""Get null value that is expected to be base64url encoded.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: bytes
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
:raises:
:class:`ErrorException<Fixtures.AcceptanceTestsBodyString.models.ErrorException>`
"""
# Construct URL
url = '/string/nullBase64UrlEncoding'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('base64', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
| 36.376396 | 90 | 0.647273 | 2,343 | 22,808 | 6.200598 | 0.075971 | 0.046531 | 0.041162 | 0.02891 | 0.933645 | 0.933645 | 0.927175 | 0.927175 | 0.924353 | 0.917883 | 0 | 0.006673 | 0.264074 | 22,808 | 626 | 91 | 36.434505 | 0.849926 | 0.386926 | 0 | 0.883065 | 0 | 0 | 0.0672 | 0.00808 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056452 | false | 0 | 0.008065 | 0 | 0.153226 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
01863d02fa5eb044534542d19acc6fa884b3dd32 | 12,468 | py | Python | broker/tasks/pipelines.py | cloud-gov/domain-broker | d1161cecd971202cb10449a29fedb83ef3cd83b0 | [
"CC0-1.0"
] | 5 | 2020-04-28T01:40:43.000Z | 2021-03-27T21:34:14.000Z | broker/tasks/pipelines.py | cloud-gov/external-domain-broker | 7136ac0c10f5950315b77eddca60512129d4078d | [
"CC0-1.0"
] | 124 | 2020-04-23T20:24:15.000Z | 2022-02-28T16:59:34.000Z | broker/tasks/pipelines.py | cloud-gov/domain-broker | d1161cecd971202cb10449a29fedb83ef3cd83b0 | [
"CC0-1.0"
] | 1 | 2020-06-10T02:43:21.000Z | 2020-06-10T02:43:21.000Z | import logging
from broker.tasks import alb, cloudfront, update_operations, iam, letsencrypt, route53
from broker.tasks.huey import huey
logger = logging.getLogger(__name__)
def queue_all_alb_provision_tasks_for_operation(operation_id: int, correlation_id: str):
if correlation_id is None:
raise RuntimeError("correlation_id must be set")
if operation_id is None:
raise RuntimeError("operation_id must be set")
correlation = {"correlation_id": correlation_id}
task_pipeline = (
letsencrypt.create_user.s(operation_id, **correlation)
.then(letsencrypt.generate_private_key, operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(alb.select_alb, operation_id, **correlation)
.then(alb.add_certificate_to_alb, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_alb_deprovision_tasks_for_operation(
operation_id: int, correlation_id: str
):
if correlation_id is None:
raise RuntimeError("correlation_id must be set")
if operation_id is None:
raise RuntimeError("operation_id must be set")
correlation = {"correlation_id": correlation_id}
task_pipeline = (
update_operations.cancel_pending_provisioning.s(operation_id, **correlation)
.then(route53.remove_ALIAS_records, operation_id, **correlation)
.then(route53.remove_TXT_records, operation_id, **correlation)
.then(alb.remove_certificate_from_alb, operation_id, **correlation)
.then(iam.delete_server_certificate, operation_id, **correlation)
.then(update_operations.deprovision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_migration_deprovision_tasks_for_operation(
operation_id: int, correlation_id: str
):
if correlation_id is None:
raise RuntimeError("correlation_id must be set")
if operation_id is None:
raise RuntimeError("operation_id must be set")
correlation = {"correlation_id": correlation_id}
task_pipeline = update_operations.deprovision.s(operation_id, **correlation)
huey.enqueue(task_pipeline)
def queue_all_cdn_provision_tasks_for_operation(operation_id: int, correlation_id: str):
if correlation_id is None:
raise RuntimeError("correlation_id must be set")
if operation_id is None:
raise RuntimeError("operation_id must be set")
correlation = {"correlation_id": correlation_id}
task_pipeline = (
letsencrypt.create_user.s(operation_id, **correlation)
.then(letsencrypt.generate_private_key, operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(cloudfront.create_distribution, operation_id, **correlation)
.then(cloudfront.wait_for_distribution, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_cdn_deprovision_tasks_for_operation(
operation_id: int, correlation_id: str
):
if correlation_id is None:
raise RuntimeError("correlation_id must be set")
if operation_id is None:
raise RuntimeError("operation_id must be set")
correlation = {"correlation_id": correlation_id}
task_pipeline = (
update_operations.cancel_pending_provisioning.s(operation_id, **correlation)
.then(route53.remove_ALIAS_records, operation_id, **correlation)
.then(route53.remove_TXT_records, operation_id, **correlation)
.then(cloudfront.disable_distribution, operation_id, **correlation)
.then(cloudfront.wait_for_distribution_disabled, operation_id, **correlation)
.then(cloudfront.delete_distribution, operation_id=operation_id, **correlation)
.then(iam.delete_server_certificate, operation_id, **correlation)
.then(update_operations.deprovision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_alb_renewal_tasks_for_operation(operation_id, **kwargs):
correlation = {"correlation_id": "Renewal"}
task_pipeline = (
letsencrypt.generate_private_key.s(operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(alb.select_alb, operation_id, **correlation)
.then(alb.add_certificate_to_alb, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(alb.remove_certificate_from_previous_alb, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_cdn_renewal_tasks_for_operation(operation_id, **kwargs):
correlation = {"correlation_id": "Renewal"}
task_pipeline = (
letsencrypt.generate_private_key.s(operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(cloudfront.update_certificate, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_cdn_update_tasks_for_operation(operation_id, correlation_id):
correlation = {"correlation_id": correlation_id}
task_pipeline = (
letsencrypt.generate_private_key.s(operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(cloudfront.update_distribution, operation_id, **correlation)
.then(cloudfront.wait_for_distribution, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.update_complete, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_alb_update_tasks_for_operation(operation_id, correlation_id):
correlation = {"correlation_id": correlation_id}
task_pipeline = (
letsencrypt.generate_private_key.s(operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(alb.select_alb, operation_id, **correlation)
.then(alb.add_certificate_to_alb, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(alb.remove_certificate_from_previous_alb, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_cdn_broker_migration_tasks_for_operation(operation_id, correlation_id):
correlation = {"correlation_id": correlation_id}
task_pipeline = (
cloudfront.remove_s3_bucket_from_cdn_broker_instance.s(
operation_id, **correlation
)
.then(cloudfront.add_logging_to_bucket, operation_id, **correlation)
.then(letsencrypt.create_user, operation_id, **correlation)
.then(letsencrypt.generate_private_key, operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(cloudfront.update_certificate, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
def queue_all_domain_broker_migration_tasks_for_operation(operation_id, correlation_id):
correlation = {"correlation_id": correlation_id}
task_pipeline = (
letsencrypt.create_user.s(operation_id, **correlation)
.then(letsencrypt.generate_private_key, operation_id, **correlation)
.then(letsencrypt.initiate_challenges, operation_id, **correlation)
# create alias records here is probably not necessary, but belt + suspenders
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(route53.create_TXT_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(letsencrypt.answer_challenges, operation_id, **correlation)
.then(letsencrypt.retrieve_certificate, operation_id, **correlation)
.then(iam.upload_server_certificate, operation_id, **correlation)
.then(alb.select_alb, operation_id, **correlation)
.then(alb.add_certificate_to_alb, operation_id, **correlation)
.then(route53.create_ALIAS_records, operation_id, **correlation)
.then(route53.wait_for_changes, operation_id, **correlation)
.then(alb.remove_certificate_from_previous_alb, operation_id, **correlation)
.then(iam.delete_previous_server_certificate, operation_id, **correlation)
.then(update_operations.provision, operation_id, **correlation)
)
huey.enqueue(task_pipeline)
| 53.055319 | 88 | 0.742621 | 1,386 | 12,468 | 6.316017 | 0.061328 | 0.183459 | 0.321682 | 0.332648 | 0.95682 | 0.943112 | 0.943112 | 0.942769 | 0.940027 | 0.932374 | 0 | 0.007144 | 0.158004 | 12,468 | 234 | 89 | 53.282051 | 0.826729 | 0.005935 | 0 | 0.818182 | 0 | 0 | 0.033731 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.014354 | 0 | 0.066986 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6d84214afeb6f307d1dfc5901a439d97f2d6f1b8 | 174 | py | Python | ovhcli/modules/webhosting/__init__.py | akram/ovh-cli | b1b1d0972c699e43cbf5bc24c35e177ac206e4a4 | [
"BSD-3-Clause"
] | 42 | 2016-09-20T17:54:49.000Z | 2021-02-12T13:35:35.000Z | ovhcli/modules/webhosting/__init__.py | akram/ovh-cli | b1b1d0972c699e43cbf5bc24c35e177ac206e4a4 | [
"BSD-3-Clause"
] | 4 | 2016-12-23T01:38:10.000Z | 2018-04-18T12:16:04.000Z | ovhcli/modules/webhosting/__init__.py | akram/ovh-cli | b1b1d0972c699e43cbf5bc24c35e177ac206e4a4 | [
"BSD-3-Clause"
] | 12 | 2016-09-21T19:58:19.000Z | 2019-05-09T08:43:54.000Z | # -*- coding: utf8 -*-
import click
@click.group(short_help="Manage your WebHosting services.")
def webhosting():
"""Manage and configure your WebHosting products."""
| 19.333333 | 59 | 0.701149 | 20 | 174 | 6.05 | 0.75 | 0.231405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006757 | 0.149425 | 174 | 8 | 60 | 21.75 | 0.810811 | 0.390805 | 0 | 0 | 0 | 0 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6d9543eaa539db62950e6fde56b5ad4c50148b71 | 17,925 | py | Python | pyfc4/plugins/pcdm/models.py | RockefellerArchiveCenter/pyfc4 | fc671ec69f9b436759c2fd04385bbe93a9b1128b | [
"MIT"
] | 12 | 2017-08-04T19:26:33.000Z | 2020-05-25T17:57:47.000Z | pyfc4/plugins/pcdm/models.py | ghukill/pyfc4 | fc671ec69f9b436759c2fd04385bbe93a9b1128b | [
"MIT"
] | 57 | 2017-07-26T02:22:34.000Z | 2022-02-28T20:26:06.000Z | pyfc4/plugins/pcdm/models.py | RockefellerArchiveCenter/pyfc4 | fc671ec69f9b436759c2fd04385bbe93a9b1128b | [
"MIT"
] | 1 | 2018-09-13T17:03:10.000Z | 2018-09-13T17:03:10.000Z | # pyfc4 plugin: pcdm.models
import copy
import time
# import pyfc4 base models
from pyfc4 import models as _models
# logging
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
'''
Implementation of PCDM in LDP:
https://docs.google.com/document/d/1RI8aX8XQEk-30-Ht-DaPF5nz_VtI1-eqxUuDvF3nhv0/edit#
'''
class PCDMCollection(_models.BasicContainer):
'''
Class to represent PCDM Collections in LDP.
----------------------------------------------------------------------------------
URI Template -- Resource Identified
----------------------------------------------------------------------------------
/collections/{id} -- A Collection
/collections/{id}/members/ -- Membership container for the parent Collection
/collections/{id}/members/{proxy_obj_id} -- Proxy for the member Collection or Object
/collections/{id}/related/ -- Related object container for the parent Collection
/collections/{id}/related/{proxy_obj_id} -- Proxy for the related Object
----------------------------------------------------------------------------------
When a PCDMCollection is created, the following child resources are automatically created:
- /members
- /related
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef, str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
'''
def __init__(self, repo, uri=None, response=None):
# fire parent Container init()
super().__init__(repo, uri=uri, response=response)
# members, related
self.members = self.get_members()
self._orig_members = copy.deepcopy(self.members)
self.related = self.get_related()
self._orig_related = copy.deepcopy(self.related)
def _post_create(self, auto_refresh=False):
'''
resource.create() hook
For PCDM Collections, post creation, also create
'''
# set PCDM triple as Collection
self.add_triple(self.rdf.prefixes.rdf.type, self.rdf.prefixes.pcdm.Collection)
self.update(auto_refresh=auto_refresh)
# create /members child resource
members_child = PCDMMembersContainer(
self.repo,
'%s/members' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.pcdm.hasMember,
insertedContentRelation=self.rdf.prefixes.ore.proxyFor)
members_child.create(specify_uri=True)
# create /related child resource
related_child = PCDMRelatedContainer(
self.repo,
'%s/related' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.ore.aggregates,
insertedContentRelation=self.rdf.prefixes.ore.proxyFor)
related_child.create(specify_uri=True)
def get_members(self):
'''
get pcdm:hasMember for this resource, optionally retrieving resource payload
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'pcdm') and hasattr(self.rdf.triples.pcdm, 'hasMember'):
members = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.pcdm.hasMember ]
# return
return members
else:
return []
def get_related(self):
'''
get ore:aggregates for this resource, optionally retrieving resource payload
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'ore') and hasattr(self.rdf.triples.ore, 'aggregates'):
related = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.ore.aggregates ]
# return
return related
else:
return []
def _post_update(self):
'''
'''
self.update_pcdm_relationship()
def _post_refresh(self):
'''
'''
self.update_pcdm_relationship()
def update_pcdm_relationship(self):
'''
'''
logger.debug("updating PCDM relationships")
# determine member diff
member_diff = {
'new':set(self.members) - set(self._orig_members),
'removed':set(self._orig_members) - set(self.members)
}
logger.debug(member_diff)
# create proxy objects for added members
for resource_uri in member_diff['new']:
proxy_obj = PCDMProxyObject(self.repo, uri="%s/members" % (self.uri), proxyForURI=resource_uri)
proxy_obj.create()
# remove proxy objects for added members
for resource_uri in member_diff['removed']:
proxy_obj = self.repo.get_resource(resource_uri)
proxy_obj.delete(remove_tombstone=True)
# determine member diff
related_diff = {
'new':set(self.related) - set(self._orig_related),
'removed':set(self._orig_related) - set(self.related)
}
logger.debug(member_diff)
# create proxy objects for added members
for resource_uri in related_diff['new']:
proxy_obj = PCDMProxyObject(self.repo, uri="%s/related" % (self.uri), proxyForURI=resource_uri)
proxy_obj.create()
# remove proxy objects for added members
for resource_uri in related_diff['removed']:
proxy_obj = self.repo.get_resource(resource_uri)
proxy_obj.delete(remove_tombstone=True)
class PCDMObject(_models.BasicContainer):
'''
Class to represent PCDM Objects in LDP.
----------------------------------------------------------------------------------
URI Template -- Resource Identified
----------------------------------------------------------------------------------
/objects/{id} -- An Object
/objects/{id}/files/ -- Container for component Files of the Object
/objects/{id}/files/{binary_id} -- A component File
/objects/{id}/files/{binary_id}/fcr:metadata -- Technical metadata about the File
/objects/{id}/members/ -- Membership container for the parent Object
/objects/{id}/members/{proxy_obj_id} -- Proxy for the member Object
/objects/{id}/related/ -- Related object container for the parent Object
/objects/{id}/related/{proxy_obj_id} -- Proxy for the related Object
/objects/{id}/associated/ -- Container for associated Files
----------------------------------------------------------------------------------
When a PCDMObject is created, the following child resources are automatically created:
- /files
- /members
- /related
- /associated
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
'''
def __init__(self, repo, uri=None, response=None, retrieve_pcdm_links=True):
# fire parent Container init()
super().__init__(repo, uri=uri, response=response)
# members, related
self.members = self.get_members(retrieve=retrieve_pcdm_links)
self._orig_members = copy.deepcopy(self.members)
self.files = self.get_files(retrieve=retrieve_pcdm_links)
self._orig_files = copy.deepcopy(self.files)
self.associated = self.get_associated(retrieve=retrieve_pcdm_links)
self._orig_associated = copy.deepcopy(self.associated)
self.related = self.get_related(retrieve=retrieve_pcdm_links)
self._orig_related = copy.deepcopy(self.related)
def _post_create(self, auto_refresh=False):
'''
resource.create() hook
'''
# set PCDM triple as Object
self.add_triple(self.rdf.prefixes.rdf.type, self.rdf.prefixes.pcdm.Object)
self.update(auto_refresh=auto_refresh)
# create /files child resource
files_child = PCDMFilesContainer(
self.repo,
'%s/files' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.pcdm.hasFile)
files_child.create(specify_uri=True)
# create /members child resource
members_child = PCDMMembersContainer(
self.repo,
'%s/members' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.pcdm.hasMember,
insertedContentRelation=self.rdf.prefixes.ore.proxyFor)
members_child.create(specify_uri=True)
# create /related child resource
related_child = PCDMRelatedContainer(
self.repo,
'%s/related' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.ore.aggregates,
insertedContentRelation=self.rdf.prefixes.ore.proxyFor)
related_child.create(specify_uri=True)
# create /associated child resource
associated_child = PCDMAssociatedContainer(
self.repo,
'%s/associated' % self.uri_as_string(),
membershipResource=self.uri,
hasMemberRelation=self.rdf.prefixes.pcdm.hasRelatedFile)
associated_child.create(specify_uri=True)
def get_members(self, retrieve=False):
'''
get pcdm:hasMember for this resource
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'pcdm') and hasattr(self.rdf.triples.pcdm, 'hasMember'):
members = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.pcdm.hasMember ]
# return
return members
else:
return []
def get_files(self, retrieve=False):
'''
get pcdm:hasFile for this resource
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'pcdm') and hasattr(self.rdf.triples.pcdm, 'hasFile'):
files = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.pcdm.hasFile ]
# return
return files
else:
return []
def get_associated(self, retrieve=False):
'''
get pcdm:hasRelatedFile for this resource
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'pcdm') and hasattr(self.rdf.triples.pcdm, 'hasRelatedFile'):
files = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.pcdm.hasRelatedFile ]
# return
return files
else:
return []
def get_related(self, retrieve=False):
'''
get ore:aggregates for this resource
Args:
retrieve (bool): if True, issue .refresh() on resource thereby confirming existence and retrieving payload
'''
if self.exists and hasattr(self.rdf.triples, 'ore') and hasattr(self.rdf.triples.ore, 'aggregates'):
related = [ self.repo.parse_uri(uri) for uri in self.rdf.triples.ore.aggregates ]
# return
return related
else:
return []
def _post_update(self):
'''
'''
self.update_pcdm_relationship()
def _post_refresh(self):
'''
'''
self.update_pcdm_relationship()
def update_pcdm_relationship(self):
'''
'''
logger.debug("updating PCDM relationships")
# determine member diff
member_diff = {
'new':set(self.members) - set(self._orig_members),
'removed':set(self._orig_members) - set(self.members)
}
logger.debug(member_diff)
# create proxy objects for added members
for resource_uri in member_diff['new']:
proxy_obj = PCDMProxyObject(self.repo, uri="%s/members" % (self.uri), proxyForURI=resource_uri)
proxy_obj.create()
# remove proxy objects for added members
for resource_uri in member_diff['removed']:
proxy_obj = self.repo.get_resource(resource_uri)
proxy_obj.delete(remove_tombstone=True)
# determine member diff
related_diff = {
'new':set(self.related) - set(self._orig_related),
'removed':set(self._orig_related) - set(self.related)
}
logger.debug(member_diff)
# create proxy objects for added members
for resource_uri in related_diff['new']:
proxy_obj = PCDMProxyObject(self.repo, uri="%s/related" % (self.uri), proxyForURI=resource_uri)
proxy_obj.create()
# remove proxy objects for added members
for resource_uri in related_diff['removed']:
proxy_obj = self.repo.get_resource(resource_uri)
proxy_obj.delete(remove_tombstone=True)
class PCDMFile(_models.NonRDFSource):
'''
Extends NonRDFSource(Binary).
Inherits:
NonRDFSource
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
binary_data: optional, file data, accepts file-like object, raw data, or URL
binary_mimetype: optional, mimetype for provided data
'''
def __init__(self, repo, uri=None, response=None, binary_data=None, binary_mimetype=None):
# fire parent Resource init()
super().__init__(repo, uri=uri, response=response, binary_data=binary_data, binary_mimetype=binary_mimetype)
def _post_create(self, auto_refresh=False):
'''
resource.create() hook
For PCDM File
'''
# set PCDM triple as Collection
self.add_triple(self.rdf.prefixes.rdf.type, self.rdf.prefixes.pcdm.File)
self.update(auto_refresh=auto_refresh)
class PCDMProxyObject(_models.BasicContainer):
'''
Class to represent PCDM Proxy Objects in PCDM/LDP
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
proxyFor (rdflib.term.URIRef,str): URI of resource this resource is a proxy for, sets ore:proxyFor triple
proxyFor (rdflib.term.URIRef,str): URI of resource this resource is a proxy in, sets ore:proxyIn triple
'''
def __init__(self, repo, uri=None, response=None, proxyForURI=None, proxyInURI=None):
# fire parent Container init()
super().__init__(repo, uri=uri, response=response)
self.proxyForURI = proxyForURI
self.proxyInURI = proxyInURI
def _post_create(self, auto_refresh=False):
'''
resource.create() hook
'''
# set rdf type
self.add_triple(self.rdf.prefixes.rdf.type, self.rdf.prefixes.ore.Proxy)
# set triple for what this resource is a proxy for
if self.proxyForURI:
self.add_triple(self.rdf.prefixes.ore.proxyFor, self.proxyForURI)
# if proxyIn set, add triple
if self.proxyInURI:
self.add_triple(self.rdf.prefixes.ore.proxyFor, self.proxyForURI)
# update
self.update(auto_refresh=auto_refresh)
class PCDMFilesContainer(_models.DirectContainer):
'''
Class to represent Files under a PCDM Object
Inherits:
DirectContainer
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
membershipResource (rdflib.term): resource that will accumlate triples as children are added
hasMemberRelation (rdflib.term): predicate that will be used when pointing from URI in ldp:membershipResource to ldp:insertedContentRelation
'''
def __init__(self,
repo,
uri=None,
response=None,
membershipResource=None,
hasMemberRelation=None):
# fire parent DirectContainer init()
super().__init__(
repo,
uri=uri,
response=response,
membershipResource=membershipResource,
hasMemberRelation=hasMemberRelation)
class PCDMMembersContainer(_models.IndirectContainer):
'''
Class to represent Related under a PCDM Collection or Object
Inherits:
IndirectContainer
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
membershipResource (rdflib.term): resource that will accumlate triples as children are added
hasMemberRelation (rdflib.term): predicate that will be used when pointing from URI in ldp:membershipResource to ldp:insertedContentRelation
insertedContentRelation (rdflib.term): destination for ldp:hasMemberRelation from ldp:membershipResource
'''
def __init__(self,
repo,
uri=None,
response=None,
membershipResource=None,
hasMemberRelation=None,
insertedContentRelation=None):
# fire parent Container init()
super().__init__(
repo,
uri=uri,
response=response,
membershipResource=membershipResource,
hasMemberRelation=hasMemberRelation,
insertedContentRelation=insertedContentRelation)
class PCDMRelatedContainer(_models.IndirectContainer):
'''
Class to represent Related under a PCDM Collection or Object
Inherits:
IndirectContainer
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
membershipResource (rdflib.term): resource that will accumlate triples as children are added
hasMemberRelation (rdflib.term): predicate that will be used when pointing from URI in ldp:membershipResource to ldp:insertedContentRelation
insertedContentRelation (rdflib.term): destination for ldp:hasMemberRelation from ldp:membershipResource
'''
def __init__(self,
repo,
uri=None,
response=None,
membershipResource=None,
hasMemberRelation=None,
insertedContentRelation=None):
# fire parent Container init()
super().__init__(
repo,
uri=uri,
response=response,
membershipResource=membershipResource,
hasMemberRelation=hasMemberRelation,
insertedContentRelation=insertedContentRelation)
class PCDMAssociatedContainer(_models.DirectContainer):
'''
Class to represent Associated under a PCDM Object
Inherits:
DirectContainer
Args:
repo (Repository): instance of Repository class
uri (rdflib.term.URIRef,str): input URI
response (requests.models.Response): defaults None, but if passed, populate self.data, self.headers, self.status_code
membershipResource (rdflib.term): resource that will accumlate triples as children are added
hasMemberRelation (rdflib.term): predicate that will be used when pointing from URI in ldp:membershipResource to ldp:insertedContentRelation
'''
def __init__(self,
repo,
uri=None,
response=None,
membershipResource=None,
hasMemberRelation=None):
# fire parent DirectContainer init()
super().__init__(
repo,
uri=uri,
response=response,
membershipResource=membershipResource,
hasMemberRelation=hasMemberRelation)
| 28.40729 | 142 | 0.726695 | 2,216 | 17,925 | 5.74639 | 0.08935 | 0.020889 | 0.023559 | 0.01602 | 0.867756 | 0.846788 | 0.811685 | 0.783572 | 0.761819 | 0.742422 | 0 | 0.000783 | 0.144714 | 17,925 | 630 | 143 | 28.452381 | 0.829822 | 0.469958 | 0 | 0.758475 | 0 | 0 | 0.032889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.101695 | false | 0 | 0.016949 | 0 | 0.20339 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6dd76a6a33ebe942abe8f2630b7860be3308c467 | 136 | py | Python | audio/fourier/__init__.py | joshbarrass/spyctrum | d3a081f5f120a62ee06da596b2385c5bfdf45c41 | [
"MIT"
] | null | null | null | audio/fourier/__init__.py | joshbarrass/spyctrum | d3a081f5f120a62ee06da596b2385c5bfdf45c41 | [
"MIT"
] | null | null | null | audio/fourier/__init__.py | joshbarrass/spyctrum | d3a081f5f120a62ee06da596b2385c5bfdf45c41 | [
"MIT"
] | null | null | null | from spyctrum.audio.fourier.chunking import get_chunk
from spyctrum.audio.fourier.chunking import ALIGN_CENTRAL, ALIGN_START, ALIGN_END
| 45.333333 | 81 | 0.867647 | 20 | 136 | 5.7 | 0.6 | 0.210526 | 0.298246 | 0.421053 | 0.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073529 | 136 | 2 | 82 | 68 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
09cc6b810c2166c388796c21c88b2190d90dd45f | 126 | py | Python | gempy/torch_/__init__.py | bhartl/generative-models | 47590e18d8c120e2f10e29cd0293e999f44c815b | [
"MIT"
] | null | null | null | gempy/torch_/__init__.py | bhartl/generative-models | 47590e18d8c120e2f10e29cd0293e999f44c815b | [
"MIT"
] | null | null | null | gempy/torch_/__init__.py | bhartl/generative-models | 47590e18d8c120e2f10e29cd0293e999f44c815b | [
"MIT"
] | null | null | null | from .encoder import Encoder
from .encoder import ConvEncoder
from .decoder import Decoder
from .decoder import ConvTDecoder
| 21 | 33 | 0.833333 | 16 | 126 | 6.5625 | 0.375 | 0.209524 | 0.32381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134921 | 126 | 5 | 34 | 25.2 | 0.963303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
09ffb49842431279fdecd17e5a40392d08ee3cbb | 88 | py | Python | editor/lib/juma/core/guid.py | RazielSun/juma-editor | 125720f7386f9f0a4cd3466a45c883d6d6020e33 | [
"MIT"
] | 7 | 2016-02-13T18:47:23.000Z | 2020-07-03T13:47:49.000Z | editor/lib/juma/core/guid.py | RazielSun/juma-editor | 125720f7386f9f0a4cd3466a45c883d6d6020e33 | [
"MIT"
] | 1 | 2018-06-13T04:55:27.000Z | 2021-11-05T05:52:51.000Z | editor/lib/juma/core/guid.py | RazielSun/juma-editor | 125720f7386f9f0a4cd3466a45c883d6d6020e33 | [
"MIT"
] | 4 | 2016-02-15T13:32:46.000Z | 2019-12-12T17:22:31.000Z | import uuid
def generateGUID():
# return str( uuid.uuid1() )
return uuid.uuid1().hex
| 14.666667 | 29 | 0.693182 | 12 | 88 | 5.083333 | 0.666667 | 0.295082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0.159091 | 88 | 5 | 30 | 17.6 | 0.797297 | 0.295455 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 9 |
1128d43ea3d46f1ba5950ccde0d1bf650acfa1d4 | 171 | py | Python | github_code/mmdetection3d/mmdet3d/ops/interpolate/__init__.py | Fawkes7/mmdetection3d | 7d5a339c699a73b3c8d65de5e48c7407f8640b23 | [
"Apache-2.0"
] | 1 | 2020-11-20T23:10:34.000Z | 2020-11-20T23:10:34.000Z | github_code/mmdetection3d/mmdet3d/ops/interpolate/__init__.py | Fawkes7/mmdetection3d | 7d5a339c699a73b3c8d65de5e48c7407f8640b23 | [
"Apache-2.0"
] | null | null | null | github_code/mmdetection3d/mmdet3d/ops/interpolate/__init__.py | Fawkes7/mmdetection3d | 7d5a339c699a73b3c8d65de5e48c7407f8640b23 | [
"Apache-2.0"
] | null | null | null | from .three_interpolate import three_interpolate, trilinear_devoxelize
from .three_nn import three_nn
__all__ = ['three_nn', 'three_interpolate', 'trilinear_devoxelize']
| 34.2 | 70 | 0.830409 | 21 | 171 | 6.190476 | 0.380952 | 0.369231 | 0.384615 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087719 | 171 | 4 | 71 | 42.75 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1156b99ee65e74f679da32482f8b1de4ba0d84e6 | 10,475 | py | Python | Will_OS_Files/WillGames/Baseball/PrintBoxScore.py | andersonwillsam/Will_OS | c490cc39a856aa96594a25359df344b795da711b | [
"Unlicense"
] | null | null | null | Will_OS_Files/WillGames/Baseball/PrintBoxScore.py | andersonwillsam/Will_OS | c490cc39a856aa96594a25359df344b795da711b | [
"Unlicense"
] | null | null | null | Will_OS_Files/WillGames/Baseball/PrintBoxScore.py | andersonwillsam/Will_OS | c490cc39a856aa96594a25359df344b795da711b | [
"Unlicense"
] | null | null | null | import time
def wait_short2():
time.sleep(.5) # default .5
def PrintBoxScore(teams, batters, pitchers_used):
wait_short2()
print("Batting")
wait_short2()
print("")
wait_short2()
###########################################################
# Box score - Away batting
print(teams["away"].upper(), end="")
for y in range(25 - len(teams["away"])):
print(" ", end="")
print("AB R H RBI HR BB SO")
for x in batters["away"]:
# Player name
wait_short2()
print(x[0] + " ", end="")
# Print correct amount of spaces
for y in range(25 - len(str(x[0]))):
print(" ", end="")
# First column
print("\033[1;93;40m" + str(x[2]) + "\033[0m", end="")
# Columns 2-6
for z in range(3, 8):
if len(str(x[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[z] > 0:
print("\033[1;93;40m" + str(x[z]) + "\033[0m", end="")
else:
print(str(x[z]), end="")
# Last column
if len(str(x[7])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[8] > 0:
print("\033[1;93;40m" + str(x[8]) + "\033[0m")
else:
print(str(x[8]))
# Add up away batting totals
away_total = [0, 0, 0, 0, 0, 0, 0]
for x in range(0, 9):
away_total[0] = away_total[0] + batters["away"][x][2] # AB
away_total[1] = away_total[1] + batters["away"][x][3] # R
away_total[2] = away_total[2] + batters["away"][x][4] # H
away_total[3] = away_total[3] + batters["away"][x][5] # RBI
away_total[4] = away_total[4] + batters["away"][x][6] # HR
away_total[5] = away_total[5] + batters["away"][x][7] # BB
away_total[6] = away_total[6] + batters["away"][x][8] # SO
wait_short2()
print("Totals: " + str(away_total[0]), end="")
# Totals, Columns 1-6
for z in range(1, 6):
if len(str(away_total[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if away_total[z] > 0:
print("\033[1;93;40m" + str(away_total[z]) + "\033[0m", end="")
else:
print(str(away_total[z]), end="")
# Totals, Column 7
if len(str(away_total[6])) > 1:
print(" ", end="")
else:
print(" ", end="")
if away_total[6] > 0:
print("\033[1;93;40m" + str(away_total[6]) + "\033[0m")
else:
print(str(away_total[6]))
print("")
###########################################################
# Box score - Home batting
print(teams["home"].upper(), end="")
for y in range(25 - len(teams["home"])):
print(" ", end="")
print("AB R H RBI HR BB SO")
for x in batters["home"]:
# Player name
wait_short2()
print(x[0] + " ", end="")
# Print correct amount of spaces
for y in range(25 - len(str(x[0]))):
print(" ", end="")
# First column
print("\033[1;93;40m" + str(x[2]) + "\033[0m", end="")
# Columns 2-6
for z in range(3, 8):
if len(str(x[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[z] > 0:
print("\033[1;93;40m" + str(x[z]) + "\033[0m", end="")
else:
print(str(x[z]), end="")
# Last column
if len(str(x[7])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[8] > 0:
print("\033[1;93;40m" + str(x[8]) + "\033[0m")
else:
print(str(x[8]))
# Add up home batting totals
home_total = [0, 0, 0, 0, 0, 0, 0]
for x in range(0, 9):
home_total[0] = home_total[0] + batters["home"][x][2] # AB
home_total[1] = home_total[1] + batters["home"][x][3] # R
home_total[2] = home_total[2] + batters["home"][x][4] # H
home_total[3] = home_total[3] + batters["home"][x][5] # RBI
home_total[4] = home_total[4] + batters["home"][x][6] # HR
home_total[5] = home_total[5] + batters["home"][x][7] # BB
home_total[6] = home_total[6] + batters["home"][x][8] # SO
wait_short2()
print("Totals: " + str(home_total[0]), end="")
# Totals, columns 1-6
for z in range(1, 6):
if len(str(home_total[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if home_total[z] > 0:
print("\033[1;93;40m" + str(home_total[z]) + "\033[0m", end="")
else:
print(str(home_total[z]), end="")
# Totals, column 7
if len(str(home_total[6])) > 1:
print(" ", end="")
else:
print(" ", end="")
if home_total[6] > 0:
print("\033[1;93;40m" + str(home_total[6]) + "\033[0m")
else:
print(str(home_total[6]))
print("")
wait_short2()
print("Pitching")
wait_short2()
print("")
wait_short2
###########################################################
# Box score - Away pitching
print(teams["away"].upper(), end="")
for y in range(25 - len(teams["away"])):
print(" ", end="")
print("IP R H ER HR BB SO")
wait_short2()
for x in pitchers_used["away"]:
# Player name
print(x[0] + " ", end="")
# Print correct amount of spaces
for y in range(23 - len(str(x[0]))):
print(" ", end="")
# Column 1
if len(str(round(x[2], 1))) == 1:
print(" ", end="")
print("\033[1;93;40m" + str(round(x[2], 1)) + "\033[0m", end="")
# Columns 2-6
for z in range(3, 8):
if len(str(x[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[3] > 0:
print("\033[1;93;40m" + str(x[3]) + "\033[0m", end="")
else:
print(str(x[3]), end="")
# Last column
if len(str(x[7])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[8] > 0:
print("\033[1;93;40m" + str(x[8]) + "\033[0m")
else:
print(str(x[8]))
wait_short2()
# Add up away pitching totals
away_total = [0, 0, 0, 0, 0, 0, 0]
for x in range(0, len(pitchers_used["away"])):
away_total[0] = away_total[0] + pitchers_used["away"][x][2] # IP
away_total[1] = away_total[1] + pitchers_used["away"][x][3] # R
away_total[2] = away_total[2] + pitchers_used["away"][x][4] # H
away_total[3] = away_total[3] + pitchers_used["away"][x][5] # ER
away_total[4] = away_total[4] + pitchers_used["away"][x][6] # HR
away_total[5] = away_total[5] + pitchers_used["away"][x][7] # BB
away_total[6] = away_total[6] + pitchers_used["away"][x][8] # SO
wait_short2()
print("Totals: " + str(round(away_total[0], 1)), end="")
# Totals, columns 2-6
for z in range(1, 6):
if len(str(away_total[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if away_total[z] > 0:
print("\033[1;93;40m" + str(away_total[z]) + "\033[0m", end="")
else:
print(str(away_total[z]), end="")
# Totals, column 7
if len(str(away_total[6])) > 1:
print(" ", end="")
else:
print(" ", end="")
if away_total[6] > 0:
print("\033[1;93;40m" + str(away_total[6]) + "\033[0m")
else:
print(str(away_total[6]))
print("")
wait_short2()
###########################################################
# Box score - Home pitching
print(teams["home"].upper(), end="")
for y in range(25 - len(teams["home"])):
print(" ", end="")
print("IP R H ER HR BB SO")
wait_short2()
for x in pitchers_used["home"]:
# Player name
print(x[0] + " ", end="")
# Print correct amount of spaces
for y in range(23 - len(str(x[0]))):
print(" ", end="")
# Column 1
if len(str(round(x[2], 1))) == 1:
print(" ", end="")
print("\033[1;93;40m" + str(round(x[2], 1)) + "\033[0m", end="")
# Columns 2-6
for z in range(3, 8):
if len(str(x[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[3] > 0:
print("\033[1;93;40m" + str(x[3]) + "\033[0m", end="")
else:
print(str(x[3]), end="")
# Last column
if len(str(x[7])) > 1:
print(" ", end="")
else:
print(" ", end="")
if x[8] > 0:
print("\033[1;93;40m" + str(x[8]) + "\033[0m")
else:
print(str(x[8]))
wait_short2()
# Add up home pitching totals
home_total = [0, 0, 0, 0, 0, 0, 0]
for x in range(0, len(pitchers_used["home"])):
home_total[0] = home_total[0] + pitchers_used["home"][x][2] # IP
home_total[1] = home_total[1] + pitchers_used["home"][x][3] # R
home_total[2] = home_total[2] + pitchers_used["home"][x][4] # H
home_total[3] = home_total[3] + pitchers_used["home"][x][5] # ER
home_total[4] = home_total[4] + pitchers_used["home"][x][6] # HR
home_total[5] = home_total[5] + pitchers_used["home"][x][7] # BB
home_total[6] = home_total[6] + pitchers_used["home"][x][8] # SO
wait_short2()
print("Totals: " + str(round(home_total[0], 1)), end="")
# Totals, columns 2-6
for z in range(1, 6):
if len(str(home_total[z])) > 1:
print(" ", end="")
else:
print(" ", end="")
if home_total[z] > 0:
print("\033[1;93;40m" + str(home_total[z]) + "\033[0m", end="")
else:
print(str(home_total[z]), end="")
# Totals, column 7
if len(str(home_total[6])) > 1:
print(" ", end="")
else:
print(" ", end="")
if home_total[6] > 0:
print("\033[1;93;40m" + str(home_total[6]) + "\033[0m")
else:
print(str(home_total[6]))
print("")
wait_short2()
print("")
wait_short2()
wait_short2()
print("") | 30.100575 | 76 | 0.44105 | 1,426 | 10,475 | 3.14446 | 0.04979 | 0.096343 | 0.064228 | 0.049063 | 0.868421 | 0.862177 | 0.808653 | 0.808653 | 0.79215 | 0.761374 | 0 | 0.076734 | 0.339379 | 10,475 | 348 | 77 | 30.100575 | 0.571243 | 0.072554 | 0 | 0.82 | 0 | 0 | 0.093584 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008 | false | 0 | 0.004 | 0 | 0.012 | 0.416 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
fecfff12c49ce731dd6b9fcdeb29b1c82ffa020a | 46,146 | py | Python | tests/test_mmlib.py | Devyani-Divs/mirrormanager2 | c56bfdb363608c5ea701ce8f089ab1b09abc4de9 | [
"MIT"
] | 1 | 2015-11-08T08:56:33.000Z | 2015-11-08T08:56:33.000Z | tests/test_mmlib.py | Devyani-Divs/mirrormanager2 | c56bfdb363608c5ea701ce8f089ab1b09abc4de9 | [
"MIT"
] | null | null | null | tests/test_mmlib.py | Devyani-Divs/mirrormanager2 | c56bfdb363608c5ea701ce8f089ab1b09abc4de9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
'''
mirrormanager2 tests.
'''
__requires__ = ['SQLAlchemy >= 0.7']
import pkg_resources
import unittest
import sys
import os
sys.path.insert(0, os.path.join(os.path.dirname(
os.path.abspath(__file__)), '..'))
import mirrormanager2.lib
import tests
class MMLibtests(tests.Modeltests):
""" Collection tests. """
def test_query_directories(self):
""" Test the query_directories function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.query_directories(self.session)
self.assertEqual(len(results), 0)
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
tests.create_hostcategoryurl(self.session)
tests.create_categorydirectory(self.session)
results = mirrormanager2.lib.query_directories(self.session)
self.assertEqual(len(results), 12)
def test_get_site(self):
""" Test the get_site function of mirrormanager2.lib. """
tests.create_site(self.session)
results = mirrormanager2.lib.get_site(self.session, 0)
self.assertEqual(results, None)
results = mirrormanager2.lib.get_site(self.session, 1)
self.assertEqual(results.name, 'test-mirror')
self.assertEqual(results.private, False)
self.assertEqual(results.created_by, 'pingou')
results = mirrormanager2.lib.get_site(self.session, 2)
self.assertEqual(results.name, 'test-mirror2')
self.assertEqual(results.private, False)
self.assertEqual(results.created_by, 'kevin')
results = mirrormanager2.lib.get_site(self.session, 3)
self.assertEqual(results.name, 'test-mirror_private')
self.assertEqual(results.private, True)
self.assertEqual(results.created_by, 'skvidal')
def test_get_site_by_name(self):
""" Test the get_site_by_name function of mirrormanager2.lib. """
tests.create_site(self.session)
results = mirrormanager2.lib.get_site_by_name(self.session, 'foo')
self.assertEqual(results, None)
results = mirrormanager2.lib.get_site_by_name(
self.session, 'test-mirror')
self.assertEqual(results.name, 'test-mirror')
self.assertEqual(results.private, False)
self.assertEqual(results.created_by, 'pingou')
results = mirrormanager2.lib.get_site_by_name(
self.session, 'test-mirror2')
self.assertEqual(results.name, 'test-mirror2')
self.assertEqual(results.private, False)
self.assertEqual(results.created_by, 'kevin')
def test_get_all_sites(self):
""" Test the get_all_sites function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_all_sites(self.session)
self.assertEqual(results, [])
tests.create_site(self.session)
results = mirrormanager2.lib.get_all_sites(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'test-mirror')
self.assertEqual(results[1].name, 'test-mirror2')
self.assertEqual(results[2].name, 'test-mirror_private')
def test_get_siteadmin(self):
""" Test the get_siteadmin function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_siteadmin(self.session, 1)
self.assertEqual(results, None)
tests.create_site(self.session)
results = mirrormanager2.lib.get_siteadmin(self.session, 1)
self.assertEqual(results, None)
tests.create_site_admin(self.session)
results = mirrormanager2.lib.get_siteadmin(self.session, 1)
self.assertEqual(results.site.name, 'test-mirror')
self.assertEqual(results.username, 'ralph')
results = mirrormanager2.lib.get_siteadmin(self.session, 4)
self.assertEqual(results.site.name, 'test-mirror2')
self.assertEqual(results.username, 'pingou')
def test_get_host(self):
""" Test the get_host function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host(self.session, 1)
self.assertEqual(results, None)
tests.create_site(self.session)
tests.create_hosts(self.session)
results = mirrormanager2.lib.get_host(self.session, 1)
self.assertEqual(results.name, 'mirror.localhost')
self.assertEqual(results.country, 'US')
results = mirrormanager2.lib.get_host(self.session, 2)
self.assertEqual(results.name, 'mirror2.localhost')
self.assertEqual(results.country, 'FR')
results = mirrormanager2.lib.get_host(self.session, 3)
self.assertEqual(results.name, 'private.localhost')
self.assertEqual(results.country, 'NL')
def test_get_hosts(self):
""" Test the get_hosts function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_hosts(self.session)
self.assertEqual(results, [])
tests.create_site(self.session)
tests.create_hosts(self.session)
results = mirrormanager2.lib.get_hosts(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror.localhost')
self.assertEqual(results[0].country, 'US')
self.assertEqual(results[1].name, 'mirror2.localhost')
self.assertEqual(results[1].country, 'FR')
self.assertEqual(results[2].name, 'private.localhost')
self.assertEqual(results[2].country, 'NL')
def test_get_host_acl_ip(self):
""" Test the get_host_acl_ip function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host_acl_ip(self.session, 1)
self.assertEqual(results, None)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_hostaclip(self.session)
results = mirrormanager2.lib.get_host_acl_ip(self.session, 1)
self.assertEqual(results.host.name, 'mirror.localhost')
self.assertEqual(results.host.country, 'US')
results = mirrormanager2.lib.get_host_acl_ip(self.session, 2)
self.assertEqual(results.host.name, 'mirror2.localhost')
self.assertEqual(results.host.country, 'FR')
def test_get_host_netblock(self):
""" Test the get_host_netblock function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host_netblock(self.session, 1)
self.assertEqual(results, None)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_hostnetblock(self.session)
results = mirrormanager2.lib.get_host_netblock(self.session, 1)
self.assertEqual(results.host.name, 'private.localhost')
self.assertEqual(results.host.country, 'NL')
results = mirrormanager2.lib.get_host_netblock(self.session, 2)
self.assertEqual(results, None)
def test_get_host_peer_asn(self):
""" Test the get_host_peer_asn function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host_peer_asn(self.session, 1)
self.assertEqual(results, None)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_hostpeerasn(self.session)
results = mirrormanager2.lib.get_host_peer_asn(self.session, 1)
self.assertEqual(results.host.name, 'private.localhost')
self.assertEqual(results.host.country, 'NL')
results = mirrormanager2.lib.get_host_peer_asn(self.session, 2)
self.assertEqual(results, None)
def test_get_host_country(self):
""" Test the get_host_country function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host_country(self.session, 1)
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_hostcountry(self.session)
results = mirrormanager2.lib.get_host_country(self.session, 1)
self.assertEqual(results.host.name, 'mirror.localhost')
self.assertEqual(results.host.country, 'US')
results = mirrormanager2.lib.get_host_country(self.session, 2)
self.assertEqual(results.host.name, 'mirror2.localhost')
self.assertEqual(results.host.country, 'FR')
results = mirrormanager2.lib.get_host_country(self.session, 3)
self.assertEqual(results, None)
def test_get_host_category(self):
""" Test the get_host_category function of mirrormanager2.lib. """
results = mirrormanager2.lib.get_host_category(self.session, 1)
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
results = mirrormanager2.lib.get_host_category(self.session, 1)
self.assertEqual(results.host.name, 'mirror.localhost')
self.assertEqual(results.host.country, 'US')
results = mirrormanager2.lib.get_host_category(self.session, 2)
self.assertEqual(results.host.name, 'mirror.localhost')
self.assertEqual(results.host.country, 'US')
results = mirrormanager2.lib.get_host_category(self.session, 3)
self.assertEqual(results.host.name, 'mirror2.localhost')
self.assertEqual(results.host.country, 'FR')
results = mirrormanager2.lib.get_host_category(self.session, 4)
self.assertEqual(results.host.name, 'mirror2.localhost')
self.assertEqual(results.host.country, 'FR')
results = mirrormanager2.lib.get_host_category(self.session, 5)
self.assertEqual(results, None)
def test_get_host_category_by_hostid_category(self):
""" Test the get_host_category_by_hostid_category function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_host_category_by_hostid_category(
self.session, 1, 'Fedora Linux')
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
results = mirrormanager2.lib.get_host_category_by_hostid_category(
self.session, 1, 'Fedora Linux')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].host.name, 'mirror.localhost')
self.assertEqual(results[0].host.country, 'US')
results = mirrormanager2.lib.get_host_category_by_hostid_category(
self.session, 2, 'Fedora Linux')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].host.name, 'mirror2.localhost')
self.assertEqual(results[0].host.country, 'FR')
results = mirrormanager2.lib.get_host_category_by_hostid_category(
self.session, 3, 'Fedora Linux')
self.assertEqual(results, [])
def test_get_host_category_url_by_id(self):
""" Test the get_host_category_url_by_id function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_host_category_url_by_id(
self.session, 1)
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
tests.create_hostcategoryurl(self.session)
for i in range(4):
results = mirrormanager2.lib.get_host_category_url_by_id(
self.session, i+1)
self.assertEqual(
results.host_category.host.name, 'mirror.localhost')
self.assertEqual(
results.host_category.host.country, 'US')
results = mirrormanager2.lib.get_host_category_url_by_id(
self.session, 5)
self.assertEqual(results, None)
def test_get_host_category_url(self):
""" Test the get_host_category_url function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_host_category_url(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
tests.create_hostcategoryurl(self.session)
results = mirrormanager2.lib.get_host_category_url(self.session)
self.assertEqual(len(results), 4)
for i in range(4):
self.assertEqual(
results[i].host_category.host.name, 'mirror.localhost')
self.assertEqual(
results[i].host_category.host.country, 'US')
def test_get_country_by_name(self):
""" Test the get_country_by_name function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_country_by_name(self.session, 'FR')
self.assertEqual(results, None)
tests.create_base_items(self.session)
for i in ['FR', 'US']:
results = mirrormanager2.lib.get_country_by_name(
self.session, i)
self.assertEqual(results.code, i)
results = mirrormanager2.lib.get_country_by_name(self.session, 'BE')
self.assertEqual(results, None)
def test_get_country_continent_redirect(self):
""" Test the get_country_continent_redirect function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_country_continent_redirect(
self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_country_continent_redirect(
self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].country, 'IL')
self.assertEqual(results[0].continent, 'EU')
self.assertEqual(results[1].country, 'AM')
self.assertEqual(results[1].continent, 'EU')
self.assertEqual(results[2].country, 'JO')
self.assertEqual(results[2].continent, 'EU')
def test_get_user_by_username(self):
""" Test the get_user_by_username function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_user_by_username(
self.session, 'pingou')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_user_by_username(
self.session, 'pingou')
self.assertEqual(results.user_name, 'pingou')
self.assertEqual(results.email_address, 'pingou@fp.o')
results = mirrormanager2.lib.get_user_by_username(
self.session, 'ralph')
self.assertEqual(results.user_name, 'ralph')
self.assertEqual(results.email_address, 'ralph@fp.o')
results = mirrormanager2.lib.get_user_by_username(
self.session, 'foo')
self.assertEqual(results, None)
def test_get_user_by_email(self):
""" Test the get_user_by_email function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_user_by_email(
self.session, 'pingou@fp.o')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_user_by_email(
self.session, 'pingou@fp.o')
self.assertEqual(results.user_name, 'pingou')
self.assertEqual(results.email_address, 'pingou@fp.o')
results = mirrormanager2.lib.get_user_by_email(
self.session, 'ralph@fp.o')
self.assertEqual(results.user_name, 'ralph')
self.assertEqual(results.email_address, 'ralph@fp.o')
results = mirrormanager2.lib.get_user_by_email(
self.session, 'foo')
self.assertEqual(results, None)
def test_get_user_by_token(self):
""" Test the get_user_by_token function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_user_by_token(
self.session, 'bar')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_user_by_token(
self.session, 'bar')
self.assertEqual(results.user_name, 'shaiton')
self.assertEqual(results.email_address, 'shaiton@fp.o')
results = mirrormanager2.lib.get_user_by_token(
self.session, 'foo')
self.assertEqual(results, None)
def test_get_session_by_visitkey(self):
""" Test the get_session_by_visitkey function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_session_by_visitkey(
self.session, 'foo')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_session_by_visitkey(
self.session, 'foo')
self.assertEqual(results.user.user_name, 'pingou')
self.assertEqual(results.user.email_address, 'pingou@fp.o')
self.assertEqual(results.user_ip, '127.0.0.1')
results = mirrormanager2.lib.get_session_by_visitkey(
self.session, 'bar')
self.assertEqual(results, None)
def test_get_version_by_name_version(self):
""" Test the get_version_by_name_version function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_version_by_name_version(
self.session, 'Fedora', '21')
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_version(self.session)
results = mirrormanager2.lib.get_version_by_name_version(
self.session, 'Fedora', 21)
self.assertEqual(results.product.name, 'Fedora')
self.assertEqual(results.name, '21')
results = mirrormanager2.lib.get_version_by_name_version(
self.session, 'Fedora', '21-alpha')
self.assertEqual(results.product.name, 'Fedora')
self.assertEqual(results.name, '21-alpha')
self.assertEqual(results.is_test, True)
results = mirrormanager2.lib.get_session_by_visitkey(
self.session, 'bar')
self.assertEqual(results, None)
def test_get_versions(self):
""" Test the get_versions function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_versions(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_version(self.session)
results = mirrormanager2.lib.get_versions(self.session)
self.assertEqual(len(results), 6)
self.assertEqual(results[0].product.name, 'Fedora')
self.assertEqual(results[0].name, '20')
self.assertEqual(results[1].product.name, 'Fedora')
self.assertEqual(results[1].name, '21-alpha')
self.assertEqual(results[2].product.name, 'Fedora')
self.assertEqual(results[2].name, '21')
def test_get_arch_by_name(self):
""" Test the get_arch_by_name function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_arch_by_name(self.session, 'i386')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_arch_by_name(self.session, 'i386')
self.assertEqual(results.name, 'i386')
results = mirrormanager2.lib.get_arch_by_name(self.session, 'i686')
self.assertEqual(results, None)
def test_get_categories(self):
""" Test the get_categories function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_categories(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
results = mirrormanager2.lib.get_categories(self.session)
self.assertEqual(len(results), 2)
self.assertEqual(results[0].name, 'Fedora Linux')
self.assertEqual(results[0].product.name, 'Fedora')
self.assertEqual(results[1].name, 'Fedora EPEL')
self.assertEqual(results[1].product.name, 'EPEL')
def test_get_category_by_name(self):
""" Test the get_category_by_name function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_category_by_name(
self.session, 'Fedora EPEL')
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
results = mirrormanager2.lib.get_category_by_name(
self.session, 'Fedora EPEL')
self.assertEqual(results.name, 'Fedora EPEL')
self.assertEqual(results.product.name, 'EPEL')
results = mirrormanager2.lib.get_category_by_name(
self.session, 'Fedora Linux')
self.assertEqual(results.name, 'Fedora Linux')
self.assertEqual(results.product.name, 'Fedora')
results = mirrormanager2.lib.get_category_by_name(
self.session, 'foo')
self.assertEqual(results, None)
def test_get_category_directory(self):
""" Test the get_category_directory function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_category_directory(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_categorydirectory(self.session)
results = mirrormanager2.lib.get_category_directory(self.session)
self.assertEqual(len(results), 4)
self.assertEqual(
results[0].category.name, 'Fedora Linux')
self.assertEqual(
results[0].directory.name, 'pub/fedora/linux/releases')
self.assertEqual(
results[1].category.name, 'Fedora EPEL')
self.assertEqual(
results[1].directory.name, 'pub/epel')
def test_get_product_by_name(self):
""" Test the get_product_by_name function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_product_by_name(
self.session, 'Fedora')
self.assertEqual(results, None)
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_product_by_name(
self.session, 'Fedora')
self.assertEqual(results.name, 'Fedora')
results = mirrormanager2.lib.get_product_by_name(
self.session, 'EPEL')
self.assertEqual(results.name, 'EPEL')
results = mirrormanager2.lib.get_product_by_name(
self.session, 'foo')
self.assertEqual(results, None)
def test_get_products(self):
""" Test the get_products function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_products(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_products(self.session)
self.assertEqual(len(results), 2)
self.assertEqual(
results[0].name, 'EPEL')
self.assertEqual(
results[1].name, 'Fedora')
def test_get_repo_prefix_arch(self):
""" Test the get_repo_prefix_arch function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_repo_prefix_arch(
self.session, 'updates-testing-f20', 'x86_64')
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_version(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_repository(self.session)
results = mirrormanager2.lib.get_repo_prefix_arch(
self.session, 'updates-testing-f20', 'x86_64')
self.assertEqual(
results.name, 'pub/fedora/linux/updates/testing/20/x86_64')
results = mirrormanager2.lib.get_repo_prefix_arch(
self.session, 'updates-testing-f21', 'x86_64')
self.assertEqual(
results.name, 'pub/fedora/linux/updates/testing/21/x86_64')
results = mirrormanager2.lib.get_repo_prefix_arch(
self.session, 'updates-testing-f20', 'i386')
self.assertEqual(results, None)
def test_get_repo_by_name(self):
""" Test the get_repo_by_name function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_repo_by_name(
self.session, 'pub/fedora/linux/updates/testing/19/x86_64')
self.assertEqual(results, None)
tests.create_base_items(self.session)
tests.create_version(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_repository(self.session)
results = mirrormanager2.lib.get_repo_by_name(
self.session, 'pub/fedora/linux/updates/testing/19/x86_64')
self.assertEqual(
results.name, 'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_repo_by_name(
self.session, 'pub/fedora/linux/updates/testing/20/x86_64')
self.assertEqual(
results.name, 'pub/fedora/linux/updates/testing/20/x86_64')
results = mirrormanager2.lib.get_repo_by_name(
self.session, 'pub/fedora/linux/updates/testing/19/i386')
self.assertEqual(results, None)
def test_get_repo_by_dir(self):
""" Test the get_repo_by_dir function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_repo_by_dir(
self.session, 'pub/fedora/linux/updates/testing/21/x86_64')
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_version(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_repository(self.session)
results = mirrormanager2.lib.get_repo_by_dir(
self.session, 'pub/fedora/linux/updates/testing/21/x86_64')
self.assertEqual(len(results), 1)
self.assertEqual(
results[0].name, 'pub/fedora/linux/updates/testing/21/x86_64')
self.assertEqual(results[0].arch.name, 'x86_64')
def test_get_repositories(self):
""" Test the get_repositories function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_repositories(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_version(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_repository(self.session)
results = mirrormanager2.lib.get_repositories(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(
results[0].name, 'pub/fedora/linux/updates/testing/19/x86_64')
self.assertEqual(results[0].arch.name, 'x86_64')
self.assertEqual(
results[1].name, 'pub/fedora/linux/updates/testing/20/x86_64')
self.assertEqual(results[1].arch.name, 'x86_64')
self.assertEqual(
results[2].name, 'pub/fedora/linux/updates/testing/21/x86_64')
self.assertEqual(results[2].arch.name, 'x86_64')
def test_get_reporedirect(self):
""" Test the get_reporedirect function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_reporedirect(self.session)
self.assertEqual(results, [])
tests.create_repositoryredirect(self.session)
results = mirrormanager2.lib.get_reporedirect(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].from_repo, 'fedora-rawhide')
self.assertEqual(results[0].to_repo, 'rawhide')
self.assertEqual(results[1].from_repo, 'fedora-install-rawhide')
self.assertEqual(results[1].to_repo, 'rawhide')
self.assertEqual(results[2].from_repo, 'epel-6.0')
self.assertEqual(results[2].to_repo, 'epel-6')
def test_get_arches(self):
""" Test the get_arches function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_arches(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
results = mirrormanager2.lib.get_arches(self.session)
self.assertEqual(len(results), 4)
self.assertEqual(results[0].name, 'i386')
self.assertEqual(results[1].name, 'ppc')
self.assertEqual(results[2].name, 'source')
self.assertEqual(results[3].name, 'x86_64')
def test_add_admin_to_site(self):
""" Test the add_admin_to_site function of mirrormanager2.lib.
"""
tests.create_base_items(self.session)
tests.create_site(self.session)
site = mirrormanager2.lib.get_site(self.session, 1)
results = mirrormanager2.lib.add_admin_to_site(
self.session, site, 'pingou')
self.assertEqual(results, 'pingou added as an admin')
results = mirrormanager2.lib.add_admin_to_site(
self.session, site, 'pingou')
self.assertEqual(results, 'pingou was already listed as an admin')
def test_get_locations(self):
""" Test the get_locations function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_locations(self.session)
self.assertEqual(results, [])
tests.create_location(self.session)
results = mirrormanager2.lib.get_locations(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'foo')
self.assertEqual(results[1].name, 'bar')
self.assertEqual(results[2].name, 'foobar')
def test_get_netblock_country(self):
""" Test the get_netblock_country function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_netblock_country(self.session)
self.assertEqual(results, [])
tests.create_netblockcountry(self.session)
results = mirrormanager2.lib.get_netblock_country(self.session)
self.assertEqual(len(results), 1)
self.assertEqual(results[0].netblock, '127.0.0.0/24')
self.assertEqual(results[0].country, 'AU')
def test_get_mirrors(self):
""" Test the get_mirrors function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_mirrors(self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
tests.create_hostcategoryurl(self.session)
tests.create_categorydirectory(self.session)
tests.create_netblockcountry(self.session)
results = mirrormanager2.lib.get_mirrors(self.session)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(self.session, private=True)
self.assertEqual(len(results), 1)
self.assertEqual(results[0].name, 'private.localhost')
results = mirrormanager2.lib.get_mirrors(self.session, internet2=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, internet2_clients=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, internet2_clients=False)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, asn_clients=True)
self.assertEqual(len(results), 1)
self.assertEqual(results[0].name, 'mirror2.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, asn_clients=False)
self.assertEqual(len(results), 2)
self.assertEqual(results[0].name, 'private.localhost')
self.assertEqual(results[1].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, admin_active=False)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, admin_active=True)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, user_active=False)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, user_active=True)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, host_category_url_private=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, host_category_url_private=False)
self.assertEqual(len(results), 1)
self.assertEqual(results[0].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, last_crawl_duration=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, last_crawl_duration=False)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, last_crawled=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, last_crawled=False)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, last_checked_in=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, last_checked_in=False)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, site_private=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, site_private=False)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, site_user_active=False)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, site_user_active=True)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, site_admin_active=False)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, site_admin_active=True)
self.assertEqual(len(results), 3)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'private.localhost')
self.assertEqual(results[2].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, up2date=True)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, up2date=False)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, version_id=1)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, version_id=3)
tests.create_version(self.session)
tests.create_repository(self.session)
results = mirrormanager2.lib.get_mirrors(
self.session, version_id=1)
self.assertEqual(len(results), 2)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, version_id=3)
self.assertEqual(len(results), 2)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'mirror.localhost')
results = mirrormanager2.lib.get_mirrors(
self.session, arch_id=1)
self.assertEqual(len(results), 0)
results = mirrormanager2.lib.get_mirrors(
self.session, arch_id=3)
self.assertEqual(len(results), 2)
self.assertEqual(results[0].name, 'mirror2.localhost')
self.assertEqual(results[1].name, 'mirror.localhost')
def test_get_user_sites(self):
""" Test the get_user_sites function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_user_sites(self.session, 'pingou')
self.assertEqual(results, [])
self.test_add_admin_to_site()
results = mirrormanager2.lib.get_user_sites(self.session, 'pingou')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].name, 'test-mirror')
def test_id_generator(self):
""" Test the id_generator function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.id_generator(size=5, chars=['a'])
self.assertEqual(results, 'aaaaa')
results = mirrormanager2.lib.id_generator(size=5, chars=['1'])
self.assertEqual(results, '11111')
def test_get_directory_by_name(self):
""" Test the get_directory_by_name function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_directory_by_name(
self.session, 'pub/epel')
self.assertEqual(results, None)
tests.create_directory(self.session)
results = mirrormanager2.lib.get_directory_by_name(
self.session, 'pub/epel')
self.assertEqual(results.name, 'pub/epel')
self.assertEqual(results.readable, True)
results = mirrormanager2.lib.get_directory_by_name(
self.session, 'pub/fedora/linux/extras')
self.assertEqual(results.name, 'pub/fedora/linux/extras')
self.assertEqual(results.readable, True)
results = mirrormanager2.lib.get_directory_by_name(
self.session, 'pub/fedora/linux/updates/testing/19/x86_64')
self.assertEqual(
results.name, 'pub/fedora/linux/updates/testing/19/x86_64')
self.assertEqual(results.readable, True)
def test_get_directories(self):
""" Test the get_directories function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_directories(self.session)
self.assertEqual(results, [])
tests.create_directory(self.session)
results = mirrormanager2.lib.get_directories(self.session)
self.assertEqual(len(results), 9)
self.assertEqual(results[0].name, 'pub/fedora/linux/releases')
self.assertEqual(results[1].name, 'pub/fedora/linux/extras')
self.assertEqual(results[2].name, 'pub/epel')
def test_get_directory_by_id(self):
""" Test the get_directory_by_id function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_directory_by_id(
self.session, 1)
self.assertEqual(results, None)
tests.create_directory(self.session)
results = mirrormanager2.lib.get_directory_by_id(self.session, 3)
self.assertEqual(results.name, 'pub/epel')
self.assertEqual(results.readable, True)
results = mirrormanager2.lib.get_directory_by_id(self.session, 2)
self.assertEqual(results.name, 'pub/fedora/linux/extras')
self.assertEqual(results.readable, True)
results = mirrormanager2.lib.get_directory_by_id(self.session, 4)
self.assertEqual(
results.name, 'pub/fedora/linux/releases/20')
self.assertEqual(results.readable, True)
def test_get_hostcategorydir_by_hostcategoryid_and_path(self):
""" Test the get_hostcategorydir_by_hostcategoryid_and_path
function of mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_hostcategorydir_by_hostcategoryid_and_path(
self.session, 2, 'pub/fedora/linux/releases/21')
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_category(self.session)
tests.create_hostcategory(self.session)
tests.create_hostcategorydir(self.session)
results = mirrormanager2.lib.get_hostcategorydir_by_hostcategoryid_and_path(
self.session, 3, 'pub/fedora/linux/releases/21')
self.assertEqual(len(results), 1)
self.assertEqual(
results[0].directory.name, 'pub/fedora/linux/releases/21')
self.assertEqual(
results[0].host_category.category.name, 'Fedora Linux')
def test_get_directory_exclusive_host(self):
""" Test the get_directory_exclusive_host function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_directory_exclusive_host(
self.session)
self.assertEqual(results, [])
tests.create_base_items(self.session)
tests.create_site(self.session)
tests.create_hosts(self.session)
tests.create_directory(self.session)
tests.create_directoryexclusivehost(self.session)
results = mirrormanager2.lib.get_directory_exclusive_host(
self.session)
self.assertEqual(len(results), 2)
self.assertEqual(
results[0].dname, 'pub/fedora/linux/releases/20')
self.assertEqual(
results[0].host_id, 1)
self.assertEqual(
results[1].dname, 'pub/fedora/linux/releases/21')
self.assertEqual(
results[1].host_id, 3)
def test_get_file_detail(self):
""" Test the get_file_detail function of
mirrormanager2.lib.
"""
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7)
self.assertEqual(results, None)
tests.create_directory(self.session)
tests.create_filedetail(self.session)
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7)
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, md5='foo_md5')
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, sha1='foo_sha1')
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, sha256='foo_sha256')
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, sha512='foo_sha512')
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, size=2973)
self.assertEqual(results, None)
results = mirrormanager2.lib.get_file_detail(
self.session, 'repomd.xml', 7, timestamp=1357758825)
self.assertEqual(results.md5, 'foo_md5')
self.assertEqual(
results.directory.name,
'pub/fedora/linux/updates/testing/19/x86_64')
if __name__ == '__main__':
SUITE = unittest.TestLoader().loadTestsFromTestCase(MMLibtests)
unittest.TextTestRunner(verbosity=10).run(SUITE)
| 40.267016 | 84 | 0.665561 | 5,331 | 46,146 | 5.584881 | 0.03958 | 0.167769 | 0.20616 | 0.151446 | 0.932657 | 0.89205 | 0.852282 | 0.806603 | 0.771034 | 0.703591 | 0 | 0.021444 | 0.220864 | 46,146 | 1,145 | 85 | 40.302183 | 0.806642 | 0.067893 | 0 | 0.693683 | 0 | 0 | 0.084062 | 0.029462 | 0 | 0 | 0 | 0 | 0.396901 | 1 | 0.056019 | false | 0 | 0.007151 | 0 | 0.064362 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28c27caeb1344b4e346cb868f2b6d37f9b1d75d6 | 816 | py | Python | tests/components/accordion/test_accordion.py | Crown-Commercial-Service/govuk-frontend-jinja | ddbe208a976ffa4ca330881c506c5200dfa69851 | [
"MIT"
] | 7 | 2019-09-25T13:59:35.000Z | 2021-06-30T11:13:22.000Z | tests/components/accordion/test_accordion.py | Crown-Commercial-Service/govuk-frontend-jinja | ddbe208a976ffa4ca330881c506c5200dfa69851 | [
"MIT"
] | 23 | 2019-08-20T10:52:49.000Z | 2021-06-02T14:21:16.000Z | tests/components/accordion/test_accordion.py | Crown-Commercial-Service/govuk-frontend-jinja | ddbe208a976ffa4ca330881c506c5200dfa69851 | [
"MIT"
] | 6 | 2019-08-29T14:02:25.000Z | 2021-04-10T20:20:23.000Z | import pytest
pytestmark = pytest.mark.skipif(
pytest.helpers.govuk_frontend_version_info() < (2, 5),
reason="requires govuk-frontend >= 2.5",
)
def test_accordion(env, template, expected, similar):
template = env.from_string(template)
assert similar(template.render(), expected)
def test_accordion_with_additional_descriptions(env, template, expected, similar):
template = env.from_string(template)
assert similar(template.render(), expected)
def test_accordion_with_one_section_open(env, template, expected, similar):
template = env.from_string(template)
assert similar(template.render(), expected)
def test_accordion_with_all_sections_already_open(env, template, expected, similar):
template = env.from_string(template)
assert similar(template.render(), expected)
| 30.222222 | 84 | 0.76348 | 100 | 816 | 6 | 0.34 | 0.2 | 0.106667 | 0.173333 | 0.713333 | 0.713333 | 0.713333 | 0.713333 | 0.713333 | 0.713333 | 0 | 0.005642 | 0.131127 | 816 | 26 | 85 | 31.384615 | 0.840621 | 0 | 0 | 0.470588 | 0 | 0 | 0.036765 | 0 | 0 | 0 | 0 | 0 | 0.235294 | 1 | 0.235294 | false | 0 | 0.058824 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e90520c8cc6fadc2369ac05d83be8ca507547994 | 7,158 | py | Python | datasets/task_sampler.py | alessiabertugli/FUSION | 6d55ccca32da76be05450a8bb27bcb21ac3acc6a | [
"Apache-2.0"
] | 13 | 2020-11-18T15:15:31.000Z | 2021-12-18T03:07:03.000Z | datasets/task_sampler.py | alessiabertugli/FUSION | 6d55ccca32da76be05450a8bb27bcb21ac3acc6a | [
"Apache-2.0"
] | 1 | 2021-02-02T00:06:21.000Z | 2021-02-02T14:07:13.000Z | datasets/task_sampler.py | alessiabertugli/FUSION-ME | 6d55ccca32da76be05450a8bb27bcb21ac3acc6a | [
"Apache-2.0"
] | 1 | 2020-12-24T13:04:54.000Z | 2020-12-24T13:04:54.000Z | import copy
import logging
import numpy as np
import torch
logger = logging.getLogger("experiment")
class SamplerFactory:
def __init__(self):
pass
@staticmethod
def get_sampler(dataset, tasks, trainset, testset=None):
if "omni" in dataset:
return OmniglotSampler(tasks, trainset, testset)
elif "imagenet":
return ImagenetSampler(tasks, trainset, testset)
class OmniglotSampler:
# Class to sample tasks
def __init__(self, tasks, trainset, testset):
self.tasks = tasks
self.task_sampler = SampleOmni(trainset, testset)
self.task_sampler.add_complete_iterator(list(range(0, int(len(self.tasks)))))
def get_complete_iterator(self):
return self.task_sampler.complete_iterator
def sample_task(self, t, train=True):
return self.task_sampler.get(t, train)
class ImagenetSampler:
# Class to sample tasks
def __init__(self, tasks, trainset, testset):
self.tasks = tasks
self.task_sampler = SampleImagenet(trainset, testset)
self.task_sampler.add_complete_iterator(list(range(0, int(len(self.tasks)))))
def get_complete_iterator(self):
return self.task_sampler.complete_iterator
def sample_task(self, t, train=True):
return self.task_sampler.get(t, train)
def sample_tasks(self, t, train=False):
# assert(false)
dataset = self.task_sampler.get_task_trainset(t, train)
train_iterator = torch.utils.data.DataLoader(dataset,
batch_size=1,
shuffle=True, num_workers=0)
return train_iterator
class SampleOmni:
def __init__(self, trainset, testset):
self.task_iterators = []
self.trainset = trainset
self.testset = testset
self.iterators = {}
self.test_iterators = {}
def add_complete_iterator(self, tasks):
dataset = self.get_task_trainset(tasks, True)
# dataset = self.get_task_testset(tasks)
train_iterator = torch.utils.data.DataLoader(dataset,
batch_size=10,
shuffle=True, num_workers=0)
self.complete_iterator = train_iterator
logger.info("Len of complete iterator = %d", len(self.complete_iterator) * 256)
train_iterator2 = torch.utils.data.DataLoader(dataset,
batch_size=1,
shuffle=True, num_workers=0)
self.another_complete_iterator = train_iterator2
def add_task_iterator(self, task, train):
dataset = self.get_task_trainset([task], train)
train_iterator = torch.utils.data.DataLoader(dataset,
batch_size=1,
shuffle=True, num_workers=0)
self.iterators[task] = train_iterator
print("Task %d has been added to the list" % task)
return train_iterator
def get(self, tasks, train):
if train:
for task in tasks:
if task in self.iterators:
return self.iterators[task]
else:
return self.add_task_iterator(task, True)
else:
for task in tasks:
if tasks in self.test_iterators:
return self.test_iterators[task]
else:
return self.add_task_iterator(task, False)
def get_task_trainset(self, tasks, train):
if train:
set = copy.deepcopy(self.trainset)
else:
set = copy.deepcopy(self.testset)
# class labels -> set.targets
class_labels = np.asarray(set.targets)
indices = np.zeros_like(class_labels)
for a in tasks:
indices = indices + (class_labels == a).astype(int)
indices = np.nonzero(indices)
set.data = [set.data[i] for i in indices[0]]
set.targets = [set.targets[i] for i in indices[0]]
set.data2 = []
set.targets2 = []
return set
def filter_upto(self, task):
trainset = copy.deepcopy(self.trainset)
trainset.data = trainset.data[trainset.data['target'] <= task]
return trainset
class SampleImagenet:
def __init__(self, trainset, testset):
self.task_iterators = []
self.trainset = trainset
self.testset = testset
self.iterators = {}
self.test_iterators = {}
def add_complete_iterator(self, tasks):
dataset = self.get_task_trainset(tasks, True)
# dataset = self.get_task_testset(tasks)
train_iterator = torch.utils.data.DataLoader(dataset,
batch_size=10,
shuffle=True, num_workers=0)
self.complete_iterator = train_iterator
logger.info("Len of complete iterator = %d", len(self.complete_iterator) * 256)
train_iterator2 = torch.utils.data.DataLoader(dataset,
batch_size=1,
shuffle=True, num_workers=0)
self.another_complete_iterator = train_iterator2
def add_task_iterator(self, task, train):
dataset = self.get_task_trainset([task], train)
train_iterator = torch.utils.data.DataLoader(dataset,
batch_size=1,
shuffle=True, num_workers=0)
self.iterators[task] = train_iterator
print("Task %d has been added to the list" % task)
return train_iterator
def get(self, tasks, train):
if train:
for task in tasks:
if task in self.iterators:
return self.iterators[task]
else:
return self.add_task_iterator(task, True)
else:
for task in tasks:
if tasks in self.test_iterators:
return self.test_iterators[task]
else:
return self.add_task_iterator(task, False)
def get_task_trainset(self, tasks, train):
if train:
set = copy.deepcopy(self.trainset)
else:
set = copy.deepcopy(self.testset)
# class labels -> set.targets
class_labels = np.asarray(set.targets)
indices = np.zeros_like(class_labels)
for a in tasks:
indices = indices + (class_labels == a).astype(int)
indices = np.nonzero(indices)
set.data = [set.data[i] for i in indices[0]]
set.targets = [set.targets[i] for i in indices[0]]
set.data2 = []
set.targets2 = []
return set
def filter_upto(self, task):
trainset = copy.deepcopy(self.trainset)
trainset.data = trainset.data[trainset.data['target'] <= task]
return trainset
| 33.764151 | 87 | 0.563426 | 777 | 7,158 | 5.019305 | 0.118404 | 0.065641 | 0.034615 | 0.043077 | 0.878205 | 0.878205 | 0.878205 | 0.878205 | 0.878205 | 0.878205 | 0 | 0.007755 | 0.351495 | 7,158 | 211 | 88 | 33.924171 | 0.8324 | 0.026683 | 0 | 0.857143 | 0 | 0 | 0.022995 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0.006494 | 0.025974 | 0.025974 | 0.331169 | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e907f1bc2f85c8f6d0aa8a2c23f6a2557fc99e5a | 4,362 | py | Python | app/test/integration/test_integration_halosqs_config_helper.py | ashmastaflash/halo-sqs-connector | e4334d98af5fd74bbe8155e5efe467f5022d1c5d | [
"BSD-2-Clause"
] | 1 | 2020-12-20T04:32:33.000Z | 2020-12-20T04:32:33.000Z | app/test/integration/test_integration_halosqs_config_helper.py | ashmastaflash/halo-sqs-connector | e4334d98af5fd74bbe8155e5efe467f5022d1c5d | [
"BSD-2-Clause"
] | 1 | 2019-06-05T21:53:01.000Z | 2019-06-05T23:33:45.000Z | app/test/integration/test_integration_halosqs_config_helper.py | cloudpassage/halo-sqs-connector | e4334d98af5fd74bbe8155e5efe467f5022d1c5d | [
"BSD-2-Clause"
] | null | null | null | import imp
import os
import pytest
import sys
module_name = 'halosqs'
here_dir = os.path.dirname(os.path.abspath(__file__))
module_path = os.path.join(here_dir, '../../')
sys.path.append(module_path)
fp, pathname, description = imp.find_module(module_name)
halosqs = imp.load_module(module_name, fp, pathname, description)
class TestIntegrationConfigHelper(object):
def test_integration_config_helper_instantiate_send_events(self,
monkeypatch):
monkeypatch.setenv("HALO_API_KEY", "abc123")
monkeypatch.setenv("HALO_API_SECRET_KEY", "def456")
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "abc123")
monkeypatch.setenv("AWS_DEFAULT_REGION", "abc123")
monkeypatch.setenv("SQS_QUEUE_URL", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "send")
monkeypatch.setenv("HALO_MODULE", "events")
monkeypatch.setenv("START_TIME", "2018-01-01")
assert halosqs.ConfigHelper()
def test_integration_config_helper_instantiate_send_scans(self,
monkeypatch):
monkeypatch.setenv("HALO_API_KEY", "abc123")
monkeypatch.setenv("HALO_API_SECRET_KEY", "def456")
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "abc123")
monkeypatch.setenv("AWS_DEFAULT_REGION", "abc123")
monkeypatch.setenv("SQS_QUEUE_URL", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "send")
monkeypatch.setenv("HALO_MODULE", "events")
monkeypatch.setenv("START_TIME", "2018-01-01")
monkeypatch.setenv("SCAN_FILTER", "module:fim;status:completed_with_errors") # NOQA
cfg_helper = halosqs.ConfigHelper()
assert cfg_helper
assert cfg_helper.search_params == {"module": "fim",
"status": "completed_with_errors"}
def test_integration_config_helper_instantiate_send_scans_fail(self, monkeypatch): # NOQA
monkeypatch.setenv("HALO_API_KEY", "abc123")
monkeypatch.setenv("HALO_API_SECRET_KEY", "def456")
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "abc123")
monkeypatch.setenv("AWS_DEFAULT_REGION", "abc123")
monkeypatch.setenv("SQS_QUEUE_URL", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "send")
monkeypatch.setenv("HALO_MODULE", "events")
monkeypatch.setenv("START_TIME", "2018-01-01")
monkeypatch.setenv("SCAN_FILTER", "module:../etc/passwd")
with pytest.raises(ValueError):
assert halosqs.ConfigHelper()
def test_integration_config_helper_instantiate_send_fail(self,
monkeypatch):
monkeypatch.setenv("HALO_API_KEY", "abc123")
monkeypatch.setenv("HALO_API_SECRET_KEY", "def456")
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "abc123")
monkeypatch.setenv("AWS_DEFAULT_REGION", "abc123")
monkeypatch.setenv("SQS_QUEUE_URL", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "send")
with pytest.raises(ValueError):
assert halosqs.ConfigHelper()
def test_integration_config_helper_instantiate_receive(self,
monkeypatch):
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "abc123")
monkeypatch.setenv("AWS_DEFAULT_REGION", "abc123")
monkeypatch.setenv("SQS_QUEUE_URL", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "receive")
monkeypatch.setenv("HALO_MODULE", "events")
assert halosqs.ConfigHelper()
def test_integration_config_helper_instantiate_receive_fail(self,
monkeypatch):
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "abc123")
monkeypatch.setenv("APPLICATION_MODE", "receive")
monkeypatch.setenv("HALO_MODULE", "events")
monkeypatch.delenv("APPLICATION_MODE")
with pytest.raises(ValueError):
assert halosqs.ConfigHelper()
| 48.466667 | 94 | 0.649931 | 445 | 4,362 | 6.040449 | 0.170787 | 0.284598 | 0.213914 | 0.096726 | 0.848214 | 0.845238 | 0.81808 | 0.782366 | 0.761905 | 0.761905 | 0 | 0.033303 | 0.235901 | 4,362 | 89 | 95 | 49.011236 | 0.773177 | 0.002063 | 0 | 0.7 | 0 | 0 | 0.247586 | 0.037931 | 0 | 0 | 0 | 0 | 0.0875 | 1 | 0.075 | false | 0.0125 | 0.05 | 0 | 0.1375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3a8f1ad209f8b1bd0991eb7906c429912e1c3351 | 106 | py | Python | TrafficFlowClassification/train/__init__.py | wmn7/Traffic-Classification | 8a9271216072a3e2d8d3058d98397361f55c394d | [
"MIT"
] | 8 | 2020-12-15T02:55:10.000Z | 2022-03-25T02:56:26.000Z | TrafficFlowClassification/train/__init__.py | wmn7/Traffic-Classification | 8a9271216072a3e2d8d3058d98397361f55c394d | [
"MIT"
] | 4 | 2020-12-16T06:09:06.000Z | 2021-11-30T03:13:05.000Z | TrafficFlowClassification/train/__init__.py | wmn7/Traffic-Classification | 8a9271216072a3e2d8d3058d98397361f55c394d | [
"MIT"
] | 3 | 2021-10-21T02:04:37.000Z | 2022-03-04T07:32:45.000Z | '''
@Author: WANG Maonan
@Date: 2021-01-07 17:03:06
@Description:
@LastEditTime: 2021-01-07 17:03:07
'''
| 15.142857 | 34 | 0.669811 | 18 | 106 | 3.944444 | 0.666667 | 0.169014 | 0.225352 | 0.28169 | 0.338028 | 0 | 0 | 0 | 0 | 0 | 0 | 0.301075 | 0.122642 | 106 | 6 | 35 | 17.666667 | 0.462366 | 0.915094 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3ae8371dedc784a3dbbe01f0af41d73fe7be7ee7 | 173 | py | Python | src/__init__.py | KentWangYQ/timingwheel | aa01750d76b35ba24a4fffebce1305f46bc7a9cb | [
"MIT"
] | 1 | 2019-07-23T11:20:03.000Z | 2019-07-23T11:20:03.000Z | src/__init__.py | KentWangYQ/py_timer | aa01750d76b35ba24a4fffebce1305f46bc7a9cb | [
"MIT"
] | null | null | null | src/__init__.py | KentWangYQ/py_timer | aa01750d76b35ba24a4fffebce1305f46bc7a9cb | [
"MIT"
] | null | null | null | from . import delay_queue, timer, timer_task_entry, timer_task_list, timing_wheel
__all__ = ['delay_queue', 'timer', 'timer_task_entry', 'timer_task_list', 'timing_wheel']
| 43.25 | 89 | 0.780347 | 25 | 173 | 4.76 | 0.44 | 0.302521 | 0.252101 | 0.336134 | 0.890756 | 0.890756 | 0.890756 | 0.890756 | 0.890756 | 0.890756 | 0 | 0 | 0.092486 | 173 | 3 | 90 | 57.666667 | 0.757962 | 0 | 0 | 0 | 0 | 0 | 0.34104 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 12 |
a33f84fdf57a8ce4c1d4b03e2bda96e9e9325f22 | 9,756 | py | Python | pyaz/storage/blob/copy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/storage/blob/copy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/storage/blob/copy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage blob copy operations. Use `az storage blob show` to check the status of the blobs.
'''
from .... pyaz_utils import _call_az
def start(destination_blob, destination_container, account_key=None, account_name=None, auth_mode=None, connection_string=None, destination_if_match=None, destination_if_modified_since=None, destination_if_none_match=None, destination_if_unmodified_since=None, destination_lease_id=None, destination_tags_condition=None, metadata=None, rehydrate_priority=None, requires_sync=None, sas_token=None, source_account_key=None, source_account_name=None, source_blob=None, source_container=None, source_if_match=None, source_if_modified_since=None, source_if_none_match=None, source_if_unmodified_since=None, source_lease_id=None, source_path=None, source_sas=None, source_share=None, source_snapshot=None, source_tags_condition=None, source_uri=None, tags=None, tier=None, timeout=None):
'''
Copy a blob asynchronously. Use `az storage blob show` to check the status of the blobs.
Required Parameters:
- destination_blob -- Name of the destination blob. If the exists, it will be overwritten.
- destination_container -- The container name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- auth_mode -- The mode in which to run the command. "login" mode will directly use your login credentials for the authentication. The legacy "key" mode will attempt to query for an account key if no authentication parameters for the account are provided. Environment variable: AZURE_STORAGE_AUTH_MODE
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- destination_if_match -- An ETag value, or the wildcard character (*). Specify this header to perform the operation only if the resource's ETag matches the value specified.
- destination_if_modified_since -- Commence only if modified since supplied UTC datetime (Y-m-d'T'H:M'Z')
- destination_if_none_match -- An ETag value, or the wildcard character (*). Specify this header to perform the operation only if the resource's ETag does not match the value specified. Specify the wildcard character (*) to perform the operation only if the resource does not exist, and fail the operation if it does exist.
- destination_if_unmodified_since -- Commence only if unmodified since supplied UTC datetime (Y-m-d'T'H:M'Z')
- destination_lease_id -- The lease ID specified for this header must match the lease ID of the estination blob. If the request does not include the lease ID or it is not valid, the operation fails with status code 412 (Precondition Failed).
- destination_tags_condition -- Specify a SQL where clause on blob tags to operate only on blobs with a matching value.
- metadata -- Metadata in space-separated key=value pairs. This overwrites any existing metadata.
- rehydrate_priority -- Indicate the priority with which to rehydrate an archived blob.
- requires_sync -- Enforce that the service will not return a response until the copy is complete.
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
- source_account_key -- The storage account key of the source blob.
- source_account_name -- The storage account name of the source blob.
- source_blob -- The blob name for the source storage account.
- source_container -- The container name for the source storage account.
- source_if_match -- An ETag value, or the wildcard character (*). Specify this header to perform the operation only if the resource's ETag matches the value specified.
- source_if_modified_since -- Commence only if modified since supplied UTC datetime (Y-m-d'T'H:M'Z')
- source_if_none_match -- An ETag value, or the wildcard character (*). Specify this header to perform the operation only if the resource's ETag does not match the value specified. Specify the wildcard character (*) to perform the operation only if the resource does not exist, and fail the operation if it does exist.
- source_if_unmodified_since -- Commence only if unmodified since supplied UTC datetime (Y-m-d'T'H:M'Z')
- source_lease_id -- Specify this to perform the Copy Blob operation only if the lease ID given matches the active lease ID of the source blob.
- source_path -- The file path for the source storage account.
- source_sas -- The shared access signature for the source storage account.
- source_share -- The share name for the source storage account.
- source_snapshot -- The blob snapshot for the source storage account.
- source_tags_condition -- Specify a SQL where clause on blob tags to operate only on blobs with a matching value.
- source_uri -- None
- tags -- space-separated tags: key[=value] [key[=value] ...]. Use '' to clear existing tags.
- tier -- The tier value to set the blob to. For page blob, the tier correlates to the size of the blob and number of allowed IOPS. Possible values are P10, P15, P20, P30, P4, P40, P50, P6, P60, P70, P80 and this is only applicable to page blobs on premium storage accounts; For block blob, possible values are Archive, Cool and Hot. This is only applicable to block blobs on standard storage accounts.
- timeout -- Request timeout in seconds. Applies to each call to the service.
'''
return _call_az("az storage blob copy start", locals())
def cancel(copy_id, destination_blob, destination_container, account_key=None, account_name=None, auth_mode=None, connection_string=None, lease_id=None, sas_token=None, timeout=None):
'''
Required Parameters:
- copy_id -- Copy identifier provided in the copy.id of the original copy_blob operation.
- destination_blob -- Name of the destination blob. If the exists, it will be overwritten.
- destination_container -- The container name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- auth_mode -- The mode in which to run the command. "login" mode will directly use your login credentials for the authentication. The legacy "key" mode will attempt to query for an account key if no authentication parameters for the account are provided. Environment variable: AZURE_STORAGE_AUTH_MODE
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- lease_id -- Required if the destination blob has an active infinite lease.
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
- timeout -- Request timeout in seconds. Applies to each call to the service.
'''
return _call_az("az storage blob copy cancel", locals())
def start_batch(account_key=None, account_name=None, auth_mode=None, connection_string=None, destination_container=None, destination_path=None, dryrun=None, pattern=None, sas_token=None, source_account_key=None, source_account_name=None, source_client=None, source_container=None, source_sas=None, source_share=None, source_uri=None):
'''
Copy multiple blobs to a blob container. Use `az storage blob show` to check the status of the blobs.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- auth_mode -- The mode in which to run the command. "login" mode will directly use your login credentials for the authentication. The legacy "key" mode will attempt to query for an account key if no authentication parameters for the account are provided. Environment variable: AZURE_STORAGE_AUTH_MODE
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- destination_container -- The container name.
- destination_path -- The destination path that will be prepended to the blob name.
- dryrun -- None
- pattern -- None
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
- source_account_key -- None
- source_account_name -- None
- source_client -- ==SUPPRESS==
- source_container -- None
- source_sas -- None
- source_share -- None
- source_uri -- None
'''
return _call_az("az storage blob copy start-batch", locals())
| 101.625 | 781 | 0.772243 | 1,463 | 9,756 | 5.012987 | 0.146275 | 0.061085 | 0.049086 | 0.063403 | 0.741614 | 0.719116 | 0.700845 | 0.686119 | 0.67562 | 0.67562 | 0 | 0.00283 | 0.167077 | 9,756 | 95 | 782 | 102.694737 | 0.899705 | 0.809143 | 0 | 0 | 0 | 0 | 0.054981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.142857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
a3967462d2fbef3f1e2d62b7c2bcc068c875e6db | 72,168 | py | Python | functions/event_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | functions/event_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | 1 | 2021-02-24T21:50:18.000Z | 2021-02-24T21:50:18.000Z | functions/event_functions.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | # Autogenerated file. ANY CHANGES WILL BE OVERWRITTEN
from to_python.core.types import FunctionType, \
FunctionArgument, \
FunctionArgumentValues, \
FunctionReturnTypes, \
FunctionSignature, \
FunctionDoc, \
FunctionData, \
CompoundFunctionData
DUMP_PARTIAL = [
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='addEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='allowRemoteTrigger',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='false',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function allows you to register a custom event. Custom events function exactly like the built-in events. See event system for more information on the event system.' ,
arguments={
"eventName": """The name of the event you wish to create. """,
"allowRemoteTrigger": """A boolean specifying whether this event can be called remotely using triggerClientEvent / triggerServerEvent or not. """
},
result='returns true if the event was added successfully, false if the event was already added.' ,
),
url='addEvent',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='addEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='allowRemoteTrigger',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='false',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function allows you to register a custom event. Custom events function exactly like the built-in events. See event system for more information on the event system.' ,
arguments={
"eventName": """The name of the event you wish to create. """,
"allowRemoteTrigger": """A boolean specifying whether this event can be called remotely using triggerClientEvent / triggerServerEvent or not. """
},
result='returns true if the event was added successfully, false if the event was already added.' ,
),
url='addEvent',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='addEventHandler',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='handlerFunction',
argument_type=FunctionType(
names=['function'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='propagate',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='true',
)
],
[
FunctionArgument(
name='priority',
argument_type=FunctionType(
names=['string'],
is_optional=True,
),
default_value='"normal"',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function will add an event handler. An event handler is a function that will be called when the event its attached to is triggered. See event system for more information on how the event system works.\nEvent handlers are functions that are called when a particular event happens. Each event specifies a specific set of variables that are passed to the event handler and can be read by your function. The following global variables are available for use in handler functions:\n*source: the element that triggered the event\n*this: the element that the event handler is attached to\n*sourceResource: the resource that triggered the event\n*sourceResourceRoot: the root element of the resource that triggered the event\n*client: the client that triggered the event using triggerServerEvent. Not set if the event was not triggered from a client.\n*eventName: the name of the event which triggered the handler function.\nIt is important to remember that events pass up and down the element tree. An event triggered on the root element is triggered on every element in the tree. An event triggered on any other element is triggered on its ancestors (its parent element and its parents parent etc) and its children, grandchildren and great-grandchildren. You can use the propagate argument to specify if you wish your handler to receive events that have propagated up or down the tree.\nThe order in which event handlers are triggered is undefined, you should not rely on one event handler being executed before another.\nEach function closure can only be added once to each event. On the second attempt to add the function closure to the same event a warning will be emitted to the debug console and the call to addEventHandler will fail.' ,
arguments={
"eventName": """The name of the event you want to attach the handler function to. Note: The maximum allowed length is 100 ASCII characters (that is, English letters and numerals)``` """,
"attachedTo": """The element you wish to attach the handler to. The handler will only be called when the event it is attached to is triggered for this element, or one of its children. Often, this can be the root element (meaning the handler will be called when the event is triggered for any element). """,
"handlerFunction": """The handler function you wish to call when the event is triggered. This function will be passed all of the events parameters as arguments, but it isnt required that it takes all of them. """,
"propagate": """A boolean representing whether the handler will be triggered if the event was propagated down or up the element tree (starting from the source), and not triggered directly on attachedTo (that is, handlers attached with this argument set to false will only be triggered if source == this). In GUI events you will probably want to set this to false. """,
"priority": """A string representing the trigger order priority relative to other event handlers of the same name. Possible values are: """,
"high": """ """,
"normal": """ """,
"low": """''It is also possible to add finer priority control by appending a positive or negative number to the priority string. For example (in priority order for reference): "high+4" "high" "high-1" "normal-6" "normal-7" "low+1" "low" "low-1"'' """
},
result='returns true if the event handler was attached successfully. returns false if the specified event could not be found or any parameters were invalid.' ,
),
url='addEventHandler',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='addEventHandler',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='handlerFunction',
argument_type=FunctionType(
names=['function'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='propagate',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='true',
)
],
[
FunctionArgument(
name='priority',
argument_type=FunctionType(
names=['string'],
is_optional=True,
),
default_value='"normal"',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function will add an event handler. An event handler is a function that will be called when the event its attached to is triggered. See event system for more information on how the event system works.\nEvent handlers are functions that are called when a particular event happens. Each event specifies a specific set of variables that are passed to the event handler and can be read by your function. The following global variables are available for use in handler functions:\n*source: the element that triggered the event\n*this: the element that the event handler is attached to\n*sourceResource: the resource that triggered the event\n*sourceResourceRoot: the root element of the resource that triggered the event\n*client: the client that triggered the event using triggerServerEvent. Not set if the event was not triggered from a client.\n*eventName: the name of the event which triggered the handler function.\nIt is important to remember that events pass up and down the element tree. An event triggered on the root element is triggered on every element in the tree. An event triggered on any other element is triggered on its ancestors (its parent element and its parents parent etc) and its children, grandchildren and great-grandchildren. You can use the propagate argument to specify if you wish your handler to receive events that have propagated up or down the tree.\nThe order in which event handlers are triggered is undefined, you should not rely on one event handler being executed before another.\nEach function closure can only be added once to each event. On the second attempt to add the function closure to the same event a warning will be emitted to the debug console and the call to addEventHandler will fail.' ,
arguments={
"eventName": """The name of the event you want to attach the handler function to. Note: The maximum allowed length is 100 ASCII characters (that is, English letters and numerals)``` """,
"attachedTo": """The element you wish to attach the handler to. The handler will only be called when the event it is attached to is triggered for this element, or one of its children. Often, this can be the root element (meaning the handler will be called when the event is triggered for any element). """,
"handlerFunction": """The handler function you wish to call when the event is triggered. This function will be passed all of the events parameters as arguments, but it isnt required that it takes all of them. """,
"propagate": """A boolean representing whether the handler will be triggered if the event was propagated down or up the element tree (starting from the source), and not triggered directly on attachedTo (that is, handlers attached with this argument set to false will only be triggered if source == this). In GUI events you will probably want to set this to false. """,
"priority": """A string representing the trigger order priority relative to other event handlers of the same name. Possible values are: """,
"high": """ """,
"normal": """ """,
"low": """''It is also possible to add finer priority control by appending a positive or negative number to the priority string. For example (in priority order for reference): "high+4" "high" "high-1" "normal-6" "normal-7" "low+1" "low" "low-1"'' """
},
result='returns true if the event handler was attached successfully. returns false if the specified event could not be found or any parameters were invalid.' ,
),
url='addEventHandler',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='cancelEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='cancel',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='true',
)
],
[
FunctionArgument(
name='reason',
argument_type=FunctionType(
names=['string'],
is_optional=True,
),
default_value='""',
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function is used to stop the automatic internal handling of events, for example this can be used to prevent an item being given to a player when they walk over a pickup, by canceling the onPickupUse event.\ncancelEvent does not have an effect on all events, see the individual events pages for information on what happens when the event is canceled. cancelEvent does not stop further event handlers from being called, as the order of event handlers being called is undefined in many cases. Instead, you can see if the currently active event has been cancelled using wasEventCancelled.\nThe use of cancelEvent outside of an event handler has no effect.\nIf you implement your own custom events and want to handle them being cancelled, you should call wasEventCancelled to check after your call to triggerEvent.' ,
arguments={
},
result='' ,
),
url='cancelEvent',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='cancelEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function is used to stop the automatic internal handling of events, for example this can be used to prevent an item being given to a player when they walk over a pickup, by canceling the onPickupUse event.\ncancelEvent does not have an effect on all events, see the individual events pages for information on what happens when the event is canceled. cancelEvent does not stop further event handlers from being called, as the order of event handlers being called is undefined in many cases. Instead, you can see if the currently active event has been cancelled using wasEventCancelled.\nThe use of cancelEvent outside of an event handler has no effect.\nIf you implement your own custom events and want to handle them being cancelled, you should call wasEventCancelled to check after your call to triggerEvent.' ,
arguments={
},
result='' ,
),
url='cancelEvent',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='cancelLatentEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePlayer',
argument_type=FunctionType(
names=['player'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='handle',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Stops a latent event from completing' ,
arguments={
"thePlayer": """The player who is receiving the event. """,
"handle": """A handle previous got from getLatentEventHandles. """
},
result='' ,
),
url='cancelLatentEvent',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='cancelLatentEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='handle',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Stops a latent event from completing' ,
arguments={
"handle": """A handle previous got from getLatentEventHandles. """
},
result='' ,
),
url='cancelLatentEvent',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='getCancelReason',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['string'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Gets the reason for cancelling an event.' ,
arguments={
},
result='returns the reason that was given with cancelevent' ,
),
url='getCancelReason',
)
],
client=[
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='getEventHandlers',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function gets the attached functions from the event and attached element from current lua script.' ,
arguments={
"eventName": """The name of the event. For example ( onPlayerWasted ). """,
"attachedTo": """The element attached to. """
},
result='returns table with attached functions, empty table otherwise.' ,
),
url='getEventHandlers',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='getEventHandlers',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function gets the attached functions from the event and attached element from current lua script.' ,
arguments={
"eventName": """The name of the event. For example ( onPlayerWasted ). """,
"attachedTo": """The element attached to. """
},
result='returns table with attached functions, empty table otherwise.' ,
),
url='getEventHandlers',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='getLatentEventHandles',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePlayer',
argument_type=FunctionType(
names=['player'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Gets the currently queued latent events. The last one in the table is always the latest event queued. Each returned handle can be used with getLatentEventStatus or cancelLatentEvent' ,
arguments={
"thePlayer": """The player who is receiving the events. """
},
result='' ,
),
url='getLatentEventHandles',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='getLatentEventHandles',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Gets the currently queued latent events. The last one in the table is always the latest event queued. Each returned handle can be used with getLatentEventStatus or cancelLatentEvent' ,
arguments={
},
result='' ,
),
url='getLatentEventHandles',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='getLatentEventStatus',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='thePlayer',
argument_type=FunctionType(
names=['player'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='handle',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Gets the status of one queued latent event.' ,
arguments={
"thePlayer": """The player who is receiving the event. """,
"handle": """A handle previous got from getLatentEventHandles. """
},
result='' ,
),
url='getLatentEventStatus',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='getLatentEventStatus',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['table'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='handle',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='Gets the status of one queued latent event.' ,
arguments={
"handle": """A handle previous got from getLatentEventHandles. """
},
result='' ,
),
url='getLatentEventStatus',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='removeEventHandler',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='functionVar',
argument_type=FunctionType(
names=['function'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This functions removes a handler function from an event, so that the function is not called anymore when the event is triggered. See event system for more information on how the event system works.' ,
arguments={
"eventName": """The name of the event you want to detach the handler function from. """,
"attachedTo": """The element the handler was attached to. """,
"functionVar": """The handler function that was attached. """
},
result='returns true if the event handler was removed successfully. returns false if the specified event handler could not be found or invalid parameters were passed.' ,
),
url='removeEventHandler',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='removeEventHandler',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='attachedTo',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='functionVar',
argument_type=FunctionType(
names=['function'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This functions removes a handler function from an event, so that the function is not called anymore when the event is triggered. See event system for more information on how the event system works.' ,
arguments={
"eventName": """The name of the event you want to detach the handler function from. """,
"attachedTo": """The element the handler was attached to. """,
"functionVar": """The handler function that was attached. """
},
result='returns true if the event handler was removed successfully. returns false if the specified event handler could not be found or invalid parameters were passed.' ,
),
url='removeEventHandler',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='triggerClientEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='sendTo',
argument_type=FunctionType(
names=['table', 'element'],
is_optional=True,
),
default_value='getRootElement()',
)
],
[
FunctionArgument(
name='name',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='sourceElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='arguments',
argument_type=None,
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function triggers an event previously registered on a client. This is the primary means of passing information between the server and the client. Clients have a similar triggerServerEvent function that can do the reverse. You can treat this function as if it was an asynchronous function call, using triggerServerEvent to pass back any returned information if necessary.\nAlmost any data types can be passed as expected, including elements and complex nested tables. Non-element MTA data types like xmlNodes or resource pointers will not be able to be passed as they do not necessarily have a valid representation on the client.\nEvents are sent reliably, so clients will receive them, but there may be (but shouldnt be) a significant delay before they are received. You should take this into account when using them.\nKeep in mind the bandwidth issues when using events - dont pass a large list of arguments unless you really need to. It is marginally more efficient to pass one large event than two smaller ones.' ,
arguments={
"name": """The name of the event to trigger client side. You should register this event with addEvent and add at least one event handler using addEventHandler. """,
"sourceElement": """The element that is the Event system#Event handlers|source of the event. """,
"sendTo": """The event will be sent to all players that are children of the specified element. By default this is the root element, and hence the event is sent to all players. If you specify a single player it will just be sent to that player. This argument can also be a table of player elements. """,
"arguments...": """A list of arguments to trigger with the event. You can pass any lua data type (except functions). You can also pass elements. """
},
result='returns true if the event trigger has been sent, false if invalid arguments were specified.' ,
),
url='triggerClientEvent',
)
],
client=[
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='triggerEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='baseElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='argument1',
argument_type=FunctionType(
names=['var'],
is_optional=True,
),
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function will trigger a named event on a specific element in the element tree. See event system for more information on how the event system works.\nYou can use the value returned from this function to determine if the event was cancelled by one of the event handlers. You should determine what your response (if any) to this should be based on the events purpose. Generally, cancelling an event should prevent any further code being run that is dependent on whatever triggered that event. For example, if you have an onFlagCapture event, cancelling it would be expected to prevent the flag being able to be captured. Similarly, if you have onPlayerKill as an event you trigger, canceling it would either be expected to prevent the player being killed from dying or at least prevent the player from getting a score for it.' ,
arguments={
"eventName": """The name of the event you wish to trigger """,
"baseElement": """The element you wish to trigger the event on. See event system for information on how this works. """,
"argument1": """The first argument that the event handler expects should be added after the baseElement variable. """,
"NOTE": """This function can have more than one of these arguments specified, once for each argument the event handler is expecting. """
},
result='* returns nil if the arguments are invalid or the event could not be found.\n* returns true if the event was triggered successfully, and was not cancelled using cancelevent.\n* returns false if the event was triggered successfully, and was cancelled using cancelevent.' ,
),
url='triggerEvent',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='triggerEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='eventName',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='baseElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='argument1',
argument_type=FunctionType(
names=['var'],
is_optional=True,
),
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function will trigger a named event on a specific element in the element tree. See event system for more information on how the event system works.\nYou can use the value returned from this function to determine if the event was cancelled by one of the event handlers. You should determine what your response (if any) to this should be based on the events purpose. Generally, cancelling an event should prevent any further code being run that is dependent on whatever triggered that event. For example, if you have an onFlagCapture event, cancelling it would be expected to prevent the flag being able to be captured. Similarly, if you have onPlayerKill as an event you trigger, canceling it would either be expected to prevent the player being killed from dying or at least prevent the player from getting a score for it.' ,
arguments={
"eventName": """The name of the event you wish to trigger """,
"baseElement": """The element you wish to trigger the event on. See event system for information on how this works. """,
"argument1": """The first argument that the event handler expects should be added after the baseElement variable. """,
"NOTE": """This function can have more than one of these arguments specified, once for each argument the event handler is expecting. """
},
result='* returns nil if the arguments are invalid or the event could not be found.\n* returns true if the event was triggered successfully, and was not cancelled using cancelevent.\n* returns false if the event was triggered successfully, and was cancelled using cancelevent.' ,
),
url='triggerEvent',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='triggerLatentClientEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='sendTo',
argument_type=FunctionType(
names=['table', 'element'],
is_optional=True,
),
default_value='getRootElement()',
)
],
[
FunctionArgument(
name='name',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='bandwidth',
argument_type=FunctionType(
names=['int'],
is_optional=True,
),
default_value='50000',
)
],
[
FunctionArgument(
name='persist',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='false',
)
],
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='arguments',
argument_type=None,
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function is the same as triggerClientEvent except the transmission rate of the data contained in the arguments can be limited\nand other network traffic is not blocked while the data is being transferred.' ,
arguments={
"name": """The name of the event to trigger client side. You should register this event with addEvent and add at least one event handler using addEventHandler. """,
"theElement": """The element that is the Event system#Event handlers|source of the event. This could be another player, or if this isnt relevant, use the root element. """,
"sendTo": """The event will be sent to all players that are children of the specified element. By default this is the root element, and hence the event is sent to all players. If you specify a single player it will just be sent to that player. This argument can also be a table of player elements. """,
"bandwidth": """The bytes per second rate to send the data contained in the arguments. """,
"persist": """A bool indicating whether the transmission should be allowed to continue even after the resource that triggered it has since stopped. """,
"arguments...": """A list of arguments to trigger with the event. You can pass any Lua data type (except functions). You can also pass elements. The total amount of data should not exceed 100MB. """
},
result='returns true if the event trigger has been sent, false if invalid arguments were specified.' ,
),
url='triggerLatentClientEvent',
)
],
client=[
],
),
CompoundFunctionData(
server=[
],
client=[
FunctionData(
signature=FunctionSignature(
name='triggerLatentServerEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='event',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='bandwidth',
argument_type=FunctionType(
names=['int'],
is_optional=True,
),
default_value='5000',
)
],
[
FunctionArgument(
name='persist',
argument_type=FunctionType(
names=['bool'],
is_optional=True,
),
default_value='false',
)
],
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='arguments',
argument_type=None,
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function is the same as triggerServerEvent except the transmission rate of the data contained in the arguments can be limited and other network traffic is not blocked while the data is being transferred.' ,
arguments={
"event": """The name of the event to trigger server-side. You should register this event with addEvent and add at least one event handler using addEventHandler. """,
"theElement": """The element that is the Event system#Event handlers|source of the event. This could be another player, or if this isnt relevant, use the root element. """,
"bandwidth": """The bytes per second rate to send the data contained in the arguments. """,
"persist": """A bool indicating whether the transmission should be allowed to continue even after the resource that triggered it has since stopped. """,
"arguments...": """A list of arguments to trigger with the event. You can pass any Lua data type (except functions). You can also pass elements. The total amount of data should not exceed 100MB. """
},
result='returns true if the event trigger has been sent, false if invalid arguments were specified.' ,
),
url='triggerLatentServerEvent',
)
],
),
CompoundFunctionData(
server=[
],
client=[
FunctionData(
signature=FunctionSignature(
name='triggerServerEvent',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='event',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='arguments',
argument_type=None,
default_value=None,
)
]
],
variable_length=True,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function triggers an event previously registered on the server. This is the primary means of passing information between the client and the server. Servers have a similar triggerClientEvent function that can do the reverse. You can treat this function as if it was an asynchronous function call, using triggerClientEvent to pass back any returned information if necessary.\nAlmost any data types can be passed as expected, including elements and complex nested tables. Non-element MTA data types like xmlNodes or resource pointers will not be able to be passed as they do not necessarily have a valid representation on the client. Elements of the Vector or Matrix classes cannot be passed!\nEvents are sent reliably, so the server will receive them, but there may be (but shouldnt be) a significant delay before they are received. You should take this into account when using them.\nKeep in mind the bandwidth issues when using events - dont pass a large list of arguments unless you really need to. It is marginally more efficient to pass one large event than two smaller ones.' ,
arguments={
"event": """The name of the event to trigger server-side. You should register this event with addEvent and add at least one event handler using addEventHandler. """,
"theElement": """The element that is the Event system#Event handlers|source of the event. """,
"arguments...": """A list of arguments to trigger with the event. You can pass any lua data type (except functions). You can also pass elements. """
},
result='returns true if the event trigger has been sent, false if invalid arguments were specified or a client side element was a parameter.' ,
),
url='triggerServerEvent',
)
],
),
CompoundFunctionData(
server=[
FunctionData(
signature=FunctionSignature(
name='wasEventCancelled',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function checks if the last completed event was cancelled. This is mainly useful for custom events created by scripts.\nEvents can be cancelled using cancelEvent, this indicates that the resource which triggered the event should do whatever it can to reverse any changes made by whatever caused the event. See triggerEvent for a more detailed explanation of this.' ,
arguments={
},
result='returns true if the event was cancelled, false if it wasnt or doesnt exist.' ,
),
url='wasEventCancelled',
)
],
client=[
FunctionData(
signature=FunctionSignature(
name='wasEventCancelled',
return_types=FunctionReturnTypes(
return_types=[
FunctionType(
names=['bool'],
is_optional=False,
)
],
variable_length=False,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
generic_types=[
],
),
docs=FunctionDoc(
description='This function checks if the last completed event was cancelled. This is mainly useful for custom events created by scripts.\nEvents can be cancelled using cancelEvent, this indicates that the resource which triggered the event should do whatever it can to reverse any changes made by whatever caused the event. See triggerEvent for a more detailed explanation of this.' ,
arguments={
},
result='returns true if the event was cancelled, false if it wasnt or doesnt exist.' ,
),
url='wasEventCancelled',
)
],
)
]
| 49.261433 | 1,769 | 0.426505 | 5,255 | 72,168 | 5.796575 | 0.087345 | 0.027839 | 0.030531 | 0.050458 | 0.969797 | 0.966252 | 0.952529 | 0.943666 | 0.942517 | 0.942517 | 0 | 0.001058 | 0.515284 | 72,168 | 1,464 | 1,770 | 49.295082 | 0.869729 | 0.000707 | 0 | 0.826882 | 1 | 0.037298 | 0.3201 | 0.005103 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.008445 | 0.002111 | 0 | 0.002111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
6e872fc1771f98c2de59205dac7c153788c53503 | 14,600 | py | Python | test/test_error_summary.py | DavidHribek/pero-ocr | 8d274282813878b3e31dd560563a36b3f02e5c33 | [
"BSD-3-Clause"
] | 27 | 2020-03-20T08:25:39.000Z | 2022-03-08T11:30:50.000Z | test/test_error_summary.py | DavidHribek/pero-ocr | 8d274282813878b3e31dd560563a36b3f02e5c33 | [
"BSD-3-Clause"
] | 28 | 2020-02-11T17:27:35.000Z | 2022-02-09T23:36:24.000Z | test/test_error_summary.py | DavidHribek/pero-ocr | 8d274282813878b3e31dd560563a36b3f02e5c33 | [
"BSD-3-Clause"
] | 9 | 2020-03-16T12:22:03.000Z | 2022-03-16T12:49:06.000Z | import unittest
from pero_ocr.error_summary import ErrorsSummary
class SingleReferenceTests(unittest.TestCase):
def test_empty_ref_match(self):
ref = ''
hyp = ''
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.nb_errors, 0)
self.assertEqual(summary.nb_subs, 0)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 0)
self.assertEqual(summary.nb_lines_summarized, 1)
def test_match(self):
ref = 'ab'
hyp = 'ab'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.nb_errors, 0)
self.assertEqual(summary.nb_subs, 0)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 2)
self.assertEqual(summary.nb_lines_summarized, 1)
def test_deletion(self):
ref = 'ab'
hyp = 'a'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.nb_errors, 1)
self.assertEqual(summary.nb_subs, 0)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 1)
self.assertEqual(summary.ref_len, 2)
self.assertEqual(summary.nb_lines_summarized, 1)
def test_insertion(self):
ref = 'ab'
hyp = 'abc'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.nb_errors, 1)
self.assertEqual(summary.nb_subs, 0)
self.assertEqual(summary.nb_inss, 1)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 2)
self.assertEqual(summary.nb_lines_summarized, 1)
def test_substitution(self):
ref = 'ab'
hyp = 'ac'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.nb_errors, 1)
self.assertEqual(summary.nb_subs, 1)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 2)
self.assertEqual(summary.nb_lines_summarized, 1)
class ErrorAggregationTests(unittest.TestCase):
def test_single_summary(self):
partial_1 = ErrorsSummary.from_lists(list('abcd'), list('abb'))
summary = ErrorsSummary.aggregate([partial_1])
self.assertEqual(summary.nb_errors, 2)
self.assertEqual(summary.nb_subs, 1)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 1)
self.assertEqual(summary.ref_len, 4)
self.assertEqual(summary.nb_lines_summarized, 1)
self.assertEqual(summary.confusions['a']['a'], 1)
self.assertEqual(summary.confusions['b']['a'], 0)
self.assertEqual(summary.confusions['b']['b'], 1)
extra_b_correctly_matched_to_c = (
summary.confusions['c']['b'] == 1 and
summary.confusions['d'][None] == 1
)
extra_b_correctly_matched_to_d = (
summary.confusions['c'][None] == 1 and
summary.confusions['d']['b'] == 1
)
self.assertTrue(extra_b_correctly_matched_to_c or extra_b_correctly_matched_to_d)
self.assertEqual(sum(sum(k.values()) for k in summary.confusions.values()), len('abcd'))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 1)
self.assertEqual(summary.ending_errors.correct, 0)
def test_one_errorneuous_one_perfect(self):
partial_1 = ErrorsSummary.from_lists(list('abcd'), list('abb'))
partial_2 = ErrorsSummary.from_lists(list('ab'), list('ab'))
summary = ErrorsSummary.aggregate([partial_1, partial_2])
self.assertEqual(summary.nb_errors, 2)
self.assertEqual(summary.nb_subs, 1)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 1)
self.assertEqual(summary.ref_len, 6)
self.assertEqual(summary.nb_lines_summarized, 2)
self.assertEqual(summary.confusions['a']['a'], 2)
self.assertEqual(summary.confusions['b']['a'], 0)
self.assertEqual(summary.confusions['b']['b'], 2)
extra_b_correctly_matched_to_c = (
summary.confusions['c']['b'] == 1 and
summary.confusions['d'][None] == 1
)
extra_b_correctly_matched_to_d = (
summary.confusions['c'][None] == 1 and
summary.confusions['d']['b'] == 1
)
self.assertTrue(extra_b_correctly_matched_to_c or extra_b_correctly_matched_to_d)
self.assertEqual(sum(sum(k.values()) for k in summary.confusions.values()), len('abcd') + len('ab'))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 1)
self.assertEqual(summary.ending_errors.correct, 1)
def test_one_substituting_one_perfect(self):
partial_1 = ErrorsSummary.from_lists(list('abcd'), list('abxy'))
partial_2 = ErrorsSummary.from_lists(list('ab'), list('ab'))
summary = ErrorsSummary.aggregate([partial_1, partial_2])
self.assertEqual(summary.nb_errors, 2)
self.assertEqual(summary.nb_subs, 2)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 6)
self.assertEqual(summary.nb_lines_summarized, 2)
self.assertEqual(summary.confusions['a']['a'], 2)
self.assertEqual(summary.confusions['b']['b'], 2)
self.assertEqual(summary.confusions['c']['x'], 1)
self.assertEqual(summary.confusions['d']['y'], 1)
self.assertEqual(sum(sum(k.values()) for k in summary.confusions.values()), len('abcd') + len('ab'))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.pure_substitutions, 1)
self.assertEqual(summary.ending_errors.correct, 1)
def test_two_summaries_end_deleting(self):
partial_1 = ErrorsSummary.from_lists(list('abcd'), list('abb'))
partial_2 = ErrorsSummary.from_lists(list('abc'), list('ab'))
summary = ErrorsSummary.aggregate([partial_1, partial_2])
self.assertEqual(summary.nb_errors, 3)
self.assertEqual(summary.nb_subs, 1)
self.assertEqual(summary.nb_inss, 0)
self.assertEqual(summary.nb_dels, 2)
self.assertEqual(summary.ref_len, 7)
self.assertEqual(summary.nb_lines_summarized, 2)
self.assertEqual(summary.confusions['a']['a'], 2)
self.assertEqual(summary.confusions['b']['b'], 2)
extra_b_correctly_matched_to_c = (
summary.confusions['c'][None] == 1 and
summary.confusions['c']['b'] == 1 and
summary.confusions['d'][None] == 1
)
extra_b_correctly_matched_to_d = (
summary.confusions['c'][None] == 2 and
summary.confusions['d']['b'] == 1
)
self.assertTrue(extra_b_correctly_matched_to_c or extra_b_correctly_matched_to_d)
self.assertEqual(sum(sum(k.values()) for k in summary.confusions.values()), len('abcd') + len('abc'))
self.assertEqual(summary.ending_errors.pure_deletions, 1)
self.assertEqual(summary.ending_errors.mixed_deletions, 1)
self.assertEqual(summary.ending_errors.correct, 0)
def test_two_summaries_end_inserting(self):
partial_1 = ErrorsSummary.from_lists(list('ab'), list('acd'))
partial_2 = ErrorsSummary.from_lists(list('ab'), list('abc'))
summary = ErrorsSummary.aggregate([partial_1, partial_2])
self.assertEqual(summary.nb_errors, 3)
self.assertEqual(summary.nb_subs, 1)
self.assertEqual(summary.nb_inss, 2)
self.assertEqual(summary.nb_dels, 0)
self.assertEqual(summary.ref_len, 4)
self.assertEqual(summary.nb_lines_summarized, 2)
self.assertEqual(summary.confusions['a']['a'], 2)
self.assertEqual(summary.confusions['b']['b'], 1)
unmatched_b_correctly_matched_to_c = (
summary.confusions['b']['c'] == 1 and
summary.confusions[None]['d'] == 1
)
unmatched_b_correctly_matched_to_d = (
summary.confusions[None]['c'] == 1 and
summary.confusions['b']['d'] == 1
)
self.assertTrue(unmatched_b_correctly_matched_to_c or unmatched_b_correctly_matched_to_d)
self.assertEqual(sum(sum(k.values()) for k in summary.confusions.values()), len('acd') + len('abc'))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.pure_insertions, 1)
self.assertEqual(summary.ending_errors.mixed_insertions, 1)
self.assertEqual(summary.ending_errors.correct, 0)
class DetailedSummaryTests(unittest.TestCase):
def test_empty_ref_match(self):
ref = ''
hyp = ''
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.confusions, {})
def test_match(self):
ref = 'ab'
hyp = 'ab'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.confusions['a']['a'], 1)
self.assertEqual(summary.confusions['b']['b'], 1)
def test_substitution(self):
ref = 'ab'
hyp = 'aa'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.confusions['a']['a'], 1)
self.assertEqual(summary.confusions['b']['b'], 0)
self.assertEqual(summary.confusions['b']['a'], 1)
def test_insertion(self):
ref = 'ab'
hyp = 'abc'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.confusions['a']['a'], 1)
self.assertEqual(summary.confusions['b']['b'], 1)
self.assertEqual(summary.confusions[None]['c'], 1)
def test_deletion(self):
ref = 'ab'
hyp = 'a'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.confusions['a']['a'], 1)
self.assertEqual(summary.confusions['b']['b'], 0)
self.assertEqual(summary.confusions['b'][None], 1)
class BoundaryErrorsTests(unittest.TestCase):
def test_empty_ref_match(self):
ref = ''
hyp = ''
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 1)
def test_match(self):
ref = 'ab'
hyp = 'ab'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 1)
def test_substitution_only(self):
ref = 'ab'
hyp = 'cd'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 0)
self.assertEqual(summary.ending_errors.pure_substitutions, 1)
def test_end_substitution(self):
ref = 'ab'
hyp = 'ac'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 0)
self.assertEqual(summary.ending_errors.pure_substitutions, 1)
def test_start_substitution(self):
ref = 'ab'
hyp = 'xb'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 1)
self.assertEqual(summary.ending_errors.pure_substitutions, 0)
def test_end_deletion(self):
ref = 'ab'
hyp = 'a'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 1)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 0)
def test_end_mixed_deletion(self):
ref = 'abcd'
hyp = 'abb'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 1)
self.assertEqual(summary.ending_errors.correct, 0)
def test_longer_end_deletion(self):
ref = 'abc'
hyp = 'a'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 1)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 0)
def test_end_insertion(self):
ref = 'a'
hyp = 'ab'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.pure_insertions, 1)
self.assertEqual(summary.ending_errors.mixed_insertions, 0)
self.assertEqual(summary.ending_errors.correct, 0)
def test_end_mixed_insertion(self):
ref = 'ab'
hyp = 'acd'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.pure_insertions, 0)
self.assertEqual(summary.ending_errors.mixed_insertions, 1)
self.assertEqual(summary.ending_errors.correct, 0)
def test_start_deletion(self):
ref = 'ab'
hyp = 'b'
summary = ErrorsSummary.from_lists(list(ref), list(hyp))
self.assertEqual(summary.ending_errors.pure_deletions, 0)
self.assertEqual(summary.ending_errors.mixed_deletions, 0)
self.assertEqual(summary.ending_errors.correct, 1)
| 42.44186 | 109 | 0.661507 | 1,797 | 14,600 | 5.175849 | 0.052866 | 0.240297 | 0.340609 | 0.174605 | 0.954736 | 0.920976 | 0.91388 | 0.895925 | 0.887431 | 0.880873 | 0 | 0.015458 | 0.211301 | 14,600 | 343 | 110 | 42.565598 | 0.792271 | 0 | 0 | 0.769492 | 0 | 0 | 0.015548 | 0 | 0 | 0 | 0 | 0 | 0.518644 | 1 | 0.088136 | false | 0 | 0.00678 | 0 | 0.108475 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
6e9b982fff60eab9b6ef0fe655fc78deb6cd9742 | 105 | py | Python | src/docs/application-layer-protocol/tests/__init__.py | fujiawei-dev/protocols-notes | 4f5b2dd6f7b5a7f2d1260312972898be0f123ff5 | [
"MIT"
] | null | null | null | src/docs/application-layer-protocol/tests/__init__.py | fujiawei-dev/protocols-notes | 4f5b2dd6f7b5a7f2d1260312972898be0f123ff5 | [
"MIT"
] | null | null | null | src/docs/application-layer-protocol/tests/__init__.py | fujiawei-dev/protocols-notes | 4f5b2dd6f7b5a7f2d1260312972898be0f123ff5 | [
"MIT"
] | null | null | null | '''
Date: 2021.11.15 13:58
Description: Omit
LastEditors: Rustle Karl
LastEditTime: 2021.11.15 13:58
'''
| 15 | 30 | 0.72381 | 17 | 105 | 4.470588 | 0.705882 | 0.157895 | 0.210526 | 0.263158 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 0.12381 | 105 | 6 | 31 | 17.5 | 0.565217 | 0.914286 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e3c2b34aa86e3173942d71fee3a8b4bed21958b | 40,732 | py | Python | QLed.py | jazzycamel/QLed | ebbf71a270ed58c6c77ce2e0ace7c985bba80155 | [
"MIT"
] | 11 | 2018-06-09T08:54:22.000Z | 2022-01-19T22:02:18.000Z | QLed.py | jazzycamel/QLed | ebbf71a270ed58c6c77ce2e0ace7c985bba80155 | [
"MIT"
] | 2 | 2017-11-30T11:48:56.000Z | 2021-02-02T10:36:50.000Z | QLed.py | jazzycamel/QLed | ebbf71a270ed58c6c77ce2e0ace7c985bba80155 | [
"MIT"
] | 9 | 2017-11-20T07:49:10.000Z | 2022-02-24T08:59:59.000Z | from colorsys import rgb_to_hls, hls_to_rgb
import six
if six.PY3:
from PyQt5.QtWidgets import QApplication, QWidget, QGridLayout, QSizePolicy, QStyleOption
from PyQt5.QtGui import QPainter
from PyQt5.QtCore import pyqtSignal, Qt, QSize, QTimer, QByteArray, QRectF, pyqtProperty
from PyQt5.QtSvg import QSvgRenderer
else:
from PyQt4.QtGui import QApplication, QWidget, QPainter, QGridLayout, QSizePolicy, QStyleOption
from PyQt4.QtCore import pyqtSignal, Qt, QSize, QTimer, QByteArray, QRectF, pyqtProperty
from PyQt4.QtSvg import QSvgRenderer
class QLed(QWidget):
Circle = 1
Round = 2
Square = 3
Triangle = 4
Red = 1
Green = 2
Yellow = 3
Grey = 4
Orange = 5
Purple = 6
Blue = 7
shapes={
Circle:"""
<svg height="50.000000px" id="svg9493" width="50.000000px" xmlns="http://www.w3.org/2000/svg">
<defs id="defs9495">
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient6650" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient6494">
<stop id="stop6496" offset="0.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
<stop id="stop6498" offset="1.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient6648" x1="23.213980" x2="23.201290" xlink:href="#linearGradient6494" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient6646" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient6644" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient id="linearGradient6506">
<stop id="stop6508" offset="0.0000000" style="stop-color:#ffffff;stop-opacity:0.0000000;"/>
<stop id="stop6510" offset="1.0000000" style="stop-color:#ffffff;stop-opacity:0.87450981;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7498" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient7464">
<stop id="stop7466" offset="0.0000000" style="stop-color:#00039a;stop-opacity:1.0000000;"/>
<stop id="stop7468" offset="1.0000000" style="stop-color:#afa5ff;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7496" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient id="linearGradient5756">
<stop id="stop5758" offset="0.0000000" style="stop-color:#828282;stop-opacity:1.0000000;"/>
<stop id="stop5760" offset="1.0000000" style="stop-color:#929292;stop-opacity:0.35294119;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9321" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient id="linearGradient5742">
<stop id="stop5744" offset="0.0000000" style="stop-color:#adadad;stop-opacity:1.0000000;"/>
<stop id="stop5746" offset="1.0000000" style="stop-color:#f0f0f0;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7492" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9527" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9529" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9531" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9533" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
</defs>
<g id="layer1">
<g id="g9447" style="overflow:visible" transform="matrix(31.25000,0.000000,0.000000,31.25000,-625.0232,-1325.000)">
<path d="M 24.000001,43.200001 C 24.000001,43.641601 23.641601,44.000001 23.200001,44.000001 C 22.758401,44.000001 22.400001,43.641601 22.400001,43.200001 C 22.400001,42.758401 22.758401,42.400001 23.200001,42.400001 C 23.641601,42.400001 24.000001,42.758401 24.000001,43.200001 z " id="path6596" style="fill:url(#linearGradient6644);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="translate(-2.399258,-1.000000e-6)"/>
<path d="M 23.906358,43.296204 C 23.906358,43.625433 23.639158,43.892633 23.309929,43.892633 C 22.980700,43.892633 22.713500,43.625433 22.713500,43.296204 C 22.713500,42.966975 22.980700,42.699774 23.309929,42.699774 C 23.639158,42.699774 23.906358,42.966975 23.906358,43.296204 z " id="path6598" style="fill:url(#linearGradient6646);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="matrix(1.082474,0.000000,0.000000,1.082474,-4.431649,-3.667015)"/>
<path d="M 23.906358,43.296204 C 23.906358,43.625433 23.639158,43.892633 23.309929,43.892633 C 22.980700,43.892633 22.713500,43.625433 22.713500,43.296204 C 22.713500,42.966975 22.980700,42.699774 23.309929,42.699774 C 23.639158,42.699774 23.906358,42.966975 23.906358,43.296204 z " id="path6600" style="fill:url(#linearGradient6648);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="matrix(0.969072,0.000000,0.000000,0.969072,-1.788256,1.242861)"/>
<path d="M 23.906358,43.296204 C 23.906358,43.625433 23.639158,43.892633 23.309929,43.892633 C 22.980700,43.892633 22.713500,43.625433 22.713500,43.296204 C 22.713500,42.966975 22.980700,42.699774 23.309929,42.699774 C 23.639158,42.699774 23.906358,42.966975 23.906358,43.296204 z " id="path6602" style="fill:url(#linearGradient6650);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="matrix(0.773196,0.000000,0.000000,0.597938,2.776856,17.11876)"/>
</g>
</g>
</svg>
""",
Round:"""
<svg height="50.000000px" id="svg9493" width="100.00000px" xmlns="http://www.w3.org/2000/svg">
<defs id="defs9495">
<linearGradient gradientTransform="matrix(0.928127,0.000000,0.000000,0.639013,13.55634,12.87587)" gradientUnits="userSpaceOnUse" id="linearGradient13424" x1="21.593750" x2="21.593750" xlink:href="#linearGradient6506" y1="47.917328" y2="46.774261"/>
<linearGradient id="linearGradient6494">
<stop id="stop6496" offset="0.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
<stop id="stop6498" offset="1.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientTransform="translate(12.00000,-4.000002)" gradientUnits="userSpaceOnUse" id="linearGradient13427" x1="21.591305" x2="21.593750" xlink:href="#linearGradient6494" y1="46.617390" y2="47.781250"/>
<linearGradient gradientTransform="translate(12.00000,-4.000002)" gradientUnits="userSpaceOnUse" id="linearGradient13430" x1="21.408695" x2="21.834784" xlink:href="#linearGradient5756" y1="46.556522" y2="47.843750"/>
<linearGradient gradientTransform="translate(12.00000,-4.000002)" gradientUnits="userSpaceOnUse" id="linearGradient13433" x1="21.594427" x2="21.600000" xlink:href="#linearGradient5742" y1="46.376728" y2="48.000000"/>
<linearGradient gradientTransform="matrix(0.928127,0.000000,0.000000,0.639013,21.55634,15.27587)" gradientUnits="userSpaceOnUse" id="linearGradient13472" x1="21.593750" x2="21.593750" xlink:href="#linearGradient6506" y1="47.917328" y2="46.774261"/>
<linearGradient gradientTransform="translate(20.00000,-1.600002)" gradientUnits="userSpaceOnUse" id="linearGradient13475" x1="21.591305" x2="21.593750" xlink:href="#linearGradient9163" y1="46.617390" y2="47.781250"/>
<linearGradient gradientTransform="translate(20.00000,-1.600002)" gradientUnits="userSpaceOnUse" id="linearGradient13478" x1="21.408695" x2="21.834784" xlink:href="#linearGradient5756" y1="46.556522" y2="47.843750"/>
<linearGradient gradientTransform="translate(20.00000,-1.600002)" gradientUnits="userSpaceOnUse" id="linearGradient13481" x1="21.594427" x2="21.600000" xlink:href="#linearGradient5742" y1="46.376728" y2="48.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9199" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient9163">
<stop id="stop9165" offset="0.0000000" style="stop-color:#000000;stop-opacity:1.0000000;"/>
<stop id="stop9167" offset="1.0000000" style="stop-color:#8c8c8c;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9197" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9195" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9193" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient id="linearGradient6506">
<stop id="stop6508" offset="0.0000000" style="stop-color:#ffffff;stop-opacity:0.0000000;"/>
<stop id="stop6510" offset="1.0000000" style="stop-color:#ffffff;stop-opacity:0.87450981;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7498" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient7464">
<stop id="stop7466" offset="0.0000000" style="stop-color:#00039a;stop-opacity:1.0000000;"/>
<stop id="stop7468" offset="1.0000000" style="stop-color:#afa5ff;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7496" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient id="linearGradient5756">
<stop id="stop5758" offset="0.0000000" style="stop-color:#828282;stop-opacity:1.0000000;"/>
<stop id="stop5760" offset="1.0000000" style="stop-color:#929292;stop-opacity:0.35294119;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9321" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient id="linearGradient5742">
<stop id="stop5744" offset="0.0000000" style="stop-color:#adadad;stop-opacity:1.0000000;"/>
<stop id="stop5746" offset="1.0000000" style="stop-color:#f0f0f0;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7492" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9527" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9529" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9531" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9533" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(24.16238,0.000000,0.000000,18.68556,-538.2464,-790.0391)" gradientUnits="userSpaceOnUse" id="linearGradient1336" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(30.28350,0.000000,0.000000,30.28350,-680.9062,-1286.161)" gradientUnits="userSpaceOnUse" id="linearGradient1339" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientTransform="matrix(33.82731,0.000000,0.000000,33.82731,-763.5122,-1439.594)" gradientUnits="userSpaceOnUse" id="linearGradient1342" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientTransform="matrix(31.25000,0.000000,0.000000,31.25000,-700.0000,-1325.000)" gradientUnits="userSpaceOnUse" id="linearGradient1345" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
</defs>
<g id="layer1">
<g id="g13543" style="overflow:visible" transform="matrix(31.25000,0.000000,0.000000,31.25000,-999.9999,-1325.000)">
<path d="M 32.799998,42.400000 L 34.399998,42.400000 C 34.843198,42.400000 35.199998,42.756800 35.199998,43.200000 C 35.199998,43.643200 34.843198,44.000000 34.399998,44.000000 L 32.799998,44.000000 C 32.356798,44.000000 31.999998,43.643200 31.999998,43.200000 C 31.999998,42.756800 32.356798,42.400000 32.799998,42.400000 z " id="path13335" style="fill:url(#linearGradient13433);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000"/>
<path d="M 32.812498,42.562498 C 32.447387,42.562498 32.156248,42.829606 32.156248,43.187498 C 32.156248,43.545390 32.454607,43.843750 32.812498,43.843748 L 34.406248,43.843748 C 34.764141,43.843748 35.031248,43.552611 35.031248,43.187498 C 35.031248,42.822387 34.771358,42.562498 34.406248,42.562498 L 32.812498,42.562498 z " id="path13337" style="fill:url(#linearGradient13430);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
<path d="M 32.812498,42.624998 C 32.485887,42.624998 32.218748,42.871665 32.218748,43.187498 C 32.218748,43.503332 32.496667,43.781249 32.812498,43.781248 L 34.406248,43.781248 C 34.722082,43.781248 34.968748,43.514111 34.968748,43.187498 C 34.968748,42.860887 34.732858,42.624998 34.406248,42.624998 L 32.812498,42.624998 z " id="path13339" style="fill:url(#linearGradient13427);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
<path d="M 32.872983,42.669849 C 32.569847,42.669849 32.321908,42.827473 32.321908,43.029294 C 32.321908,43.231116 32.579852,43.408709 32.872983,43.408708 L 34.352185,43.408708 C 34.645320,43.408708 34.874257,43.238004 34.874257,43.029294 C 34.874257,42.820585 34.655321,42.669849 34.352185,42.669849 L 32.872983,42.669849 z " id="path13341" style="fill:url(#linearGradient13424);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
</g>
</g>
</svg>
""",
Square:"""
<svg height="50.000000px" id="svg9493" width="50.000000px" xmlns="http://www.w3.org/2000/svg">
<defs id="defs9495">
<linearGradient gradientTransform="matrix(0.388435,0.000000,0.000000,0.618097,2.806900,2.626330)" gradientUnits="userSpaceOnUse" id="linearGradient31681" x1="21.593750" x2="21.593750" xlink:href="#linearGradient6506" y1="47.917328" y2="46.774261"/>
<linearGradient id="linearGradient6494">
<stop id="stop6496" offset="0.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
<stop id="stop6498" offset="1.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient31704" x1="18.390625" x2="18.390625" xlink:href="#linearGradient6494" y1="43.400002" y2="44.593750"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient31624" x1="17.728125" x2="19.031250" xlink:href="#linearGradient5756" y1="43.337502" y2="44.656250"/>
<linearGradient gradientTransform="matrix(0.500000,0.000000,0.000000,1.000000,-3.600000,-8.800000)" gradientUnits="userSpaceOnUse" id="linearGradient31686" x1="29.600000" x2="29.600000" xlink:href="#linearGradient5742" y1="39.991302" y2="41.599998"/>
<linearGradient gradientTransform="matrix(0.388435,0.000000,0.000000,0.618097,7.606900,5.026330)" gradientUnits="userSpaceOnUse" id="linearGradient31649" x1="21.593750" x2="21.593750" xlink:href="#linearGradient6506" y1="47.917328" y2="46.774261"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient31710" x1="18.390625" x2="18.390625" xlink:href="#linearGradient9163" y1="43.400002" y2="44.593750"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient31570" x1="17.728125" x2="19.031250" xlink:href="#linearGradient5756" y1="43.337502" y2="44.656250"/>
<linearGradient gradientTransform="matrix(0.500000,0.000000,0.000000,1.000000,1.200000,-6.400000)" gradientUnits="userSpaceOnUse" id="linearGradient31654" x1="29.600000" x2="29.600000" xlink:href="#linearGradient5742" y1="39.991302" y2="41.599998"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9199" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient9163">
<stop id="stop9165" offset="0.0000000" style="stop-color:#000000;stop-opacity:1.0000000;"/>
<stop id="stop9167" offset="1.0000000" style="stop-color:#8c8c8c;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9197" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9195" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9193" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient id="linearGradient6506">
<stop id="stop6508" offset="0.0000000" style="stop-color:#ffffff;stop-opacity:0.0000000;"/>
<stop id="stop6510" offset="1.0000000" style="stop-color:#ffffff;stop-opacity:0.87450981;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7498" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient7464">
<stop id="stop7466" offset="0.0000000" style="stop-color:#00039a;stop-opacity:1.0000000;"/>
<stop id="stop7468" offset="1.0000000" style="stop-color:#afa5ff;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7496" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient id="linearGradient5756">
<stop id="stop5758" offset="0.0000000" style="stop-color:#828282;stop-opacity:1.0000000;"/>
<stop id="stop5760" offset="1.0000000" style="stop-color:#929292;stop-opacity:0.35294119;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9321" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient id="linearGradient5742">
<stop id="stop5744" offset="0.0000000" style="stop-color:#adadad;stop-opacity:1.0000000;"/>
<stop id="stop5746" offset="1.0000000" style="stop-color:#f0f0f0;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7492" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9527" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9529" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9531" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9533" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(24.16238,0.000000,0.000000,18.68556,-538.2464,-790.0391)" gradientUnits="userSpaceOnUse" id="linearGradient1336" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(30.28350,0.000000,0.000000,30.28350,-680.9062,-1286.161)" gradientUnits="userSpaceOnUse" id="linearGradient1339" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientTransform="matrix(33.82731,0.000000,0.000000,33.82731,-763.5122,-1439.594)" gradientUnits="userSpaceOnUse" id="linearGradient1342" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientTransform="matrix(31.25000,0.000000,0.000000,31.25000,-700.0000,-1325.000)" gradientUnits="userSpaceOnUse" id="linearGradient1345" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
</defs>
<g id="layer1">
<g id="g31718" style="overflow:visible" transform="matrix(31.25000,0.000000,0.000000,31.25000,-325.0000,-975.0000)">
<path d="M 10.400000,31.200000 L 12.000000,31.200000 L 12.000000,32.800000 L 10.400000,32.800000 L 10.400000,31.200000 z " id="path31614" style="fill:url(#linearGradient31686);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000"/>
<path d="M 17.750000,43.343750 L 17.750000,44.656250 L 19.031250,44.656250 L 19.031250,43.343750 L 17.750000,43.343750 z " id="path31616" style="fill:url(#linearGradient31624);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="translate(-7.190625,-12.00000)"/>
<path d="M 17.812500,43.406250 L 17.812500,44.593750 L 18.968750,44.593750 L 18.968750,43.406250 L 17.812500,43.406250 z " id="path31618" style="fill:url(#linearGradient31704);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible" transform="translate(-7.190625,-12.00000)"/>
<path d="M 10.891195,31.445120 C 10.630356,31.445967 10.660563,31.393294 10.660563,31.792800 C 10.660563,31.988016 10.768517,32.159796 10.891195,32.159795 L 11.510263,32.159795 C 11.632945,32.159795 11.728757,31.994678 11.728757,31.792800 C 11.728757,31.389990 11.754584,31.441761 11.510263,31.445120 L 10.891195,31.445120 z " id="path31620" sodipodi:nodetypes="csccscc" style="fill:url(#linearGradient31681);fill-opacity:1.0000000;stroke:none;stroke-width:0.80000001;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
</g>
</g>
</svg>
""",
Triangle:"""
<svg height="50.000000px" id="svg9493" width="50.000000px" xmlns="http://www.w3.org/2000/svg" >
<defs id="defs9495">
<linearGradient gradientTransform="matrix(0.389994,0.000000,0.000000,0.403942,4.557010,29.83582)" gradientUnits="userSpaceOnUse" id="linearGradient28861" x1="23.187498" x2="23.187498" xlink:href="#linearGradient6506" y1="28.449617" y2="26.670279"/>
<linearGradient id="linearGradient6494">
<stop id="stop6496" offset="0.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
<stop id="stop6498" offset="1.0000000" style="stop-color:%s;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientTransform="translate(-9.587500,13.60000)" gradientUnits="userSpaceOnUse" id="linearGradient28864" x1="23.181250" x2="23.187500" xlink:href="#linearGradient6494" y1="26.793751" y2="27.843750"/>
<linearGradient gradientTransform="translate(-9.587500,13.60000)" gradientUnits="userSpaceOnUse" id="linearGradient28867" x1="22.762501" x2="23.812500" xlink:href="#linearGradient5756" y1="26.687500" y2="27.906250"/>
<linearGradient gradientTransform="translate(-9.600000,13.60000)" gradientUnits="userSpaceOnUse" id="linearGradient28870" x1="23.187500" x2="23.200001" xlink:href="#linearGradient5742" y1="26.400000" y2="28.000000"/>
<linearGradient gradientTransform="matrix(0.389994,0.000000,0.000000,0.403942,9.357010,32.23582)" gradientUnits="userSpaceOnUse" id="linearGradient28801" x1="23.187498" x2="23.187498" xlink:href="#linearGradient6506" y1="28.449617" y2="26.670279"/>
<linearGradient gradientTransform="translate(-4.787500,16.00000)" gradientUnits="userSpaceOnUse" id="linearGradient28804" x1="23.181250" x2="23.187500" xlink:href="#linearGradient9163" y1="26.793751" y2="27.843750"/>
<linearGradient gradientTransform="translate(-4.787500,16.00000)" gradientUnits="userSpaceOnUse" id="linearGradient28807" x1="22.762501" x2="23.812500" xlink:href="#linearGradient5756" y1="26.687500" y2="27.906250"/>
<linearGradient gradientTransform="translate(-4.800000,16.00000)" gradientUnits="userSpaceOnUse" id="linearGradient28810" x1="23.187500" x2="23.200001" xlink:href="#linearGradient5742" y1="26.400000" y2="28.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9199" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient9163">
<stop id="stop9165" offset="0.0000000" style="stop-color:#000000;stop-opacity:1.0000000;"/>
<stop id="stop9167" offset="1.0000000" style="stop-color:#8c8c8c;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9197" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9195" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9193" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient id="linearGradient6506">
<stop id="stop6508" offset="0.0000000" style="stop-color:#ffffff;stop-opacity:0.0000000;"/>
<stop id="stop6510" offset="1.0000000" style="stop-color:#ffffff;stop-opacity:0.87450981;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7498" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient id="linearGradient7464">
<stop id="stop7466" offset="0.0000000" style="stop-color:#00039a;stop-opacity:1.0000000;"/>
<stop id="stop7468" offset="1.0000000" style="stop-color:#afa5ff;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7496" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient id="linearGradient5756">
<stop id="stop5758" offset="0.0000000" style="stop-color:#828282;stop-opacity:1.0000000;"/>
<stop id="stop5760" offset="1.0000000" style="stop-color:#929292;stop-opacity:0.35294119;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9321" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient id="linearGradient5742">
<stop id="stop5744" offset="0.0000000" style="stop-color:#adadad;stop-opacity:1.0000000;"/>
<stop id="stop5746" offset="1.0000000" style="stop-color:#f0f0f0;stop-opacity:1.0000000;"/>
</linearGradient>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient7492" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9527" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9529" x1="22.935030" x2="23.662106" xlink:href="#linearGradient5756" y1="42.699776" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9531" x1="23.213980" x2="23.201290" xlink:href="#linearGradient7464" y1="42.754631" y2="43.892632"/>
<linearGradient gradientUnits="userSpaceOnUse" id="linearGradient9533" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(24.16238,0.000000,0.000000,18.68556,-538.2464,-790.0391)" gradientUnits="userSpaceOnUse" id="linearGradient1336" x1="23.402565" x2="23.389874" xlink:href="#linearGradient6506" y1="44.066776" y2="42.883698"/>
<linearGradient gradientTransform="matrix(30.28350,0.000000,0.000000,30.28350,-680.9062,-1286.161)" gradientUnits="userSpaceOnUse" id="linearGradient1339" x1="23.213980" x2="23.201290" xlink:href="#linearGradient9163" y1="42.754631" y2="43.892632"/>
<linearGradient gradientTransform="matrix(33.82731,0.000000,0.000000,33.82731,-763.5122,-1439.594)" gradientUnits="userSpaceOnUse" id="linearGradient1342" x1="23.349695" x2="23.440580" xlink:href="#linearGradient5756" y1="42.767944" y2="43.710873"/>
<linearGradient gradientTransform="matrix(31.25000,0.000000,0.000000,31.25000,-700.0000,-1325.000)" gradientUnits="userSpaceOnUse" id="linearGradient1345" x1="23.193102" x2="23.200001" xlink:href="#linearGradient5742" y1="42.429230" y2="44.000000"/>
</defs>
<g id="layer1">
<g id="g28884" style="overflow:visible" transform="matrix(31.25000,0.000000,0.000000,31.25000,-400.0000,-1250.000)">
<path d="M 14.400000,41.600000 L 12.800000,41.600000 L 13.600000,40.000000 L 14.400000,41.600000 z " id="path28664" sodipodi:nodetypes="cccc" style="fill:url(#linearGradient28870);fill-opacity:1.0000000;stroke:none;stroke-width:0.064000003;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000"/>
<path d="M 13.600000,40.256250 L 12.975000,41.506250 L 14.225000,41.506250 L 13.600000,40.256250 z " id="path28666" style="fill:url(#linearGradient28867);fill-opacity:1.0000000;stroke:none;stroke-width:0.064000003;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
<path d="M 13.600000,40.381250 L 13.068750,41.443750 L 14.131250,41.443750 L 13.600000,40.381250 z " id="path28668" style="fill:url(#linearGradient28864);fill-opacity:1.0000000;stroke:none;stroke-width:0.064000003;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
<path d="M 13.575621,40.552906 C 13.555816,40.559679 13.538695,40.572979 13.526872,40.590776 C 13.522451,40.594595 13.518372,40.598819 13.514685,40.603399 L 13.307500,41.032587 C 13.299161,41.047990 13.294953,41.065424 13.295313,41.083080 C 13.296850,41.096430 13.300996,41.109315 13.307500,41.120950 C 13.310377,41.129925 13.314481,41.138427 13.319688,41.146196 C 13.323375,41.150775 13.327454,41.155000 13.331875,41.158819 C 13.339376,41.164212 13.347584,41.168462 13.356250,41.171442 C 13.367483,41.178179 13.379923,41.182474 13.392812,41.184066 L 13.807180,41.184066 C 13.835802,41.183428 13.862639,41.169530 13.880304,41.146196 C 13.884725,41.142377 13.888804,41.138152 13.892491,41.133573 C 13.898995,41.121938 13.903142,41.109053 13.904679,41.095703 C 13.905039,41.078047 13.900831,41.060614 13.892491,41.045211 C 13.892751,41.041007 13.892751,41.036791 13.892491,41.032587 L 13.685307,40.603399 C 13.681620,40.598819 13.677541,40.594595 13.673120,40.590776 C 13.650701,40.559305 13.612491,40.544463 13.575621,40.552906 z " id="path28670" style="fill:url(#linearGradient28861);fill-opacity:1.0000000;stroke:none;stroke-width:0.064000003;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4.0000000;stroke-opacity:1.0000000;overflow:visible"/>
</g>
</g>
</svg>
"""
}
colours={Red : (0xCF, 0x00, 0x00),
Green : (0x0f, 0x69, 0x00),
Yellow : (0xd2, 0xcd, 0x00),
Grey : (0x5a, 0x5a, 0x5a),
Orange : (0xda, 0x46, 0x15),
Purple : (0x87, 0x00, 0x83),
Blue : (0x00, 0x03, 0x9a)}
clicked=pyqtSignal()
pressed=pyqtSignal(bool)
def __init__(self, parent=None, **kwargs):
self.m_value=False
self.m_onColour=QLed.Red
self.m_offColour=QLed.Grey
self.m_shape=QLed.Circle
self.m_clickable=False
QWidget.__init__(self, parent, **kwargs)
self._pressed=False
self.renderer=QSvgRenderer()
def value(self): return self.m_value
def setValue(self, value):
self.m_value=value
self.update()
value=pyqtProperty(bool, value, setValue)
def onColour(self): return self.m_onColour
def setOnColour(self, newColour):
self.m_onColour=newColour
self.update()
onColour=pyqtProperty(int, onColour, setOnColour)
def offColour(self): return self.m_offColour
def setOffColour(self, newColour):
self.m_offColour=newColour
self.update()
offColour=pyqtProperty(int, offColour, setOffColour)
def shape(self): return self.m_shape
def setShape(self, newShape):
self.m_shape=newShape
self.update()
shape=pyqtProperty(int, shape, setShape)
def clickable(self): return self.m_clickable
def setClickable(self, newClickability):
self.m_clickable = newClickability
clickable=pyqtProperty(bool, clickable, setClickable)
def sizeHint(self):
if self.m_shape==QLed.Triangle: return QSize(64,48)
elif self.m_shape==QLed.Round: return QSize(96, 48)
return QSize(48,48)
def adjust(self, r, g, b):
def normalise(x): return x/255.0
def denormalise(x): return int(x*255.0)
(h,l,s)=rgb_to_hls(normalise(r),normalise(g),normalise(b))
(nr,ng,nb)=hls_to_rgb(h,l*1.5,s)
return (denormalise(nr),denormalise(ng),denormalise(nb))
def paintEvent(self, event):
option=QStyleOption()
option.initFrom(self)
h=option.rect.height()
w=option.rect.width()
if self.m_shape in (QLed.Triangle, QLed.Round):
aspect=(4/3.0) if self.m_shape==QLed.Triangle else 2.0
ah=w/aspect
aw=w
if ah>h:
ah=h
aw=h*aspect
x=abs(aw-w)/2.0
y=abs(ah-h)/2.0
bounds=QRectF(x,y,aw,ah)
else:
size=min(w,h)
x=abs(size-w)/2.0
y=abs(size-h)/2.0
bounds=QRectF(x,y,size,size)
painter=QPainter(self);
painter.setRenderHint(QPainter.Antialiasing, True);
(dark_r,dark_g,dark_b)=self.colours[self.m_onColour if self.m_value else self.m_offColour]
dark_str="rgb(%d,%d,%d)" % (dark_r,dark_g,dark_b)
light_str="rgb(%d,%d,%d)" % self.adjust(dark_r,dark_g,dark_b)
if six.PY3:
__xml=(self.shapes[self.m_shape]%(dark_str,light_str)).encode('utf8')
self.renderer.load(QByteArray(__xml))
else: self.renderer.load(QByteArray(self.shapes[self.m_shape] % (dark_str,light_str)))
self.renderer.render(painter, bounds)
def mousePressEvent(self, event):
self._pressed=True
QWidget.mousePressEvent(self, event)
def mouseReleaseEvent(self, event):
if self._pressed:
self._pressed=False
if self.m_clickable:
self.toggleValue()
self.pressed.emit(self.m_value)
else:
self.pressed.emit(True)
self.clicked.emit()
QWidget.mouseReleaseEvent(self, event)
def toggleValue(self):
self.m_value=not self.m_value;
self.update()
if __name__=="__main__":
from sys import argv, exit
class Test(QWidget):
def __init__(self, parent=None):
QWidget.__init__(self, parent)
self.setWindowTitle("QLed Test")
_l=QGridLayout()
self.setLayout(_l)
self.leds=[]
for row, shape in enumerate(QLed.shapes.keys()):
for col, colour in enumerate(QLed.colours.keys()):
if colour==QLed.Grey: continue
led=QLed(self, onColour=colour, shape=shape)
_l.addWidget(led, row, col, Qt.AlignCenter)
self.leds.append(led)
self.toggleLeds()
def toggleLeds(self):
for led in self.leds: led.toggleValue()
QTimer.singleShot(1000, self.toggleLeds)
a=QApplication(argv)
t=Test()
t.show()
t.raise_()
exit(a.exec_())
| 97.212411 | 1,277 | 0.676127 | 5,188 | 40,732 | 5.291442 | 0.126253 | 0.025936 | 0.088737 | 0.081451 | 0.73532 | 0.718673 | 0.713391 | 0.71095 | 0.703555 | 0.698091 | 0 | 0.305397 | 0.165717 | 40,732 | 418 | 1,278 | 97.444976 | 0.502442 | 0 | 0 | 0.501319 | 0 | 0.406332 | 0.861829 | 0.404007 | 0 | 0 | 0.002062 | 0 | 0 | 1 | 0.055409 | false | 0 | 0.026385 | 0.01847 | 0.145119 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
6e44b946f74a898f26584cd6ddf449fdda1b2330 | 817,752 | py | Python | Bmoves.py | tayayan/suisho | b456457410890ed09a8eac13efb4be259a65430c | [
"MIT"
] | 1 | 2021-01-26T11:38:45.000Z | 2021-01-26T11:38:45.000Z | Bmoves.py | tayayan/suisho | b456457410890ed09a8eac13efb4be259a65430c | [
"MIT"
] | null | null | null | Bmoves.py | tayayan/suisho | b456457410890ed09a8eac13efb4be259a65430c | [
"MIT"
] | null | null | null | #先手番合法手生成
import re
import Bboard
import Bboardbak
import Wboard
import Wboardbak
import board
import oute
#動かした後に先手玉に王手がかかっていないか判定
def kaihimore(sfen):
mae = sfen[0:2]
ushiro = sfen[2:4]
nari = sfen[4:5]
if mae[1:2]=='*':
exec('Bboard.b{}="{}"'.format(ushiro,mae[0:1]))
else:
exec('Bboard.b{}=Bboard.b{}'.format(ushiro,mae))
if nari == '+':
exec("Bboard.b{}= '+'+Bboard.b{}".format(ushiro,ushiro))
exec("Bboard.b{}=''".format(mae))
exec("Wboard.w{}=''".format(ushiro))
oute.boute()
Bboardbak.yobidashi()
Wboardbak.yobidashi()
board.synth()
#以下合法手生成コード
def move1():
global depth1
Bboardbak.kioku()
Wboardbak.kioku()
board.synth()
depth1 = []
if Bboard.b1a !='':
if re.match(r'[GK+]',Bboard.b1a)and Bboard.b2a=='':
moves = '1a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b1a)and Bboard.b1b=='':
moves = '1a1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b1a)and Bboard.b2b=='':
moves = '1a2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b1a)and Bboard.b2a=='':
moves = '1a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b1a)and Bboard.b1b=='':
moves = '1a1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b1a)and Bboard.b1b=='':
moves = '1a2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1c==''\
and board.s1b=='':
moves = '1a1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1c==''\
and board.s1b=='':
moves = '1a1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1d==''\
and board.s1b+board.s1c=='':
moves = '1a1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1d==''\
and board.s1b+board.s1c=='':
moves = '1a1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1e==''\
and board.s1b+board.s1c+board.s1d=='':
moves = '1a1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1e==''\
and board.s1b+board.s1c+board.s1d=='':
moves = '1a1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1f==''\
and board.s1b+board.s1c+board.s1d+board.s1e=='':
moves = '1a1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1f==''\
and board.s1b+board.s1c+board.s1d+board.s1e=='':
moves = '1a1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1g==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1a1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1g==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1a1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1h==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1a1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1h==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1a1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b1i==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1a1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b1i==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1a1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b3a==''\
and board.s2a=='':
moves = '1a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b3a==''\
and board.s2a=='':
moves = '1a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b4a==''\
and board.s2a+board.s3a=='':
moves = '1a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b4a==''\
and board.s2a+board.s3a=='':
moves = '1a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b5a==''\
and board.s2a+board.s3a+board.s4a=='':
moves = '1a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b5a==''\
and board.s2a+board.s3a+board.s4a=='':
moves = '1a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b6a==''\
and board.s2a+board.s3a+board.s4a+board.s5a=='':
moves = '1a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b6a==''\
and board.s2a+board.s3a+board.s4a+board.s5a=='':
moves = '1a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b7a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a=='':
moves = '1a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b7a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a=='':
moves = '1a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b8a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '1a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b8a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '1a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1a)and Bboard.b9a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '1a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1a)and Bboard.b9a==''\
and board.s2a+board.s3a+board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '1a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b3c==''\
and board.s2b=='':
moves = '1a3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b4d==''\
and board.s2b+board.s3c=='':
moves = '1a4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b5e==''\
and board.s2b+board.s3c+board.s4d=='':
moves = '1a5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b6f==''\
and board.s2b+board.s3c+board.s4d+board.s5e=='':
moves = '1a6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b7g==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f=='':
moves = '1a7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b8h==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '1a8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1a)and Bboard.b9i==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '1a9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b3c==''\
and board.s2b=='':
moves = '1a3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b4d==''\
and board.s2b+board.s3c=='':
moves = '1a4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b5e==''\
and board.s2b+board.s3c+board.s4d=='':
moves = '1a5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b6f==''\
and board.s2b+board.s3c+board.s4d+board.s5e=='':
moves = '1a6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b7g==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f=='':
moves = '1a7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b8h==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '1a8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1a)and Bboard.b9i==''\
and board.s2b+board.s3c+board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '1a9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2a !='':
if re.match(r'[GK+]',Bboard.b2a)and Bboard.b1a=='':
moves = '2a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b2a)and Bboard.b3a=='':
moves = '2a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b2a)and Bboard.b2b=='':
moves = '2a2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2a)and Bboard.b1b=='':
moves = '2a1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2a)and Bboard.b3b=='':
moves = '2a3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2a)and Bboard.b1a=='':
moves = '2a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2a)and Bboard.b3a=='':
moves = '2a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2a)and Bboard.b2b=='':
moves = '2a2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2a)and Bboard.b1b=='':
moves = '2a1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2a)and Bboard.b3b=='':
moves = '2a3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2c==''\
and board.s2b=='':
moves = '2a2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2c==''\
and board.s2b=='':
moves = '2a2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2d==''\
and board.s2b+board.s2c=='':
moves = '2a2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2d==''\
and board.s2b+board.s2c=='':
moves = '2a2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2e==''\
and board.s2b+board.s2c+board.s2d=='':
moves = '2a2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2e==''\
and board.s2b+board.s2c+board.s2d=='':
moves = '2a2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2f==''\
and board.s2b+board.s2c+board.s2d+board.s2e=='':
moves = '2a2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2f==''\
and board.s2b+board.s2c+board.s2d+board.s2e=='':
moves = '2a2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2g==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2a2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2g==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2a2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2h==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2a2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2h==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2a2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b2i==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2a2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b2i==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2a2i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b4a==''\
and board.s3a=='':
moves = '2a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b4a==''\
and board.s3a=='':
moves = '2a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b5a==''\
and board.s3a+board.s4a=='':
moves = '2a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b5a==''\
and board.s3a+board.s4a=='':
moves = '2a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b6a==''\
and board.s3a+board.s4a+board.s5a=='':
moves = '2a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b6a==''\
and board.s3a+board.s4a+board.s5a=='':
moves = '2a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b7a==''\
and board.s3a+board.s4a+board.s5a+board.s6a=='':
moves = '2a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b7a==''\
and board.s3a+board.s4a+board.s5a+board.s6a=='':
moves = '2a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b8a==''\
and board.s3a+board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '2a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b8a==''\
and board.s3a+board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '2a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2a)and Bboard.b9a==''\
and board.s3a+board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '2a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2a)and Bboard.b9a==''\
and board.s3a+board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '2a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b4c==''\
and board.s3b=='':
moves = '2a4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b5d==''\
and board.s3b+board.s4c=='':
moves = '2a5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b6e==''\
and board.s3b+board.s4c+board.s5d=='':
moves = '2a6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b7f==''\
and board.s3b+board.s4c+board.s5d+board.s6e=='':
moves = '2a7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b8g==''\
and board.s3b+board.s4c+board.s5d+board.s6e+board.s7f=='':
moves = '2a8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2a)and Bboard.b9h==''\
and board.s3b+board.s4c+board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '2a9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b4c==''\
and board.s3b=='':
moves = '2a4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b5d==''\
and board.s3b+board.s4c=='':
moves = '2a5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b6e==''\
and board.s3b+board.s4c+board.s5d=='':
moves = '2a6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b7f==''\
and board.s3b+board.s4c+board.s5d+board.s6e=='':
moves = '2a7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b8g==''\
and board.s3b+board.s4c+board.s5d+board.s6e+board.s7f=='':
moves = '2a8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2a)and Bboard.b9h==''\
and board.s3b+board.s4c+board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '2a9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3a !='':
if re.match(r'[GK+]',Bboard.b3a)and Bboard.b2a=='':
moves = '3a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b3a)and Bboard.b4a=='':
moves = '3a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b3a)and Bboard.b3b=='':
moves = '3a3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3a)and Bboard.b2b=='':
moves = '3a2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3a)and Bboard.b4b=='':
moves = '3a4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3a)and Bboard.b2a=='':
moves = '3a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3a)and Bboard.b4a=='':
moves = '3a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3a)and Bboard.b3b=='':
moves = '3a3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3a)and Bboard.b2b=='':
moves = '3a2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3a)and Bboard.b4b=='':
moves = '3a4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3c==''\
and board.s3b=='':
moves = '3a3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3c==''\
and board.s3b=='':
moves = '3a3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3d==''\
and board.s3b+board.s3c=='':
moves = '3a3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3d==''\
and board.s3b+board.s3c=='':
moves = '3a3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3e==''\
and board.s3b+board.s3c+board.s3d=='':
moves = '3a3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3e==''\
and board.s3b+board.s3c+board.s3d=='':
moves = '3a3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3f==''\
and board.s3b+board.s3c+board.s3d+board.s3e=='':
moves = '3a3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3f==''\
and board.s3b+board.s3c+board.s3d+board.s3e=='':
moves = '3a3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3g==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3a3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3g==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3a3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3h==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3a3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3h==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3a3h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b3i==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3a3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b3i==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3a3i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b1a==''\
and board.s2a=='':
moves = '3a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b1a==''\
and board.s2a=='':
moves = '3a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b5a==''\
and board.s4a=='':
moves = '3a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b5a==''\
and board.s4a=='':
moves = '3a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b6a==''\
and board.s4a+board.s5a=='':
moves = '3a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b6a==''\
and board.s4a+board.s5a=='':
moves = '3a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b7a==''\
and board.s4a+board.s5a+board.s6a=='':
moves = '3a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b7a==''\
and board.s4a+board.s5a+board.s6a=='':
moves = '3a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b8a==''\
and board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '3a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b8a==''\
and board.s4a+board.s5a+board.s6a+board.s7a=='':
moves = '3a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3a)and Bboard.b9a==''\
and board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '3a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3a)and Bboard.b9a==''\
and board.s4a+board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '3a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b1c==''\
and board.s2b=='':
moves = '3a1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b5c==''\
and board.s4b=='':
moves = '3a5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b6d==''\
and board.s4b+board.s5c=='':
moves = '3a6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b6e==''\
and board.s4b+board.s5c+board.s6d=='':
moves = '3a7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b7f==''\
and board.s4b+board.s5c+board.s6d+board.s7e=='':
moves = '3a8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3a)and Bboard.b9g==''\
and board.s4b+board.s5c+board.s6d+board.s7e+board.s8f=='':
moves = '3a9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b1c==''\
and board.s2b=='':
moves = '3a1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b5c==''\
and board.s4b=='':
moves = '3a5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b6d==''\
and board.s4b+board.s5c=='':
moves = '3a6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b6e==''\
and board.s4b+board.s5c+board.s6d=='':
moves = '3a7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b7f==''\
and board.s4b+board.s5c+board.s6d+board.s7e=='':
moves = '3a8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3a)and Bboard.b9g==''\
and board.s4b+board.s5c+board.s6d+board.s7e+board.s8f=='':
moves = '3a9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4a !='':
if re.match(r'[GK+]',Bboard.b4a)and Bboard.b3a=='':
moves = '4a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b4a)and Bboard.b5a=='':
moves = '4a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b4a)and Bboard.b4b=='':
moves = '4a4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4a)and Bboard.b3b=='':
moves = '4a3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4a)and Bboard.b5b=='':
moves = '4a5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b4a)and Bboard.b3a=='':
moves = '4a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b4a)and Bboard.b5a=='':
moves = '4a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b4a)and Bboard.b4b=='':
moves = '4a4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b4a)and Bboard.b3b=='':
moves = '4a3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b4a)and Bboard.b5b=='':
moves = '4a5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4c==''\
and board.s4b=='':
moves = '4a4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4c==''\
and board.s4b=='':
moves = '4a4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4d==''\
and board.s4b+board.s4c=='':
moves = '4a4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4d==''\
and board.s4b+board.s4c=='':
moves = '4a4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4e==''\
and board.s4b+board.s4c+board.s4d=='':
moves = '4a4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4e==''\
and board.s4b+board.s4c+board.s4d=='':
moves = '4a4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4f==''\
and board.s4b+board.s4c+board.s4d+board.s4e=='':
moves = '4a4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4f==''\
and board.s4b+board.s4c+board.s4d+board.s4e=='':
moves = '4a4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4g==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4a4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4g==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4a4g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4h==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4a4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4h==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4a4h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b4i==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4a4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b4i==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4a4i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b1a==''\
and board.s2a+board.s3a=='':
moves = '4a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b1a==''\
and board.s2a+board.s3a=='':
moves = '4a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b2a==''\
and board.s3a=='':
moves = '4a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b2a==''\
and board.s3a=='':
moves = '4a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b6a==''\
and board.s5a=='':
moves = '4a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b6a==''\
and board.s5a=='':
moves = '4a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b7a==''\
and board.s5a+board.s6a=='':
moves = '4a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b7a==''\
and board.s5a+board.s6a=='':
moves = '4a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b8a==''\
and board.s5a+board.s6a+board.s7a=='':
moves = '4a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b8a==''\
and board.s5a+board.s6a+board.s7a=='':
moves = '4a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4a)and Bboard.b9a==''\
and board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '4a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4a)and Bboard.b9a==''\
and board.s5a+board.s6a+board.s7a+board.s8a=='':
moves = '4a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b6c==''\
and board.s5b=='':
moves = '4a6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b7d==''\
and board.s5b+board.s6c=='':
moves = '4a7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b8e==''\
and board.s5b+board.s6c+board.s7d=='':
moves = '4a8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b9f==''\
and board.s5b+board.s6c+board.s7d+board.s8e=='':
moves = '4a9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4a)and Bboard.b6c==''\
and board.s5b=='':
moves = '4a6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4a)and Bboard.b7d==''\
and board.s5b+board.s6c=='':
moves = '4a7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4a)and Bboard.b8e==''\
and board.s5b+board.s6c+board.s7d=='':
moves = '4a8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4a)and Bboard.b9f==''\
and board.s5b+board.s6c+board.s7d+board.s8e=='':
moves = '4a9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4a)and Bboard.b1d==''\
and board.s2c+board.s3b=='':
moves = '4a1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4a)and Bboard.b2c==''\
and board.s3b=='':
moves = '4a2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b1d==''\
and board.s2c+board.s3b=='':
moves = '4a1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4a)and Bboard.b2c==''\
and board.s3b=='':
moves = '4a2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5a !='':
if re.match(r'[GK+]',Bboard.b5a)and Bboard.b4a=='':
moves = '5a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b5a)and Bboard.b6a=='':
moves = '5a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b5a)and Bboard.b5b=='':
moves = '5a5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5a)and Bboard.b4b=='':
moves = '5a4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5a)and Bboard.b6b=='':
moves = '5a6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b5a)and Bboard.b4a=='':
moves = '5a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b5a)and Bboard.b6a=='':
moves = '5a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b5a)and Bboard.b5b=='':
moves = '5a5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b5a)and Bboard.b4b=='':
moves = '5a4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b5a)and Bboard.b6b=='':
moves = '5a6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5c==''\
and board.s5b=='':
moves = '5a5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5c==''\
and board.s5b=='':
moves = '5a5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5d==''\
and board.s5b+board.s5c=='':
moves = '5a5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5d==''\
and board.s5b+board.s5c=='':
moves = '5a5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5e==''\
and board.s5b+board.s5c+board.s5d=='':
moves = '5a5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5e==''\
and board.s5b+board.s5c+board.s5d=='':
moves = '5a5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5f==''\
and board.s5b+board.s5c+board.s5d+board.s5e=='':
moves = '5a5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5f==''\
and board.s5b+board.s5c+board.s5d+board.s5e=='':
moves = '5a5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5g==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5a5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5g==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5a5g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5h==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5a5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5h==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5a5h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b5i==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5a5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b5i==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5a5i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b1a==''\
and board.s2a+board.s3a+board.s4a=='':
moves = '5a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b1a==''\
and board.s2a+board.s3a+board.s4a=='':
moves = '5a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b2a==''\
and board.s3a+board.s4a=='':
moves = '5a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b2a==''\
and board.s3a+board.s4a=='':
moves = '5a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b3a==''\
and board.s4a=='':
moves = '5a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b3a==''\
and board.s4a=='':
moves = '5a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b7a==''\
and board.s6a=='':
moves = '5a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b7a==''\
and board.s6a=='':
moves = '5a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b8a==''\
and board.s6a+board.s7a=='':
moves = '5a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b8a==''\
and board.s6a+board.s7a=='':
moves = '5a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5a)and Bboard.b9a==''\
and board.s6a+board.s7a+board.s8a=='':
moves = '5a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5a)and Bboard.b9a==''\
and board.s6a+board.s7a+board.s8a=='':
moves = '5a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b7c==''\
and board.s6b=='':
moves = '5a7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b8d==''\
and board.s6b+board.s7c=='':
moves = '5a8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b9e==''\
and board.s6b+board.s7c+board.s8d=='':
moves = '5a9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5a)and Bboard.b7c==''\
and board.s6b=='':
moves = '5a7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5a)and Bboard.b8d==''\
and board.s6b+board.s7c=='':
moves = '5a8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5a)and Bboard.b9e==''\
and board.s6b+board.s7c+board.s8d=='':
moves = '5a9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5a)and Bboard.b2d==''\
and board.s3c+board.s4b=='':
moves = '5a2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5a)and Bboard.b3c==''\
and board.s4b=='':
moves = '5a3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b2d==''\
and board.s3c+board.s4b=='':
moves = '5a2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b3c==''\
and board.s4b=='':
moves = '5a3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5a)and Bboard.b1e==''\
and board.s4b+board.s3c+board.s2d=='':
moves = '5a1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5a)and Bboard.b1e==''\
and board.s4b+board.s3c+board.s2d=='':
moves = '5a1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6a !='':
if re.match(r'[GK+]',Bboard.b6a)and Bboard.b5a=='':
moves = '6a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b6a)and Bboard.b7a=='':
moves = '6a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b6a)and Bboard.b6b=='':
moves = '6a6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6a)and Bboard.b5b=='':
moves = '6a5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6a)and Bboard.b7b=='':
moves = '6a7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b6a)and Bboard.b5a=='':
moves = '6a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b6a)and Bboard.b7a=='':
moves = '6a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b6a)and Bboard.b6b=='':
moves = '6a6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b6a)and Bboard.b5b=='':
moves = '6a5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b6a)and Bboard.b7b=='':
moves = '6a7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6c==''\
and board.s6b=='':
moves = '6a6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6c==''\
and board.s6b=='':
moves = '6a6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6d==''\
and board.s6b+board.s6c=='':
moves = '6a6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6d==''\
and board.s6b+board.s6c=='':
moves = '6a6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6e==''\
and board.s6b+board.s6c+board.s6d=='':
moves = '6a6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6e==''\
and board.s6b+board.s6c+board.s6d=='':
moves = '6a6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6f==''\
and board.s6b+board.s6c+board.s6d+board.s6e=='':
moves = '6a6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6f==''\
and board.s6b+board.s6c+board.s6d+board.s6e=='':
moves = '6a6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6g==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6a6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6g==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6a6g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6h==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6a6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6h==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6a6h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b6i==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6a6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b6i==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6a6i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b9a==''\
and board.s8a+board.s7a=='':
moves = '6a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b9a==''\
and board.s8a+board.s7a=='':
moves = '6a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b8a==''\
and board.s7a=='':
moves = '6a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b8a==''\
and board.s7a=='':
moves = '6a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b4a==''\
and board.s5a=='':
moves = '6a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b4a==''\
and board.s5a=='':
moves = '6a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b3a==''\
and board.s5a+board.s4a=='':
moves = '6a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b3a==''\
and board.s5a+board.s4a=='':
moves = '6a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b2a==''\
and board.s5a+board.s4a+board.s3a=='':
moves = '6a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b2a==''\
and board.s5a+board.s4a+board.s3a=='':
moves = '6a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6a)and Bboard.b1a==''\
and board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '6a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6a)and Bboard.b1a==''\
and board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '6a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b4c==''\
and board.s5b=='':
moves = '6a4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b3d==''\
and board.s5b+board.s4c=='':
moves = '6a3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b2e==''\
and board.s5b+board.s4c+board.s3d=='':
moves = '6a2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b1f==''\
and board.s5b+board.s4c+board.s3d+board.s2e=='':
moves = '6a1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6a)and Bboard.b4c==''\
and board.s5b=='':
moves = '6a4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6a)and Bboard.b3d==''\
and board.s5b+board.s4c=='':
moves = '6a3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6a)and Bboard.b2e==''\
and board.s5b+board.s4c+board.s3d=='':
moves = '6a2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6a)and Bboard.b1f==''\
and board.s5b+board.s4c+board.s3d+board.s2e=='':
moves = '6a1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6a)and Bboard.b9d==''\
and board.s8c+board.s7b=='':
moves = '6a9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6a)and Bboard.b8c==''\
and board.s7b=='':
moves = '6a8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b9d==''\
and board.s8c+board.s7b=='':
moves = '6a9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6a)and Bboard.b8c==''\
and board.s7b=='':
moves = '6a8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7a !='':
if re.match(r'[GK+]',Bboard.b7a)and Bboard.b6a=='':
moves = '7a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b7a)and Bboard.b8a=='':
moves = '7a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b7a)and Bboard.b7b=='':
moves = '7a7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7a)and Bboard.b6b=='':
moves = '7a6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7a)and Bboard.b8b=='':
moves = '7a8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b7a)and Bboard.b6a=='':
moves = '7a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b7a)and Bboard.b8a=='':
moves = '7a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b7a)and Bboard.b7b=='':
moves = '7a7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b7a)and Bboard.b6b=='':
moves = '7a6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b7a)and Bboard.b8b=='':
moves = '7a8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7c==''\
and board.s7b=='':
moves = '7a7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7c==''\
and board.s7b=='':
moves = '7a7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7d==''\
and board.s7b+board.s7c=='':
moves = '7a7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7d==''\
and board.s7b+board.s7c=='':
moves = '7a7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7e==''\
and board.s7b+board.s7c+board.s7d=='':
moves = '7a7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7e==''\
and board.s7b+board.s7c+board.s7d=='':
moves = '7a7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7f==''\
and board.s7b+board.s7c+board.s7d+board.s7e=='':
moves = '7a7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7f==''\
and board.s7b+board.s7c+board.s7d+board.s7e=='':
moves = '7a7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7g==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7a7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7g==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7a7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7h==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7a7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7h==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7a7h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b7i==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7a7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b7i==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7a7i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b9a==''\
and board.s8a=='':
moves = '7a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b9a==''\
and board.s8a=='':
moves = '7a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b5a==''\
and board.s6a=='':
moves = '7a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b5a==''\
and board.s6a=='':
moves = '7a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b4a==''\
and board.s6a+board.s5a=='':
moves = '7a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b4a==''\
and board.s6a+board.s5a=='':
moves = '7a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b3a==''\
and board.s6a+board.s5a+board.s4a=='':
moves = '7a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b3a==''\
and board.s6a+board.s5a+board.s4a=='':
moves = '7a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b2a==''\
and board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '7a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b2a==''\
and board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '7a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7a)and Bboard.b1a==''\
and board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '7a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7a)and Bboard.b1a==''\
and board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '7a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b9c==''\
and board.s8b=='':
moves = '7a9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b5c==''\
and board.s6b=='':
moves = '7a5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b4d==''\
and board.s6b+board.s5c=='':
moves = '7a4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b4e==''\
and board.s6b+board.s5c+board.s4d=='':
moves = '7a3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b3f==''\
and board.s6b+board.s5c+board.s4d+board.s3e=='':
moves = '7a2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7a)and Bboard.b1g==''\
and board.s6b+board.s5c+board.s4d+board.s3e+board.s2f=='':
moves = '7a1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b9c==''\
and board.s8b=='':
moves = '7a9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b5c==''\
and board.s6b=='':
moves = '7a5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b4d==''\
and board.s6b+board.s5c=='':
moves = '7a4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b4e==''\
and board.s6b+board.s5c+board.s4d=='':
moves = '7a3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b3f==''\
and board.s6b+board.s5c+board.s4d+board.s3e=='':
moves = '7a2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7a)and Bboard.b1g==''\
and board.s6b+board.s5c+board.s4d+board.s3e+board.s2f=='':
moves = '7a1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8a !='':
if re.match(r'[GK+]',Bboard.b8a)and Bboard.b7a=='':
moves = '8a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b8a)and Bboard.b9a=='':
moves = '8a9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b8a)and Bboard.b8b=='':
moves = '8a8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8a)and Bboard.b7b=='':
moves = '8a7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8a)and Bboard.b9b=='':
moves = '8a9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b8a)and Bboard.b7a=='':
moves = '8a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b8a)and Bboard.b9a=='':
moves = '8a9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b8a)and Bboard.b8b=='':
moves = '8a8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b8a)and Bboard.b7b=='':
moves = '8a7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b8a)and Bboard.b9b=='':
moves = '8a9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8c==''\
and board.s8b=='':
moves = '8a8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8c==''\
and board.s8b=='':
moves = '8a8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8d==''\
and board.s8b+board.s8c=='':
moves = '8a8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8d==''\
and board.s8b+board.s8c=='':
moves = '8a8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8e==''\
and board.s8b+board.s8c+board.s8d=='':
moves = '8a8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8e==''\
and board.s8b+board.s8c+board.s8d=='':
moves = '8a8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8f==''\
and board.s8b+board.s8c+board.s8d+board.s8e=='':
moves = '8a8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8f==''\
and board.s8b+board.s8c+board.s8d+board.s8e=='':
moves = '8a8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8g==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8a8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8g==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8a8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8h==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8a8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8h==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8a8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b8i==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8a8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b8i==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8a8i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b6a==''\
and board.s7a=='':
moves = '8a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b6a==''\
and board.s7a=='':
moves = '8a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b5a==''\
and board.s7a+board.s6a=='':
moves = '8a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b5a==''\
and board.s7a+board.s6a=='':
moves = '8a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b4a==''\
and board.s7a+board.s6a+board.s5a=='':
moves = '8a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b4a==''\
and board.s7a+board.s6a+board.s5a=='':
moves = '8a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b3a==''\
and board.s7a+board.s6a+board.s5a+board.s4a=='':
moves = '8a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b3a==''\
and board.s7a+board.s6a+board.s5a+board.s4a=='':
moves = '8a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b2a==''\
and board.s7a+board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '8a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b2a==''\
and board.s7a+board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '8a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8a)and Bboard.b1a==''\
and board.s7a+board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '8a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8a)and Bboard.b1a==''\
and board.s7a+board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '8a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b6c==''\
and board.s7b=='':
moves = '8a6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b5d==''\
and board.s7b+board.s6c=='':
moves = '8a5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b4e==''\
and board.s7b+board.s6c+board.s5d=='':
moves = '8a4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b3f==''\
and board.s7b+board.s6c+board.s5d+board.s4e=='':
moves = '8a3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b2g==''\
and board.s7b+board.s6c+board.s5d+board.s4e+board.s3f=='':
moves = '8a2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8a)and Bboard.b1h==''\
and board.s7b+board.s6c+board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '8a1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b6c==''\
and board.s7b=='':
moves = '8a6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b5d==''\
and board.s7b+board.s6c=='':
moves = '8a5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b4e==''\
and board.s7b+board.s6c+board.s5d=='':
moves = '8a4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b3f==''\
and board.s7b+board.s6c+board.s5d+board.s4e=='':
moves = '8a3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b2g==''\
and board.s7b+board.s6c+board.s5d+board.s4e+board.s3f=='':
moves = '8a2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8a)and Bboard.b1h==''\
and board.s7b+board.s6c+board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '8a1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9a !='':
if re.match(r'[GK+]',Bboard.b9a)and Bboard.b8a=='':
moves = '9a8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b9a)and Bboard.b9b=='':
moves = '9a9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b9a)and Bboard.b8b=='':
moves = '9a8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b9a)and Bboard.b8a=='':
moves = '9a8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b9a)and Bboard.b9b=='':
moves = '9a9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b9a)and Bboard.b8b=='':
moves = '9a8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9c==''\
and board.s9b=='':
moves = '9a9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9c==''\
and board.s9b=='':
moves = '9a9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9d==''\
and board.s9b+board.s9c=='':
moves = '9a9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9d==''\
and board.s9b+board.s9c=='':
moves = '9a9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9e==''\
and board.s9b+board.s9c+board.s9d=='':
moves = '9a9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9e==''\
and board.s9b+board.s9c+board.s9d=='':
moves = '9a9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9f==''\
and board.s9b+board.s9c+board.s9d+board.s9e=='':
moves = '9a9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9f==''\
and board.s9b+board.s9c+board.s9d+board.s9e=='':
moves = '9a9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9g==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9a9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9g==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9a9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9h==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9a9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9h==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9a9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b9i==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9a9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b9i==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9a9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b7a==''\
and board.s8a=='':
moves = '9a7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b7a==''\
and board.s8a=='':
moves = '9a7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b6a==''\
and board.s8a+board.s7a=='':
moves = '9a6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b6a==''\
and board.s8a+board.s7a=='':
moves = '9a6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b5a==''\
and board.s8a+board.s7a+board.s6a=='':
moves = '9a5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b5a==''\
and board.s8a+board.s7a+board.s6a=='':
moves = '9a5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b4a==''\
and board.s8a+board.s7a+board.s6a+board.s5a=='':
moves = '9a4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b4a==''\
and board.s8a+board.s7a+board.s6a+board.s5a=='':
moves = '9a4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b3a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a=='':
moves = '9a3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b3a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a=='':
moves = '9a3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b2a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '9a2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b2a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a+board.s3a=='':
moves = '9a2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9a)and Bboard.b1a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '9a1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9a)and Bboard.b1a==''\
and board.s8a+board.s7a+board.s6a+board.s5a+board.s4a+board.s3a+board.s2a=='':
moves = '9a1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b7c==''\
and board.s8b=='':
moves = '9a7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b6d==''\
and board.s8b+board.s7c=='':
moves = '9a6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b5e==''\
and board.s8b+board.s7c+board.s6d=='':
moves = '9a5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b4f==''\
and board.s8b+board.s7c+board.s6d+board.s5e=='':
moves = '9a4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b3g==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f=='':
moves = '9a3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b2h==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '9a2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9a)and Bboard.b1i==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '9a1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b7c==''\
and board.s8b=='':
moves = '9a7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b6d==''\
and board.s8b+board.s7c=='':
moves = '9a6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b5e==''\
and board.s8b+board.s7c+board.s6d=='':
moves = '9a5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b4f==''\
and board.s8b+board.s7c+board.s6d+board.s5e=='':
moves = '9a4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b3g==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f=='':
moves = '9a3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b2h==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '9a2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9a)and Bboard.b1i==''\
and board.s8b+board.s7c+board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '9a1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1b !='':
if re.match(r'[SGK+]',Bboard.b1b)and Bboard.b1a=='':
moves = '1b1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]',Bboard.b1b)and Bboard.b2a=='':
moves = '1b2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b1b)and Bboard.b2b=='':
moves = '1b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b1b)and Bboard.b1c=='':
moves = '1b1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b1b)and Bboard.b2c=='':
moves = '1b2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]',Bboard.b1b)and Bboard.b1a=='':
moves = '1b1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b1b)and Bboard.b2a=='':
moves = '1b2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b1b)and Bboard.b2b=='':
moves = '1b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b1b)and Bboard.b1c=='':
moves = '1b1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b1b)and Bboard.b2c=='':
moves = '1b2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1d==''\
and board.s1e=='':
moves = '1b1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1d==''\
and board.s1e=='':
moves = '1b1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1e==''\
and board.s1c+board.s1d=='':
moves = '1b1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1e==''\
and board.s1c+board.s1d=='':
moves = '1b1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1f==''\
and board.s1c+board.s1d+board.s1e=='':
moves = '1b1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1f==''\
and board.s1c+board.s1d+board.s1e=='':
moves = '1b1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1g==''\
and board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1b1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1g==''\
and board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1b1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1h==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1b1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1h==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1b1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b1i==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1b1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b1i==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1b1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b3b==''\
and board.s2b=='':
moves = '1b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b3b==''\
and board.s2b=='':
moves = '1b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b4b==''\
and board.s2b+board.s3b=='':
moves = '1b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b4b==''\
and board.s2b+board.s3b=='':
moves = '1b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b5b==''\
and board.s2b+board.s3b+board.s4b=='':
moves = '1b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b5b==''\
and board.s2b+board.s3b+board.s4b=='':
moves = '1b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b6b==''\
and board.s2b+board.s3b+board.s4b+board.s5b=='':
moves = '1b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b6b==''\
and board.s2b+board.s3b+board.s4b+board.s5b=='':
moves = '1b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b7b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b=='':
moves = '1b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b7b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b=='':
moves = '1b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b8b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '1b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b8b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '1b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1b)and Bboard.b9b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '1b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1b)and Bboard.b9b==''\
and board.s2b+board.s3b+board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '1b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b3d==''\
and board.s2c=='':
moves = '1b3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b4e==''\
and board.s2c+board.s3d=='':
moves = '1b4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b5f==''\
and board.s2c+board.s3d+board.s4e=='':
moves = '1b5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b6g==''\
and board.s2c+board.s3d+board.s4e+board.s5f=='':
moves = '1b6g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b7h==''\
and board.s2c+board.s3d+board.s4e+board.s5f+board.s6g=='':
moves = '1b7h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1b)and Bboard.b8i==''\
and board.s2c+board.s3d+board.s4e+board.s5f+board.s6g+board.s7h=='':
moves = '1b8i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b3d==''\
and board.s2c=='':
moves = '1b3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b4e==''\
and board.s2c+board.s3d=='':
moves = '1b4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b5f==''\
and board.s2c+board.s3d+board.s4e=='':
moves = '1b5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b6g==''\
and board.s2c+board.s3d+board.s4e+board.s5f=='':
moves = '1b6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b7h==''\
and board.s2c+board.s3d+board.s4e+board.s5f+board.s6g=='':
moves = '1b7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1b)and Bboard.b8i==''\
and board.s2c+board.s3d+board.s4e+board.s5f+board.s6g+board.s7h=='':
moves = '1b8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2b !='':
if re.match(r'[SGK+]',Bboard.b2b)and Bboard.b2a=='':
moves = '2b2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]',Bboard.b2b)and Bboard.b1a=='':
moves = '2b1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]',Bboard.b2b)and Bboard.b3a=='':
moves = '2b3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b2b)and Bboard.b1b=='':
moves = '2b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b2b)and Bboard.b3b=='':
moves = '2b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b2b)and Bboard.b2c=='':
moves = '2b2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2b)and Bboard.b1c=='':
moves = '2b1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2b)and Bboard.b3c=='':
moves = '2b3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]',Bboard.b2b)and Bboard.b2a=='':
moves = '2b2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2b)and Bboard.b1a=='':
moves = '2b1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2b)and Bboard.b3a=='':
moves = '2b3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2b)and Bboard.b1b=='':
moves = '2b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2b)and Bboard.b3b=='':
moves = '2b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b2b)and Bboard.b2c=='':
moves = '2b2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2b)and Bboard.b1c=='':
moves = '2b1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b2b)and Bboard.b3c=='':
moves = '2b3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2d==''\
and board.s2e=='':
moves = '2b2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2d==''\
and board.s2e=='':
moves = '2b2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2e==''\
and board.s2c+board.s2d=='':
moves = '2b2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2e==''\
and board.s2c+board.s2d=='':
moves = '2b2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2f==''\
and board.s2c+board.s2d+board.s2e=='':
moves = '2b2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2f==''\
and board.s2c+board.s2d+board.s2e=='':
moves = '2b2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2g==''\
and board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2b2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2g==''\
and board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2b2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2h==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2b2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2h==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2b2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b2i==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2b2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b2i==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2b2i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b4b==''\
and board.s3b=='':
moves = '2b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b4b==''\
and board.s3b=='':
moves = '2b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b5b==''\
and board.s3b+board.s4b=='':
moves = '2b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b5b==''\
and board.s3b+board.s4b=='':
moves = '2b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b6b==''\
and board.s3b+board.s4b+board.s5b=='':
moves = '2b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b6b==''\
and board.s3b+board.s4b+board.s5b=='':
moves = '2b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b7b==''\
and board.s3b+board.s4b+board.s5b+board.s6b=='':
moves = '2b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b7b==''\
and board.s3b+board.s4b+board.s5b+board.s6b=='':
moves = '2b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b8b==''\
and board.s3b+board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '2b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b8b==''\
and board.s3b+board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '2b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2b)and Bboard.b9b==''\
and board.s3b+board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '2b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2b)and Bboard.b9b==''\
and board.s3b+board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '2b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b4d==''\
and board.s3c=='':
moves = '2b4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b5e==''\
and board.s3c+board.s4d=='':
moves = '2b5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b6f==''\
and board.s3c+board.s4d+board.s5e=='':
moves = '2b6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b7g==''\
and board.s3c+board.s4d+board.s5e+board.s6f=='':
moves = '2b7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b8h==''\
and board.s3c+board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '2b8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2b)and Bboard.b9i==''\
and board.s3c+board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '2b9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b4d==''\
and board.s3c=='':
moves = '2b4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b5e==''\
and board.s3c+board.s4d=='':
moves = '2b5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b6f==''\
and board.s3c+board.s4d+board.s5e=='':
moves = '2b6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b7g==''\
and board.s3c+board.s4d+board.s5e+board.s6f=='':
moves = '2b7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b8h==''\
and board.s3c+board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '2b8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2b)and Bboard.b9i==''\
and board.s3c+board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '2b9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3b !='':
if re.match(r'[SGK+]',Bboard.b3b)and Bboard.b3a=='':
moves = '3b3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]',Bboard.b3b)and Bboard.b2a=='':
moves = '3b2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]',Bboard.b3b)and Bboard.b4a=='':
moves = '3b4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b3b)and Bboard.b2b=='':
moves = '3b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b3b)and Bboard.b4b=='':
moves = '3b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]',Bboard.b3b)and Bboard.b3c=='':
moves = '3b3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3b)and Bboard.b2c=='':
moves = '3b2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3b)and Bboard.b4c=='':
moves = '3b4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]',Bboard.b3b)and Bboard.b3a=='':
moves = '3b3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3b)and Bboard.b2a=='':
moves = '3b2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3b)and Bboard.b4a=='':
moves = '3b4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3b)and Bboard.b2b=='':
moves = '3b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3b)and Bboard.b4b=='':
moves = '3b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R',Bboard.b3b)and Bboard.b3c=='':
moves = '3b3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3b)and Bboard.b2c=='':
moves = '3b2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]',Bboard.b3b)and Bboard.b4c=='':
moves = '3b4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3d==''\
and board.s3e=='':
moves = '3b3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3d==''\
and board.s3e=='':
moves = '3b3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3e==''\
and board.s3c+board.s3d=='':
moves = '3b3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3e==''\
and board.s3c+board.s3d=='':
moves = '3b3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3f==''\
and board.s3c+board.s3d+board.s3e=='':
moves = '3b3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3f==''\
and board.s3c+board.s3d+board.s3e=='':
moves = '3b3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3g==''\
and board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3b3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3g==''\
and board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3b3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3h==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3b3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3h==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3b3h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b3i==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3b3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b3i==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3b3i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b1b==''\
and board.s2b=='':
moves = '3b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b1b==''\
and board.s2b=='':
moves = '3b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b5b==''\
and board.s4b=='':
moves = '3b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b5b==''\
and board.s4b=='':
moves = '3b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b6b==''\
and board.s4b+board.s5b=='':
moves = '3b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b6b==''\
and board.s4b+board.s5b=='':
moves = '3b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b7b==''\
and board.s4b+board.s5b+board.s6b=='':
moves = '3b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b7b==''\
and board.s4b+board.s5b+board.s6b=='':
moves = '3b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b8b==''\
and board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '3b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b8b==''\
and board.s4b+board.s5b+board.s6b+board.s7b=='':
moves = '3b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3b)and Bboard.b9b==''\
and board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '3b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3b)and Bboard.b9b==''\
and board.s4b+board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '3b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b5d==''\
and board.s4c=='':
moves = '3b5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b6e==''\
and board.s4c+board.s5d=='':
moves = '3b6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b7f==''\
and board.s4c+board.s5d+board.s6e=='':
moves = '3b7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b8g==''\
and board.s4c+board.s5d+board.s6e+board.s7f=='':
moves = '3b8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b9h==''\
and board.s4c+board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '3b9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b5d==''\
and board.s4c=='':
moves = '3b5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b6e==''\
and board.s4c+board.s5d=='':
moves = '3b6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b7f==''\
and board.s4c+board.s5d+board.s6e=='':
moves = '3b7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b8g==''\
and board.s4c+board.s5d+board.s6e+board.s7f=='':
moves = '3b8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b9h==''\
and board.s4c+board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '3b9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3b)and Bboard.b1d==''\
and board.s2c=='':
moves = '3b1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3b)and Bboard.b1d==''\
and board.s2c=='':
moves = '3b1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4b !='':
if re.match(r'[SGK+]', Bboard.b4b)and Bboard.b4a=='':
moves = '4b4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4b)and Bboard.b3a=='':
moves = '4b3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4b)and Bboard.b5a=='':
moves = '4b5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4b)and Bboard.b3b=='':
moves = '4b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4b)and Bboard.b5b=='':
moves = '4b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4b)and Bboard.b4c=='':
moves = '4b4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4b)and Bboard.b3c=='':
moves = '4b3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4b)and Bboard.b5c=='':
moves = '4b5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b4b)and Bboard.b4a=='':
moves = '4b4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4b)and Bboard.b3a=='':
moves = '4b3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4b)and Bboard.b5a=='':
moves = '4b5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b3b=='':
moves = '4b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b5b=='':
moves = '4b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4c=='':
moves = '4b4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4b)and Bboard.b3c=='':
moves = '4b3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4b)and Bboard.b5c=='':
moves = '4b5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4d==''\
and board.s4e=='':
moves = '4b4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4d==''\
and board.s4e=='':
moves = '4b4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4e==''\
and board.s4c+board.s4d=='':
moves = '4b4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4e==''\
and board.s4c+board.s4d=='':
moves = '4b4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4f==''\
and board.s4c+board.s4d+board.s4e=='':
moves = '4b4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4f==''\
and board.s4c+board.s4d+board.s4e=='':
moves = '4b4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4g==''\
and board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4b4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4g==''\
and board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4b4g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4h==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4b4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4h==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4b4h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b4i==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4b4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b4i==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4b4i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b1b==''\
and board.s2b+board.s3b=='':
moves = '4b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b1b==''\
and board.s2b+board.s3b=='':
moves = '4b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b5b==''\
and board.s3b=='':
moves = '4b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b5b==''\
and board.s3b=='':
moves = '4b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b6b==''\
and board.s5b=='':
moves = '4b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b6b==''\
and board.s5b=='':
moves = '4b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b7b==''\
and board.s5b+board.s6b=='':
moves = '4b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b7b==''\
and board.s5b+board.s6b=='':
moves = '4b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b8b==''\
and board.s5b+board.s6b+board.s7b=='':
moves = '4b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b8b==''\
and board.s5b+board.s6b+board.s7b=='':
moves = '4b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4b)and Bboard.b9b==''\
and board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '4b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4b)and Bboard.b9b==''\
and board.s5b+board.s6b+board.s7b+board.s8b=='':
moves = '4b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b6d==''\
and board.s5c=='':
moves = '4b6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b7e==''\
and board.s5c+board.s6d=='':
moves = '4b7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b8f==''\
and board.s5c+board.s6d+board.s7e=='':
moves = '4b8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b9g==''\
and board.s5c+board.s6d+board.s7e+board.s8f=='':
moves = '4b9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4b)and Bboard.b6d==''\
and board.s5c=='':
moves = '4b6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4b)and Bboard.b7e==''\
and board.s5c+board.s6d=='':
moves = '4b7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4b)and Bboard.b8f==''\
and board.s5c+board.s6d+board.s7e=='':
moves = '4b8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4b)and Bboard.b9g==''\
and board.s5c+board.s6d+board.s7e+board.s8f=='':
moves = '4b9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4b)and Bboard.b1e==''\
and board.s2d+board.s3c=='':
moves = '4b1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4b)and Bboard.b2d==''\
and board.s3c=='':
moves = '4b2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b1e==''\
and board.s2d+board.s3c=='':
moves = '4b1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4b)and Bboard.b2d==''\
and board.s3c=='':
moves = '4b2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5b !='':
if re.match(r'[SGK+]', Bboard.b5b)and Bboard.b5a=='':
moves = '5b5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5b)and Bboard.b4a=='':
moves = '5b4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5b)and Bboard.b6a=='':
moves = '5b6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5b)and Bboard.b4b=='':
moves = '5b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5b)and Bboard.b6b=='':
moves = '5b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5b)and Bboard.b5c=='':
moves = '5b5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5b)and Bboard.b4c=='':
moves = '5b4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5b)and Bboard.b6c=='':
moves = '5b6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b5b)and Bboard.b5a=='':
moves = '5b5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5b)and Bboard.b4a=='':
moves = '5b4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5b)and Bboard.b6a=='':
moves = '5b6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b4b=='':
moves = '5b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b6b=='':
moves = '5b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5c=='':
moves = '5b5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5b)and Bboard.b4c=='':
moves = '5b4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5b)and Bboard.b6c=='':
moves = '5b6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5d==''\
and board.s5e=='':
moves = '5b5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5d==''\
and board.s5e=='':
moves = '5b5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5e==''\
and board.s5c+board.s5d=='':
moves = '5b5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5e==''\
and board.s5c+board.s5d=='':
moves = '5b5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5f==''\
and board.s5c+board.s5d+board.s5e=='':
moves = '5b5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5f==''\
and board.s5c+board.s5d+board.s5e=='':
moves = '5b5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5g==''\
and board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5b5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5g==''\
and board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5b5g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5h==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5b5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5h==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5b5h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b5i==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5b5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b5i==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5b5i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b1b==''\
and board.s2b+board.s3b+board.s4b=='':
moves = '5b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b1b==''\
and board.s2b+board.s3b+board.s4b=='':
moves = '5b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b2b==''\
and board.s3b+board.s4b=='':
moves = '5b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b2b==''\
and board.s3b+board.s4b=='':
moves = '5b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b3b==''\
and board.s4b=='':
moves = '5b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b3b==''\
and board.s4b=='':
moves = '5b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b7b==''\
and board.s6b=='':
moves = '5b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b7b==''\
and board.s6b=='':
moves = '5b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b8b==''\
and board.s6b+board.s7b=='':
moves = '5b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b8b==''\
and board.s6b+board.s7b=='':
moves = '5b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5b)and Bboard.b9b==''\
and board.s6b+board.s7b+board.s8b=='':
moves = '5b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5b)and Bboard.b9b==''\
and board.s6b+board.s7b+board.s8b=='':
moves = '5b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b7d==''\
and board.s6c=='':
moves = '5b7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b8e==''\
and board.s6c+board.s7d=='':
moves = '5b8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b9f==''\
and board.s6c+board.s7d+board.s8e=='':
moves = '5b9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5b)and Bboard.b7d==''\
and board.s6c=='':
moves = '5b7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5b)and Bboard.b8e==''\
and board.s6c+board.s7d=='':
moves = '5b8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5b)and Bboard.b9f==''\
and board.s6c+board.s7d+board.s8e=='':
moves = '5b9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5b)and Bboard.b2e==''\
and board.s3d+board.s4c=='':
moves = '5b2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5b)and Bboard.b3d==''\
and board.s4c=='':
moves = '5b3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b2e==''\
and board.s3d+board.s4c=='':
moves = '5b2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b3d==''\
and board.s4c=='':
moves = '5b3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5b)and Bboard.b1f==''\
and board.s4c+board.s3d+board.s2e=='':
moves = '5b1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5b)and Bboard.b1f==''\
and board.s4c+board.s3d+board.s2e=='':
moves = '5b1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6b !='':
if re.match(r'[SGK+]', Bboard.b6b)and Bboard.b6a=='':
moves = '6b6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6b)and Bboard.b5a=='':
moves = '6b5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6b)and Bboard.b7a=='':
moves = '6b7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6b)and Bboard.b5b=='':
moves = '6b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6b)and Bboard.b7b=='':
moves = '6b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6b)and Bboard.b6c=='':
moves = '6b6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6b)and Bboard.b5c=='':
moves = '6b5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6b)and Bboard.b7c=='':
moves = '6b7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b6b)and Bboard.b6a=='':
moves = '6b6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6b)and Bboard.b5a=='':
moves = '6b5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6b)and Bboard.b7a=='':
moves = '6b7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b5b=='':
moves = '6b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b7b=='':
moves = '6b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6c=='':
moves = '6b6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6b)and Bboard.b5c=='':
moves = '6b5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6b)and Bboard.b7c=='':
moves = '6b7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6d==''\
and board.s6e=='':
moves = '6b6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6d==''\
and board.s6e=='':
moves = '6b6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6e==''\
and board.s6c+board.s6d=='':
moves = '6b6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6e==''\
and board.s6c+board.s6d=='':
moves = '6b6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6f==''\
and board.s6c+board.s6d+board.s6e=='':
moves = '6b6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6f==''\
and board.s6c+board.s6d+board.s6e=='':
moves = '6b6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6g==''\
and board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6b6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6g==''\
and board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6b6g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6h==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6b6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6h==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6b6h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b6i==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6b6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b6i==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6b6i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b9b==''\
and board.s8b+board.s7b=='':
moves = '6b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b9b==''\
and board.s8b+board.s7b=='':
moves = '6b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b5b==''\
and board.s7b=='':
moves = '6b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b5b==''\
and board.s7b=='':
moves = '6b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b4b==''\
and board.s5b=='':
moves = '6b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b4b==''\
and board.s5b=='':
moves = '6b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b3b==''\
and board.s5b+board.s4b=='':
moves = '6b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b3b==''\
and board.s5b+board.s4b=='':
moves = '6b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b2b==''\
and board.s5b+board.s4b+board.s3b=='':
moves = '6b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b2b==''\
and board.s5b+board.s4b+board.s3b=='':
moves = '6b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6b)and Bboard.b1b==''\
and board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '6b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6b)and Bboard.b1b==''\
and board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '6b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b4d==''\
and board.s5c=='':
moves = '6b4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b3e==''\
and board.s5c+board.s4d=='':
moves = '6b3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b2f==''\
and board.s5c+board.s4d+board.s3e=='':
moves = '6b2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b1g==''\
and board.s5c+board.s4d+board.s3e+board.s2f=='':
moves = '6b1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6b)and Bboard.b4d==''\
and board.s5c=='':
moves = '6b4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6b)and Bboard.b3e==''\
and board.s5c+board.s4d=='':
moves = '6b3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6b)and Bboard.b2f==''\
and board.s5c+board.s4d+board.s3e=='':
moves = '6b2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6b)and Bboard.b1g==''\
and board.s5c+board.s4d+board.s3e+board.s2f=='':
moves = '6b1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6b)and Bboard.b9e==''\
and board.s8d+board.s7c=='':
moves = '6b9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6b)and Bboard.b8d==''\
and board.s7c=='':
moves = '6b8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b9e==''\
and board.s8d+board.s7c=='':
moves = '6b9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6b)and Bboard.b8d==''\
and board.s7c=='':
moves = '6b8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7b !='':
if re.match(r'[SGK+]', Bboard.b7b)and Bboard.b7a=='':
moves = '7b7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7b)and Bboard.b6a=='':
moves = '7b6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7b)and Bboard.b8a=='':
moves = '7b8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7b)and Bboard.b6b=='':
moves = '7b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7b)and Bboard.b8b=='':
moves = '7b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7b)and Bboard.b7c=='':
moves = '7b7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7b)and Bboard.b6c=='':
moves = '7b6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7b)and Bboard.b8c=='':
moves = '7b8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b7b)and Bboard.b7a=='':
moves = '7b7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7b)and Bboard.b6a=='':
moves = '7b6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7b)and Bboard.b8a=='':
moves = '7b8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b6b=='':
moves = '7b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b8b=='':
moves = '7b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7c=='':
moves = '7b7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7b)and Bboard.b6c=='':
moves = '7b6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7b)and Bboard.b8c=='':
moves = '7b8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7d==''\
and board.s7e=='':
moves = '7b7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7d==''\
and board.s7e=='':
moves = '7b7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7e==''\
and board.s7c+board.s7d=='':
moves = '7b7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7e==''\
and board.s7c+board.s7d=='':
moves = '7b7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7f==''\
and board.s7c+board.s7d+board.s7e=='':
moves = '7b7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7f==''\
and board.s7c+board.s7d+board.s7e=='':
moves = '7b7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7g==''\
and board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7b7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7g==''\
and board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7b7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7h==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7b7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7h==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7b7h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b7i==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7b7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b7i==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7b7i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b9b==''\
and board.s8b=='':
moves = '7b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b9b==''\
and board.s8b=='':
moves = '7b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b5b==''\
and board.s6b=='':
moves = '7b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b5b==''\
and board.s6b=='':
moves = '7b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b4b==''\
and board.s6b+board.s5b=='':
moves = '7b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b4b==''\
and board.s6b+board.s5b=='':
moves = '7b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b3b==''\
and board.s6b+board.s5b+board.s4b=='':
moves = '7b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b3b==''\
and board.s6b+board.s5b+board.s4b=='':
moves = '7b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b2b==''\
and board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '7b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b2b==''\
and board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '7b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7b)and Bboard.b1b==''\
and board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '7b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7b)and Bboard.b1b==''\
and board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '7b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b5d==''\
and board.s6c=='':
moves = '7b5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b4e==''\
and board.s6c+board.s5d=='':
moves = '7b4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b3f==''\
and board.s6c+board.s5d+board.s4e=='':
moves = '7b3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b2g==''\
and board.s6c+board.s5d+board.s4e+board.s3f=='':
moves = '7b2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b1h==''\
and board.s6c+board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '7b1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b5d==''\
and board.s6c=='':
moves = '7b5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b4e==''\
and board.s6c+board.s5d=='':
moves = '7b4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b3f==''\
and board.s6c+board.s5d+board.s4e=='':
moves = '7b3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b2g==''\
and board.s6c+board.s5d+board.s4e+board.s3f=='':
moves = '7b2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b1h==''\
and board.s6c+board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '7b1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7b)and Bboard.b9d==''\
and board.s8c=='':
moves = '7b9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7b)and Bboard.b9d==''\
and board.s8c=='':
moves = '7b9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8b !='':
if re.match(r'[SGK+]', Bboard.b8b)and Bboard.b8a=='':
moves = '8b8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8b)and Bboard.b7a=='':
moves = '8b7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8b)and Bboard.b9a=='':
moves = '8b9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8b)and Bboard.b7b=='':
moves = '8b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8b)and Bboard.b9b=='':
moves = '8b9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8b)and Bboard.b8c=='':
moves = '8b8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8b)and Bboard.b7c=='':
moves = '8b7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8b)and Bboard.b9c=='':
moves = '8b9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b8b)and Bboard.b8a=='':
moves = '8b8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8b)and Bboard.b7a=='':
moves = '8b7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8b)and Bboard.b9a=='':
moves = '8b9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b7b=='':
moves = '8b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b9b=='':
moves = '8b9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8c=='':
moves = '8b8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8b)and Bboard.b7c=='':
moves = '8b7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8b)and Bboard.b9c=='':
moves = '8b9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8d==''\
and board.s8e=='':
moves = '8b8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8d==''\
and board.s8e=='':
moves = '8b8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8e==''\
and board.s8c+board.s8d=='':
moves = '8b8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8e==''\
and board.s8c+board.s8d=='':
moves = '8b8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8f==''\
and board.s8c+board.s8d+board.s8e=='':
moves = '8b8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8f==''\
and board.s8c+board.s8d+board.s8e=='':
moves = '8b8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8g==''\
and board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8b8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8g==''\
and board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8b8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8h==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8b8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8h==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8b8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b8i==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8b8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b8i==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8b8i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b6b==''\
and board.s7b=='':
moves = '8b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b6b==''\
and board.s7b=='':
moves = '8b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b5b==''\
and board.s7b+board.s6b=='':
moves = '8b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b5b==''\
and board.s7b+board.s6b=='':
moves = '8b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b4b==''\
and board.s7b+board.s6b+board.s5b=='':
moves = '8b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b4b==''\
and board.s7b+board.s6b+board.s5b=='':
moves = '8b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b3b==''\
and board.s7b+board.s6b+board.s5b+board.s4b=='':
moves = '8b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b3b==''\
and board.s7b+board.s6b+board.s5b+board.s4b=='':
moves = '8b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b2b==''\
and board.s7b+board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '8b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b2b==''\
and board.s7b+board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '8b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8b)and Bboard.b1b==''\
and board.s7b+board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '8b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8b)and Bboard.b1b==''\
and board.s7b+board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '8b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b6d==''\
and board.s7c=='':
moves = '8b6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b5e==''\
and board.s7c+board.s6d=='':
moves = '8b5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b4f==''\
and board.s7c+board.s6d+board.s5e=='':
moves = '8b4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b3g==''\
and board.s7c+board.s6d+board.s5e+board.s4f=='':
moves = '8b3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b2h==''\
and board.s7c+board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '8b2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8b)and Bboard.b1i==''\
and board.s7c+board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '8b1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b6d==''\
and board.s7c=='':
moves = '8b6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b5e==''\
and board.s7c+board.s6d=='':
moves = '8b5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b4f==''\
and board.s7c+board.s6d+board.s5e=='':
moves = '8b4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b3g==''\
and board.s7c+board.s6d+board.s5e+board.s4f=='':
moves = '8b3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b2h==''\
and board.s7c+board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '8b2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8b)and Bboard.b1i==''\
and board.s7c+board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '8b1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9b !='':
if re.match(r'[SGK+]', Bboard.b9b)and Bboard.b9a=='':
moves = '9b9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b9b)and Bboard.b8a=='':
moves = '9b8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9b)and Bboard.b8b=='':
moves = '9b8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9b)and Bboard.b9c=='':
moves = '9b9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b9b)and Bboard.b8c=='':
moves = '9b8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b9b)and Bboard.b9a=='':
moves = '9b9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b9b)and Bboard.b8a=='':
moves = '9b8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b8b=='':
moves = '9b8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9c=='':
moves = '9b9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b9b)and Bboard.b8c=='':
moves = '9b8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9d==''\
and board.s9e=='':
moves = '9b9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9d==''\
and board.s9e=='':
moves = '9b9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9e==''\
and board.s9c+board.s9d=='':
moves = '9b9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9e==''\
and board.s9c+board.s9d=='':
moves = '9b9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9f==''\
and board.s9c+board.s9d+board.s9e=='':
moves = '9b9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9f==''\
and board.s9c+board.s9d+board.s9e=='':
moves = '9b9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9g==''\
and board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9b9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9g==''\
and board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9b9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9h==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9b9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9h==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9b9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b9i==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9b9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b9i==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9b9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b7b==''\
and board.s8b=='':
moves = '9b7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b7b==''\
and board.s8b=='':
moves = '9b7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b6b==''\
and board.s8b+board.s7b=='':
moves = '9b6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b6b==''\
and board.s8b+board.s7b=='':
moves = '9b6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b5b==''\
and board.s8b+board.s7b+board.s6b=='':
moves = '9b5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b5b==''\
and board.s8b+board.s7b+board.s6b=='':
moves = '9b5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b4b==''\
and board.s8b+board.s7b+board.s6b+board.s5b=='':
moves = '9b4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b4b==''\
and board.s8b+board.s7b+board.s6b+board.s5b=='':
moves = '9b4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b3b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b=='':
moves = '9b3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b3b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b=='':
moves = '9b3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b2b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '9b2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b2b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b+board.s3b=='':
moves = '9b2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9b)and Bboard.b1b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '9b1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9b)and Bboard.b1b==''\
and board.s8b+board.s7b+board.s6b+board.s5b+board.s4b+board.s3b+board.s2b=='':
moves = '9b1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b7d==''\
and board.s8c=='':
moves = '9b7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b6e==''\
and board.s8c+board.s7d=='':
moves = '9b6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b5f==''\
and board.s8c+board.s7d+board.s6e=='':
moves = '9b5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b4g==''\
and board.s8c+board.s7d+board.s6e+board.s5f=='':
moves = '9b4g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b3h==''\
and board.s8c+board.s7d+board.s6e+board.s5f+board.s4g=='':
moves = '9b3h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9b)and Bboard.b2i==''\
and board.s8c+board.s7d+board.s6e+board.s5f+board.s4g+board.s3h=='':
moves = '9b2i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b7d==''\
and board.s8c=='':
moves = '9b7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b6e==''\
and board.s8c+board.s7d=='':
moves = '9b6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b5f==''\
and board.s8c+board.s7d+board.s6e=='':
moves = '9b5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b4g==''\
and board.s8c+board.s7d+board.s6e+board.s5f=='':
moves = '9b4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b3h==''\
and board.s8c+board.s7d+board.s6e+board.s5f+board.s4g=='':
moves = '9b3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9b)and Bboard.b2i==''\
and board.s8c+board.s7d+board.s6e+board.s5f+board.s4g+board.s3h=='':
moves = '9b2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1c !='':
if re.match(r'[SGK+]', Bboard.b1c)and Bboard.b1b=='':
moves = '1c1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b1c)and Bboard.b2b=='':
moves = '1c2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b1c)and Bboard.b2c=='':
moves = '1c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b1c)and Bboard.b1d=='':
moves = '1c1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b1c)and Bboard.b2d=='':
moves = '1c2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b1c)and Bboard.b1b=='':
moves = '1c1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b1c)and Bboard.b2b=='':
moves = '1c2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b2c=='':
moves = '1c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1d=='':
moves = '1c1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b1c)and Bboard.b2d=='':
moves = '1c2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1c)and Bboard.b2a=='':
moves = '1c2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1a==''\
and board.s1b=='':
moves = '1c1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1c)and Bboard.b1a==''\
and board.s1b=='':
moves = '1c1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1e==''\
and board.s1d=='':
moves = '1c1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1e==''\
and board.s1d=='':
moves = '1c1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1f==''\
and board.s1d+board.s1e=='':
moves = '1c1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1f==''\
and board.s1d+board.s1e=='':
moves = '1c1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1g==''\
and board.s1d+board.s1e+board.s1f=='':
moves = '1c1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1g==''\
and board.s1d+board.s1e+board.s1f=='':
moves = '1c1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1h==''\
and board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1c1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1h==''\
and board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1c1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b1i==''\
and board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1c1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b1i==''\
and board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1c1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b3c==''\
and board.s2c=='':
moves = '1c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b3c==''\
and board.s2c=='':
moves = '1c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b4c==''\
and board.s2c+board.s3c=='':
moves = '1c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b4c==''\
and board.s2c+board.s3c=='':
moves = '1c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b5c==''\
and board.s2c+board.s3c+board.s4c=='':
moves = '1c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b5c==''\
and board.s2c+board.s3c+board.s4c=='':
moves = '1c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b6c==''\
and board.s2c+board.s3c+board.s4c+board.s5c=='':
moves = '1c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b6c==''\
and board.s2c+board.s3c+board.s4c+board.s5c=='':
moves = '1c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b7c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c=='':
moves = '1c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b7c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c=='':
moves = '1c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b8c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '1c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b8c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '1c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1c)and Bboard.b9c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '1c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b1c)and Bboard.b9c==''\
and board.s2c+board.s3c+board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '1c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b3a==''\
and board.s2b=='':
moves = '1c3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b3e==''\
and board.s2d=='':
moves = '1c3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b4f==''\
and board.s2d+board.s3e=='':
moves = '1c4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b5g==''\
and board.s2d+board.s3e+board.s4f=='':
moves = '1c5g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b6h==''\
and board.s2d+board.s3e+board.s4f+board.s5g=='':
moves = '1c6h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1c)and Bboard.b7i==''\
and board.s2d+board.s3e+board.s4f+board.s5g+board.s6h=='':
moves = '1c7i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b3a==''\
and board.s2b=='':
moves = '1c3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b3e==''\
and board.s2d=='':
moves = '1c3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b4f==''\
and board.s2d+board.s3e=='':
moves = '1c4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b5g==''\
and board.s2d+board.s3e+board.s4f=='':
moves = '1c5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b6h==''\
and board.s2d+board.s3e+board.s4f+board.s5g=='':
moves = '1c6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1c)and Bboard.b7i==''\
and board.s2d+board.s3e+board.s4f+board.s5g+board.s6h=='':
moves = '1c7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2c !='':
if re.match(r'[SGK+]', Bboard.b2c)and Bboard.b2b=='':
moves = '2c2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b2c)and Bboard.b1b=='':
moves = '2c1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b2c)and Bboard.b3b=='':
moves = '2c3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2c)and Bboard.b1c=='':
moves = '2c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2c)and Bboard.b3c=='':
moves = '2c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2c)and Bboard.b2d=='':
moves = '2c2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2c)and Bboard.b1d=='':
moves = '2c1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2c)and Bboard.b3d=='':
moves = '2c3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b2c)and Bboard.b2b=='':
moves = '2c2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2c)and Bboard.b1b=='':
moves = '2c1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2c)and Bboard.b3b=='':
moves = '2c3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b1c=='':
moves = '2c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b3c=='':
moves = '2c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2d=='':
moves = '2c2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2c)and Bboard.b1d=='':
moves = '2c1d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2c)and Bboard.b3d=='':
moves = '2c3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2c)and Bboard.b1a=='':
moves = '2c1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2c)and Bboard.b3a=='':
moves = '2c3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2a==''\
and board.s2b=='':
moves = '2c2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2c)and Bboard.b2a==''\
and board.s2b=='':
moves = '2c2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2e==''\
and board.s2d=='':
moves = '2c2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2e==''\
and board.s2d=='':
moves = '2c2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2f==''\
and board.s2d+board.s2e=='':
moves = '2c2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2f==''\
and board.s2d+board.s2e=='':
moves = '2c2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2g==''\
and board.s2d+board.s2e+board.s2f=='':
moves = '2c2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2g==''\
and board.s2d+board.s2e+board.s2f=='':
moves = '2c2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2h==''\
and board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2c2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2h==''\
and board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2c2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b2i==''\
and board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2c2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b2i==''\
and board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2c2i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b4c==''\
and board.s3c=='':
moves = '2c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b4c==''\
and board.s3c=='':
moves = '2c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b5c==''\
and board.s3c+board.s4c=='':
moves = '2c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b5c==''\
and board.s3c+board.s4c=='':
moves = '2c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b6c==''\
and board.s3c+board.s4c+board.s5c=='':
moves = '2c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b6c==''\
and board.s3c+board.s4c+board.s5c=='':
moves = '2c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b7c==''\
and board.s3c+board.s4c+board.s5c+board.s6c=='':
moves = '2c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b7c==''\
and board.s3c+board.s4c+board.s5c+board.s6c=='':
moves = '2c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b8c==''\
and board.s3c+board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '2c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b8c==''\
and board.s3c+board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '2c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2c)and Bboard.b9c==''\
and board.s3c+board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '2c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b2c)and Bboard.b9c==''\
and board.s3c+board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '2c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b4e==''\
and board.s3d=='':
moves = '2c4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b5f==''\
and board.s3d+board.s4e=='':
moves = '2c5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b6g==''\
and board.s3d+board.s4e+board.s5f=='':
moves = '2c6g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b7h==''\
and board.s3d+board.s4e+board.s5f+board.s6g=='':
moves = '2c7h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b8i==''\
and board.s3d+board.s4e+board.s5f+board.s6g+board.s7h=='':
moves = '2c8i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b4e==''\
and board.s3d=='':
moves = '2c4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b5f==''\
and board.s3d+board.s4e=='':
moves = '2c5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b6g==''\
and board.s3d+board.s4e+board.s5f=='':
moves = '2c6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b7h==''\
and board.s3d+board.s4e+board.s5f+board.s6g=='':
moves = '2c7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b8i==''\
and board.s3d+board.s4e+board.s5f+board.s6g+board.s7h=='':
moves = '2c8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2c)and Bboard.b4a==''\
and board.s3b=='':
moves = '2c4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2c)and Bboard.b4a==''\
and board.s3b=='':
moves = '2c4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3c !='':
if re.match(r'[SGK+]', Bboard.b3c)and Bboard.b3b=='':
moves = '3c3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b3c)and Bboard.b2b=='':
moves = '3c2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b3c)and Bboard.b4b=='':
moves = '3c4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3c)and Bboard.b2c=='':
moves = '3c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3c)and Bboard.b4c=='':
moves = '3c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3c)and Bboard.b3d=='':
moves = '3c3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3c)and Bboard.b2d=='':
moves = '3c2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3c)and Bboard.b4d=='':
moves = '3c4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b3c)and Bboard.b3b=='':
moves = '3c3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3c)and Bboard.b2b=='':
moves = '3c2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3c)and Bboard.b4b=='':
moves = '3c4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b2c=='':
moves = '3c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b4c=='':
moves = '3c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3d=='':
moves = '3c3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3c)and Bboard.b2d=='':
moves = '3c2d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3c)and Bboard.b4d=='':
moves = '3c4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3c)and Bboard.b2a=='':
moves = '3c2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3c)and Bboard.b4a=='':
moves = '3c4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3a==''\
and board.s3b=='':
moves = '3c3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3c)and Bboard.b3a==''\
and board.s3b=='':
moves = '3c3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3e==''\
and board.s3d=='':
moves = '3c3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3e==''\
and board.s3d=='':
moves = '3c3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3f==''\
and board.s3d+board.s3e=='':
moves = '3c3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3f==''\
and board.s3d+board.s3e=='':
moves = '3c3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3g==''\
and board.s3d+board.s3e+board.s3f=='':
moves = '3c3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3g==''\
and board.s3d+board.s3e+board.s3f=='':
moves = '3c3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3h==''\
and board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3c3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3h==''\
and board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3c3h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b3i==''\
and board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3c3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b3i==''\
and board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3c3i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b1c==''\
and board.s2c=='':
moves = '3c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b1c==''\
and board.s2c=='':
moves = '3c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b5c==''\
and board.s4c=='':
moves = '3c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b5c==''\
and board.s4c=='':
moves = '3c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b6c==''\
and board.s4c+board.s5c=='':
moves = '3c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b6c==''\
and board.s4c+board.s5c=='':
moves = '3c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b7c==''\
and board.s4c+board.s5c+board.s6c=='':
moves = '3c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b7c==''\
and board.s4c+board.s5c+board.s6c=='':
moves = '3c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b8c==''\
and board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '3c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b8c==''\
and board.s4c+board.s5c+board.s6c+board.s7c=='':
moves = '3c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3c)and Bboard.b9c==''\
and board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '3c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b3c)and Bboard.b9c==''\
and board.s4c+board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '3c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b1a==''\
and board.s2b=='':
moves = '3c1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b5e==''\
and board.s4d=='':
moves = '3c5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b6f==''\
and board.s4d+board.s5e=='':
moves = '3c6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b7g==''\
and board.s4d+board.s5e+board.s6f=='':
moves = '3c7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b8h==''\
and board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '3c8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b9i==''\
and board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '3c9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b1a==''\
and board.s2b=='':
moves = '3c1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b5e==''\
and board.s4d=='':
moves = '3c5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b6f==''\
and board.s4d+board.s5e=='':
moves = '3c6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b7g==''\
and board.s4d+board.s5e+board.s6f=='':
moves = '3c7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b8h==''\
and board.s4d+board.s5e+board.s6f+board.s7g=='':
moves = '3c8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b9i==''\
and board.s4d+board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '3c9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b5a==''\
and board.s4b=='':
moves = '3c5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3c)and Bboard.b1e==''\
and board.s2d=='':
moves = '3c1e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b5a==''\
and board.s4b=='':
moves = '3c5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3c)and Bboard.b1e==''\
and board.s2d=='':
moves = '3c1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4c !='':
if re.match(r'[SGK+]', Bboard.b4c)and Bboard.b4b=='':
moves = '4c4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4c)and Bboard.b3b=='':
moves = '4c3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4c)and Bboard.b5b=='':
moves = '4c5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4c)and Bboard.b3c=='':
moves = '4c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4c)and Bboard.b5c=='':
moves = '4c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4c)and Bboard.b4d=='':
moves = '4c4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4c)and Bboard.b3d=='':
moves = '4c3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4c)and Bboard.b5d=='':
moves = '4c5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b4c)and Bboard.b4b=='':
moves = '4c4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4c)and Bboard.b3b=='':
moves = '4c3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4c)and Bboard.b5b=='':
moves = '4c5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b3c=='':
moves = '4c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b5c=='':
moves = '4c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4d=='':
moves = '4c4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4c)and Bboard.b3d=='':
moves = '4c3d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4c)and Bboard.b5d=='':
moves = '4c5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4c)and Bboard.b3a=='':
moves = '4c3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4c)and Bboard.b5a=='':
moves = '4c5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4a==''\
and board.s4b=='':
moves = '4c4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4c)and Bboard.b4a==''\
and board.s4b=='':
moves = '4c4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4e==''\
and board.s4d=='':
moves = '4c4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4e==''\
and board.s4d=='':
moves = '4c4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4f==''\
and board.s4d+board.s4e=='':
moves = '4c4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4f==''\
and board.s4d+board.s4e=='':
moves = '4c4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4g==''\
and board.s4d+board.s4e+board.s4f=='':
moves = '4c4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4g==''\
and board.s4d+board.s4e+board.s4f=='':
moves = '4c4g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4h==''\
and board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4c4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4h==''\
and board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4c4h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b4i==''\
and board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4c4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b4i==''\
and board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4c4i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b1c==''\
and board.s2c+board.s3c=='':
moves = '4c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b1c==''\
and board.s2c+board.s3c=='':
moves = '4c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b5c==''\
and board.s3c=='':
moves = '4c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b5c==''\
and board.s3c=='':
moves = '4c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b6c==''\
and board.s5c=='':
moves = '4c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b6c==''\
and board.s5c=='':
moves = '4c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b7c==''\
and board.s5c+board.s6c=='':
moves = '4c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b7c==''\
and board.s5c+board.s6c=='':
moves = '4c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b8c==''\
and board.s5c+board.s6c+board.s7c=='':
moves = '4c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b8c==''\
and board.s5c+board.s6c+board.s7c=='':
moves = '4c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4c)and Bboard.b9c==''\
and board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '4c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b4c)and Bboard.b9c==''\
and board.s5c+board.s6c+board.s7c+board.s8c=='':
moves = '4c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b6e==''\
and board.s5d=='':
moves = '4c6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b7f==''\
and board.s5d+board.s6e=='':
moves = '4c7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b8g==''\
and board.s5d+board.s6e+board.s7f=='':
moves = '4c8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b9h==''\
and board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '4c9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4c)and Bboard.b6e==''\
and board.s5d=='':
moves = '4c6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4c)and Bboard.b7f==''\
and board.s5d+board.s6e=='':
moves = '4c7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4c)and Bboard.b8g==''\
and board.s5d+board.s6e+board.s7f=='':
moves = '4c8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4c)and Bboard.b9h==''\
and board.s5d+board.s6e+board.s7f+board.s8g=='':
moves = '4c9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4c)and Bboard.b1f==''\
and board.s2e+board.s3d=='':
moves = '4c1f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4c)and Bboard.b2e==''\
and board.s3d=='':
moves = '4c2e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b1f==''\
and board.s2e+board.s3d=='':
moves = '4c1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b2e==''\
and board.s3d=='':
moves = '4c2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4c)and Bboard.b2a==''\
and board.s3b=='':
moves = '4c2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b2a==''\
and board.s3b=='':
moves = '4c2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4c)and Bboard.b6a==''\
and board.s5b=='':
moves = '4c6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4c)and Bboard.b6a==''\
and board.s5b=='':
moves = '4c6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5c !='':
if re.match(r'[SGK+]', Bboard.b5c)and Bboard.b5b=='':
moves = '5c5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5c)and Bboard.b4b=='':
moves = '5c4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5c)and Bboard.b6b=='':
moves = '5c6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5c)and Bboard.b4c=='':
moves = '5c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5c)and Bboard.b6c=='':
moves = '5c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5c)and Bboard.b5d=='':
moves = '5c5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5c)and Bboard.b4d=='':
moves = '5c4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5c)and Bboard.b6d=='':
moves = '5c6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b5c)and Bboard.b5b=='':
moves = '5c5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5c)and Bboard.b4b=='':
moves = '5c4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5c)and Bboard.b6b=='':
moves = '5c6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b4c=='':
moves = '5c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b6c=='':
moves = '5c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5d=='':
moves = '5c5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5c)and Bboard.b4d=='':
moves = '5c4d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5c)and Bboard.b6d=='':
moves = '5c6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5c)and Bboard.b4a=='':
moves = '5c4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5c)and Bboard.b6a=='':
moves = '5c6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5a==''\
and board.s5b=='':
moves = '5c5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5c)and Bboard.b5a==''\
and board.s5b=='':
moves = '5c5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5e==''\
and board.s5d=='':
moves = '5c5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5e==''\
and board.s5d=='':
moves = '5c5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5f==''\
and board.s5d+board.s5e=='':
moves = '5c5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5f==''\
and board.s5d+board.s5e=='':
moves = '5c5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5g==''\
and board.s5d+board.s5e+board.s5f=='':
moves = '5c5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5g==''\
and board.s5d+board.s5e+board.s5f=='':
moves = '5c5g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5h==''\
and board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5c5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5h==''\
and board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5c5h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b5i==''\
and board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5c5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b5i==''\
and board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5c5i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b1c==''\
and board.s2c+board.s3c+board.s4c=='':
moves = '5c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b1c==''\
and board.s2c+board.s3c+board.s4c=='':
moves = '5c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b2c==''\
and board.s3c+board.s4c=='':
moves = '5c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b2c==''\
and board.s3c+board.s4c=='':
moves = '5c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b3c==''\
and board.s4c=='':
moves = '5c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b3c==''\
and board.s4c=='':
moves = '5c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b7c==''\
and board.s6c=='':
moves = '5c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b7c==''\
and board.s6c=='':
moves = '5c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b8c==''\
and board.s6c+board.s7c=='':
moves = '5c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b8c==''\
and board.s6c+board.s7c=='':
moves = '5c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5c)and Bboard.b9c==''\
and board.s6c+board.s7c+board.s8c=='':
moves = '5c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b5c)and Bboard.b9c==''\
and board.s6c+board.s7c+board.s8c=='':
moves = '5c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b7e==''\
and board.s6d=='':
moves = '5c7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b8f==''\
and board.s6d+board.s7e=='':
moves = '5c8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b9g==''\
and board.s6d+board.s7e+board.s8f=='':
moves = '5c9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5c)and Bboard.b7e==''\
and board.s6d=='':
moves = '5c7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5c)and Bboard.b8f==''\
and board.s6d+board.s7e=='':
moves = '5c8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5c)and Bboard.b9g==''\
and board.s6d+board.s7e+board.s8f=='':
moves = '5c9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5c)and Bboard.b2f==''\
and board.s3e+board.s4d=='':
moves = '5c2f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5c)and Bboard.b3e==''\
and board.s4d=='':
moves = '5c3e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b2f==''\
and board.s3e+board.s4d=='':
moves = '5c2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b3e==''\
and board.s4d=='':
moves = '5c3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b1g==''\
and board.s4d+board.s3e+board.s2f=='':
moves = '5c1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5c)and Bboard.b1g==''\
and board.s4d+board.s3e+board.s2f=='':
moves = '5c1g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5c)and Bboard.b3a==''\
and board.s4b=='':
moves = '5c3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b3a==''\
and board.s4b=='':
moves = '5c3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5c)and Bboard.b7a==''\
and board.s6b=='':
moves = '5c7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5c)and Bboard.b7a==''\
and board.s6b=='':
moves = '5c7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6c !='':
if re.match(r'[SGK+]', Bboard.b6c)and Bboard.b6b=='':
moves = '6c6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6c)and Bboard.b5b=='':
moves = '6c5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6c)and Bboard.b7b=='':
moves = '6c7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6c)and Bboard.b5c=='':
moves = '6c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6c)and Bboard.b7c=='':
moves = '6c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6c)and Bboard.b6d=='':
moves = '6c6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6c)and Bboard.b5d=='':
moves = '6c5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6c)and Bboard.b7d=='':
moves = '6c7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b6c)and Bboard.b6b=='':
moves = '6c6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6c)and Bboard.b5b=='':
moves = '6c5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6c)and Bboard.b7b=='':
moves = '6c7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b5c=='':
moves = '6c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b7c=='':
moves = '6c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6d=='':
moves = '6c6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6c)and Bboard.b5d=='':
moves = '6c5d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6c)and Bboard.b7d=='':
moves = '6c7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6c)and Bboard.b5a=='':
moves = '6c5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6c)and Bboard.b7a=='':
moves = '6c7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6a==''\
and board.s6b=='':
moves = '6c6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6c)and Bboard.b6a==''\
and board.s6b=='':
moves = '6c6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6e==''\
and board.s6d=='':
moves = '6c6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6e==''\
and board.s6d=='':
moves = '6c6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6f==''\
and board.s6d+board.s6e=='':
moves = '6c6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6f==''\
and board.s6d+board.s6e=='':
moves = '6c6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6g==''\
and board.s6d+board.s6e+board.s6f=='':
moves = '6c6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6g==''\
and board.s6d+board.s6e+board.s6f=='':
moves = '6c6g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6h==''\
and board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6c6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6h==''\
and board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6c6h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b6i==''\
and board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6c6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b6i==''\
and board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6c6i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b9c==''\
and board.s8c+board.s7c=='':
moves = '6c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b9c==''\
and board.s8c+board.s7c=='':
moves = '6c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b5c==''\
and board.s7c=='':
moves = '6c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b5c==''\
and board.s7c=='':
moves = '6c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b4c==''\
and board.s5c=='':
moves = '6c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b4c==''\
and board.s5c=='':
moves = '6c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b3c==''\
and board.s5c+board.s4c=='':
moves = '6c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b3c==''\
and board.s5c+board.s4c=='':
moves = '6c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b2c==''\
and board.s5c+board.s4c+board.s3c=='':
moves = '6c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b2c==''\
and board.s5c+board.s4c+board.s3c=='':
moves = '6c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6c)and Bboard.b1c==''\
and board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '6c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b6c)and Bboard.b1c==''\
and board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '6c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b4e==''\
and board.s5d=='':
moves = '6c4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b3f==''\
and board.s5d+board.s4e=='':
moves = '6c3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b2g==''\
and board.s5d+board.s4e+board.s3f=='':
moves = '6c2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b1h==''\
and board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '6c1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6c)and Bboard.b4e==''\
and board.s5d=='':
moves = '6c4e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6c)and Bboard.b3f==''\
and board.s5d+board.s4e=='':
moves = '6c3f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6c)and Bboard.b2g==''\
and board.s5d+board.s4e+board.s3f=='':
moves = '6c2g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6c)and Bboard.b1h==''\
and board.s5d+board.s4e+board.s3f+board.s2g=='':
moves = '6c1h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6c)and Bboard.b9f==''\
and board.s8e+board.s7d=='':
moves = '6c9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6c)and Bboard.b8e==''\
and board.s7d=='':
moves = '6c8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b9f==''\
and board.s8e+board.s7d=='':
moves = '6c9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b8e==''\
and board.s7d=='':
moves = '6c8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6c)and Bboard.b8a==''\
and board.s7b=='':
moves = '6c8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b8a==''\
and board.s7b=='':
moves = '6c8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6c)and Bboard.b4a==''\
and board.s5b=='':
moves = '6c4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6c)and Bboard.b4a==''\
and board.s5b=='':
moves = '6c4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7c !='':
if re.match(r'[SGK+]', Bboard.b7c)and Bboard.b7b=='':
moves = '7c7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7c)and Bboard.b6b=='':
moves = '7c6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7c)and Bboard.b8b=='':
moves = '7c8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7c)and Bboard.b6c=='':
moves = '7c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7c)and Bboard.b8c=='':
moves = '7c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7c)and Bboard.b7d=='':
moves = '7c7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7c)and Bboard.b6d=='':
moves = '7c6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7c)and Bboard.b8d=='':
moves = '7c8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b7c)and Bboard.b7b=='':
moves = '7c7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7c)and Bboard.b6b=='':
moves = '7c6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7c)and Bboard.b8b=='':
moves = '7c8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b6c=='':
moves = '7c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b8c=='':
moves = '7c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7d=='':
moves = '7c7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7c)and Bboard.b6d=='':
moves = '7c6d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7c)and Bboard.b8d=='':
moves = '7c8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7c)and Bboard.b6a=='':
moves = '7c6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7c)and Bboard.b8a=='':
moves = '7c8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7a==''\
and board.s7b=='':
moves = '7c7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7c)and Bboard.b7a==''\
and board.s7b=='':
moves = '7c7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7e==''\
and board.s7d=='':
moves = '7c7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7e==''\
and board.s7d=='':
moves = '7c7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7f==''\
and board.s7d+board.s7e=='':
moves = '7c7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7f==''\
and board.s7d+board.s7e=='':
moves = '7c7f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7g==''\
and board.s7d+board.s7e+board.s7f=='':
moves = '7c7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7g==''\
and board.s7d+board.s7e+board.s7f=='':
moves = '7c7g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7h==''\
and board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7c7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7h==''\
and board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7c7h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b7i==''\
and board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7c7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b7i==''\
and board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7c7i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b9c==''\
and board.s8c=='':
moves = '7c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b9c==''\
and board.s8c=='':
moves = '7c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b5c==''\
and board.s6c=='':
moves = '7c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b5c==''\
and board.s6c=='':
moves = '7c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b4c==''\
and board.s6c+board.s5c=='':
moves = '7c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b4c==''\
and board.s6c+board.s5c=='':
moves = '7c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b3c==''\
and board.s6c+board.s5c+board.s4c=='':
moves = '7c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b3c==''\
and board.s6c+board.s5c+board.s4c=='':
moves = '7c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b2c==''\
and board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '7c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b2c==''\
and board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '7c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7c)and Bboard.b1c==''\
and board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '7c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b7c)and Bboard.b1c==''\
and board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '7c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b9a==''\
and board.s8b=='':
moves = '7c9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b5e==''\
and board.s6d=='':
moves = '7c5e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b4f==''\
and board.s6d+board.s5e=='':
moves = '7c4f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b3g==''\
and board.s6d+board.s5e+board.s4f=='':
moves = '7c3g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b2h==''\
and board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '7c2h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b1i==''\
and board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '7c1i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b9a==''\
and board.s8b=='':
moves = '7c9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b5e==''\
and board.s6d=='':
moves = '7c5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b4f==''\
and board.s6d+board.s5e=='':
moves = '7c4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b3g==''\
and board.s6d+board.s5e+board.s4f=='':
moves = '7c3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b2h==''\
and board.s6d+board.s5e+board.s4f+board.s3g=='':
moves = '7c2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b1i==''\
and board.s6d+board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '7c1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b5a==''\
and board.s6b=='':
moves = '7c5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7c)and Bboard.b9e==''\
and board.s8d=='':
moves = '7c9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b5a==''\
and board.s6b=='':
moves = '7c5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7c)and Bboard.b9e==''\
and board.s8d=='':
moves = '7c9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8c !='':
if re.match(r'[SGK+]', Bboard.b8c)and Bboard.b8b=='':
moves = '8c8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8c)and Bboard.b7b=='':
moves = '8c7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8c)and Bboard.b9b=='':
moves = '8c9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8c)and Bboard.b7c=='':
moves = '8c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8c)and Bboard.b9c=='':
moves = '8c9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8c)and Bboard.b8d=='':
moves = '8c8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8c)and Bboard.b7d=='':
moves = '8c7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8c)and Bboard.b9d=='':
moves = '8c9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b8c)and Bboard.b8b=='':
moves = '8c8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8c)and Bboard.b7b=='':
moves = '8c7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8c)and Bboard.b9b=='':
moves = '8c9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b7c=='':
moves = '8c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b9c=='':
moves = '8c9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8d=='':
moves = '8c8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8c)and Bboard.b7d=='':
moves = '8c7d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8c)and Bboard.b9d=='':
moves = '8c9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8c)and Bboard.b7a=='':
moves = '8c7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8c)and Bboard.b9a=='':
moves = '8c9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8a==''\
and board.s8b=='':
moves = '8c8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8c)and Bboard.b8a==''\
and board.s8b=='':
moves = '8c8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8e==''\
and board.s8d=='':
moves = '8c8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8e==''\
and board.s8d=='':
moves = '8c8e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8f==''\
and board.s8d+board.s8e=='':
moves = '8c8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8f==''\
and board.s8d+board.s8e=='':
moves = '8c8f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8g==''\
and board.s8d+board.s8e+board.s8f=='':
moves = '8c8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8g==''\
and board.s8d+board.s8e+board.s8f=='':
moves = '8c8g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8h==''\
and board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8c8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8h==''\
and board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8c8h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b8i==''\
and board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8c8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b8i==''\
and board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8c8i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b6c==''\
and board.s7c=='':
moves = '8c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b6c==''\
and board.s7c=='':
moves = '8c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b5c==''\
and board.s7c+board.s6c=='':
moves = '8c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b5c==''\
and board.s7c+board.s6c=='':
moves = '8c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b4c==''\
and board.s7c+board.s6c+board.s5c=='':
moves = '8c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b4c==''\
and board.s7c+board.s6c+board.s5c=='':
moves = '8c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b3c==''\
and board.s7c+board.s6c+board.s5c+board.s4c=='':
moves = '8c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b3c==''\
and board.s7c+board.s6c+board.s5c+board.s4c=='':
moves = '8c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b2c==''\
and board.s7c+board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '8c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b2c==''\
and board.s7c+board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '8c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8c)and Bboard.b1c==''\
and board.s7c+board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '8c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b8c)and Bboard.b1c==''\
and board.s7c+board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '8c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b6e==''\
and board.s7d=='':
moves = '8c6e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b5f==''\
and board.s7d+board.s6e=='':
moves = '8c5f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b4g==''\
and board.s7d+board.s6e+board.s5f=='':
moves = '8c4g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b3h==''\
and board.s7d+board.s6e+board.s5f+board.s4g=='':
moves = '8c3h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b2i==''\
and board.s7d+board.s6e+board.s5f+board.s4g+board.s3h=='':
moves = '8c2i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b6e==''\
and board.s7d=='':
moves = '8c6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b5f==''\
and board.s7d+board.s6e=='':
moves = '8c5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b4g==''\
and board.s7d+board.s6e+board.s5f=='':
moves = '8c4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b3h==''\
and board.s7d+board.s6e+board.s5f+board.s4g=='':
moves = '8c3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b2i==''\
and board.s7d+board.s6e+board.s5f+board.s4g+board.s3h=='':
moves = '8c2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8c)and Bboard.b6a==''\
and board.s7b=='':
moves = '8c6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8c)and Bboard.b6a==''\
and board.s7b=='':
moves = '8c6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9c !='':
if re.match(r'[SGK+]', Bboard.b9c)and Bboard.b9b=='':
moves = '9c9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b9c)and Bboard.b8b=='':
moves = '9c8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9c)and Bboard.b8c=='':
moves = '9c8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9c)and Bboard.b9d=='':
moves = '9c9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b9c)and Bboard.b8d=='':
moves = '9c8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b9c)and Bboard.b9b=='':
moves = '9c9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b9c)and Bboard.b8b=='':
moves = '9c8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b8c=='':
moves = '9c8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9d=='':
moves = '9c9d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b9c)and Bboard.b8d=='':
moves = '9c8d+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9c)and Bboard.b8a=='':
moves = '9c8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9a==''\
and board.s9b=='':
moves = '9c9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9c)and Bboard.b9a==''\
and board.s9b=='':
moves = '9c9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9e==''\
and board.s9d=='':
moves = '9c9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9e==''\
and board.s9d=='':
moves = '9c9e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9f==''\
and board.s9d+board.s9e=='':
moves = '9c9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9f==''\
and board.s9d+board.s9e=='':
moves = '9c9f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9g==''\
and board.s9d+board.s9e+board.s9f=='':
moves = '9c9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9g==''\
and board.s9d+board.s9e+board.s9f=='':
moves = '9c9g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9h==''\
and board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9c9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9h==''\
and board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9c9h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b9i==''\
and board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9c9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b9i==''\
and board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9c9i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b7c==''\
and board.s8c=='':
moves = '9c7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b7c==''\
and board.s8c=='':
moves = '9c7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b6c==''\
and board.s8c+board.s7c=='':
moves = '9c6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b6c==''\
and board.s8c+board.s7c=='':
moves = '9c6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b5c==''\
and board.s8c+board.s7c+board.s6c=='':
moves = '9c5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b5c==''\
and board.s8c+board.s7c+board.s6c=='':
moves = '9c5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b4c==''\
and board.s8c+board.s7c+board.s6c+board.s5c=='':
moves = '9c4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b4c==''\
and board.s8c+board.s7c+board.s6c+board.s5c=='':
moves = '9c4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b3c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c=='':
moves = '9c3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b3c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c=='':
moves = '9c3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b2c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '9c2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b2c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c+board.s3c=='':
moves = '9c2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9c)and Bboard.b1c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '9c1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('R', Bboard.b9c)and Bboard.b1c==''\
and board.s8c+board.s7c+board.s6c+board.s5c+board.s4c+board.s3c+board.s2c=='':
moves = '9c1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b7a==''\
and board.s8b=='':
moves = '9c7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b7e==''\
and board.s8d=='':
moves = '9c7e+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b6f==''\
and board.s8d+board.s7e=='':
moves = '9c6f+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b5g==''\
and board.s8d+board.s7e+board.s6f=='':
moves = '9c5g+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b4h==''\
and board.s8d+board.s7e+board.s6f+board.s5g=='':
moves = '9c4h+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9c)and Bboard.b3i==''\
and board.s8d+board.s7e+board.s6f+board.s5g+board.s4h=='':
moves = '9c3i+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b7a==''\
and board.s8b=='':
moves = '9c7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b7e==''\
and board.s8d=='':
moves = '9c7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b6f==''\
and board.s8d+board.s7e=='':
moves = '9c6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b5g==''\
and board.s8d+board.s7e+board.s6f=='':
moves = '9c5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b4h==''\
and board.s8d+board.s7e+board.s6f+board.s5g=='':
moves = '9c4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9c)and Bboard.b3i==''\
and board.s8d+board.s7e+board.s6f+board.s5g+board.s4h=='':
moves = '9c3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1d !='':
if re.match(r'[LSGK+]', Bboard.b1d)and Bboard.b1c=='':
moves = '1d1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b1d)and Bboard.b2c=='':
moves = '1d2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b1d)and Bboard.b2d=='':
moves = '1d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b1d)and Bboard.b1e=='':
moves = '1d1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b1d)and Bboard.b2e=='':
moves = '1d2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b1d)and Bboard.b1c=='':
moves = '1d1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b1d)and Bboard.b2c=='':
moves = '1d2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1d)and Bboard.b2b=='':
moves = '1d2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1d)and Bboard.b1a==''\
and board.s1b+board.s1c=='':
moves = '1d1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1d)and Bboard.b1a==''\
and board.s1b+board.s1c=='':
moves = '1d1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1d)and Bboard.b1b==''\
and board.s1c=='':
moves = '1d1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1d)and Bboard.b1b==''\
and board.s1c=='':
moves = '1d1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b1f==''\
and board.s1e=='':
moves = '1d1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b1g==''\
and board.s1e+board.s1f=='':
moves = '1d1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b1h==''\
and board.s1e+board.s1f+board.s1g=='':
moves = '1d1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b1i==''\
and board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1d1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b3d==''\
and board.s2d=='':
moves = '1d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b4d==''\
and board.s2d+board.s3d=='':
moves = '1d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b5d==''\
and board.s2d+board.s3d+board.s4d=='':
moves = '1d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b6d==''\
and board.s2d+board.s3d+board.s4d+board.s5d=='':
moves = '1d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b7d==''\
and board.s2d+board.s3d+board.s4d+board.s5d+board.s6d=='':
moves = '1d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b8d==''\
and board.s2d+board.s3d+board.s4d+board.s5d+board.s6d+board.s7d=='':
moves = '1d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1d)and Bboard.b9d==''\
and board.s2d+board.s3d+board.s4d+board.s5d+board.s6d+board.s7d+board.s8d=='':
moves = '1d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1d)and Bboard.b3f==''\
and board.s2e=='':
moves = '1d3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1d)and Bboard.b4g==''\
and board.s2e+board.s3f=='':
moves = '1d4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1d)and Bboard.b5h==''\
and board.s2e+board.s3f+board.s4g=='':
moves = '1d5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1d)and Bboard.b6i==''\
and board.s2e+board.s3f+board.s4g+board.s5h=='':
moves = '1d6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1d)and Bboard.b4a==''\
and board.s3b+board.s2c=='':
moves = '1d4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1d)and Bboard.b3b==''\
and board.s2c=='':
moves = '1d3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1d)and Bboard.b4a==''\
and board.s3b+board.s2c=='':
moves = '1d4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1d)and Bboard.b3b==''\
and board.s2c=='':
moves = '1d3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2d !='':
if re.match(r'[LSGK+]', Bboard.b2d)and Bboard.b2c=='':
moves = '2d2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b2d)and Bboard.b1c=='':
moves = '2d1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b2d)and Bboard.b3c=='':
moves = '2d3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2d)and Bboard.b1d=='':
moves = '2d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2d)and Bboard.b3d=='':
moves = '2d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b2d)and Bboard.b2e=='':
moves = '2d2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2d)and Bboard.b1e=='':
moves = '2d1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b2d)and Bboard.b3e=='':
moves = '2d3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b2d)and Bboard.b2c=='':
moves = '2d2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2d)and Bboard.b1c=='':
moves = '2d1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b2d)and Bboard.b3c=='':
moves = '2d3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2d)and Bboard.b1b=='':
moves = '2d1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2d)and Bboard.b3b=='':
moves = '2d3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2d)and Bboard.b2a==''\
and board.s2b+board.s2c=='':
moves = '2d2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2d)and Bboard.b2a==''\
and board.s2b+board.s2c=='':
moves = '2d2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2d)and Bboard.b2b==''\
and board.s2c=='':
moves = '2d2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2d)and Bboard.b2b==''\
and board.s2c=='':
moves = '2d2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b2f==''\
and board.s2e=='':
moves = '2d2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b2g==''\
and board.s2e+board.s2f=='':
moves = '2d2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b2h==''\
and board.s2e+board.s2f+board.s2g=='':
moves = '2d2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b2i==''\
and board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2d2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b4d==''\
and board.s3d=='':
moves = '2d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b5d==''\
and board.s3d+board.s4d=='':
moves = '2d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b6d==''\
and board.s3d+board.s4d+board.s5d=='':
moves = '2d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b7d==''\
and board.s3d+board.s4d+board.s5d+board.s6d=='':
moves = '2d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b8d==''\
and board.s3d+board.s4d+board.s5d+board.s6d+board.s7d=='':
moves = '2d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2d)and Bboard.b9d==''\
and board.s3d+board.s4d+board.s5d+board.s6d+board.s7d+board.s8d=='':
moves = '2d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2d)and Bboard.b4f==''\
and board.s3e=='':
moves = '2d4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2d)and Bboard.b5g==''\
and board.s3e+board.s4f=='':
moves = '2d5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2d)and Bboard.b6h==''\
and board.s3e+board.s4f+board.s5g=='':
moves = '2d6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2d)and Bboard.b7i==''\
and board.s3e+board.s4f+board.s5g+board.s6h=='':
moves = '2d7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2d)and Bboard.b5a==''\
and board.s4b+board.s3c=='':
moves = '2d5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2d)and Bboard.b4b==''\
and board.s3c=='':
moves = '2d4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2d)and Bboard.b5a==''\
and board.s4b+board.s3c=='':
moves = '2d5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2d)and Bboard.b4b==''\
and board.s3c=='':
moves = '2d4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3d !='':
if re.match(r'[LSGK+]', Bboard.b3d)and Bboard.b3c=='':
moves = '3d3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b3d)and Bboard.b2c=='':
moves = '3d2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b3d)and Bboard.b4c=='':
moves = '3d4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3d)and Bboard.b2d=='':
moves = '3d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3d)and Bboard.b4d=='':
moves = '3d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b3d)and Bboard.b3e=='':
moves = '3d3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3d)and Bboard.b2e=='':
moves = '3d2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b3d)and Bboard.b4e=='':
moves = '3d4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b3d)and Bboard.b3c=='':
moves = '3d3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3d)and Bboard.b2c=='':
moves = '3d2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b3d)and Bboard.b4c=='':
moves = '3d4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3d)and Bboard.b2b=='':
moves = '3d2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3d)and Bboard.b4b=='':
moves = '3d4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3d)and Bboard.b3a==''\
and board.s3b+board.s3c=='':
moves = '3d3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3d)and Bboard.b3a==''\
and board.s3b+board.s3c=='':
moves = '3d3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3d)and Bboard.b3b==''\
and board.s3c=='':
moves = '3d3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3d)and Bboard.b3b==''\
and board.s3c=='':
moves = '3d3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b3f==''\
and board.s3e=='':
moves = '3d3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b3g==''\
and board.s3e+board.s3f=='':
moves = '3d3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b3h==''\
and board.s3e+board.s3f+board.s3g=='':
moves = '3d3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b3i==''\
and board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3d3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b1d==''\
and board.s2d=='':
moves = '3d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b5d==''\
and board.s4d=='':
moves = '3d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b6d==''\
and board.s4d+board.s5d=='':
moves = '3d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b7d==''\
and board.s4d+board.s5d+board.s6d=='':
moves = '3d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b8d==''\
and board.s4d+board.s5d+board.s6d+board.s7d=='':
moves = '3d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3d)and Bboard.b9d==''\
and board.s4d+board.s5d+board.s6d+board.s7d+board.s8d=='':
moves = '3d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3d)and Bboard.b1b==''\
and board.s2c=='':
moves = '3d1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3d)and Bboard.b1b==''\
and board.s2c=='':
moves = '3d1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3d)and Bboard.b5f==''\
and board.s4e=='':
moves = '3d5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3d)and Bboard.b6g==''\
and board.s4e+board.s5f=='':
moves = '3d6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3d)and Bboard.b7h==''\
and board.s4e+board.s5f+board.s6g=='':
moves = '3d7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3d)and Bboard.b8i==''\
and board.s4e+board.s5f+board.s6g+board.s7h=='':
moves = '3d8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3d)and Bboard.b6a==''\
and board.s5b+board.s4c=='':
moves = '3d6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3d)and Bboard.b5b==''\
and board.s4c=='':
moves = '3d5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3d)and Bboard.b6a==''\
and board.s5b+board.s4c=='':
moves = '3d6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3d)and Bboard.b5b==''\
and board.s4c=='':
moves = '3d5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3d)and Bboard.b1f==''\
and board.s2e=='':
moves = '3d1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4d !='':
if re.match(r'[LSGK+]', Bboard.b4d)and Bboard.b4c=='':
moves = '4d4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4d)and Bboard.b3c=='':
moves = '4d3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b4d)and Bboard.b5c=='':
moves = '4d5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4d)and Bboard.b3d=='':
moves = '4d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4d)and Bboard.b5d=='':
moves = '4d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b4d)and Bboard.b4e=='':
moves = '4d4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4d)and Bboard.b3e=='':
moves = '4d3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b4d)and Bboard.b5e=='':
moves = '4d5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b4d)and Bboard.b4c=='':
moves = '4d4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4d)and Bboard.b3c=='':
moves = '4d3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b4d)and Bboard.b5c=='':
moves = '4d5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4d)and Bboard.b3b=='':
moves = '4d3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4d)and Bboard.b5b=='':
moves = '4d5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4d)and Bboard.b4a==''\
and board.s4b+board.s4c=='':
moves = '4d4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4d)and Bboard.b4a==''\
and board.s4b+board.s4c=='':
moves = '4d4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4d)and Bboard.b4b==''\
and board.s4c=='':
moves = '4d4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4d)and Bboard.b4b==''\
and board.s4c=='':
moves = '4d4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b4f==''\
and board.s4e=='':
moves = '4d4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b4g==''\
and board.s4e+board.s4f=='':
moves = '4d4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b4h==''\
and board.s4e+board.s4f+board.s4g=='':
moves = '4d4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b4i==''\
and board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4d4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b1d==''\
and board.s2d+board.s3d=='':
moves = '4d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b2d==''\
and board.s3d=='':
moves = '4d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b6d==''\
and board.s5d=='':
moves = '4d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b7d==''\
and board.s5d+board.s6d=='':
moves = '4d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b8d==''\
and board.s5d+board.s6d+board.s7d=='':
moves = '4d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4d)and Bboard.b9d==''\
and board.s5d+board.s6d+board.s7d+board.s8d=='':
moves = '4d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4d)and Bboard.b1a==''\
and board.s2b+board.s3c=='':
moves = '4d1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4d)and Bboard.b2b==''\
and board.s3c=='':
moves = '4d2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4d)and Bboard.b1a==''\
and board.s2b+board.s3c=='':
moves = '4d1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4d)and Bboard.b2b==''\
and board.s3c=='':
moves = '4d2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b6f==''\
and board.s5e=='':
moves = '4d6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b7g==''\
and board.s5e+board.s6f=='':
moves = '4d7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b8h==''\
and board.s5e+board.s6f+board.s7g=='':
moves = '4d8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b9i==''\
and board.s5e+board.s6f+board.s7g+board.s8h=='':
moves = '4d9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4d)and Bboard.b7a==''\
and board.s6b+board.s5c=='':
moves = '4d7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4d)and Bboard.b6b==''\
and board.s5c=='':
moves = '4d6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4d)and Bboard.b7a==''\
and board.s6b+board.s5c=='':
moves = '4d7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4d)and Bboard.b6b==''\
and board.s5c=='':
moves = '4d6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b2f==''\
and board.s3e=='':
moves = '4d2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4d)and Bboard.b1g==''\
and board.s3e+board.s2f=='':
moves = '4d1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5d !='':
if re.match(r'[LSGK+]', Bboard.b5d)and Bboard.b5c=='':
moves = '5d5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5d)and Bboard.b4c=='':
moves = '5d4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b5d)and Bboard.b6c=='':
moves = '5d6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5d)and Bboard.b4d=='':
moves = '5d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5d)and Bboard.b6d=='':
moves = '5d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b5d)and Bboard.b5e=='':
moves = '5d5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5d)and Bboard.b4e=='':
moves = '5d4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b5d)and Bboard.b6e=='':
moves = '5d6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b5d)and Bboard.b5c=='':
moves = '5d5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5d)and Bboard.b4c=='':
moves = '5d4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b5d)and Bboard.b6c=='':
moves = '5d6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5d)and Bboard.b4b=='':
moves = '5d4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5d)and Bboard.b6b=='':
moves = '5d6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5d)and Bboard.b5a==''\
and board.s5b+board.s5c=='':
moves = '5d5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5d)and Bboard.b5a==''\
and board.s5b+board.s5c=='':
moves = '5d5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5d)and Bboard.b5b==''\
and board.s5c=='':
moves = '5d5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5d)and Bboard.b5b==''\
and board.s5c=='':
moves = '5d5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b5f==''\
and board.s5e=='':
moves = '5d5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b5g==''\
and board.s5e+board.s5f=='':
moves = '5d5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b5h==''\
and board.s5e+board.s5f+board.s5g=='':
moves = '5d5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b5i==''\
and board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5d5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b1d==''\
and board.s2d+board.s3d+board.s4d=='':
moves = '5d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b2d==''\
and board.s3d+board.s4d=='':
moves = '5d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b3d==''\
and board.s4d=='':
moves = '5d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b7d==''\
and board.s6d=='':
moves = '5d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b8d==''\
and board.s6d+board.s7d=='':
moves = '5d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5d)and Bboard.b9d==''\
and board.s6d+board.s7d+board.s8d=='':
moves = '5d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5d)and Bboard.b2a==''\
and board.s3b+board.s4c=='':
moves ='5d2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5d)and Bboard.b3b==''\
and board.s4c=='':
moves ='5d3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5d)and Bboard.b2a==''\
and board.s3b+board.s4c=='':
moves ='5d2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b5d)and Bboard.b3b==''\
and board.s4c=='':
moves ='5d3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5d)and Bboard.b7f==''\
and board.s6e=='':
moves ='5d7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5d)and Bboard.b8g==''\
and board.s6e+board.s7f=='':
moves ='5d8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5d)and Bboard.b9h==''\
and board.s6e+board.s7f+board.s8g=='':
moves ='5d9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5d)and Bboard.b8a==''\
and board.s7b+board.s6c=='':
moves ='5d8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5d)and Bboard.b7b==''\
and board.s6c=='':
moves ='5d7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5d)and Bboard.b8a==''\
and board.s7b+board.s6c=='':
moves ='5d8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b5d)and Bboard.b7b==''\
and board.s6c=='':
moves ='5d7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5d)and Bboard.b3f==''\
and board.s4e=='':
moves ='5d3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5d)and Bboard.b2g==''\
and board.s4e+board.s3f=='':
moves ='5d2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6d !='':
if re.match(r'[LSGK+]', Bboard.b6d)and Bboard.b6c=='':
moves = '6d6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6d)and Bboard.b5c=='':
moves = '6d5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b6d)and Bboard.b7c=='':
moves = '6d7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6d)and Bboard.b5d=='':
moves = '6d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6d)and Bboard.b7d=='':
moves = '6d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b6d)and Bboard.b6e=='':
moves = '6d6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6d)and Bboard.b5e=='':
moves = '6d5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b6d)and Bboard.b7e=='':
moves = '6d7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b6d)and Bboard.b6c=='':
moves = '6d6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6d)and Bboard.b5c=='':
moves = '6d5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b6d)and Bboard.b7c=='':
moves = '6d7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6d)and Bboard.b5b=='':
moves = '6d5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6d)and Bboard.b7b=='':
moves = '6d7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6d)and Bboard.b6a==''\
and board.s6b+board.s6c=='':
moves = '6d6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6d)and Bboard.b6a==''\
and board.s6b+board.s6c=='':
moves = '6d6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6d)and Bboard.b6b==''\
and board.s6c=='':
moves = '6d6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6d)and Bboard.b6b==''\
and board.s6c=='':
moves = '6d6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b6f==''\
and board.s6e=='':
moves = '6d6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b6g==''\
and board.s6e+board.s6f=='':
moves = '6d6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b6h==''\
and board.s6e+board.s6f+board.s6g=='':
moves = '6d6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b6i==''\
and board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6d6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b9d==''\
and board.s8d+board.s7d=='':
moves = '6d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b8d==''\
and board.s7d=='':
moves = '6d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b4d==''\
and board.s5d=='':
moves = '6d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b3d==''\
and board.s5d+board.s4d=='':
moves = '6d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b2d==''\
and board.s5d+board.s4d+board.s3d=='':
moves = '6d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6d)and Bboard.b1d==''\
and board.s5d+board.s4d+board.s3d+board.s2d=='':
moves = '6d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6d)and Bboard.b9a==''\
and board.s8b+board.s7c=='':
moves = '6d9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6d)and Bboard.b8b==''\
and board.s7c=='':
moves = '6d8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6d)and Bboard.b9a==''\
and board.s8b+board.s7c=='':
moves = '6d9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6d)and Bboard.b8b==''\
and board.s7c=='':
moves = '6d8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b4f==''\
and board.s5e=='':
moves = '6d4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b3g==''\
and board.s5e+board.s4f=='':
moves = '6d3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b2h==''\
and board.s5e+board.s4f+board.s3g=='':
moves = '6d2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b1i==''\
and board.s5e+board.s4f+board.s3g+board.s2h=='':
moves = '6d1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6d)and Bboard.b3a==''\
and board.s4b+board.s5c=='':
moves = '6d3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6d)and Bboard.b4b==''\
and board.s5c=='':
moves = '6d4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6d)and Bboard.b3a==''\
and board.s4b+board.s5c=='':
moves = '6d3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6d)and Bboard.b4b==''\
and board.s5c=='':
moves = '6d4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b8f==''\
and board.s7e=='':
moves = '6d8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6d)and Bboard.b9g==''\
and board.s7e+board.s8f=='':
moves = '6d9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7d !='':
if re.match(r'[LSGK+]', Bboard.b7d)and Bboard.b7c=='':
moves = '7d7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7d)and Bboard.b6c=='':
moves = '7d6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b7d)and Bboard.b8c=='':
moves = '7d8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7d)and Bboard.b6d=='':
moves = '7d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7d)and Bboard.b8d=='':
moves = '7d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b7d)and Bboard.b7e=='':
moves = '7d7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7d)and Bboard.b6e=='':
moves = '7d6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b7d)and Bboard.b8e=='':
moves = '7d8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b7d)and Bboard.b7c=='':
moves = '7d7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7d)and Bboard.b6c=='':
moves = '7d6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b7d)and Bboard.b8c=='':
moves = '7d8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7d)and Bboard.b6b=='':
moves = '7d6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7d)and Bboard.b8b=='':
moves = '7d8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7d)and Bboard.b7a==''\
and board.s7b+board.s7c=='':
moves = '7d7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7d)and Bboard.b7a==''\
and board.s7b+board.s7c=='':
moves = '7d7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7d)and Bboard.b7b==''\
and board.s7c=='':
moves = '7d7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7d)and Bboard.b7b==''\
and board.s7c=='':
moves = '7d7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b7f==''\
and board.s7e=='':
moves = '7d7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b7g==''\
and board.s7e+board.s7f=='':
moves = '7d7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b7h==''\
and board.s7e+board.s7f+board.s7g=='':
moves = '7d7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b7i==''\
and board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7d7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b9d==''\
and board.s8d=='':
moves = '7d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b5d==''\
and board.s6d=='':
moves = '7d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b4d==''\
and board.s6d+board.s5d=='':
moves = '7d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b3d==''\
and board.s6d+board.s5d+board.s4d=='':
moves = '7d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b2d==''\
and board.s6d+board.s5d+board.s4d+board.s3d=='':
moves = '7d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7d)and Bboard.b1d==''\
and board.s6d+board.s5d+board.s4d+board.s3d+board.s2d=='':
moves = '7d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7d)and Bboard.b9b==''\
and board.s8c=='':
moves = '7d9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7d)and Bboard.b9b==''\
and board.s8c=='':
moves = '7d9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7d)and Bboard.b5f==''\
and board.s6e=='':
moves = '7d5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7d)and Bboard.b4g==''\
and board.s6e+board.s5f=='':
moves = '7d4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7d)and Bboard.b3h==''\
and board.s6e+board.s5f+board.s4g=='':
moves = '7d3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7d)and Bboard.b2i==''\
and board.s6e+board.s5f+board.s4g+board.s3h=='':
moves = '7d2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7d)and Bboard.b4a==''\
and board.s5b+board.s6c=='':
moves = '7d4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7d)and Bboard.b5b==''\
and board.s6c=='':
moves = '7d5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7d)and Bboard.b4a==''\
and board.s5b+board.s6c=='':
moves = '7d4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7d)and Bboard.b5b==''\
and board.s6c=='':
moves = '7d5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7d)and Bboard.b9f==''\
and board.s8e=='':
moves = '7d9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8d !='':
if re.match(r'[LSGK+]', Bboard.b8d)and Bboard.b8c=='':
moves = '8d8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8d)and Bboard.b7c=='':
moves = '8d7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b8d)and Bboard.b9c=='':
moves = '8d9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8d)and Bboard.b7d=='':
moves = '8d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8d)and Bboard.b9d=='':
moves = '8d9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b8d)and Bboard.b8e=='':
moves = '8d8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8d)and Bboard.b7e=='':
moves = '8d7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b8d)and Bboard.b9e=='':
moves = '8d9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b8d)and Bboard.b8c=='':
moves = '8d8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8d)and Bboard.b7c=='':
moves = '8d7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b8d)and Bboard.b9c=='':
moves = '8d9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8d)and Bboard.b7b=='':
moves = '8d7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8d)and Bboard.b9b=='':
moves = '8d9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8d)and Bboard.b8a==''\
and board.s8b+board.s8c=='':
moves = '8d8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8d)and Bboard.b8a==''\
and board.s8b+board.s8c=='':
moves = '8d8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8d)and Bboard.b8b==''\
and board.s8c=='':
moves = '8d8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8d)and Bboard.b8b==''\
and board.s8c=='':
moves = '8d8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b8f==''\
and board.s8e=='':
moves = '8d8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b8g==''\
and board.s8e+board.s8f=='':
moves = '8d8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b8h==''\
and board.s8e+board.s8f+board.s8g=='':
moves = '8d8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b8i==''\
and board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8d8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b6d==''\
and board.s7d=='':
moves = '8d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b5d==''\
and board.s7d+board.s6d=='':
moves = '8d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b4d==''\
and board.s7d+board.s6d+board.s5d=='':
moves = '8d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b3d==''\
and board.s7d+board.s6d+board.s5d+board.s4d=='':
moves = '8d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b2d==''\
and board.s7d+board.s6d+board.s5d+board.s4d+board.s3d=='':
moves = '8d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8d)and Bboard.b1d==''\
and board.s7d+board.s6d+board.s5d+board.s4d+board.s3d+board.s2d=='':
moves = '8d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8d)and Bboard.b6f==''\
and board.s7e=='':
moves = '8d6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8d)and Bboard.b5g==''\
and board.s7e+board.s6f=='':
moves = '8d5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8d)and Bboard.b4h==''\
and board.s7e+board.s6f+board.s5g=='':
moves = '8d4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8d)and Bboard.b3i==''\
and board.s7e+board.s6f+board.s5g+board.s4h=='':
moves = '8d3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8d)and Bboard.b5a==''\
and board.s6b+board.s7c=='':
moves = '8d5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8d)and Bboard.b6b==''\
and board.s7c=='':
moves = '8d6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8d)and Bboard.b5a==''\
and board.s6b+board.s7c=='':
moves = '8d5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8d)and Bboard.b6b==''\
and board.s7c=='':
moves = '8d6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9d !='':
if re.match(r'[LSGK+]', Bboard.b9d)and Bboard.b9c=='':
moves = '9d9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGK+]', Bboard.b9d)and Bboard.b8c=='':
moves = '9d8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9d)and Bboard.b8d=='':
moves = '9d8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GK+]', Bboard.b9d)and Bboard.b9e=='':
moves = '9d9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|S|K',Bboard.b9d)and Bboard.b8e=='':
moves = '9d8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[PLSR]', Bboard.b9d)and Bboard.b9c=='':
moves = '9d9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[BS]', Bboard.b9d)and Bboard.b8c=='':
moves = '9d8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9d)and Bboard.b8b=='':
moves = '9d8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9d)and Bboard.b9a==''\
and board.s9b+board.s9c=='':
moves = '9d9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9d)and Bboard.b9a==''\
and board.s9b+board.s9c=='':
moves = '9d9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9d)and Bboard.b9b==''\
and board.s9c=='':
moves = '9d9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9d)and Bboard.b9b==''\
and board.s9c=='':
moves = '9d9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b9f==''\
and board.s9e=='':
moves = '9d9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b9g==''\
and board.s9e+board.s9f=='':
moves = '9d9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b9h==''\
and board.s9e+board.s9f+board.s9g=='':
moves = '9d9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b9i==''\
and board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9d9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b7d==''\
and board.s8d=='':
moves = '9d7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b6d==''\
and board.s8d+board.s7d=='':
moves = '9d6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b5d==''\
and board.s8d+board.s7d+board.s6d=='':
moves = '9d5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b4d==''\
and board.s8d+board.s7d+board.s6d+board.s5d=='':
moves = '9d4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b3d==''\
and board.s8d+board.s7d+board.s6d+board.s5d+board.s4d=='':
moves = '9d3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b2d==''\
and board.s8d+board.s7d+board.s6d+board.s5d+board.s4d+board.s3d=='':
moves = '9d2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9d)and Bboard.b1d==''\
and board.s8d+board.s7d+board.s6d+board.s5d+board.s4d+board.s3d+board.s2d=='':
moves = '9d1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9d)and Bboard.b7f==''\
and board.s8e=='':
moves = '9d7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9d)and Bboard.b6g==''\
and board.s8e+board.s7f=='':
moves = '9d6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9d)and Bboard.b5h==''\
and board.s8e+board.s7f+board.s6g=='':
moves = '9d5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9d)and Bboard.b4i==''\
and board.s8e+board.s7f+board.s6g+board.s5h=='':
moves = '9d4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9d)and Bboard.b6a==''\
and board.s7b+board.s8c=='':
moves = '9d6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9d)and Bboard.b7b==''\
and board.s8c=='':
moves = '9d7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9d)and Bboard.b6a==''\
and board.s7b+board.s8c=='':
moves = '9d6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9d)and Bboard.b7b==''\
and board.s8c=='':
moves = '9d7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1e !='':
if re.match(r'[PLSGRK+]', Bboard.b1e)and Bboard.b1d=='':
moves = '1e1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b1e)and Bboard.b2d=='':
moves = '1e2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1e)and Bboard.b2e=='':
moves = '1e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1e)and Bboard.b1f=='':
moves = '1e1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b1e)and Bboard.b2f=='':
moves = '1e2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1e)and Bboard.b2c=='':
moves = '1e2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1e)and Bboard.b2c=='':
moves = '1e2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1e)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d=='':
moves = '1e1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1e)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d=='':
moves = '1e1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1e)and Bboard.b1b==''\
and board.s1c+board.s1d=='':
moves = '1e1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1e)and Bboard.b1b==''\
and board.s1c+board.s1d=='':
moves = '1e1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b1e)and Bboard.b1c==''\
and board.s1d=='':
moves = '1e1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1e)and Bboard.b1c==''\
and board.s1d=='':
moves = '1e1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b1g==''\
and board.s1f=='':
moves = '1e1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b1h==''\
and board.s1f+board.s1g=='':
moves = '1e1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b1i==''\
and board.s1f+board.s1g+board.s1h=='':
moves = '1e1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b3e==''\
and board.s2e=='':
moves = '1e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b4e==''\
and board.s2e+board.s3e=='':
moves = '1e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b5e==''\
and board.s2e+board.s3e+board.s4e=='':
moves = '1e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b6e==''\
and board.s2e+board.s3e+board.s4e+board.s5e=='':
moves = '1e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b7e==''\
and board.s2e+board.s3e+board.s4e+board.s5e+board.s6e=='':
moves = '1e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b8e==''\
and board.s2e+board.s3e+board.s4e+board.s5e+board.s6e+board.s7e=='':
moves = '1e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1e)and Bboard.b9e==''\
and board.s2e+board.s3e+board.s4e+board.s5e+board.s6e+board.s7e+board.s8e=='':
moves = '1e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1e)and Bboard.b3g==''\
and board.s2f=='':
moves = '1e3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1e)and Bboard.b4h==''\
and board.s2f+board.s3g=='':
moves = '1e4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1e)and Bboard.b5i==''\
and board.s2f+board.s3g+board.s4h=='':
moves = '1e5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1e)and Bboard.b5a==''\
and board.s4b+board.s3c+board.s2d=='':
moves = '1e5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1e)and Bboard.b4b==''\
and board.s3c+board.s2d=='':
moves = '1e4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1e)and Bboard.b3c==''\
and board.s2d=='':
moves = '1e3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1e)and Bboard.b5a==''\
and board.s4b+board.s3c+board.s2d=='':
moves = '1e5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1e)and Bboard.b4b==''\
and board.s3c+board.s2d=='':
moves = '1e4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b1e)and Bboard.b3c==''\
and board.s2d=='':
moves = '1e3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2e !='':
if re.match(r'[PLSGRK+]', Bboard.b2e)and Bboard.b2d=='':
moves = '2e2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2e)and Bboard.b1d=='':
moves = '2e1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2e)and Bboard.b3d=='':
moves = '2e3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2e)and Bboard.b1e=='':
moves = '2e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2e)and Bboard.b3e=='':
moves = '2e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2e)and Bboard.b2f=='':
moves = '2e2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2e)and Bboard.b1f=='':
moves = '2e1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2e)and Bboard.b3f=='':
moves = '2e3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2e)and Bboard.b1c=='':
moves = '2e1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2e)and Bboard.b3c=='':
moves = '2e3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2e)and Bboard.b1c=='':
moves = '2e1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2e)and Bboard.b3c=='':
moves = '2e3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2e)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d=='':
moves = '2e2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2e)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d=='':
moves = '2e2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2e)and Bboard.b2b==''\
and board.s2c+board.s2d=='':
moves = '2e2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2e)and Bboard.b2b==''\
and board.s2c+board.s2d=='':
moves = '2e2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b2e)and Bboard.b2c==''\
and board.s2d=='':
moves = '2e2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2e)and Bboard.b2c==''\
and board.s2d=='':
moves = '2e2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b2g==''\
and board.s2f=='':
moves = '2e2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b2h==''\
and board.s2f+board.s2g=='':
moves = '2e2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b2i==''\
and board.s2f+board.s2g+board.s2h=='':
moves = '2e2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b4e==''\
and board.s3e=='':
moves = '2e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b5e==''\
and board.s3e+board.s4e=='':
moves = '2e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b6e==''\
and board.s3e+board.s4e+board.s5e=='':
moves = '2e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b7e==''\
and board.s3e+board.s4e+board.s5e+board.s6e=='':
moves = '2e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b8e==''\
and board.s3e+board.s4e+board.s5e+board.s6e+board.s7e=='':
moves = '2e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2e)and Bboard.b9e==''\
and board.s3e+board.s4e+board.s5e+board.s6e+board.s7e+board.s8e=='':
moves = '2e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2e)and Bboard.b4g==''\
and board.s3f=='':
moves = '2e4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2e)and Bboard.b5h==''\
and board.s3f+board.s4g=='':
moves = '2e5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2e)and Bboard.b6i==''\
and board.s3f+board.s4g+board.s5h=='':
moves = '2e6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2e)and Bboard.b6a==''\
and board.s5b+board.s4c+board.s3d=='':
moves = '2e6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2e)and Bboard.b5b==''\
and board.s4c+board.s3d=='':
moves = '2e5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2e)and Bboard.b4c==''\
and board.s3d=='':
moves = '2e4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2e)and Bboard.b6a==''\
and board.s5b+board.s4c+board.s3d=='':
moves = '2e6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2e)and Bboard.b5b==''\
and board.s4c+board.s3d=='':
moves = '2e5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b2e)and Bboard.b4c==''\
and board.s3d=='':
moves = '2e4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3e !='':
if re.match(r'[PLSGRK+]', Bboard.b3e)and Bboard.b3d=='':
moves = '3e3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3e)and Bboard.b2d=='':
moves = '3e2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3e)and Bboard.b4d=='':
moves = '3e4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3e)and Bboard.b2e=='':
moves = '3e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3e)and Bboard.b4e=='':
moves = '3e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3e)and Bboard.b3f=='':
moves = '3e3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3e)and Bboard.b2f=='':
moves = '3e2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3e)and Bboard.b4f=='':
moves = '3e4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3e)and Bboard.b2c=='':
moves = '3e2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3e)and Bboard.b4c=='':
moves = '3e4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3e)and Bboard.b2c=='':
moves = '3e2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3e)and Bboard.b4c=='':
moves = '3e4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3e)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d=='':
moves = '3e3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3e)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d=='':
moves = '3e3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3e)and Bboard.b3b==''\
and board.s3c+board.s3d=='':
moves = '3e3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3e)and Bboard.b3b==''\
and board.s3c+board.s3d=='':
moves = '3e3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b3e)and Bboard.b3c==''\
and board.s3d=='':
moves = '3e3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3e)and Bboard.b3c==''\
and board.s3d=='':
moves = '3e3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b3g==''\
and board.s3f=='':
moves = '3e3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b3h==''\
and board.s3f+board.s3g=='':
moves = '3e3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b3i==''\
and board.s3f+board.s3g+board.s3h=='':
moves = '3e3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b1e==''\
and board.s2e=='':
moves = '3e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b5e==''\
and board.s4e=='':
moves = '3e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b6e==''\
and board.s4e+board.s5e=='':
moves = '3e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b7e==''\
and board.s4e+board.s5e+board.s6e=='':
moves = '3e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b8e==''\
and board.s4e+board.s5e+board.s6e+board.s7e=='':
moves = '3e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3e)and Bboard.b9e==''\
and board.s4e+board.s5e+board.s6e+board.s7e+board.s8e=='':
moves = '3e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3e)and Bboard.b1c==''\
and board.s2d=='':
moves = '3e1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b3e)and Bboard.b1c==''\
and board.s2d=='':
moves = '3e1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3e)and Bboard.b5g==''\
and board.s2f=='':
moves = '3e5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3e)and Bboard.b6h==''\
and board.s2f+board.s5g=='':
moves = '3e6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3e)and Bboard.b7i==''\
and board.s2f+board.s5g+board.s6h=='':
moves = '3e7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3e)and Bboard.b7a==''\
and board.s6b+board.s5c+board.s4d=='':
moves = '3e7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3e)and Bboard.b6b==''\
and board.s5c+board.s4d=='':
moves = '3e6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3e)and Bboard.b5c==''\
and board.s4d=='':
moves = '3e5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3e)and Bboard.b7a==''\
and board.s6b+board.s5c+board.s4d=='':
moves = '3e7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3e)and Bboard.b6b==''\
and board.s5c+board.s4d=='':
moves = '3e6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b3e)and Bboard.b5c==''\
and board.s4d=='':
moves = '3e5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3e)and Bboard.b1g==''\
and board.s2f=='':
moves = '3e1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4e !='':
if re.match(r'[PLSGRK+]', Bboard.b4e)and Bboard.b4d=='':
moves = '4e4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4e)and Bboard.b3d=='':
moves = '4e3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4e)and Bboard.b5d=='':
moves = '4e5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4e)and Bboard.b3e=='':
moves = '4e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4e)and Bboard.b5e=='':
moves = '4e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4e)and Bboard.b4f=='':
moves = '4e4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4e)and Bboard.b3f=='':
moves = '4e3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4e)and Bboard.b5f=='':
moves = '4e5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4e)and Bboard.b3c=='':
moves = '4e3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4e)and Bboard.b5c=='':
moves = '4e5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4e)and Bboard.b3c=='':
moves = '4e3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4e)and Bboard.b5c=='':
moves = '4e5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4e)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d=='':
moves = '4e4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4e)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d=='':
moves = '4e4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4e)and Bboard.b4b==''\
and board.s4c+board.s4d=='':
moves = '4e4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4e)and Bboard.b4b==''\
and board.s4c+board.s4d=='':
moves = '4e4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b4e)and Bboard.b4c==''\
and board.s4d=='':
moves = '4e4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4e)and Bboard.b4c==''\
and board.s4d=='':
moves = '4e4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b4g==''\
and board.s4f=='':
moves = '4e4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b4h==''\
and board.s4f+board.s4g=='':
moves = '4e4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b4i==''\
and board.s4f+board.s4g+board.s4h=='':
moves = '4e4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b1e==''\
and board.s2e+board.s3e=='':
moves = '4e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b2e==''\
and board.s3e=='':
moves = '4e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b6e==''\
and board.s5e=='':
moves = '4e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b7e==''\
and board.s5e+board.s6e=='':
moves = '4e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b8e==''\
and board.s5e+board.s6e+board.s7e=='':
moves = '4e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4e)and Bboard.b9e==''\
and board.s5e+board.s6e+board.s7e+board.s8e=='':
moves = '4e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4e)and Bboard.b1b==''\
and board.s2c+board.s3d=='':
moves = '4e1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4e)and Bboard.b2c==''\
and board.s3d=='':
moves = '4e2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4e)and Bboard.b1b==''\
and board.s2c+board.s3d=='':
moves = '4e1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b4e)and Bboard.b2c==''\
and board.s3d=='':
moves = '4e2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4e)and Bboard.b6g==''\
and board.s5f=='':
moves = '4e6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4e)and Bboard.b7h==''\
and board.s5f+board.s6g=='':
moves = '4e7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4e)and Bboard.b8i==''\
and board.s5f+board.s6g+board.s7h=='':
moves = '4e8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4e)and Bboard.b8a==''\
and board.s7b+board.s6c+board.s5d=='':
moves = '4e8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4e)and Bboard.b7b==''\
and board.s6c+board.s5d=='':
moves = '4e7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b4e)and Bboard.b6c==''\
and board.s5d=='':
moves = '4e6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4e)and Bboard.b8a==''\
and board.s7b+board.s6c+board.s5d=='':
moves = '4e8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4e)and Bboard.b7b==''\
and board.s6c+board.s5d=='':
moves = '4e7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b4e)and Bboard.b6c==''\
and board.s5d=='':
moves = '4e6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4e)and Bboard.b2g==''\
and board.s3f=='':
moves = '4e2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4e)and Bboard.b1h==''\
and board.s3f+board.s2g=='':
moves = '4e1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5e !='':
if re.match(r'[PLSGRK+]', Bboard.b5e)and Bboard.b5d=='':
moves = '5e5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5e)and Bboard.b4d=='':
moves = '5e4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5e)and Bboard.b6d=='':
moves = '5e6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5e)and Bboard.b4e=='':
moves = '5e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5e)and Bboard.b6e=='':
moves = '5e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5e)and Bboard.b5f=='':
moves = '5e5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5e)and Bboard.b4f=='':
moves = '5e4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5e)and Bboard.b6f=='':
moves = '5e6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5e)and Bboard.b4c=='':
moves = '5e4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5e)and Bboard.b6c=='':
moves = '5e6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5e)and Bboard.b4c=='':
moves = '5e4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5e)and Bboard.b6c=='':
moves = '5e6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5e)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d=='':
moves = '5e5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5e)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d=='':
moves = '5e5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5e)and Bboard.b5b==''\
and board.s5c+board.s5d=='':
moves = '5e5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5e)and Bboard.b5b==''\
and board.s5c+board.s5d=='':
moves = '5e5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b5e)and Bboard.b5c==''\
and board.s5d=='':
moves = '5e5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5e)and Bboard.b5c==''\
and board.s5d=='':
moves = '5e5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b5g==''\
and board.s5f=='':
moves = '5e5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b5h==''\
and board.s5f+board.s5g=='':
moves = '5e5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b5i==''\
and board.s5f+board.s5g+board.s5h=='':
moves = '5e5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b1e==''\
and board.s2e+board.s3e+board.s4e=='':
moves = '5e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b2e==''\
and board.s3e+board.s4e=='':
moves = '5e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b3e==''\
and board.s4e=='':
moves = '5e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b7e==''\
and board.s6e=='':
moves = '5e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b8e==''\
and board.s6e+board.s7e=='':
moves = '5e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5e)and Bboard.b9e==''\
and board.s6e+board.s7e+board.s8e=='':
moves = '5e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b1a==''\
and board.s2b+board.s3c+board.s4d=='':
moves = '5e1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b2b==''\
and board.s3c+board.s4d=='':
moves = '5e2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b3c==''\
and board.s4d=='':
moves = '5e3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5e)and Bboard.b1a==''\
and board.s2b+board.s3c+board.s4d=='':
moves = '5e1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5e)and Bboard.b2b==''\
and board.s3c+board.s4d=='':
moves = '5e2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b5e)and Bboard.b3c==''\
and board.s4d=='':
moves = '5e3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b7g==''\
and board.s6f=='':
moves = '5e7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b8h==''\
and board.s6f+board.s7g=='':
moves = '5e8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b9i==''\
and board.s6f+board.s7g+board.s8h=='':
moves = '5e9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b9a==''\
and board.s8b+board.s7c+board.s6d=='':
moves = '5e9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b8b==''\
and board.s7c+board.s6d=='':
moves = '5e8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b5e)and Bboard.b7c==''\
and board.s6d=='':
moves = '5e7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5e)and Bboard.b9a==''\
and board.s8b+board.s7c+board.s6d=='':
moves = '5e9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5e)and Bboard.b8b==''\
and board.s7c+board.s6d=='':
moves = '5e8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b5e)and Bboard.b7c==''\
and board.s6d=='':
moves = '5e7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b3g==''\
and board.s4f=='':
moves = '5e3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b2h==''\
and board.s4f+board.s3g=='':
moves = '5e2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5e)and Bboard.b1i==''\
and board.s4f+board.s3g+board.s2h=='':
moves = '5e1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6e !='':
if re.match(r'[PLSGRK+]', Bboard.b6e)and Bboard.b6d=='':
moves = '6e6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6e)and Bboard.b5d=='':
moves = '6e5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6e)and Bboard.b7d=='':
moves = '6e7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6e)and Bboard.b5e=='':
moves = '6e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6e)and Bboard.b7e=='':
moves = '6e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6e)and Bboard.b6f=='':
moves = '6e6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6e)and Bboard.b5f=='':
moves = '6e5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6e)and Bboard.b7f=='':
moves = '6e7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6e)and Bboard.b5c=='':
moves = '6e5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6e)and Bboard.b7c=='':
moves = '6e7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6e)and Bboard.b5c=='':
moves = '6e5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6e)and Bboard.b7c=='':
moves = '6e7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6e)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d=='':
moves = '6e6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6e)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d=='':
moves = '6e6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6e)and Bboard.b6b==''\
and board.s6c+board.s6d=='':
moves = '6e6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6e)and Bboard.b6b==''\
and board.s6c+board.s6d=='':
moves = '6e6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b6e)and Bboard.b6c==''\
and board.s6d=='':
moves = '6e6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6e)and Bboard.b6c==''\
and board.s6d=='':
moves = '6e6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b6g==''\
and board.s6f=='':
moves = '6e6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b6h==''\
and board.s6f+board.s6g=='':
moves = '6e6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b6i==''\
and board.s6f+board.s6g+board.s6h=='':
moves = '6e6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b9e==''\
and board.s8e+board.s7e=='':
moves = '6e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b8e==''\
and board.s7e=='':
moves = '6e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b4e==''\
and board.s5e=='':
moves = '6e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b3e==''\
and board.s5e+board.s4e=='':
moves = '6e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b2e==''\
and board.s5e+board.s4e+board.s3e=='':
moves = '6e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6e)and Bboard.b1e==''\
and board.s5e+board.s4e+board.s3e+board.s2e=='':
moves = '6e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6e)and Bboard.b9b==''\
and board.s8c+board.s7d=='':
moves = '6e9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6e)and Bboard.b8c==''\
and board.s7d=='':
moves = '6e8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6e)and Bboard.b9b==''\
and board.s8c+board.s7d=='':
moves = '6e9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b6e)and Bboard.b8c==''\
and board.s7d=='':
moves = '6e8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6e)and Bboard.b4g==''\
and board.s5f=='':
moves = '6e4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6e)and Bboard.b3h==''\
and board.s5f+board.s4g=='':
moves = '6e3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6e)and Bboard.b2i==''\
and board.s5f+board.s4g+board.s3h=='':
moves = '6e2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6e)and Bboard.b2a==''\
and board.s3b+board.s4c+board.s5d=='':
moves = '6e2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6e)and Bboard.b3b==''\
and board.s4c+board.s5d=='':
moves = '6e3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b6e)and Bboard.b4c==''\
and board.s5d=='':
moves = '6e4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6e)and Bboard.b2a==''\
and board.s3b+board.s4c+board.s5d=='':
moves = '6e2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6e)and Bboard.b3b==''\
and board.s4c+board.s5d=='':
moves = '6e3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b6e)and Bboard.b4c==''\
and board.s5d=='':
moves = '6e4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6e)and Bboard.b8g==''\
and board.s7f=='':
moves = '6e8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6e)and Bboard.b9h==''\
and board.s7f+board.s8g=='':
moves = '6e9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7e !='':
if re.match(r'[PLSGRK+]', Bboard.b7e)and Bboard.b7d=='':
moves = '7e7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7e)and Bboard.b6d=='':
moves = '7e6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7e)and Bboard.b8d=='':
moves = '7e8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7e)and Bboard.b6e=='':
moves = '7e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7e)and Bboard.b8e=='':
moves = '7e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7e)and Bboard.b7f=='':
moves = '7e7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7e)and Bboard.b6f=='':
moves = '7e6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7e)and Bboard.b8f=='':
moves = '7e8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7e)and Bboard.b6c=='':
moves = '7e6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7e)and Bboard.b8c=='':
moves = '7e8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7e)and Bboard.b6c=='':
moves = '7e6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7e)and Bboard.b8c=='':
moves = '7e8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7e)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d=='':
moves = '7e7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7e)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d=='':
moves = '7e7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7e)and Bboard.b7b==''\
and board.s7c+board.s7d=='':
moves = '7e7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7e)and Bboard.b7b==''\
and board.s7c+board.s7d=='':
moves = '7e7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b7e)and Bboard.b7c==''\
and board.s7d=='':
moves = '7e7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7e)and Bboard.b7c==''\
and board.s7d=='':
moves = '7e7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b7g==''\
and board.s7f=='':
moves = '7e7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b7h==''\
and board.s7f+board.s7g=='':
moves = '7e7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b7i==''\
and board.s7f+board.s7g+board.s7h=='':
moves = '7e7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b9e==''\
and board.s8e=='':
moves = '7e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b5e==''\
and board.s6e=='':
moves = '7e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b4e==''\
and board.s6e+board.s5e=='':
moves = '7e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b3e==''\
and board.s6e+board.s5e+board.s4e=='':
moves = '7e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b2e==''\
and board.s6e+board.s5e+board.s4e+board.s3e=='':
moves = '7e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7e)and Bboard.b1e==''\
and board.s6e+board.s5e+board.s4e+board.s3e+board.s2e=='':
moves = '7e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7e)and Bboard.b9c==''\
and board.s8d=='':
moves = '7e9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b7e)and Bboard.b9c==''\
and board.s8d=='':
moves = '7e9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7e)and Bboard.b5g==''\
and board.s8f=='':
moves = '7e5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7e)and Bboard.b4h==''\
and board.s8f+board.s5g=='':
moves = '7e4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7e)and Bboard.b3i==''\
and board.s8f+board.s5g+board.s4h=='':
moves = '7e3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7e)and Bboard.b3a==''\
and board.s4b+board.s5c+board.s6d=='':
moves = '7e3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7e)and Bboard.b4b==''\
and board.s5c+board.s6d=='':
moves = '7e4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7e)and Bboard.b5c==''\
and board.s6d=='':
moves = '7e5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7e)and Bboard.b3a==''\
and board.s4b+board.s5c+board.s6d=='':
moves = '7e3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7e)and Bboard.b4b==''\
and board.s5c+board.s6d=='':
moves = '7e4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b7e)and Bboard.b5c==''\
and board.s6d=='':
moves = '7e5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7e)and Bboard.b9g==''\
and board.s8f=='':
moves = '7e9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8e !='':
if re.match(r'[PLSGRK+]', Bboard.b8e)and Bboard.b8d=='':
moves = '8e8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8e)and Bboard.b7d=='':
moves = '8e7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8e)and Bboard.b9d=='':
moves = '8e9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8e)and Bboard.b7e=='':
moves = '8e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8e)and Bboard.b9e=='':
moves = '8e9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8e)and Bboard.b8f=='':
moves = '8e8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8e)and Bboard.b7f=='':
moves = '8e7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8e)and Bboard.b9f=='':
moves = '8e9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8e)and Bboard.b7c=='':
moves = '8e7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8e)and Bboard.b9c=='':
moves = '8e9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8e)and Bboard.b7c=='':
moves = '8e7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8e)and Bboard.b9c=='':
moves = '8e9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8e)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d=='':
moves = '8e8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8e)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d=='':
moves = '8e8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8e)and Bboard.b8b==''\
and board.s8c+board.s8d=='':
moves = '8e8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8e)and Bboard.b8b==''\
and board.s8c+board.s8d=='':
moves = '8e8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b8e)and Bboard.b8c==''\
and board.s8d=='':
moves = '8e8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8e)and Bboard.b8c==''\
and board.s8d=='':
moves = '8e8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b8g==''\
and board.s8f=='':
moves = '8e8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b8h==''\
and board.s8f+board.s8g=='':
moves = '8e8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b8i==''\
and board.s8f+board.s8g+board.s8h=='':
moves = '8e8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b6e==''\
and board.s7e=='':
moves = '8e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b5e==''\
and board.s7e+board.s6e=='':
moves = '8e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b4e==''\
and board.s7e+board.s6e+board.s5e=='':
moves = '8e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b3e==''\
and board.s7e+board.s6e+board.s5e+board.s4e=='':
moves = '8e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b2e==''\
and board.s7e+board.s6e+board.s5e+board.s4e+board.s3e=='':
moves = '8e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8e)and Bboard.b1e==''\
and board.s7e+board.s6e+board.s5e+board.s4e+board.s3e+board.s2e=='':
moves = '8e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8e)and Bboard.b6g==''\
and board.s7f=='':
moves = '8e6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8e)and Bboard.b5h==''\
and board.s7f+board.s6g=='':
moves = '8e5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8e)and Bboard.b4i==''\
and board.s7f+board.s6g+board.s5h=='':
moves = '8e4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8e)and Bboard.b4a==''\
and board.s5b+board.s6c+board.s7d=='':
moves = '8e4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8e)and Bboard.b5b==''\
and board.s6c+board.s7d=='':
moves = '8e5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8e)and Bboard.b6c==''\
and board.s7d=='':
moves = '8e6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8e)and Bboard.b4a==''\
and board.s5b+board.s6c+board.s7d=='':
moves = '8e4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8e)and Bboard.b5b==''\
and board.s6c+board.s7d=='':
moves = '8e5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b8e)and Bboard.b6c==''\
and board.s7d=='':
moves = '8e6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9e !='':
if re.match(r'[PLSGRK+]', Bboard.b9e)and Bboard.b9d=='':
moves = '9e9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b9e)and Bboard.b8d=='':
moves = '9e8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9e)and Bboard.b8e=='':
moves = '9e8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9e)and Bboard.b9f=='':
moves = '9e9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b9e)and Bboard.b8f=='':
moves = '9e8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9e)and Bboard.b8c=='':
moves = '9e8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9e)and Bboard.b8c=='':
moves = '9e8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9e)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d=='':
moves = '9e9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9e)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d=='':
moves = '9e9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9e)and Bboard.b9b==''\
and board.s9c+board.s9d=='':
moves = '9e9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9e)and Bboard.b9b==''\
and board.s9c+board.s9d=='':
moves = '9e9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b9e)and Bboard.b9c==''\
and board.s9d=='':
moves = '9e9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9e)and Bboard.b9c==''\
and board.s9d=='':
moves = '9e9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b9g==''\
and board.s9f=='':
moves = '9e9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b9h==''\
and board.s9f+board.s9g=='':
moves = '9e9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b9i==''\
and board.s9f+board.s9g+board.s9h=='':
moves = '9e9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b9i==''\
and board.s9f+board.s9g+board.s9h=='':
moves = '9e9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b7e==''\
and board.s8e=='':
moves = '9e7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b6e==''\
and board.s8e+board.s7e=='':
moves = '9e6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b5e==''\
and board.s8e+board.s7e+board.s6e=='':
moves = '9e5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b4e==''\
and board.s8e+board.s7e+board.s6e+board.s5e=='':
moves = '9e4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b3e==''\
and board.s8e+board.s7e+board.s6e+board.s5e+board.s4e=='':
moves = '9e3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b2e==''\
and board.s8e+board.s7e+board.s6e+board.s5e+board.s4e+board.s3e=='':
moves = '9e2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9e)and Bboard.b1e==''\
and board.s8e+board.s7e+board.s6e+board.s5e+board.s4e+board.s3e+board.s2e=='':
moves = '9e1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9e)and Bboard.b7g==''\
and board.s8f=='':
moves = '9e7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9e)and Bboard.b6h==''\
and board.s8f+board.s7g=='':
moves = '9e6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9e)and Bboard.b5i==''\
and board.s8f+board.s7g+board.s6h=='':
moves = '9e5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9e)and Bboard.b5a==''\
and board.s6b+board.s7c+board.s8d=='':
moves = '9e5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9e)and Bboard.b6b==''\
and board.s7c+board.s8d=='':
moves = '9e6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9e)and Bboard.b7c==''\
and board.s8d=='':
moves = '9e7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9e)and Bboard.b5a==''\
and board.s6b+board.s7c+board.s8d=='':
moves = '9e5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9e)and Bboard.b6b==''\
and board.s7c+board.s8d=='':
moves = '9e6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B',Bboard.b9e)and Bboard.b7c==''\
and board.s8d=='':
moves = '9e7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1f !='':
if re.match(r'[PLSGRK+]', Bboard.b1f)and Bboard.b1e=='':
moves = '1f1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b1f)and Bboard.b2e=='':
moves = '1f2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1f)and Bboard.b2f=='':
moves = '1f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1f)and Bboard.b1g=='':
moves = '1f1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b1f)and Bboard.b2g=='':
moves = '1f2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1f)and Bboard.b2d=='':
moves = '1f2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1f)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e=='':
moves = '1f1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1f)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e=='':
moves = '1f1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1f)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e=='':
moves = '1f1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1f)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e=='':
moves = '1f1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b1f)and Bboard.b1c==''\
and board.s1d+board.s1e=='':
moves = '1f1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1f)and Bboard.b1c==''\
and board.s1d+board.s1e=='':
moves = '1f1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1f)and Bboard.b1d==''\
and board.s1e=='':
moves = '1f1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b1h==''\
and board.s1g=='':
moves = '1f1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b1i==''\
and board.s1g+board.s1h=='':
moves = '1f1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b3f==''\
and board.s2f=='':
moves = '1f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b4f==''\
and board.s2f+board.s3f=='':
moves = '1f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b5f==''\
and board.s2f+board.s3f+board.s4f=='':
moves = '1f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b6f==''\
and board.s2f+board.s3f+board.s4f+board.s5f=='':
moves = '1f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b7f==''\
and board.s2f+board.s3f+board.s4f+board.s5f+board.s6f=='':
moves = '1f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b8f==''\
and board.s2f+board.s3f+board.s4f+board.s5f+board.s6f+board.s7f=='':
moves = '1f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1f)and Bboard.b9f==''\
and board.s2f+board.s3f+board.s4f+board.s5f+board.s6f+board.s7f+board.s8f=='':
moves = '1f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1f)and Bboard.b3d==''\
and board.s2e=='':
moves = '1f3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1f)and Bboard.b4c==''\
and board.s2e+board.s3d=='':
moves = '1f4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1f)and Bboard.b5b==''\
and board.s2e+board.s3d+board.s4c=='':
moves = '1f5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1f)and Bboard.b6a==''\
and board.s2e+board.s3d+board.s4c+board.s5b=='':
moves = '1f6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1f)and Bboard.b4i==''\
and board.s3h+board.s2g=='':
moves = '1f4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1f)and Bboard.b3h==''\
and board.s2g=='':
moves = '1f3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b1f)and Bboard.b4c==''\
and board.s2e+board.s3d=='':
moves = '1f4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b1f)and Bboard.b5b==''\
and board.s2e+board.s3d+board.s4c=='':
moves = '1f5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b1f)and Bboard.b6a==''\
and board.s2e+board.s3d+board.s4c+board.s5b=='':
moves = '1f6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2f !='':
if re.match(r'[PLSGRK+]', Bboard.b2f)and Bboard.b2e=='':
moves = '2f2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2f)and Bboard.b1e=='':
moves = '2f1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2f)and Bboard.b3e=='':
moves = '2f3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2f)and Bboard.b1f=='':
moves = '2f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2f)and Bboard.b3f=='':
moves = '2f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2f)and Bboard.b2g=='':
moves = '2f2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2f)and Bboard.b1g=='':
moves = '2f1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2f)and Bboard.b3g=='':
moves = '2f3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2f)and Bboard.b1d=='':
moves = '2f1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2f)and Bboard.b3d=='':
moves = '2f3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2f)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e=='':
moves = '2f2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2f)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e=='':
moves = '2f2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2f)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e=='':
moves = '2f2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2f)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e=='':
moves = '2f2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b2f)and Bboard.b2c==''\
and board.s2d+board.s2e=='':
moves = '2f2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2f)and Bboard.b2c==''\
and board.s2d+board.s2e=='':
moves = '2f2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2f)and Bboard.b2d==''\
and board.s2e=='':
moves = '2f2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b2h==''\
and board.s2g=='':
moves = '2f2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b2i==''\
and board.s2g+board.s2h=='':
moves = '2f2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b4f==''\
and board.s3f=='':
moves = '2f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b5f==''\
and board.s3f+board.s4f=='':
moves = '2f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b6f==''\
and board.s3f+board.s4f+board.s5f=='':
moves = '2f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b7f==''\
and board.s3f+board.s4f+board.s5f+board.s6f=='':
moves = '2f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b8f==''\
and board.s3f+board.s4f+board.s5f+board.s6f+board.s7f=='':
moves = '2f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2f)and Bboard.b9f==''\
and board.s3f+board.s4f+board.s5f+board.s6f+board.s7f+board.s8f=='':
moves = '2f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2f)and Bboard.b4d==''\
and board.s3e=='':
moves = '2f4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2f)and Bboard.b5c==''\
and board.s3e+board.s4d=='':
moves = '2f5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2f)and Bboard.b6b==''\
and board.s3e+board.s4d+board.s5c=='':
moves = '2f6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2f)and Bboard.b7a==''\
and board.s3e+board.s4d+board.s5c+board.s6b=='':
moves = '2f7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2f)and Bboard.b5i==''\
and board.s4h+board.s3g=='':
moves = '2f5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2f)and Bboard.b4h==''\
and board.s3g=='':
moves = '2f4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b2f)and Bboard.b5c==''\
and board.s3e+board.s4d=='':
moves = '2f5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b2f)and Bboard.b6b==''\
and board.s3e+board.s4d+board.s5c=='':
moves = '2f6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b2f)and Bboard.b7a==''\
and board.s3e+board.s4d+board.s5c+board.s6b=='':
moves = '2f7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3f !='':
if re.match(r'[PLSGRK+]', Bboard.b3f)and Bboard.b3e=='':
moves = '3f3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3f)and Bboard.b2e=='':
moves = '3f2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3f)and Bboard.b4e=='':
moves = '3f4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3f)and Bboard.b2f=='':
moves = '3f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3f)and Bboard.b4f=='':
moves = '3f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3f)and Bboard.b3g=='':
moves = '3f3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3f)and Bboard.b2g=='':
moves = '3f2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3f)and Bboard.b4g=='':
moves = '3f4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3f)and Bboard.b2d=='':
moves = '3f2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3f)and Bboard.b4d=='':
moves = '3f4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3f)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e=='':
moves = '3f3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3f)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e=='':
moves = '3f3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3f)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e=='':
moves = '3f3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3f)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e=='':
moves = '3f3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b3f)and Bboard.b3c==''\
and board.s3d+board.s3e=='':
moves = '3f3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3f)and Bboard.b3c==''\
and board.s3d+board.s3e=='':
moves = '3f3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3f)and Bboard.b3d==''\
and board.s3e=='':
moves = '3f3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b3h==''\
and board.s3g=='':
moves = '3f3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b3i==''\
and board.s3g+board.s3h=='':
moves = '3f3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b1f==''\
and board.s2f=='':
moves = '3f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b5f==''\
and board.s4f=='':
moves = '3f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b6f==''\
and board.s4f+board.s5f=='':
moves = '3f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b7f==''\
and board.s4f+board.s5f+board.s6f=='':
moves = '3f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b8f==''\
and board.s4f+board.s5f+board.s6f+board.s7f=='':
moves = '3f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3f)and Bboard.b9f==''\
and board.s4f+board.s5f+board.s6f+board.s7f+board.s8f=='':
moves = '3f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3f)and Bboard.b1h==''\
and board.s2g=='':
moves = '3f1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3f)and Bboard.b5d==''\
and board.s4e=='':
moves = '3f5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3f)and Bboard.b6c==''\
and board.s4e+board.s5d=='':
moves = '3f6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3f)and Bboard.b7b==''\
and board.s4e+board.s5d+board.s6c=='':
moves = '3f7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3f)and Bboard.b8a==''\
and board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '3f8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3f)and Bboard.b6i==''\
and board.s5h+board.s4g=='':
moves = '3f6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3f)and Bboard.b5h==''\
and board.s4g=='':
moves = '3f5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3f)and Bboard.b1d==''\
and board.s2e=='':
moves = '3f1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b3f)and Bboard.b6c==''\
and board.s4e+board.s5d=='':
moves = '3f6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b3f)and Bboard.b7b==''\
and board.s4e+board.s5d+board.s6c=='':
moves = '3f7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b3f)and Bboard.b8a==''\
and board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '3f8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4f !='':
if re.match(r'[PLSGRK+]', Bboard.b4f)and Bboard.b4e=='':
moves = '4f4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4f)and Bboard.b3e=='':
moves = '4f3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4f)and Bboard.b5e=='':
moves = '4f5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4f)and Bboard.b3f=='':
moves = '4f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4f)and Bboard.b5f=='':
moves = '4f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4f)and Bboard.b4g=='':
moves = '4f4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4f)and Bboard.b3g=='':
moves = '4f3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4f)and Bboard.b5g=='':
moves = '4f5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4f)and Bboard.b3d=='':
moves = '4f3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4f)and Bboard.b5d=='':
moves = '4f5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4f)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e=='':
moves = '4f4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4f)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e=='':
moves = '4f4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4f)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e=='':
moves = '4f4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4f)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e=='':
moves = '4f4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b4f)and Bboard.b4c==''\
and board.s4d+board.s4e=='':
moves = '4f4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4f)and Bboard.b4c==''\
and board.s4d+board.s4e=='':
moves = '4f4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4f)and Bboard.b4d==''\
and board.s4e=='':
moves = '4f4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b4h==''\
and board.s4g=='':
moves = '4f4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b4i==''\
and board.s4g+board.s4h=='':
moves = '4f4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b1f==''\
and board.s2f+board.s3f=='':
moves = '4f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b2f==''\
and board.s3f=='':
moves = '4f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b6f==''\
and board.s5f=='':
moves = '4f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b7f==''\
and board.s5f+board.s6f=='':
moves = '4f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b8f==''\
and board.s5f+board.s6f+board.s7f=='':
moves = '4f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4f)and Bboard.b9f==''\
and board.s5f+board.s6f+board.s7f+board.s8f=='':
moves = '4f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b1i==''\
and board.s2h+board.s3g=='':
moves = '4f1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b2h==''\
and board.s3g=='':
moves = '4f2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b6d==''\
and board.s5e=='':
moves = '4f6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4f)and Bboard.b7c==''\
and board.s5e+board.s6d=='':
moves = '4f7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4f)and Bboard.b8b==''\
and board.s5e+board.s6d+board.s7c=='':
moves = '4f8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4f)and Bboard.b9a==''\
and board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '4f9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b7i==''\
and board.s6h+board.s5g=='':
moves = '4f7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b6h==''\
and board.s5g=='':
moves = '4f6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4f)and Bboard.b2d==''\
and board.s3e=='':
moves = '4f2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4f)and Bboard.b1c==''\
and board.s3e+board.s2d=='':
moves = '4f1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4f)and Bboard.b7c==''\
and board.s5e+board.s6d=='':
moves = '4f7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4f)and Bboard.b8b==''\
and board.s5e+board.s6d+board.s7c=='':
moves = '4f8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4f)and Bboard.b1c==''\
and board.s3e+board.s2d=='':
moves = '4f1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4f)and Bboard.b9a==''\
and board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '4f9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5f !='':
if re.match(r'[PLSGRK+]', Bboard.b5f)and Bboard.b5e=='':
moves = '5f5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5f)and Bboard.b4e=='':
moves = '5f4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5f)and Bboard.b6e=='':
moves = '5f6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5f)and Bboard.b4f=='':
moves = '5f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5f)and Bboard.b6f=='':
moves = '5f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5f)and Bboard.b5g=='':
moves = '5f5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5f)and Bboard.b4g=='':
moves = '5f4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5f)and Bboard.b6g=='':
moves = '5f6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5f)and Bboard.b4d=='':
moves = '5f4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5f)and Bboard.b6d=='':
moves = '5f6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5f)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e=='':
moves = '5f5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5f)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e=='':
moves = '5f5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5f)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e=='':
moves = '5f5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5f)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e=='':
moves = '5f5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b5f)and Bboard.b5c==''\
and board.s5d+board.s5e=='':
moves = '5f5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5f)and Bboard.b5c==''\
and board.s5d+board.s5e=='':
moves = '5f5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5f)and Bboard.b5d==''\
and board.s5e=='':
moves = '5f5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b5h==''\
and board.s5g=='':
moves = '5f5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b5i==''\
and board.s5g+board.s5h=='':
moves = '5f5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b1f==''\
and board.s2f+board.s3f+board.s4f=='':
moves = '5f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b2f==''\
and board.s3f+board.s4f=='':
moves = '5f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b3f==''\
and board.s4f=='':
moves = '5f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b7f==''\
and board.s6f=='':
moves = '5f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b8f==''\
and board.s6f+board.s7f=='':
moves = '5f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5f)and Bboard.b9f==''\
and board.s6f+board.s7f+board.s8f=='':
moves = '5f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5f)and Bboard.b2i==''\
and board.s3h+board.s4g=='':
moves ='5f2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B',Bboard.b5f)and Bboard.b3h==''\
and board.s4g=='':
moves ='5f3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5f)and Bboard.b7d==''\
and board.s6e=='':
moves ='5f7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5f)and Bboard.b8c==''\
and board.s6e+board.s7d=='':
moves ='5f8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5f)and Bboard.b9b==''\
and board.s6e+board.s7d+board.s8c=='':
moves ='5f9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5f)and Bboard.b8i==''\
and board.s7h+board.s6g=='':
moves ='5f8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B',Bboard.b5f)and Bboard.b7h==''\
and board.s6g=='':
moves ='5f7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5f)and Bboard.b3d==''\
and board.s4e=='':
moves ='5f3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5f)and Bboard.b2c==''\
and board.s4e+board.s3d=='':
moves ='5f2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5f)and Bboard.b8c==''\
and board.s6e+board.s7d=='':
moves ='5f8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5f)and Bboard.b2c==''\
and board.s4e+board.s3d=='':
moves ='5f2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6f !='':
if re.match(r'[PLSGRK+]', Bboard.b6f)and Bboard.b6e=='':
moves = '6f6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6f)and Bboard.b5e=='':
moves = '6f5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6f)and Bboard.b7e=='':
moves = '6f7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6f)and Bboard.b5f=='':
moves = '6f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6f)and Bboard.b7f=='':
moves = '6f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6f)and Bboard.b6g=='':
moves = '6f6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6f)and Bboard.b5g=='':
moves = '6f5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6f)and Bboard.b7g=='':
moves = '6f7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6f)and Bboard.b5d=='':
moves = '6f5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6f)and Bboard.b7d=='':
moves = '6f7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6f)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e=='':
moves = '6f6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6f)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e=='':
moves = '6f6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6f)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e=='':
moves = '6f6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6f)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e=='':
moves = '6f6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b6f)and Bboard.b6c==''\
and board.s6d+board.s6e=='':
moves = '6f6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6f)and Bboard.b6c==''\
and board.s6d+board.s6e=='':
moves = '6f6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6f)and Bboard.b6d==''\
and board.s6e=='':
moves = '6f6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b6h==''\
and board.s6g=='':
moves = '6f6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b6i==''\
and board.s6g+board.s6h=='':
moves = '6f6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b9f==''\
and board.s8f+board.s7f=='':
moves = '6f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b8f==''\
and board.s7f=='':
moves = '6f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b4f==''\
and board.s5f=='':
moves = '6f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b3f==''\
and board.s5f+board.s4f=='':
moves = '6f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b2f==''\
and board.s5f+board.s4f+board.s3f=='':
moves = '6f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6f)and Bboard.b1f==''\
and board.s5f+board.s4f+board.s3f+board.s2f=='':
moves = '6f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b9i==''\
and board.s8h+board.s7g=='':
moves = '6f9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b8h==''\
and board.s7g=='':
moves = '6f8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b4d==''\
and board.s5e=='':
moves = '6f4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6f)and Bboard.b3c==''\
and board.s5e+board.s4d=='':
moves = '6f3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6f)and Bboard.b2b==''\
and board.s5e+board.s4d+board.s3c=='':
moves = '6f2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6f)and Bboard.b1a==''\
and board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '6f1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b3i==''\
and board.s4h+board.s5g=='':
moves = '6f3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b4h==''\
and board.s5g=='':
moves = '6f4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6f)and Bboard.b8d==''\
and board.s7e=='':
moves = '6f8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6f)and Bboard.b9c==''\
and board.s7e+board.s8d=='':
moves = '6f9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6f)and Bboard.b3c==''\
and board.s5e+board.s4d=='':
moves = '6f3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6f)and Bboard.b2b==''\
and board.s5e+board.s4d+board.s3c=='':
moves = '6f2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6f)and Bboard.b9c==''\
and board.s7e+board.s8d=='':
moves = '6f9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6f)and Bboard.b1a==''\
and board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '6f1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7f !='':
if re.match(r'[PLSGRK+]', Bboard.b7f)and Bboard.b7e=='':
moves = '7f7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7f)and Bboard.b6e=='':
moves = '7f6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7f)and Bboard.b8e=='':
moves = '7f8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7f)and Bboard.b6f=='':
moves = '7f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7f)and Bboard.b8f=='':
moves = '7f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7f)and Bboard.b7g=='':
moves = '7f7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7f)and Bboard.b6g=='':
moves = '7f6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7f)and Bboard.b8g=='':
moves = '7f8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7f)and Bboard.b6d=='':
moves = '7f6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7f)and Bboard.b8d=='':
moves = '7f8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7f)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e=='':
moves = '7f7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7f)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e=='':
moves = '7f7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7f)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e=='':
moves = '7f7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7f)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e=='':
moves = '7f7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b7f)and Bboard.b7c==''\
and board.s7d+board.s7e=='':
moves = '7f7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7f)and Bboard.b7c==''\
and board.s7d+board.s7e=='':
moves = '7f7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7f)and Bboard.b7d==''\
and board.s7e=='':
moves = '7f7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b7h==''\
and board.s7g=='':
moves = '7f7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b7i==''\
and board.s7g+board.s7h=='':
moves = '7f7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b9f==''\
and board.s8f=='':
moves = '7f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b5f==''\
and board.s6f=='':
moves = '7f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b4f==''\
and board.s6f+board.s5f=='':
moves = '7f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b3f==''\
and board.s6f+board.s5f+board.s4f=='':
moves = '7f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b2f==''\
and board.s6f+board.s5f+board.s4f+board.s3f=='':
moves = '7f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7f)and Bboard.b1f==''\
and board.s6f+board.s5f+board.s4f+board.s3f+board.s2f=='':
moves = '7f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7f)and Bboard.b9h==''\
and board.s8g=='':
moves = '7f9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7f)and Bboard.b5d==''\
and board.s6e=='':
moves = '7f5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7f)and Bboard.b4c==''\
and board.s6e+board.s5d=='':
moves = '7f4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7f)and Bboard.b3b==''\
and board.s6e+board.s5d+board.s4c=='':
moves = '7f3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7f)and Bboard.b2a==''\
and board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '7f2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7f)and Bboard.b4i==''\
and board.s5h+board.s6g=='':
moves = '7f4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7f)and Bboard.b5h==''\
and board.s6g=='':
moves = '7f5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7f)and Bboard.b9d==''\
and board.s8e=='':
moves = '7f9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b7f)and Bboard.b4c==''\
and board.s6e+board.s5d=='':
moves = '7f4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b7f)and Bboard.b3b==''\
and board.s6e+board.s5d+board.s4c=='':
moves = '7f3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b7f)and Bboard.b2a==''\
and board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '7f2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8f !='':
if re.match(r'[PLSGRK+]', Bboard.b8f)and Bboard.b8e=='':
moves = '8f8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8f)and Bboard.b7e=='':
moves = '8f7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8f)and Bboard.b9e=='':
moves = '8f9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8f)and Bboard.b7f=='':
moves = '8f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8f)and Bboard.b9f=='':
moves = '8f9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8f)and Bboard.b8g=='':
moves = '8f8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8f)and Bboard.b7g=='':
moves = '8f7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8f)and Bboard.b9g=='':
moves = '8f9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8f)and Bboard.b7d=='':
moves = '8f7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8f)and Bboard.b9d=='':
moves = '8f9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8f)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e=='':
moves = '8f8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8f)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e=='':
moves = '8f8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8f)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e=='':
moves = '8f8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8f)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e=='':
moves = '8f8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b8f)and Bboard.b8c==''\
and board.s8d+board.s8e=='':
moves = '8f8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8f)and Bboard.b8c==''\
and board.s8d+board.s8e=='':
moves = '8f8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8f)and Bboard.b8d==''\
and board.s8e=='':
moves = '8f8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b8h==''\
and board.s8g=='':
moves = '8f8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b8i==''\
and board.s8g+board.s8h=='':
moves = '8f8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b6f==''\
and board.s7f=='':
moves = '8f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b5f==''\
and board.s7f+board.s6f=='':
moves = '8f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b4f==''\
and board.s7f+board.s6f+board.s5f=='':
moves = '8f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b3f==''\
and board.s7f+board.s6f+board.s5f+board.s4f=='':
moves = '8f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b2f==''\
and board.s7f+board.s6f+board.s5f+board.s4f+board.s3f=='':
moves = '8f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8f)and Bboard.b1f==''\
and board.s7f+board.s6f+board.s5f+board.s4f+board.s3f+board.s2f=='':
moves = '8f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8f)and Bboard.b6d==''\
and board.s7e=='':
moves = '8f6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8f)and Bboard.b5c==''\
and board.s7e+board.s6d=='':
moves = '8f5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8f)and Bboard.b4b==''\
and board.s7e+board.s6d+board.s5c=='':
moves = '8f4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8f)and Bboard.b3a==''\
and board.s7e+board.s6d+board.s5c+board.s4b=='':
moves = '8f3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8f)and Bboard.b5i==''\
and board.s6h+board.s7g=='':
moves = '8f5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8f)and Bboard.b6h==''\
and board.s7g=='':
moves = '8f6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b8f)and Bboard.b5c==''\
and board.s7e+board.s6d=='':
moves = '8f5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b8f)and Bboard.b4b==''\
and board.s7e+board.s6d+board.s5c=='':
moves = '8f4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b8f)and Bboard.b3a==''\
and board.s7e+board.s6d+board.s5c+board.s4b=='':
moves = '8f3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9f !='':
if re.match(r'[PLSGRK+]', Bboard.b9f)and Bboard.b9e=='':
moves = '9f9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b9f)and Bboard.b8e=='':
moves = '9f8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9f)and Bboard.b8f=='':
moves = '9f8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9f)and Bboard.b9g=='':
moves = '9f9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b9f)and Bboard.b8g=='':
moves = '9f8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9f)and Bboard.b8d=='':
moves = '9f8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9f)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e=='':
moves = '9f9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9f)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e=='':
moves = '9f9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9f)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e=='':
moves = '9f9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9f)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e=='':
moves = '9f9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b9f)and Bboard.b9c==''\
and board.s9d+board.s9e=='':
moves = '9f9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9f)and Bboard.b9c==''\
and board.s9d+board.s9e=='':
moves = '9f9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9f)and Bboard.b9d==''\
and board.s9e=='':
moves = '9f9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b9h==''\
and board.s9g=='':
moves = '9f9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b9i==''\
and board.s9g+board.s9h=='':
moves = '9f9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b7f==''\
and board.s8f=='':
moves = '9f7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b6f==''\
and board.s8f+board.s7f=='':
moves = '9f6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b5f==''\
and board.s8f+board.s7f+board.s6f=='':
moves = '9f5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b4f==''\
and board.s8f+board.s7f+board.s6f+board.s5f=='':
moves = '9f4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b3f==''\
and board.s8f+board.s7f+board.s6f+board.s5f+board.s4f=='':
moves = '9f3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b2f==''\
and board.s8f+board.s7f+board.s6f+board.s5f+board.s4f+board.s3f=='':
moves = '9f2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9f)and Bboard.b1f==''\
and board.s8f+board.s7f+board.s6f+board.s5f+board.s4f+board.s3f+board.s2f=='':
moves = '9f1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9f)and Bboard.b7d==''\
and board.s8e=='':
moves = '9f7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9f)and Bboard.b6c==''\
and board.s8e+board.s7d=='':
moves = '9f6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9f)and Bboard.b5b==''\
and board.s8e+board.s7d+board.s6c=='':
moves = '9f5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9f)and Bboard.b4a==''\
and board.s8e+board.s7d+board.s6c+board.s5b=='':
moves = '9f4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9f)and Bboard.b6i==''\
and board.s7h+board.s8g=='':
moves = '9f6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9f)and Bboard.b7h==''\
and board.s8g=='':
moves = '9f7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b9f)and Bboard.b6c==''\
and board.s8e+board.s7d=='':
moves = '9f6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b9f)and Bboard.b5b==''\
and board.s8e+board.s7d+board.s6c=='':
moves = '9f5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b9f)and Bboard.b4a==''\
and board.s8e+board.s7d+board.s6c+board.s5b=='':
moves = '9f4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1g !='':
if re.match(r'[PLSGRK+]', Bboard.b1g)and Bboard.b1f=='':
moves = '1g1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b1g)and Bboard.b2f=='':
moves = '1g2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1g)and Bboard.b2g=='':
moves = '1g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1g)and Bboard.b1h=='':
moves = '1g1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b1g)and Bboard.b2h=='':
moves = '1g2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1g)and Bboard.b2e=='':
moves = '1g2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1g)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1g1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1g)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1g1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1g)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1g1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1g)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f=='':
moves = '1g1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b1g)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f=='':
moves = '1g1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1g)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f=='':
moves = '1g1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1g)and Bboard.b1d==''\
and board.s1e+board.s1f=='':
moves = '1g1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1g)and Bboard.b1e==''\
and board.s1f=='':
moves = '1g1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b1i==''\
and board.s1h=='':
moves = '1g1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b3g==''\
and board.s2g=='':
moves = '1g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b4g==''\
and board.s2g+board.s3g=='':
moves = '1g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b5g==''\
and board.s2g+board.s3g+board.s4g=='':
moves = '1g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b6g==''\
and board.s2g+board.s3g+board.s4g+board.s5g=='':
moves = '1g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b7g==''\
and board.s2g+board.s3g+board.s4g+board.s5g+board.s6g=='':
moves = '1g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b8g==''\
and board.s2g+board.s3g+board.s4g+board.s5g+board.s6g+board.s7g=='':
moves = '1g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1g)and Bboard.b9g==''\
and board.s2g+board.s3g+board.s4g+board.s5g+board.s6g+board.s7g+board.s8g=='':
moves = '1g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1g)and Bboard.b5c==''\
and board.s2f+board.s3e+board.s4d=='':
moves = '1g5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1g)and Bboard.b6b==''\
and board.s2f+board.s3e+board.s4d+board.s5c=='':
moves = '1g6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1g)and Bboard.b7a==''\
and board.s2f+board.s3e+board.s4d+board.s5c+board.s6b=='':
moves = '1g7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1g)and Bboard.b3i==''\
and board.s2h=='':
moves = '1g3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1g)and Bboard.b3e==''\
and board.s2f=='':
moves = '1g3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1g)and Bboard.b4d==''\
and board.s2f+board.s3e=='':
moves = '1g4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1g)and Bboard.b5c==''\
and board.s2f+board.s3e+board.s4d=='':
moves = '1g5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1g)and Bboard.b6b==''\
and board.s2f+board.s3e+board.s4d+board.s5c=='':
moves = '1g6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1g)and Bboard.b7a==''\
and board.s2f+board.s3e+board.s4d+board.s5c+board.s6b=='':
moves = '1g7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2g !='':
if re.match(r'[PLSGRK+]', Bboard.b2g)and Bboard.b2f=='':
moves = '2g2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2g)and Bboard.b1f=='':
moves = '2g1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2g)and Bboard.b3f=='':
moves = '2g3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2g)and Bboard.b1g=='':
moves = '2g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2g)and Bboard.b3g=='':
moves = '2g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2g)and Bboard.b2h=='':
moves = '2g2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2g)and Bboard.b1h=='':
moves = '2g1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2g)and Bboard.b3h=='':
moves = '2g3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2g)and Bboard.b1e=='':
moves = '2g1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2g)and Bboard.b3e=='':
moves = '2g3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2g)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2g2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2g)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2g2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2g)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2g2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2g)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f=='':
moves = '2g2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b2g)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f=='':
moves = '2g2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2g)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f=='':
moves = '2g2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2g)and Bboard.b2d==''\
and board.s2e+board.s2f=='':
moves = '2g2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2g)and Bboard.b2e==''\
and board.s2f=='':
moves = '2g2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b2i==''\
and board.s2h=='':
moves = '2g2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b4g==''\
and board.s3g=='':
moves = '2g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b5g==''\
and board.s3g+board.s4g=='':
moves = '2g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b6g==''\
and board.s3g+board.s4g+board.s5g=='':
moves = '2g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b7g==''\
and board.s3g+board.s4g+board.s5g+board.s6g=='':
moves = '2g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b8g==''\
and board.s3g+board.s4g+board.s5g+board.s6g+board.s7g=='':
moves = '2g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2g)and Bboard.b9g==''\
and board.s3g+board.s4g+board.s5g+board.s6g+board.s7g+board.s8g=='':
moves = '2g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2g)and Bboard.b6c==''\
and board.s3f+board.s4e+board.s5d=='':
moves = '2g6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2g)and Bboard.b7b==''\
and board.s3f+board.s4e+board.s5d+board.s6c=='':
moves = '2g7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2g)and Bboard.b8a==''\
and board.s3f+board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '2g8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2g)and Bboard.b4e==''\
and board.s3f=='':
moves = '2g4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2g)and Bboard.b5d==''\
and board.s3f+board.s4e=='':
moves = '2g5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2g)and Bboard.b6c==''\
and board.s3f+board.s4e+board.s5d=='':
moves = '2g6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2g)and Bboard.b7b==''\
and board.s3f+board.s4e+board.s5d+board.s6c=='':
moves = '2g7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2g)and Bboard.b8a==''\
and board.s3f+board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '2g8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2g)and Bboard.b4i==''\
and board.s3h=='':
moves = '2g4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3g !='':
if re.match(r'[PLSGRK+]', Bboard.b3g)and Bboard.b3f=='':
moves = '3g3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3g)and Bboard.b2f=='':
moves = '3g2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3g)and Bboard.b4f=='':
moves = '3g4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3g)and Bboard.b2g=='':
moves = '3g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3g)and Bboard.b4g=='':
moves = '3g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3g)and Bboard.b3h=='':
moves = '3g3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3g)and Bboard.b2h=='':
moves = '3g2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3g)and Bboard.b4h=='':
moves = '3g4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3g)and Bboard.b2e=='':
moves = '3g2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3g)and Bboard.b4e=='':
moves = '3g4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3g)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3g3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3g)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3g3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3g)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3g3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3g)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f=='':
moves = '3g3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b3g)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f=='':
moves = '3g3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3g)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f=='':
moves = '3g3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3g)and Bboard.b3d==''\
and board.s3e+board.s3f=='':
moves = '3g3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3g)and Bboard.b3e==''\
and board.s3f=='':
moves = '3g3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b3i==''\
and board.s3h=='':
moves = '3g3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b1g==''\
and board.s2g=='':
moves = '3g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b5g==''\
and board.s4g=='':
moves = '3g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b6g==''\
and board.s4g+board.s5g=='':
moves = '3g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b7g==''\
and board.s4g+board.s5g+board.s6g=='':
moves = '3g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b8g==''\
and board.s4g+board.s5g+board.s6g+board.s7g=='':
moves = '3g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3g)and Bboard.b9g==''\
and board.s4g+board.s5g+board.s6g+board.s7g+board.s8g=='':
moves = '3g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3g)and Bboard.b7c==''\
and board.s4f+board.s5e+board.s6d=='':
moves = '3g7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3g)and Bboard.b8b==''\
and board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '3g8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3g)and Bboard.b9a==''\
and board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '3g9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3g)and Bboard.b1i==''\
and board.s2h=='':
moves = '3g1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3g)and Bboard.b5e==''\
and board.s4f=='':
moves = '3g5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3g)and Bboard.b6d==''\
and board.s4f+board.s5e=='':
moves = '3g6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3g)and Bboard.b7c==''\
and board.s4f+board.s5e+board.s6d=='':
moves = '3g7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3g)and Bboard.b8b==''\
and board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '3g8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3g)and Bboard.b9a==''\
and board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '3g9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3g)and Bboard.b5i==''\
and board.s4h=='':
moves = '3g5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3g)and Bboard.b1e==''\
and board.s2f=='':
moves = '3g1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4g !='':
if re.match(r'[PLSGRK+]', Bboard.b4g)and Bboard.b4f=='':
moves = '4g4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4g)and Bboard.b3f=='':
moves = '4g3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4g)and Bboard.b5f=='':
moves = '4g5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4g)and Bboard.b3g=='':
moves = '4g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4g)and Bboard.b5g=='':
moves = '4g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4g)and Bboard.b4h=='':
moves = '4g4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4g)and Bboard.b3h=='':
moves = '4g3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4g)and Bboard.b5h=='':
moves = '4g5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4g)and Bboard.b3e=='':
moves = '4g3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4g)and Bboard.b5e=='':
moves = '4g5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4g)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4g4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4g)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4g4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4g)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4g4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4g)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f=='':
moves = '4g4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b4g)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f=='':
moves = '4g4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4g)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f=='':
moves = '4g4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4g)and Bboard.b4d==''\
and board.s4e+board.s4f=='':
moves = '4g4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4g)and Bboard.b4e==''\
and board.s4f=='':
moves = '4g4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b4i==''\
and board.s4h=='':
moves = '4g4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b1g==''\
and board.s2g+board.s3g=='':
moves = '4g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b2g==''\
and board.s3g=='':
moves = '4g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b6g==''\
and board.s5g=='':
moves = '4g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b7g==''\
and board.s5g+board.s6g=='':
moves = '4g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b8g==''\
and board.s5g+board.s6g+board.s7g=='':
moves = '4g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4g)and Bboard.b9g==''\
and board.s5g+board.s6g+board.s7g+board.s8g=='':
moves = '4g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b6e==''\
and board.s5f=='':
moves = '4g6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b7d==''\
and board.s5f+board.s6e=='':
moves = '4g7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4g)and Bboard.b8c==''\
and board.s5f+board.s6e+board.s7d=='':
moves = '4g8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4g)and Bboard.b9b==''\
and board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '4g9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4g)and Bboard.b8c==''\
and board.s5f+board.s6e+board.s7d=='':
moves = '4g8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4g)and Bboard.b9b==''\
and board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '4g9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b1d==''\
and board.s2e+board.s3f=='':
moves = '4g1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b2e==''\
and board.s3f=='':
moves = '4g2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b2i==''\
and board.s3h=='':
moves = '4g2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4g)and Bboard.b6i==''\
and board.s5h=='':
moves = '4g6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5g !='':
if re.match(r'[PLSGRK+]', Bboard.b5g)and Bboard.b5f=='':
moves = '5g5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5g)and Bboard.b4f=='':
moves = '5g4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5g)and Bboard.b6f=='':
moves = '5g6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5g)and Bboard.b4g=='':
moves = '5g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5g)and Bboard.b6g=='':
moves = '5g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5g)and Bboard.b5h=='':
moves = '5g5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5g)and Bboard.b4h=='':
moves = '5g4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5g)and Bboard.b6h=='':
moves = '5g6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5g)and Bboard.b4e=='':
moves = '5g4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5g)and Bboard.b6e=='':
moves = '5g6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5g)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5g5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5g)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5g5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5g)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5g5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5g)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f=='':
moves = '5g5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b5g)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f=='':
moves = '5g5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5g)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f=='':
moves = '5g5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5g)and Bboard.b5d==''\
and board.s5e+board.s5f=='':
moves = '5g5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5g)and Bboard.b5e==''\
and board.s5f=='':
moves = '5g5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b5i==''\
and board.s5h=='':
moves = '5g5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b1g==''\
and board.s2g+board.s3g+board.s4g=='':
moves = '5g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b2g==''\
and board.s3g+board.s4g=='':
moves = '5g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b3g==''\
and board.s4g=='':
moves = '5g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b7g==''\
and board.s6g=='':
moves = '5g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b8g==''\
and board.s6g+board.s7g=='':
moves = '5g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5g)and Bboard.b9g==''\
and board.s6g+board.s7g+board.s8g=='':
moves = '5g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b7e==''\
and board.s6f=='':
moves = '5g7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b8d==''\
and board.s6f+board.s7e=='':
moves = '5g8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5g)and Bboard.b9c==''\
and board.s6f+board.s7e+board.s8d=='':
moves = '5g9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5g)and Bboard.b9c==''\
and board.s6f+board.s7e+board.s8d=='':
moves = '5g9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b2d==''\
and board.s3e+board.s4f=='':
moves = '5g2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b3e==''\
and board.s4f=='':
moves = '5g3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b5g)and Bboard.b1c==''\
and board.s4f+board.s3e+board.s2d=='':
moves = '5g1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b5g)and Bboard.b1c==''\
and board.s4f+board.s3e+board.s2d=='':
moves = '5g1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b3i==''\
and board.s4h=='':
moves = '5g3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5g)and Bboard.b7i==''\
and board.s6h=='':
moves = '5g7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6g !='':
if re.match(r'[PLSGRK+]', Bboard.b6g)and Bboard.b6f=='':
moves = '6g6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6g)and Bboard.b5f=='':
moves = '6g5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6g)and Bboard.b7f=='':
moves = '6g7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6g)and Bboard.b5g=='':
moves = '6g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6g)and Bboard.b7g=='':
moves = '6g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6g)and Bboard.b6h=='':
moves = '6g6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6g)and Bboard.b5h=='':
moves = '6g5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6g)and Bboard.b7h=='':
moves = '6g7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6g)and Bboard.b5e=='':
moves = '6g5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6g)and Bboard.b7e=='':
moves = '6g7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6g)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6g6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6g)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6g6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6g)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6g6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6g)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f=='':
moves = '6g6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b6g)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f=='':
moves = '6g6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6g)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f=='':
moves = '6g6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6g)and Bboard.b6d==''\
and board.s6e+board.s6f=='':
moves = '6g6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6g)and Bboard.b6e==''\
and board.s6f=='':
moves = '6g6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b6i==''\
and board.s6h=='':
moves = '6g6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b9g==''\
and board.s8g+board.s7g=='':
moves = '6g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b8g==''\
and board.s7g=='':
moves = '6g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b4g==''\
and board.s5g=='':
moves = '6g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b3g==''\
and board.s5g+board.s4g=='':
moves = '6g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b2g==''\
and board.s5g+board.s4g+board.s3g=='':
moves = '6g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6g)and Bboard.b1g==''\
and board.s5g+board.s4g+board.s3g+board.s2g=='':
moves = '6g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b4e==''\
and board.s5f=='':
moves = '6g4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b3d==''\
and board.s5f+board.s4e=='':
moves = '6g3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6g)and Bboard.b2c==''\
and board.s5f+board.s4e+board.s3d=='':
moves = '6g2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6g)and Bboard.b1b==''\
and board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '6g1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6g)and Bboard.b2c==''\
and board.s5f+board.s4e+board.s3d=='':
moves = '6g2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6g)and Bboard.b1b==''\
and board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '6g1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b9d==''\
and board.s8e+board.s7f=='':
moves = '6g9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b8e==''\
and board.s7f=='':
moves = '6g8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b8i==''\
and board.s7h=='':
moves = '6g8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6g)and Bboard.b4i==''\
and board.s5h=='':
moves = '6g4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7g !='':
if re.match(r'[PLSGRK+]', Bboard.b7g)and Bboard.b7f=='':
moves = '7g7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7g)and Bboard.b6f=='':
moves = '7g6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7g)and Bboard.b8f=='':
moves = '7g8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7g)and Bboard.b6g=='':
moves = '7g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7g)and Bboard.b8g=='':
moves = '7g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7g)and Bboard.b7h=='':
moves = '7g7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7g)and Bboard.b6h=='':
moves = '7g6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7g)and Bboard.b8h=='':
moves = '7g8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7g)and Bboard.b6e=='':
moves = '7g6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7g)and Bboard.b8e=='':
moves = '7g8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7g)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7g7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7g)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7g7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7g)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7g7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7g)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f=='':
moves = '7g7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b7g)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f=='':
moves = '7g7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7g)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f=='':
moves = '7g7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7g)and Bboard.b7d==''\
and board.s7e+board.s7f=='':
moves = '7g7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7g)and Bboard.b7e==''\
and board.s7f=='':
moves = '7g7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b7i==''\
and board.s7h=='':
moves = '7g7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b9g==''\
and board.s8g=='':
moves = '7g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b5g==''\
and board.s6g=='':
moves = '7g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b4g==''\
and board.s6g+board.s5g=='':
moves = '7g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b3g==''\
and board.s6g+board.s5g+board.s4g=='':
moves = '7g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b2g==''\
and board.s6g+board.s5g+board.s4g+board.s3g=='':
moves = '7g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7g)and Bboard.b1g==''\
and board.s6g+board.s5g+board.s4g+board.s3g+board.s2g=='':
moves = '7g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7g)and Bboard.b3c==''\
and board.s6f+board.s5e+board.s4d=='':
moves = '7g3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7g)and Bboard.b2b==''\
and board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '7g2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7g)and Bboard.b1a==''\
and board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '7g1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7g)and Bboard.b9i==''\
and board.s8h=='':
moves = '7g9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7g)and Bboard.b5e==''\
and board.s6f=='':
moves = '7g5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7g)and Bboard.b4d==''\
and board.s6f+board.s5e=='':
moves = '7g4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7g)and Bboard.b3c==''\
and board.s6f+board.s5e+board.s4d=='':
moves = '7g3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7g)and Bboard.b2b==''\
and board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '7g2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7g)and Bboard.b1a==''\
and board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '7g1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7g)and Bboard.b5i==''\
and board.s6h=='':
moves = '7g5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7g)and Bboard.b9e==''\
and board.s8f=='':
moves = '7g9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8g !='':
if re.match(r'[PLSGRK+]', Bboard.b8g)and Bboard.b8f=='':
moves = '8g8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8g)and Bboard.b7f=='':
moves = '8g7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8g)and Bboard.b9f=='':
moves = '8g9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8g)and Bboard.b7g=='':
moves = '8g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8g)and Bboard.b9g=='':
moves = '8g9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8g)and Bboard.b8h=='':
moves = '8g8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8g)and Bboard.b7h=='':
moves = '8g7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8g)and Bboard.b9h=='':
moves = '8g9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8g)and Bboard.b7e=='':
moves = '8g7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8g)and Bboard.b9e=='':
moves = '8g9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8g)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8g8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8g)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8g8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8g)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8g8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8g)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f=='':
moves = '8g8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b8g)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f=='':
moves = '8g8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8g)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f=='':
moves = '8g8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8g)and Bboard.b8d==''\
and board.s8e+board.s8f=='':
moves = '8g8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8g)and Bboard.b8e==''\
and board.s8f=='':
moves = '8g8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b8i==''\
and board.s8h=='':
moves = '8g8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b6g==''\
and board.s7g=='':
moves = '8g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b5g==''\
and board.s7g+board.s6g=='':
moves = '8g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b4g==''\
and board.s7g+board.s6g+board.s5g=='':
moves = '8g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b3g==''\
and board.s7g+board.s6g+board.s5g+board.s4g=='':
moves = '8g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b2g==''\
and board.s7g+board.s6g+board.s5g+board.s4g+board.s3g=='':
moves = '8g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8g)and Bboard.b1g==''\
and board.s7g+board.s6g+board.s5g+board.s4g+board.s3g+board.s2g=='':
moves = '8g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8g)and Bboard.b4c==''\
and board.s7f+board.s6e+board.s5d=='':
moves = '8g4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8g)and Bboard.b3b==''\
and board.s7f+board.s6e+board.s5d+board.s4c=='':
moves = '8g3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8g)and Bboard.b2a==''\
and board.s7f+board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '8g2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8g)and Bboard.b6e==''\
and board.s7f=='':
moves = '8g6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8g)and Bboard.b5d==''\
and board.s7f+board.s6e=='':
moves = '8g5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8g)and Bboard.b4c==''\
and board.s7f+board.s6e+board.s5d=='':
moves = '8g4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8g)and Bboard.b3b==''\
and board.s7f+board.s6e+board.s5d+board.s4c=='':
moves = '8g3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8g)and Bboard.b2a==''\
and board.s7f+board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '8g2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8g)and Bboard.b6i==''\
and board.s7h=='':
moves = '8g6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9g !='':
if re.match(r'[PLSGRK+]', Bboard.b9g)and Bboard.b9f=='':
moves = '9g9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b9g)and Bboard.b8f=='':
moves = '9g8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9g)and Bboard.b8g=='':
moves = '9g8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9g)and Bboard.b9h=='':
moves = '9g9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b9g)and Bboard.b8h=='':
moves = '9g8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9g)and Bboard.b8e=='':
moves = '9g8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9g)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9g9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9g)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9g9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9g)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9g9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9g)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f=='':
moves = '9g9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b9g)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f=='':
moves = '9g9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9g)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f=='':
moves = '9g9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9g)and Bboard.b9d==''\
and board.s9e+board.s9f=='':
moves = '9g9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9g)and Bboard.b9e==''\
and board.s9f=='':
moves = '9g9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b9i==''\
and board.s9h=='':
moves = '9g9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b7g==''\
and board.s8g=='':
moves = '9g7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b6g==''\
and board.s8g+board.s7g=='':
moves = '9g6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b5g==''\
and board.s8g+board.s7g+board.s6g=='':
moves = '9g5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b4g==''\
and board.s8g+board.s7g+board.s6g+board.s5g=='':
moves = '9g4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b3g==''\
and board.s8g+board.s7g+board.s6g+board.s5g+board.s4g=='':
moves = '9g3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b2g==''\
and board.s8g+board.s7g+board.s6g+board.s5g+board.s4g+board.s3g=='':
moves = '9g2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9g)and Bboard.b1g==''\
and board.s8g+board.s7g+board.s6g+board.s5g+board.s4g+board.s3g+board.s2g=='':
moves = '9g1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9g)and Bboard.b5c==''\
and board.s8f+board.s7e+board.s6d=='':
moves = '9g5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9g)and Bboard.b4b==''\
and board.s8f+board.s7e+board.s6d+board.s5c=='':
moves = '9g4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9g)and Bboard.b3a==''\
and board.s8f+board.s7e+board.s6d+board.s5c+board.s4b=='':
moves = '9g3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9g)and Bboard.b7i==''\
and board.s8h=='':
moves = '9g7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9g)and Bboard.b7e==''\
and board.s8f=='':
moves = '9g7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9g)and Bboard.b6d==''\
and board.s8f+board.s7e=='':
moves = '9g6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9g)and Bboard.b5c==''\
and board.s8f+board.s7e+board.s6d=='':
moves = '9g5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9g)and Bboard.b4b==''\
and board.s8f+board.s7e+board.s6d+board.s5c=='':
moves = '9g4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9g)and Bboard.b3a==''\
and board.s8f+board.s7e+board.s6d+board.s5c+board.s4b=='':
moves = '9g3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1h !='':
if re.match(r'[PLSGRK+]', Bboard.b1h)and Bboard.b1g=='':
moves = '1h1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b1h)and Bboard.b2g=='':
moves = '1h2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1h)and Bboard.b2h=='':
moves = '1h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1h)and Bboard.b1i=='':
moves = '1h1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b1h)and Bboard.b2i=='':
moves = '1h2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1h)and Bboard.b2f=='':
moves = '1h2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1h)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1h)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1h)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1h)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b1h)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1h)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f+board.s1g=='':
moves = '1h1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1h)and Bboard.b1d==''\
and board.s1e+board.s1f+board.s1g=='':
moves = '1h1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1h)and Bboard.b1e==''\
and board.s1f+board.s1g=='':
moves = '1h1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1h)and Bboard.b1f==''\
and board.s1g=='':
moves = '1h1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b3h==''\
and board.s2h=='':
moves = '1h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b4h==''\
and board.s2h+board.s3h=='':
moves = '1h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b5h==''\
and board.s2h+board.s3h+board.s4h=='':
moves = '1h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b6h==''\
and board.s2h+board.s3h+board.s4h+board.s5h=='':
moves = '1h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b7h==''\
and board.s2h+board.s3h+board.s4h+board.s5h+board.s6h=='':
moves = '1h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b8h==''\
and board.s2h+board.s3h+board.s4h+board.s5h+board.s6h+board.s7h=='':
moves = '1h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1h)and Bboard.b9h==''\
and board.s2h+board.s3h+board.s4h+board.s5h+board.s6h+board.s7h+board.s8h=='':
moves = '1h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1h)and Bboard.b6c==''\
and board.s2g+board.s3f+board.s4e+board.s5d=='':
moves = '1h6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1h)and Bboard.b7b==''\
and board.s2g+board.s3f+board.s4e+board.s5d+board.s6c=='':
moves = '1h7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1h)and Bboard.b8a==''\
and board.s2g+board.s3f+board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '1h8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1h)and Bboard.b3f==''\
and board.s2g=='':
moves = '1h3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1h)and Bboard.b4e==''\
and board.s2g+board.s3f=='':
moves = '1h4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1h)and Bboard.b5d==''\
and board.s2g+board.s3f+board.s4e=='':
moves = '1h5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1h)and Bboard.b6c==''\
and board.s2g+board.s3f+board.s4e+board.s5d=='':
moves = '1h6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1h)and Bboard.b7b==''\
and board.s2g+board.s3f+board.s4e+board.s5d+board.s6c=='':
moves = '1h7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1h)and Bboard.b8a==''\
and board.s2g+board.s3f+board.s4e+board.s5d+board.s6c+board.s7b=='':
moves = '1h8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2h !='':
if re.match(r'[PLSGRK+]', Bboard.b2h)and Bboard.b2g=='':
moves = '2h2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2h)and Bboard.b1g=='':
moves = '2h1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2h)and Bboard.b3g=='':
moves = '2h3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2h)and Bboard.b1h=='':
moves = '2h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2h)and Bboard.b3h=='':
moves = '2h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2h)and Bboard.b2i=='':
moves = '2h2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2h)and Bboard.b1i=='':
moves = '2h1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b2h)and Bboard.b3i=='':
moves = '2h3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2h)and Bboard.b1f=='':
moves = '2h1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2h)and Bboard.b3f=='':
moves = '2h3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2h)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2h)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2h)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2h)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b2h)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2h)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f+board.s2g=='':
moves = '2h2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2h)and Bboard.b2d==''\
and board.s2e+board.s2f+board.s2g=='':
moves = '2h2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2h)and Bboard.b2e==''\
and board.s2f+board.s2g=='':
moves = '2h2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2h)and Bboard.b2f==''\
and board.s2g=='':
moves = '2h2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b4h==''\
and board.s3h=='':
moves = '2h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b5h==''\
and board.s3h+board.s4h=='':
moves = '2h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b6h==''\
and board.s3h+board.s4h+board.s5h=='':
moves = '2h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b7h==''\
and board.s3h+board.s4h+board.s5h+board.s6h=='':
moves = '2h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b8h==''\
and board.s3h+board.s4h+board.s5h+board.s6h+board.s7h=='':
moves = '2h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2h)and Bboard.b9h==''\
and board.s3h+board.s4h+board.s5h+board.s6h+board.s7h+board.s8h=='':
moves = '2h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2h)and Bboard.b7c==''\
and board.s3g+board.s4f+board.s5e+board.s6d=='':
moves = '2h7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2h)and Bboard.b8b==''\
and board.s3g+board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '2h8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2h)and Bboard.b9a==''\
and board.s3g+board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '2h9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2h)and Bboard.b4f==''\
and board.s3g=='':
moves = '2h4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2h)and Bboard.b5e==''\
and board.s3g+board.s4f=='':
moves = '2h5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2h)and Bboard.b6d==''\
and board.s3g+board.s4f+board.s5e=='':
moves = '2h6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2h)and Bboard.b7c==''\
and board.s3g+board.s4f+board.s5e+board.s6d=='':
moves = '2h7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2h)and Bboard.b8b==''\
and board.s3g+board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '2h8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2h)and Bboard.b9a==''\
and board.s3g+board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '2h9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3h !='':
if re.match(r'[PLSGRK+]', Bboard.b3h)and Bboard.b3g=='':
moves = '3h3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3h)and Bboard.b2g=='':
moves = '3h2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3h)and Bboard.b4g=='':
moves = '3h4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3h)and Bboard.b2h=='':
moves = '3h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3h)and Bboard.b4h=='':
moves = '3h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3h)and Bboard.b3i=='':
moves = '3h3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3h)and Bboard.b2i=='':
moves = '3h2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b3h)and Bboard.b4i=='':
moves = '3h4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3h)and Bboard.b2f=='':
moves = '3h2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3h)and Bboard.b4f=='':
moves = '3h4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3h)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3h)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3h)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3h)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b3h)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3h)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f+board.s3g=='':
moves = '3h3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3h)and Bboard.b3d==''\
and board.s3e+board.s3f+board.s3g=='':
moves = '3h3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3h)and Bboard.b3e==''\
and board.s3f+board.s3g=='':
moves = '3h3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3h)and Bboard.b3f==''\
and board.s3g=='':
moves = '3h3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b1h==''\
and board.s2h=='':
moves = '3h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b5h==''\
and board.s4h=='':
moves = '3h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b6h==''\
and board.s4h+board.s5h=='':
moves = '3h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b7h==''\
and board.s4h+board.s5h+board.s6h=='':
moves = '3h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b8h==''\
and board.s4h+board.s5h+board.s6h+board.s7h=='':
moves = '3h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3h)and Bboard.b9h==''\
and board.s4h+board.s5h+board.s6h+board.s7h+board.s8h=='':
moves = '3h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3h)and Bboard.b8c==''\
and board.s4g+board.s5f+board.s6e+board.s7d=='':
moves = '3h8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3h)and Bboard.b9b==''\
and board.s4g+board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '3h9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3h)and Bboard.b5f==''\
and board.s4g=='':
moves = '3h5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3h)and Bboard.b6e==''\
and board.s4g+board.s5f=='':
moves = '3h6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3h)and Bboard.b7d==''\
and board.s4g+board.s5f+board.s6e=='':
moves = '3h7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3h)and Bboard.b8c==''\
and board.s4g+board.s5f+board.s6e+board.s7d=='':
moves = '3h8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3h)and Bboard.b9b==''\
and board.s4g+board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '3h9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3h)and Bboard.b1f==''\
and board.s2g=='':
moves = '3h1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4h !='':
if re.match(r'[PLSGRK+]', Bboard.b4h)and Bboard.b4g=='':
moves = '4h4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4h)and Bboard.b3g=='':
moves = '4h3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4h)and Bboard.b5g=='':
moves = '4h5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4h)and Bboard.b3h=='':
moves = '4h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4h)and Bboard.b5h=='':
moves = '4h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4h)and Bboard.b4i=='':
moves = '4h4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4h)and Bboard.b3i=='':
moves = '4h3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b4h)and Bboard.b5i=='':
moves = '4h5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4h)and Bboard.b3f=='':
moves = '4h3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4h)and Bboard.b5f=='':
moves = '4h5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4h)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4h)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4h)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4h)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b4h)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4h)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f+board.s4g=='':
moves = '4h4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4h)and Bboard.b4d==''\
and board.s4e+board.s4f+board.s4g=='':
moves = '4h4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4h)and Bboard.b4e==''\
and board.s4f+board.s4g=='':
moves = '4h4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4h)and Bboard.b4f==''\
and board.s4g=='':
moves = '4h4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b1h==''\
and board.s2h+board.s3h=='':
moves = '4h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b2h==''\
and board.s3h=='':
moves = '4h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b6h==''\
and board.s5h=='':
moves = '4h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b7h==''\
and board.s5h+board.s6h=='':
moves = '4h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b8h==''\
and board.s5h+board.s6h+board.s7h=='':
moves = '4h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4h)and Bboard.b9h==''\
and board.s5h+board.s6h+board.s7h+board.s8h=='':
moves = '4h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4h)and Bboard.b6f==''\
and board.s5g=='':
moves = '4h6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4h)and Bboard.b7e==''\
and board.s5g+board.s6f=='':
moves = '4h7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4h)and Bboard.b8d==''\
and board.s5g+board.s6f+board.s7e=='':
moves = '4h8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b4h)and Bboard.b9c==''\
and board.s5g+board.s6f+board.s7e+board.s8d=='':
moves = '4h9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b4h)and Bboard.b9c==''\
and board.s5g+board.s6f+board.s7e+board.s8d=='':
moves = '4h9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4h)and Bboard.b1e==''\
and board.s2f+board.s3g=='':
moves = '4h1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4h)and Bboard.b2f==''\
and board.s3g=='':
moves = '4h2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5h !='':
if re.match(r'[PLSGRK+]', Bboard.b5h)and Bboard.b5g=='':
moves = '5h5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5h)and Bboard.b4g=='':
moves = '5h4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5h)and Bboard.b6g=='':
moves = '5h6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5h)and Bboard.b4h=='':
moves = '5h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5h)and Bboard.b6h=='':
moves = '5h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5h)and Bboard.b5i=='':
moves = '5h5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5h)and Bboard.b4i=='':
moves = '5h4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b5h)and Bboard.b6i=='':
moves = '5h6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5h)and Bboard.b4f=='':
moves = '5h4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5h)and Bboard.b6f=='':
moves = '5h6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5h)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5h)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5h)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5h)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b5h)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5h)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f+board.s5g=='':
moves = '5h5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5h)and Bboard.b5d==''\
and board.s5e+board.s5f+board.s5g=='':
moves = '5h5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5h)and Bboard.b5e==''\
and board.s5f+board.s5g=='':
moves = '5h5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5h)and Bboard.b5f==''\
and board.s5g=='':
moves = '5h5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b1h==''\
and board.s2h+board.s3h+board.s4h=='':
moves = '5h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b2h==''\
and board.s3h+board.s4h=='':
moves = '5h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b3h==''\
and board.s4h=='':
moves = '5h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b7h==''\
and board.s6h=='':
moves = '5h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b8h==''\
and board.s6h+board.s7h=='':
moves = '5h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5h)and Bboard.b9h==''\
and board.s6h+board.s7h+board.s8h=='':
moves = '5h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b7f==''\
and board.s6g=='':
moves = '5h7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b8e==''\
and board.s6g+board.s7f=='':
moves = '5h8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b9d==''\
and board.s6g+board.s7f+board.s8e=='':
moves = '5h9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b2e==''\
and board.s3f+board.s4g=='':
moves = '5h2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b3f==''\
and board.s4g=='':
moves = '5h3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5h)and Bboard.b1d==''\
and board.s4g+board.s3f+board.s2e=='':
moves = '5h1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6h !='':
if re.match(r'[PLSGRK+]', Bboard.b6h)and Bboard.b6g=='':
moves = '6h6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6h)and Bboard.b5g=='':
moves = '6h5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6h)and Bboard.b7g=='':
moves = '6h7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6h)and Bboard.b5h=='':
moves = '6h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6h)and Bboard.b7h=='':
moves = '6h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6h)and Bboard.b6i=='':
moves = '6h6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6h)and Bboard.b5i=='':
moves = '6h5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b6h)and Bboard.b7i=='':
moves = '6h7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6h)and Bboard.b5f=='':
moves = '6h5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6h)and Bboard.b7f=='':
moves = '6h7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6h)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6h)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6h)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6h)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b6h)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6h)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f+board.s6g=='':
moves = '6h6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6h)and Bboard.b6d==''\
and board.s6e+board.s6f+board.s6g=='':
moves = '6h6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6h)and Bboard.b6e==''\
and board.s6f+board.s6g=='':
moves = '6h6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6h)and Bboard.b6f==''\
and board.s6g=='':
moves = '6h6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b9h==''\
and board.s8h+board.s7h=='':
moves = '6h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b8h==''\
and board.s7h=='':
moves = '6h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b4h==''\
and board.s5h=='':
moves = '6h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b3h==''\
and board.s5h+board.s4h=='':
moves = '6h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b2h==''\
and board.s5h+board.s4h+board.s3h=='':
moves = '6h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6h)and Bboard.b1h==''\
and board.s5h+board.s4h+board.s3h+board.s2h=='':
moves = '6h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6h)and Bboard.b4f==''\
and board.s5g=='':
moves = '6h4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6h)and Bboard.b3e==''\
and board.s5g+board.s4f=='':
moves = '6h3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6h)and Bboard.b2d==''\
and board.s5g+board.s4f+board.s3e=='':
moves = '6h2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b6h)and Bboard.b1c==''\
and board.s5g+board.s4f+board.s3e+board.s2d=='':
moves = '6h1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B', Bboard.b6h)and Bboard.b1c==''\
and board.s5g+board.s4f+board.s3e+board.s2d=='':
moves = '6h1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6h)and Bboard.b9e==''\
and board.s8f+board.s7g=='':
moves = '6h9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6h)and Bboard.b8f==''\
and board.s7g=='':
moves = '6h8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7h !='':
if re.match(r'[PLSGRK+]', Bboard.b7h)and Bboard.b7g=='':
moves = '7h7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7h)and Bboard.b6g=='':
moves = '7h6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7h)and Bboard.b8g=='':
moves = '7h8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7h)and Bboard.b6h=='':
moves = '7h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7h)and Bboard.b8h=='':
moves = '7h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7h)and Bboard.b7i=='':
moves = '7h7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7h)and Bboard.b6i=='':
moves = '7h6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b7h)and Bboard.b8i=='':
moves = '7h8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7h)and Bboard.b6f=='':
moves = '7h6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7h)and Bboard.b8f=='':
moves = '7h8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7h)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7h)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7h)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7h)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b7h)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7h)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f+board.s7g=='':
moves = '7h7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7h)and Bboard.b7d==''\
and board.s7e+board.s7f+board.s7g=='':
moves = '7h7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7h)and Bboard.b7e==''\
and board.s7f+board.s7g=='':
moves = '7h7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7h)and Bboard.b7f==''\
and board.s7g=='':
moves = '7h7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b9h==''\
and board.s8h=='':
moves = '7h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b5h==''\
and board.s6h=='':
moves = '7h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b4h==''\
and board.s6h+board.s5h=='':
moves = '7h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b3h==''\
and board.s6h+board.s5h+board.s4h=='':
moves = '7h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b2h==''\
and board.s6h+board.s5h+board.s4h+board.s3h=='':
moves = '7h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7h)and Bboard.b1h==''\
and board.s6h+board.s5h+board.s4h+board.s3h+board.s2h=='':
moves = '7h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7h)and Bboard.b2c==''\
and board.s6g+board.s5f+board.s4e+board.s3d=='':
moves = '7h2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7h)and Bboard.b1b==''\
and board.s6g+board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '7h1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7h)and Bboard.b5f==''\
and board.s6g=='':
moves = '7h5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7h)and Bboard.b4e==''\
and board.s6g+board.s5f=='':
moves = '7h4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7h)and Bboard.b3d==''\
and board.s6g+board.s5f+board.s4e=='':
moves = '7h3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7h)and Bboard.b2c==''\
and board.s6g+board.s5f+board.s4e+board.s3d=='':
moves = '7h2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7h)and Bboard.b1b==''\
and board.s6g+board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '7h1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7h)and Bboard.b9f==''\
and board.s8g=='':
moves = '7h9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8h !='':
if re.match(r'[PLSGRK+]', Bboard.b8h)and Bboard.b8g=='':
moves = '8h8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8h)and Bboard.b7g=='':
moves = '8h7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8h)and Bboard.b9g=='':
moves = '8h9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8h)and Bboard.b7h=='':
moves = '8h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8h)and Bboard.b9h=='':
moves = '8h9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8h)and Bboard.b8i=='':
moves = '8h8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8h)and Bboard.b7i=='':
moves = '8h7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b8h)and Bboard.b9i=='':
moves = '8h9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8h)and Bboard.b7f=='':
moves = '8h7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8h)and Bboard.b9f=='':
moves = '8h9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8h)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8h)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8h)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8h)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b8h)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8h)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f+board.s8g=='':
moves = '8h8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8h)and Bboard.b8d==''\
and board.s8e+board.s8f+board.s8g=='':
moves = '8h8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8h)and Bboard.b8e==''\
and board.s8f+board.s8g=='':
moves = '8h8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8h)and Bboard.b8f==''\
and board.s8g=='':
moves = '8h8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b6h==''\
and board.s7h=='':
moves = '8h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b5h==''\
and board.s7h+board.s6h=='':
moves = '8h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b4h==''\
and board.s7h+board.s6h+board.s5h=='':
moves = '8h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b3h==''\
and board.s7h+board.s6h+board.s5h+board.s4h=='':
moves = '8h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b2h==''\
and board.s7h+board.s6h+board.s5h+board.s4h+board.s3h=='':
moves = '8h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8h)and Bboard.b1h==''\
and board.s7h+board.s6h+board.s5h+board.s4h+board.s3h+board.s2h=='':
moves = '8h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8h)and Bboard.b3c==''\
and board.s7g+board.s6f+board.s5e+board.s4d=='':
moves = '8h3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8h)and Bboard.b2b==''\
and board.s7g+board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '8h2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8h)and Bboard.b1a==''\
and board.s7g+board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '8h1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8h)and Bboard.b6f==''\
and board.s7g=='':
moves = '8h6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8h)and Bboard.b5e==''\
and board.s7g+board.s6f=='':
moves = '8h5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8h)and Bboard.b4d==''\
and board.s7g+board.s6f+board.s5e=='':
moves = '8h4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8h)and Bboard.b3c==''\
and board.s7g+board.s6f+board.s5e+board.s4d=='':
moves = '8h3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8h)and Bboard.b2b==''\
and board.s7g+board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '8h2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8h)and Bboard.b1a==''\
and board.s7g+board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '8h1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9h !='':
if re.match(r'[PLSGRK+]', Bboard.b9h)and Bboard.b9g=='':
moves = '9h9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b9h)and Bboard.b8g=='':
moves = '9h8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9h)and Bboard.b8h=='':
moves = '9h8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9h)and Bboard.b9i=='':
moves = '9h9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|\+B|B|S|K',Bboard.b9h)and Bboard.b8i=='':
moves = '9h8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9h)and Bboard.b8f=='':
moves = '9h8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9h)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9h)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9h)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9h)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b9h)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9h)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f+board.s9g=='':
moves = '9h9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9h)and Bboard.b9d==''\
and board.s9e+board.s9f+board.s9g=='':
moves = '9h9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9h)and Bboard.b9e==''\
and board.s9f+board.s9g=='':
moves = '9h9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9h)and Bboard.b9f==''\
and board.s9g=='':
moves = '9h9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b7h==''\
and board.s8h=='':
moves = '9h7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b6h==''\
and board.s8h+board.s7h=='':
moves = '9h6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b5h==''\
and board.s8h+board.s7h+board.s6h=='':
moves = '9h5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b4h==''\
and board.s8h+board.s7h+board.s6h+board.s5h=='':
moves = '9h4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b3h==''\
and board.s8h+board.s7h+board.s6h+board.s5h+board.s4h=='':
moves = '9h3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b2h==''\
and board.s8h+board.s7h+board.s6h+board.s5h+board.s4h+board.s3h=='':
moves = '9h2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9h)and Bboard.b1h==''\
and board.s8h+board.s7h+board.s6h+board.s5h+board.s4h+board.s3h+board.s2h=='':
moves = '9h1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9h)and Bboard.b4c==''\
and board.s8g+board.s7f+board.s6e+board.s5d=='':
moves = '9h4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9h)and Bboard.b3b==''\
and board.s8g+board.s7f+board.s6e+board.s5d+board.s4c=='':
moves = '9h3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9h)and Bboard.b2a==''\
and board.s8g+board.s7f+board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '9h2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9h)and Bboard.b7f==''\
and board.s8g=='':
moves = '9h7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9h)and Bboard.b6e==''\
and board.s8g+board.s7f=='':
moves = '9h6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9h)and Bboard.b5d==''\
and board.s8g+board.s7f+board.s6e=='':
moves = '9h5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9h)and Bboard.b4c==''\
and board.s8g+board.s7f+board.s6e+board.s5d=='':
moves = '9h4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9h)and Bboard.b3b==''\
and board.s8g+board.s7f+board.s6e+board.s5d+board.s4c=='':
moves = '9h3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9h)and Bboard.b2a==''\
and board.s8g+board.s7f+board.s6e+board.s5d+board.s4c+board.s3b=='':
moves = '9h2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b1i !='':
if re.match(r'[PLSGRK+]', Bboard.b1i)and Bboard.b1h=='':
moves = '1i1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b1i)and Bboard.b2h=='':
moves = '1i2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b1i)and Bboard.b2i=='':
moves = '1i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b1i)and Bboard.b2g=='':
moves = '1i2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1i)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1i)and Bboard.b1a==''\
and board.s1b+board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b1i)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1i)and Bboard.b1b==''\
and board.s1c+board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b1i)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b1i)and Bboard.b1c==''\
and board.s1d+board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1i)and Bboard.b1d==''\
and board.s1e+board.s1f+board.s1g+board.s1h=='':
moves = '1i1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1i)and Bboard.b1e==''\
and board.s1f+board.s1g+board.s1h=='':
moves = '1i1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1i)and Bboard.b1f==''\
and board.s1g+board.s1h=='':
moves = '1i1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b1i)and Bboard.b1g==''\
and board.s1h=='':
moves = '1i1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b3i==''\
and board.s2i=='':
moves = '1i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b4i==''\
and board.s2i+board.s3i=='':
moves = '1i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b5i==''\
and board.s2i+board.s3i+board.s4i=='':
moves = '1i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b6i==''\
and board.s2i+board.s3i+board.s4i+board.s5i=='':
moves = '1i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b7i==''\
and board.s2i+board.s3i+board.s4i+board.s5i+board.s6i=='':
moves = '1i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b8i==''\
and board.s2i+board.s3i+board.s4i+board.s5i+board.s6i+board.s7i=='':
moves = '1i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b1i)and Bboard.b9i==''\
and board.s2i+board.s3i+board.s4i+board.s5i+board.s6i+board.s7i+board.s8i=='':
moves = '1i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1i)and Bboard.b7c==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d=='':
moves = '1i7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1i)and Bboard.b8b==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '1i8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b1i)and Bboard.b9a==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '1i9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1i)and Bboard.b3g==''\
and board.s2h=='':
moves = '1i3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1i)and Bboard.b4f==''\
and board.s2h+board.s3g=='':
moves = '1i4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1i)and Bboard.b5e==''\
and board.s2h+board.s3g+board.s4f=='':
moves = '1i5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b1i)and Bboard.b6d==''\
and board.s2h+board.s3g+board.s4f+board.s5e=='':
moves = '1i6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1i)and Bboard.b7c==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d=='':
moves = '1i7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1i)and Bboard.b8b==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d+board.s7c=='':
moves = '1i8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b1i)and Bboard.b9a==''\
and board.s2h+board.s3g+board.s4f+board.s5e+board.s6d+board.s7c+board.s8b=='':
moves = '1i9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b2i !='':
if re.match(r'[PLSGRK+]', Bboard.b2i)and Bboard.b2h=='':
moves = '2i2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2i)and Bboard.b1h=='':
moves = '2i1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b2i)and Bboard.b3h=='':
moves = '2i3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2i)and Bboard.b1i=='':
moves = '2i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b2i)and Bboard.b3i=='':
moves = '2i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2i)and Bboard.b1g=='':
moves = '2i1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b2i)and Bboard.b3g=='':
moves = '2i3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2i)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2i)and Bboard.b2a==''\
and board.s2b+board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b2i)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2i)and Bboard.b2b==''\
and board.s2c+board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b2i)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b2i)and Bboard.b2c==''\
and board.s2d+board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2i)and Bboard.b2d==''\
and board.s2e+board.s2f+board.s2g+board.s2h=='':
moves = '2i2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2i)and Bboard.b2e==''\
and board.s2f+board.s2g+board.s2h=='':
moves = '2i2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2i)and Bboard.b2f==''\
and board.s2g+board.s2h=='':
moves = '2i2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b2i)and Bboard.b2g==''\
and board.s2h=='':
moves = '2i2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b4i==''\
and board.s3i=='':
moves = '2i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b5i==''\
and board.s3i+board.s4i=='':
moves = '2i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b6i==''\
and board.s3i+board.s4i+board.s5i=='':
moves = '2i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b7i==''\
and board.s3i+board.s4i+board.s5i+board.s6i=='':
moves = '2i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b8i==''\
and board.s3i+board.s4i+board.s5i+board.s6i+board.s7i=='':
moves = '2i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b2i)and Bboard.b9i==''\
and board.s3i+board.s4i+board.s5i+board.s6i+board.s7i+board.s8i=='':
moves = '2i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2i)and Bboard.b8c==''\
and board.s3h+board.s4g+board.s5f+board.s6e+board.s7d=='':
moves = '2i8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b2i)and Bboard.b9b==''\
and board.s3h+board.s4g+board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '2i9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2i)and Bboard.b4g==''\
and board.s3h=='':
moves = '2i4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2i)and Bboard.b5f==''\
and board.s3h+board.s4g=='':
moves = '2i5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2i)and Bboard.b6e==''\
and board.s3h+board.s4g+board.s5f=='':
moves = '2i6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b2i)and Bboard.b7d==''\
and board.s3h+board.s4g+board.s5f+board.s6e=='':
moves = '2i7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2i)and Bboard.b8c==''\
and board.s3h+board.s4g+board.s5f+board.s6e+board.s7d=='':
moves = '2i8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b2i)and Bboard.b9b==''\
and board.s3h+board.s4g+board.s5f+board.s6e+board.s7d+board.s8c=='':
moves = '2i9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b3i !='':
if re.match(r'[PLSGRK+]', Bboard.b3i)and Bboard.b3h=='':
moves = '3i3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3i)and Bboard.b2h=='':
moves = '3i2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b3i)and Bboard.b4h=='':
moves = '3i4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3i)and Bboard.b2i=='':
moves = '3i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b3i)and Bboard.b4i=='':
moves = '3i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3i)and Bboard.b2g=='':
moves = '3i2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b3i)and Bboard.b4g=='':
moves = '3i4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3i)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3i)and Bboard.b3a==''\
and board.s3b+board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b3i)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3i)and Bboard.b3b==''\
and board.s3c+board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b3i)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b3i)and Bboard.b3c==''\
and board.s3d+board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3i)and Bboard.b3d==''\
and board.s3e+board.s3f+board.s3g+board.s3h=='':
moves = '3i3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3i)and Bboard.b3e==''\
and board.s3f+board.s3g+board.s3h=='':
moves = '3i3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3i)and Bboard.b3f==''\
and board.s3g+board.s3h=='':
moves = '3i3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b3i)and Bboard.b3g==''\
and board.s3h=='':
moves = '3i3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b1i==''\
and board.s2i=='':
moves = '3i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b5i==''\
and board.s4i=='':
moves = '3i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b6i==''\
and board.s4i+board.s5i=='':
moves = '3i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b7i==''\
and board.s4i+board.s5i+board.s6i=='':
moves = '3i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b8i==''\
and board.s4i+board.s5i+board.s6i+board.s7i=='':
moves = '3i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b3i)and Bboard.b9i==''\
and board.s4i+board.s5i+board.s6i+board.s7i+board.s8i=='':
moves = '3i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b3i)and Bboard.b9c==''\
and board.s4h+board.s5g+board.s6f+board.s7e+board.s8d=='':
moves = '3i9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3i)and Bboard.b1g==''\
and board.s2h=='':
moves = '3i1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3i)and Bboard.b5g==''\
and board.s4h=='':
moves = '3i5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3i)and Bboard.b6f==''\
and board.s4h+board.s5g=='':
moves = '3i6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3i)and Bboard.b6e==''\
and board.s4h+board.s5g+board.s6f=='':
moves = '3i7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b3i)and Bboard.b7d==''\
and board.s4h+board.s5g+board.s6f+board.s7e=='':
moves = '3i8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b3i)and Bboard.b9c==''\
and board.s4h+board.s5g+board.s6f+board.s7e+board.s8d=='':
moves = '3i9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b4i !='':
if re.match(r'[PLSGRK+]', Bboard.b4i)and Bboard.b4h=='':
moves = '4i4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4i)and Bboard.b3h=='':
moves = '4i3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b4i)and Bboard.b5h=='':
moves = '4i5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4i)and Bboard.b3i=='':
moves = '4i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b4i)and Bboard.b5i=='':
moves = '4i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4i)and Bboard.b3g=='':
moves = '4i3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b4i)and Bboard.b5g=='':
moves = '4i5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4i)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4i)and Bboard.b4a==''\
and board.s4b+board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b4i)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4i)and Bboard.b4b==''\
and board.s4c+board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b4i)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b4i)and Bboard.b4c==''\
and board.s4d+board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4i)and Bboard.b4d==''\
and board.s4e+board.s4f+board.s4g+board.s4h=='':
moves = '4i4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4i)and Bboard.b4e==''\
and board.s4f+board.s4g+board.s4h=='':
moves = '4i4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4i)and Bboard.b4f==''\
and board.s4g+board.s4h=='':
moves = '4i4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b4i)and Bboard.b4g==''\
and board.s4h=='':
moves = '4i4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b1i==''\
and board.s2i+board.s3i=='':
moves = '4i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b2i==''\
and board.s3i=='':
moves = '4i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b6i==''\
and board.s5i=='':
moves = '4i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b7i==''\
and board.s5i+board.s6i=='':
moves = '4i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b8i==''\
and board.s5i+board.s6i+board.s7i=='':
moves = '4i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b4i)and Bboard.b9i==''\
and board.s5i+board.s6i+board.s7i+board.s8i=='':
moves = '4i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b6g==''\
and board.s5h=='':
moves = '4i6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b7f==''\
and board.s5h+board.s6g=='':
moves = '4i7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b8e==''\
and board.s5h+board.s6g+board.s7f=='':
moves = '4i8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b9d==''\
and board.s5h+board.s6g+board.s7f+board.s8e=='':
moves = '4i9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b1f==''\
and board.s2g+board.s3h=='':
moves = '4i1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b4i)and Bboard.b2g==''\
and board.s3h=='':
moves = '4i2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b5i !='':
if re.match(r'[PLSGRK+]', Bboard.b5i)and Bboard.b5h=='':
moves = '5i5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5i)and Bboard.b4h=='':
moves = '5i4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b5i)and Bboard.b6h=='':
moves = '5i6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5i)and Bboard.b4i=='':
moves = '5i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b5i)and Bboard.b6i=='':
moves = '5i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5i)and Bboard.b4g=='':
moves = '5i4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b5i)and Bboard.b6g=='':
moves = '5i6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5i)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5i)and Bboard.b5a==''\
and board.s5b+board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b5i)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5i)and Bboard.b5b==''\
and board.s5c+board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b5i)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b5i)and Bboard.b5c==''\
and board.s5d+board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5i)and Bboard.b5d==''\
and board.s5e+board.s5f+board.s5g+board.s5h=='':
moves = '5i5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5i)and Bboard.b5e==''\
and board.s5f+board.s5g+board.s5h=='':
moves = '5i5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5i)and Bboard.b5f==''\
and board.s5g+board.s5h=='':
moves = '5i5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b5i)and Bboard.b5g==''\
and board.s5h=='':
moves = '5i5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b1i==''\
and board.s2i+board.s3i+board.s4i=='':
moves = '5i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b2i==''\
and board.s3i+board.s4i=='':
moves = '5i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b3i==''\
and board.s4i=='':
moves = '5i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b7i==''\
and board.s6i=='':
moves = '5i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b8i==''\
and board.s6i+board.s7i=='':
moves = '5i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b5i)and Bboard.b9i==''\
and board.s6i+board.s7i+board.s8i=='':
moves = '5i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b7g==''\
and board.s6h=='':
moves = '5i7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b8f==''\
and board.s6h+board.s7g=='':
moves = '5i8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b9e==''\
and board.s6h+board.s7g+board.s8f=='':
moves = '5i9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b2f==''\
and board.s3g+board.s4h=='':
moves = '5i2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b3g==''\
and board.s4h=='':
moves = '5i3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b5i)and Bboard.b1e==''\
and board.s4h+board.s3g+board.s2f=='':
moves = '5i1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b6i !='':
if re.match(r'[PLSGRK+]', Bboard.b6i)and Bboard.b6h=='':
moves = '6i6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6i)and Bboard.b5h=='':
moves = '6i5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b6i)and Bboard.b7h=='':
moves = '6i7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6i)and Bboard.b5i=='':
moves = '6i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b6i)and Bboard.b7i=='':
moves = '6i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6i)and Bboard.b5g=='':
moves = '6i5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b6i)and Bboard.b7g=='':
moves = '6i7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6i)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6i)and Bboard.b6a==''\
and board.s6b+board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b6i)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6i)and Bboard.b6b==''\
and board.s6c+board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b6i)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b6i)and Bboard.b6c==''\
and board.s6d+board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6i)and Bboard.b6d==''\
and board.s6e+board.s6f+board.s6g+board.s6h=='':
moves = '6i6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6i)and Bboard.b6e==''\
and board.s6f+board.s6g+board.s6h=='':
moves = '6i6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6i)and Bboard.b6f==''\
and board.s6g+board.s6h=='':
moves = '6i6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b6i)and Bboard.b6g==''\
and board.s6h=='':
moves = '6i6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b9i==''\
and board.s8i+board.s7i=='':
moves = '6i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b8i==''\
and board.s7i=='':
moves = '6i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b4i==''\
and board.s5i=='':
moves = '6i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b3i==''\
and board.s5i+board.s4i=='':
moves = '6i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b2i==''\
and board.s5i+board.s4i+board.s3i=='':
moves = '6i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b6i)and Bboard.b1i==''\
and board.s5i+board.s4i+board.s3i+board.s2i=='':
moves = '6i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b4g==''\
and board.s5h=='':
moves = '6i4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b3f==''\
and board.s5h+board.s4g=='':
moves = '6i3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b2e==''\
and board.s5h+board.s4g+board.s3f=='':
moves = '6i2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b1d==''\
and board.s5h+board.s4g+board.s3f+board.s2e=='':
moves = '6i1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b9f==''\
and board.s8g+board.s7h=='':
moves = '6i9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b6i)and Bboard.b8g==''\
and board.s7h=='':
moves = '6i8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b7i !='':
if re.match(r'[PLSGRK+]', Bboard.b7i)and Bboard.b7h=='':
moves = '7i7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7i)and Bboard.b6h=='':
moves = '7i6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b7i)and Bboard.b8h=='':
moves = '7i8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7i)and Bboard.b6i=='':
moves = '7i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b7i)and Bboard.b8i=='':
moves = '7i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7i)and Bboard.b6g=='':
moves = '7i6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b7i)and Bboard.b8g=='':
moves = '7i8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7i)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7i)and Bboard.b7a==''\
and board.s7b+board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b7i)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7i)and Bboard.b7b==''\
and board.s7c+board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b7i)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b7i)and Bboard.b7c==''\
and board.s7d+board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7i)and Bboard.b7d==''\
and board.s7e+board.s7f+board.s7g+board.s7h=='':
moves = '7i7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7i)and Bboard.b7e==''\
and board.s7f+board.s7g+board.s7h=='':
moves = '7i7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7i)and Bboard.b7f==''\
and board.s7g+board.s7h=='':
moves = '7i7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b7i)and Bboard.b7g==''\
and board.s7h=='':
moves = '7i7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b9i==''\
and board.s8i=='':
moves = '7i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b5i==''\
and board.s6i=='':
moves = '7i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b4i==''\
and board.s6i+board.s5i=='':
moves = '7i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b3i==''\
and board.s6i+board.s5i+board.s4i=='':
moves = '7i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b2i==''\
and board.s6i+board.s5i+board.s4i+board.s3i=='':
moves = '7i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b7i)and Bboard.b1i==''\
and board.s6i+board.s5i+board.s4i+board.s3i+board.s2i=='':
moves = '7i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b7i)and Bboard.b1c==''\
and board.s6h+board.s5g+board.s4f+board.s3e+board.s2d=='':
moves = '7i1c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7i)and Bboard.b9g==''\
and board.s8h=='':
moves = '7i9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7i)and Bboard.b5g==''\
and board.s6h=='':
moves = '7i5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7i)and Bboard.b4f==''\
and board.s6h+board.s5g=='':
moves = '7i4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7i)and Bboard.b4e==''\
and board.s6h+board.s5g+board.s4f=='':
moves = '7i3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b7i)and Bboard.b3d==''\
and board.s6h+board.s5g+board.s4f+board.s3e=='':
moves = '7i2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b7i)and Bboard.b1c==''\
and board.s6h+board.s5g+board.s4f+board.s3e+board.s2d=='':
moves = '7i1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b8i !='':
if re.match(r'[PLSGRK+]', Bboard.b8i)and Bboard.b8h=='':
moves = '8i8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8i)and Bboard.b7h=='':
moves = '8i7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b8i)and Bboard.b9h=='':
moves = '8i9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8i)and Bboard.b7i=='':
moves = '8i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b8i)and Bboard.b9i=='':
moves = '8i9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8i)and Bboard.b7g=='':
moves = '8i7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b8i)and Bboard.b9g=='':
moves = '8i9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8i)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8i)and Bboard.b8a==''\
and board.s8b+board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b8i)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8i)and Bboard.b8b==''\
and board.s8c+board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b8i)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b8i)and Bboard.b8c==''\
and board.s8d+board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8i)and Bboard.b8d==''\
and board.s8e+board.s8f+board.s8g+board.s8h=='':
moves = '8i8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8i)and Bboard.b8e==''\
and board.s8f+board.s8g+board.s8h=='':
moves = '8i8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8i)and Bboard.b8f==''\
and board.s8g+board.s8h=='':
moves = '8i8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b8i)and Bboard.b8g==''\
and board.s8h=='':
moves = '8i8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b6i==''\
and board.s7i=='':
moves = '8i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b5i==''\
and board.s7i+board.s6i=='':
moves = '8i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b4i==''\
and board.s7i+board.s6i+board.s5i=='':
moves = '8i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b3i==''\
and board.s7i+board.s6i+board.s5i+board.s4i=='':
moves = '8i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b2i==''\
and board.s7i+board.s6i+board.s5i+board.s4i+board.s3i=='':
moves = '8i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b8i)and Bboard.b1i==''\
and board.s7i+board.s6i+board.s5i+board.s4i+board.s3i+board.s2i=='':
moves = '8i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8i)and Bboard.b2c==''\
and board.s7h+board.s6g+board.s5f+board.s4e+board.s3d=='':
moves = '8i2c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b8i)and Bboard.b1b==''\
and board.s7h+board.s6g+board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '8i1b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8i)and Bboard.b6g==''\
and board.s7h=='':
moves = '8i6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8i)and Bboard.b5f==''\
and board.s7h+board.s6g=='':
moves = '8i5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8i)and Bboard.b4e==''\
and board.s7h+board.s6g+board.s5f=='':
moves = '8i4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b8i)and Bboard.b3d==''\
and board.s7h+board.s6g+board.s5f+board.s4e=='':
moves = '8i3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8i)and Bboard.b2c==''\
and board.s7h+board.s6g+board.s5f+board.s4e+board.s3d=='':
moves = '8i2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b8i)and Bboard.b1b==''\
and board.s7h+board.s6g+board.s5f+board.s4e+board.s3d+board.s2c=='':
moves = '8i1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.b9i !='':
if re.match(r'[PLSGRK+]', Bboard.b9i)and Bboard.b9h=='':
moves = '9i9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[SGBK+]', Bboard.b9i)and Bboard.b8h=='':
moves = '9i8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'[GRK+]', Bboard.b9i)and Bboard.b8i=='':
moves = '9i8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('N', Bboard.b9i)and Bboard.b8g=='':
moves = '9i8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9i)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9i)and Bboard.b9a==''\
and board.s9b+board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+R', Bboard.b9i)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9i)and Bboard.b9b==''\
and board.s9c+board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|L', Bboard.b9i)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'R|L', Bboard.b9i)and Bboard.b9c==''\
and board.s9d+board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9i)and Bboard.b9d==''\
and board.s9e+board.s9f+board.s9g+board.s9h=='':
moves = '9i9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9i)and Bboard.b9e==''\
and board.s9f+board.s9g+board.s9h=='':
moves = '9i9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9i)and Bboard.b9f==''\
and board.s9g+board.s9h=='':
moves = '9i9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R|L', Bboard.b9i)and Bboard.b9g==''\
and board.s9h=='':
moves = '9i9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b7i==''\
and board.s8i=='':
moves = '9i7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b6i==''\
and board.s8i+board.s7i=='':
moves = '9i6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b5i==''\
and board.s8i+board.s7i+board.s6i=='':
moves = '9i5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b4i==''\
and board.s8i+board.s7i+board.s6i+board.s5i=='':
moves = '9i4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b3i==''\
and board.s8i+board.s7i+board.s6i+board.s5i+board.s4i=='':
moves = '9i3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b2i==''\
and board.s8i+board.s7i+board.s6i+board.s5i+board.s4i+board.s3i=='':
moves = '9i2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+R|R', Bboard.b9i)and Bboard.b1i==''\
and board.s8i+board.s7i+board.s6i+board.s5i+board.s4i+board.s3i+board.s2i=='':
moves = '9i1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9i)and Bboard.b3c==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d=='':
moves = '9i3c+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9i)and Bboard.b2b==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '9i2b+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('B',Bboard.b9i)and Bboard.b1a==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '9i1a+'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9i)and Bboard.b7g==''\
and board.s8h=='':
moves = '9i7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9i)and Bboard.b6f==''\
and board.s8h+board.s7g=='':
moves = '9i6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9i)and Bboard.b5e==''\
and board.s8h+board.s7g+board.s6f=='':
moves = '9i5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match(r'\+B|B', Bboard.b9i)and Bboard.b4d==''\
and board.s8h+board.s7g+board.s6f+board.s5e=='':
moves = '9i4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9i)and Bboard.b3c==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d=='':
moves = '9i3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9i)and Bboard.b2b==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d+board.s3c=='':
moves = '9i2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if re.match('\+B', Bboard.b9i)and Bboard.b1a==''\
and board.s8h+board.s7g+board.s6f+board.s5e+board.s4d+board.s3c+board.s2b=='':
moves = '9i1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1a =='':
if Bboard.S>0:
moves = 'S*1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1b =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1c =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1d =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1e =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1f =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1g =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1h =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s1i =='':
if Bboard.P>0 and (Bboard.b1b !='P' and Bboard.b1c !='P' and Bboard.b1d !='P' and Bboard.b1e !='P' and Bboard.b1f !='P' and Bboard.b1g !='P' and Bboard.b1h !='P' and Bboard.b1i !='P'):
moves = 'P*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*1i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2a =='':
if Bboard.S>0:
moves = 'S*2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2b =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2c =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2d =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2e =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2f =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2g =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2h =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s2i =='':
if Bboard.P>0 and (Bboard.b2b !='P' and Bboard.b2c !='P' and Bboard.b2d !='P' and Bboard.b2e !='P' and Bboard.b2f !='P' and Bboard.b2g !='P' and Bboard.b2h !='P' and Bboard.b2i !='P'):
moves = 'P*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*2i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3a =='':
if Bboard.S>0:
moves = 'S*3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3b =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3c =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3d =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3e =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3f =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3g =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3h =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s3i =='':
if Bboard.P>0 and (Bboard.b3b !='P' and Bboard.b3c !='P' and Bboard.b3d !='P' and Bboard.b3e !='P' and Bboard.b3f !='P' and Bboard.b3g !='P' and Bboard.b3h !='P' and Bboard.b3i !='P'):
moves = 'P*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*3i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4a =='':
if Bboard.S>0:
moves = 'S*4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4b =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4c =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4d =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4e =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4f =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4g =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4h =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s4i =='':
if Bboard.P>0 and (Bboard.b4b !='P' and Bboard.b4c !='P' and Bboard.b4d !='P' and Bboard.b4e !='P' and Bboard.b4f !='P' and Bboard.b4g !='P' and Bboard.b4h !='P' and Bboard.b4i !='P'):
moves = 'P*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*4i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9a =='':
if Bboard.S>0:
moves = 'S*9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9b =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9c =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9d =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9e =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9f =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9g =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9h =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s9i =='':
if Bboard.P>0 and (Bboard.b9b !='P' and Bboard.b9c !='P' and Bboard.b9d !='P' and Bboard.b9e !='P' and Bboard.b9f !='P' and Bboard.b9g !='P' and Bboard.b9h !='P' and Bboard.b9i !='P'):
moves = 'P*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*9i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8a =='':
if Bboard.S>0:
moves = 'S*8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8b =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8c =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8d =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8e =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8f =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8g =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8h =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s8i =='':
if Bboard.P>0 and (Bboard.b8b !='P' and Bboard.b8c !='P' and Bboard.b8d !='P' and Bboard.b8e !='P' and Bboard.b8f !='P' and Bboard.b8g !='P' and Bboard.b8h !='P' and Bboard.b8i !='P'):
moves = 'P*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*8i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7a =='':
if Bboard.S>0:
moves = 'S*7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7b =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7c =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7d =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7e =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7f =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7g =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7h =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s7i =='':
if Bboard.P>0 and (Bboard.b7b !='P' and Bboard.b7c !='P' and Bboard.b7d !='P' and Bboard.b7e !='P' and Bboard.b7f !='P' and Bboard.b7g !='P' and Bboard.b7h !='P' and Bboard.b7i !='P'):
moves = 'P*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*7i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6a =='':
if Bboard.S>0:
moves = 'S*6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6b =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6c =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6d =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6e =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6f =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6g =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6h =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s6i =='':
if Bboard.P>0 and (Bboard.b6b !='P' and Bboard.b6c !='P' and Bboard.b6d !='P' and Bboard.b6e !='P' and Bboard.b6f !='P' and Bboard.b6g !='P' and Bboard.b6h !='P' and Bboard.b6i !='P'):
moves = 'P*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*6i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5a =='':
if Bboard.S>0:
moves = 'S*5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5a'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5b =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5b'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5c =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5c'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5d =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5d'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5e =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5e'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5f =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5f'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5g =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5g'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5h =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5h'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if board.s5i =='':
if Bboard.P>0 and (Bboard.b5b !='P' and Bboard.b5c !='P' and Bboard.b5d !='P' and Bboard.b5e !='P' and Bboard.b5f !='P' and Bboard.b5g !='P' and Bboard.b5h !='P' and Bboard.b5i !='P'):
moves = 'P*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.L>0:
moves = 'L*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.N>0:
moves = 'N*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.S>0:
moves = 'S*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.G>0:
moves = 'G*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.B>0:
moves = 'B*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
if Bboard.R>0:
moves = 'R*5i'
kaihimore(moves)
if oute.oute == 0:
depth1.append(moves)
| 37.813373 | 192 | 0.457949 | 95,656 | 817,752 | 3.914956 | 0.026334 | 0.141219 | 0.161415 | 0.201768 | 0.967513 | 0.953075 | 0.940442 | 0.932086 | 0.927189 | 0.919995 | 0 | 0.05863 | 0.389381 | 817,752 | 21,625 | 193 | 37.815121 | 0.691345 | 0.00005 | 0 | 0.651412 | 0 | 0 | 0.035891 | 0.000026 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000093 | false | 0 | 0.000326 | 0 | 0.000419 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e4a8b9e3abb46546f132673353102e223b39f8b | 166 | py | Python | phablytics/constants/__init__.py | hacktoolkit/phablytics | 977a85b0e5035ef93dd8b745673248c983bbc294 | [
"MIT"
] | 3 | 2020-06-22T22:05:08.000Z | 2020-10-27T19:01:40.000Z | phablytics/constants/__init__.py | hacktoolkit/phablytics | 977a85b0e5035ef93dd8b745673248c983bbc294 | [
"MIT"
] | 2 | 2020-12-22T22:15:31.000Z | 2021-12-14T03:47:19.000Z | phablytics/constants/__init__.py | hacktoolkit/phablytics | 977a85b0e5035ef93dd8b745673248c983bbc294 | [
"MIT"
] | 4 | 2020-08-27T15:56:01.000Z | 2021-05-07T18:03:11.000Z | # Phablytics Imports
from phablytics.constants.general import * # noqa
from phablytics.constants.phab import * # noqa
from phablytics.constants.ui import * # noqa
| 33.2 | 50 | 0.777108 | 20 | 166 | 6.45 | 0.45 | 0.325581 | 0.534884 | 0.372093 | 0.511628 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144578 | 166 | 4 | 51 | 41.5 | 0.908451 | 0.198795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
281cde66b012af73551526cf2e065cc15e8148b1 | 107 | py | Python | django_grapesjs/tests/__init__.py | TheLazzziest/django_grapesjs | 037234e082a5b2272996ef08008a5d4a2ee9845b | [
"MIT"
] | 6 | 2020-01-19T01:37:11.000Z | 2021-04-10T02:08:43.000Z | django_grapesjs/tests/__init__.py | stajilov/django_grapesjs | 766cedf437429cfb00c909fa423da3854a8d1ea1 | [
"MIT"
] | null | null | null | django_grapesjs/tests/__init__.py | stajilov/django_grapesjs | 766cedf437429cfb00c909fa423da3854a8d1ea1 | [
"MIT"
] | 4 | 2020-01-19T01:38:13.000Z | 2020-11-09T10:03:53.000Z | from .test_form_fields import *
from .test_model_fields import *
from .utils import *
from .views import *
| 21.4 | 32 | 0.775701 | 16 | 107 | 4.9375 | 0.5 | 0.379747 | 0.405063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149533 | 107 | 4 | 33 | 26.75 | 0.868132 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2870fda2af62b726d24c2bb6fb42d4b4106dc1ab | 18,236 | py | Python | testhub/testsuites/annotations/test_cvat_suites.py | banrieen/PerfBoard | 855c7249de9075a8bc33149938782245d580d558 | [
"MIT"
] | 146 | 2021-04-15T03:32:12.000Z | 2021-09-10T06:06:42.000Z | testhub/testsuites/annotations/test_cvat_suites.py | banrieen/PerfBoard | 855c7249de9075a8bc33149938782245d580d558 | [
"MIT"
] | 3 | 2021-05-26T05:20:26.000Z | 2021-08-05T02:07:14.000Z | testhub/testsuites/annotations/testsuites.py | banrieen/MachineDevil | b9d8d70bc7e8d0113a9176aa16ec0e7fbda523ca | [
"MIT"
] | 29 | 2021-04-03T06:46:43.000Z | 2021-07-11T08:35:19.000Z | # coding=UTF-8
""" annotation
# 数据集标注基础测试集
# VERSION: 0.0.1
# EDITOR: thomas
# TIMER: 2021-03-11
"""
import locust.stats
locust.stats.CONSOLE_STATS_INTERVAL_SEC = 3
from locust import TaskSet, task, between, User
from locust.contrib.fasthttp import FastHttpUser
from locust import events
import logging
import json
import os
import yaml
import pdb
TEST_CONF = os.path.join(os.path.abspath(os.path.dirname(os.path.abspath(__file__)) + os.path.sep ), "datas.yaml")
TEST_DATAS = {}
def read_test_datas(conf_file=TEST_CONF):
stream = {}
with open(conf_file,'r') as cf:
stream =cf.read()
conf = yaml.safe_load(stream)
return conf
@events.quitting.add_listener
def _(environment, **kw):
if environment.stats.total.fail_ratio > 0.001:
logging.error("Test failed due to failure ratio > 1%")
environment.process_exit_code = 1
elif environment.stats.total.avg_response_time > 200:
logging.error("Test failed due to average response time ratio > 200 ms")
environment.process_exit_code = 2
elif environment.stats.total.get_response_time_percentile(0.99) > 800:
logging.error("Test failed due to 95th percentile response time > 800 ms")
environment.process_exit_code = 3
else:
environment.process_exit_code = 0
class BasicalActions(TaskSet):
"""
数据标注测试集
1. 单用户并发10个task的基础操作请求
2. 单用户持续标注1000以上图标的响应和时延
3. 100用户并发的基础操作
4. 20用户并发的upload/dump 500M文件的操作(数据集导入,导出)。
5. 存储空间的耗用和回收
6. 内存空间的耗用和系统整体的响应
7. 内网环境、Internet下响应时延
"""
global TEST_DATAS
testdatas = {}
@events.test_start.add_listener
def on_test_start_get_homepage(self, environment, **kwargs):
print("======================= A new test is starting, user will login! =======================")
self.client.get(self.testdatas["RESTFULAPI"]["homepage"])
with self.client.post(path=self.testdatas["RESTFULAPI"]["login"]["path"],
headers=self.testdatas["RESTFULAPI"]["Header"],
data=json.dumps(self.testdatas["ACCOUNT"]["admin"])) as response:
if response.status_code == 200:
token = response.json["token"]
self.testdatas["token"] = token
response.success()
@events.test_stop.add_listener
def on_test_stop_logout(self, environment, **kwargs):
print("======================= A test is ending, user will logout! =======================")
responses = self.client.get(url=self.testdatas["RESTFULAPI"]["logout"]["path"])
if responses.status_code == 200:
rst = json.loads(responses.text, strict=False)
if rst['success'] == '200':
responses.success()
@task(1)
def test_redirect_cvat(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["redirect_cvat"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["redirect_cvat"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_projects_list(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["projects_list"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["projects_list"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_create_task(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["create_task"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["create_task"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_upload_task_label(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["upload_task_label"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["upload_task_label"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_upload_task_data(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["upload_task_data"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["upload_task_data"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_get_task_status(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["get_task_status"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["get_task_status"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_get_task_list(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["get_task_list"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["get_task_list"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_auto_annotations(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["auto_annotations"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["auto_annotations"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_task(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["open_task"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["open_task"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_job(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["open_job"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["open_job"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_job_meta(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["open_job_meta"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["open_job_meta"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_job_logs(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["open_job_logs"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["open_job_logs"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_job_data(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["open_job_data"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["open_job_data"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_open_job_annotations(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["open_job_annotations"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["open_job_annotations"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_job(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["save_job"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["save_job"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_job_create_annotations(self):
self.testdatas = TEST_DATAS
with self.client.patch(path=self.testdatas["RESTFULAPI"]["save_job_create_annotations"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["save_job_create_annotations"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_job_update_annotations(self):
self.testdatas = TEST_DATAS
with self.client.patch(path=self.testdatas["RESTFULAPI"]["save_job_update_annotations"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["save_job_update_annotations"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_job_delete_annotations(self):
self.testdatas = TEST_DATAS
with self.client.patch(path=self.testdatas["RESTFULAPI"]["save_job_delete_annotations"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["save_job_delete_annotations"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_save_job_logs(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["save_job_logs"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["save_job_logs"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_save_job_loader(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["save_job_loader"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["save_job_loader"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_reback_task(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["reback_task"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["reback_task"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_upload_anotaion(self):
self.testdatas = TEST_DATAS
with self.client.post(path=self.testdatas["RESTFULAPI"]["upload_anotaion"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["upload_anotaion"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_push_to_ai_platform(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["push_to_ai_platform"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["push_to_ai_platform"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_dump_anotation(self):
self.testdatas = TEST_DATAS
with self.client.get(path=self.testdatas["RESTFULAPI"]["dump_anotation"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
params=json.dumps(self.testdatas["RESTFULAPI"]["dump_anotation"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
@task(1)
def test_delect_task(self):
self.testdatas = TEST_DATAS
with self.client.delete(path=self.testdatas["RESTFULAPI"]["delect_task"]["path"],
headers=self.testdatas["RESTFULAPI"]["header"],
data=json.dumps(self.testdatas["RESTFULAPI"]["delect_task"]["datas"])) as response:
if response.status_code == 200:
response.success()
elif response.status_code == 401:
print("account error")
else:
response.raise_for_status()
class AnnotationUser(FastHttpUser):
global TEST_DATAS
sock = None
wait_time = between(0.5, 5)
TEST_DATAS = read_test_datas(conf_file=TEST_CONF)
tasks = [BasicalActions]
if __name__ == "__main__":
cmd = 'locust -f locust_demo.py'
os.system(cmd)
# Run in cmd
# locust -f ./testhub/testsuites/annotations_cvat/test_cvat_actions.py --conf ./testhub/testsuites/annotations_cvat/host.conf
| 43.522673 | 129 | 0.568272 | 1,895 | 18,236 | 5.286544 | 0.099208 | 0.137552 | 0.181374 | 0.070074 | 0.829906 | 0.799461 | 0.785786 | 0.767618 | 0.763126 | 0.756139 | 0 | 0.019473 | 0.304453 | 18,236 | 418 | 130 | 43.626794 | 0.770341 | 0.02347 | 0 | 0.635359 | 0 | 0 | 0.150566 | 0.014307 | 0 | 0 | 0 | 0 | 0 | 1 | 0.080111 | false | 0 | 0.024862 | 0 | 0.127072 | 0.074586 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
954a144f6c35ab75fcac01cfe5f67bf4af401f1c | 6,422 | py | Python | code/hierarchical/model_h.py | SqWei17/hierarachical_ec2vae | f823c5e5b196184cd1fda1e87b262d9d51b11c36 | [
"MIT"
] | 1 | 2021-11-26T13:34:34.000Z | 2021-11-26T13:34:34.000Z | code/hierarchical/model_h.py | SqWei17/hierarachical_ec2vae | f823c5e5b196184cd1fda1e87b262d9d51b11c36 | [
"MIT"
] | null | null | null | code/hierarchical/model_h.py | SqWei17/hierarachical_ec2vae | f823c5e5b196184cd1fda1e87b262d9d51b11c36 | [
"MIT"
] | null | null | null | import torch
from torch import nn
from torch.nn import functional as F
from torch.distributions import Normal
__all__ = ['Decoder8to4', 'Decoder4to2']
class Decoder8to4(nn.Module):
def __init__(self,hidden_dims,
z1p_dim,z1r_dim,
z2p_dim,z2r_dim,
n_step,
k=1000):
super(Decoder8to4, self).__init__()
self.grucell_0 = nn.GRUCell(z1p_dim+z2p_dim,
hidden_dims)
self.grucell_1 = nn.GRUCell(
z1r_dim+z2r_dim, hidden_dims)
self.linear_init_0 = nn.Linear(z1p_dim, hidden_dims)
self.linear_init_1 = nn.Linear(z1r_dim, hidden_dims)
self.linear_out_0 = nn.Linear(hidden_dims, z2p_dim)
self.linear_out_1 = nn.Linear(hidden_dims, z2r_dim)
self.linear_out_0_ = nn.Linear(hidden_dims, z2p_dim)
self.linear_out_1_ = nn.Linear(hidden_dims, z2r_dim)
self.n_step = n_step
self.hidden_dims = hidden_dims
self.z1p_dim = z1p_dim
self.z1r_dim = z1r_dim
self.z2p_dim = z2p_dim
self.z2r_dim = z2r_dim
self.eps = 1
self.samplep = None
self.sampler = None
self.iteration = 0
self.k = torch.FloatTensor([k])
def final_decoder84p(self, z):
out = torch.zeros((z.size(0), self.z2p_dim))
out[:, -1] = 1.
x, hx = [], [None]
t = torch.tanh(self.linear_init_0(z))
hx = t
if torch.cuda.is_available():
out = out.cuda()
for i in range(self.n_step):
out = torch.cat([out, z], 1)
hx = self.grucell_0(out, hx)
out = self.linear_out_0(hx)
x.append(out)
if self.training:
p = torch.rand(1).item()
if p < self.eps:
out = out
#out = self.samplep[:, i, :]
#out = out.squeeze(1)
else:
out = out
else:
out = out
return torch.stack(x, 1)
def final_decoder84r(self, z):
out = torch.zeros((z.size(0), self.z2r_dim))
out[:, -1] = 1.
x, hx = [], [None]
t = torch.tanh(self.linear_init_1(z))
hx = t
if torch.cuda.is_available():
out = out.cuda()
for i in range(self.n_step):
out = torch.cat([out, z], 1)
hx = self.grucell_1(out, hx)
out = self.linear_out_1(hx)
x.append(out)
if self.training:
p = torch.rand(1).item()
if p < self.eps:
out = out
# out = self.sampler[:, i, :]
# out = out.squeeze(1)
else:
out = out
self.eps = self.k / \
(self.k + torch.exp(self.iteration / self.k))
else:
out = out
return torch.stack(x, 1)
def forward(self, z_8p,z_8r,z_4p,z_4r):
if self.training:
self.samplep = z_4p
self.sampler = z_4r
self.iteration += 1
z4p = self.final_decoder84p(z_8p)
z4r= self.final_decoder84r(z_8r)
output = (z4p,z4r)
return output
class Decoder4to2(nn.Module):
def __init__(self,hidden_dims,
z1p_dim,z1r_dim,
z2p_dim,z2r_dim,
n_step,
k=1000):
super(Decoder4to2, self).__init__()
self.grucell_0 = nn.GRUCell(z1p_dim+z2p_dim,
hidden_dims)
self.grucell_1 = nn.GRUCell(
z1r_dim+z2r_dim, hidden_dims)
self.linear_init_0 = nn.Linear(z1p_dim, hidden_dims)
self.linear_init_1 = nn.Linear(z1r_dim, hidden_dims)
self.linear_out_0 = nn.Linear(hidden_dims, z2p_dim)
self.linear_out_1 = nn.Linear(hidden_dims, z2r_dim)
self.linear_out_0_ = nn.Linear(hidden_dims, z2p_dim)
self.linear_out_1_ = nn.Linear(hidden_dims, z2r_dim)
self.n_step = n_step
self.hidden_dims = hidden_dims
self.z1p_dim = z1p_dim
self.z1r_dim = z1r_dim
self.z2p_dim = z2p_dim
self.z2r_dim = z2r_dim
self.eps = 1
self.samplep = None
self.sampler = None
self.iteration = 0
self.k = torch.FloatTensor([k])
def final_decoder42p(self, z):
out = torch.zeros((z.size(0), self.z2p_dim))
out[:, -1] = 1.
x, hx = [], [None]
t = torch.tanh(self.linear_init_0(z))
hx = t
if torch.cuda.is_available():
out = out.cuda()
for i in range(self.n_step):
out = torch.cat([out, z], 1)
hx = self.grucell_0(out, hx)
out = self.linear_out_0(hx)
x.append(out)
if self.training:
p = torch.rand(1).item()
if p < self.eps:
out = out
#out = self.samplep[:, i, :]
#out = out.squeeze(1)
else:
out = out
else:
out = out
return torch.stack(x, 1)
def final_decoder42r(self, z):
out = torch.zeros((z.size(0), self.z2r_dim))
out[:, -1] = 1.
x, hx = [], [None]
t = torch.tanh(self.linear_init_1(z))
hx = t
if torch.cuda.is_available():
out = out.cuda()
for i in range(self.n_step):
out = torch.cat([out, z], 1)
hx = self.grucell_1(out, hx)
out = self.linear_out_1(hx)
x.append(out)
if self.training:
p = torch.rand(1).item()
if p < self.eps:
out = out
#out = self.sampler[:, i, :]
#out = out.squeeze(1)
else:
out = out
self.eps = self.k / \
(self.k + torch.exp(self.iteration / self.k))
else:
out = out
return torch.stack(x, 1)
def forward(self, z_4p,z_4r,z_2p,z_2r):
if self.training:
self.samplep = z_2p
self.sampler = z_2r
self.iteration += 1
z2p = self.final_decoder42p(z_4p)
z2r = self.final_decoder42r(z_4r)
output = (z2p,z2r)
return output
| 32.765306 | 65 | 0.49346 | 833 | 6,422 | 3.588235 | 0.097239 | 0.048177 | 0.052191 | 0.0455 | 0.852459 | 0.852459 | 0.835062 | 0.835062 | 0.835062 | 0.835062 | 0 | 0.043803 | 0.395671 | 6,422 | 195 | 66 | 32.933333 | 0.726359 | 0.029586 | 0 | 0.853801 | 0 | 0 | 0.003535 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046784 | false | 0 | 0.023392 | 0 | 0.116959 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
95cf66dda83042585c9acc35335ed31931b8194c | 9,011 | py | Python | application/core/models.py | victor-freitas/ProjetoNCS | 7c80fad11e49f4ed00eefb90638730d340d78e1f | [
"Apache-2.0"
] | null | null | null | application/core/models.py | victor-freitas/ProjetoNCS | 7c80fad11e49f4ed00eefb90638730d340d78e1f | [
"Apache-2.0"
] | 2 | 2020-06-05T18:58:17.000Z | 2021-06-10T20:50:12.000Z | application/core/models.py | victor-freitas/ProjetoNCS | 7c80fad11e49f4ed00eefb90638730d340d78e1f | [
"Apache-2.0"
] | 1 | 2018-09-17T18:14:18.000Z | 2018-09-17T18:14:18.000Z | # This is an auto-generated Django model module.
# You'll have to do the following manually to clean this up:
# * Rearrange models' order
# * Make sure each model has one field with primary_key=True
# * Make sure each ForeignKey has `on_delete` set to the desired behavior.
# * Remove `managed = False` lines if you wish to allow Django to create, modify, and delete the table
# Feel free to rename the models, but don't rename db_table values or field names.
from django.db import models
class Cliente(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
cpf_cnpj = models.IntegerField(db_column='CPF_CNPJ', unique=True) # Field name made lowercase.
#nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
razao = models.CharField(db_column='RAZAO', max_length=100, blank=True, null=True) # Field name made lowercase.
endereco = models.CharField(db_column='ENDERECO', max_length=80) # Field name made lowercase.
cep = models.CharField(db_column='CEP', max_length=20) # Field name made lowercase.
email = models.CharField(db_column='EMAIL', max_length=200) # Field name made lowercase.
telefone = models.CharField(db_column='TELEFONE', max_length=11) # Field name made lowercase.
celular = models.CharField(db_column='CELULAR', max_length=11, blank=True, null=True) # Field name made lowercase.
id_seguimento = models.ForeignKey('Tiposeguimento', models.DO_NOTHING, db_column='ID_SEGUIMENTO') # Field name made lowercase.
class Meta:
managed = False
db_table = 'Cliente'
def __str__(self):
return self.razao
class Fornecedor(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
cpf_cnpj = models.IntegerField(db_column='CPF_CNPJ', unique=True) # Field name made lowercase.
#nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
razao = models.CharField(db_column='RAZAO', max_length=100, blank=True, null=True) # Field name made lowercase.
endereco = models.CharField(db_column='ENDERECO', max_length=80) # Field name made lowercase.
cep = models.CharField(db_column='CEP', max_length=20) # Field name made lowercase.
email = models.CharField(db_column='EMAIL', max_length=200) # Field name made lowercase.
telefone = models.CharField(db_column='TELEFONE', max_length=11) # Field name made lowercase.
celular = models.CharField(db_column='CELULAR', max_length=11, blank=True, null=True) # Field name made lowercase.
pessoa_contato = models.CharField(db_column='PESSOA_CONTATO', max_length=100, blank=True, null=True) # Field name made lowercase.
class Meta:
managed = False
db_table = 'Fornecedor'
def __str__(self):
return self.razao
class Funcionario(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
cpf = models.IntegerField(db_column='CPF') # Field name made lowercase.
cargo = models.SmallIntegerField(db_column='CARGO') # Field name made lowercase.
id_setor = models.ForeignKey('Setor', models.DO_NOTHING, db_column='ID_SETOR') # Field name made lowercase.
login = models.CharField(db_column='LOGIN', max_length=100) # Field name made lowercase.
senha = models.CharField(db_column='SENHA', max_length=50) # Field name made lowercase.
class Meta:
managed = False
db_table = 'Funcionario'
class Materiaprima(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=60) # Field name made lowercase.
forma_emb = models.CharField(db_column='FORMA_EMB', max_length=60) # Field name made lowercase.
peso = models.CharField(db_column='PESO', max_length=20) # Field name made lowercase.
unid_medida = models.CharField(db_column='UNID_MEDIDA', max_length=50) # Field name made lowercase.
quantidade_max = models.IntegerField(db_column='QUANTIDADE_MAX') #Field name made Lowercase.
quantidade = models.IntegerField(db_column='QUANTIDADE') # Field name made lowercase.
quantidade_min = models.IntegerField(db_column='QUANTIDADE_MIN') # Field name made lowercase.
descricao = models.CharField(db_column='DESCRICAO', max_length=500) # Field name made lowercase.
data_recebimento = models.DateField(db_column='DATA_RECEBIMENTO') # Field name made lowercase.
id_fornecedor = models.ForeignKey(Fornecedor, models.DO_NOTHING, db_column='ID_FORNECEDOR') # Field name made lowercase.
class Meta:
managed = False
db_table = 'MateriaPrima'
def __str__(self):
return self.nome
class Ordemdeproducao(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
descricao = models.CharField(db_column='Descricao', max_length=500) # Field name made lowercase.
id_status = models.ForeignKey('Statusordemproducao', models.DO_NOTHING, db_column='ID_STATUS') # Field name made lowercase.
class Meta:
managed = False
db_table = 'OrdemDeProducao'
class Pedido(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
data_pedido = models.DateField(db_column='DATA_PEDIDO') # Field name made lowercase.
valor = models.CharField(db_column='VALOR', max_length=20, blank=True, null=True) # Field name made lowercase.
id_cliente = models.ForeignKey(Cliente, models.DO_NOTHING, db_column='ID_CLIENTE') # Field name made lowercase.
class Meta:
managed = False
db_table = 'Pedido'
class Pedidomp(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
data_pedido = models.DateField(db_column='DATA_PEDIDO') # Field name made lowercase.
data_prevista = models.DateField(db_column='DATA_PREVISTA') # Field name made lowercase.
descricao = models.CharField(db_column='DESCRICAO', max_length=500, blank=True, null=True) # Field name made lowercase.
valor = models.CharField(db_column='VALOR', max_length=20, blank=True, null=True) # Field name made lowercase.
id_fornecedor = models.ForeignKey(Fornecedor, models.DO_NOTHING, db_column='ID_FORNECEDOR') # Field name made lowercase.
id_funcionario = models.ForeignKey(Funcionario, models.DO_NOTHING, db_column='ID_FUNCIONARIO') # Field name made lowercase.
class Meta:
managed = False
db_table = 'PedidoMP'
class Produto(models.Model):
id = models.AutoField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=60) # Field name made lowercase.
forma_emb = models.CharField(db_column='FORMA_EMB', max_length=60) # Field name made lowercase.
peso = models.CharField(db_column='PESO', max_length=20) # Field name made lowercase.
unid_medida = models.CharField(db_column='UNID_MEDIDA', max_length=50) # Field name made lowercase.
id_tipo = models.ForeignKey('Tipoproduto', models.DO_NOTHING, db_column='ID_TIPO') # Field name made lowercase.
preco = models.CharField(db_column='PRECO', max_length=10, blank=True, null=True) # Field name made lowercase.
quantidade = models.IntegerField(db_column='QUANTIDADE', blank=True, null=True) # Field name made lowercase.
desc_produto = models.CharField(db_column='DESC_PRODUTO', max_length=500) # Field name made lowercase.
class Meta:
managed = False
db_table = 'Produto'
class Setor(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
class Meta:
managed = False
db_table = 'Setor'
class Statusordemproducao(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
status_nome = models.CharField(db_column='STATUS_NOME', max_length=30) # Field name made lowercase.
class Meta:
managed = False
db_table = 'StatusOrdemProducao'
class Tipoproduto(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
class Meta:
managed = False
db_table = 'TipoProduto'
class Tiposeguimento(models.Model):
id = models.SmallIntegerField(db_column='ID', primary_key=True) # Field name made lowercase.
nome = models.CharField(db_column='NOME', max_length=100) # Field name made lowercase.
class Meta:
managed = False
db_table = 'TipoSeguimento'
| 53.319527 | 134 | 0.726002 | 1,193 | 9,011 | 5.316848 | 0.11316 | 0.087025 | 0.141416 | 0.239319 | 0.789059 | 0.744127 | 0.720479 | 0.702034 | 0.690525 | 0.69021 | 0 | 0.012019 | 0.169016 | 9,011 | 168 | 135 | 53.636905 | 0.835069 | 0.270003 | 0 | 0.590164 | 1 | 0 | 0.098631 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02459 | false | 0 | 0.008197 | 0.02459 | 0.803279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
95e4e57a1e8b80821a360083ed4fb20e9de52428 | 10,276 | py | Python | restler/unit_tests/log_baseline_test_files/test_grammar_body.py | mkleshchenok/restler-fuzzer | 1bd7bc68a6c4de997e9fda9a9db5ffb0504b864c | [
"MIT"
] | 1,539 | 2020-11-16T19:20:55.000Z | 2022-03-30T16:36:49.000Z | restler/unit_tests/log_baseline_test_files/test_grammar_body.py | mkleshchenok/restler-fuzzer | 1bd7bc68a6c4de997e9fda9a9db5ffb0504b864c | [
"MIT"
] | 282 | 2020-11-17T04:53:38.000Z | 2022-03-31T13:16:25.000Z | restler/unit_tests/log_baseline_test_files/test_grammar_body.py | mkleshchenok/restler-fuzzer | 1bd7bc68a6c4de997e9fda9a9db5ffb0504b864c | [
"MIT"
] | 171 | 2020-11-16T21:55:59.000Z | 2022-03-28T12:56:26.000Z | # Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
from __future__ import print_function
import json
from engine import primitives
from engine.core import requests
from engine.errors import ResponseParsingException
from engine import dependencies
_city_put_name = dependencies.DynamicVariable(
"_city_put_name"
)
_city_house_put_name = dependencies.DynamicVariable(
"_city_house_put_name"
)
def parse_cityNamePut(data):
temp_123 = None
try:
data = json.loads(data)
except Exception as error:
raise ResponseParsingException("Exception parsing response, data was not valid json: {}".format(error))
try:
temp_123 = str(data["name"])
except Exception as error:
pass
if temp_123:
dependencies.set_variable("_city_put_name", temp_123)
def parse_cityHouseNamePut(data):
temp_123 = None
try:
data = json.loads(data)
except Exception as error:
raise ResponseParsingException("Exception parsing response, data was not valid json: {}".format(error))
try:
temp_123 = str(data["name"])
except Exception as error:
pass
if temp_123:
dependencies.set_variable("_city_house_put_name", temp_123)
req_collection = requests.RequestCollection([])
request = requests.Request([
primitives.restler_static_string("GET "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("PUT "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_custom_payload_uuid4_suffix("cityName"),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n"),
primitives.restler_static_string("{"),
primitives.restler_static_string('"properties":'),
primitives.restler_static_string("{"),
primitives.restler_static_string('"population":'),
primitives.restler_fuzzable_int('10000', quoted=True),
primitives.restler_static_string(', "area": "5000",'),
primitives.restler_fuzzable_string('strtest', quoted=True),
primitives.restler_static_string(':'),
primitives.restler_fuzzable_bool('true', quoted=True),
primitives.restler_static_string(',"subproperties":'),
primitives.restler_static_string("{"),
primitives.restler_static_string('"subtest":'),
primitives.restler_fuzzable_bool("true", quoted=False),
primitives.restler_static_string("}"),
primitives.restler_static_string("}"),
primitives.restler_static_string("}"),
primitives.restler_static_string("\r\n"),
{
'post_send':
{
'parser': parse_cityNamePut,
'dependencies':
[
_city_put_name.writer()
]
}
},
],
requestId="/city/{cityName}"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("GET "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string("?"),
primitives.restler_static_string("location="),
primitives.restler_custom_payload("location"),
primitives.restler_static_string("&"),
primitives.restler_static_string("group="),
primitives.restler_fuzzable_group("fuzzable_group_tag", ['A','BB','CCC']),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city/{cityName}"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("DELETE "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city/{cityName}"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("GET "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string("/"),
primitives.restler_static_string("house"),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city/{cityName}/house"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("PUT "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string("/"),
primitives.restler_static_string("house"),
primitives.restler_static_string("/"),
primitives.restler_custom_payload_uuid4_suffix("houseName"),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n"),
primitives.restler_static_string("["),
primitives.restler_static_string("{"),
primitives.restler_static_string('"house":'),
primitives.restler_custom_payload_uuid4_suffix("houseName", quoted=True),
primitives.restler_static_string(',"group":'),
primitives.restler_fuzzable_group("fuzzable_group_tag", ['A','BB','CCC'], quoted=True),
primitives.restler_static_string("}"),
primitives.restler_static_string(","),
primitives.restler_static_string("{"),
primitives.restler_static_string('"arraytest":'),
primitives.restler_custom_payload("location", quoted=True),
primitives.restler_static_string("}"),
primitives.restler_static_string("]"),
primitives.restler_static_string("\r\n"),
{
'post_send':
{
'parser': parse_cityHouseNamePut,
'dependencies':
[
_city_house_put_name.writer()
]
}
},
],
requestId="/city/{cityName}/house/{houseName}"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("GET "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string("/"),
primitives.restler_static_string("house"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_house_put_name.reader()),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city/{cityName}/house/{houseName}"
)
req_collection.add_request(request)
request = requests.Request([
primitives.restler_static_string("DELETE "),
primitives.restler_static_string("/"),
primitives.restler_static_string("city"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_put_name.reader()),
primitives.restler_static_string("/"),
primitives.restler_static_string("house"),
primitives.restler_static_string("/"),
primitives.restler_static_string(_city_house_put_name.reader()),
primitives.restler_static_string(" HTTP/1.1\r\n"),
primitives.restler_static_string("Accept: application/json\r\n"),
primitives.restler_static_string("Host: restler.unit.test.server.com\r\n"),
primitives.restler_static_string("Content-Type: application/json\r\n"),
primitives.restler_refreshable_authentication_token("authentication_token_tag"),
primitives.restler_static_string("\r\n")
],
requestId="/city/{cityName}/house/{houseName}"
)
req_collection.add_request(request)
| 38.924242 | 111 | 0.738906 | 1,170 | 10,276 | 6.147863 | 0.095727 | 0.323787 | 0.377311 | 0.47574 | 0.919088 | 0.897956 | 0.874461 | 0.868622 | 0.840678 | 0.840678 | 0 | 0.005785 | 0.125243 | 10,276 | 263 | 112 | 39.072243 | 0.794415 | 0.006617 | 0 | 0.700855 | 0 | 0 | 0.18671 | 0.056062 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008547 | false | 0.008547 | 0.025641 | 0 | 0.034188 | 0.004274 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c2f8a91cef659e5709f2def1a959a85e33d242f6 | 155 | py | Python | loldib/getratings/models/NA/na_warwick/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_warwick/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_warwick/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from .na_warwick_top import *
from .na_warwick_jng import *
from .na_warwick_mid import *
from .na_warwick_bot import *
from .na_warwick_sup import *
| 25.833333 | 30 | 0.774194 | 25 | 155 | 4.4 | 0.36 | 0.272727 | 0.590909 | 0.690909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 155 | 5 | 31 | 31 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c2f97cc0ad9add2bf88803f4a1e18933f3f471f3 | 179 | py | Python | bentoml/h2o.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | 1 | 2021-06-12T17:04:07.000Z | 2021-06-12T17:04:07.000Z | bentoml/h2o.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | 4 | 2021-05-16T08:06:25.000Z | 2021-11-13T08:46:36.000Z | bentoml/h2o.py | francoisserra/BentoML | 213e9e9b39e055286f2649c733907df88e6d2503 | [
"Apache-2.0"
] | null | null | null | from ._internal.frameworks.h2o import load
from ._internal.frameworks.h2o import save
from ._internal.frameworks.h2o import load_runner
__all__ = ["load", "load_runner", "save"]
| 29.833333 | 49 | 0.787709 | 24 | 179 | 5.5 | 0.375 | 0.272727 | 0.5 | 0.568182 | 0.765152 | 0.530303 | 0 | 0 | 0 | 0 | 0 | 0.018634 | 0.100559 | 179 | 5 | 50 | 35.8 | 0.801242 | 0 | 0 | 0 | 0 | 0 | 0.106145 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6c34bd46dd3de04d3a7f6f72d67d31956e864259 | 2,977 | py | Python | matplotlib/matplot3.py | dogancantorun8/python-application | 3ef972e52bb6950108cde36974ceaf5c3cde3667 | [
"MIT"
] | null | null | null | matplotlib/matplot3.py | dogancantorun8/python-application | 3ef972e52bb6950108cde36974ceaf5c3cde3667 | [
"MIT"
] | null | null | null | matplotlib/matplot3.py | dogancantorun8/python-application | 3ef972e52bb6950108cde36974ceaf5c3cde3667 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sun Jan 17 17:12:38 2021
@author: Dogancan Torun
"""
#birden fazla veri setinin histogramını çizmek istersem :
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_csv(r'Covid-19-Datasets\covid_19_data.csv')
x = np.random.randn(2000).reshape(1000, 2)
plt.title('Normal Random Numbers Histogram', fontsize=15)
plt.xlabel('x', fontsize=12)
plt.ylabel('frequency', fontsize=12)
plt.xticks(np.arange(-10, 10, 0.5))
plt.hist(x, bins=20, color=['green', 'blue'])
#histogram grafik özelliklerinden edge colour kullanırsam
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_csv(r'Covid-19-Datasets\covid_19_data.csv')
x = np.random.randn(2000)
plt.title('Normal Random Numbers Histogram', fontsize=15)
plt.xlabel('x', fontsize=12)
plt.ylabel('frequency', fontsize=12)
plt.xticks(np.arange(-10, 10, 0.5))
plt.hist(x, rwidth=1, edgecolor='red', color='yellow')
#pasta grafiği çizimi
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
x = [360, 500, 180, 900, 1300]
labels = ['XRP', 'BTC', 'Avax', 'NEO', 'BTG']
plt.pie(x, labels=labels)
#pastanın renklerini değişmek istersem
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
#df = pd.read_csv(r'Covid-19-Datasets\covid_19_data.csv')
x = [360, 500, 180, 900, 1300]
labels = ['XRP', 'BTC', 'Avax', 'NEO', 'BTG']
plt.pie(x, labels=labels, colors=['red', 'green', 'blue', 'yellow', 'magenta'])
#pasta grafiğinde explode methodu ile istediğimiz dilimi pastadan koparabiliriz
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
#df = pd.read_csv(r'Covid-19-Datasets\covid_19_data.csv')
x = [360, 500, 180, 900, 1300]
labels = ['XRP', 'BTC', 'Avax', 'NEO', 'BTG']
plt.pie(x, labels=labels, colors=['red', 'green', 'blue', 'yellow', 'magenta'], explode=[0.1, 0, 0, 0, 0])
#pasta dilimlerin üzerinde yüzdeler olsun istersem autopct kullanabilirim
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
x = [664812242762, 141428512203, 24330211680, 15364089757, 12576793068]
labels = ['BTC', 'ETH', 'USDT', 'DOT', 'XRP']
plt.pie(x, labels=labels, explode=[0, 0, 0, 0, 0], autopct='%%%.0f')
#pastanın yarıçapını değiştirip grafiği büyütebilirim :matplotta birimler pixel değil inç cinsindedendir
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
x = [664812242762, 141428512203, 24330211680, 15364089757, 12576793068]
labels = ['BTC', 'ETH', 'USDT', 'DOT', 'XRP']
plt.pie(x, labels=labels, explode=[0, 0, 0, 0, 0], autopct='%%%.0f',radius=2)
#bir figure boyutunu inç cinsinden değiştirebiliriz
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
x = [664812242762, 141428512203, 24330211680, 15364089757, 12576793068]
labels = ['BTC', 'ETH', 'USDT', 'DOT', 'XRP']
plt.figure(figsize=(10, 10)) #figure parametresiyle config yaptım
plt.pie(x, labels=labels, explode=[0, 0, 0, 0, 0], autopct='%%%.0f')
| 30.377551 | 106 | 0.717165 | 459 | 2,977 | 4.625272 | 0.298475 | 0.014131 | 0.015544 | 0.056524 | 0.720207 | 0.720207 | 0.720207 | 0.720207 | 0.720207 | 0.720207 | 0 | 0.122205 | 0.128653 | 2,977 | 97 | 107 | 30.690722 | 0.696222 | 0.237151 | 0 | 0.842105 | 0 | 0 | 0.14924 | 0.031278 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.421053 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6c526a0744d6295ed91ceeb44f102d86957fbb67 | 8,218 | py | Python | users/tests.py | nikita03565/projects_fair | e4a6095f804f500a0285332a5fab051b2b61acc1 | [
"Apache-2.0"
] | null | null | null | users/tests.py | nikita03565/projects_fair | e4a6095f804f500a0285332a5fab051b2b61acc1 | [
"Apache-2.0"
] | null | null | null | users/tests.py | nikita03565/projects_fair | e4a6095f804f500a0285332a5fab051b2b61acc1 | [
"Apache-2.0"
] | null | null | null | from rest_framework.test import APITestCase, APIClient
from rest_framework import status
from rest_framework.authtoken.models import Token
from django.urls import reverse
from .models import User
class AccountTests(APITestCase):
def test_new_user_registration(self):
url = reverse("signup")
data = {
'email': 'foobar@example.com',
'password': 'somepassword',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
self.assertEqual(User.objects.count(), 1)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.data['email'], data['email'])
self.assertFalse('password' in response.data)
token = Token.objects.get(user=User.objects.latest('id'))
self.assertEqual(response.data['token'], token.key)
def test_registration_user_with_short_password(self):
url = reverse("signup")
data = {
'email': 'foobarbaz@example.com',
'password': 'foo',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(User.objects.count(), 0)
self.assertEqual(len(response.data['password']), 1)
def test_registration_user_with_no_password(self):
url = reverse("signup")
data = {
'email': 'foobarbaz@example.com',
'password': '',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(User.objects.count(), 0)
self.assertEqual(len(response.data['password']), 1)
def test_create_user_with_invalid_email(self):
url = reverse("signup")
data = {
'email': 'testing',
'passsword': 'foobarbaz',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(User.objects.count(), 0)
self.assertEqual(len(response.data['email']), 1)
def test_registration_user_with_no_email(self):
url = reverse("signup")
data = {
'email': '',
'password': 'foobar123',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(User.objects.count(), 0)
self.assertEqual(len(response.data['email']), 1)
def test_registration_user_with_preexisting_email(self):
url = reverse("signup")
data = {
'email': 'foobar@example.com',
'password': 'somepassword',
'first_name': 'First',
'last_name': 'Last'
}
User.objects.create_user_by_email(data["email"], data["password"], data["first_name"], data["last_name"])
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(User.objects.count(), 1)
def test_authorize_registered_user(self):
url = reverse("signup")
data = {
'email': 'foobar@example.com',
'password': 'somepassword',
'first_name': 'First',
'last_name': 'Last'
}
response = self.client.post(url, data, format='json')
token1 = response.data['token']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url = reverse("signin")
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(token1, response.data['token'])
def test_authorize_user(self):
self.user = User.objects.create_user(username='testUser', password='12345', email="test@test.test")
url = reverse("signin")
data = {'email': 'test@test.test', 'password': '12345'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_authorize_empty_fields(self):
url = reverse("signin")
data = {'email': '', 'password': '12345'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data['email'][0], "This field may not be blank.")
data = {'email': 'test@test.test', 'password': ''}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data['password'][0], "This field may not be blank.")
data = {'email': '', 'password': ''}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data['password'][0], "This field may not be blank.")
self.assertEqual(response.data['email'][0], "This field may not be blank.")
def test_authorize_wrongData(self):
url = reverse("signin")
data = {'email': 'wrongtest@test.test', 'password': '12345'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
data = {'email': 'test@test.test', 'password': '12345wrong7'}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_change_password_unauthorized(self):
url = reverse("password-change")
self.user = User.objects.create_user(username='testUser', password='12345', email="test@test.test")
response = self.client.post(url, {'new_password': "12345678"}, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_change_password(self):
url = reverse("password-change")
self.user = User.objects.create_user(username='testUser', password='12345', email="test@test.test")
data = {'email': 'test@test.test', 'password': '12345'}
response = self.client.post(reverse("signin"), data, format='json')
client = APIClient()
client.credentials(HTTP_AUTHORIZATION='Token ' + response.data['token'])
response = client.post(url, {'new_password': "12345678"}, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data['detail'], 'Password has been saved.')
response = self.client.post(reverse("signin"), data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
response = self.client.post(reverse("signin"), {'email': 'test@test.test', 'password': '12345678'}, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_user_signout_unauthorized(self):
url = reverse("signout")
response = self.client.post(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_user_signout(self):
url = reverse("signout")
self.user = User.objects.create_user(username='testUser', password='12345', email="test@test.test")
data = {'email': 'test@test.test', 'password': '12345'}
response = self.client.post(reverse("signin"), data, format='json')
token = response.data['token']
client = APIClient()
client.credentials(HTTP_AUTHORIZATION='Token ' + token)
response = client.post(url, format='json')
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
response = client.post(url, {'new_password': "12345678"}, format='json')
| 44.421622 | 122 | 0.635434 | 941 | 8,218 | 5.388948 | 0.108395 | 0.112404 | 0.122461 | 0.086768 | 0.823901 | 0.797476 | 0.767501 | 0.727076 | 0.727076 | 0.684283 | 0 | 0.025277 | 0.220127 | 8,218 | 184 | 123 | 44.663043 | 0.765954 | 0 | 0 | 0.660377 | 0 | 0 | 0.169019 | 0.005111 | 0 | 0 | 0 | 0 | 0.245283 | 1 | 0.08805 | false | 0.226415 | 0.031447 | 0 | 0.125786 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
dd84cdf25ea74e3205f1e43bedbcf230de5840d3 | 400 | py | Python | response/serializers.py | dmichau85/response | 672a675660400766286628c349f324bff56e986c | [
"MIT"
] | 2 | 2020-09-17T17:24:32.000Z | 2020-10-16T10:49:03.000Z | response/serializers.py | dmichau85/response | 672a675660400766286628c349f324bff56e986c | [
"MIT"
] | 39 | 2020-10-02T15:56:55.000Z | 2022-01-19T11:58:41.000Z | response/serializers.py | dmichau85/response | 672a675660400766286628c349f324bff56e986c | [
"MIT"
] | 3 | 2020-10-30T19:46:31.000Z | 2021-05-14T04:59:39.000Z | from .core.serializers import (ActionSerializer, CommsChannelSerializer,
EventSerializer, ExternalUserSerializer,
IncidentSerializer, TimelineEventSerializer)
__all__ = (
"ActionSerializer",
"CommsChannelSerializer",
"EventSerializer",
"ExternalUserSerializer",
"IncidentSerializer",
"TimelineEventSerializer",
)
| 30.769231 | 75 | 0.66 | 17 | 400 | 15.294118 | 0.647059 | 0.292308 | 0.407692 | 0.576923 | 0.892308 | 0.892308 | 0 | 0 | 0 | 0 | 0 | 0 | 0.265 | 400 | 12 | 76 | 33.333333 | 0.884354 | 0 | 0 | 0 | 0 | 0 | 0.29 | 0.1675 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 1 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6630987ef7de453fc6739a3f6d8bbe33a993510c | 109 | py | Python | sfaira/versions/topologies/human/__init__.py | theislab/sfaira | 77a7b49936047a0cdddc5ace4482186a868c3a7a | [
"BSD-3-Clause"
] | 110 | 2020-09-08T07:47:15.000Z | 2022-03-29T03:33:56.000Z | sfaira/versions/topologies/human/__init__.py | theislab/sfaira | 77a7b49936047a0cdddc5ace4482186a868c3a7a | [
"BSD-3-Clause"
] | 405 | 2020-09-15T15:05:46.000Z | 2022-03-16T14:44:23.000Z | sfaira/versions/topologies/human/__init__.py | theislab/sfaira | 77a7b49936047a0cdddc5ace4482186a868c3a7a | [
"BSD-3-Clause"
] | 20 | 2021-03-30T15:30:14.000Z | 2022-03-07T12:52:58.000Z | from sfaira.versions.topologies.human import celltype
from sfaira.versions.topologies.human import embedding
| 36.333333 | 54 | 0.87156 | 14 | 109 | 6.785714 | 0.571429 | 0.210526 | 0.378947 | 0.589474 | 0.821053 | 0.821053 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073395 | 109 | 2 | 55 | 54.5 | 0.940594 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
b0df61be96519e42aacbacced4d542eb3ef659a6 | 31,252 | py | Python | sdk/python/pulumi_azure/datafactory/data_flow.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/datafactory/data_flow.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/datafactory/data_flow.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['DataFlowArgs', 'DataFlow']
@pulumi.input_type
class DataFlowArgs:
def __init__(__self__, *,
data_factory_id: pulumi.Input[str],
script: pulumi.Input[str],
sinks: pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]],
sources: pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]],
annotations: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
description: Optional[pulumi.Input[str]] = None,
folder: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
transformations: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]] = None):
"""
The set of arguments for constructing a DataFlow resource.
:param pulumi.Input[str] data_factory_id: The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
:param pulumi.Input[str] script: The script for the Data Factory Data Flow.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]] sinks: One or more `sink` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]] sources: One or more `source` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input[str]]] annotations: List of tags that can be used for describing the Data Factory Data Flow.
:param pulumi.Input[str] description: The description for the Data Factory Data Flow.
:param pulumi.Input[str] folder: The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
:param pulumi.Input[str] name: Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]] transformations: One or more `transformation` blocks as defined below.
"""
pulumi.set(__self__, "data_factory_id", data_factory_id)
pulumi.set(__self__, "script", script)
pulumi.set(__self__, "sinks", sinks)
pulumi.set(__self__, "sources", sources)
if annotations is not None:
pulumi.set(__self__, "annotations", annotations)
if description is not None:
pulumi.set(__self__, "description", description)
if folder is not None:
pulumi.set(__self__, "folder", folder)
if name is not None:
pulumi.set(__self__, "name", name)
if transformations is not None:
pulumi.set(__self__, "transformations", transformations)
@property
@pulumi.getter(name="dataFactoryId")
def data_factory_id(self) -> pulumi.Input[str]:
"""
The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
"""
return pulumi.get(self, "data_factory_id")
@data_factory_id.setter
def data_factory_id(self, value: pulumi.Input[str]):
pulumi.set(self, "data_factory_id", value)
@property
@pulumi.getter
def script(self) -> pulumi.Input[str]:
"""
The script for the Data Factory Data Flow.
"""
return pulumi.get(self, "script")
@script.setter
def script(self, value: pulumi.Input[str]):
pulumi.set(self, "script", value)
@property
@pulumi.getter
def sinks(self) -> pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]]:
"""
One or more `sink` blocks as defined below.
"""
return pulumi.get(self, "sinks")
@sinks.setter
def sinks(self, value: pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]]):
pulumi.set(self, "sinks", value)
@property
@pulumi.getter
def sources(self) -> pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]]:
"""
One or more `source` blocks as defined below.
"""
return pulumi.get(self, "sources")
@sources.setter
def sources(self, value: pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]]):
pulumi.set(self, "sources", value)
@property
@pulumi.getter
def annotations(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of tags that can be used for describing the Data Factory Data Flow.
"""
return pulumi.get(self, "annotations")
@annotations.setter
def annotations(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "annotations", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description for the Data Factory Data Flow.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def folder(self) -> Optional[pulumi.Input[str]]:
"""
The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
"""
return pulumi.get(self, "folder")
@folder.setter
def folder(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def transformations(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]]:
"""
One or more `transformation` blocks as defined below.
"""
return pulumi.get(self, "transformations")
@transformations.setter
def transformations(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]]):
pulumi.set(self, "transformations", value)
@pulumi.input_type
class _DataFlowState:
def __init__(__self__, *,
annotations: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
data_factory_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
folder: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
script: Optional[pulumi.Input[str]] = None,
sinks: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]]] = None,
sources: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]]] = None,
transformations: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]] = None):
"""
Input properties used for looking up and filtering DataFlow resources.
:param pulumi.Input[Sequence[pulumi.Input[str]]] annotations: List of tags that can be used for describing the Data Factory Data Flow.
:param pulumi.Input[str] data_factory_id: The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
:param pulumi.Input[str] description: The description for the Data Factory Data Flow.
:param pulumi.Input[str] folder: The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
:param pulumi.Input[str] name: Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
:param pulumi.Input[str] script: The script for the Data Factory Data Flow.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]] sinks: One or more `sink` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]] sources: One or more `source` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]] transformations: One or more `transformation` blocks as defined below.
"""
if annotations is not None:
pulumi.set(__self__, "annotations", annotations)
if data_factory_id is not None:
pulumi.set(__self__, "data_factory_id", data_factory_id)
if description is not None:
pulumi.set(__self__, "description", description)
if folder is not None:
pulumi.set(__self__, "folder", folder)
if name is not None:
pulumi.set(__self__, "name", name)
if script is not None:
pulumi.set(__self__, "script", script)
if sinks is not None:
pulumi.set(__self__, "sinks", sinks)
if sources is not None:
pulumi.set(__self__, "sources", sources)
if transformations is not None:
pulumi.set(__self__, "transformations", transformations)
@property
@pulumi.getter
def annotations(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of tags that can be used for describing the Data Factory Data Flow.
"""
return pulumi.get(self, "annotations")
@annotations.setter
def annotations(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "annotations", value)
@property
@pulumi.getter(name="dataFactoryId")
def data_factory_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
"""
return pulumi.get(self, "data_factory_id")
@data_factory_id.setter
def data_factory_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "data_factory_id", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description for the Data Factory Data Flow.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def folder(self) -> Optional[pulumi.Input[str]]:
"""
The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
"""
return pulumi.get(self, "folder")
@folder.setter
def folder(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def script(self) -> Optional[pulumi.Input[str]]:
"""
The script for the Data Factory Data Flow.
"""
return pulumi.get(self, "script")
@script.setter
def script(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "script", value)
@property
@pulumi.getter
def sinks(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]]]:
"""
One or more `sink` blocks as defined below.
"""
return pulumi.get(self, "sinks")
@sinks.setter
def sinks(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSinkArgs']]]]):
pulumi.set(self, "sinks", value)
@property
@pulumi.getter
def sources(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]]]:
"""
One or more `source` blocks as defined below.
"""
return pulumi.get(self, "sources")
@sources.setter
def sources(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowSourceArgs']]]]):
pulumi.set(self, "sources", value)
@property
@pulumi.getter
def transformations(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]]:
"""
One or more `transformation` blocks as defined below.
"""
return pulumi.get(self, "transformations")
@transformations.setter
def transformations(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DataFlowTransformationArgs']]]]):
pulumi.set(self, "transformations", value)
class DataFlow(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
data_factory_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
folder: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
script: Optional[pulumi.Input[str]] = None,
sinks: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSinkArgs']]]]] = None,
sources: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSourceArgs']]]]] = None,
transformations: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowTransformationArgs']]]]] = None,
__props__=None):
"""
Manages a Data Flow inside an Azure Data Factory.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_account = azure.storage.Account("exampleAccount",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
account_tier="Standard",
account_replication_type="LRS")
example_factory = azure.datafactory.Factory("exampleFactory",
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_linked_custom_service = azure.datafactory.LinkedCustomService("exampleLinkedCustomService",
data_factory_id=example_factory.id,
type="AzureBlobStorage",
type_properties_json=example_account.primary_connection_string.apply(lambda primary_connection_string: f\"\"\"{{
"connectionString": "{primary_connection_string}"
}}
\"\"\"))
example1 = azure.datafactory.DatasetJson("example1",
resource_group_name=example_resource_group.name,
data_factory_name=example_factory.name,
linked_service_name=example_linked_custom_service.name,
azure_blob_storage_location=azure.datafactory.DatasetJsonAzureBlobStorageLocationArgs(
container="container",
path="foo/bar/",
filename="foo.txt",
),
encoding="UTF-8")
example2 = azure.datafactory.DatasetJson("example2",
resource_group_name=example_resource_group.name,
data_factory_name=example_factory.name,
linked_service_name=example_linked_custom_service.name,
azure_blob_storage_location=azure.datafactory.DatasetJsonAzureBlobStorageLocationArgs(
container="container",
path="foo/bar/",
filename="bar.txt",
),
encoding="UTF-8")
example_data_flow = azure.datafactory.DataFlow("exampleDataFlow",
data_factory_id=example_factory.id,
sources=[azure.datafactory.DataFlowSourceArgs(
name="source1",
dataset=azure.datafactory.DataFlowSourceDatasetArgs(
name=example1.name,
),
)],
sinks=[azure.datafactory.DataFlowSinkArgs(
name="sink1",
dataset=azure.datafactory.DataFlowSinkDatasetArgs(
name=example2.name,
),
)],
script=\"\"\"source(
allowSchemaDrift: true,
validateSchema: false,
limit: 100,
ignoreNoFilesFound: false,
documentForm: 'documentPerLine') ~> source1
source1 sink(
allowSchemaDrift: true,
validateSchema: false,
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true) ~> sink1
\"\"\")
```
## Import
Data Factory Data Flow can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:datafactory/dataFlow:DataFlow example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/example/providers/Microsoft.DataFactory/factories/example/dataflows/example
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] annotations: List of tags that can be used for describing the Data Factory Data Flow.
:param pulumi.Input[str] data_factory_id: The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
:param pulumi.Input[str] description: The description for the Data Factory Data Flow.
:param pulumi.Input[str] folder: The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
:param pulumi.Input[str] name: Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
:param pulumi.Input[str] script: The script for the Data Factory Data Flow.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSinkArgs']]]] sinks: One or more `sink` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSourceArgs']]]] sources: One or more `source` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowTransformationArgs']]]] transformations: One or more `transformation` blocks as defined below.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: DataFlowArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Data Flow inside an Azure Data Factory.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_account = azure.storage.Account("exampleAccount",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
account_tier="Standard",
account_replication_type="LRS")
example_factory = azure.datafactory.Factory("exampleFactory",
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_linked_custom_service = azure.datafactory.LinkedCustomService("exampleLinkedCustomService",
data_factory_id=example_factory.id,
type="AzureBlobStorage",
type_properties_json=example_account.primary_connection_string.apply(lambda primary_connection_string: f\"\"\"{{
"connectionString": "{primary_connection_string}"
}}
\"\"\"))
example1 = azure.datafactory.DatasetJson("example1",
resource_group_name=example_resource_group.name,
data_factory_name=example_factory.name,
linked_service_name=example_linked_custom_service.name,
azure_blob_storage_location=azure.datafactory.DatasetJsonAzureBlobStorageLocationArgs(
container="container",
path="foo/bar/",
filename="foo.txt",
),
encoding="UTF-8")
example2 = azure.datafactory.DatasetJson("example2",
resource_group_name=example_resource_group.name,
data_factory_name=example_factory.name,
linked_service_name=example_linked_custom_service.name,
azure_blob_storage_location=azure.datafactory.DatasetJsonAzureBlobStorageLocationArgs(
container="container",
path="foo/bar/",
filename="bar.txt",
),
encoding="UTF-8")
example_data_flow = azure.datafactory.DataFlow("exampleDataFlow",
data_factory_id=example_factory.id,
sources=[azure.datafactory.DataFlowSourceArgs(
name="source1",
dataset=azure.datafactory.DataFlowSourceDatasetArgs(
name=example1.name,
),
)],
sinks=[azure.datafactory.DataFlowSinkArgs(
name="sink1",
dataset=azure.datafactory.DataFlowSinkDatasetArgs(
name=example2.name,
),
)],
script=\"\"\"source(
allowSchemaDrift: true,
validateSchema: false,
limit: 100,
ignoreNoFilesFound: false,
documentForm: 'documentPerLine') ~> source1
source1 sink(
allowSchemaDrift: true,
validateSchema: false,
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true) ~> sink1
\"\"\")
```
## Import
Data Factory Data Flow can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:datafactory/dataFlow:DataFlow example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/example/providers/Microsoft.DataFactory/factories/example/dataflows/example
```
:param str resource_name: The name of the resource.
:param DataFlowArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(DataFlowArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
data_factory_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
folder: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
script: Optional[pulumi.Input[str]] = None,
sinks: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSinkArgs']]]]] = None,
sources: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSourceArgs']]]]] = None,
transformations: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowTransformationArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = DataFlowArgs.__new__(DataFlowArgs)
__props__.__dict__["annotations"] = annotations
if data_factory_id is None and not opts.urn:
raise TypeError("Missing required property 'data_factory_id'")
__props__.__dict__["data_factory_id"] = data_factory_id
__props__.__dict__["description"] = description
__props__.__dict__["folder"] = folder
__props__.__dict__["name"] = name
if script is None and not opts.urn:
raise TypeError("Missing required property 'script'")
__props__.__dict__["script"] = script
if sinks is None and not opts.urn:
raise TypeError("Missing required property 'sinks'")
__props__.__dict__["sinks"] = sinks
if sources is None and not opts.urn:
raise TypeError("Missing required property 'sources'")
__props__.__dict__["sources"] = sources
__props__.__dict__["transformations"] = transformations
super(DataFlow, __self__).__init__(
'azure:datafactory/dataFlow:DataFlow',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
data_factory_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
folder: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
script: Optional[pulumi.Input[str]] = None,
sinks: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSinkArgs']]]]] = None,
sources: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSourceArgs']]]]] = None,
transformations: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowTransformationArgs']]]]] = None) -> 'DataFlow':
"""
Get an existing DataFlow resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] annotations: List of tags that can be used for describing the Data Factory Data Flow.
:param pulumi.Input[str] data_factory_id: The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
:param pulumi.Input[str] description: The description for the Data Factory Data Flow.
:param pulumi.Input[str] folder: The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
:param pulumi.Input[str] name: Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
:param pulumi.Input[str] script: The script for the Data Factory Data Flow.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSinkArgs']]]] sinks: One or more `sink` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowSourceArgs']]]] sources: One or more `source` blocks as defined below.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DataFlowTransformationArgs']]]] transformations: One or more `transformation` blocks as defined below.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _DataFlowState.__new__(_DataFlowState)
__props__.__dict__["annotations"] = annotations
__props__.__dict__["data_factory_id"] = data_factory_id
__props__.__dict__["description"] = description
__props__.__dict__["folder"] = folder
__props__.__dict__["name"] = name
__props__.__dict__["script"] = script
__props__.__dict__["sinks"] = sinks
__props__.__dict__["sources"] = sources
__props__.__dict__["transformations"] = transformations
return DataFlow(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def annotations(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of tags that can be used for describing the Data Factory Data Flow.
"""
return pulumi.get(self, "annotations")
@property
@pulumi.getter(name="dataFactoryId")
def data_factory_id(self) -> pulumi.Output[str]:
"""
The ID of Data Factory in which to associate the Data Flow with. Changing this forces a new resource.
"""
return pulumi.get(self, "data_factory_id")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
The description for the Data Factory Data Flow.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def folder(self) -> pulumi.Output[Optional[str]]:
"""
The folder that this Data Flow is in. If not specified, the Data Flow will appear at the root level.
"""
return pulumi.get(self, "folder")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the Data Factory Data Flow. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def script(self) -> pulumi.Output[str]:
"""
The script for the Data Factory Data Flow.
"""
return pulumi.get(self, "script")
@property
@pulumi.getter
def sinks(self) -> pulumi.Output[Sequence['outputs.DataFlowSink']]:
"""
One or more `sink` blocks as defined below.
"""
return pulumi.get(self, "sinks")
@property
@pulumi.getter
def sources(self) -> pulumi.Output[Sequence['outputs.DataFlowSource']]:
"""
One or more `source` blocks as defined below.
"""
return pulumi.get(self, "sources")
@property
@pulumi.getter
def transformations(self) -> pulumi.Output[Optional[Sequence['outputs.DataFlowTransformation']]]:
"""
One or more `transformation` blocks as defined below.
"""
return pulumi.get(self, "transformations")
| 45.556851 | 211 | 0.645143 | 3,420 | 31,252 | 5.729825 | 0.072515 | 0.097112 | 0.057155 | 0.06634 | 0.909114 | 0.888294 | 0.865534 | 0.851041 | 0.841294 | 0.830323 | 0 | 0.004141 | 0.25048 | 31,252 | 685 | 212 | 45.623358 | 0.832437 | 0.415365 | 0 | 0.737654 | 1 | 0 | 0.108534 | 0.020069 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160494 | false | 0.003086 | 0.021605 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b0ed164925dc2ae3ae8e341abacd8ca929bf3774 | 9,478 | py | Python | apps/incidents/tests/test_escalation_helper.py | seanlefevre/openduty | 34ab21117f114ccc808d8b0aa2cb801c819bdb86 | [
"MIT"
] | 145 | 2016-04-11T06:53:13.000Z | 2022-03-22T05:15:49.000Z | apps/incidents/tests/test_escalation_helper.py | seanlefevre/openduty | 34ab21117f114ccc808d8b0aa2cb801c819bdb86 | [
"MIT"
] | 78 | 2017-09-24T10:59:49.000Z | 2022-02-12T07:36:27.000Z | apps/incidents/tests/test_escalation_helper.py | seanlefevre/openduty | 34ab21117f114ccc808d8b0aa2cb801c819bdb86 | [
"MIT"
] | 30 | 2016-04-11T06:53:16.000Z | 2021-12-29T11:39:26.000Z | import pytest
from django.utils import timezone
import datetime
from datetime import timedelta, datetime
from django.core.exceptions import ValidationError
from django_dynamic_fixture import G
from schedule.models import Calendar, Event
from apps.services.models import Service
from apps.policies.models import SchedulePolicy, SchedulePolicyRule, Group
from apps.incidents.escalation_helper import (
get_escalation_for_service, get_current_events_users, get_events_users_inbetween, services_where_user_is_on_call
)
from apps.incidents.models import Incident, IncidentSilenced, Service
from apps.commons.tests.fixtures import authenticated_client, base_user, other_user
@pytest.mark.django_db
def test_get_escalation_works_with_no_recurrence(base_user):
schedule_policy = G(SchedulePolicy)
service = G(Service, policy=schedule_policy)
calendar = G(Calendar)
group = G(Group)
base_user.groups.add(group)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
user_id=base_user.id
)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
group_id=group.id
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group.name,
)
events = get_escalation_for_service(service)
assert len(events) == 2
@pytest.mark.django_db
def test_get_escalation_for_service_with_notifications_disabled(base_user):
schedule_policy = G(SchedulePolicy)
service = G(Service, policy=schedule_policy, notifications_disabled=True)
calendar = G(Calendar)
group = G(Group)
base_user.groups.add(group)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
user_id=base_user.id
)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
group_id=group.id
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group.name,
)
events = get_escalation_for_service(service)
assert len(events) == 0
@pytest.mark.django_db
def test_get_escalation_for_service_no_schedule(base_user, other_user):
schedule_policy = G(SchedulePolicy)
service = G(Service, policy=schedule_policy, notifications_disabled=False)
calendar = G(Calendar)
group = G(Group)
other_user.groups.add(group)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
user_id=base_user
)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
group_id=group
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
events = get_escalation_for_service(service)
assert len(events) == 3
@pytest.mark.django_db
def test_get_escalation_fails_with_no_recurrence_after_event_end(base_user):
schedule_policy = G(SchedulePolicy)
service = G(Service, policy=schedule_policy, )
calendar = G(Calendar)
group = G(Group)
base_user.groups.add(group)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
user_id=base_user
)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
group_id=group
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group.name,
)
events = get_escalation_for_service(service)
assert len(events) == 2
@pytest.mark.django_db
def test_get_current_events_users(base_user):
calendar = G(Calendar)
event = G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
current_events_users = get_current_events_users(calendar)
assert len(current_events_users)
@pytest.mark.django_db
def test_get_current_events_users_by_group(base_user):
calendar = G(Calendar)
group = G(Group)
base_user.groups.add(group)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group.name
)
current_events_users = get_current_events_users(calendar)
assert len(current_events_users)
@pytest.mark.django_db
def test_get_events_users_inbetween_with_timezone(base_user):
calendar = G(Calendar)
since = timezone.now() - timedelta(days=2)
until = timezone.now() + timedelta(days=2)
event = G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
events_users_in_between = get_events_users_inbetween(calendar, since, until)
assert len(events_users_in_between) == 1
@pytest.mark.django_db
def test_get_events_users_inbetween_no_timezone(base_user):
calendar = G(Calendar)
since = datetime.now() - timedelta(days=2)
until = datetime.now() + timedelta(days=2)
event = G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
events_users_in_between = get_events_users_inbetween(calendar, since, until)
assert len(events_users_in_between) == 1
@pytest.mark.django_db
def test_get_events_users_inbetween_with_users(base_user):
Group.objects.all().delete()
calendar = G(Calendar)
since = datetime.now() - timedelta(days=2)
until = datetime.now() + timedelta(days=2)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=base_user.username,
)
events_users_in_between = get_events_users_inbetween(calendar, since, until)
assert len(events_users_in_between) == 1
@pytest.mark.django_db
def test_get_events_users_inbetween_with_group(base_user):
Group.objects.all().delete()
calendar = G(Calendar)
group = G(Group)
group2 = G(Group)
base_user.groups.add(group, group2)
since = datetime.now() - timedelta(days=2)
until = datetime.now() + timedelta(days=2)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group.name,
)
G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=base_user,
title=group2.name,
)
events_users_in_between = get_events_users_inbetween(calendar, since, until)
assert len(events_users_in_between) == 1
@pytest.mark.django_db
def test_services_where_user_is_on_call(base_user, other_user):
schedule_policy = G(SchedulePolicy)
calendar = G(Calendar)
calendar2 = G(Calendar)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar2,
escalate_after=1,
user_id=base_user,
group_id=None
)
event = G(
Event,
start=timezone.now() - timedelta(days=1),
end=timezone.now() + timedelta(days=1),
calendar=calendar,
creator=other_user,
title=other_user.username,
)
G(
SchedulePolicyRule,
schedule_policy=schedule_policy,
position=0,
schedule=calendar,
escalate_after=1,
user_id=None,
group_id=None
)
service = G(Service, policy=schedule_policy, notifications_disabled=False)
assert service in services_where_user_is_on_call(other_user)
assert service in services_where_user_is_on_call(base_user)
| 28.896341 | 116 | 0.659105 | 1,131 | 9,478 | 5.279399 | 0.082228 | 0.060291 | 0.101825 | 0.128622 | 0.867526 | 0.855636 | 0.844917 | 0.826495 | 0.798359 | 0.743092 | 0 | 0.00989 | 0.242562 | 9,478 | 327 | 117 | 28.984709 | 0.821841 | 0 | 0 | 0.770492 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039344 | 1 | 0.036066 | false | 0 | 0.039344 | 0 | 0.07541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b0f2a11607bbbf6354f6ebc0eb99997d0d0edc82 | 159,850 | py | Python | atp-auto-core-open/atp/api/mysql_manager.py | rebecca1202/testAuto | 0c80d34a2f27c593a1ab7869679b75f50bd08d13 | [
"Apache-2.0"
] | 130 | 2020-04-01T02:27:54.000Z | 2022-03-16T10:11:56.000Z | atp-auto-core-open/atp/api/mysql_manager.py | rebecca1202/testAuto | 0c80d34a2f27c593a1ab7869679b75f50bd08d13 | [
"Apache-2.0"
] | 8 | 2020-03-31T09:41:21.000Z | 2021-06-02T01:19:12.000Z | atp-auto-core-open/atp/api/mysql_manager.py | rebecca1202/testAuto | 0c80d34a2f27c593a1ab7869679b75f50bd08d13 | [
"Apache-2.0"
] | 53 | 2020-06-15T03:41:00.000Z | 2022-03-31T10:41:21.000Z | # -*- coding:utf-8 -*-
from atp.extensions import db
from sqlalchemy import func
from sqlalchemy.orm import aliased
from sqlalchemy import or_, distinct, case
from atp.models.atp_auto import (
EnvInfo, User, TestcaseTag, BaseProjectInfo, BaseSystemInfo, BaseModuleInfo, BaseTestcaseInfo,
BaseJobHistory, UiProjectInfo, UiSystemInfo, UiModuleInfo, UICasePageInfo, UICasePageObjectInfo, UiTestcaseInfo,
ApiTestcaseRequestQll, ApiIntfDefaultRequest, BaseModuleInfoBak, BaseTestcaseInfoBak, CeleryTaskRecord,
ApiRunTaskResult, GenerateDataRecord, ApiTestcaseMainTagRelation, ApiTestcaseMainCustomFlow)
from atp.models.atp_auto import (
ApiCompanyInfo, ApiIntfInfo, ApiProjectInfo, ApiProjectIntfRelation, ApiProjectSystemRelation,
ApiPublicVariableInfo, ApiSystemInfo, ApiTestcaseInfo, ApiTestcaseRequest, ApiTestcaseTagRelation, ApiTestReport,
ApiProductLine, ApiTestcaseMain, ApiTestcaseSub, ApiTestcaseReuseRecord, ApiTaskInfo, GitDiffVersion
)
import logging
from sqlalchemy.exc import IntegrityError
logging.basicConfig(level=logging.DEBUG)
class SessionHandler(object):
def __init__(self):
self.db = db
self.session = None
def __enter__(self):
self.session = self.db.session
return self
def __exit__(self, exc_type, exc_val, exc_tb):
if self.session:
self.session.close()
class ApiCompanyInfoManager(object):
@staticmethod
def insert_company(**kwargs):
with SessionHandler() as sh:
obj = ApiCompanyInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_company(insert_list):
with SessionHandler() as sh:
objs = [ApiCompanyInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_company(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiCompanyInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_company(id_):
with SessionHandler() as session_handler:
obj = ApiCompanyInfo.query.filter_by(id=id_).first()
if obj:
session_handler.session.delete(obj)
session_handler.session.commit()
@staticmethod
def get_company(**kwargs):
with SessionHandler() as session_handler:
obj = ApiCompanyInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_companies(**kwargs):
with SessionHandler() as session_handler:
objs = ApiCompanyInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def query_api_subtree(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiSystemInfo.git_url
).outerjoin(
ApiIntfInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).filter(
ApiSystemInfo.api_company_id == company_id
).order_by(db.asc(ApiSystemInfo.system_name), db.asc(ApiIntfInfo.intf_name)).all()
@staticmethod
def query_api_project_subtree(company_id, start_day=None):
with SessionHandler() as sh:
if start_day:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiProjectInfo.create_time > start_day
).order_by(
db.asc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
else:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id
).order_by(
db.asc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
@staticmethod
def query_api_project_subtree_for_main_case(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseMain.id,
ApiTestcaseMain.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseMain, ApiTestcaseMain.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id
).order_by(
db.asc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseMain.index),
).all()
@staticmethod
def query_api_product_line_subtree(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProductLine.id, ApiProductLine.product_line_name,
ApiTestcaseMain.id, func.concat(ApiTestcaseMain.id, '_', ApiTestcaseMain.testcase_name)
).outerjoin(
ApiTestcaseMain, ApiProductLine.id == ApiTestcaseMain.api_product_line_id
).filter(
ApiProductLine.api_company_id == company_id,
# ApiTestcaseMain.case_type == 2
).order_by(
db.asc(ApiProductLine.index),
db.asc(ApiTestcaseMain.index),
).all()
@staticmethod
def query_api_intf_case_subtree(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name),
ApiTestcaseInfo.id,
func.concat(ApiTestcaseInfo.id, '_', ApiTestcaseInfo.testcase_name, '__', ApiTestcaseInfo.expect_result)
).join(
ApiIntfInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).filter(
ApiSystemInfo.api_company_id == company_id
).order_by(
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
@staticmethod
def query_api_project_subtree_patch(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
).outerjoin(
ApiProjectSystemRelation, ApiProjectSystemRelation.api_project_id == ApiProjectInfo.id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiProjectSystemRelation.api_system_id
).filter(
ApiProjectInfo.api_company_id == company_id
).order_by(
db.asc(ApiProjectInfo.id), db.asc(ApiSystemInfo.system_name)).all()
@staticmethod
def query_api_subtree_for_xmind(project_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.project_name, ApiSystemInfo.system_name, ApiIntfInfo.intf_type, ApiIntfInfo.intf_name,
ApiTestcaseInfo.id, ApiTestcaseInfo.testcase_name, ApiTestcaseInfo.expect_result
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.id == project_id).all()
@staticmethod
def query_api_subtree_for_xmind_by_system_id(system_id):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.system_name, ApiIntfInfo.intf_type, ApiIntfInfo.intf_name,
ApiTestcaseInfo.id, ApiTestcaseInfo.testcase_name, ApiTestcaseInfo.expect_result
).outerjoin(
ApiIntfInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiSystemInfo.id == system_id).all()
@staticmethod
def query_api_project_subtree_by_testcase_id(company_id, testcase_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiTestcaseInfo.id == testcase_id
).order_by(
db.desc(ApiProjectInfo.id)
).all()
@staticmethod
def query_api_subtree_by_testcase_id(company_id, testcase_id):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiIntfInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiSystemInfo.api_company_id == company_id,
ApiTestcaseInfo.id == testcase_id
).order_by(
db.desc(ApiSystemInfo.id)
).all()
@staticmethod
def query_api_project_subtree_like_intf_url(company_id, intf_url):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiIntfInfo.intf_name.ilike('%{}%'.format(intf_url))
).order_by(
db.desc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
).all()
@staticmethod
def query_api_subtree_like_intf_url(company_id, intf_url):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiIntfInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiSystemInfo.api_company_id == company_id,
ApiIntfInfo.intf_name.ilike('%{}%'.format(intf_url))
).order_by(
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
@staticmethod
def query_api_project_subtree_like_intf_desc(company_id, intf_desc):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiIntfInfo.intf_desc.ilike('%{}%'.format(intf_desc))
).order_by(
db.desc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
).all()
@staticmethod
def query_api_subtree_like_intf_desc(company_id, intf_desc):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiIntfInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiSystemInfo.api_company_id == company_id,
ApiIntfInfo.intf_desc.ilike('%{}%'.format(intf_desc))
).order_by(
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
@staticmethod
def query_api_project_subtree_like_testcase_name(company_id, testcase_name):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiTestcaseInfo.testcase_name.ilike('%{}%'.format(testcase_name))
).order_by(
db.desc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
).all()
@staticmethod
def query_api_project_subtree_like_testcase_creator(company_id, testcase_creator):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name), ApiTestcaseInfo.id,
ApiTestcaseInfo.testcase_name,
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id,
ApiTestcaseInfo.creator.ilike('%{}%'.format(testcase_creator))
).order_by(
db.desc(ApiProjectInfo.id),
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
).all()
class ApiIntfInfoManager(object):
@staticmethod
def insert_intf(**kwargs):
with SessionHandler() as sh:
obj = ApiIntfInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_intf(insert_list):
with SessionHandler() as sh:
objs = [ApiIntfInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_intf(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiIntfInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_intf(id_):
with SessionHandler() as sh:
obj = ApiIntfInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_intf(**kwargs):
with SessionHandler() as sh:
obj = ApiIntfInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_intfs(**kwargs):
with SessionHandler() as sh:
objs = ApiIntfInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_intfs_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiIntfInfo.query.filter(ApiIntfInfo.id.in_(id_list)).all()
return objs
@staticmethod
def get_intfs_in_system_id_list(system_id_list):
with SessionHandler() as sh:
objs = ApiIntfInfo.query.filter(ApiIntfInfo.api_system_id.in_(system_id_list)).all()
return objs
class ApiProjectInfoManager(object):
@staticmethod
def insert_project(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_project(insert_list):
with SessionHandler() as sh:
objs = [ApiProjectInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_project(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiProjectInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_project(id_):
with SessionHandler() as sh:
obj = ApiProjectInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_project(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_projects(**kwargs):
with SessionHandler() as sh:
objs = ApiProjectInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_projects_reverse(**kwargs):
with SessionHandler() as sh:
objs = ApiProjectInfo.query.filter_by(**kwargs).order_by(db.desc(ApiProjectInfo.id)).all()
return objs
@staticmethod
def query_api_project_subtree(project_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
ApiIntfInfo.id, func.concat(ApiIntfInfo.intf_desc, '-', ApiIntfInfo.intf_name),
ApiTestcaseInfo.id, func.concat(ApiTestcaseInfo.testcase_name, '_', ApiTestcaseInfo.expect_result),
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.id == project_id
).order_by(
db.asc(ApiSystemInfo.system_name),
db.asc(ApiIntfInfo.intf_name),
db.asc(ApiTestcaseInfo.index),
).all()
@staticmethod
def count_api_project_subtree(project_id):
with SessionHandler() as sh:
return sh.session.query(
func.count(ApiTestcaseInfo.id)
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiProjectIntfRelation, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
# ).outerjoin(
# ApiProjectInfo, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
# ).
).filter(
ApiProjectIntfRelation.api_project_id == project_id
).first()
@staticmethod
def count_api_project_subtree_group_by_project_id(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, func.count(ApiTestcaseInfo.id)
).outerjoin(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id
).filter(
ApiProjectInfo.api_company_id == company_id
).group_by(
ApiProjectInfo.id
).all()
@staticmethod
def query_api_project_subtree_patch(project_id):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.id, ApiProjectInfo.project_name, ApiSystemInfo.id, ApiSystemInfo.system_name,
).outerjoin(
ApiProjectSystemRelation, ApiProjectSystemRelation.api_project_id == ApiProjectInfo.id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiProjectSystemRelation.api_system_id
).filter(
ApiProjectInfo.id == project_id
).order_by(
db.asc(ApiProjectInfo.id), db.asc(ApiSystemInfo.system_name)).all()
class ApiProjectIntfRelationManager(object):
@staticmethod
def insert_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectIntfRelation(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_relation(insert_list):
with SessionHandler() as sh:
objs = [ApiProjectIntfRelation(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_relation(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiProjectIntfRelation.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_relation(id_):
with SessionHandler() as sh:
obj = ApiProjectIntfRelation.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectIntfRelation.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_relations(**kwargs):
with SessionHandler() as sh:
objs = ApiProjectIntfRelation.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_distinct_intf_ids():
with SessionHandler() as sh:
return sh.session.query(
ApiProjectIntfRelation.api_intf_id
).filter().distinct().all()
class ApiProjectSystemRelationManager(object):
@staticmethod
def insert_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectSystemRelation(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_relation(insert_list):
with SessionHandler() as sh:
objs = [ApiProjectSystemRelation(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_relation(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiProjectSystemRelation.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_relation(id_):
with SessionHandler() as sh:
obj = ApiProjectSystemRelation.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiProjectSystemRelation.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_relations(**kwargs):
with SessionHandler() as sh:
objs = ApiProjectSystemRelation.query.filter_by(**kwargs).all()
return objs
class ApiPublicVariableInfoManager(object):
@staticmethod
def insert_variable(**kwargs):
with SessionHandler() as sh:
obj = ApiPublicVariableInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_variable(insert_list):
with SessionHandler() as sh:
objs = [ApiPublicVariableInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_variable(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiPublicVariableInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_variable(id_):
with SessionHandler() as sh:
obj = ApiPublicVariableInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_variable(**kwargs):
with SessionHandler() as sh:
obj = ApiPublicVariableInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_variables(**kwargs):
with SessionHandler() as sh:
objs = ApiPublicVariableInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_variables_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiPublicVariableInfo.query.filter(ApiPublicVariableInfo.id.in_(id_list)).all()
return objs
@staticmethod
def public_variable_paginate(page, num, keywords=None):
with SessionHandler() as sh:
if keywords:
value = '%{0}%'.format(keywords)
data = ApiPublicVariableInfo.query.order_by(db.desc(ApiPublicVariableInfo.create_time)).filter(
or_(ApiPublicVariableInfo.variable_name.ilike(value),
ApiPublicVariableInfo.value.ilike(value),
ApiPublicVariableInfo.type.ilike(value),
ApiPublicVariableInfo.api_system_id.ilike(value))
).paginate(page=page, per_page=num, error_out=False)
else:
data = ApiPublicVariableInfo.query.order_by(
db.desc(ApiPublicVariableInfo.create_time)).filter().paginate(
page=page, per_page=num, error_out=False)
return data
@staticmethod
def whether_variable_name_canbeupdated(variable_name, id_, api_system_id):
"""判断更改变量名称时,是否被其它变量名称使用"""
with SessionHandler() as sh:
obj = ApiPublicVariableInfo.query.filter(ApiPublicVariableInfo.id != id_,
ApiPublicVariableInfo.variable_name == variable_name,
ApiPublicVariableInfo.api_system_id == api_system_id).first()
if obj:
return True
else:
return False
@staticmethod
def whether_variable_name_canbeupdated_in_company_id(variable_name, id_, api_company_id):
"""判断更改变量名称时,是否被其它变量名称使用"""
with SessionHandler() as sh:
obj = ApiPublicVariableInfo.query.filter(ApiPublicVariableInfo.id != id_,
ApiPublicVariableInfo.variable_name == variable_name,
ApiPublicVariableInfo.api_company_id == api_company_id).first()
if obj:
return True
else:
return False
class ApiSystemInfoManager(object):
@staticmethod
def insert_system(**kwargs):
with SessionHandler() as sh:
obj = ApiSystemInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_system(insert_list):
with SessionHandler() as sh:
objs = [ApiSystemInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_system(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiSystemInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_system(id_):
with SessionHandler() as sh:
obj = ApiSystemInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_system(**kwargs):
with SessionHandler() as sh:
obj = ApiSystemInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_systems(**kwargs):
with SessionHandler() as sh:
objs = ApiSystemInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def count_system_with_intf_in_company_id(company_id):
with SessionHandler() as sh:
return sh.session.query(
func.count(distinct(ApiSystemInfo.id))
).join(
ApiIntfInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id
).filter(ApiSystemInfo.api_company_id == company_id).first()
class ApiTestcaseInfoManager(object):
@staticmethod
def insert_testcase(**kwargs):
with SessionHandler() as sh:
api_intf_id = kwargs.get('api_intf_id')
pre_obj = ApiTestcaseInfo.query.filter_by(api_intf_id=api_intf_id).order_by(
db.desc(ApiTestcaseInfo.index)).first()
index = pre_obj.index + 1 if pre_obj else 0
kwargs.update({'index': index})
obj = ApiTestcaseInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_testcase(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_testcase(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_testcase(id_):
with SessionHandler() as sh:
obj = ApiTestcaseInfo.query.filter_by(id=id_).first()
if obj:
to_update_objs = ApiTestcaseInfo.query.filter(
ApiTestcaseInfo.api_intf_id == obj.api_intf_id, ApiTestcaseInfo.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_testcase(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_testcases(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def count_testcases_in_intf_id_list(intf_id_list):
with SessionHandler() as sh:
# return sh.session.query(func.count(ApiTestcaseInfo.id)).filter_by(**kwargs).first()
return sh.session.query(
func.count(ApiTestcaseInfo.id)
).filter(
ApiTestcaseInfo.api_intf_id.in_(intf_id_list), ApiTestcaseInfo.case_status == 0
).first()
@staticmethod
def get_testcases_order_by_create_time_desc(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseInfo.query.filter_by(**kwargs).order_by(db.desc(ApiTestcaseInfo.create_time)).all()
return objs
@staticmethod
def query_all_testcases_include():
with SessionHandler() as sh:
return sh.session.query(ApiTestcaseInfo.include).filter().all()
@staticmethod
def paging_query_testcase_by_intf_id(intf_id, page_no, page_size, testcase_name=None):
"""分页查询某一接口下的testcase"""
with SessionHandler() as sh:
if testcase_name:
value = '%{0}%'.format(testcase_name)
pagination_obj = ApiTestcaseInfo.query.order_by(db.asc(ApiTestcaseInfo.index)).filter(
ApiTestcaseInfo.api_intf_id == intf_id, ApiTestcaseInfo.testcase_name.ilike(value)).paginate(
page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = ApiTestcaseInfo.query.order_by(db.asc(ApiTestcaseInfo.index)).filter(
ApiTestcaseInfo.api_intf_id == intf_id).paginate(page=page_no, per_page=page_size, error_out=False)
return pagination_obj
@staticmethod
def query_testcase_belong(testcase_id):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseInfo.include, ApiTestcaseInfo.testcase_name, ApiSystemInfo.system_name,
ApiIntfInfo.intf_name, ApiTestcaseRequest.request
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseRequest, ApiTestcaseRequest.api_testcase_id == ApiTestcaseInfo.id
).filter(ApiTestcaseInfo.id == testcase_id).first()
@staticmethod
def get_testcases_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiTestcaseInfo.query.filter(ApiTestcaseInfo.id.in_(id_list)).all()
return objs
@staticmethod
def get_testcases_in_intf_id_list(intf_id_list):
with SessionHandler() as sh:
objs = ApiTestcaseInfo.query.filter(ApiTestcaseInfo.api_intf_id.in_(intf_id_list)).all()
return objs
@staticmethod
def get_last_obj_by_intf(intf_id):
with SessionHandler() as sh:
return ApiTestcaseInfo.query.filter_by(api_intf_id=intf_id).order_by(db.desc(ApiTestcaseInfo.index)).first()
@staticmethod
def get_last_obj():
with SessionHandler() as sh:
return ApiTestcaseInfo.query.order_by(db.desc(ApiTestcaseInfo.id)).first()
@staticmethod
def index_update_while_remove_testcase(id_):
with SessionHandler() as sh:
obj = ApiTestcaseInfo.query.filter_by(id=id_).first()
if obj:
to_update_objs = ApiTestcaseInfo.query.filter(
ApiTestcaseInfo.api_intf_id == obj.api_intf_id, ApiTestcaseInfo.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.commit()
@staticmethod
def filter_task_testcase_ids(intf_id, tag_id_list):
with SessionHandler() as sh:
# # 可自动化标签默认添加
# if 12 not in tag_id_list:
# tag_id_list.append(12)
return sh.session.query(
ApiTestcaseInfo.id
).join(
ApiTestcaseTagRelation, ApiTestcaseInfo.id == ApiTestcaseTagRelation.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id == intf_id,
ApiTestcaseTagRelation.tag_id.in_(tag_id_list),
ApiTestcaseInfo.case_status == 0
).distinct().all()
@staticmethod
def filter_task_testcase_ids_(intf_ids, tag_id_list):
with SessionHandler() as sh:
# # 可自动化标签默认添加
# if 12 not in tag_id_list:
# tag_id_list.append(12)
return sh.session.query(
ApiTestcaseInfo.id,
ApiTestcaseInfo.api_intf_id
).join(
ApiTestcaseTagRelation, ApiTestcaseInfo.id == ApiTestcaseTagRelation.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id.in_(intf_ids),
ApiTestcaseTagRelation.tag_id.in_(tag_id_list),
ApiTestcaseInfo.case_status == 0
).distinct().all()
@staticmethod
def get_id_and_setup_case_list():
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseInfo.id, ApiTestcaseInfo.setup_case_list
).filter().all()
@staticmethod
def get_intf_and_case_info_in_case_ids(case_ids):
with SessionHandler() as sh:
return sh.session.query(
ApiIntfInfo.id, ApiIntfInfo.intf_name, ApiIntfInfo.intf_desc, ApiIntfInfo.intf_type,
ApiTestcaseInfo.id, ApiTestcaseInfo.testcase_name, ApiTestcaseInfo.creator,
ApiTestcaseInfo.last_modifier, ApiTestcaseInfo.case_status, ApiTestcaseInfo.last_run,
ApiTestcaseInfo.create_time, ApiTestcaseInfo.update_time, ApiTestcaseInfo.last_run_time
).join(
ApiTestcaseInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id,
).filter(
ApiTestcaseInfo.id.in_(case_ids),
).order_by(
ApiIntfInfo.api_system_id, ApiIntfInfo.intf_name, ApiTestcaseInfo.id
).all()
@staticmethod
def get_recent_testcases_by_time(time_):
with SessionHandler() as sh:
return sh.session.query(
ApiProjectInfo.project_name, ApiSystemInfo.system_name, ApiTestcaseInfo.id,
ApiTestcaseInfo.create_time, ApiTestcaseInfo.last_modify_time, ApiTestcaseInfo.creator,
ApiTestcaseInfo.last_modifier, ApiProjectInfo.id,
).join(
ApiProjectIntfRelation, ApiProjectInfo.id == ApiProjectIntfRelation.api_project_id,
).join(
ApiIntfInfo, ApiIntfInfo.id == ApiProjectIntfRelation.api_intf_id,
).join(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id,
).join(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id,
).filter(
or_(ApiTestcaseInfo.create_time > time_, ApiTestcaseInfo.last_modify_time > time_),
ApiSystemInfo.id.notin_([448, 450, 451]), ApiTestcaseInfo.case_status == 0
).order_by(
ApiProjectInfo.project_name, ApiSystemInfo.id, ApiTestcaseInfo.id
).all()
@staticmethod
def get_recent_testcases_by_time_not_belong_project(time_, intf_id_list):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.system_name, ApiTestcaseInfo.id,
ApiTestcaseInfo.create_time, ApiTestcaseInfo.last_modify_time, ApiTestcaseInfo.creator,
ApiTestcaseInfo.last_modifier
).join(
ApiIntfInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id,
).join(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id,
).filter(
or_(ApiTestcaseInfo.create_time > time_, ApiTestcaseInfo.last_modify_time > time_),
ApiIntfInfo.id.notin_(intf_id_list), ApiSystemInfo.id.notin_([448, 450, 451]),
ApiTestcaseInfo.case_status == 0
).order_by(
ApiSystemInfo.id, ApiTestcaseInfo.id
).all()
@staticmethod
def get_testcase_id_in_company_id(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseInfo.id
).join(
ApiIntfInfo, ApiTestcaseInfo.api_intf_id == ApiIntfInfo.id,
).join(
ApiSystemInfo, ApiIntfInfo.api_system_id == ApiSystemInfo.id,
).filter(
ApiSystemInfo.api_company_id == company_id,
ApiTestcaseInfo.case_status == 0
).all()
@staticmethod
def count_testcase_in_tag_id(testcase_id_list, tag_id):
with SessionHandler() as sh:
return sh.session.query(
func.count(ApiTestcaseTagRelation.id)
).join(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseTagRelation.api_testcase_id,
).filter(
ApiTestcaseTagRelation.tag_id == tag_id,
ApiTestcaseInfo.id.in_(testcase_id_list)
).first()
@staticmethod
def count_intf_in_company_id(company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiIntfInfo.intf_type, func.count(distinct(ApiIntfInfo.id)),
).join(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id,
).filter(
ApiSystemInfo.api_company_id == company_id,
).group_by(
ApiIntfInfo.intf_type
).all()
class ApiTestcaseRequestManager(object):
@staticmethod
def insert_request(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequest(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_request(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseRequest(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_request(id_):
with SessionHandler() as sh:
obj = ApiTestcaseRequest.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def update_request(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequest.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_request(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequest.query.filter_by(**kwargs).first()
return obj
@staticmethod
def update_request_by_testcase_id(testcase_id, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequest.query.filter_by(api_testcase_id=testcase_id).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
class ApiTestcaseTagRelationManager(object):
@staticmethod
def insert_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseTagRelation(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_relation(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseTagRelation(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_relation(id_):
with SessionHandler() as sh:
obj = ApiTestcaseTagRelation.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseTagRelation.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_relations(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseTagRelation.query.filter_by(**kwargs).all()
return objs
@staticmethod
def query_tag_info_by_testcase(testcase_id):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseTagRelation.tag_id, TestcaseTag.tag_name,
).outerjoin(
TestcaseTag, ApiTestcaseTagRelation.tag_id == TestcaseTag.id
).filter(
ApiTestcaseTagRelation.api_testcase_id == testcase_id).all()
@staticmethod
def get_relations_in_case_ids(case_ids):
with SessionHandler() as sh:
objs = ApiTestcaseTagRelation.query.filter(ApiTestcaseTagRelation.api_testcase_id.in_(case_ids)).all()
return objs
class ApiTestcaseMainTagRelationManager(object):
@staticmethod
def insert_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMainTagRelation(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_relation(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseMainTagRelation(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_relation(id_):
with SessionHandler() as sh:
obj = ApiTestcaseMainTagRelation.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_relation(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMainTagRelation.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_relations(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseMainTagRelation.query.filter_by(**kwargs).all()
return objs
@staticmethod
def query_tag_info_by_testcase(testcase_id):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseMainTagRelation.tag_id, TestcaseTag.tag_name,
).outerjoin(
TestcaseTag, ApiTestcaseMainTagRelation.tag_id == TestcaseTag.id
).filter(
ApiTestcaseMainTagRelation.api_testcase_id == testcase_id).all()
class ApiTestReportManager(object):
@staticmethod
def insert_report(**kwargs):
with SessionHandler() as sh:
kwargs.pop('report', None)
obj = ApiTestReport(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_report(insert_list):
with SessionHandler() as sh:
objs = [ApiTestReport(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_report(id_, **kwargs):
with SessionHandler() as sh:
kwargs.pop('report', None)
obj = ApiTestReport.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_report(id_):
with SessionHandler() as sh:
obj = ApiTestReport.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_report(**kwargs):
with SessionHandler() as sh:
obj = ApiTestReport.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_reports(**kwargs):
with SessionHandler() as sh:
objs = ApiTestReport.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_next_report_id():
"""获取api_test_report表下一个id"""
with SessionHandler() as sh:
obj = sh.session.query(func.max(ApiTestReport.id)).first()
if not obj[0]:
return 1
return obj[0] + 1
@staticmethod
def paging_query_reports(page_no, page_size, project_id=None, start_time=None, end_time=None, executor=None):
with SessionHandler() as sh:
if project_id and start_time and end_time and executor:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.create_time.between(start_time, end_time),
ApiTestReport.api_project_id == project_id,
ApiTestReport.executor == executor,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif project_id and start_time and end_time:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.create_time.between(start_time, end_time),
ApiTestReport.api_project_id == project_id,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif project_id and executor:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.api_project_id == project_id,
ApiTestReport.executor == executor,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif start_time and end_time and executor:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.create_time.between(start_time, end_time),
ApiTestReport.executor == executor,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif project_id:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.api_project_id == project_id,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif start_time and end_time:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.create_time.between(start_time, end_time),
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
elif executor:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.executor == executor,
ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = ApiTestReport.query.filter(ApiTestReport.status.in_(('success', 'fail'))).order_by(
db.desc(ApiTestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
return pagination_obj
class ApiProductLineManager(object):
@staticmethod
def insert_product_line(**kwargs):
with SessionHandler() as sh:
api_company_id = kwargs.get('api_company_id')
pre_obj = ApiProductLine.query.filter_by(api_company_id=api_company_id).order_by(
db.desc(ApiProductLine.index)).first()
index = pre_obj.index + 1 if pre_obj else 0
kwargs.update({'index': index})
obj = ApiProductLine(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def insert_folder(**kwargs):
with SessionHandler() as sh:
parent_id = kwargs.get('parent_id')
pre_obj = ApiProductLine.query.filter_by(parent_id=parent_id).order_by(
db.desc(ApiProductLine.index)).first()
index = pre_obj.index + 1 if pre_obj else 0
kwargs.update({'index': index})
obj = ApiProductLine(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_product_line(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiProductLine.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_product_line(id_):
with SessionHandler() as sh:
obj = ApiProductLine.query.filter_by(id=id_).first()
if obj:
to_update_objs = ApiProductLine.query.filter(
ApiProductLine.api_company_id == obj.api_company_id, ApiProductLine.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def delete_folder(id_):
with SessionHandler() as sh:
obj = ApiProductLine.query.filter_by(id=id_).first()
if obj:
to_update_objs = ApiProductLine.query.filter(
ApiProductLine.parent_id == obj.parent_id, ApiProductLine.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_product_line(**kwargs):
with SessionHandler() as sh:
obj = ApiProductLine.query.filter_by(**kwargs).first()
return obj
@staticmethod
# @custom_func_wrapper
def get_product_lines(**kwargs):
with SessionHandler() as sh:
objs = ApiProductLine.query.filter_by(**kwargs).all()
return objs
class ApiTestcaseMainManager(object):
@staticmethod
def insert_testcase_main(**kwargs):
with SessionHandler() as sh:
api_intf_id = kwargs.get('api_intf_id', None)
api_product_line_id = kwargs.get('api_product_line_id', None)
if api_product_line_id:
pre_obj = ApiTestcaseMain.query.filter_by(api_product_line_id=api_product_line_id).order_by(
db.desc(ApiTestcaseMain.index)).first()
else:
pre_obj = ApiTestcaseMain.query.filter_by(api_intf_id=api_intf_id).order_by(
db.desc(ApiTestcaseMain.index)).first()
index = pre_obj.index + 1 if pre_obj else 0
kwargs.update({'index': index})
obj = ApiTestcaseMain(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_testcase_main(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseMain(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_testcase_main(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMain.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_testcase_main(id_):
with SessionHandler() as sh:
obj = ApiTestcaseMain.query.filter_by(id=id_).first()
if obj:
if obj.case_type == 2:
to_update_objs = ApiTestcaseMain.query.filter(
ApiTestcaseMain.api_product_line_id == obj.api_product_line_id,
ApiTestcaseMain.index > obj.index
).all()
else:
to_update_objs = ApiTestcaseMain.query.filter(
ApiTestcaseMain.api_intf_id == obj.api_intf_id, ApiTestcaseMain.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_testcase_main(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMain.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_testcase_mains(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseMain.query.filter_by(**kwargs).all()
return objs
@staticmethod
def count_testcases_in_id_list(id_list):
with SessionHandler() as sh:
return sh.session.query(
func.count(ApiTestcaseMain.id)
).filter(
ApiTestcaseMain.id.in_(id_list), ApiTestcaseMain.case_status == 0
).first()
@staticmethod
def paging_query_testcase_by_intf_id(intf_id, page_no, page_size, testcase_name=None):
"""分页查询某一接口下的testcase"""
with SessionHandler() as sh:
if testcase_name:
value = '%{0}%'.format(testcase_name)
pagination_obj = ApiTestcaseMain.query.order_by(db.asc(ApiTestcaseMain.index)).filter(
ApiTestcaseMain.api_intf_id == intf_id, ApiTestcaseMain.testcase_name.ilike(value)).paginate(
page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = ApiTestcaseMain.query.order_by(db.asc(ApiTestcaseMain.index)).filter(
ApiTestcaseMain.api_intf_id == intf_id).paginate(page=page_no, per_page=page_size, error_out=False)
return pagination_obj
@staticmethod
def paging_query_testcase_by_product_line_id(product_line_id, page_no, page_size, testcase_name=None):
"""分页查询某一产品线下的testcase"""
with SessionHandler() as sh:
if testcase_name:
value = '%{0}%'.format(testcase_name)
pagination_obj = ApiTestcaseMain.query.order_by(db.asc(ApiTestcaseMain.index)).filter(
ApiTestcaseMain.api_product_line_id == product_line_id,
ApiTestcaseMain.testcase_name.ilike(value)).paginate(
page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = ApiTestcaseMain.query.order_by(db.asc(ApiTestcaseMain.index)).filter(
ApiTestcaseMain.api_product_line_id == product_line_id).paginate(page=page_no, per_page=page_size,
error_out=False)
return pagination_obj
@staticmethod
def index_update_while_remove_testcase(id_):
with SessionHandler() as sh:
obj = ApiTestcaseMain.query.filter_by(id=id_).first()
if obj:
if obj.case_type == 2:
to_update_objs = ApiTestcaseMain.query.filter(
ApiTestcaseMain.api_product_line_id == obj.api_product_line_id,
ApiTestcaseMain.index > obj.index
).all()
else:
to_update_objs = ApiTestcaseMain.query.filter(
ApiTestcaseMain.api_intf_id == obj.api_intf_id, ApiTestcaseMain.index > obj.index
).all()
for to_update_obj in to_update_objs:
to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
sh.session.add(to_update_obj)
sh.session.commit()
@staticmethod
def get_last_obj_by_intf(intf_id):
with SessionHandler() as sh:
return ApiTestcaseMain.query.filter_by(api_intf_id=intf_id).order_by(db.desc(ApiTestcaseMain.index)).first()
@staticmethod
def get_last_obj_by_product_line(product_line_id):
with SessionHandler() as sh:
return ApiTestcaseMain.query.filter_by(api_product_line_id=product_line_id).order_by(
db.desc(ApiTestcaseMain.index)).first()
@staticmethod
def get_last_obj():
with SessionHandler() as sh:
return ApiTestcaseMain.query.order_by(db.desc(ApiTestcaseMain.id)).first()
@staticmethod
def get_testcases_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiTestcaseMain.query.filter(ApiTestcaseMain.id.in_(id_list)).all()
return objs
@staticmethod
def get_valid_testcases_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiTestcaseMain.query.filter(ApiTestcaseMain.id.in_(id_list), ApiTestcaseMain.case_status == 0).all()
return objs
@staticmethod
def get_testcase_mains_in_tag(api_product_line_id, tag_id_list=None):
with SessionHandler() as sh:
if not tag_id_list:
return sh.session.query(
ApiTestcaseMain.id, func.concat(ApiTestcaseMain.testcase_name, '__', ApiTestcaseMain.expect_result),
ApiTestcaseMain.sub_list
).filter(
ApiTestcaseMain.api_product_line_id == api_product_line_id
).order_by(db.asc(ApiTestcaseMain.id)).all()
elif 1 == len(tag_id_list):
return sh.session.query(
ApiTestcaseMain.id, func.concat(ApiTestcaseMain.testcase_name, '__', ApiTestcaseMain.expect_result),
ApiTestcaseMain.sub_list
).join(
ApiTestcaseMainTagRelation, ApiTestcaseMain.id == ApiTestcaseMainTagRelation.api_testcase_id
).filter(
ApiTestcaseMain.api_product_line_id == api_product_line_id,
ApiTestcaseMainTagRelation.tag_id == tag_id_list[0]
).order_by(db.asc(ApiTestcaseMain.id)).all()
elif 2 == len(tag_id_list):
main_tag_relation1 = aliased(ApiTestcaseMainTagRelation)
main_tag_relation2 = aliased(ApiTestcaseMainTagRelation)
return sh.session.query(
ApiTestcaseMain.id, func.concat(ApiTestcaseMain.testcase_name, '__', ApiTestcaseMain.expect_result),
ApiTestcaseMain.sub_list
).join(
main_tag_relation1, ApiTestcaseMain.id == main_tag_relation1.api_testcase_id
).join(
main_tag_relation2, ApiTestcaseMain.id == main_tag_relation2.api_testcase_id
).filter(
ApiTestcaseMain.api_product_line_id == api_product_line_id,
main_tag_relation1.tag_id == tag_id_list[0],
main_tag_relation2.tag_id == tag_id_list[1]
).order_by(db.asc(ApiTestcaseMain.id)).all()
else:
return []
@staticmethod
def filter_task_testcase_ids(product_line_id, tag_id_list):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseMain.id
).join(
ApiTestcaseMainTagRelation, ApiTestcaseMain.id == ApiTestcaseMainTagRelation.api_testcase_id
).filter(
ApiTestcaseMain.api_product_line_id == product_line_id,
ApiTestcaseMainTagRelation.tag_id.in_(tag_id_list),
ApiTestcaseMain.case_status == 0
).distinct().all()
@staticmethod
def filter_task_testcase_ids_(product_line_ids, tag_id_list):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseMain.id
).join(
ApiTestcaseMainTagRelation, ApiTestcaseMain.id == ApiTestcaseMainTagRelation.api_testcase_id
).filter(
ApiTestcaseMain.api_product_line_id.in_(product_line_ids),
ApiTestcaseMainTagRelation.tag_id.in_(tag_id_list),
ApiTestcaseMain.case_status == 0
).distinct().all()
class ApiTestcaseSubManager(object):
@staticmethod
def insert_testcase_sub(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseSub(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_testcase_sub(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseSub(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_testcase_sub(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseSub.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_update_testcase_sub(update_list):
with SessionHandler() as sh:
from atp.utils.tools import json_dumps
sub_id_list = []
for sub in update_list:
sub_id = sub.pop('sub_id', None)
if not sub:
# 引入的子用例,不更新直接跳过
sub_id_list.append(sub_id)
continue
hr_request = sub.pop('request')
if isinstance(hr_request, dict):
hr_request = json_dumps(hr_request)
'''插入或更新ApiTestcaseSub和ApiTestcaseRequestQll'''
# insert
if not sub_id:
last_sub_obj = sh.session.query(func.max(ApiTestcaseSub.id)).first()
next_sub_id = last_sub_obj[0] + 1 if last_sub_obj[0] else 1
sub['id'] = next_sub_id
sub_id_list.append(next_sub_id)
obj = ApiTestcaseSub(**sub)
sh.session.add(obj)
sh.session.commit()
ApiTestcaseRequestQllManager.insert_request(
api_testcase_id=next_sub_id,
request=hr_request,
)
# update
else:
sub_id_list.append(sub_id)
obj = ApiTestcaseSub.query.filter_by(id=sub_id).first()
for column in sub:
obj = obj_set_value(obj, column, sub[column])
sh.session.add(obj)
sh.session.commit()
ApiTestcaseRequestQllManager.update_request_by_testcase_id(
testcase_id=sub_id,
request=hr_request,
)
return sub_id_list
@staticmethod
def delete_testcase_sub(id_):
with SessionHandler() as sh:
obj = ApiTestcaseSub.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
ApiTestcaseRequestQllManager.delete_request_by_testcase_id(testcase_id=id_)
@staticmethod
def get_testcase_sub(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseSub.query.filter_by(**kwargs).first()
return obj
@staticmethod
# @custom_func_wrapper
def get_testcase_subs(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseSub.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_testcase_subs_in_id_list(id_list):
with SessionHandler() as sh:
objs = ApiTestcaseSub.query.filter(ApiTestcaseSub.id.in_(id_list)).all()
return objs
class ApiTestcaseRequestQllManager(object):
@staticmethod
def insert_request(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequestQll(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_request(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseRequestQll(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_request(id_):
with SessionHandler() as sh:
obj = ApiTestcaseRequestQll.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def delete_request_by_testcase_id(testcase_id):
with SessionHandler() as sh:
obj = ApiTestcaseRequestQll.query.filter_by(api_testcase_id=testcase_id).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def update_request(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequestQll.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_request(**kwargs):
with SessionHandler() as sh:
return ApiTestcaseRequestQll.query.filter_by(**kwargs).first()
@staticmethod
def get_requests(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseRequestQll.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_redundant_requests():
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseRequestQll.request
).group_by(
ApiTestcaseRequestQll.request
).having(
func.count(ApiTestcaseRequestQll.id) > 1
).all()
@staticmethod
def update_request_by_testcase_id(testcase_id, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseRequestQll.query.filter_by(api_testcase_id=testcase_id).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_requests_in_case_id_list(case_id_list):
with SessionHandler() as sh:
objs = ApiTestcaseRequestQll.query.filter(ApiTestcaseRequestQll.api_testcase_id.in_(case_id_list)).all()
return objs
class EnvInfoManager(object):
@staticmethod
def insert_env(**kwargs):
with SessionHandler() as sh:
obj = EnvInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_env(id_, **kwargs):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_env(id_):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(id=id_).first()
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def env_info(env_name=None):
with SessionHandler() as sh:
if env_name:
obj = EnvInfo.query.filter_by(env_name=env_name)
else:
obj = EnvInfo.query.all()
return obj
@staticmethod
def is_env_id_exist(id):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(id=id).first()
if obj:
return True
else:
return False
@staticmethod
def is_env_name_exist(name):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(env_name=name).first()
if obj:
return True
else:
return False
@staticmethod
def get_env_info(id_):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(id=id_).first()
return obj
@staticmethod
def get_env(**kwargs):
with SessionHandler() as sh:
obj = EnvInfo.query.filter_by(**kwargs).first()
return obj
# class ProjectInfoManager(object):
# @staticmethod
# def insert_project(**kwargs):
# obj = ProjectInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def update_project(id_, **kwargs):
# obj = ProjectInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_project(id_):
# obj = ProjectInfo.query.filter_by(id=id_).first()
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def is_project_name_exist(project_name):
# project_name = '%' + project_name + '%'
# obj = ProjectInfo.query.filter(ProjectInfo.project_name.like(project_name)).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def is_project_name_exist_for_update(project_name):
# obj = ProjectInfo.query.filter_by(project_name=project_name).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def is_project_name_exist_union(id, project_name):
# obj = ProjectInfo.query.filter_by(project_name=project_name, id=id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def is_project_id_exist(id):
# obj = ProjectInfo.query.filter_by(id=id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def project_info(project_name):
# if project_name == "":
# obj = ProjectInfo.query.all()
# else:
# project_name = '%' + project_name + '%'
# obj = ProjectInfo.query.filter(ProjectInfo.project_name.like(project_name)).all()
#
# return obj
#
# @staticmethod
# def system_info(id_):
#
# obj = SystemInfo.query.filter_by(project_id=id_).all()
#
# return obj
#
# @staticmethod
# def select_id_by(project_name):
# obj = ProjectInfo.query.filter_by(project_name=project_name).first()
#
# return obj.id
#
# @staticmethod
# def get_project_info(id_):
# obj = ProjectInfo.query.filter_by(id=id_).first()
# return obj
#
#
# class SystemInfoManager(object):
# @staticmethod
# def insert_system(**kwargs):
# project_id = kwargs.get('project_id')
# pre_obj = SystemInfo.query.filter_by(project_id=project_id).order_by(db.desc(SystemInfo.index)).first()
# index = pre_obj.index + 1 if (pre_obj and pre_obj.index is not None) else 0
# kwargs.update({'index': index})
# obj = SystemInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def update_system(id_, **kwargs):
# obj = SystemInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# # obj.system_name = kwargs.pop('system_name')
# # obj.test_user = kwargs.pop('test_user', None)
# # obj.dev_user = kwargs.pop('dev_user', None)
# # obj.publish_app = kwargs.pop('publish_app', None)
# # obj.simple_desc = kwargs.pop('simple_desc', None)
# # obj.project_id = kwargs.pop('project_id', None)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_system(id_):
# obj = SystemInfo.query.filter_by(id=id_).first()
# if obj:
# to_update_objs = SystemInfo.query.filter(
# SystemInfo.project_id == obj.project_id, SystemInfo.index > obj.index
# ).all()
# for to_update_obj in to_update_objs:
# to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
# sh.session.add(to_update_obj)
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def is_system_name_exist(system_name, project_id=None):
# if project_id:
# obj = SystemInfo.query.filter_by(system_name=system_name, project_id=project_id).first()
# else:
# obj = SystemInfo.query.filter_by(system_name=system_name).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def query_systeminfo_by_id(_id):
# obj = SystemInfo.query.filter_by(id=_id).first()
# return obj
#
# @staticmethod
# def is_system_exist(system_name, project_id):
# obj = SystemInfo.query.filter_by(system_name=system_name, project_id=project_id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def is_project_id_exist(project_id):
# obj = ProjectInfo.query.filter_by(id=project_id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def is_system_id_exist(id):
# obj = SystemInfo.query.filter_by(id=id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def system_info(system_name):
# if system_name == "":
# obj = SystemInfo.query.all()
# else:
# system_name = '%' + system_name + '%'
# obj = SystemInfo.query.filter(SystemInfo.system_name.like(system_name)).all()
#
# return obj
#
# @staticmethod
# def project_info(id_):
# obj = ProjectInfo.query.filter_by(id=id_).all()
# return obj
#
# @staticmethod
# def select_by_project_id(project_id):
# obj = SystemInfo.query.filter_by(project_id=project_id).all()
# return obj
#
# @staticmethod
# def query_id_by_projectid(project_id, system_name):
# obj = SystemInfo.query.filter_by(system_name=system_name, project_id=project_id).first()
# return obj.id
#
# @staticmethod
# def get_max_index(project_id):
# obj = SystemInfo.query.filter_by(project_id=project_id).order_by(db.desc(SystemInfo.index)).first()
# return obj.id
#
# @staticmethod
# def up_system(id_):
# obj = SystemInfo.query.filter_by(id=id_).first()
# if obj and obj.index != 0:
# pre_obj = SystemInfo.query.filter_by(project_id=obj.project_id, index=obj.index - 1).first()
# pre_obj = obj_set_value(pre_obj, 'index', obj.index)
# sh.session.add(pre_obj)
# obj = obj_set_value(obj, 'index', obj.index - 1)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def down_system(id_):
# obj = SystemInfo.query.filter_by(id=id_).first()
# if obj:
# next_obj = SystemInfo.query.filter_by(project_id=obj.project_id, index=obj.index + 1).first()
# if next_obj:
# next_obj = obj_set_value(next_obj, 'index', obj.index)
# sh.session.add(next_obj)
# obj = obj_set_value(obj, 'index', obj.index + 1)
# sh.session.add(obj)
# sh.session.commit()
#
#
# class ModuleInfoManager(object):
# @staticmethod
# def insert_module(**kwargs):
# obj = ModuleInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def update_module(id_, **kwargs):
# obj = ModuleInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# # obj.module_name = kwargs.pop('module_name')
# # obj.test_user = kwargs.pop('test_user', None)
# # obj.simple_desc = kwargs.pop('simple_desc', None)
# # obj.system_id = kwargs.pop('system_id', None)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_module(id_):
# obj = ModuleInfo.query.filter_by(id=id_).first()
# if obj:
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def query_module(system_id, module_name):
# obj = ModuleInfo.query.filter_by(system_id=system_id, module_name=module_name).all()
# return obj
#
# @staticmethod
# def query_module_id(id):
# obj = ModuleInfo.query.filter_by(id=id).all()
# return obj
#
# @staticmethod
# def query_module_by_id(id_):
# obj = ModuleInfo.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def query_module_name(name, system_id):
# obj = ModuleInfo.query.order_by(db.desc(ModuleInfo.create_time)).filter_by(module_name=name, system_id=system_id).first()
# return obj.id
#
# @staticmethod
# def query_all_module(system_id):
# obj = ModuleInfo.query.filter_by(system_id=system_id).all()
# return obj
#
# @staticmethod
# def is_module_exits(module_name, system_id):
# obj = ModuleInfo.query.filter_by(module_name=module_name, system_id=system_id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def delete_module_by_system_id(system_id):
# """根据system_id删除对应模块"""
# objs = ModuleInfo.query.filter_by(system_id=system_id).all()
# if objs:
# [sh.session.delete(obj) for obj in objs]
# sh.session.commit()
#
#
# class TestsuiteInfoManager(object):
# @staticmethod
# def insert_testsuite(**kwargs):
# obj = TestsuiteInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def update_testsuite(id_, **kwargs):
# obj = TestsuiteInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# # obj.testsuite_name = kwargs.pop('testsuite_name')
# # obj.simple_desc = kwargs.pop('simple_desc', None)
# # obj.module_id = kwargs.pop('module_id', None)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_testsuite(id_):
# obj = TestsuiteInfo.query.filter_by(id=id_).first()
# if obj:
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def query_all_testsuite(module_id):
# obj = TestsuiteInfo.query.filter_by(module_id=module_id).all()
# return obj
#
# @staticmethod
# def query_testsuiteid(testsuite_name, module_id):
# '''导入时:根据用例集名称,模块名称查询是否存在重复的用例集'''
# obj = TestsuiteInfo.query.filter_by(testsuite_name=testsuite_name, module_id=module_id).first()
# return obj.id
#
# @staticmethod
# def query_testsuite_by_id(id_):
# obj = TestsuiteInfo.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def is_testsuite_exits(testsuite_name, module_id):
# obj = TestsuiteInfo.query.filter_by(testsuite_name=testsuite_name, module_id=module_id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def delete_testsuite_by_module_id(module_id):
# """根据module_id删除对应测试集"""
# objs = TestsuiteInfo.query.filter_by(module_id=module_id).all()
# if objs:
# [sh.session.delete(obj) for obj in objs]
# sh.session.commit()
#
# @staticmethod
# def query_testsuite_by_projectid(id_):
# testsuite_list = []
# system_objs = SystemInfo.query.filter_by(project_id=id_).all()
# for obj in system_objs:
# module_objs = ModuleInfo.query.filter_by(system_id=obj.id).all()
# for obj in module_objs:
# testsuites = TestsuiteInfo.query.filter_by(module_id=obj.id).all()
# for testsuite in testsuites:
# testsuite_list.append(testsuite.id)
# return testsuite_list
#
# @staticmethod
# def get_all_testsuites():
# return TestsuiteInfo.query.filter_by().all()
#
#
# class TestcaseInfoManager(object):
# @staticmethod
# def insert_testcase(**kwargs):
# testsuite_id = kwargs.get('testsuite_id')
# pre_obj = TestcaseInfo.query.filter_by(testsuite_id=testsuite_id).order_by(db.desc(TestcaseInfo.index)).first()
# index = pre_obj.index + 1 if pre_obj else 0
# kwargs.update({'index': index})
# obj = TestcaseInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# '''一次commit插入批量case'''
#
# @staticmethod
# def insert_testcases(testcases):
# sh.session.add_all(testcases)
# sh.session.commit()
#
# @staticmethod
# def update_testcase(id_, **kwargs):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# # obj.testcase_name = kwargs.pop('testcase_name')
# # obj.type = kwargs.pop('type')
# # obj.include = kwargs.pop('include', None)
# # obj.request = kwargs.pop('request')
# # obj.testsuite_id = kwargs.pop('testsuite_id', None)
# # obj.module_id = kwargs.pop('module_id', None)
# # obj.system_id = kwargs.pop('system_id', None)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_testcase(id_):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# if obj:
# to_update_objs = TestcaseInfo.query.filter(
# TestcaseInfo.testsuite_id == obj.testsuite_id, TestcaseInfo.index > obj.index
# ).all()
# for to_update_obj in to_update_objs:
# to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
# sh.session.add(to_update_obj)
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def get_testcase(id_):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def get_all_testcases():
# objs = TestcaseInfo.query.filter_by().all()
# return objs
#
# @staticmethod
# def query_all_testcases_include():
# return sh.session.query(TestcaseInfo.include).filter().all()
#
# @staticmethod
# def is_testcase_id_exist(id_):
# """判断查询或删除用例时,id是否存在"""
# if isinstance(id_, list):
# objs = TestcaseInfo.query.filter(TestcaseInfo.id.in_(id_)).all()
# if len(id_) == len(objs):
# return True
# else:
# return False
# else:
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def query_suite_testcase(testsuite_id, page_no, page_size, testcase_name=None):
# """获取某一测试用例集下的testcase"""
#
# if testcase_name:
# value = '%{0}%'.format(testcase_name)
# pagination_obj = TestcaseInfo.query.order_by(db.asc(TestcaseInfo.index)).filter \
# (TestcaseInfo.testsuite_id == testsuite_id, TestcaseInfo.testcase_name.ilike(value)). \
# paginate(page=page_no, per_page=page_size, error_out=False)
# else:
# pagination_obj = TestcaseInfo.query.order_by(db.asc(TestcaseInfo.index)).filter \
# (TestcaseInfo.testsuite_id == testsuite_id). \
# paginate(page=page_no, per_page=page_size, error_out=False)
# return pagination_obj
#
# @staticmethod
# def is_case_exists_by_testsuiteid(testsuite_id):
# """获取某一测试用例集下的testcase"""
# obj = TestcaseInfo.query.filter_by(testsuite_id=testsuite_id).order_by(TestcaseInfo.index.asc(), TestcaseInfo.id.asc()).all()
# return obj
#
# @staticmethod
# def is_case_exists_by_moduleid(module_id):
# """获取某一模块下的testcase"""
# objs = TestcaseInfo.query.filter_by(module_id=module_id).all()
# return objs
#
# @staticmethod
# def is_case_exists_by_systemid(system_id):
# """获取某一系统下的testcase"""
# objs = TestcaseInfo.query.filter_by(system_id=system_id).all()
# return objs
#
# @staticmethod
# def get_no_index_testcases():
# """获取未index的用例"""
# objs = TestcaseInfo.query.filter_by(index=None).all()
# return objs
#
# @staticmethod
# def get_testcases_by(**kwargs):
# """根据条件获取用例s"""
# objs = TestcaseInfo.query.filter_by(**kwargs).all()
# return objs
#
# @staticmethod
# def get_case_status(id_list):
# objs = TestcaseInfo.query.filter(TestcaseInfo.id.in_(id_list)).all()
# return objs
#
# @staticmethod
# def get_system_testcases(system_id):
# """获取某一系统下的所有testcase"""
# objs = TestcaseInfo.query.filter_by(system_id=system_id).all()
# return objs
#
# @staticmethod
# def get_last_obj_by_testsuite(testsuite_id):
# return TestcaseInfo.query.filter_by(testsuite_id=testsuite_id).order_by(db.desc(TestcaseInfo.index)).first()
#
# @staticmethod
# def up_testcase(id_):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# if obj and obj.index != 0:
# pre_obj = TestcaseInfo.query.filter_by(testsuite_id=obj.testsuite_id, index=obj.index - 1).first()
# pre_obj = obj_set_value(pre_obj, 'index', obj.index)
# sh.session.add(pre_obj)
# obj = obj_set_value(obj, 'index', obj.index - 1)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def down_testcase(id_):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# if obj:
# next_obj = TestcaseInfo.query.filter_by(testsuite_id=obj.testsuite_id, index=obj.index + 1).first()
# if next_obj:
# next_obj = obj_set_value(next_obj, 'index', obj.index)
# sh.session.add(next_obj)
# obj = obj_set_value(obj, 'index', obj.index + 1)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def index_update_while_remove_testcase(id_):
# obj = TestcaseInfo.query.filter_by(id=id_).first()
# if obj:
# to_update_objs = TestcaseInfo.query.filter(
# TestcaseInfo.testsuite_id == obj.testsuite_id, TestcaseInfo.index > obj.index
# ).all()
# for to_update_obj in to_update_objs:
# to_update_obj = obj_set_value(to_update_obj, 'index', to_update_obj.index - 1)
# sh.session.add(to_update_obj)
# sh.session.commit()
#
#
# class TestReportManager(object):
# @staticmethod
# def insert_testreport(**kwargs):
# obj = TestReport(**kwargs)
# sh.session.add(obj)
# try:
# sh.session.commit()
# return True
# except IntegrityError:
# sh.session.rollback()
# return False
#
# @staticmethod
# def delete_testreport(id_):
# obj = TestReport.query.filter_by(id=id_).first()
# if obj:
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def query_testreport(system_id):
# obj = TestReport.query.filter_by(system_id=system_id).all()
# return obj
#
# @staticmethod
# def get_next_report_id():
# """获取test_report表下一个id"""
# obj = sh.session.query(func.max(TestReport.id)).first()
# if not obj[0]:
# return 1
# return obj[0] + 1
#
# @staticmethod
# def update_testreport(id_, **kwargs):
# obj = TestReport.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def query_testreport_by_id(id_):
# obj = TestReport.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def query_report_by_executor(executor):
# obj = TestReport.query.filter_by(executor=executor).order_by(db.desc(TestReport.id)).first()
# return obj
#
# @staticmethod
# def paging_query_reports(page_no, page_size, project_id=None, start_time=None, end_time=None, executor=None):
# if project_id and start_time and end_time and executor:
# pagination_obj = TestReport.query.filter(TestReport.create_time.between(start_time, end_time),
# TestReport.project_id == project_id,
# TestReport.executor == executor,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif project_id and start_time and end_time:
# pagination_obj = TestReport.query.filter(TestReport.create_time.between(start_time, end_time),
# TestReport.project_id == project_id,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif project_id and executor:
# pagination_obj = TestReport.query.filter(TestReport.project_id == project_id,
# TestReport.executor == executor,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif start_time and end_time and executor:
# pagination_obj = TestReport.query.filter(TestReport.create_time.between(start_time, end_time),
# TestReport.executor == executor,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif project_id:
# pagination_obj = TestReport.query.filter(TestReport.project_id == project_id,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif start_time and end_time:
# pagination_obj = TestReport.query.filter(TestReport.create_time.between(start_time, end_time),
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# elif executor:
# pagination_obj = TestReport.query.filter(TestReport.executor == executor,
# TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
#
# else:
# pagination_obj = TestReport.query.filter(TestReport.status.in_(('success', 'fail'))).order_by(
# db.desc(TestReport.id)).paginate(page=page_no, per_page=page_size, error_out=False)
# return pagination_obj
#
#
# class PublicVariableInfoManage(object):
# @staticmethod
# def insert_public_variable(**kwargs):
# """新增变量"""
# obj = PublicVariableInfo(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def is_variable_name_exist(system_id, variable_name):
# """判断查询或删除变量时,是否存在"""
# obj = PublicVariableInfo.query.filter_by(system_id=system_id, variable_name=variable_name).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def public_variable_paginate(page, num, keywords=None):
# if keywords:
# value = '%{0}%'.format(keywords)
# data = PublicVariableInfo.query.order_by(db.desc(PublicVariableInfo.create_time)).filter(
# or_(PublicVariableInfo.variable_name.ilike(value),
# PublicVariableInfo.vaule.ilike(value),
# PublicVariableInfo.type.ilike(value),
# PublicVariableInfo.system_id.ilike(value)), ).paginate(page=page, per_page=num, error_out=False)
# else:
# data = PublicVariableInfo.query.order_by(db.desc(PublicVariableInfo.create_time)).filter().paginate(
# page=page, per_page=num, error_out=False)
# return data
#
# @staticmethod
# def fetch_all_publicvariable():
# obj = PublicVariableInfo.query.order_by(db.desc(PublicVariableInfo.create_time)).filter().all()
# return obj
#
# @staticmethod
# def get_all_variables():
# obj = PublicVariableInfo.query.filter().all()
# return obj
#
# @staticmethod
# def query_variable_detail_byname(system_id, variable_name):
# """根据变量名称获取变量详情"""
# obj = PublicVariableInfo.query.filter_by(system_id=system_id, variable_name=variable_name).first()
# return obj
#
# @staticmethod
# def query_variable_detail_byid(id_):
# """根据变量id,返回变量详情"""
# obj = PublicVariableInfo.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def whether_variable_name_canbeupdated(variable_name, id_, system_id):
# """判断更改变量名称时,是否被其它变量名称使用"""
# obj = PublicVariableInfo.query.filter(PublicVariableInfo.id != id_,
# PublicVariableInfo.variable_name == variable_name,
# PublicVariableInfo.system_id == system_id).first()
# if obj:
# return True
# else:
# return False
#
# @staticmethod
# def delete_variable(system_id, variable_name):
# obj = PublicVariableInfo.query.filter_by(system_id=system_id, variable_name=variable_name).first()
# if obj:
# sh.session.delete(obj)
# sh.session.commit()
# return True
# else:
# return False
#
# @staticmethod
# def update_variable(id_, **kwargs):
# obj = PublicVariableInfo.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def get_variables(id_list):
# objs = PublicVariableInfo.query.filter(PublicVariableInfo.id.in_(id_list)).all()
# return objs
#
# @staticmethod
# def get_variable(**kwargs):
# obj = PublicVariableInfo.query.filter_by(**kwargs).first()
# return obj
#
#
# class TestPlanManager(object):
# @staticmethod
# def insert_test_plan(**kwargs):
# obj = TestPlan(**kwargs)
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def update_test_plan(id_, **kwargs):
# obj = TestPlan.query.filter_by(id=id_).first()
# for column in kwargs:
# obj = obj_set_value(obj, column, kwargs[column])
# sh.session.add(obj)
# sh.session.commit()
#
# @staticmethod
# def delete_test_plan(id_):
# obj = TestPlan.query.filter_by(id=id_).first()
# if obj:
# sh.session.delete(obj)
# sh.session.commit()
#
# @staticmethod
# def get_test_plan(id_):
# obj = TestPlan.query.filter_by(id=id_).first()
# return obj
#
# @staticmethod
# def get_test_plan_by_name(plan_name):
# obj = TestPlan.query.filter_by(plan_name=plan_name).first()
# return obj
#
# @staticmethod
# def paging_query_plans(page_no, page_size, search_keyword=None):
# if search_keyword:
# return sh.session.query(
# TestPlan.id, TestPlan.plan_name, TestPlan.crontab, TestPlan.simple_desc, TestPlan.creator,
# TestPlan.project_id, TestPlan.env_id,
# TestPlan.last_modifier, ProjectInfo.project_name, EnvInfo.env_name
# ).outerjoin(
# EnvInfo, EnvInfo.id == TestPlan.env_id
# ).outerjoin(
# ProjectInfo, ProjectInfo.id == TestPlan.project_id
# ).filter(TestPlan.plan_name.ilike('%{0}%'.format(search_keyword))).order_by(
# db.desc(TestPlan.id)).paginate(page=page_no, per_page=page_size, error_out=False)
# else:
# return sh.session.query(
# TestPlan.id, TestPlan.plan_name, TestPlan.crontab, TestPlan.simple_desc, TestPlan.creator,
# TestPlan.project_id, TestPlan.env_id,
# TestPlan.last_modifier, ProjectInfo.project_name, EnvInfo.env_name
# ).outerjoin(
# EnvInfo, EnvInfo.id == TestPlan.env_id
# ).outerjoin(
# ProjectInfo, ProjectInfo.id == TestPlan.project_id
# ).filter().order_by(
# db.desc(TestPlan.id)).paginate(page=page_no, per_page=page_size, error_out=False)
class UserManager(object):
@staticmethod
def insert_user(**kwargs):
with SessionHandler() as sh:
obj = User(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_user(id_, **kwargs):
with SessionHandler() as sh:
obj = User.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_user(id_):
with SessionHandler() as sh:
obj = User.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_user(id_):
with SessionHandler() as sh:
obj = User.query.filter_by(id=id_).first()
return obj
@staticmethod
def get_user_by_username(username):
with SessionHandler() as sh:
obj = User.query.filter_by(username=username).first()
return obj
@staticmethod
def get_user_by_nickname(nickname):
with SessionHandler() as sh:
obj = User.query.filter_by(nickname=nickname).first()
return obj
@staticmethod
def paging_query_users(page_no, page_size):
with SessionHandler() as sh:
pagination_obj = User.query.filter().order_by(db.asc(User.id)).paginate(
page=page_no, per_page=page_size, error_out=False)
return pagination_obj
@staticmethod
def get_all_username_nickname():
with SessionHandler() as sh:
return sh.session.query(
User.username, User.nickname
).filter().all()
class TestcaseTagManager(object):
@staticmethod
def get_testcase_tag(id_):
with SessionHandler() as sh:
return TestcaseTag.query.filter_by(id=id_).first()
@staticmethod
def query_testcase_tag():
with SessionHandler() as sh:
return TestcaseTag.query.filter_by().all()
@staticmethod
def query_testcase_tags(**kwargs):
with SessionHandler() as sh:
return TestcaseTag.query.filter_by(**kwargs).all()
@staticmethod
def get_tag_categories():
with SessionHandler() as sh:
return sh.session.query(TestcaseTag.tag_category).dinstinct().all()
def stat_api_testcase():
with SessionHandler() as sh:
return sh.session.query(
ApiCompanyInfo.company_name, func.count(ApiTestcaseInfo.id)
).outerjoin(
ApiSystemInfo, ApiCompanyInfo.id == ApiSystemInfo.api_company_id
).outerjoin(
ApiIntfInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).filter(
# ApiTestcaseInfo.case_status == '0'
).group_by(
ApiCompanyInfo.id
).all()
def get_reuse_group_by_testcase_id(start_date, end_date, intf_id):
with SessionHandler() as sh:
return sh.session.query(
ApiTestcaseInfo.id, ApiTestcaseInfo.testcase_name, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseReuseRecord, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id == intf_id, ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
ApiTestcaseInfo.id
).all()
def get_reuse_group_by_intf_id(start_date, end_date, system_id):
with SessionHandler() as sh:
return sh.session.query(
ApiIntfInfo.id, ApiIntfInfo.intf_name, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiTestcaseReuseRecord, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiIntfInfo.api_system_id == system_id, ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
ApiIntfInfo.id
).all()
def get_reuse_group_by_system_id(start_date, end_date, company_id):
with SessionHandler() as sh:
return sh.session.query(
ApiSystemInfo.id, ApiSystemInfo.system_name, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiIntfInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).outerjoin(
ApiTestcaseInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiTestcaseReuseRecord, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiSystemInfo.api_company_id == company_id, ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
ApiSystemInfo.id
).all()
def get_reuse_group_by_day(start_date, end_date, company_id=None, system_id=None, intf_id=None):
with SessionHandler() as sh:
if intf_id:
return sh.session.query(
ApiTestcaseReuseRecord.record_date, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id == intf_id, ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
ApiTestcaseReuseRecord.record_date
).all()
elif system_id:
return sh.session.query(
ApiTestcaseReuseRecord.record_date, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).filter(
ApiIntfInfo.api_system_id == system_id, ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
ApiTestcaseReuseRecord.record_date
).all()
elif company_id:
return sh.session.query(
ApiTestcaseReuseRecord.record_date, func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).filter(
ApiSystemInfo.api_company_id == company_id,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date),
).group_by(
ApiTestcaseReuseRecord.record_date
).all()
def get_reuse_group_by_week(start_date, end_date, company_id=None, system_id=None, intf_id=None):
with SessionHandler() as sh:
if intf_id:
return sh.session.query(
func.week(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id == intf_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.week(ApiTestcaseReuseRecord.record_date)
).all()
elif system_id:
return sh.session.query(
func.week(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).filter(
ApiIntfInfo.api_system_id == system_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.week(ApiTestcaseReuseRecord.record_date)
).all()
elif company_id:
return sh.session.query(
func.week(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).filter(
ApiSystemInfo.api_company_id == company_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.week(ApiTestcaseReuseRecord.record_date)
).all()
def get_reuse_group_by_month(start_date, end_date, company_id=None, system_id=None, intf_id=None):
with SessionHandler() as sh:
if intf_id:
return sh.session.query(
func.month(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).filter(
ApiTestcaseInfo.api_intf_id == intf_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.month(ApiTestcaseReuseRecord.record_date)
).all()
elif system_id:
return sh.session.query(
func.month(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).filter(
ApiIntfInfo.api_system_id == system_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.month(ApiTestcaseReuseRecord.record_date)
).all()
elif company_id:
return sh.session.query(
func.month(ApiTestcaseReuseRecord.record_date), func.sum(ApiTestcaseReuseRecord.total_times),
func.sum(ApiTestcaseReuseRecord.success_times), func.sum(ApiTestcaseReuseRecord.fail_times)
).outerjoin(
ApiTestcaseInfo, ApiTestcaseInfo.id == ApiTestcaseReuseRecord.api_testcase_id
).outerjoin(
ApiIntfInfo, ApiIntfInfo.id == ApiTestcaseInfo.api_intf_id
).outerjoin(
ApiSystemInfo, ApiSystemInfo.id == ApiIntfInfo.api_system_id
).filter(
ApiSystemInfo.api_company_id == company_id, func.year(ApiTestcaseReuseRecord.record_date) == 2019,
ApiTestcaseReuseRecord.record_date.between(start_date, end_date)
).group_by(
func.month(ApiTestcaseReuseRecord.record_date)
).all()
def obj_set_value(obj, attr, value):
if value is not None and hasattr(obj, attr):
setattr(obj, attr, value)
return obj
class BaseProjectInfoManager(object):
@staticmethod
def insert_base_project(**kwargs):
with SessionHandler() as sh:
obj = BaseProjectInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_base_project(id_, **kwargs):
with SessionHandler() as sh:
obj = BaseProjectInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_base_project(**kwargs):
with SessionHandler() as sh:
obj = BaseProjectInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_project(**kwargs):
with SessionHandler() as sh:
obj = BaseProjectInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_all_projects():
with SessionHandler() as sh:
objs = BaseProjectInfo.query.filter_by().all()
return objs
@staticmethod
def base_project_info(project_name):
with SessionHandler() as sh:
if project_name:
project_name = '%' + project_name + '%'
obj = BaseProjectInfo.query.filter(BaseProjectInfo.project_name.like(project_name)).all()
else:
obj = BaseProjectInfo.query.all()
return obj
class BaseSystemInfoManager(object):
@staticmethod
def insert_base_system(**kwargs):
with SessionHandler() as sh:
obj = BaseSystemInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_base_system(**kwargs):
with SessionHandler() as sh:
obj = BaseSystemInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_system(system_name, project_id):
with SessionHandler() as sh:
obj = BaseSystemInfo.query.filter_by(system_name=system_name, project_id=project_id).first()
return obj
@staticmethod
def get_systems(project_id):
with SessionHandler() as sh:
return BaseSystemInfo.query.filter_by(project_id=project_id).all()
@staticmethod
def get_total_systems():
with SessionHandler() as sh:
return BaseSystemInfo.query.filter_by().all()
@staticmethod
def get_all_system(project_id):
with SessionHandler() as sh:
obj = BaseSystemInfo.query.filter_by(project_id=project_id).all()
return obj
@staticmethod
def query_system(**kwargs):
with SessionHandler() as sh:
obj = BaseSystemInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def update_system(**kwargs):
with SessionHandler() as sh:
systemId = kwargs.pop('systemId')
obj = BaseSystemInfo.query.filter_by(id=systemId).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
class BaseModuleInfoManager(object):
@staticmethod
def insert_base_module(**kwargs):
with SessionHandler() as sh:
obj = BaseModuleInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_base_module(**kwargs):
with SessionHandler() as sh:
obj = BaseModuleInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def batch_delete_by_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseModuleInfo.query.filter(BaseModuleInfo.id.in_(module_ids)).all()
if objs:
[sh.session.delete(obj) for obj in objs]
sh.session.commit()
@staticmethod
def get_module(**kwargs):
with SessionHandler() as sh:
obj = BaseModuleInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_modules(**kwargs):
with SessionHandler() as sh:
objs = BaseModuleInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_modules_by_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseModuleInfo.query.filter(BaseModuleInfo.id.in_(module_ids)).all()
return objs
class BaseModuleInfoBakManager(object):
@staticmethod
def insert_base_module(**kwargs):
with SessionHandler() as sh:
obj = BaseModuleInfoBak(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_base_module(**kwargs):
with SessionHandler() as sh:
obj = BaseModuleInfoBak.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def batch_delete_by_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseModuleInfoBak.query.filter(BaseModuleInfoBak.id.in_(module_ids)).all()
if objs:
[sh.session.delete(obj) for obj in objs]
sh.session.commit()
@staticmethod
def get_module(**kwargs):
with SessionHandler() as sh:
return BaseModuleInfoBak.query.filter_by(**kwargs).first()
@staticmethod
def get_modules(**kwargs):
with SessionHandler() as sh:
return BaseModuleInfoBak.query.filter_by(**kwargs).all()
class BaseTestcaseInfoManager(object):
@staticmethod
def insert_base_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_base_testcase(insert_list):
with SessionHandler() as sh:
objs = [BaseTestcaseInfo(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_base_testcase(id_, **kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_base_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def batch_delete_by_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseTestcaseInfo.query.filter(BaseTestcaseInfo.module_id.in_(module_ids)).all()
if objs:
[sh.session.delete(obj) for obj in objs]
sh.session.commit()
@staticmethod
def get_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_all_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfo.query.filter_by(**kwargs).all()
return obj
@staticmethod
def get_testcases_in_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseTestcaseInfo.query.filter(BaseTestcaseInfo.module_id.in_(module_ids)).all()
return objs
@staticmethod
def query_all_basecase(module_id, page_no, page_size, testcase_name=None):
"""获取某一模块下的testcase"""
with SessionHandler() as sh:
if testcase_name:
value = '%{0}%'.format(testcase_name)
pagination_obj = BaseTestcaseInfo.query.order_by(db.asc(BaseTestcaseInfo.id)).filter(
BaseTestcaseInfo.module_id == module_id, BaseTestcaseInfo.testcase_name.ilike(value)).paginate(
page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = BaseTestcaseInfo.query.order_by(db.asc(BaseTestcaseInfo.id)).filter(
BaseTestcaseInfo.module_id == module_id).paginate(page=page_no, per_page=page_size, error_out=False)
return pagination_obj
@staticmethod
def count_testcases(**kwargs):
with SessionHandler() as sh:
return sh.session.query(func.count(BaseTestcaseInfo.id)).filter_by(**kwargs).first()
@staticmethod
def group_testcases_by_module_id():
with SessionHandler() as sh:
return sh.session.query(BaseTestcaseInfo.module_id, func.count(BaseTestcaseInfo.id)).filter_by().group_by(
BaseTestcaseInfo.module_id).all()
class BaseTestcaseInfoBakManager(object):
@staticmethod
def insert_base_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfoBak(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_base_testcase(insert_list):
with SessionHandler() as sh:
objs = [BaseTestcaseInfoBak(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_base_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfoBak.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def batch_delete_by_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseTestcaseInfoBak.query.filter(BaseTestcaseInfoBak.module_id.in_(module_ids)).all()
if objs:
[sh.session.delete(obj) for obj in objs]
sh.session.commit()
@staticmethod
def get_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfoBak.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_all_testcase(**kwargs):
with SessionHandler() as sh:
obj = BaseTestcaseInfoBak.query.filter_by(**kwargs).all()
return obj
@staticmethod
def get_testcases_in_module_ids(module_ids):
with SessionHandler() as sh:
objs = BaseTestcaseInfo.query.filter(BaseTestcaseInfo.module_id.in_(module_ids)).all()
return objs
@staticmethod
def query_all_basecase(module_id, page_no, page_size, testcase_name=None):
"""获取某一模块下的testcase"""
with SessionHandler() as sh:
if testcase_name:
value = '%{0}%'.format(testcase_name)
pagination_obj = BaseTestcaseInfoBak.query.order_by(db.asc(BaseTestcaseInfoBak.id)).filter(
BaseTestcaseInfoBak.module_id == module_id,
BaseTestcaseInfoBak.testcase_name.ilike(value)).paginate(
page=page_no, per_page=page_size, error_out=False)
else:
pagination_obj = BaseTestcaseInfoBak.query.order_by(db.asc(BaseTestcaseInfoBak.id)).filter(
BaseTestcaseInfoBak.module_id == module_id).paginate(page=page_no, per_page=page_size,
error_out=False)
return pagination_obj
@staticmethod
def count_testcases(**kwargs):
with SessionHandler() as sh:
return sh.session.query(func.count(BaseTestcaseInfoBak.id)).filter_by(**kwargs).first()
@staticmethod
def group_testcases_by_module_id():
with SessionHandler() as sh:
return sh.session.query(BaseTestcaseInfoBak.module_id, func.count(BaseTestcaseInfoBak.id)).filter_by(
).group_by(BaseTestcaseInfoBak.module_id).all()
class BaseJobHistoryManager(object):
@staticmethod
def insert_job_history(**kwargs):
with SessionHandler() as sh:
obj = BaseJobHistory(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_last_record():
with SessionHandler() as sh:
obj = sh.session.query(func.max(BaseJobHistory.create_time).label("last_time")).first()
return obj
class UiProjectInfoManager(object):
@staticmethod
def insert_ui_project(**kwargs):
with SessionHandler() as sh:
obj = UiProjectInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def ui_project_info(project_name):
with SessionHandler() as sh:
if project_name:
project_name = '%' + project_name + '%'
obj = UiProjectInfo.query.filter(UiProjectInfo.project_name.like(project_name)).all()
else:
obj = UiProjectInfo.query.all()
return obj
class UiSystemInfoManager(object):
@staticmethod
def insert_ui_system(**kwargs):
with SessionHandler() as sh:
obj = UiSystemInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_ui_system(**kwargs):
with SessionHandler() as sh:
obj = UiSystemInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_all_system(project_id):
with SessionHandler() as sh:
obj = UiSystemInfo.query.filter_by(project_id=project_id).all()
return obj
@staticmethod
def query_system(**kwargs):
with SessionHandler() as sh:
obj = UiSystemInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def update_system(**kwargs):
with SessionHandler() as sh:
systemId = kwargs.pop('systemId')
obj = UiSystemInfo.query.filter_by(id=systemId).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
class UiModuleInfoManager(object):
@staticmethod
def insert_ui_module(**kwargs):
with SessionHandler() as sh:
obj = UiModuleInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_ui_module(**kwargs):
with SessionHandler() as sh:
obj = UiModuleInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_module(**kwargs):
with SessionHandler() as sh:
obj = UiModuleInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_modules(**kwargs):
with SessionHandler() as sh:
objs = UiModuleInfo.query.filter_by(**kwargs).all()
return objs
class UICasePageInfoManager(object):
@staticmethod
def insert_ui_page(**kwargs):
with SessionHandler() as sh:
obj = UICasePageInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def query_ui_page(**kwargs):
with SessionHandler() as sh:
obj = UICasePageInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def query_ui_pages(**kwargs):
with SessionHandler() as sh:
obj = UICasePageInfo.query.filter_by(**kwargs).all()
return obj
class UICasePageObjectInfoManage(object):
@staticmethod
def insert_ui_pageobject(**kwargs):
with SessionHandler() as sh:
obj = UICasePageObjectInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def query_paget_object(**kwargs):
with SessionHandler() as sh:
obj = UICasePageObjectInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def delete_paget_object(**kwargs):
with SessionHandler() as sh:
obj = UICasePageObjectInfo.query.filter_by(**kwargs).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def query_all_page_object(**kwargs):
with SessionHandler() as sh:
obj = UICasePageObjectInfo.query.filter_by(**kwargs).all()
return obj
@staticmethod
def eidt_page_object(**kwargs):
with SessionHandler() as sh:
objectId = kwargs.pop('object_id')
obj = UICasePageObjectInfo.query.filter_by(id=objectId).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
class UITestCaseInfoManage(object):
@staticmethod
def insert_ui_testcase(**kwargs):
with SessionHandler() as sh:
obj = UiTestcaseInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_ui_testcase(id_, **kwargs):
with SessionHandler() as sh:
obj = UiTestcaseInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def query_ui_testcase(**kwargs):
with SessionHandler() as sh:
obj = UiTestcaseInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def query_ui_testcases(**kwargs):
with SessionHandler() as sh:
obj = UiTestcaseInfo.query.filter_by(**kwargs).all()
return obj
@staticmethod
def delete_ui_testcase(id_):
with SessionHandler() as sh:
obj = UiTestcaseInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
class ApiIntfDefaultRequestManager(object):
@staticmethod
def insert_request(**kwargs):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_request(insert_list):
with SessionHandler() as sh:
objs = [ApiIntfDefaultRequest(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def delete_request(id_):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def delete_request_by_intf_id(intf_id):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest.query.filter_by(api_intf_id=intf_id).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def update_request(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_request(**kwargs):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest.query.filter_by(**kwargs).first()
return obj
@staticmethod
def update_request_by_intf_id(intf_id, **kwargs):
with SessionHandler() as sh:
obj = ApiIntfDefaultRequest.query.filter_by(api_intf_id=intf_id).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
class ApiTestcaseReuseRecordManager(object):
@staticmethod
def insert_record(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseReuseRecord(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_record(insert_list):
with SessionHandler() as sh:
objs = [ApiTestcaseReuseRecord(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_record(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseReuseRecord.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_record(id_):
with SessionHandler() as sh:
obj = ApiTestcaseReuseRecord.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_record(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseReuseRecord.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_records(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseReuseRecord.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_recent_summary(testcase_id, start_date):
with SessionHandler() as sh:
return sh.session.query(
func.sum(ApiTestcaseReuseRecord.total_times), func.sum(ApiTestcaseReuseRecord.success_times),
).filter(
ApiTestcaseReuseRecord.api_testcase_id == testcase_id,
ApiTestcaseReuseRecord.is_setup == 0, ApiTestcaseReuseRecord.record_date > start_date
).first()
class ApiTaskInfoManager(object):
@staticmethod
def insert_task(**kwargs):
with SessionHandler() as sh:
obj = ApiTaskInfo(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_task(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTaskInfo.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_task(id_):
with SessionHandler() as sh:
obj = ApiTaskInfo.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_task(**kwargs):
with SessionHandler() as sh:
obj = ApiTaskInfo.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_tasks(**kwargs):
with SessionHandler() as sh:
objs = ApiTaskInfo.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_tasks_paginate(page, num, company_id, keywords=None):
with SessionHandler() as sh:
task_type_list = [1, 2]
if keywords:
value = '%{0}%'.format(keywords)
objs = ApiTaskInfo.query.order_by(db.desc(ApiTaskInfo.create_time)).filter(
ApiTaskInfo.api_company_id == company_id,
ApiTaskInfo.task_type.in_(task_type_list),
ApiTaskInfo.task_name.ilike(value)).paginate(page=page, per_page=num, error_out=False)
else:
objs = ApiTaskInfo.query.order_by(db.desc(ApiTaskInfo.create_time)).filter(
ApiTaskInfo.api_company_id == company_id,
ApiTaskInfo.task_type.in_(task_type_list)).paginate(
page=page, per_page=num, error_out=False)
return objs
@staticmethod
def get_smoking_tasks_paginate(page, num, company_id, keywords=None):
with SessionHandler() as sh:
if keywords:
value = '%{0}%'.format(keywords)
objs = ApiTaskInfo.query.order_by(
db.desc(func.coalesce(ApiTaskInfo.update_time, ApiTaskInfo.create_time))).filter(
ApiTaskInfo.api_company_id == company_id,
ApiTaskInfo.task_type == 3,
ApiTaskInfo.task_name.ilike(value)).paginate(page=page, per_page=num, error_out=False)
else:
objs = ApiTaskInfo.query.order_by(db.desc(func.coalesce(ApiTaskInfo.update_time, ApiTaskInfo.create_time))).filter(
ApiTaskInfo.api_company_id == company_id, ApiTaskInfo.task_type == 3).paginate(
page=page, per_page=num, error_out=False)
return objs
@staticmethod
def get_regression_tasks_paginate(page, num, company_id, keywords=None):
with SessionHandler() as sh:
if keywords:
value = '%{0}%'.format(keywords)
objs = ApiTaskInfo.query.order_by(db.desc(func.coalesce(ApiTaskInfo.update_time, ApiTaskInfo.create_time))).filter(
ApiTaskInfo.api_company_id == company_id,
ApiTaskInfo.task_type == 4,
ApiTaskInfo.task_name.ilike(value)).paginate(page=page, per_page=num, error_out=False)
else:
objs = ApiTaskInfo.query.order_by(db.desc(func.coalesce(ApiTaskInfo.update_time, ApiTaskInfo.create_time))).filter(
ApiTaskInfo.api_company_id == company_id, ApiTaskInfo.task_type == 4).paginate(
page=page, per_page=num, error_out=False)
return objs
@staticmethod
def get_tasks_in_project_id_list(project_id_list):
with SessionHandler() as sh:
objs = ApiTaskInfo.query.filter(
ApiTaskInfo.api_project_id.in_(project_id_list)
).order_by(db.desc(ApiTaskInfo.id)).all()
return objs
class CeleryTaskRecordManager(object):
@staticmethod
def insert_celery(**kwargs):
with SessionHandler() as sh:
obj = CeleryTaskRecord(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def batch_insert_celery(insert_list):
with SessionHandler() as sh:
objs = [CeleryTaskRecord(**kw) for kw in insert_list]
sh.session.bulk_save_objects(objs)
sh.session.commit()
@staticmethod
def update_celery(id_, **kwargs):
with SessionHandler() as sh:
obj = CeleryTaskRecord.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_celery_by_task_no(celery_task_no, **kwargs):
with SessionHandler() as sh:
obj = CeleryTaskRecord.query.filter_by(celery_task_no=celery_task_no).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_celery(id_):
with SessionHandler() as sh:
obj = CeleryTaskRecord.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_celery(**kwargs):
with SessionHandler() as sh:
obj = CeleryTaskRecord.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_celeries(**kwargs):
with SessionHandler() as sh:
objs = CeleryTaskRecord.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_callback_celery(api_run_task_result_id):
with SessionHandler() as sh:
callback_keywords = ['WAITING', 'RUNNING', 'ERROR', 'SUCCESS']
obj = CeleryTaskRecord.query.filter(
CeleryTaskRecord.api_run_task_result_id == api_run_task_result_id,
CeleryTaskRecord.celery_task_status.in_(callback_keywords)
).first()
return obj
class ApiRunTaskResultManager(object):
@staticmethod
def insert_result(**kwargs):
with SessionHandler() as sh:
obj = ApiRunTaskResult(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_result(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiRunTaskResult.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_result(id_):
with SessionHandler() as sh:
obj = ApiRunTaskResult.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_result(**kwargs):
with SessionHandler() as sh:
obj = ApiRunTaskResult.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_results(**kwargs):
with SessionHandler() as sh:
objs = ApiRunTaskResult.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_results_order_by_id_desc(**kwargs):
with SessionHandler() as sh:
objs = ApiRunTaskResult.query.filter_by(**kwargs).order_by(
db.desc(ApiRunTaskResult.id)).limit(10).all()
return objs
@staticmethod
def get_next_result_id():
with SessionHandler() as sh:
"""获取表下一个id"""
obj = sh.session.query(func.max(ApiRunTaskResult.id)).first()
if not obj[0]:
return 1
return obj[0] + 1
@staticmethod
def get_last_result_by_task_id(task_id):
with SessionHandler() as sh:
obj = ApiRunTaskResult.query.filter(ApiRunTaskResult.api_task_id == task_id).order_by(
db.desc(ApiRunTaskResult.create_time)).first()
return obj
@staticmethod
def get_results_group_by_run_date_in_company(company_id, recent_days):
with SessionHandler() as sh:
return sh.session.query(
ApiRunTaskResult.run_date, func.count(ApiRunTaskResult.id), func.sum(ApiRunTaskResult.total_cases),
func.sum(ApiRunTaskResult.not_run_cases), func.sum(ApiRunTaskResult.run_cases),
func.sum(ApiRunTaskResult.success_cases), func.sum(ApiRunTaskResult.fail_cases),
).outerjoin(
ApiTaskInfo, ApiTaskInfo.id == ApiRunTaskResult.api_task_id
).outerjoin(
ApiProjectInfo, ApiProjectInfo.id == ApiTaskInfo.api_project_id
).filter(
ApiProjectInfo.api_company_id == company_id
).group_by(
ApiRunTaskResult.run_date
).order_by(
db.desc(ApiRunTaskResult.run_date)
).limit(int(recent_days))
@staticmethod
def get_results_group_by_run_date_in_company_ignore_project(company_id, recent_days):
with SessionHandler() as sh:
return sh.session.query(
ApiRunTaskResult.run_date, func.count(ApiRunTaskResult.id), func.sum(ApiRunTaskResult.total_cases),
func.sum(ApiRunTaskResult.not_run_cases), func.sum(ApiRunTaskResult.run_cases),
func.sum(ApiRunTaskResult.success_cases), func.sum(ApiRunTaskResult.fail_cases),
).outerjoin(
ApiTaskInfo, ApiTaskInfo.id == ApiRunTaskResult.api_task_id
).filter(
ApiTaskInfo.api_company_id == company_id
).group_by(
ApiRunTaskResult.run_date
).order_by(
db.desc(ApiRunTaskResult.run_date)
).limit(int(recent_days))
@staticmethod
def get_results_by_run_date_in_company(company_id, run_date):
with SessionHandler() as sh:
return sh.session.query(
ApiRunTaskResult.id, ApiTaskInfo.task_name, ApiTaskInfo.task_type,
ApiProjectInfo.project_name, ApiRunTaskResult.total_cases, ApiRunTaskResult.not_run_cases,
ApiRunTaskResult.run_cases, ApiRunTaskResult.success_cases, ApiRunTaskResult.fail_cases,
ApiRunTaskResult.start_time, ApiRunTaskResult.end_time, ApiRunTaskResult.creator,
EnvInfo.env_name
).outerjoin(
ApiTaskInfo, ApiTaskInfo.id == ApiRunTaskResult.api_task_id
).outerjoin(
ApiProjectInfo, ApiProjectInfo.id == ApiTaskInfo.api_project_id
).outerjoin(
EnvInfo, EnvInfo.id == ApiRunTaskResult.run_env_id
).filter(
ApiProjectInfo.api_company_id == company_id, ApiRunTaskResult.run_date == run_date
).order_by(
db.desc(ApiRunTaskResult.start_time)
).all()
@staticmethod
def get_results_by_run_date_in_company_ignore_project(company_id, run_date):
with SessionHandler() as sh:
return sh.session.query(
ApiRunTaskResult.id, ApiTaskInfo.task_name, ApiTaskInfo.task_type,
ApiTaskInfo.api_project_id, ApiRunTaskResult.total_cases, ApiRunTaskResult.not_run_cases,
ApiRunTaskResult.run_cases, ApiRunTaskResult.success_cases, ApiRunTaskResult.fail_cases,
ApiRunTaskResult.start_time, ApiRunTaskResult.end_time, ApiRunTaskResult.creator,
EnvInfo.env_name
).outerjoin(
ApiTaskInfo, ApiTaskInfo.id == ApiRunTaskResult.api_task_id
).outerjoin(
EnvInfo, EnvInfo.id == ApiRunTaskResult.run_env_id
).filter(
ApiTaskInfo.api_company_id == company_id, ApiRunTaskResult.run_date == run_date,
ApiTaskInfo.api_project_id == 0
).order_by(
db.desc(ApiRunTaskResult.start_time)
).all()
class GitDiffVersionManager(object):
@staticmethod
def insert_git_diff_version(**kwargs):
with SessionHandler() as sh:
obj = GitDiffVersion(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_git_diff_version(id_, **kwargs):
with SessionHandler() as sh:
obj = GitDiffVersion.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_git_diff_version_by_seq_no(seq_no_, **kwargs):
with SessionHandler() as sh:
obj = GitDiffVersion.query.filter_by(seq_no=seq_no_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_git_diff_version(id_):
with SessionHandler() as sh:
obj = GitDiffVersion.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
sh.session.close()
@staticmethod
def delete_git_diff_version_by_obj(obj):
with SessionHandler() as sh:
if obj:
sh.session.delete(obj)
sh.session.commit()
@staticmethod
def get_git_diff_version(**kwargs):
with SessionHandler() as sh:
obj = GitDiffVersion.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_git_diff_versions(**kwargs):
with SessionHandler() as sh:
objs = GitDiffVersion.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_git_diff_versions_special(seq_no_, api_task_id_):
with SessionHandler() as sh:
objs = GitDiffVersion.query.filter(
GitDiffVersion.seq_no != seq_no_ and GitDiffVersion.api_task_id == api_task_id_).all()
return objs
class GenerateDataRecordManager(object):
@staticmethod
def insert_record(**kwargs):
with SessionHandler() as sh:
obj = GenerateDataRecord(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def get_record(**kwargs):
with SessionHandler() as sh:
obj = GenerateDataRecord.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_records(**kwargs):
with SessionHandler() as sh:
objs = GenerateDataRecord.query.filter_by(**kwargs).all()
return objs
@staticmethod
def get_records_between(start, end):
with SessionHandler() as sh:
return sh.session.query(
GenerateDataRecord.mobile_no
).filter(
GenerateDataRecord.record_date.between(start, end)
).all()
class ApiTestcaseMainCustomFlowManager(object):
@staticmethod
def insert_flow(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMainCustomFlow(**kwargs)
sh.session.add(obj)
sh.session.commit()
@staticmethod
def update_flow(id_, **kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMainCustomFlow.query.filter_by(id=id_).first()
for column in kwargs:
obj = obj_set_value(obj, column, kwargs[column])
sh.session.add(obj)
sh.session.commit()
@staticmethod
def delete_flow(id_):
with SessionHandler() as sh:
obj = ApiTestcaseMainCustomFlow.query.filter_by(id=id_).first()
if obj:
sh.session.delete(obj)
sh.session.commit()
sh.session.close()
@staticmethod
def get_flow(**kwargs):
with SessionHandler() as sh:
obj = ApiTestcaseMainCustomFlow.query.filter_by(**kwargs).first()
return obj
@staticmethod
def get_flows(**kwargs):
with SessionHandler() as sh:
objs = ApiTestcaseMainCustomFlow.query.filter_by(**kwargs).all()
return objs
if __name__ == '__main__':
pass
| 39.381621 | 135 | 0.605568 | 16,527 | 159,850 | 5.620923 | 0.025594 | 0.042143 | 0.06997 | 0.076257 | 0.8854 | 0.858854 | 0.829015 | 0.800435 | 0.763921 | 0.731972 | 0 | 0.001315 | 0.295771 | 159,850 | 4,058 | 136 | 39.391326 | 0.823916 | 0.179844 | 0 | 0.761304 | 0 | 0 | 0.003034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.115317 | false | 0.000351 | 0.003155 | 0 | 0.204346 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b01d52f449f12ed3eb4311d2ad5e05c6a7949e9b | 301 | py | Python | jovsatools/minitorch/coursework/Module-2/minitorch/__init__.py | jovsa/jovsatools | 52e7f6737376b62bbeba41dba8b54167661412a2 | [
"Apache-2.0"
] | 2 | 2020-06-06T22:46:58.000Z | 2020-07-09T22:00:38.000Z | jovsatools/minitorch/coursework/Module-2/minitorch/__init__.py | jovsa/jsmltools | 52e7f6737376b62bbeba41dba8b54167661412a2 | [
"Apache-2.0"
] | 1 | 2020-09-16T19:13:22.000Z | 2020-09-16T19:13:22.000Z | jovsatools/minitorch/coursework/Module-2/minitorch/__init__.py | jovsa/jsmltools | 52e7f6737376b62bbeba41dba8b54167661412a2 | [
"Apache-2.0"
] | null | null | null | from .tensor_data import * # noqa: F401,F403
from .tensor import * # noqa: F401,F403
from .tensor_ops import * # noqa: F401,F403
from .operators import * # noqa: F401,F403
from .modules import * # noqa: F401,F403
from .autodiff import * # noqa: F401,F403
from .scalar import * # noqa: F401,F403 | 43 | 45 | 0.700997 | 44 | 301 | 4.75 | 0.272727 | 0.334928 | 0.4689 | 0.602871 | 0.688995 | 0.267943 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 0.182724 | 301 | 7 | 46 | 43 | 0.678862 | 0.368771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b059b314ece9b43434b79cbaccd97ca739483222 | 1,951 | py | Python | hrt/util.py | sachinkamath/http-request-translator | 71fac59ac0673a6695a93da211ea2af4a31862b7 | [
"BSD-3-Clause"
] | 17 | 2015-06-18T17:27:51.000Z | 2021-11-30T22:17:01.000Z | hrt/util.py | sachinkamath/http-request-translator | 71fac59ac0673a6695a93da211ea2af4a31862b7 | [
"BSD-3-Clause"
] | 39 | 2015-07-16T17:01:49.000Z | 2018-06-07T21:50:29.000Z | hrt/util.py | sachinkamath/http-request-translator | 71fac59ac0673a6695a93da211ea2af4a31862b7 | [
"BSD-3-Clause"
] | 22 | 2015-01-02T22:51:34.000Z | 2021-11-30T22:17:05.000Z | import re
# Blindy copied from: https://gist.github.com/mnordhoff/2213179
# And even more blindly trusted. Fingers crossed.
re_ipv4_address = re.compile('^(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$')
re_ipv6_address = re.compile('^(?:(?:[0-9A-Fa-f]{1,4}:){6}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|::(?:[0-9A-Fa-f]{1,4}:){5}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){4}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){3}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,2}[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:){2}(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,3}[0-9A-Fa-f]{1,4})?::[0-9A-Fa-f]{1,4}:(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,4}[0-9A-Fa-f]{1,4})?::(?:[0-9A-Fa-f]{1,4}:[0-9A-Fa-f]{1,4}|(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))|(?:(?:[0-9A-Fa-f]{1,4}:){,5}[0-9A-Fa-f]{1,4})?::[0-9A-Fa-f]{1,4}|(?:(?:[0-9A-Fa-f]{1,4}:){,6}[0-9A-Fa-f]{1,4})?::)$')
# Homebrew
re_domain = re.compile(r'^(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9-])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|)$', re.IGNORECASE)
| 195.1 | 1,540 | 0.411071 | 565 | 1,951 | 1.410619 | 0.074336 | 0.160602 | 0.2133 | 0.25596 | 0.745295 | 0.745295 | 0.745295 | 0.745295 | 0.745295 | 0.745295 | 0 | 0.25026 | 0.014864 | 1,951 | 9 | 1,541 | 216.777778 | 0.164412 | 0.060482 | 0 | 0 | 0 | 0.75 | 0.932203 | 0.932203 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 |
b0669970c5249de07188a716189769c01cc7b0b8 | 2,701 | py | Python | wntr/tests/test_metrics_todini.py | algchyhao/WNTR | dd4db188a8641a4da16cf80a1557c908fa48c17d | [
"BSD-3-Clause"
] | null | null | null | wntr/tests/test_metrics_todini.py | algchyhao/WNTR | dd4db188a8641a4da16cf80a1557c908fa48c17d | [
"BSD-3-Clause"
] | null | null | null | wntr/tests/test_metrics_todini.py | algchyhao/WNTR | dd4db188a8641a4da16cf80a1557c908fa48c17d | [
"BSD-3-Clause"
] | null | null | null | from __future__ import print_function
from nose.tools import *
from os.path import abspath, dirname, join
import wntr
testdir = dirname(abspath(str(__file__)))
datadir = join(testdir,'networks_for_testing')
def test_Todini_Fig2_optCost_GPM():
inp_file = join(datadir,'Todini_Fig2_optCost_GPM.inp')
wn = wntr.network.WaterNetworkModel(inp_file)
sim = wntr.sim.WNTRSimulator(wn)
results = sim.run_sim()
# Compute todini index
head = results.node['head']
pressure = results.node['pressure']
demand = results.node['demand']
flowrate = results.link['flowrate']
todini = wntr.metrics.todini_index(head, pressure, demand, flowrate, wn, 30) # h* = 30 m
expected = 0.22
error = abs(todini[0] - expected)
#print(todini[0], expected, error)
assert_less(error, 0.01)
def test_Todini_Fig2_optCost_CMH():
inp_file = join(datadir,'Todini_Fig2_optCost_CMH.inp')
wn = wntr.network.WaterNetworkModel(inp_file)
sim = wntr.sim.WNTRSimulator(wn)
results = sim.run_sim()
# Compute todini index
head = results.node['head']
pressure = results.node['pressure']
demand = results.node['demand']
flowrate = results.link['flowrate']
todini = wntr.metrics.todini_index(head, pressure, demand, flowrate, wn, 30) # h* = 30 m
expected = 0.22
error = abs(todini[0] - expected)
#print(todini[0], expected, error)
assert_less(error, 0.01)
def test_Todini_Fig2_solA_GPM():
inp_file = join(datadir,'Todini_Fig2_solA_GPM.inp')
wn = wntr.network.WaterNetworkModel(inp_file)
sim = wntr.sim.WNTRSimulator(wn)
results = sim.run_sim()
# Compute todini index
head = results.node['head']
pressure = results.node['pressure']
demand = results.node['demand']
flowrate = results.link['flowrate']
todini = wntr.metrics.todini_index(head, pressure, demand, flowrate, wn, 30) # h* = 30 m
expected = 0.41
error = abs(todini[0] - expected)
#print(todini[0], expected, error)
assert_less(error, 0.03)
def test_Todini_Fig2_solA_CMH():
inp_file = join(datadir,'Todini_Fig2_solA_CMH.inp')
wn = wntr.network.WaterNetworkModel(inp_file)
sim = wntr.sim.WNTRSimulator(wn)
results = sim.run_sim()
# Compute todini index
head = results.node['head']
pressure = results.node['pressure']
demand = results.node['demand']
flowrate = results.link['flowrate']
todini = wntr.metrics.todini_index(head, pressure, demand, flowrate, wn, 30) # h* = 30 m
expected = 0.41
error = abs(todini[0] - expected)
#print(todini[0], expected, error)
assert_less(error, 0.03)
if __name__ == '__main__':
test_Todini_Fig2_solA_GPM()
| 32.154762 | 92 | 0.68197 | 361 | 2,701 | 4.905817 | 0.168975 | 0.074534 | 0.067758 | 0.038396 | 0.906268 | 0.862789 | 0.862789 | 0.78035 | 0.78035 | 0.78035 | 0 | 0.026016 | 0.188819 | 2,701 | 83 | 93 | 32.542169 | 0.782291 | 0.094409 | 0 | 0.733333 | 0 | 0 | 0.096178 | 0.041924 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.133333 | 0.016667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c66f5a12ad8440795428e549edf9712597fde404 | 111 | py | Python | notescribe/views/__init__.py | SatvikR/notescribe | 8ac46297407065a920595768410be18da14d6256 | [
"Apache-2.0"
] | null | null | null | notescribe/views/__init__.py | SatvikR/notescribe | 8ac46297407065a920595768410be18da14d6256 | [
"Apache-2.0"
] | null | null | null | notescribe/views/__init__.py | SatvikR/notescribe | 8ac46297407065a920595768410be18da14d6256 | [
"Apache-2.0"
] | null | null | null | from notescribe.views.index import *
from notescribe.views.about import *
from notescribe.views.music import *
| 27.75 | 36 | 0.810811 | 15 | 111 | 6 | 0.466667 | 0.466667 | 0.633333 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 111 | 3 | 37 | 37 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c6b39337bdca023ca2ceff7dbda55546fd274fad | 133 | py | Python | src/slipstream/__init__.py | slipstream/slipstream-libcloud-driver | 6f461d48ba22a4d62da94d38bb1a978eefb365b0 | [
"Apache-2.0"
] | null | null | null | src/slipstream/__init__.py | slipstream/slipstream-libcloud-driver | 6f461d48ba22a4d62da94d38bb1a978eefb365b0 | [
"Apache-2.0"
] | 1 | 2017-10-12T10:10:43.000Z | 2018-07-24T14:33:21.000Z | src/slipstream/__init__.py | slipstream/slipstream-libcloud-driver | 6f461d48ba22a4d62da94d38bb1a978eefb365b0 | [
"Apache-2.0"
] | null | null | null | #from pkgutil import extend_path
#__path__ = extend_path(__path__, __name__)
__import__("pkg_resources").declare_namespace(__name__)
| 33.25 | 55 | 0.834586 | 16 | 133 | 5.4375 | 0.625 | 0.229885 | 0.321839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067669 | 133 | 3 | 56 | 44.333333 | 0.701613 | 0.548872 | 0 | 0 | 0 | 0 | 0.224138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c6b4a144fbbb6dff6cd9ac093fc1ddb12c903bbd | 7,624 | py | Python | models.py | tsudijon/sghmc_dgp | fe58fdc25f9b937149642e5159bafa070f407684 | [
"MIT"
] | null | null | null | models.py | tsudijon/sghmc_dgp | fe58fdc25f9b937149642e5159bafa070f407684 | [
"MIT"
] | null | null | null | models.py | tsudijon/sghmc_dgp | fe58fdc25f9b937149642e5159bafa070f407684 | [
"MIT"
] | null | null | null | from sghmc_dgp import DGP
import numpy as np
from scipy.stats import norm
from scipy.special import logsumexp
from kernels import SquaredExponential
import likelihoods as llh
import gpflow
class ClassificationModel(object):
def __init__(self, depth):
class ARGS:
num_inducing = 50
iterations = 10000
minibatch_size = 10
window_size = 100
num_posterior_samples = 100
posterior_sample_spacing = 50
self.ARGS = ARGS
self.depth = depth
self.model = None
def fit(self, X, Y):
lik = gpflow.likelihoods.Bernoulli()
return self._fit(X, Y, lik)
def _fit(self, X, Y, lik, **kwargs):
if len(Y.shape) == 1:
Y = Y[:, None]
kerns = []
if not self.model:
for _ in range(self.depth):
kerns.append(SquaredExponential(X.shape[1], ARD=True, lengthscales=float(X.shape[1])**0.5))
mb_size = self.ARGS.minibatch_size if X.shape[0] > self.ARGS.minibatch_size else X.shape[0]
self.model = DGP(X, Y, self.ARGS.num_inducing, kerns, lik,
minibatch_size=mb_size,
window_size=self.ARGS.window_size,
**kwargs)
self.model.reset(X, Y)
try:
for _ in range(self.ARGS.iterations):
self.model.sghmc_step()
self.model.train_hypers()
if _ % 100 == 1:
print('Iteration {}'.format(_))
self.model.print_sample_performance()
self.model.collect_samples(self.ARGS.num_posterior_samples, self.ARGS.posterior_sample_spacing)
except KeyboardInterrupt: # pragma: no cover
pass
def _predict(self, Xs, S):
ms, vs = [], []
n = max(len(Xs) / 100, 1) # predict in small batches
for xs in np.array_split(Xs, n):
m, v = self.model.predict_y(xs, S)
ms.append(m)
vs.append(v)
return np.concatenate(ms, 1), np.concatenate(vs, 1) # num_posterior_samples, N_test, D_y
def predict(self, Xs):
ms, vs = self._predict(Xs, self.ARGS.num_posterior_samples)
m = np.average(ms, 0)
v = np.average(vs + ms**2, 0) - m**2
return m, v, ms
def calculate_density(self, Xs, Ys): ### change this
ms, vs = self._predict(Xs, self.ARGS.num_posterior_samples)
logps = norm.logpdf(np.repeat(Ys[None, :, :], self.ARGS.num_posterior_samples, axis=0), ms, np.sqrt(vs))
return logsumexp(logps, axis=0) - np.log(self.ARGS.num_posterior_samples)
def sample(self, Xs, S): ### change this
ms, vs = self._predict(Xs, S)
return ms + vs**0.5 * np.random.randn(*ms.shape)
class MultiClassModel(object):
def __init__(self, num_classes):
class ARGS:
num_inducing = 100
iterations = 10000
minibatch_size = 10000
window_size = 100
num_posterior_samples = 100
posterior_sample_spacing = 50
self.ARGS = ARGS
self.model = None
self.num_classes = num_classes
def fit(self, X, Y):
lik = gpflow.likelihoods.RobustMax(self.num_classes)
return self._fit(X, Y, lik)
def _fit(self, X, Y, lik, **kwargs):
if len(Y.shape) == 1:
Y = Y[:, None]
kerns = []
if not self.model:
for _ in range(5):
kerns.append(SquaredExponential(X.shape[1], ARD=True, lengthscales=float(X.shape[1])**0.5))
mb_size = self.ARGS.minibatch_size if X.shape[0] > self.ARGS.minibatch_size else X.shape[0]
self.model = DGP(X, Y, 100, kerns, lik,
minibatch_size=mb_size,
window_size=self.ARGS.window_size,
**kwargs)
self.model.reset(X, Y)
try:
for _ in range(self.ARGS.iterations):
self.model.sghmc_step()
self.model.train_hypers()
if _ % 100 == 1:
print('Iteration {}'.format(_))
self.model.print_sample_performance()
self.model.collect_samples(self.ARGS.num_posterior_samples, self.ARGS.posterior_sample_spacing)
except KeyboardInterrupt: # pragma: no cover
pass
def _predict(self, Xs, S):
ms, vs = [], []
n = max(len(Xs) / 100, 1) # predict in small batches
for xs in np.array_split(Xs, n):
m, v = self.model.predict_y(xs, S)
ms.append(m)
vs.append(v)
return np.concatenate(ms, 1), np.concatenate(vs, 1) # num_posterior_samples, N_test, D_y
def predict(self, Xs):
ms, vs = self._predict(Xs, self.ARGS.num_posterior_samples)
m = np.average(ms, 0)
v = np.average(vs + ms**2, 0) - m**2
return m, v
def sample(self, Xs, S): ### change this
ms, vs = self._predict(Xs, S)
return ms + vs**0.5 * np.random.randn(*ms.shape)
class RegressionModel(object):
def __init__(self):
class ARGS:
num_inducing = 50
iterations = 10000
minibatch_size = 10000
window_size = 100
num_posterior_samples = 100
posterior_sample_spacing = 50
self.ARGS = ARGS
self.model = None
def fit(self, X, Y):
lik = llh.Gaussian(np.var(Y, 0))
return self._fit(X, Y, lik)
def _fit(self, X, Y, lik, **kwargs):
if len(Y.shape) == 1:
Y = Y[:, None]
kerns = []
if not self.model:
for _ in range(5):
kerns.append(SquaredExponential(X.shape[1], ARD=True, lengthscales=float(X.shape[1])**0.5))
mb_size = self.ARGS.minibatch_size if X.shape[0] > self.ARGS.minibatch_size else X.shape[0]
self.model = DGP(X, Y, 100, kerns, lik,
minibatch_size=mb_size,
window_size=self.ARGS.window_size,
**kwargs)
self.model.reset(X, Y)
try:
for _ in range(self.ARGS.iterations):
self.model.sghmc_step()
self.model.train_hypers()
if _ % 100 == 1:
print('Iteration {}'.format(_))
self.model.print_sample_performance()
self.model.collect_samples(self.ARGS.num_posterior_samples, self.ARGS.posterior_sample_spacing)
except KeyboardInterrupt: # pragma: no cover
pass
def _predict(self, Xs, S):
ms, vs = [], []
n = max(len(Xs) / 100, 1) # predict in small batches
for xs in np.array_split(Xs, n):
m, v = self.model.predict_y(xs, S)
ms.append(m)
vs.append(v)
return np.concatenate(ms, 1), np.concatenate(vs, 1) # num_posterior_samples, N_test, D_y
def predict(self, Xs):
ms, vs = self._predict(Xs, self.ARGS.num_posterior_samples)
m = np.average(ms, 0)
v = np.average(vs + ms**2, 0) - m**2
return m, v
def calculate_density(self, Xs, Ys):
ms, vs = self._predict(Xs, self.ARGS.num_posterior_samples)
logps = norm.logpdf(np.repeat(Ys[None, :, :], self.ARGS.num_posterior_samples, axis=0), ms, np.sqrt(vs))
return logsumexp(logps, axis=0) - np.log(self.ARGS.num_posterior_samples)
def sample(self, Xs, S):
ms, vs = self._predict(Xs, S)
return ms + vs**0.5 * np.random.randn(*ms.shape)
| 34.188341 | 112 | 0.552204 | 995 | 7,624 | 4.076382 | 0.119598 | 0.061144 | 0.08432 | 0.059172 | 0.896696 | 0.896696 | 0.884122 | 0.881657 | 0.869576 | 0.841223 | 0 | 0.027446 | 0.330929 | 7,624 | 222 | 113 | 34.342342 | 0.767693 | 0.03489 | 0 | 0.869318 | 0 | 0 | 0.004906 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113636 | false | 0.017045 | 0.039773 | 0 | 0.267045 | 0.034091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6bf2901dbb6998d48abb739b7fdcc024ddcd245 | 128 | py | Python | multi_buffer/multiple_gym/multiple_gym/envs/__init__.py | wx100059/multiple-gym | 7dcebfd2fbc7c232b09f7170176c43a22f4cbfc4 | [
"MIT"
] | null | null | null | multi_buffer/multiple_gym/multiple_gym/envs/__init__.py | wx100059/multiple-gym | 7dcebfd2fbc7c232b09f7170176c43a22f4cbfc4 | [
"MIT"
] | 1 | 2021-01-23T07:20:09.000Z | 2021-01-23T07:20:09.000Z | multi_buffer/multiple_gym/multiple_gym/envs/__init__.py | wx100059/multiple-gym | 7dcebfd2fbc7c232b09f7170176c43a22f4cbfc4 | [
"MIT"
] | 1 | 2021-01-04T05:20:32.000Z | 2021-01-04T05:20:32.000Z | from multiple_gym.envs.multiple_gym import multipleEnv
#from multiple_gym.envs.multiple_gym_extend import multipleEnvExtend
| 32 | 69 | 0.867188 | 17 | 128 | 6.235294 | 0.470588 | 0.415094 | 0.283019 | 0.358491 | 0.566038 | 0.566038 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 128 | 3 | 70 | 42.666667 | 0.913793 | 0.523438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c6c2cb3e3bd1c11f4fc19cc208a0f045c6fabe29 | 187 | py | Python | app/jekylledit/controllers/__init__.py | klokantech/jekylledit | 98a710a02c4e0bf2c8e3bbb0c67080c7b1791a82 | [
"BSD-3-Clause"
] | 17 | 2016-06-09T13:23:42.000Z | 2021-08-24T15:44:09.000Z | app/jekylledit/controllers/__init__.py | klokantech/jekylledit | 98a710a02c4e0bf2c8e3bbb0c67080c7b1791a82 | [
"BSD-3-Clause"
] | 32 | 2016-04-26T10:16:52.000Z | 2021-06-01T21:57:17.000Z | app/jekylledit/controllers/__init__.py | klokantech/jekylledit | 98a710a02c4e0bf2c8e3bbb0c67080c7b1791a82 | [
"BSD-3-Clause"
] | 9 | 2016-08-16T02:50:31.000Z | 2020-11-04T21:17:56.000Z | from .base import app
from . import auth
from . import site
app.register_blueprint(auth.blueprint, url_prefix='/auth')
app.register_blueprint(auth.firebase.blueprint, url_prefix='/auth')
| 31.166667 | 67 | 0.796791 | 27 | 187 | 5.37037 | 0.407407 | 0.137931 | 0.275862 | 0.331034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085562 | 187 | 5 | 68 | 37.4 | 0.847953 | 0 | 0 | 0 | 0 | 0 | 0.053476 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0.4 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
af0a5763d2720a9365d8a1b382eed54e16bb1fe3 | 30,481 | py | Python | examples/estimator/classifier/RandomForestClassifier/js/basics_embedded.py | mathewdgardner/sklearn-porter | d8927a6af06e96dd416be759321e93691c39cf73 | [
"MIT"
] | 1 | 2022-02-15T12:44:37.000Z | 2022-02-15T12:44:37.000Z | examples/estimator/classifier/RandomForestClassifier/js/basics_embedded.py | Stardustsky/sklearn-porter | d8927a6af06e96dd416be759321e93691c39cf73 | [
"MIT"
] | null | null | null | examples/estimator/classifier/RandomForestClassifier/js/basics_embedded.py | Stardustsky/sklearn-porter | d8927a6af06e96dd416be759321e93691c39cf73 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from sklearn_porter import Porter
iris_data = load_iris()
X = iris_data.data
y = iris_data.target
clf = RandomForestClassifier(n_estimators=15, max_depth=None,
min_samples_split=2, random_state=0)
clf.fit(X, y)
porter = Porter(clf, language='js')
output = porter.export(embed_data=True)
print(output)
"""
var RandomForestClassifier = function() {
var findMax = function(nums) {
var index = 0;
for (var i = 0; i < nums.length; i++) {
index = nums[i] > nums[index] ? i : index;
}
return index;
};
var trees = new Array();
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.75) {
classes[0] = 47;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.85000038147) {
if (features[3] <= 1.65000009537) {
classes[0] = 0;
classes[1] = 42;
classes[2] = 0;
} else {
if (features[1] <= 3.0) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
}
} else {
if (features[0] <= 6.59999990463) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 27;
} else {
if (features[2] <= 5.19999980927) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 29;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.800000011921) {
classes[0] = 46;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[3] <= 1.75) {
if (features[2] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 58;
classes[2] = 0;
} else {
if (features[2] <= 5.44999980927) {
if (features[1] <= 2.45000004768) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 3;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
}
}
} else {
if (features[2] <= 4.85000038147) {
if (features[1] <= 3.09999990463) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 35;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[0] <= 5.55000019073) {
if (features[3] <= 0.800000011921) {
classes[0] = 49;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[3] <= 1.60000002384) {
classes[0] = 0;
classes[1] = 12;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
}
} else {
if (features[3] <= 1.54999995232) {
if (features[3] <= 0.75) {
classes[0] = 2;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 5.0) {
classes[0] = 0;
classes[1] = 32;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
}
} else {
if (features[2] <= 4.65000009537) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
if (features[3] <= 1.70000004768) {
if (features[2] <= 5.44999980927) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 48;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[0] <= 5.44999980927) {
if (features[1] <= 2.80000019073) {
if (features[1] <= 2.45000004768) {
classes[0] = 0;
classes[1] = 5;
classes[2] = 0;
} else {
if (features[0] <= 5.0) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
} else {
classes[0] = 0;
classes[1] = 3;
classes[2] = 0;
}
}
} else {
classes[0] = 41;
classes[1] = 0;
classes[2] = 0;
}
} else {
if (features[0] <= 6.25) {
if (features[3] <= 1.70000004768) {
if (features[3] <= 0.600000023842) {
classes[0] = 3;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[1] <= 2.25) {
if (features[3] <= 1.25) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 3;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
}
} else {
classes[0] = 0;
classes[1] = 37;
classes[2] = 0;
}
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 8;
}
} else {
if (features[2] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 10;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 35;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.699999988079) {
classes[0] = 50;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[3] <= 1.75) {
if (features[2] <= 5.05000019073) {
if (features[2] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 56;
classes[2] = 0;
} else {
if (features[3] <= 1.60000002384) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
} else {
classes[0] = 0;
classes[1] = 3;
classes[2] = 0;
}
}
} else {
if (features[0] <= 6.05000019073) {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 5;
}
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 33;
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.800000011921) {
classes[0] = 49;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.94999980927) {
if (features[0] <= 4.94999980927) {
if (features[3] <= 1.35000002384) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
} else {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 49;
classes[2] = 0;
} else {
if (features[1] <= 2.59999990463) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
if (features[0] <= 6.05000019073) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
if (features[3] <= 1.59999990463) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
}
}
}
}
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 44;
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.699999988079) {
classes[0] = 46;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.75) {
if (features[0] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 39;
classes[2] = 0;
}
} else {
if (features[2] <= 5.14999961853) {
if (features[0] <= 6.59999990463) {
if (features[3] <= 1.70000004768) {
if (features[3] <= 1.54999995232) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 19;
}
} else {
classes[0] = 0;
classes[1] = 3;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 38;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[2] <= 2.59999990463) {
classes[0] = 58;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 37;
classes[2] = 0;
} else {
if (features[2] <= 5.14999961853) {
if (features[3] <= 1.75) {
if (features[0] <= 6.5) {
if (features[2] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
if (features[0] <= 6.15000009537) {
if (features[3] <= 1.54999995232) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
}
}
} else {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 13;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 34;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.699999988079) {
classes[0] = 42;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[0] <= 6.25) {
if (features[2] <= 4.80000019073) {
if (features[0] <= 4.94999980927) {
if (features[1] <= 2.45000004768) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
}
} else {
classes[0] = 0;
classes[1] = 36;
classes[2] = 0;
}
} else {
if (features[3] <= 1.54999995232) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
} else {
if (features[3] <= 1.70000004768) {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
}
}
}
} else {
if (features[3] <= 1.75) {
if (features[2] <= 5.05000019073) {
classes[0] = 0;
classes[1] = 15;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 39;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[2] <= 2.59999990463) {
classes[0] = 55;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.94999980927) {
if (features[0] <= 5.94999980927) {
classes[0] = 0;
classes[1] = 23;
classes[2] = 0;
} else {
if (features[3] <= 1.64999997616) {
classes[0] = 0;
classes[1] = 16;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
}
}
} else {
if (features[0] <= 6.59999990463) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 33;
} else {
if (features[0] <= 6.75) {
if (features[3] <= 2.0) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 14;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.800000011921) {
classes[0] = 52;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 37;
classes[2] = 0;
} else {
if (features[3] <= 1.75) {
if (features[2] <= 4.94999980927) {
classes[0] = 0;
classes[1] = 4;
classes[2] = 0;
} else {
if (features[1] <= 2.65000009537) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
if (features[3] <= 1.54999995232) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
if (features[2] <= 5.44999980927) {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
}
}
}
} else {
if (features[2] <= 4.85000038147) {
if (features[1] <= 3.09999990463) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 6;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 43;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[2] <= 2.59999990463) {
classes[0] = 47;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 40;
classes[2] = 0;
} else {
if (features[2] <= 4.94999980927) {
if (features[1] <= 3.04999995232) {
if (features[3] <= 1.59999990463) {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 7;
}
} else {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
}
} else {
if (features[0] <= 6.05000019073) {
if (features[2] <= 5.05000019073) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 4;
} else {
if (features[0] <= 5.94999980927) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 7;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 40;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[3] <= 0.800000011921) {
classes[0] = 54;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[1] <= 2.45000004768) {
if (features[2] <= 4.75) {
classes[0] = 0;
classes[1] = 12;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
} else {
if (features[3] <= 1.60000002384) {
if (features[2] <= 5.0) {
classes[0] = 0;
classes[1] = 23;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
}
} else {
if (features[3] <= 1.75) {
if (features[0] <= 5.80000019073) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 3;
} else {
classes[0] = 0;
classes[1] = 2;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 53;
}
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[0] <= 5.44999980927) {
if (features[3] <= 0.800000011921) {
classes[0] = 36;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[2] <= 4.19999980927) {
classes[0] = 0;
classes[1] = 6;
classes[2] = 0;
} else {
if (features[1] <= 2.75) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
} else {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
}
}
}
} else {
if (features[2] <= 4.90000009537) {
if (features[1] <= 3.59999990463) {
classes[0] = 0;
classes[1] = 43;
classes[2] = 0;
} else {
classes[0] = 7;
classes[1] = 0;
classes[2] = 0;
}
} else {
if (features[3] <= 1.70000004768) {
if (features[3] <= 1.54999995232) {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
} else {
classes[0] = 0;
classes[1] = 4;
classes[2] = 0;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 50;
}
}
}
return findMax(classes);
});
trees.push(function(features) {
var classes = new Array(3);
if (features[2] <= 2.59999990463) {
classes[0] = 52;
classes[1] = 0;
classes[2] = 0;
} else {
if (features[3] <= 1.70000004768) {
if (features[0] <= 7.0) {
if (features[2] <= 5.0) {
classes[0] = 0;
classes[1] = 48;
classes[2] = 0;
} else {
if (features[0] <= 6.05000019073) {
classes[0] = 0;
classes[1] = 1;
classes[2] = 0;
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 2;
}
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 1;
}
} else {
classes[0] = 0;
classes[1] = 0;
classes[2] = 46;
}
}
return findMax(classes);
});
this.predict = function(features) {
var classes = new Array(3).fill(0);
for (var i = 0; i < trees.length; i++) {
classes[trees[i](features)]++;
}
return findMax(classes);
}
};
if (typeof process !== 'undefined' && typeof process.argv !== 'undefined') {
if (process.argv.length - 2 == 4) {
// Features:
var features = process.argv.slice(2);
// Prediction:
var prediction = new RandomForestClassifier().predict(features);
console.log(prediction);
}
}
"""
| 34.018973 | 76 | 0.266953 | 2,191 | 30,481 | 3.708352 | 0.059334 | 0.190031 | 0.124062 | 0.220554 | 0.881846 | 0.869662 | 0.843692 | 0.822031 | 0.799754 | 0.786585 | 0 | 0.184634 | 0.62721 | 30,481 | 895 | 77 | 34.056983 | 0.530406 | 0.000689 | 0 | 0 | 0 | 0 | 0.004662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bbadcb790c1f9f99f3c4cd83ecd0bd91aaab8b06 | 115 | py | Python | 2_fixtures_conftest/conftest.py | G00gl3r/test1 | 6c7eb1908d678b17afc65913a377cbbc8750b0fe | [
"MIT"
] | null | null | null | 2_fixtures_conftest/conftest.py | G00gl3r/test1 | 6c7eb1908d678b17afc65913a377cbbc8750b0fe | [
"MIT"
] | null | null | null | 2_fixtures_conftest/conftest.py | G00gl3r/test1 | 6c7eb1908d678b17afc65913a377cbbc8750b0fe | [
"MIT"
] | null | null | null | import pytest
# @pytest.fixture
# def first_fixture():
# print("\nPrint from 'first_fixture' in conftest.py")
| 19.166667 | 58 | 0.704348 | 15 | 115 | 5.266667 | 0.733333 | 0.303797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156522 | 115 | 5 | 59 | 23 | 0.814433 | 0.808696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bbe06c7601b42a8ba4a317678d66646825d06bcd | 254,854 | py | Python | modules/models.py | sathvikb007/neurawkes | 6e8657b66e44de32e4aa5c0fc32fddd801b27d94 | [
"MIT"
] | null | null | null | modules/models.py | sathvikb007/neurawkes | 6e8657b66e44de32e4aa5c0fc32fddd801b27d94 | [
"MIT"
] | null | null | null | modules/models.py | sathvikb007/neurawkes | 6e8657b66e44de32e4aa5c0fc32fddd801b27d94 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Here are the models
continuous-time sequential model (CTSM)
@author: hongyuan
"""
import pickle
import time
import numpy
import theano
from theano import sandbox
import theano.tensor as tensor
import os
#import scipy.io
from collections import defaultdict
from theano.tensor.shared_randomstreams import RandomStreams
import utils
dtype=theano.config.floatX
#
class HawkesCTSM(object):
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
print "initializing Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of K * K
the (i, j) entry is the effect of j-th to i-th
this order may be changed in the neural hawkes
for the sake of implementation ease or the convention of Theano
'''
self.alpha = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='alpha'
)
self.delta = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='delta'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
self.alpha = theano.shared(
model_pre_train['alpha'], name='alpha'
)
self.delta = theano.shared(
model_pre_train['delta'], name='delta'
)
#
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, self.alpha, self.delta
]
self.grad_params = None
self.cost_to_optimize = None
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
# to evaluate per-event intensity predict
# this should be filterd by mask
self.lambda_samples = None
self.num_of_samples = None
#
#
#
def compute_loss(
self,
seq_time_to_end, seq_time_to_current, seq_type_event,
time_since_start_to_end,
seq_mask, seq_mask_to_current
):
'''
use this function to compute negative log likelihood
seq_time_to_end : T * size_batch -- T-t_i
seq_time_to_current : T * T * size_batch --
for each batch, it is T * T, and at each time step t,
it tracks the ( t_i - t_i' ) for all t_i' < t_i
seq_type_event : T * size_batch -- for each data
and each time step, tracks the type of event k_i
time_since_start_to_end : size_batch -- time for seq
#
seq_mask : T * size_batch -- 1/0
seq_mask_to_current : T * T * size_batch -- 1/0
'''
print "computing loss function of Hawkes model ... "
# first compute the 3rd term in loss
alpha_over_seq = self.alpha[
:, seq_type_event
] # dim_process * T * size_batch
delta_over_seq = self.delta[
:, seq_type_event
] # dim_process * T * size_batch
#
term_3 = tensor.sum(
tensor.sum(
(
(
numpy.float32(1.0) - tensor.exp(
-delta_over_seq * seq_time_to_end[
None, :, :
]
)
) * alpha_over_seq / delta_over_seq
),
axis = 0
) * seq_mask,
axis = 0
) # (size_batch, )
# then we compute the 2nd term
term_2 = tensor.sum(self.mu) * time_since_start_to_end
# (size_batch, )
# then we compute the 1st term, which is the trickest
# we use seq_time_to_current : T * T * size_batch
# seq_mask_to_current : T * T * size_batch
lambda_over_seq = self.mu[:, None, None] + tensor.sum(
(
seq_mask_to_current[None,:,:,:]
* (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:]
* seq_time_to_current[None,:,:,:]
)
)
)
, axis=2
) # dim_process * T * size_batch
#
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=0
) # T * size_batch
#
# now we choose the right lambda for each step
# by using seq_type_event : T * size_batch
new_shape_0 = lambda_over_seq.shape[1]*lambda_over_seq.shape[2]
new_shape_1 = lambda_over_seq.shape[0]
#
back_shape_0 = lambda_over_seq.shape[1]
back_shape_1 = lambda_over_seq.shape[2]
#
lambda_target_over_seq = lambda_over_seq.transpose(
(1,2,0)
).reshape(
(
new_shape_0, new_shape_1
)
)[
tensor.arange(new_shape_0),
seq_type_event.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
def compute_lambda(
self,
seq_type_event,
seq_sims_time_to_current,
seq_sims_mask,
seq_sims_mask_to_current
):
'''
use this function to compute intensity
seq_type_event : T * size_batch -- for each data
and each time step, tracks the type of event k_i
these are only used for computing intensity estimation
N is the # of MonteCarlo samples
seq_sims_time_to_current : N * T * size_batch -- for each batch, and at each time step t, track t_i-t_i' for t_i'<t_i
seq_sims_mask : N * size_batch
seq_sims_mask_to_current : N * T * size_batch
'''
print "computing intensity ... "
# first compute the 3rd term in loss
alpha_over_seq = self.alpha[
:, seq_type_event
] # dim_process * T * size_batch
delta_over_seq = self.delta[
:, seq_type_event
] # dim_process * T * size_batch
#
'''
in this block, we compute intensity
at sampled time
'''
#
lambda_samples = self.mu[:,None,None] + tensor.sum(
(
seq_sims_mask_to_current[None,:,:,:] * (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:] * seq_sims_time_to_current[None,:,:,:]
)
)
), axis=2
)
# K * N * size_batch
self.lambda_samples = lambda_samples * seq_sims_mask[None,:,:]
self.num_of_samples = tensor.sum(seq_sims_mask)
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
# Note : _scale means : we use scaling parameter in transfer function
#
class HawkesInhibCTSM_scale(object):
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
print "initializing Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
# initialize variables
self.scale = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='scale'
)
#
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of K * K
the (i, j) entry is the effect of j-th to i-th
this order may be changed in the neural hawkes
for the sake of implementation ease or the convention of Theano
'''
self.alpha = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='alpha'
)
self.delta = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='delta'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.scale = theano.shared(
model_pre_train['scale'], name='scale'
)
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
self.alpha = theano.shared(
model_pre_train['alpha'], name='alpha'
)
self.delta = theano.shared(
model_pre_train['delta'], name='delta'
)
#
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.scale, # scale parameter
self.mu, self.alpha, self.delta
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def soft_relu_scale(self, x):
# x is symbolic tensor
# usually last dim is dim_process
# but in this model, 0-th dim is dim_process
# this is important !
x /= self.scale[:,None,None]
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
z *= self.scale[:,None,None]
return z
#
#
def compute_loss(
self,
seq_time_to_current, seq_type_event,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask, seq_mask_to_current,
seq_sims_time_to_current,
seq_sims_mask_to_current,
seq_sims_mask
):
'''
use this function to compute negative log likelihood
seq_time_to_end : T * size_batch -- T-t_i
seq_time_to_current : T * T * size_batch --
for each batch, it is T * T, and at each time step t,
it tracks the ( t_i - t_i' ) for all t_i' < t_i
seq_type_event : T * size_batch -- for each data
and each time step, tracks the type of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
#
seq_mask : T * size_batch -- 1/0
seq_mask_to_current : T * T * size_batch -- 1/0
#
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Hawkes model ... "
# first compute the 3rd term in loss
alpha_over_seq = self.alpha[
:, seq_type_event
] # dim_process * T * size_batch
delta_over_seq = self.delta[
:, seq_type_event
] # dim_process * T * size_batch
#
lambda_over_seq_sims_tilde = self.mu[:,None,None] + tensor.sum(
(
seq_sims_mask_to_current[None,:,:,:] * (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:] * seq_sims_time_to_current[None,:,:,:]
)
)
), axis=2
)
# dim_process * N * size_batch
#
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
#
# dim_process * N * size_batch
#
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=0
)
# N * size_batch
# mask the lambda of simulations
lambda_sum_over_seq_sims *= seq_sims_mask
#
#
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
# (size_batch, )
term_2 = numpy.float32(0.0)
#
'''
for this model, the computation of term_3 follows the same procedure of term_1, since we need to estimate lambda(s_j), i.e, we need large N * T * size_batch tensors for (1) time to current; (2) mask for (1).
then we can just follow the steps of term_1 to finish the integral estimation.
correspondingly, we need to modify the data processors, to generate the big tensors
'''
# then we compute the 1st term, which is the trickest
# we use seq_time_to_current : T * T * size_batch
# seq_mask_to_current : T * T * size_batch
lambda_over_seq_tilde = self.mu[:, None, None] + tensor.sum(
(
seq_mask_to_current[None,:,:,:]
* (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:]
* seq_time_to_current[None,:,:,:]
)
)
)
, axis=2
)
# dim_process * T * size_batch
#
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde
)
#
# dim_process * T * size_batch
#
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=0
) # T * size_batch
# now we choose the right lambda for each step
# by using seq_type_event : T * size_batch
new_shape_0 = lambda_over_seq.shape[1]*lambda_over_seq.shape[2]
new_shape_1 = lambda_over_seq.shape[0]
#
back_shape_0 = lambda_over_seq.shape[1]
back_shape_1 = lambda_over_seq.shape[2]
#
lambda_target_over_seq = lambda_over_seq.transpose(
(1,2,0)
).reshape(
(
new_shape_0, new_shape_1
)
)[
tensor.arange(new_shape_0),
seq_type_event.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
class NeuralHawkesCTLSTM(object):
'''
This model uses:
Adative base rate, interaction and decay
Continuous-time LSTM
Scale parameter s_k for softrelu curvature adjustment
Reduced version -- delta param is D * D, not D * D * K
'''
#
def __init__(self, settings):
print(">> inside models NeuralHawkesCTLSTM class")
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Neural Hawkes with Continuous-time LSTM ... "
print ('>> settings[\'path_pre_train\'] = {}'.format(settings['path_pre_train']))
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
# this is useless in cont-time lstm
self.dim_model = settings['dim_model']
# initialize variables
self.scale = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='scale'
)
#
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
2*self.dim_model, 7*self.dim_model
), name='W_recur'
)
'''
2 input :
event rep, hidden state
7 outputs :
4 regular LSTM gates
2 -- input_bar and forget_bar gate
1 -- cell memory decay gate
'''
self.b_recur = theano.shared(
numpy.zeros(
(7*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.scale = theano.shared(
model_pre_train['scale'], name='scale'
)
#
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.c_0_target = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0_target'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.scale, # scale parameter
self.W_alpha,
self.Emb_event,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
#
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def soft_relu_scale(self, x):
# x is symbolic tensor
# last dim is dim_process
# this is important !
x /= self.scale
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
z *= self.scale
return z
#
#
def rnn_unit(
self,
emb_event_im1, time_interval_im1,
hidden_t_im1, cell_t_im1, cell_im1_target
):
'''
This LSTM unit is working in continuous-time
What a regular LSTM does :
Take h_{i-1}, and update to h_i
What a CT-LSTM does :
Take h(t_{i-1}), which decays to t_{i-1}
Use it and upate to h_i
h_i is then used to compute Hawkes params
#
input:
emb_event_imt = x_{i-1}
time_interval_i = t_i - t_{i-1}
h(t_{i-1}) right before THIS update
c(t_{i-1}) right before THIS update
c_{i-1}_target before THIS update
output: ( # stands for not output it )
h(t_i) right before NEXT update at t_i
c(t_i) right before NEXT update at t_i
c_i_target over ( t_{i-1}, t_i ]
#h_i = h( t_{i-1} <-- t ) right after THIS update
c_i = c( t_{i-1} <-- t ) right after THIS update
decay_rate over ( t_{i-1}, t_i ]
gate_output over ( t_{i-1}, t_i ]
'''
#TODO: update LSTM state at t_{i-1}
pre_transform = tensor.concatenate(
[emb_event_im1, hidden_t_im1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
# 4 regular LSTM gates
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[
:, 3*self.dim_model:4*self.dim_model
]
)
# 2 -- input_bar and forget_bar gates
gate_input_target = tensor.nnet.sigmoid(
post_transform[
:, 4*self.dim_model:5*self.dim_model
]
)
gate_forget_target = tensor.nnet.sigmoid(
post_transform[
:, 5*self.dim_model:6*self.dim_model
]
)
# cell memory decay
decay_cell = self.soft_relu(
post_transform[
:, 6*self.dim_model:
]
)
# size : size_batch * dim_model
#TODO: decay cell memory
cell_i = gate_forget * cell_t_im1 + gate_input * gate_pre_c
cell_i_target = gate_forget_target * cell_im1_target + gate_input_target * gate_pre_c
#
cell_t_i = cell_i_target + (
cell_i - cell_i_target
) * tensor.exp(
-decay_cell * time_interval_im1[:, None]
)
hidden_t_i = gate_output * tensor.tanh(
cell_t_i
)
#TODO: get the hidden state right after this update, which is used to compute Hawkes params
hidden_i = gate_output * tensor.tanh(
cell_i
)
return hidden_t_i, cell_t_i, cell_i_target, cell_i, decay_cell, gate_output
#return hidden_t_i, cell_t_i, cell_i_target, hidden_i, cell_i, decay_cell, gate_output
#
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, #seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_values : (T+1) * size_batch -- t_i - t_i-1 starting as 0.0 at BOS event
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
Warning: There is overlap between seq_time_values and seq_time_to_current, so in this function, we happen not to use both. So we need to put on_unused_input='warn' in theano.function to avoid error message.
'''
print ">> In models.NeuralHawkesCTLSTM.compute_loss()"
print "computing loss function of Neural Hawkes model with continuous-time LSTM ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
No need to pass time values through thresholds
Use time_values directly
'''
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
initial_cell_target_mat = tensor.outer(
self.expand, self.c_0_target
)
# size_batch * dim_model
# seq_emb_event and seq_time_values start with
# a special BOS event -- ( K, 0.0 )
# to initialize the h, c and \bar{c}
'''
seq_cell_target, seq_cell : cell right AFTER THIS occurrence, including BOS
seq_decay_cell, seq_gate_output : decay and gates AFTER THIS and BEFORE NEXT
seq_hidden_t, seq_cell_t : hidden and cell right BEFORE NEXT occurrence
'''
[seq_hidden_t, seq_cell_t, seq_cell_target, seq_cell, seq_decay_cell, seq_gate_output], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(
input=seq_emb_event[:-1, :, :],
taps=[0]
),
dict(
input=seq_time_to_current,
taps=[0]
)
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1]),
dict(initial=initial_cell_target_mat, taps=[-1]),
None, None, None
],
non_sequences = None
)
# size of outputs of this scan :
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t), c(t), and decay(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
#
shape_hidden = seq_cell_target.shape
# [ T , size_batch , dim_model ]
shape_sims_index = seq_sims_index_in_hidden.shape
# [ N, size_batch ]
#
seq_cell_target_sims = seq_cell_target.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_cell_sims = seq_cell.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_decay_cell_sims = seq_decay_cell.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_gate_output_sims = seq_gate_output.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
#
seq_cell_with_time_sims = seq_cell_target_sims + (
seq_cell_sims - seq_cell_target_sims
) * tensor.exp(
-seq_decay_cell_sims * seq_sims_time_to_current[:, :, None]
)
seq_hidden_with_time_sims = seq_gate_output_sims * tensor.tanh(
seq_cell_with_time_sims
)
#
lambda_over_seq_sims_tilde = tensor.tensordot(
seq_hidden_with_time_sims, self.W_alpha,
(2, 0)
)
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
#
term_2 = numpy.float32(0.0)
#
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_t : T * size_batch * dim_model
#
lambda_over_seq_tilde = tensor.tensordot(
seq_hidden_t, self.W_alpha,
(2, 0)
)
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
#
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def compute_lambda(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute intensity
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
initial_cell_target_mat = tensor.outer(
self.expand, self.c_0_target
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden_t, seq_cell_t, seq_cell_target, seq_cell, seq_decay_cell, seq_gate_output], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(
input=seq_emb_event[:-1, :, :],
taps=[0]
),
dict(
input=seq_time_values[1:, :],
taps=[0]
)
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1]),
dict(initial=initial_cell_target_mat, taps=[-1]),
None, None, None
],
non_sequences = None
)
#
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_cell_target.shape
# [ T, size_batch, dim_model ]
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_cell_target_sims = seq_cell_target.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_cell_sims = seq_cell.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_decay_cell_sims = seq_decay_cell.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
seq_gate_output_sims = seq_gate_output.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
#
seq_cell_with_time_sims = seq_cell_target_sims + (
seq_cell_sims - seq_cell_target_sims
) * tensor.exp(
-seq_decay_cell_sims * seq_sims_time_to_current[:, :, None]
)
seq_hidden_with_time_sims = seq_gate_output_sims * tensor.tanh(
seq_cell_with_time_sims
)
#
lambda_over_seq_sims_tilde = tensor.tensordot(
seq_hidden_with_time_sims, self.W_alpha,
(2, 0)
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
# N * size_batch * dim_process
# (2,0,1) --> dim_process * N * size_batch
'''
this block is to compute intensity
'''
self.lambda_samples = lambda_over_seq_sims.transpose((2,0,1)) * seq_sims_mask[None,:,:]
self.num_of_samples = tensor.sum(seq_sims_mask)
#
#
#
#
def compute_prediction_loss(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss for neural Hawkes with continuous-time LSTM ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
initial_cell_target_mat = tensor.outer(
self.expand, self.c_0_target
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden_t, seq_cell_t, seq_cell_target, seq_cell, seq_decay_cell, seq_gate_output], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(
input=seq_emb_event[:-1, :, :],
taps=[0]
),
dict(
input=seq_time_values[1:, :],
taps=[0]
)
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1]),
dict(initial=initial_cell_target_mat, taps=[-1]),
None, None, None
],
non_sequences = None
)
# seq_hidden_t : T * size_batch * dim_model
seq_cell_with_time = seq_cell_target[
:, :, :, None
] + (
seq_cell[:, :, :, None] - seq_cell_target[:, :, :, None]
) * tensor.exp(
-seq_decay_cell[:, :, :, None] * time_diffs[
None, None, None, :
]
)
# T * size_batch * dim_model * M
seq_hidden_with_time = seq_gate_output[
:, :, :, None
] * tensor.tanh(
seq_cell_with_time
)
# T * size_batch * dim_model * M
lambda_over_seq_tilde = tensor.sum(
seq_hidden_with_time[
:, :, :, None, :
] * self.W_alpha[
None, None, :, :, None
], axis = 2
)
# T * size_batch * dim_process * M
# each time stamp, each seq in batch
# each process, each simulation for prediction
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde.dimshuffle(3,0,1,2)
).dimshuffle(1,2,3,0)
#
# T * size_batch * dim_process * M
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=2
)
# T * size_batch * M
term_1 = time_diffs
# M *
#
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_over_seq, axis = 2
) / cum_num[None, None, :]
) * time_diffs[
None, None, :
]
)
# T * size_batch * M
term_3 = lambda_sum_over_seq
# T * size_batch * M
density = term_2 * term_3
# T * size_batch * M
time_prediction = tensor.mean(
term_1[None, None, :] * density,
axis = 2
) * time_diffs[-1]
# T * size_batch
lambda_over_seq_over_sims = lambda_over_seq[
:, :, :, :
] * density[
:, :, None, :
] / lambda_sum_over_seq[
:, :, None, :
]
# T * size_batch * dim_process * M
prob_over_seq_over_type = tensor.mean(
lambda_over_seq_over_sims, axis = 3
) * time_diffs[-1]
# T * size_batch * dim_process
prob_over_seq_over_type /= tensor.sum(
prob_over_seq_over_type,
axis=2,
keepdims=True
)
# T * size_batch * dim_process
#type_prediction = tensor.argmax(
# prob_over_seq_over_type, axis = 2
#)
# T * size_batch
# Now we have :
# time_prediction, type_prediction, seq_mask
# all of -- T * size_batch
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#diff_type = tensor.abs_(
# target_type - type_prediction
#) * seq_mask
#diff_type = tensor.switch(
# diff_type >= numpy.float32(0.5),
# numpy.float32(1.0), numpy.float32(0.0)
#)
#
#self.num_of_errors = tensor.sum(diff_type)
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#
#
#TODO: memory efficient version of prediction loss
def predict_each_step(
self,
cell_target, cell,
decay_cell, gate_output,
time_diffs
):
# seqs : size_batch * dim_model
# time_diffs : M
cell_with_time = cell_target[
:, :, None
] + (
cell[:, :, None] - cell_target[:, :, None]
) * tensor.exp(
-decay_cell[:, :, None] * time_diffs[
None, None, :
]
)
# size_batch * dim_model * M
hidden_with_time = gate_output[
:, :, None
] * tensor.tanh(
cell_with_time
)
# size_batch * dim_model * M
lambda_tilde = tensor.sum(
hidden_with_time[
:, :, None, :
] * self.W_alpha[
None, :, :, None
], axis = 1
)
# size_batch * dim_process * M
lambda_each_step = self.soft_relu_scale(
lambda_tilde.dimshuffle(2, 0, 1)
).dimshuffle(1, 2, 0)
lambda_sum_each_step = tensor.sum(
lambda_each_step, axis=1
)
# size_batch * M
#TODO: compute integral
term_1 = time_diffs
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_each_step, axis=1
) / cum_num[None, :]
) * time_diffs[None, :]
)
# size_batch * M
term_3 = lambda_sum_each_step
density = term_2 * term_3
# size_batch * M
time_prediction_each_step = tensor.mean(
term_1[None, :] * density, axis=1
) * time_diffs[-1]
# size_batch
lambda_each_step_over_sims = lambda_each_step[
:, :, :
] * density[
:, None, :
] / lambda_sum_each_step[
:, None, :
]
# size_batch * dim_process * M
prob_over_type = tensor.mean(
lambda_each_step_over_sims, axis=2
) * time_diffs[-1]
# size_batch * dim_process
prob_over_type /= tensor.sum(
prob_over_type, axis=1, keepdims=True
)
# size_batch * dim_process
return prob_over_type, time_prediction_each_step
#
#
def compute_prediction_loss_lessmem(
self,
seq_type_event,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss of neural Hawkes with continuous-time LSTM ... "
print "memory efficient version ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
initial_cell_target_mat = tensor.outer(
self.expand, self.c_0_target
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden_t, seq_cell_t, seq_cell_target, seq_cell, seq_decay_cell, seq_gate_output], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(
input=seq_emb_event[:-1, :, :],
taps=[0]
),
dict(
input=seq_time_values[1:, :],
taps=[0]
)
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1]),
dict(initial=initial_cell_target_mat, taps=[-1]),
None, None, None
],
non_sequences = None
)
#
#TODO: predict time and type for each step
[prob_over_seq_over_type, time_prediction], _ = theano.scan(
fn = self.predict_each_step,
sequences = [
dict(input=seq_cell_target, taps=[0]),
dict(input=seq_cell, taps=[0]),
dict(input=seq_decay_cell, taps=[0]),
dict(input=seq_gate_output, taps=[0])
],
outputs_info = [
None, None
],
non_sequences = time_diffs
)
#
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#
#
def get_model(self):
print "getting model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
return model_dict
#
#
#
def save_model(self, file_save):
model_dict = self.get_model()
print "saving model ... "
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
#
#
#
#
#
#
#
# deprecated models
# TODO: modules below are deprecated
# they are models that we tried over this project
# most of them work, better than Hawkes baseline
# but still lose to our neural Hawkes with continuous-time LSTM
# most of them keep the decomposable structure of Hawkes
# and try to use neural networks to parametrize it
#
#
#
class HawkesInhibCTSM(object):
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
print "initializing Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of K * K
the (i, j) entry is the effect of j-th to i-th
this order may be changed in the neural hawkes
for the sake of implementation ease or the convention of Theano
'''
self.alpha = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='alpha'
)
self.delta = theano.shared(
numpy.ones(
(self.dim_process, self.dim_process),
dtype=dtype
), name='delta'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
self.alpha = theano.shared(
model_pre_train['alpha'], name='alpha'
)
self.delta = theano.shared(
model_pre_train['delta'], name='delta'
)
#
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, self.alpha, self.delta
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
return tensor.log(numpy.float32(1.0)+tensor.exp(x))
#
def compute_loss(
self,
seq_time_to_current, seq_type_event,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask, seq_mask_to_current,
seq_sims_time_to_current,
seq_sims_mask_to_current,
seq_sims_mask
):
'''
use this function to compute negative log likelihood
seq_time_to_end : T * size_batch -- T-t_i
seq_time_to_current : T * T * size_batch --
for each batch, it is T * T, and at each time step t,
it tracks the ( t_i - t_i' ) for all t_i' < t_i
seq_type_event : T * size_batch -- for each data
and each time step, tracks the type of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
#
seq_mask : T * size_batch -- 1/0
seq_mask_to_current : T * T * size_batch -- 1/0
#
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Hawkes model ... "
# first compute the 3rd term in loss
alpha_over_seq = self.alpha[
:, seq_type_event
] # dim_process * T * size_batch
delta_over_seq = self.delta[
:, seq_type_event
] # dim_process * T * size_batch
#
lambda_over_seq_sims_tilde = self.mu[:,None,None] + tensor.sum(
(
seq_sims_mask_to_current[None,:,:,:] * (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:] * seq_sims_time_to_current[None,:,:,:]
)
)
), axis=2
) # dim_process * N * size_batch
#
lambda_over_seq_sims = tensor.log(
numpy.float32(1.0) + tensor.exp(
lambda_over_seq_sims_tilde
)
)
# dim_process * N * size_batch
#
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=0
)
# N * size_batch
# mask the lambda of simulations
lambda_sum_over_seq_sims *= seq_sims_mask
#
#
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
# (size_batch, )
term_2 = numpy.float32(0.0)
#
'''
for this model, the computation of term_3 follows the same procedure of term_1, since we need to estimate lambda(s_j), i.e, we need large N * T * size_batch tensors for (1) time to current; (2) mask for (1).
then we can just follow the steps of term_1 to finish the integral estimation.
correspondingly, we need to modify the data processors, to generate the big tensors
'''
# then we compute the 1st term, which is the trickest
# we use seq_time_to_current : T * T * size_batch
# seq_mask_to_current : T * T * size_batch
lambda_over_seq_tilde = self.mu[:, None, None] + tensor.sum(
(
seq_mask_to_current[None,:,:,:]
* (
alpha_over_seq[:,None,:,:] * tensor.exp(
-delta_over_seq[:,None,:,:]
* seq_time_to_current[None,:,:,:]
)
)
)
, axis=2
) # dim_process * T * size_batch
#
lambda_over_seq = tensor.log(
numpy.float32(1.0) + tensor.exp(
lambda_over_seq_tilde
)
) # dim_process * T * size_batch
#
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=0
) # T * size_batch
# now we choose the right lambda for each step
# by using seq_type_event : T * size_batch
new_shape_0 = lambda_over_seq.shape[1]*lambda_over_seq.shape[2]
new_shape_1 = lambda_over_seq.shape[0]
#
back_shape_0 = lambda_over_seq.shape[1]
back_shape_1 = lambda_over_seq.shape[2]
#
lambda_target_over_seq = lambda_over_seq.transpose(
(1,2,0)
).reshape(
(
new_shape_0, new_shape_1
)
)[
tensor.arange(new_shape_0),
seq_type_event.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
class NeuralHawkesCTSM(object):
#
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Neural Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
self.delta = theano.shared(
numpy.ones(
(self.dim_model, self.dim_process),
dtype=dtype
), name='delta'
)
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time, self.dim_model
), name='Emb_time'
)
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
self.delta = theano.shared(
model_pre_train['delta'], name='delta'
)
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, self.delta,
self.W_alpha,
self.Emb_event, self.Emb_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
return tensor.log(numpy.float32(1.0)+tensor.exp(x))
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, seq_time_rep,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-self.delta[
None, None, :, :
] * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
lambda_over_seq_sims_tilde = self.mu[None, None, :] + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-self.delta[
None, None, :, :
] * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
lambda_over_seq_tilde = self.mu[None, None, :] + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
class GeneralizedNeuralHawkesCTSM(object):
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model,
self.dim_process
)
)
), name = 'W_delta'
)
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time, self.dim_model
), name='Emb_time'
)
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, #self.delta,
self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, seq_time_rep,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model * dim_process
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
lambda_over_seq_sims_tilde = self.mu[None, None, :] + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model * dim_process
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
lambda_over_seq_tilde = self.mu[None, None, :] + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
class NeuralHawkesAdaptiveBaseCTSM(object):
#TODO: the base rate is adaptive
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes with Adaptive Base Rate CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
#self.mu = theano.shared(
# numpy.ones(
# (self.dim_process,), dtype=dtype
# ), name='mu'
#)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
#
self.W_mu = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model, self.dim_process
)
)
), name = 'W_mu'
)
#
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model,
self.dim_process
)
)
), name = 'W_delta'
)
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time, self.dim_model
), name='Emb_time'
)
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.W_mu = theano.shared(
model_pre_train['W_mu'], name='W_mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.W_mu, self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
return tensor.log(numpy.float32(1.0)+tensor.exp(x))
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, seq_time_rep,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model * dim_process
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
mu_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_mu, (2,0)
)
# N * size_batch * dim_process
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model * dim_process
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
#
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
#
lambda_over_seq_tilde = mu_for_lambda + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
class NeuralHawkesSimpleCTSM(object):
#TODO: all parameters controlled by one LSTM state
#
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes with Adaptive Base Rate CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
#self.mu = theano.shared(
# numpy.ones(
# (self.dim_process,), dtype=dtype
# ), name='mu'
#)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
#
self.W_hawkes = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
3 * self.dim_process
)
)
), name = 'W_hawkes'
)
self.b_hawkes = theano.shared(
numpy.zeros(
(3*self.dim_process,), dtype=dtype
), name='b_hawkes'
)
#
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time, self.dim_model
), name='Emb_time'
)
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.W_hawkes = theano.shared(
model_pre_train['W_hawkes'], name = 'W_hawkes'
)
self.b_hawkes = theano.shared(
model_pre_train['b_hawkes'], name='b_hawkes'
)
#
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.W_hawkes, self.b_hawkes,
self.Emb_event, self.Emb_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
return tensor.log(numpy.float32(1.0)+tensor.exp(x))
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, seq_time_rep,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
params_hawkes_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_hawkes, (2,0)
) + self.b_hawkes[None, None, :]
#
mu_for_sims = params_hawkes_for_sims[
:, :, :self.dim_process
]
alpha_for_sims = params_hawkes_for_sims[
:, :, self.dim_process:2*self.dim_process
]
delta_for_sims = self.soft_relu(
params_hawkes_for_sims[
:, :, 2*self.dim_process:
]
)
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None
]
) * alpha_for_sims
#
# N * size_batch * dim_process
# may over flow here
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
params_hawkes_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_hawkes, (2,0)
) + self.b_hawkes[None, None, :]
#
mu_for_lambda = params_hawkes_for_lambda[
:, :, :self.dim_process
]
alpha_for_lambda = params_hawkes_for_lambda[
:, :, self.dim_process:2*self.dim_process
]
delta_for_lambda = self.soft_relu(
params_hawkes_for_lambda[
:, :, 2*self.dim_process:
]
)
#
lambda_over_seq_tilde = mu_for_lambda + tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None
]
) * alpha_for_lambda
#
# T * size_batch * dim_process
#
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
class NeuralHawkesCTSM_time(object):
'''
this model stems from neural hawkes
but encode time (positive real values) with neural nodes
'''
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Neural Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
self.delta = theano.shared(
numpy.ones(
(self.dim_model, self.dim_process),
dtype=dtype
), name='delta'
)
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time+numpy.int32(1), self.dim_model
), name='Emb_time'
)
# a dim_time vector for thresholding time
self.Threshold_time = theano.shared(
numpy.float32(settings['threshold_time']),
name='Threshold_time'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
self.delta = theano.shared(
model_pre_train['delta'], name='delta'
)
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
#
self.Threshold_time = theano.shared(
model_pre_train['Threshold_time'], name='Threshold_time'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, self.delta,
self.W_alpha,
self.Emb_event, self.Emb_time, self.Threshold_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event,
#seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
seq_time_values : (T+1) * size_batch -- for each data and each time step, track the time values of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-self.delta[
None, None, :, :
] * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
lambda_over_seq_sims_tilde = self.mu[None, None, :] + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-self.delta[
None, None, :, :
] * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
lambda_over_seq_tilde = self.mu[None, None, :] + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
class GeneralizedNeuralHawkesCTSM_time(object):
#
'''
this model stems from generalized neural hawkes
but encode time (positive real values) with neural nodes
'''
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
self.mu = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='mu'
)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model,
self.dim_process
)
)
), name = 'W_delta'
)
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time+numpy.int32(1), self.dim_model
), name='Emb_time'
)
# a dim_time vector for thresholding time
self.Threshold_time = theano.shared(
numpy.float32(settings['threshold_time']),
name='Threshold_time'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.mu = theano.shared(
model_pre_train['mu'], name='mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
#
self.Threshold_time = theano.shared(
model_pre_train['Threshold_time'], name='Threshold_time'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
self.mu, #self.delta,
self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.Threshold_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, #seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model * dim_process
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
lambda_over_seq_sims_tilde = self.mu[None, None, :] + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model * dim_process
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
lambda_over_seq_tilde = self.mu[None, None, :] + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
class NeuralHawkesAdaptiveBaseCTSM_time(object):
#TODO: the base rate is adaptive
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes with Adaptive Base Rate CTSM ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
#self.mu = theano.shared(
# numpy.ones(
# (self.dim_process,), dtype=dtype
# ), name='mu'
#)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
#
self.W_mu = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model, self.dim_process
)
)
), name = 'W_mu'
)
#
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model,
self.dim_process
)
)
), name = 'W_delta'
)
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time+numpy.int32(1), self.dim_model
), name='Emb_time'
)
# a dim_time vector for thresholding time
self.Threshold_time = theano.shared(
numpy.float32(settings['threshold_time']),
name='Threshold_time'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.W_mu = theano.shared(
model_pre_train['W_mu'], name='W_mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
#print "W_delta is : "
#print model_pre_train['W_delta']
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
#
self.Threshold_time = theano.shared(
model_pre_train['Threshold_time'], name='Threshold_time'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.W_mu, self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.Threshold_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, #seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model * dim_process
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
mu_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_mu, (2,0)
)
# N * size_batch * dim_process
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model * dim_process
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
#
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
#
lambda_over_seq_tilde = mu_for_lambda + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def compute_prediction(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
'''
use this function to compute log likelihood
seq_type_event : (T+1) * size_batch -- k_i
seq_time_values : (T+1) * size_batch -- t_i - t_i-1
seq_mask : T * size_batch -- 1/0
time_diffs : vector of some length
sample diff time for each item in each batch
same within one batch
'''
print "computing predictions ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
# seq_hidden : (T+1) * size_batch * dim_model
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# seq_hidden_for_lambda :
# T * size_batch * dim_model
delta_for_lambda_pre = tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
#
delta_for_lambda = self.soft_relu(
delta_for_lambda_pre
)
# T * size_batch * dim_model * dim_process
# time_diffs : M * --> a vector
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None, None
] * tensor.exp(
-delta_for_lambda[
:, :, :, :, None
] * time_diffs[
None, None, None, None, :
]
)
# T * size_batch * dim_model * dim_process * M
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
lambda_over_seq_tilde = tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :, None
], axis = 2
) + mu_for_lambda[:, :, :, None]
# T * size_batch * dim_process * M
# each time stamp, each seq in batch
# each process, each simulation for prediction
lambda_over_seq = self.soft_relu(
lambda_over_seq_tilde
)
#
# T * size_batch * dim_process * M
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=2
)
# T * size_batch * M
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_1 = time_diffs
# M *
term_2 = tensor.exp(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_over_seq, axis = 2
) / cum_num[None, None, :]
)
# T * size_batch * M
term_3 = lambda_sum_over_seq
# T * size_batch * M
density = term_2 * term_3
time_prediction = tensor.mean(
term_1[None, None, :] * density,
axis = 2
)
# T * size_batch
lambda_over_seq_over_sims = lambda_over_seq[
:, :, :, :
] * density[
:, :, None, :
] / lambda_sum_over_seq[
:, :, None, :
]
# T * size_batch * dim_process * M
prob_over_seq_over_type = tensor.mean(
lambda_over_seq_over_sims, axis = 3
)
# T * size_batch * dim_process
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
# T * size_batch
# Now we have :
# time_prediction, type_prediction, seq_mask
# all of -- T * size_batch
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
#
self.num_of_events = tensor.sum(seq_mask)
#
#
#TODO: for debug
#self.time_prediction = time_prediction
#self.target_time = target_time
#self.type_prediction = type_prediction
#self.target_type = target_type
#
#self.seq_hidden = seq_hidden_for_lambda[-1,0,:]
#self.intensity = lambda_over_seq
#self.cum_num = cum_num
#self.density = density
#self.seq_delta_pre = delta_for_lambda_pre[-1,0,:]
#self.seq_delta_pre_check = tensor.dot(
# self.seq_hidden, self.W_delta
#)
#self.seq_delta = delta_for_lambda
#self.lambda_tilde = lambda_over_seq_tilde
#
#
#
def get_model(self):
print "getting model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
return model_dict
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
class NeuralHawkesAdaptiveBaseCTSM_time_scale(object):
#TODO: the base rate is adaptive
# and it uses neural time encoder
# and it uses scale parameter s_k
# to addjust the soft_relu 's curvature
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes with Adaptive Base Rate CTSM with neural time encoder and scale s_k ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
#self.mu = theano.shared(
# numpy.ones(
# (self.dim_process,), dtype=dtype
# ), name='mu'
#)
'''
we need to notice that: in these matrices of D * K
the (i, j) entry is the effect of i-th dimension
to j-th event
this order may be different from that of Hawkes
so we need to be careful when interpreting
'''
#self.delta = theano.shared(
# numpy.ones(
# (self.dim_model, self.dim_process),
# dtype=dtype
# ), name='delta'
#)
self.scale = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='scale'
)
#
self.W_mu = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model, self.dim_process
)
)
), name = 'W_mu'
)
#
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model,
self.dim_process
)
)
), name = 'W_delta'
)
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time+numpy.int32(1), self.dim_model
), name='Emb_time'
)
# a dim_time vector for thresholding time
self.Threshold_time = theano.shared(
numpy.float32(settings['threshold_time']),
name='Threshold_time'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.scale = theano.shared(
model_pre_train['scale'], name='scale'
)
#
self.W_mu = theano.shared(
model_pre_train['W_mu'], name='W_mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
#
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
#print "W_delta is : "
#print model_pre_train['W_delta']
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
#
self.Threshold_time = theano.shared(
model_pre_train['Threshold_time'], name='Threshold_time'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.scale, # scale parameter
self.W_mu, self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.Threshold_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def soft_relu_scale(self, x):
# x is symbolic tensor
# last dim is dim_process
# this is important !
x /= self.scale
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
z *= self.scale
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, #seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model * dim_process
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :, None
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None, None
]
)
#
# N * size_batch * dim_model * dim_process
# self.W_alpha : dim_model * dim_process
mu_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_mu, (2,0)
)
# N * size_batch * dim_process
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.sum(
seq_sims_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model * dim_process
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None, None
]
)
# T * size_batch * dim_model * dim_process
#
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
#
lambda_over_seq_tilde = mu_for_lambda + tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :
],
axis = 2
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
#
def compute_prediction_loss(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
# seq_hidden : (T+1) * size_batch * dim_model
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# seq_hidden_for_lambda :
# T * size_batch * dim_model
delta_for_lambda_pre = tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
#
delta_for_lambda = self.soft_relu(
delta_for_lambda_pre
)
# T * size_batch * dim_model * dim_process
# time_diffs : M * --> a vector
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None, None
] * tensor.exp(
-delta_for_lambda[
:, :, :, :, None
] * time_diffs[
None, None, None, None, :
]
)
# T * size_batch * dim_model * dim_process * M
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
lambda_over_seq_tilde = tensor.sum(
seq_hidden_with_time * self.W_alpha[
None, None, :, :, None
], axis = 2
) + mu_for_lambda[:, :, :, None]
# T * size_batch * dim_process * M
# each time stamp, each seq in batch
# each process, each simulation for prediction
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde.dimshuffle(3,0,1,2)
).dimshuffle(1,2,3,0)
#
# T * size_batch * dim_process * M
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=2
)
# T * size_batch * M
term_1 = time_diffs
# M *
#
#
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_over_seq, axis = 2
) / cum_num[None, None, :]
) * time_diffs[
None, None, :
]
)
#
#term_2 = tensor.exp(
# -1.0 * lambda_sum_over_seq * time_diffs[
# None, None, :
# ]
#)
# T * size_batch * M
term_3 = lambda_sum_over_seq
# T * size_batch * M
density = term_2 * term_3
# T * size_batch * M
time_prediction = tensor.mean(
term_1[None, None, :] * density,
axis = 2
) * time_diffs[-1]
# T * size_batch
lambda_over_seq_over_sims = lambda_over_seq[
:, :, :, :
] * density[
:, :, None, :
] / lambda_sum_over_seq[
:, :, None, :
]
# T * size_batch * dim_process * M
prob_over_seq_over_type = tensor.mean(
lambda_over_seq_over_sims, axis = 3
) * time_diffs[-1]
# T * size_batch * dim_process
prob_over_seq_over_type /= tensor.sum(
prob_over_seq_over_type,
axis=2,
keepdims=True
)
# T * size_batch * dim_process
#type_prediction = tensor.argmax(
# prob_over_seq_over_type, axis = 2
#)
# T * size_batch
# Now we have :
# time_prediction, type_prediction, seq_mask
# all of -- T * size_batch
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#diff_type = tensor.abs_(
# target_type - type_prediction
#) * seq_mask
#diff_type = tensor.switch(
# diff_type >= numpy.float32(0.5),
# numpy.float32(1.0), numpy.float32(0.0)
#)
#
#self.num_of_errors = tensor.sum(diff_type)
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#TODO: memory efficient version of prediction loss
def predict_each_step(
self, hidden_for_lambda, time_diffs
):
# hidden_for_lambda : size_batch * dim_model
# time_diffs : M
delta_for_lambda = self.soft_relu(
tensor.tensordot(
hidden_for_lambda, self.W_delta, (1,0)
)
)
# delta_for_lambda : size_batch * dim_model * dim_process
hidden_with_time = hidden_for_lambda[
:, :, None, None
] * tensor.exp(
-delta_for_lambda[
:, :, :, None
] * time_diffs[
None, None, None, :
]
)
# hidden_with_time : size_batch * dim_model * dim_process * M
mu_for_lambda = tensor.tensordot(
hidden_for_lambda, self.W_mu, (1,0)
)
# mu_for_lambda : size_batch * dim_process
lambda_tilde = tensor.sum(
hidden_with_time * self.W_alpha[
None, :, : , None
], axis = 1
) + mu_for_lambda[:, :, None]
# size_batch * dim_process * M
lambda_each_step = self.soft_relu_scale(
lambda_tilde.dimshuffle(2,0,1)
).dimshuffle(1,2,0)
# size_batch * dim_process * M
lambda_sum_each_step = tensor.sum(
lambda_each_step, axis=1
)
# size_batch * M
#TODO: compute integral
term_1 = time_diffs
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_each_step, axis=1
) / cum_num[None, :]
) * time_diffs[None, :]
)
# size_batch * M
term_3 = lambda_sum_each_step
density = term_2 * term_3
# size_batch * M
time_prediction_each_step = tensor.mean(
term_1[None, :] * density, axis=1
) * time_diffs[-1]
# size_batch
lambda_each_step_over_sims = lambda_each_step[
:, :, :
] * density[
:, None, :
] / lambda_sum_each_step[
:, None, :
]
# size_batch * dim_process * M
prob_over_type = tensor.mean(
lambda_each_step_over_sims, axis=2
) * time_diffs[-1]
# size_batch * dim_process
prob_over_type /= tensor.sum(
prob_over_type, axis=1, keepdims=True
)
# size_batch * dim_process
return prob_over_type, time_prediction_each_step
#
def compute_prediction_loss_lessmem(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss ... "
print "memory efficient version ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#TODO: get sequence of hidden units
# seq_hidden : (T+1) * size_batch * dim_model
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# seq_hidden_for_lambda :
# T * size_batch * dim_model
#TODO: predict time and type for each step
[prob_over_seq_over_type, time_prediction], _ = theano.scan(
fn = self.predict_each_step,
sequences = dict(
input=seq_hidden_for_lambda, taps=[0]
),
outputs_info = [
None, None
],
non_sequences = time_diffs
)
#
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#
#
def get_model(self):
print "getting model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
return model_dict
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
class NeuralHawkesAdaptiveBaseCTSM_time_scale_r(object):
#TODO: the base rate is adaptive
# and it uses neural time encoder
# and it uses scale parameter s_k
# to addjust the soft_relu 's curvature
#
# r means reduced version :
# delta param is D * D, not D * D * K
#
def __init__(self, settings):
self.size_batch = settings['size_batch']
self.coef_l2 = settings['coef_l2']
#
#
print "initializing Generalized Neural Hawkes with Adaptive Base Rate CTSM with neural time encoder and scale s_k ... "
if settings['path_pre_train'] == None:
self.dim_process = settings['dim_process']
self.dim_time = settings['dim_time']
# the dimension of time representations
self.dim_model = settings['dim_model']
# initialize variables
self.scale = theano.shared(
numpy.ones(
(self.dim_process,), dtype=dtype
), name='scale'
)
#
self.W_mu = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model, self.dim_process
)
)
), name = 'W_mu'
)
#
self.W_delta = theano.shared(
numpy.float32(
numpy.random.normal(
loc = 0.0, scale = 0.1,
size = (
self.dim_model,
self.dim_model
)
)
), name = 'W_delta'
)
#
# the 0-th axis -- self.dim_model
# is for dot product with hidden units
# dot(h, W_delta) --> delta of size:
# dim_model * dim_process
#
self.W_alpha = theano.shared(
utils.sample_weights(
self.dim_model, self.dim_process
), name='W_alpha'
)
# + 1 cuz there is a special BOS event
self.Emb_event = theano.shared(
utils.sample_weights(
self.dim_process+numpy.int32(1), self.dim_model
), name='Emb_event'
)
self.Emb_time = theano.shared(
utils.sample_weights(
self.dim_time+numpy.int32(1), self.dim_model
), name='Emb_time'
)
# a dim_time vector for thresholding time
self.Threshold_time = theano.shared(
numpy.float32(settings['threshold_time']),
name='Threshold_time'
)
#
self.W_recur = theano.shared(
utils.sample_weights(
3*self.dim_model, 4*self.dim_model
), name='W_recur'
)
self.b_recur = theano.shared(
numpy.zeros(
(4*self.dim_model,), dtype=dtype
), name='b_recur'
)
#
else:
path_pre_train = os.path.abspath(
settings['path_pre_train']
)
with open(path_pre_train, 'rb') as f:
model_pre_train = pickle.load(f)
#with open(settings['path_pre_train'], 'rb') as f:
# model_pre_train = pickle.load(f)
self.dim_process = model_pre_train['dim_process']
self.dim_model = model_pre_train['dim_model']
self.dim_time = model_pre_train['dim_time']
#
self.scale = theano.shared(
model_pre_train['scale'], name='scale'
)
#
self.W_mu = theano.shared(
model_pre_train['W_mu'], name='W_mu'
)
#self.delta = theano.shared(
# model_pre_train['delta'], name='delta'
#)
#
self.W_delta = theano.shared(
model_pre_train['W_delta'], name='W_delta'
)
#print "W_delta is : "
#print model_pre_train['W_delta']
self.W_alpha = theano.shared(
model_pre_train['W_alpha'], name='W_alpha'
)
self.Emb_event = theano.shared(
model_pre_train['Emb_event'], name='Emb_event'
)
self.Emb_time = theano.shared(
model_pre_train['Emb_time'], name='Emb_time'
)
#
self.Threshold_time = theano.shared(
model_pre_train['Threshold_time'], name='Threshold_time'
)
#
self.W_recur = theano.shared(
model_pre_train['W_recur'], name='W_recur'
)
self.b_recur = theano.shared(
model_pre_train['b_recur'], name='b_recur'
)
#
self.h_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='h_0'
)
self.c_0 = theano.shared(
numpy.zeros(
(self.dim_model, ), dtype=dtype
), name='c_0'
)
self.expand = theano.shared(
numpy.ones(
(self.size_batch, ), dtype=dtype
), name='expand'
)
# alpha & delta, i-row j-col is the effect of j to i
#
self.params = [
#self.mu, #self.delta,
self.scale, # scale parameter
self.W_mu, self.W_delta, self.W_alpha,
self.Emb_event, self.Emb_time,
self.Threshold_time,
self.W_recur, self.b_recur
#self.h_0, self.c_0
]
self.grad_params = None
self.cost_to_optimize = None
#
#
self.log_likelihood_seq = None
self.log_likelihood_type = None
self.log_likelihood_time = None
#
self.norm_l2 = numpy.float32(0.0)
for param in self.params:
self.norm_l2 += tensor.sum( param ** 2 )
self.term_reg = self.coef_l2 * self.norm_l2
#
# for intensity eval
#self.lambda_sum_over_seq = None
self.lambda_samples = None
self.num_of_samples = None
#
#
def soft_relu(self, x):
# x is a symbolic tensor
# tensor[(x == 0).nonzeros()]
#v_max = numpy.float32(1e9)
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
#a = tensor.switch(z>=v_max, v_max, z)
#y[(x>=100.0).nonzeros()] = x[(x>=100.0).nonzeros()]
#np.finfo(np.float32).max
return z
#
#
def soft_relu_scale(self, x):
# x is symbolic tensor
# last dim is dim_process
# this is important !
x /= self.scale
y = tensor.log(numpy.float32(1.0)+tensor.exp(x) )
z = tensor.switch(x>=100.0, x, y)
z *= self.scale
return z
#
#
def rnn_unit(
self, emb_event_t, emb_time_t,
hidden_tm1, cell_tm1
):
pre_transform = tensor.concatenate(
[emb_event_t, emb_time_t, hidden_tm1],
axis = 1
)
post_transform = tensor.dot(
pre_transform, self.W_recur
) + self.b_recur
#
gate_input = tensor.nnet.sigmoid(
post_transform[:, :self.dim_model]
)
gate_forget = tensor.nnet.sigmoid(
post_transform[:, self.dim_model:2*self.dim_model]
)
gate_output = tensor.nnet.sigmoid(
post_transform[
:, 2*self.dim_model:3*self.dim_model
]
)
gate_pre_c = tensor.tanh(
post_transform[:, 3*self.dim_model:]
)
#
cell_t = gate_forget * cell_tm1 + gate_input * gate_pre_c
hidden_t = gate_output * tensor.tanh(cell_t)
return hidden_t, cell_t
#
#
def compute_loss(
self,
seq_time_to_current,
seq_type_event, #seq_time_rep,
seq_time_values,
time_since_start_to_end,
num_sims_start_to_end,
seq_mask,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute log likelihood
seq_time_to_current : T * size_batch -- t_i - t_i-1
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
time_since_start_to_end : size_batch -- time for seq
num_sims_start_to_end : size_batch -- N for each seq
seq_mask : T * size_batch -- 1/0
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None
]
)
#
# N * size_batch * dim_model
# self.W_alpha : dim_model * dim_process
mu_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_mu, (2,0)
)
# N * size_batch * dim_process
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.tensordot(
seq_sims_hidden_with_time, self.W_alpha,
(2, 0)
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
lambda_sum_over_seq_sims = tensor.sum(
lambda_over_seq_sims, axis=2
)
lambda_sum_over_seq_sims *= seq_sims_mask
# N * size_batch
term_3 = tensor.sum(
lambda_sum_over_seq_sims, axis=0
) * time_since_start_to_end / num_sims_start_to_end
#
term_2 = numpy.float32(0.0)
#
# compute term_1
# as the same procedure as term_3, but easier
# since we can directly use
# seq_hidden_for_lambda : T * size_batch * dim_model
#
#
delta_for_lambda = self.soft_relu(
tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
)
# T * size_batch * dim_model
#
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :
] * tensor.exp(
-delta_for_lambda * seq_time_to_current[
:, :, None
]
)
# T * size_batch * dim_model
#
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
#
lambda_over_seq_tilde = mu_for_lambda + tensor.tensordot(
seq_hidden_with_time, self.W_alpha,
(2, 0)
)
# T * size_batch * dim_process
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde
)
# T * size_batch * dim_process
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis = 2
)
# T * size_batch
#
new_shape_0 = lambda_over_seq.shape[0]*lambda_over_seq.shape[1]
new_shape_1 = lambda_over_seq.shape[2]
#
back_shape_0 = lambda_over_seq.shape[0]
back_shape_1 = lambda_over_seq.shape[1]
#
lambda_target_over_seq = lambda_over_seq.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
seq_type_event[1:,:].flatten()
].reshape(
(back_shape_0, back_shape_1)
)
# T * size_batch
# if there is NaN,
# it can also be the issue of underflow here
log_lambda_target_over_seq = tensor.log(
lambda_target_over_seq + numpy.float32(1e-9)
)
log_lambda_target_over_seq *= seq_mask
#
log_lambda_sum_over_seq = tensor.log(
lambda_sum_over_seq + numpy.float32(1e-9)
)
log_lambda_sum_over_seq *= seq_mask
#
term_1 = tensor.sum(
log_lambda_target_over_seq, axis=0
)
term_sum = tensor.sum(
log_lambda_sum_over_seq, axis=0
)
# (size_batch, )
#
'''
log-likelihood computed in this section is batch-wise
'''
log_likelihood_seq_batch = tensor.sum(
term_1 - term_2 - term_3
)
log_likelihood_type_batch = tensor.sum(
term_1 - term_sum
)
log_likelihood_time_batch = log_likelihood_seq_batch - log_likelihood_type_batch
#
self.cost_to_optimize = -log_likelihood_seq_batch + self.term_reg
#
self.log_likelihood_seq = log_likelihood_seq_batch
self.log_likelihood_type = log_likelihood_type_batch
self.log_likelihood_time = log_likelihood_time_batch
#
self.num_of_events = tensor.sum(seq_mask)
#
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
#
#
#
#
def compute_lambda(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_sims_time_to_current,
seq_sims_index_in_hidden,
seq_sims_mask
):
'''
use this function to compute intensity
seq_type_event : (T+1) * size_batch -- k_i
seq_time_rep : (T+1) * size_batch * dim_time --
for each data and each time step, track the time features of event k_i
seq_sims_time_to_current : N * size_batch -- s_j - t_i
seq_sims_index_in_hidden : N * size_batch -- int32
seq_sims_mask : N * size_batch -- 1/0
'''
print "computing loss function of Neural Hawkes model ... "
#
# we first process the past history of events with LSTM
seq_emb_event = self.Emb_event[seq_type_event, :]
'''
seq_type_event is (T + 1) * size_batch
the 0-th is BOS event
the 1-to-T is regular event
regular event id is 0, 1, 2, ..., K-1
the BOS is K
this setting is easier for the use of seq_type_event
'''
# T * size_batch * dim_model
'''
pass time values through thresholds
'''
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# T * size_batch * dim_model
'''
# This tensor is used to compute effect/decay term
# it will be used to compute term_1 and term_3
# the (t, m, d) entry of this tensor is :
# in m-th data in batch, before t-th event happen,
# at the d-th dimention, the value of hidden unit
'''
#
# first compute the 3rd term in loss
# self.delta : dim_model * dim_process
#
'''
while using simulation, we should feed in follows:
seq_sims_time_to_current : time of t-t_recent_event at each simulation time for each seq in batch
seq_sims_index_in_hidden : index of the hidden units
at each time of simulation, so that we can extract the right h(t)
to do this, we need to be sure the indexing is correct:
a) reshape T * size_batch * dim_model
to (T*size_batch) * dim_model
b) flatten seq_sims_index_in_hidden N * size_batch
to (N*size_batch) * null
c) indexing to get (N*size_batch) * dim_model
d) reshape it back to N * size_batch * dim_model
the crucial part is to fill in the seq_sims_index_in_hidden correctly !!!
'''
#
shape_hidden = seq_hidden_for_lambda.shape
shape_sims_index = seq_sims_index_in_hidden.shape
#
seq_hidden_for_sims = seq_hidden_for_lambda.reshape(
(shape_hidden[0]*shape_hidden[1], shape_hidden[2])
)[
seq_sims_index_in_hidden.flatten(), :
].reshape(
(
shape_sims_index[0], shape_sims_index[1], shape_hidden[2]
)
)
# N * size_batch * dim_model
# seq_sims_time_to_current : N * size_batch
# self.W_delta : dim_model * dim_model * dim_process
#
delta_for_sims = self.soft_relu(
tensor.tensordot(
seq_hidden_for_sims, self.W_delta, (2,0)
)
)
#
# N * size_batch * dim_model
#
seq_sims_hidden_with_time = seq_hidden_for_sims[
:, :, :
] * tensor.exp(
-delta_for_sims * seq_sims_time_to_current[
:, :, None
]
)
#
# N * size_batch * dim_model
# self.W_alpha : dim_model * dim_process
mu_for_sims = tensor.tensordot(
seq_hidden_for_sims, self.W_mu, (2,0)
)
# N * size_batch * dim_process
#
lambda_over_seq_sims_tilde = mu_for_sims + tensor.tensordot(
seq_sims_hidden_with_time, self.W_alpha,
(2, 0)
)
# N * size_batch * dim_process
lambda_over_seq_sims = self.soft_relu_scale(
lambda_over_seq_sims_tilde
)
#
'''
this block is to compute intensity
'''
self.lambda_samples = lambda_over_seq_sims.transpose((2,0,1)) * seq_sims_mask[None,:,:]
self.num_of_samples = tensor.sum(seq_sims_mask)
#
#
#
#
def compute_prediction_loss(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
# seq_hidden : (T+1) * size_batch * dim_model
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# seq_hidden_for_lambda :
# T * size_batch * dim_model
delta_for_lambda_pre = tensor.tensordot(
seq_hidden_for_lambda, self.W_delta, (2,0)
)
# T * size_batch * dim_model
delta_for_lambda = self.soft_relu(
delta_for_lambda_pre
)
# T * size_batch * dim_model
# time_diffs : M * --> a vector
seq_hidden_with_time = seq_hidden_for_lambda[
:, :, :, None
] * tensor.exp(
-delta_for_lambda[
:, :, :, None
] * time_diffs[
None, None, None, :
]
)
# T * size_batch * dim_model * M
mu_for_lambda = tensor.tensordot(
seq_hidden_for_lambda, self.W_mu, (2,0)
)
# T * size_batch * dim_process
lambda_over_seq_tilde = tensor.sum(
seq_hidden_with_time[
:, :, :, None, :
] * self.W_alpha[
None, None, :, :, None
], axis = 2
) + mu_for_lambda[:, :, :, None]
# T * size_batch * dim_process * M
# each time stamp, each seq in batch
# each process, each simulation for prediction
lambda_over_seq = self.soft_relu_scale(
lambda_over_seq_tilde.dimshuffle(3,0,1,2)
).dimshuffle(1,2,3,0)
#
# T * size_batch * dim_process * M
lambda_sum_over_seq = tensor.sum(
lambda_over_seq, axis=2
)
# T * size_batch * M
term_1 = time_diffs
# M *
#
#
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_over_seq, axis = 2
) / cum_num[None, None, :]
) * time_diffs[
None, None, :
]
)
#
#term_2 = tensor.exp(
# -1.0 * lambda_sum_over_seq * time_diffs[
# None, None, :
# ]
#)
# T * size_batch * M
term_3 = lambda_sum_over_seq
# T * size_batch * M
density = term_2 * term_3
# T * size_batch * M
time_prediction = tensor.mean(
term_1[None, None, :] * density,
axis = 2
) * time_diffs[-1]
# T * size_batch
lambda_over_seq_over_sims = lambda_over_seq[
:, :, :, :
] * density[
:, :, None, :
] / lambda_sum_over_seq[
:, :, None, :
]
# T * size_batch * dim_process * M
prob_over_seq_over_type = tensor.mean(
lambda_over_seq_over_sims, axis = 3
) * time_diffs[-1]
# T * size_batch * dim_process
prob_over_seq_over_type /= tensor.sum(
prob_over_seq_over_type,
axis=2,
keepdims=True
)
# T * size_batch * dim_process
#type_prediction = tensor.argmax(
# prob_over_seq_over_type, axis = 2
#)
# T * size_batch
# Now we have :
# time_prediction, type_prediction, seq_mask
# all of -- T * size_batch
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#diff_type = tensor.abs_(
# target_type - type_prediction
#) * seq_mask
#diff_type = tensor.switch(
# diff_type >= numpy.float32(0.5),
# numpy.float32(1.0), numpy.float32(0.0)
#)
#
#self.num_of_errors = tensor.sum(diff_type)
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#
#TODO: memory efficient version of prediction loss
def predict_each_step(
self, hidden_for_lambda, time_diffs
):
# hidden_for_lambda : size_batch * dim_model
# time_diffs : M
delta_for_lambda = self.soft_relu(
tensor.tensordot(
hidden_for_lambda, self.W_delta, (1,0)
)
)
# delta_for_lambda : size_batch * dim_model
hidden_with_time = hidden_for_lambda[
:, :, None
] * tensor.exp(
-delta_for_lambda[
:, :, None
] * time_diffs[
None, None, :
]
)
# hidden_with_time : size_batch * dim_model * M
mu_for_lambda = tensor.tensordot(
hidden_for_lambda, self.W_mu, (1,0)
)
# mu_for_lambda : size_batch * dim_process
lambda_tilde = tensor.sum(
hidden_with_time[
:, :, None, :
] * self.W_alpha[
None, :, : , None
], axis = 1
) + mu_for_lambda[:, :, None]
# size_batch * dim_process * M
lambda_each_step = self.soft_relu_scale(
lambda_tilde.dimshuffle(2,0,1)
).dimshuffle(1,2,0)
# size_batch * dim_process * M
lambda_sum_each_step = tensor.sum(
lambda_each_step, axis=1
)
# size_batch * M
#TODO: compute integral
term_1 = time_diffs
cum_num = tensor.arange(
time_diffs.shape[0]+numpy.int32(1)
)[1:] * numpy.float32(1.0)
# M
term_2 = tensor.exp(
(
-1.0 * tensor.extra_ops.cumsum(
lambda_sum_each_step, axis=1
) / cum_num[None, :]
) * time_diffs[None, :]
)
# size_batch * M
term_3 = lambda_sum_each_step
density = term_2 * term_3
# size_batch * M
time_prediction_each_step = tensor.mean(
term_1[None, :] * density, axis=1
) * time_diffs[-1]
# size_batch
lambda_each_step_over_sims = lambda_each_step[
:, :, :
] * density[
:, None, :
] / lambda_sum_each_step[
:, None, :
]
# size_batch * dim_process * M
prob_over_type = tensor.mean(
lambda_each_step_over_sims, axis=2
) * time_diffs[-1]
# size_batch * dim_process
prob_over_type /= tensor.sum(
prob_over_type, axis=1, keepdims=True
)
# size_batch * dim_process
return prob_over_type, time_prediction_each_step
#
def compute_prediction_loss_lessmem(
self,
seq_type_event, #seq_time_rep,
seq_time_values,
seq_mask,
time_diffs
):
#
print "computing predictions loss ... "
print "memory efficient version ... "
seq_emb_event = self.Emb_event[seq_type_event, :]
seq_time_rep = tensor.nnet.relu(
seq_time_values[:,:,None] - self.Threshold_time[None,None,:]
) # T/T+1 * size_batch * dim_time
#
seq_time_rep = tensor.concatenate(
[seq_time_rep, seq_time_values[:,:,None]],
axis=2
)
#
seq_emb_time = tensor.tensordot(
seq_time_rep, self.Emb_time, (2,0)
)
#
initial_hidden_mat = tensor.outer(
self.expand, self.h_0
)
initial_cell_mat = tensor.outer(
self.expand, self.c_0
)
# size_batch * dim_model
# seq_emb_event and seq_emb_time start with
# a special BOS event,
# to initialize the h and c
[seq_hidden, seq_cell], _ = theano.scan(
fn = self.rnn_unit,
sequences = [
dict(input=seq_emb_event, taps=[0]),
dict(input=seq_emb_time, taps=[0])
],
outputs_info = [
dict(initial=initial_hidden_mat, taps=[-1]),
dict(initial=initial_cell_mat, taps=[-1])
],
non_sequences = None
)
#TODO: get sequence of hidden units
# seq_hidden : (T+1) * size_batch * dim_model
seq_hidden_for_lambda = seq_hidden[:-1, :, :]
# seq_hidden_for_lambda :
# T * size_batch * dim_model
#TODO: predict time and type for each step
[prob_over_seq_over_type, time_prediction], _ = theano.scan(
fn = self.predict_each_step,
sequences = dict(
input=seq_hidden_for_lambda, taps=[0]
),
outputs_info = [
None, None
],
non_sequences = time_diffs
)
#
target_type = seq_type_event[1:, :]
target_time = seq_time_values[1:, :]
# Type first
new_shape_0 = target_type.shape[0] * target_type.shape[1]
new_shape_1 = self.dim_process
back_shape_0 = target_type.shape[0]
back_shape_1 = target_type.shape[1]
#
prob_over_seq = prob_over_seq_over_type.reshape(
(new_shape_0, new_shape_1)
)[
tensor.arange(new_shape_0),
target_type.flatten()
].reshape(
(back_shape_0, back_shape_1)
)
log_prob_over_seq = tensor.log(
prob_over_seq + numpy.float32(1e-9)
)
log_prob_over_seq *= seq_mask
self.log_likelihood_type_predict = tensor.sum(
log_prob_over_seq
)
#
# Time
diff_time = (
target_time - time_prediction
)**2
diff_time *= seq_mask
self.square_errors = tensor.sum(diff_time)
self.num_of_events = tensor.sum(seq_mask)
#TODO: Hamming loss for prediction checking
#
type_prediction = tensor.argmax(
prob_over_seq_over_type, axis = 2
)
diff_type = tensor.abs_(
target_type - type_prediction
) * seq_mask
diff_type = tensor.switch(
diff_type >= numpy.float32(0.5),
numpy.float32(1.0), numpy.float32(0.0)
)
self.num_of_errors = tensor.sum(diff_type)
#
self.cost_to_optimize = -self.log_likelihood_type_predict / self.num_of_events + self.square_errors / self.num_of_events + self.term_reg
#self.cost_to_optimize = -self.log_likelihood_type_predict + self.term_reg
self.grad_params = tensor.grad(
self.cost_to_optimize, self.params
)
self.abs_grad_params = 0.0
for grad_param in self.grad_params:
self.abs_grad_params += tensor.sum(
tensor.abs_(
grad_param
)
)
#
#
#
#
#
def get_model(self):
print "getting model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
return model_dict
#
#
def save_model(self, file_save):
print "saving model ... "
model_dict = {}
for param in self.params:
model_dict[param.name] = numpy.copy(
param.get_value()
)
model_dict['dim_process'] = self.dim_process
model_dict['dim_time'] = self.dim_time
model_dict['dim_model'] = self.dim_model
with open(file_save, 'wb') as f:
pickle.dump(model_dict, f)
#
#
#
#
#
#
| 34.08962 | 215 | 0.531367 | 31,136 | 254,854 | 4.014581 | 0.014967 | 0.043129 | 0.026208 | 0.023528 | 0.972784 | 0.968143 | 0.960727 | 0.958167 | 0.954311 | 0.952071 | 0 | 0.015497 | 0.378632 | 254,854 | 7,475 | 216 | 34.094181 | 0.773842 | 0.112935 | 0 | 0.786314 | 0 | 0 | 0.03121 | 0.00021 | 0 | 0 | 0 | 0.00107 | 0 | 0 | null | null | 0 | 0.002049 | null | null | 0.012088 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bbfc150894c86c0b4c7b574e75f1ab569fde6b8f | 27,735 | py | Python | ATSAMD51P19A/libsrc/ATSAMD51P19A/RTC_.py | t-ikegami/WioTerminal-CircuitPython | efbdc2e13ad969fe009d88f7ec4b836ca61ae973 | [
"MIT"
] | null | null | null | ATSAMD51P19A/libsrc/ATSAMD51P19A/RTC_.py | t-ikegami/WioTerminal-CircuitPython | efbdc2e13ad969fe009d88f7ec4b836ca61ae973 | [
"MIT"
] | 1 | 2022-01-19T00:16:02.000Z | 2022-01-26T03:43:34.000Z | ATSAMD51P19A/libsrc/ATSAMD51P19A/RTC_.py | t-ikegami/WioTerminal-CircuitPython | efbdc2e13ad969fe009d88f7ec4b836ca61ae973 | [
"MIT"
] | null | null | null | import uctypes as ct
RTC_MODE0 = {
'CTRLA' : ( 0x00, {
'reg' : 0x00 | ct.UINT16,
'SWRST' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'MODE' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'MATCHCLR' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'PRESCALER' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 4 << ct.BF_LEN,
'BKTRST' : 0x00 | ct.BFUINT16 | 13 << ct.BF_POS | 1 << ct.BF_LEN,
'GPTRST' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNTSYNC' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'CTRLB' : ( 0x02, {
'reg' : 0x00 | ct.UINT16,
'GP0EN' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2EN' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBMAJ' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBASYNC' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'RTCOUT' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'DMAEN' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBF' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 3 << ct.BF_LEN,
'ACTF' : 0x00 | ct.BFUINT16 | 12 << ct.BF_POS | 3 << ct.BF_LEN,
}),
'EVCTRL' : ( 0x04, {
'reg' : 0x00 | ct.UINT32,
'PEREO0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO5' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO6' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO7' : 0x00 | ct.BFUINT32 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO0' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO1' : 0x00 | ct.BFUINT32 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEREO' : 0x00 | ct.BFUINT32 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVFEO' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVEI' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENCLR' : ( 0x08, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENSET' : ( 0x0A, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTFLAG' : ( 0x0C, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'DBGCTRL' : ( 0x0E, {
'reg' : 0x00 | ct.UINT8,
'DBGRUN' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'SYNCBUSY' : ( 0x10, {
'reg' : 0x00 | ct.UINT32,
'SWRST' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'FREQCORR' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNT' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP0' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP1' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNTSYNC' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'GP0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'GP1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'GP3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'FREQCORR' : ( 0x14, {
'reg' : 0x00 | ct.UINT8,
'VALUE' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 7 << ct.BF_LEN,
'SIGN' : 0x00 | ct.BFUINT8 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'COUNT' : 0x18 | ct.UINT32,
'COMP' : ( 0x20 | ct.ARRAY, 2 | ct.UINT32 ),
'GP' : ( 0x40 | ct.ARRAY, 4 | ct.UINT32 ),
'TAMPCTRL' : ( 0x60, {
'reg' : 0x00 | ct.UINT32,
'IN0ACT' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 2 << ct.BF_LEN,
'IN1ACT' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'IN2ACT' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 2 << ct.BF_LEN,
'IN3ACT' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 2 << ct.BF_LEN,
'IN4ACT' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 2 << ct.BF_LEN,
'TAMLVL0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL4' : 0x00 | ct.BFUINT32 | 20 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC0' : 0x00 | ct.BFUINT32 | 24 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC1' : 0x00 | ct.BFUINT32 | 25 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC2' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC3' : 0x00 | ct.BFUINT32 | 27 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC4' : 0x00 | ct.BFUINT32 | 28 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'TIMESTAMP' : 0x64 | ct.UINT32,
'TAMPID' : ( 0x68, {
'reg' : 0x00 | ct.UINT32,
'TAMPID0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVT' : 0x00 | ct.BFUINT32 | 31 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'BKUP' : ( 0x80 | ct.ARRAY, 8 | ct.UINT32 ),
}
RTC_MODE1 = {
'CTRLA' : ( 0x00, {
'reg' : 0x00 | ct.UINT16,
'SWRST' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'MODE' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'PRESCALER' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 4 << ct.BF_LEN,
'BKTRST' : 0x00 | ct.BFUINT16 | 13 << ct.BF_POS | 1 << ct.BF_LEN,
'GPTRST' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNTSYNC' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'CTRLB' : ( 0x02, {
'reg' : 0x00 | ct.UINT16,
'GP0EN' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2EN' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBMAJ' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBASYNC' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'RTCOUT' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'DMAEN' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBF' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 3 << ct.BF_LEN,
'ACTF' : 0x00 | ct.BFUINT16 | 12 << ct.BF_POS | 3 << ct.BF_LEN,
}),
'EVCTRL' : ( 0x04, {
'reg' : 0x00 | ct.UINT32,
'PEREO0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO5' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO6' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO7' : 0x00 | ct.BFUINT32 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO0' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO1' : 0x00 | ct.BFUINT32 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO2' : 0x00 | ct.BFUINT32 | 10 << ct.BF_POS | 1 << ct.BF_LEN,
'CMPEO3' : 0x00 | ct.BFUINT32 | 11 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEREO' : 0x00 | ct.BFUINT32 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVFEO' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVEI' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENCLR' : ( 0x08, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP2' : 0x00 | ct.BFUINT16 | 10 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP3' : 0x00 | ct.BFUINT16 | 11 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENSET' : ( 0x0A, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP2' : 0x00 | ct.BFUINT16 | 10 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP3' : 0x00 | ct.BFUINT16 | 11 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTFLAG' : ( 0x0C, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP2' : 0x00 | ct.BFUINT16 | 10 << ct.BF_POS | 1 << ct.BF_LEN,
'CMP3' : 0x00 | ct.BFUINT16 | 11 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'DBGCTRL' : ( 0x0E, {
'reg' : 0x00 | ct.UINT8,
'DBGRUN' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'SYNCBUSY' : ( 0x10, {
'reg' : 0x00 | ct.UINT32,
'SWRST' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'FREQCORR' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNT' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP0' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP1' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP2' : 0x00 | ct.BFUINT32 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'COMP3' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'COUNTSYNC' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'GP0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'GP1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'GP3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'FREQCORR' : ( 0x14, {
'reg' : 0x00 | ct.UINT8,
'VALUE' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 7 << ct.BF_LEN,
'SIGN' : 0x00 | ct.BFUINT8 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'COUNT' : 0x18 | ct.UINT16,
'PER' : 0x1C | ct.UINT16,
'COMP' : ( 0x20 | ct.ARRAY, 4 | ct.UINT16 ),
'GP' : ( 0x40 | ct.ARRAY, 4 | ct.UINT32 ),
'TAMPCTRL' : ( 0x60, {
'reg' : 0x00 | ct.UINT32,
'IN0ACT' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 2 << ct.BF_LEN,
'IN1ACT' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'IN2ACT' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 2 << ct.BF_LEN,
'IN3ACT' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 2 << ct.BF_LEN,
'IN4ACT' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 2 << ct.BF_LEN,
'TAMLVL0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL4' : 0x00 | ct.BFUINT32 | 20 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC0' : 0x00 | ct.BFUINT32 | 24 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC1' : 0x00 | ct.BFUINT32 | 25 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC2' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC3' : 0x00 | ct.BFUINT32 | 27 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC4' : 0x00 | ct.BFUINT32 | 28 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'TIMESTAMP' : ( 0x64, {
'reg' : 0x00 | ct.UINT32,
'COUNT' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 16 << ct.BF_LEN,
}),
'TAMPID' : ( 0x68, {
'reg' : 0x00 | ct.UINT32,
'TAMPID0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVT' : 0x00 | ct.BFUINT32 | 31 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'BKUP' : ( 0x80 | ct.ARRAY, 8 | ct.UINT32 ),
}
RTC_MODE2 = {
'CTRLA' : ( 0x00, {
'reg' : 0x00 | ct.UINT16,
'SWRST' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'MODE' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'CLKREP' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'MATCHCLR' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'PRESCALER' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 4 << ct.BF_LEN,
'BKTRST' : 0x00 | ct.BFUINT16 | 13 << ct.BF_POS | 1 << ct.BF_LEN,
'GPTRST' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'CLOCKSYNC' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'CTRLB' : ( 0x02, {
'reg' : 0x00 | ct.UINT16,
'GP0EN' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2EN' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBMAJ' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBASYNC' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'RTCOUT' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'DMAEN' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBF' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 3 << ct.BF_LEN,
'ACTF' : 0x00 | ct.BFUINT16 | 12 << ct.BF_POS | 3 << ct.BF_LEN,
}),
'EVCTRL' : ( 0x04, {
'reg' : 0x00 | ct.UINT32,
'PEREO0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO5' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO6' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PEREO7' : 0x00 | ct.BFUINT32 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARMEO0' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARMEO1' : 0x00 | ct.BFUINT32 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEREO' : 0x00 | ct.BFUINT32 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVFEO' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVEI' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENCLR' : ( 0x08, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTENSET' : ( 0x0A, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'INTFLAG' : ( 0x0C, {
'reg' : 0x00 | ct.UINT16,
'PER0' : 0x00 | ct.BFUINT16 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'PER1' : 0x00 | ct.BFUINT16 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'PER2' : 0x00 | ct.BFUINT16 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'PER3' : 0x00 | ct.BFUINT16 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'PER4' : 0x00 | ct.BFUINT16 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'PER5' : 0x00 | ct.BFUINT16 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'PER6' : 0x00 | ct.BFUINT16 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'PER7' : 0x00 | ct.BFUINT16 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM0' : 0x00 | ct.BFUINT16 | 8 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM1' : 0x00 | ct.BFUINT16 | 9 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPER' : 0x00 | ct.BFUINT16 | 14 << ct.BF_POS | 1 << ct.BF_LEN,
'OVF' : 0x00 | ct.BFUINT16 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'DBGCTRL' : ( 0x0E, {
'reg' : 0x00 | ct.UINT8,
'DBGRUN' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'SYNCBUSY' : ( 0x10, {
'reg' : 0x00 | ct.UINT32,
'SWRST' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'ENABLE' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'FREQCORR' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'CLOCK' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM0' : 0x00 | ct.BFUINT32 | 5 << ct.BF_POS | 1 << ct.BF_LEN,
'ALARM1' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 1 << ct.BF_LEN,
'MASK0' : 0x00 | ct.BFUINT32 | 11 << ct.BF_POS | 1 << ct.BF_LEN,
'MASK1' : 0x00 | ct.BFUINT32 | 12 << ct.BF_POS | 1 << ct.BF_LEN,
'CLOCKSYNC' : 0x00 | ct.BFUINT32 | 15 << ct.BF_POS | 1 << ct.BF_LEN,
'GP0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'GP1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'GP2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'GP3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'FREQCORR' : ( 0x14, {
'reg' : 0x00 | ct.UINT8,
'VALUE' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 7 << ct.BF_LEN,
'SIGN' : 0x00 | ct.BFUINT8 | 7 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'CLOCK' : ( 0x18, {
'reg' : 0x00 | ct.UINT32,
'SECOND' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 6 << ct.BF_LEN,
'MINUTE' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 6 << ct.BF_LEN,
'HOUR' : 0x00 | ct.BFUINT32 | 12 << ct.BF_POS | 5 << ct.BF_LEN,
'DAY' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 5 << ct.BF_LEN,
'MONTH' : 0x00 | ct.BFUINT32 | 22 << ct.BF_POS | 4 << ct.BF_LEN,
'YEAR' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 6 << ct.BF_LEN,
}),
'GP' : ( 0x40 | ct.ARRAY, 4 | ct.UINT32 ),
'ALARM0' : ( 0x20, {
'reg' : 0x00 | ct.UINT32,
'SECOND' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 6 << ct.BF_LEN,
'MINUTE' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 6 << ct.BF_LEN,
'HOUR' : 0x00 | ct.BFUINT32 | 12 << ct.BF_POS | 5 << ct.BF_LEN,
'DAY' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 5 << ct.BF_LEN,
'MONTH' : 0x00 | ct.BFUINT32 | 22 << ct.BF_POS | 4 << ct.BF_LEN,
'YEAR' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 6 << ct.BF_LEN,
}),
'MASK0' : ( 0x24, {
'reg' : 0x00 | ct.UINT8,
'SEL' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 3 << ct.BF_LEN,
}),
'ALARM1' : ( 0x28, {
'reg' : 0x00 | ct.UINT32,
'SECOND' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 6 << ct.BF_LEN,
'MINUTE' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 6 << ct.BF_LEN,
'HOUR' : 0x00 | ct.BFUINT32 | 12 << ct.BF_POS | 5 << ct.BF_LEN,
'DAY' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 5 << ct.BF_LEN,
'MONTH' : 0x00 | ct.BFUINT32 | 22 << ct.BF_POS | 4 << ct.BF_LEN,
'YEAR' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 6 << ct.BF_LEN,
}),
'MASK1' : ( 0x2C, {
'reg' : 0x00 | ct.UINT8,
'SEL' : 0x00 | ct.BFUINT8 | 0 << ct.BF_POS | 3 << ct.BF_LEN,
}),
'TAMPCTRL' : ( 0x60, {
'reg' : 0x00 | ct.UINT32,
'IN0ACT' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 2 << ct.BF_LEN,
'IN1ACT' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 2 << ct.BF_LEN,
'IN2ACT' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 2 << ct.BF_LEN,
'IN3ACT' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 2 << ct.BF_LEN,
'IN4ACT' : 0x00 | ct.BFUINT32 | 8 << ct.BF_POS | 2 << ct.BF_LEN,
'TAMLVL0' : 0x00 | ct.BFUINT32 | 16 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL1' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL2' : 0x00 | ct.BFUINT32 | 18 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL3' : 0x00 | ct.BFUINT32 | 19 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMLVL4' : 0x00 | ct.BFUINT32 | 20 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC0' : 0x00 | ct.BFUINT32 | 24 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC1' : 0x00 | ct.BFUINT32 | 25 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC2' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC3' : 0x00 | ct.BFUINT32 | 27 << ct.BF_POS | 1 << ct.BF_LEN,
'DEBNC4' : 0x00 | ct.BFUINT32 | 28 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'TIMESTAMP' : ( 0x64, {
'reg' : 0x00 | ct.UINT32,
'SECOND' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 6 << ct.BF_LEN,
'MINUTE' : 0x00 | ct.BFUINT32 | 6 << ct.BF_POS | 6 << ct.BF_LEN,
'HOUR' : 0x00 | ct.BFUINT32 | 12 << ct.BF_POS | 5 << ct.BF_LEN,
'DAY' : 0x00 | ct.BFUINT32 | 17 << ct.BF_POS | 5 << ct.BF_LEN,
'MONTH' : 0x00 | ct.BFUINT32 | 22 << ct.BF_POS | 4 << ct.BF_LEN,
'YEAR' : 0x00 | ct.BFUINT32 | 26 << ct.BF_POS | 6 << ct.BF_LEN,
}),
'TAMPID' : ( 0x68, {
'reg' : 0x00 | ct.UINT32,
'TAMPID0' : 0x00 | ct.BFUINT32 | 0 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID1' : 0x00 | ct.BFUINT32 | 1 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID2' : 0x00 | ct.BFUINT32 | 2 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID3' : 0x00 | ct.BFUINT32 | 3 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPID4' : 0x00 | ct.BFUINT32 | 4 << ct.BF_POS | 1 << ct.BF_LEN,
'TAMPEVT' : 0x00 | ct.BFUINT32 | 31 << ct.BF_POS | 1 << ct.BF_LEN,
}),
'BKUP' : ( 0x80 | ct.ARRAY, 8 | ct.UINT32 ),
}
RTC_ = {
'MODE0' : ( 0x00, RTC_MODE0 ),
'MODE1' : ( 0x00, RTC_MODE1 ),
'MODE2' : ( 0x00, RTC_MODE2 ),
}
RTC = ct.struct(0x40002400, RTC_)
| 56.486762 | 75 | 0.511339 | 4,420 | 27,735 | 3.052941 | 0.040498 | 0.201571 | 0.176375 | 0.167778 | 0.976805 | 0.976805 | 0.976805 | 0.972358 | 0.970357 | 0.970357 | 0 | 0.157374 | 0.289995 | 27,735 | 490 | 76 | 56.602041 | 0.527879 | 0 | 0 | 0.917526 | 0 | 0 | 0.081017 | 0 | 0 | 0 | 0.063097 | 0 | 0 | 1 | 0 | false | 0 | 0.002062 | 0 | 0.002062 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a56e1cf4b2ce44d2d3a93ae6c94ea5ba2bd6abcd | 3,725 | py | Python | test/wecall_acceptance/call_filters/test_combined_allele_strand_bias.py | dylex/wecall | 35d24cefa4fba549e737cd99329ae1b17dd0156b | [
"MIT"
] | 8 | 2018-10-08T15:47:21.000Z | 2021-11-09T07:13:05.000Z | test/wecall_acceptance/call_filters/test_combined_allele_strand_bias.py | dylex/wecall | 35d24cefa4fba549e737cd99329ae1b17dd0156b | [
"MIT"
] | 4 | 2018-11-05T09:16:27.000Z | 2020-04-09T12:32:56.000Z | test/wecall_acceptance/call_filters/test_combined_allele_strand_bias.py | dylex/wecall | 35d24cefa4fba549e737cd99329ae1b17dd0156b | [
"MIT"
] | 4 | 2019-09-03T15:46:39.000Z | 2021-06-04T07:28:33.000Z | # All content Copyright (C) 2018 Genomics plc
from wecall.genomics.variant import Variant
from wecall_test_drivers.base_test import BaseTest
from wecall_test_drivers.svc_driver import SVCDriver
class TestAllelePlusStrandBiasFilteringBehaviour(BaseTest):
def test_should_allow_mildly_strand_biased_calls(self):
chrom = 'chr1'
svc = SVCDriver(self)
reads = 10
strand_bias = 6
svc.with_ref_sequence(
"AAAGCGTACAACCGGGTTAGTCACAAACCCGTTACGTATGCATG", chrom=chrom
).with_read(
"............................................",
n_rev=reads + strand_bias, n_fwd=reads - strand_bias, chrom=chrom
).with_read(
"................G...........................",
n_rev=reads - strand_bias, n_fwd=reads + strand_bias, chrom=chrom
)
expect = svc.call()
expect.with_output_vcf() \
.record_count(1) \
.has_record_for_variant(Variant(chrom, 16, 'T', 'G')) \
.with_no_filters()
def test_should_allow_mildly_allele_biased_calls(self):
chrom = 'chr1'
svc = SVCDriver(self)
reads = 10
allele_bias = 5
svc.with_ref_sequence(
"AAAGCGTACAACCGGGTTAGTCACAAACCCGTTACGTATGCATG", chrom=chrom
).with_read(
"............................................",
n_rev=reads + allele_bias, n_fwd=reads + allele_bias, chrom=chrom
).with_read(
"................G...........................",
n_rev=reads - allele_bias, n_fwd=reads - allele_bias, chrom=chrom
)
expect = svc.call()
expect.with_output_vcf() \
.record_count(1) \
.has_record_for_variant(Variant(chrom, 16, 'T', 'G')) \
.with_no_filters()
def test_should_stop_mildly_allele_and_strand_biased_calls(self):
chrom = 'chr1'
svc = SVCDriver(self)
reads = 10
allele_bias = 5
strand_bias = 4
svc.with_ref_sequence(
"AAAGCGTACAACCGGGTTAGTCACAAACCCGTTACGTATGCATG",
chrom=chrom).with_read(
"............................................",
n_rev=reads + allele_bias + strand_bias,
n_fwd=reads + allele_bias - strand_bias,
chrom=chrom).with_read(
"................G...........................",
n_rev=reads - allele_bias - strand_bias,
n_fwd=reads - allele_bias + strand_bias,
chrom=chrom)
expect = svc.call()
expect.with_output_vcf() \
.record_count(1) \
.has_record_for_variant(Variant(chrom, 16, 'T', 'G')) \
.with_filters({'AB+SB'})
def test_should_allow_mildly_allele_and_strand_biased_calls_with_lower_specified_threshold(self):
chrom = 'chr1'
svc = SVCDriver(self)
reads = 10
allele_bias = 5
strand_bias = 4
svc.with_ref_sequence(
"AAAGCGTACAACCGGGTTAGTCACAAACCCGTTACGTATGCATG",
chrom=chrom).with_read(
"............................................",
n_rev=reads + allele_bias + strand_bias,
n_fwd=reads + allele_bias - strand_bias,
chrom=chrom).with_read(
"................G...........................",
n_rev=reads - allele_bias - strand_bias,
n_fwd=reads - allele_bias + strand_bias,
chrom=chrom)
svc.with_allele_plus_strand_bias_p(0.03)
expect = svc.call()
expect.with_output_vcf() \
.record_count(1) \
.has_record_for_variant(Variant(chrom, 16, 'T', 'G')) \
.with_no_filters()
| 34.813084 | 101 | 0.534497 | 380 | 3,725 | 4.881579 | 0.184211 | 0.086253 | 0.097035 | 0.077628 | 0.838275 | 0.825337 | 0.781132 | 0.781132 | 0.781132 | 0.781132 | 0 | 0.013931 | 0.28698 | 3,725 | 106 | 102 | 35.141509 | 0.684488 | 0.011544 | 0 | 0.804598 | 0 | 0 | 0.151359 | 0.143478 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045977 | false | 0 | 0.034483 | 0 | 0.091954 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a5823b0b8ae9509c7bb1d249242233ef04068178 | 17,412 | py | Python | getLib.py | dfraser74/cuntoir | 92fe2b0d59f3e079884ff37ebf6e3c650efe6be1 | [
"MIT"
] | null | null | null | getLib.py | dfraser74/cuntoir | 92fe2b0d59f3e079884ff37ebf6e3c650efe6be1 | [
"MIT"
] | null | null | null | getLib.py | dfraser74/cuntoir | 92fe2b0d59f3e079884ff37ebf6e3c650efe6be1 | [
"MIT"
] | null | null | null | import MySQLdb as mysql
import time
import hashlib
import authLib
from types import *
def getAll(dataDict):
username = dataDict["username"].strip()
authCode = dataDict["authCode"].strip()
sort = dataDict["sort"]
doneFlag = dataDict["archived"]
timeOffset = dataDict["timeOffset"]
if(doneFlag == "true" and authLib.checkIfPremium(username) == 0):
return(3)
if(doneFlag == "false"):
buttonText = "<i class='fa fa-check-square-o' aria-hidden='true'></i>"
buttonVal = "class='archiveButton'"
onClick = "completeTaskPost"
if(doneFlag == "true"):
buttonText = "<i class='fa fa-reply' aria-hidden='true'></i>"
buttonVal = "class='restoreButton'"
onClick = "restoreTaskPost"
auth = authLib.checkAuthCode(dataDict)
if(auth != 1):
return(0)
returnString = ""
db = authLib.dbCon()
c = db.cursor()
command = "SELECT * FROM tasks WHERE BINARY username = %s AND BINARY done = %s"
c.execute(command, [username, doneFlag])
tasks = c.fetchall()
tasks = list(tasks)
if(len(tasks) == 0):
return(2)
if(sort == "default"):
tasks.sort(key = lambda x:x[3])
if(sort == "createTime"):
tasks.sort(key = lambda x:x[2], reverse=True)
if(doneFlag == "true"):
returnString += "<div class='task' style='height:auto;' id='infoHeader'><h2 style='margin:auto;'>Archived Tasks:</h2></div>"
for task in tasks:
# print(task)
taskId = str(task[0])
username = task[1]
createTime = float(task[2])
dueTime = float(task[3])
text = task[4]
title = task[6]
tags = task[7]
pushable = task[8]
recurring = task[10]
if(recurring == "false"):
recurringString = ""
else:
recurringString = " <i class='fa fa-repeat' aria-hidden='true'></i>("+recurring.title()+")"
if("'" in title):
title = title.replace("'", "'")
if("'" in text):
text = text.replace("'", "'")
timeString = time.strftime("%d/%m/%Y %H:%M", time.gmtime(dueTime - float(timeOffset)))
dateSearchList = time.strftime("%d/%m/%Y", time.gmtime(dueTime - float(timeOffset))).split("/")
dateSearchList[1] = str(int(dateSearchList[1]) - 1)
returnString += "<div class='task' id='"+taskId+"'><h2 class='taskTitle' onclick='openEdit("+taskId+");'>" + title + "</h2>"
if(text != ""):
returnString += "<div class='taskBody'>" + text + "</div>"
else:
returnString += "<div class='taskBody'><span class='italic'>No details</span></div>"
returnString += "<div class='tagAndDueTimeWrapper'><div class='dueTime' onclick='dateSearch("+dateSearchList[0]+","+dateSearchList[1]+","+dateSearchList[2]+");'>" + timeString + recurringString
returnString += "</div>"
returnString += "<div class='taskTags'>"
if(len(tags) < 1):
returnString += "<span class='noTaskTag'><span class='italic'>No tags</span></span>"
for tag in tags.split(","):
if(tag == ""):
continue
returnString += "<span class='taskTag' onclick='getTagged(\""+tag+"\");'>"+tag+"</span>"
returnString += "</div></div>"
returnString += "<button type='button' " + buttonVal + " onclick='" + onClick +"(" + taskId + ");'>" + buttonText + "</button>"
if(doneFlag == "true"):
returnString += "<button type='button' class='deleteButton' onclick='deleteTask(" + taskId + ");'><i class='fa fa-times' aria-hidden='true'></i></button>"
else:
if(pushable == "true"):
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable("+taskId+",\"true\");'><i class='fa fa-bell' aria-hidden='true'></i></button>"
else:
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable(" + taskId + ",\"false\");'><i class='fa fa-bell-o' aria-hidden='true'></i></button>"
returnString += "</div>"
if(doneFlag == "true"):
returnString += "<div class='task' id='infoFooter' style='height:auto;'><input type='button' id='archiveButton' onclick='getAll();' value='Go Back'></div>"
return(returnString)
def getTagged(dataDict):
username = dataDict["username"].strip()
authCode = dataDict["authCode"].strip()
searchTag = dataDict["tag"].strip()
sort = dataDict["sort"]
auth = authLib.checkAuthCode(dataDict)
timeOffset = float(dataDict["timeOffset"])
if(auth != 1):
return(0)
returnString = ""
db = authLib.dbCon()
c = db.cursor()
command = "SELECT * FROM tasks WHERE BINARY username = %s AND BINARY done != %s"
c.execute(command, [username, "true"])
tasks = c.fetchall()
tasks = list(tasks)
for task in tasks:
for item in task:
if(type(item) == StringType):
item = item.decode("utf-8")
if(len(tasks) == 0):
return(2)
if(sort == "default"):
tasks.sort(key = lambda x:x[3])
infoString = "<div class='task' id='infoHeader' style='height:auto;width:auto;'><h2 class='taskTitle'>Tasks tagged with \"" + searchTag + "\" :</h2></div>"
returnString += infoString
for task in tasks:
taskId = str(task[0])
username = task[1]
createTime = float(task[2])
dueTime = float(task[3])
text = task[4]
title = task[6]
tags = task[7]
pushable = task[8]
recurring = task[10]
if(recurring == "false"):
recurringString = ""
else:
recurringString = " <i class='fa fa-repeat' aria-hidden='true'></i>("+recurring.title()+")"
if("'" in title):
title = title.replace("'", "'")
if("'" in text):
text = text.replace("'", "'")
if(searchTag not in tags.split(",")):
continue
if(task[5] == "true"):
continue
timeString = time.strftime("%d/%m/%Y %H:%M", time.gmtime(dueTime - timeOffset))
dateSearchList = time.strftime("%d/%m/%Y", time.gmtime(dueTime - float(timeOffset))).split("/")
dateSearchList[1] = str(int(dateSearchList[1]) - 1)
returnString += "<div class='task' id='" + taskId + "'><h2 class='taskTitle' onclick='openEdit(" + taskId + ");'>" + title + "</h2>"
if(text != ""):
returnString += "<div class='taskBody'>" + text + "</div>"
else:
returnString += "<div class='taskBody'><span class='italic'>No details</span></div>"
returnString += "<div class='tagAndDueTimeWrapper'><div class='dueTime' onclick='dateSearch("+dateSearchList[0]+","+dateSearchList[1]+","+dateSearchList[2]+");'>" + timeString + recurringString
returnString += "</div>"
returnString += "<div class='taskTags'>"
if(len(tags) < 1):
returnString += "<span class='noTaskTag'><span class='italic'>No tags</span></span>"
for tag in tags.split(","):
if(tag == ""):
continue
returnString += "<span class='taskTag' onclick='getTagged(\""+tag+"\");'>"+tag+"</span>"
returnString += "</div></div>"
if(pushable == "true"):
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable("+taskId+",\"true\");'><i class='fa fa-bell' aria-hidden='true'></i></button>"
else:
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable(" + taskId + ",\"false\");'><i class='fa fa-bell-o' aria-hidden='true'></i></button>"
returnString += "<button type='button' class='archiveButton' onclick='completeTaskPost(" + taskId + ");'><i class='fa fa-check-square-o' aria-hidden='true'></i></button>"
returnString += "</div>"
if(returnString == infoString):
return(2)
returnString += "<div class='task' id='infoFooter' style='height:auto;'><input type='button' id='archiveButton' onclick='getAll();' value='Go Back'></div>"
return(returnString)
def search(dataDict):
username = dataDict["username"].strip()
authCode = dataDict["authCode"].strip()
searchString = dataDict["searchString"].strip().lower()
sort = dataDict["sort"]
timeOffset = float(dataDict["timeOffset"])
searchTags = dataDict["searchTags"]
searchTitle = dataDict["searchTitle"]
searchBody = dataDict["searchBody"]
auth = authLib.checkAuthCode(dataDict)
if(auth != 1):
return(0)
returnString = ""
infoString = "<div class='task' id='infoHeader' style='height:auto;width:auto;'><h2 class='taskTitle'>Tasks matching \"" + searchString + "\" :</h2></div>"
returnString += infoString
db = authLib.dbCon()
c = db.cursor()
command = "SELECT * FROM tasks WHERE BINARY username = %s AND BINARY done != %s"
c.execute(command, [username, "true"])
tasks = c.fetchall()
tasks = list(tasks)
for task in tasks:
for item in task:
if(type(item) == StringType):
item = item.decode("utf-8")
if(len(tasks) == 0):
return(2)
if(sort == "default"):
tasks.sort(key = lambda x:x[3])
for task in tasks:
taskId = str(task[0])
username = task[1]
createTime = float(task[2])
dueTime = float(task[3])
text = task[4]
title = task[6]
tags = task[7]
tagsToCompare = tags
titleToCompare = title
textToCompare = text
pushable = task[8]
recurring = task[10]
if(recurring == "false"):
recurringString = ""
else:
recurringString = " <i class='fa fa-repeat' aria-hidden='true'></i>("+recurring.title()+")"
if("'" in title):
title = title.replace("'", "'")
if("'" in text):
text = text.replace("'", "'")
#Re-write compared things to null if they shouldn't be searched
if(searchTitle == "false"):
titleToCompare = ""
if(searchBody == "false"):
textToCompare = ""
if(searchTags == "false"):
tagsToCompare = ""
if(searchString not in tagsToCompare.lower().split(",") and searchString not in titleToCompare.lower() and searchString not in textToCompare.lower()):
continue
timeString = time.strftime("%d/%m/%Y %H:%M", time.gmtime(dueTime - timeOffset))
dateSearchList = time.strftime("%d/%m/%Y", time.gmtime(dueTime - float(timeOffset))).split("/")
dateSearchList[1] = str(int(dateSearchList[1]) - 1)
returnString += "<div class='task' id='" + taskId + "'><h2 class='taskTitle' onclick='openEdit(" + taskId + ");'>" + title + "</h2>"
if(text != ""):
returnString += "<div class='taskBody'>" + text + "</div>"
else:
returnString += "<div class='taskBody'><span class='italic'>No details</span></div>"
returnString += "<div class='tagAndDueTimeWrapper'><div class='dueTime' onclick='dateSearch("+dateSearchList[0]+","+dateSearchList[1]+","+dateSearchList[2]+");'>" + timeString + recurringString
returnString += "</div>"
returnString += "<div class='taskTags'>"
if(len(tags) < 1):
returnString += "<span class='noTaskTag'><span class='italic'>No tags<span></span>"
for tag in tags.split(","):
if(tag == ""):
continue
returnString += "<span class='taskTag' onclick='getTagged(\""+tag+"\");'>"+tag+"</span>"
returnString += "</div></div>"
if(pushable == "true"):
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable("+taskId+",\"true\");'><i class='fa fa-bell' aria-hidden='true'></i></button>"
else:
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable(" + taskId + ",\"false\");'><i class='fa fa-bell-o' aria-hidden='true'></i></button>"
returnString += "<button type='button' class='archiveButton' onclick='completeTaskPost(" + taskId + ");'><i class='fa fa-check-square-o' aria-hidden='true'></i></button>"
returnString += "</div>"
if(returnString == infoString):
return(2)
returnString += "<div class='task' id='infoFooter' style='height:auto;'><input type='button' id='archiveButton' onclick='getAll();' value='Go Back'></div>"
return(returnString)
def dateSearch(dataDict):
username = dataDict["username"].strip()
authCode = dataDict["authCode"].strip()
sort = dataDict["sort"]
lowerTime = float(dataDict["lowerTime"])
upperTime = float(dataDict["upperTime"])
timeOffset = float(dataDict["timeOffset"])
auth = authLib.checkAuthCode(dataDict)
if(auth != 1):
return(0)
infoString = "<div class='task' id='infoHeader' style='height:auto;width:auto;'><h2 class='taskTitle'>Tasks on " + time.strftime("%d/%m/%Y", time.gmtime(lowerTime - timeOffset)) + ":</h2></div>"
returnString = ""
returnString += infoString
db = authLib.dbCon()
c = db.cursor()
command = "SELECT * FROM tasks WHERE BINARY username = %s AND BINARY done != %s"
c.execute(command, [username, "true"])
tasks = c.fetchall()
tasks = list(tasks)
for task in tasks:
for item in task:
if(type(item) == StringType):
item = item.decode("utf-8")
if(sort == "default"):
tasks.sort(key = lambda x:x[3])
for task in tasks:
taskId = str(task[0])
username = task[1]
createTime = float(task[2])
dueTime = float(task[3])
text = task[4]
title = task[6]
tags = task[7]
pushable = task[8]
recurring = task[10]
if(recurring == "false"):
recurringString = ""
else:
recurringString = " <i class='fa fa-repeat' aria-hidden='true'></i>("+recurring.title()+")"
if("'" in title):
title = title.replace("'", "'")
if("'" in text):
text = text.replace("'", "'")
if(lowerTime > dueTime or upperTime < dueTime):
continue
dueTime = dueTime
timeString = time.strftime("%d/%m/%Y %H:%M", time.gmtime(dueTime - timeOffset))
dateSearchList = time.strftime("%d/%m/%Y", time.gmtime(dueTime - float(timeOffset))).split("/")
dateSearchList[1] = str(int(dateSearchList[1]) - 1)
returnString += "<div class='task' id='" + taskId + "'><h2 class='taskTitle' onclick='openEdit(" + taskId + ");'>" + title + "</h2>"
if(text != ""):
returnString += "<div class='taskBody'>" + text + "</div>"
else:
returnString += "<div class='taskBody'><span class='italic'>No details</span></div>"
returnString += "<div class='tagAndDueTimeWrapper'><div class='dueTime' onclick='dateSearch("+dateSearchList[0]+","+dateSearchList[1]+","+dateSearchList[2]+");'>" + timeString + recurringString
returnString += "</div>"
returnString += "<div class='taskTags'>"
if(len(tags) < 1):
returnString += "<span class='noTaskTag'><span class='italic'>No tags<span></span>"
for tag in tags.split(","):
if(tag == ""):
continue
returnString += "<span class='taskTag' onclick='getTagged(\""+tag+"\");'>"+tag+"</span>"
returnString += "</div></div>"
if(pushable == "true"):
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable("+taskId+",\"true\");'><i class='fa fa-bell' aria-hidden='true'></i></button>"
else:
returnString += "<button type='button' class='notificationToggle' onclick='updatePushable(" + taskId + ",\"false\");'><i class='fa fa-bell-o' aria-hidden='true'></i></button>"
returnString += "<button type='button' class='archiveButton' onclick='completeTaskPost(" + taskId + ");'><i class='fa fa-check-square-o' aria-hidden='true'></i></button>"
returnString += "</div>"
if(returnString == infoString):
returnString += "<div class='task' id='infoFooter' style='height:auto;'><h2 class='taskTitle'>No tasks</h2><input type='button' id='archiveButton' onclick='getAll();' value='Go Back'></div>"
db.close()
return(returnString)
returnString += "<div class='task' id='infoFooter' style='height:auto;'><input type='button' id='archiveButton' onclick='getAll();' value='Go Back'></div>"
db.close()
return(returnString)
def getTaskDates(dataDict):
username = dataDict["username"].strip()
authCode = dataDict["authCode"].strip()
sort = dataDict["sort"]
month = int(dataDict["month"])
year = int(dataDict["year"])
auth = authLib.checkAuthCode(dataDict)
if(auth != 1):
return(0)
db = authLib.dbCon()
c = db.cursor()
command = "SELECT dueTime FROM tasks WHERE BINARY username = %s AND BINARY done != %s"
c.execute(command, [username, "true"])
tasks = c.fetchall()
tasks = list(tasks)
returnString = str(month) + ";"
for task in tasks:
dueTime = task[0]
dueTimeString = time.strftime("%d/%m/%Y", time.gmtime(dueTime))
returnString += (dueTimeString + ",")
db.close()
returnString += ";"+str(year)
return(returnString)
| 48.773109 | 201 | 0.57581 | 1,865 | 17,412 | 5.375871 | 0.095979 | 0.056852 | 0.051865 | 0.017953 | 0.833533 | 0.828446 | 0.813186 | 0.805705 | 0.798923 | 0.770995 | 0 | 0.008688 | 0.23323 | 17,412 | 356 | 202 | 48.910112 | 0.742267 | 0.004193 | 0 | 0.82235 | 0 | 0.048711 | 0.334641 | 0.101333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014327 | false | 0 | 0.014327 | 0 | 0.028653 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3c00938559f1de487b105c234eb8ddc9fe3dbc19 | 4,588 | py | Python | caluma/form/tests/snapshots/snap_test_history.py | JohnL17/caluma | 208ebd9442bd0958d45c71d45e332c3e7f0576c2 | [
"MIT"
] | null | null | null | caluma/form/tests/snapshots/snap_test_history.py | JohnL17/caluma | 208ebd9442bd0958d45c71d45e332c3e7f0576c2 | [
"MIT"
] | null | null | null | caluma/form/tests/snapshots/snap_test_history.py | JohnL17/caluma | 208ebd9442bd0958d45c71d45e332c3e7f0576c2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_document_as_of 1"] = {
"documentAsOf": {
"documentId": "890ca108-d93d-4725-9066-7d0bddad8230",
"historicalAnswers": {
"edges": [
{
"node": {
"__typename": "HistoricalStringAnswer",
"historyUserId": "admin",
"value": "first admin - revision 1",
}
}
]
},
"meta": {},
}
}
snapshots["test_document_as_of 2"] = {
"documentAsOf": {
"documentId": "890ca108-d93d-4725-9066-7d0bddad8230",
"historicalAnswers": {
"edges": [
{
"node": {
"__typename": "HistoricalStringAnswer",
"historyUserId": "AnonymousUser",
"value": "first anon - revision 3",
}
}
]
},
"meta": {},
}
}
snapshots["test_document_as_of 3"] = {
"documentAsOf": {
"documentId": "890ca108-d93d-4725-9066-7d0bddad8230",
"historicalAnswers": {
"edges": [
{
"node": {
"__typename": "HistoricalStringAnswer",
"historyUserId": "AnonymousUser",
"value": "second anon - revision 4",
}
}
]
},
"meta": {},
}
}
snapshots["test_historical_table_answer 1"] = {
"d1": {
"historicalAnswers": {
"edges": [
{
"node": {
"__typename": "HistoricalTableAnswer",
"value": [
{
"historicalAnswers": {
"edges": [
{
"node": {
"historyType": "+",
"value": "first row value",
}
}
]
}
},
{
"historicalAnswers": {
"edges": [
{
"node": {
"historyType": "+",
"value": "second row value",
}
}
]
}
},
],
}
}
]
}
},
"d2": {
"historicalAnswers": {
"edges": [
{
"node": {
"__typename": "HistoricalTableAnswer",
"value": [
{
"historicalAnswers": {
"edges": [
{
"node": {
"historyType": "+",
"value": "first row value",
}
}
]
}
},
{
"historicalAnswers": {
"edges": [
{
"node": {
"historyType": "-",
"value": "second row value",
}
}
]
}
},
],
}
}
]
}
},
}
| 33.007194 | 76 | 0.236922 | 150 | 4,588 | 7.066667 | 0.353333 | 0.186792 | 0.220755 | 0.160377 | 0.783019 | 0.759434 | 0.704717 | 0.704717 | 0.704717 | 0.704717 | 0 | 0.051621 | 0.670663 | 4,588 | 138 | 77 | 33.246377 | 0.649901 | 0.013514 | 0 | 0.415385 | 0 | 0 | 0.214681 | 0.053947 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015385 | 0 | 0.015385 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3c43e9727378d55eeb901bd7b3d18907849ae556 | 1,475 | py | Python | CodingInterview2/61_ContinousCards/test_continous_cards.py | hscspring/TheAlgorithms-Python | 5c2faea1d2d25a9a81a4786e053b0cc58ab46c6f | [
"MIT"
] | 10 | 2020-07-06T11:00:58.000Z | 2022-01-29T09:25:24.000Z | CodingInterview2/61_ContinousCards/test_continous_cards.py | hscspring/TheAlgorithms-Python | 5c2faea1d2d25a9a81a4786e053b0cc58ab46c6f | [
"MIT"
] | null | null | null | CodingInterview2/61_ContinousCards/test_continous_cards.py | hscspring/TheAlgorithms-Python | 5c2faea1d2d25a9a81a4786e053b0cc58ab46c6f | [
"MIT"
] | 3 | 2020-07-13T06:39:23.000Z | 2020-08-15T16:29:48.000Z | from continous_cards import is_continous, is_continous2
def test1():
data = [1, 3, 2, 5, 4]
assert is_continous(data) == True
assert is_continous2(data) == True
def test2():
data = [1, 3, 2, 6, 4]
assert is_continous(data) == False
assert is_continous2(data) == False
def test3():
data = [0, 3, 2, 6, 4]
assert is_continous(data) == True
assert is_continous2(data) == True
def test4():
data = [0, 3, 1, 6, 4]
assert is_continous(data) == False
assert is_continous2(data) == False
def test5():
data = [1, 3, 0, 5, 0]
assert is_continous(data) == True
assert is_continous2(data) == True
def test6():
data = [1, 3, 0, 7, 0]
assert is_continous(data) == False
assert is_continous2(data) == False
def test7():
data = [1, 0, 0, 5, 0]
assert is_continous(data) == True
assert is_continous2(data) == True
def test8():
data = [1, 0, 0, 7, 0]
assert is_continous(data) == False
assert is_continous2(data) == False
def test9():
data = [3, 0, 0, 0, 0]
assert is_continous(data) == True
assert is_continous2(data) == True
def test10():
data = [0, 0, 0, 0, 0]
assert is_continous(data) == True
assert is_continous2(data) == True
# 有对子
def test11():
data = [1, 0, 0, 1, 0]
assert is_continous(data) == False
assert is_continous2(data) == False
def test12():
assert is_continous([]) == False
assert is_continous2([]) == False
| 19.666667 | 55 | 0.604068 | 217 | 1,475 | 3.981567 | 0.147465 | 0.222222 | 0.236111 | 0.267361 | 0.732639 | 0.732639 | 0.732639 | 0.732639 | 0.726852 | 0.726852 | 0 | 0.075045 | 0.250169 | 1,475 | 74 | 56 | 19.932432 | 0.706148 | 0.002034 | 0 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | false | 0 | 0.020833 | 0 | 0.270833 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
3c6f86fb10d3d828a9d1440f28ad7b3c3a607495 | 5,827 | py | Python | packages/augur-core/tests/trading/test_setOrderPrice.py | peachbits/augur | 2b4c5938f04cdef33f2d4f9144132070f968feff | [
"MIT"
] | null | null | null | packages/augur-core/tests/trading/test_setOrderPrice.py | peachbits/augur | 2b4c5938f04cdef33f2d4f9144132070f968feff | [
"MIT"
] | null | null | null | packages/augur-core/tests/trading/test_setOrderPrice.py | peachbits/augur | 2b4c5938f04cdef33f2d4f9144132070f968feff | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from eth_tester.exceptions import TransactionFailed
from utils import longTo32Bytes, longToHexString, fix, AssertLog, stringToBytes, EtherDelta, PrintGasUsed, BuyWithCash, TokenDelta, nullAddress
from constants import ASK, BID, YES, NO, LONG, SHORT
from pytest import raises, mark
def test_orders_set_order_price_all_tokens(contractsFixture, market, cash):
orders = contractsFixture.contracts['Orders']
createOrder = contractsFixture.contracts['CreateOrder']
nullOrder = longTo32Bytes(0)
tradeGroupID = "42"
# create order
with BuyWithCash(cash, fix('10', '50'), contractsFixture.accounts[0], "create order"):
orderId = createOrder.publicCreateOrder(BID, fix(10), 50, market.address, YES, longTo32Bytes(0), longTo32Bytes(0), tradeGroupID, False, nullAddress)
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 50
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == fix('10', '50')
assert orders.getOrderSharesEscrowed(orderId) == 0
# Change the price to 60
# We have to provide the extra tokens required by the increase in our BID
with raises(TransactionFailed):
orders.setOrderPrice(orderId, 60, nullOrder, nullOrder)
with BuyWithCash(cash, fix('10', '10'), contractsFixture.accounts[0], "set order price higher"):
assert orders.setOrderPrice(orderId, 60, nullOrder, nullOrder)
# See that the order price and money escrowed has changed
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 60
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == fix('10', '60')
assert orders.getOrderSharesEscrowed(orderId) == 0
# Now if we set the price lower again to 50 we'll receive a refund for the difference
with TokenDelta(cash, fix(10, 10), contractsFixture.accounts[0], "Did not recieve a refund for lowering order price"):
assert orders.setOrderPrice(orderId, 50, nullOrder, nullOrder)
# See that the order price and money escrowed has changed
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 50
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == fix('10', '50')
assert orders.getOrderSharesEscrowed(orderId) == 0
def test_orders_set_order_price_all_shares(contractsFixture, market, cash):
orders = contractsFixture.contracts['Orders']
createOrder = contractsFixture.contracts['CreateOrder']
completeSets = contractsFixture.contracts['CompleteSets']
nullOrder = longTo32Bytes(0)
tradeGroupID = "42"
# create order using only shares
with BuyWithCash(cash, fix('10', '100'), contractsFixture.accounts[0], "buy complete set"):
assert completeSets.publicBuyCompleteSets(market.address, fix(10))
orderId = createOrder.publicCreateOrder(ASK, fix(10), 50, market.address, YES, longTo32Bytes(0), longTo32Bytes(0), tradeGroupID, False, nullAddress)
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 50
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == 0
assert orders.getOrderSharesEscrowed(orderId) == fix(10)
# Change the price to 60
# We don't need to provide any additional tokens since we're covering this order entirely with shares
assert orders.setOrderPrice(orderId, 60, nullOrder, nullOrder)
# See that the order price has changed and the money escrowed has not changed
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 60
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == 0
assert orders.getOrderSharesEscrowed(orderId) == fix(10)
def test_orders_set_order_price_partial_shares(contractsFixture, market, cash):
orders = contractsFixture.contracts['Orders']
createOrder = contractsFixture.contracts['CreateOrder']
completeSets = contractsFixture.contracts['CompleteSets']
nullOrder = longTo32Bytes(0)
tradeGroupID = "42"
# create order using partial shares along with tokens escrowed
with BuyWithCash(cash, fix('5', '100'), contractsFixture.accounts[0], "buy complete set"):
assert completeSets.publicBuyCompleteSets(market.address, fix(5))
with BuyWithCash(cash, fix('5', '50'), contractsFixture.accounts[0], "create order"):
orderId = createOrder.publicCreateOrder(ASK, fix(10), 50, market.address, YES, longTo32Bytes(0), longTo32Bytes(0), tradeGroupID, False, nullAddress)
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 50
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == fix('5', '50')
assert orders.getOrderSharesEscrowed(orderId) == fix(5)
# Change the price to 40
# We have to provide the extra tokens required by the decrease in our ASK
with raises(TransactionFailed):
orders.setOrderPrice(orderId, 40, nullOrder, nullOrder)
with BuyWithCash(cash, fix('5', '10'), contractsFixture.accounts[0], "set order price higher"):
assert orders.setOrderPrice(orderId, 40, nullOrder, nullOrder)
# See that the order price and money escrowed has changed
assert orders.getAmount(orderId) == fix('10')
assert orders.getPrice(orderId) == 40
assert orders.getOrderCreator(orderId) == contractsFixture.accounts[0]
assert orders.getOrderMoneyEscrowed(orderId) == fix('5', '60')
assert orders.getOrderSharesEscrowed(orderId) == fix(5)
| 50.669565 | 156 | 0.736914 | 667 | 5,827 | 6.409295 | 0.182909 | 0.109474 | 0.081871 | 0.045848 | 0.851462 | 0.818012 | 0.738713 | 0.703392 | 0.703392 | 0.677427 | 0 | 0.03416 | 0.155998 | 5,827 | 114 | 157 | 51.114035 | 0.835096 | 0.131457 | 0 | 0.671053 | 0 | 0 | 0.05648 | 0 | 0 | 0 | 0 | 0 | 0.552632 | 1 | 0.039474 | false | 0 | 0.052632 | 0 | 0.092105 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5931ac25ea1a4de8ac0fca63e4df76cdbc6616ad | 12,811 | py | Python | ksteta3pi/newpot/MC_12_11102441_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | ksteta3pi/newpot/MC_12_11102441_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | ksteta3pi/newpot/MC_12_11102441_MagDown.py | Williams224/davinci-scripts | 730642d2ff13543eca4073a4ce0932631195de56 | [
"MIT"
] | null | null | null | #-- GAUDI jobOptions generated on Mon Jul 27 18:44:24 2015
#-- Contains event types :
#-- 11102441 - 147 files - 3017500 events - 874.08 GBytes
#-- Extra information about the data processing phases:
from Gaudi.Configuration import *
from GaudiConf import IOHelper
IOHelper('ROOT').inputFiles(['LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000001_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000002_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000003_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000004_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000005_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000006_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000007_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000008_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000009_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000010_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000011_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000012_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000013_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000014_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000015_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000016_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000017_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000018_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000019_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000020_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000021_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000022_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000023_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000024_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000025_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000026_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000027_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000028_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000029_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000030_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000031_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000032_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000033_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000034_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000035_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000036_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000037_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000038_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000039_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000040_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000041_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000042_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000043_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000045_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000046_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000047_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000048_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000049_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000050_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000051_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000052_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000053_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000054_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000055_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000056_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000057_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000058_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000059_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000060_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000061_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000062_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000064_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000065_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000066_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000067_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000068_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000069_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000070_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000071_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000072_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000073_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000074_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000075_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000076_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000077_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000079_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000080_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000081_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000082_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000083_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000084_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000085_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000086_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000087_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000088_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000089_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000090_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000091_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000092_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000093_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000094_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000095_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000096_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000097_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000098_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000099_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000100_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000101_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000102_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000103_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000104_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000105_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000106_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000107_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000108_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000109_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000110_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000111_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000112_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000113_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000114_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000115_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000116_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000117_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000118_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000119_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000120_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000122_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000123_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000124_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000125_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000126_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000127_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000128_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000129_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000130_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000131_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000132_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000133_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000134_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000135_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000136_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000137_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000138_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000139_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000140_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000142_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000143_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000144_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000145_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000146_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000147_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000148_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000149_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000150_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000151_2.AllStreams.dst',
'LFN:/lhcb/MC/2012/ALLSTREAMS.DST/00038845/0000/00038845_00000152_2.AllStreams.dst'
], clear=True)
| 81.082278 | 113 | 0.821325 | 1,953 | 12,811 | 5.237071 | 0.099334 | 0.37368 | 0.129351 | 0.18684 | 0.86097 | 0.86097 | 0.86097 | 0.86097 | 0.86097 | 0.856472 | 0 | 0.387409 | 0.015534 | 12,811 | 157 | 114 | 81.598726 | 0.423565 | 0.015299 | 0 | 0 | 1 | 0.98 | 0.944493 | 0.944176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.013333 | 0 | 0.013333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 15 |
59454a0f07e27cde5407cd7c036dab774abfd38b | 33,245 | py | Python | src/build_rindex/build_wiki_rindex.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 61 | 2019-09-19T03:04:32.000Z | 2022-03-08T03:59:28.000Z | src/build_rindex/build_wiki_rindex.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 13 | 2019-09-19T12:11:01.000Z | 2020-12-28T17:51:43.000Z | src/build_rindex/build_wiki_rindex.py | ethanjperez/semanticRetrievalMRS | 765e00d6e7693e0eaba20ef1407fad0be4a7a92b | [
"MIT"
] | 10 | 2019-09-20T05:07:28.000Z | 2022-01-12T08:12:08.000Z | import unicodedata
from functools import partial
import regex
from sqlitedict import SqliteDict
import json
import config
from build_rindex.build_rvindex import IndexDB, load_from_file
# from doc_retri.hotpot_preliminary_doc_retri import filter_word
from build_rindex.persistent_index_db import IndexingDB
from build_rindex.term_manage import load_wiki_abstract_terms
from inspect_wikidump.init_inspect import TOTAL_NUM_DOC
from inspect_wikidump.inspect_whole_file import get_first_paragraph_index
from wiki_util import wiki_db_tool
from tqdm import tqdm
from typing import Dict, Tuple, List
from sklearn.utils import murmurhash3_32
POS_INCLUDED = ['ADJ', 'ADV', 'INTJ', 'NOUN', 'PROPN', 'VERB']
STOPWORDS = {
'i', 'me', 'my', 'myself', 'we', 'our', 'ours', 'ourselves', 'you', 'your',
'yours', 'yourself', 'yourselves', 'he', 'him', 'his', 'himself', 'she',
'her', 'hers', 'herself', 'it', 'its', 'itself', 'they', 'them', 'their',
'theirs', 'themselves', 'what', 'which', 'who', 'whom', 'this', 'that',
'these', 'those', 'am', 'is', 'are', 'was', 'were', 'be', 'been', 'being',
'have', 'has', 'had', 'having', 'do', 'does', 'did', 'doing', 'a', 'an',
'the', 'and', 'but', 'if', 'or', 'because', 'as', 'until', 'while', 'of',
'at', 'by', 'for', 'with', 'about', 'against', 'between', 'into', 'through',
'during', 'before', 'after', 'above', 'below', 'to', 'from', 'up', 'down',
'in', 'out', 'on', 'off', 'over', 'under', 'again', 'further', 'then',
'once', 'here', 'there', 'when', 'where', 'why', 'how', 'all', 'any',
'both', 'each', 'few', 'more', 'most', 'other', 'some', 'such', 'no', 'nor',
'not', 'only', 'own', 'same', 'so', 'than', 'too', 'very', 's', 't', 'can',
'will', 'just', 'don', 'should', 'now', 'd', 'll', 'm', 'o', 're', 've',
'y', 'ain', 'aren', 'couldn', 'didn', 'doesn', 'hadn', 'hasn', 'haven',
'isn', 'ma', 'mightn', 'mustn', 'needn', 'shan', 'shouldn', 'wasn', 'weren',
'won', 'wouldn', "'ll", "'re", "'ve", "n't", "'s", "'d", "'m", "''", "``"
}
def normalize(text):
"""Resolve different type of unicode encodings."""
return unicodedata.normalize('NFD', text)
def filter_word(text):
"""Take out english stopwords, punctuation, and compound endings."""
text = normalize(text)
if regex.match(r'^\p{P}+$', text):
return True
if text.lower() in STOPWORDS:
return True
return False
def filter_ngram(gram, mode='any'):
"""Decide whether to keep or discard an n-gram.
Args:
gram: list of tokens (length N)
mode: Option to throw out ngram if
'any': any single token passes filter_word
'all': all tokens pass filter_word
'ends': book-ended by filterable tokens
"""
filtered = [filter_word(w) for w in gram]
if mode == 'any':
return any(filtered)
elif mode == 'all':
return all(filtered)
elif mode == 'ends':
return filtered[0] or filtered[-1]
else:
raise ValueError('Invalid mode: %s' % mode)
def get_ngrams(terms, poss=None, n=1, filter_fn=None, included_tags=None, as_strings=True, lower=True):
"""Returns a list of all ngrams from length 1 to n.
"""
def _skip(gram):
if not filter_fn:
return False
return filter_fn(gram)
ngrams = [(s, e + 1)
for s in range(len(terms))
for e in range(s, min(s + n, len(terms)))
if not _skip(terms[s:e + 1])]
if poss is not None and included_tags is not None: # We do filtering according to pos.
filtered_ngram = []
for (s, e) in ngrams:
if any([poss[i] in included_tags for i in range(s, e)]):
filtered_ngram.append((s, e))
ngrams = filtered_ngram
# Concatenate into strings
if as_strings:
r_list = []
for (s, e) in ngrams:
if lower:
r_list.append(' '.join(terms[s:e]).lower())
else:
r_list.append(' '.join(terms[s:e]))
return r_list
else:
return ngrams
def whole_wiki_pages_title_raw_indexing():
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
title_abs_raw_indexdb = IndexDB()
abs_file_name = config.PDATA_ROOT / "reverse_indexing/abs_rindexdb"
content_indexdb = IndexDB()
content_index_file_name = ''
with SqliteDict(str(config.WHOLE_WIKI_DB), flag='r', encode=json.dumps, decode=json.loads) as whole_wiki_db:
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
valid_page = True
item = json.loads(value)
# print(item)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
abs_index = get_first_paragraph_index(whole_wiki_db[article_title])
if abs_index == -1:
valid_page = False
# print(whole_wiki_db[article_title])
# This pages is not valid.
article_term_list = []
article_poss_list = []
title_term_list = []
title_poss_list = []
abstract_term_list = []
abstract_poss_list = []
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else:
if p_i == abs_index: # If the terms are in abstract
abstract_term_list.extend(sent_text)
abstract_poss_list.extend(sent_poss)
article_term_list.extend(sent_text)
article_poss_list.extend(sent_poss)
# print("Title:", title_term_list, title_poss_list)
title_ngram = get_ngrams(title_term_list, title_poss_list, 3,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
abs_ngram = get_ngrams(abstract_term_list, abstract_poss_list, 3,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
# print(article_title)
# print(title_ngram)
# print(abs_ngram)
added_terms_num = 0
for added_term in title_ngram + abs_ngram:
title_abs_raw_indexdb.inverted_index.add(added_term, article_title)
added_terms_num += 1
title_abs_raw_indexdb.document_length_table.add(article_title, added_terms_num)
# break
# content_t_ngram = get_ngrams(title_term_list, title_poss_list, 3,
# filter_fn=partial(filter_ngram, mode='any'),
# included_tags=POS_INCLUDED)
#
# content_c_ngram = get_ngrams(abstract_term_list, abstract_poss_list, 3,
# filter_fn=partial(filter_ngram, mode='any'),
# included_tags=POS_INCLUDED)
#
# added_terms_num = 0
# for added_term in content_t_ngram + content_c_ngram:
# content_indexdb.inverted_index.add(added_term, article_title)
# added_terms_num += 1
#
# content_indexdb.document_length_table.add(article_title, added_terms_num)
#
title_abs_raw_indexdb.save_to_file(abs_file_name)
# print(title_term_list)
# print(title_ngram)
# print(abs_ngram)
# print("Title:(ngram):", get_ngrams(title_term_list, title_poss_list, 3, included_tags=POS_INCLUDED))
# print(abstract_term_list, abstract_poss_list)
# print(article_term_list, article_poss_list)
def whole_wiki_pages_title_raw_indexing_paragraph_level(limited_terms=True):
key_separator = '/'
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_p_level_limited_gram_rindexdb"
count = 0
if limited_terms:
limited_terms_set = load_wiki_abstract_terms(config.PRO_ROOT / "data/processed/wiki_abs_3gram_terms.txt")
else:
limited_terms_set = []
limited_terms_set = set(limited_terms_set)
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if p_i >= 100:
break
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
added_terms_num = 0
paragraph_key = key_separator.join((article_title, str(p_i)))
for added_term in title_ngram + paragraph_ngram:
if added_term in limited_terms_set:
wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
added_terms_num += 1
elif ' ' not in added_term:
wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
added_terms_num += 1
else:
pass
wiki_p_level_indexdb.document_length_table.add(paragraph_key, added_terms_num)
count += 1
# if count >= 1000:
# break
wiki_p_level_indexdb.save_to_file(file_name)
def whole_wiki_pages_title_raw_indexing_paragraph_level_unigram():
key_separator = '/'
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_p_level_unigram_rindexdb"
count = 0
# if limited_terms:
# limited_terms_set = load_wiki_abstract_terms(config.PRO_ROOT / "data/processed/wiki_abs_3gram_terms.txt")
# else:
# limited_terms_set = []
#
# limited_terms_set = set(limited_terms_set)
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if p_i >= 100:
break
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
added_terms_num = 0
paragraph_key = key_separator.join((article_title, str(p_i)))
for added_term in title_ngram + paragraph_ngram:
# if added_term in limited_terms_set:
# wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
# added_terms_num += 1
# elif ' ' not in added_term:
wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
added_terms_num += 1
# else:
# pass
wiki_p_level_indexdb.document_length_table.add(paragraph_key, added_terms_num)
count += 1
if count >= 1000:
break
wiki_p_level_indexdb.save_to_file(file_name)
def whole_wiki_pages_title_raw_indexing_paragraph_level_unigram_size_limited(hash_size=2**24):
key_separator = '/'
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_p_level_unigram_rindexdb_hash_size_limited"
count = 0
# if limited_terms:
# limited_terms_set = load_wiki_abstract_terms(config.PRO_ROOT / "data/processed/wiki_abs_3gram_terms.txt")
# else:
# limited_terms_set = []
#
# limited_terms_set = set(limited_terms_set)
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if p_i >= 100:
break
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
added_terms_num = 0
paragraph_key = key_separator.join((article_title, str(p_i)))
for added_term in title_ngram + paragraph_ngram:
# if added_term in limited_terms_set:
# wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
# added_terms_num += 1
# elif ' ' not in added_term:
hash_value_added_term = hash(added_term, hash_size)
hash_value_paragraph_key = hash(paragraph_key)
wiki_p_level_indexdb.inverted_index.add(hash_value_added_term, hash_value_paragraph_key)
added_terms_num += 1
# else:
# pass
hash_value_paragraph_key = hash(paragraph_key)
wiki_p_level_indexdb.document_length_table.add(hash_value_paragraph_key, added_terms_num)
count += 1
if count >= 1000:
break
wiki_p_level_indexdb.save_to_file(file_name)
def whole_wiki_pages_title_raw_indexing_paragraph_level_unigram_size_limited_memory_saving():
key_separator = '/'
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_p_level_unigram_rindexdb"
count = 0
# if limited_terms:
# limited_terms_set = load_wiki_abstract_terms(config.PRO_ROOT / "data/processed/wiki_abs_3gram_terms.txt")
# else:
# limited_terms_set = []
#
# limited_terms_set = set(limited_terms_set)
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if p_i >= 100:
break
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
added_terms_num = 0
paragraph_key = key_separator.join((article_title, str(p_i)))
for added_term in title_ngram + paragraph_ngram:
# if added_term in limited_terms_set:
# wiki_p_level_indexdb.inverted_index.add(added_term, paragraph_key)
# added_terms_num += 1
# elif ' ' not in added_term:
hash_value_added_term = hash(added_term)
hash_value_paragraph_key = hash(paragraph_key)
wiki_p_level_indexdb.inverted_index.add(hash_value_added_term, hash_value_paragraph_key)
added_terms_num += 1
# else:
# pass
hash_value_paragraph_key = hash(paragraph_key)
wiki_p_level_indexdb.document_length_table.add(hash_value_paragraph_key, added_terms_num)
count += 1
# if count >= 1000:
# break
wiki_p_level_indexdb.save_to_file(file_name, memory_saving=True)
def whole_wiki_pages_title_raw_indexing_paragraph_level_to_indexdb():
key_separator = '/'
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
# wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_p_level_persistent_indexdb.db"
index_db = IndexingDB(file_name)
index_db.create_tables()
count = 0
term_title_items_buffer_list: List[Tuple[str, str, int]] = []
title_items_buffer_list: List[Tuple[str, int]] = []
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
paragraph_term_title_dict: Dict[Tuple[str, str], int] = dict()
paragraph_title_dict: Dict[str, int] = dict()
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
continue
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
added_terms_num = 0
paragraph_key = key_separator.join((article_title, str(p_i)))
for added_term in title_ngram + paragraph_ngram:
paragraph_term_title_dict[(added_term, paragraph_key)] = \
paragraph_term_title_dict.get((added_term, paragraph_key), 0) + 1
added_terms_num += 1
paragraph_title_dict[paragraph_key] = added_terms_num
count += 1
if p_i >= 60:
break
if count >= 5000:
break
for (term, paragraph_key), ovalue in paragraph_term_title_dict.items():
term_title_items_buffer_list.append((term, paragraph_key, ovalue))
for paragraph_title, ovalue in paragraph_title_dict.items():
title_items_buffer_list.append((paragraph_title, ovalue))
if len(term_title_items_buffer_list) >= 1000: # Flush
index_db.insert_many_items(term_title_items_buffer_list)
index_db.insert_many_articles(title_items_buffer_list)
term_title_items_buffer_list = []
title_items_buffer_list = []
index_db.insert_many_items(term_title_items_buffer_list)
index_db.insert_many_articles(title_items_buffer_list)
index_db.close()
def whole_wiki_pages_title_raw_indexing_article_level_to_indexdb():
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
# wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_a_level_persistent_indexdb.db"
index_db = IndexingDB(file_name)
index_db.create_tables()
count = 0
term_title_items_buffer_list: List[Tuple[str, str, int]] = []
title_items_buffer_list: List[Tuple[str, int]] = []
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
article_term_title_dict: Dict[Tuple[str, str], int] = dict()
article_title_dict: Dict[str, int] = dict()
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
article_ngram = []
assert len(article_clean_text) == len(article_poss)
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
continue
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 2,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
article_ngram.extend(paragraph_ngram)
if p_i >= 60:
break
added_terms_num = 0
for added_term in title_ngram + article_ngram:
article_term_title_dict[(added_term, article_title)] = \
article_term_title_dict.get((added_term, article_title), 0) + 1
added_terms_num += 1
article_title_dict[article_title] = added_terms_num
count += 1
if count >= 200:
break
for (term, article_title), ovalue in article_term_title_dict.items():
term_title_items_buffer_list.append((term, article_title, ovalue))
for article_title, ovalue in article_title_dict.items():
title_items_buffer_list.append((article_title, ovalue))
if len(term_title_items_buffer_list) >= 1000: # Flush
index_db.insert_many_items(term_title_items_buffer_list)
index_db.insert_many_articles(title_items_buffer_list)
term_title_items_buffer_list = []
title_items_buffer_list = []
index_db.insert_many_items(term_title_items_buffer_list)
index_db.insert_many_articles(title_items_buffer_list)
index_db.close()
def whole_wiki_pages_title_raw_indexing_article_level(limited_terms=True):
whole_tokenized_db_cursor = wiki_db_tool.get_cursor(config.WHOLE_PROCESS_FOR_RINDEX_DB)
whole_tokenized_db_cursor.execute("SELECT * from unnamed")
wiki_p_level_indexdb = IndexDB()
file_name = config.PDATA_ROOT / "reverse_indexing/wiki_a_level_limited_gram_rindexdb"
if limited_terms:
limited_terms_set = load_wiki_abstract_terms(config.PRO_ROOT / "data/processed/wiki_abs_3gram_terms.txt")
else:
limited_terms_set = []
limited_terms_set = set(limited_terms_set)
count = 0
for key, value in tqdm(whole_tokenized_db_cursor, total=TOTAL_NUM_DOC):
item = json.loads(value)
article_title = item['title']
article_clean_text = item['clean_text']
article_poss = item['poss']
title_term_list = []
title_poss_list = []
title_ngram = None
assert len(article_clean_text) == len(article_poss)
# article_term_list = []
# article_poss_list = []
article_ngram = []
for p_i, (paragraph_text, paragraph_poss) in enumerate(zip(article_clean_text, article_poss)):
paragraph_term_list = []
paragraph_poss_list = []
for sent_text, sent_poss in zip(paragraph_text, paragraph_poss):
if p_i == 0: # In title.
title_term_list.extend(sent_text)
title_poss_list.extend(sent_poss)
continue # If the terms are in title, we don't those terms in abstract and article term.
else: # p_i != 0
paragraph_term_list.extend(sent_text)
paragraph_poss_list.extend(sent_poss)
if p_i == 0 and title_ngram is None:
title_ngram = get_ngrams(title_term_list, title_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
continue
paragraph_ngram = get_ngrams(paragraph_term_list, paragraph_poss_list, 1,
filter_fn=partial(filter_ngram, mode='any'),
included_tags=POS_INCLUDED)
if len(paragraph_ngram) == 0:
continue
article_ngram.extend(paragraph_ngram)
if p_i >= 80:
break
added_terms_num = 0
for added_term in title_ngram + article_ngram:
if added_term in limited_terms_set:
wiki_p_level_indexdb.inverted_index.add(added_term, article_title)
added_terms_num += 1
elif ' ' not in added_term:
wiki_p_level_indexdb.inverted_index.add(added_term, article_title)
added_terms_num += 1
wiki_p_level_indexdb.document_length_table.add(article_title, added_terms_num)
count += 1
# if count >= 5000:
# break
wiki_p_level_indexdb.save_to_file(file_name)
def hash(token, num_buckets=None):
"""Unsigned 32 bit murmurhash for feature hashing."""
if num_buckets is None:
return murmurhash3_32(token, positive=True)
else:
return murmurhash3_32(token, positive=True) % num_buckets
if __name__ == '__main__':
# abs_rindexdb = IndexDB()
# abs_rindexdb.load_from_file(config.PDATA_ROOT / "reverse_indexing/abs_rindexdb")
# print(len(abs_rindexdb))
# query = "What science fantasy young adult series, told in first person, has a set of companion books narrating the stories of enslaved worlds and alien species?"
# whole_wiki_pages_title_raw_indexing_paragraph_level()
# whole_wiki_pages_title_raw_indexing_paragraph_level(limited_terms=True)
# whole_wiki_pages_title_raw_indexing_paragraph_level_unigram()
# whole_wiki_pages_title_raw_indexing_paragraph_level_unigram_size_limited(2 ** 24)
whole_wiki_pages_title_raw_indexing_paragraph_level_unigram_size_limited_memory_saving()
# g_score_dict = dict()
# load_from_file(g_score_dict,
# config.PDATA_ROOT / "reverse_indexing/wiki_p_level_unigram_rindexdb_hash_size_limited/scored_db/default-tf-idf.score.txt",
# config.PDATA_ROOT / "reverse_indexing/wiki_p_level_unigram_rindexdb/inverted_index.txt",
# with_int_type=True, memory_efficient=True)
# print(g_score_dict)
# whole_wiki_pages_title_raw_indexing_article_level()
# whole_wiki_pages_title_raw_indexing_paragraph_level_to_indexdb()
# whole_wiki_pages_title_raw_indexing_article_level_to_indexdb()
# whole_wiki_pages_title_raw_indexing()
| 39.577381 | 167 | 0.612664 | 4,115 | 33,245 | 4.548724 | 0.094775 | 0.024789 | 0.02543 | 0.024522 | 0.808847 | 0.792446 | 0.76755 | 0.758628 | 0.745112 | 0.733732 | 0 | 0.007345 | 0.299714 | 33,245 | 839 | 168 | 39.624553 | 0.796658 | 0.149647 | 0 | 0.736243 | 0 | 0 | 0.054389 | 0.016623 | 0 | 0 | 0 | 0 | 0.01518 | 1 | 0.026565 | false | 0.001898 | 0.028463 | 0 | 0.079696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3cb52e6acc486e393b753f025a94685679a2d120 | 17,562 | py | Python | sdk/python/pulumi_vault/azure/backend.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2019-10-07T17:44:18.000Z | 2022-03-30T20:46:33.000Z | sdk/python/pulumi_vault/azure/backend.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 79 | 2019-10-11T18:13:07.000Z | 2022-03-31T21:09:41.000Z | sdk/python/pulumi_vault/azure/backend.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-28T10:08:40.000Z | 2020-03-17T14:20:55.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['BackendArgs', 'Backend']
@pulumi.input_type
class BackendArgs:
def __init__(__self__, *,
subscription_id: pulumi.Input[str],
tenant_id: pulumi.Input[str],
client_id: Optional[pulumi.Input[str]] = None,
client_secret: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Backend resource.
:param pulumi.Input[str] subscription_id: - The subscription id for the Azure Active Directory.
:param pulumi.Input[str] tenant_id: - The tenant id for the Azure Active Directory.
:param pulumi.Input[str] client_id: - The OAuth2 client id to connect to Azure.
:param pulumi.Input[str] client_secret: - The OAuth2 client secret to connect to Azure.
:param pulumi.Input[str] description: Human-friendly description of the mount for the backend.
:param pulumi.Input[str] environment: - The Azure environment.
:param pulumi.Input[str] path: - The unique path this backend should be mounted at. Defaults to `azure`.
"""
pulumi.set(__self__, "subscription_id", subscription_id)
pulumi.set(__self__, "tenant_id", tenant_id)
if client_id is not None:
pulumi.set(__self__, "client_id", client_id)
if client_secret is not None:
pulumi.set(__self__, "client_secret", client_secret)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if path is not None:
pulumi.set(__self__, "path", path)
@property
@pulumi.getter(name="subscriptionId")
def subscription_id(self) -> pulumi.Input[str]:
"""
- The subscription id for the Azure Active Directory.
"""
return pulumi.get(self, "subscription_id")
@subscription_id.setter
def subscription_id(self, value: pulumi.Input[str]):
pulumi.set(self, "subscription_id", value)
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> pulumi.Input[str]:
"""
- The tenant id for the Azure Active Directory.
"""
return pulumi.get(self, "tenant_id")
@tenant_id.setter
def tenant_id(self, value: pulumi.Input[str]):
pulumi.set(self, "tenant_id", value)
@property
@pulumi.getter(name="clientId")
def client_id(self) -> Optional[pulumi.Input[str]]:
"""
- The OAuth2 client id to connect to Azure.
"""
return pulumi.get(self, "client_id")
@client_id.setter
def client_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_id", value)
@property
@pulumi.getter(name="clientSecret")
def client_secret(self) -> Optional[pulumi.Input[str]]:
"""
- The OAuth2 client secret to connect to Azure.
"""
return pulumi.get(self, "client_secret")
@client_secret.setter
def client_secret(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_secret", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Human-friendly description of the mount for the backend.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[str]]:
"""
- The Azure environment.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
- The unique path this backend should be mounted at. Defaults to `azure`.
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@pulumi.input_type
class _BackendState:
def __init__(__self__, *,
client_id: Optional[pulumi.Input[str]] = None,
client_secret: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
subscription_id: Optional[pulumi.Input[str]] = None,
tenant_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Backend resources.
:param pulumi.Input[str] client_id: - The OAuth2 client id to connect to Azure.
:param pulumi.Input[str] client_secret: - The OAuth2 client secret to connect to Azure.
:param pulumi.Input[str] description: Human-friendly description of the mount for the backend.
:param pulumi.Input[str] environment: - The Azure environment.
:param pulumi.Input[str] path: - The unique path this backend should be mounted at. Defaults to `azure`.
:param pulumi.Input[str] subscription_id: - The subscription id for the Azure Active Directory.
:param pulumi.Input[str] tenant_id: - The tenant id for the Azure Active Directory.
"""
if client_id is not None:
pulumi.set(__self__, "client_id", client_id)
if client_secret is not None:
pulumi.set(__self__, "client_secret", client_secret)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if path is not None:
pulumi.set(__self__, "path", path)
if subscription_id is not None:
pulumi.set(__self__, "subscription_id", subscription_id)
if tenant_id is not None:
pulumi.set(__self__, "tenant_id", tenant_id)
@property
@pulumi.getter(name="clientId")
def client_id(self) -> Optional[pulumi.Input[str]]:
"""
- The OAuth2 client id to connect to Azure.
"""
return pulumi.get(self, "client_id")
@client_id.setter
def client_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_id", value)
@property
@pulumi.getter(name="clientSecret")
def client_secret(self) -> Optional[pulumi.Input[str]]:
"""
- The OAuth2 client secret to connect to Azure.
"""
return pulumi.get(self, "client_secret")
@client_secret.setter
def client_secret(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_secret", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Human-friendly description of the mount for the backend.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[str]]:
"""
- The Azure environment.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
- The unique path this backend should be mounted at. Defaults to `azure`.
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter(name="subscriptionId")
def subscription_id(self) -> Optional[pulumi.Input[str]]:
"""
- The subscription id for the Azure Active Directory.
"""
return pulumi.get(self, "subscription_id")
@subscription_id.setter
def subscription_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "subscription_id", value)
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> Optional[pulumi.Input[str]]:
"""
- The tenant id for the Azure Active Directory.
"""
return pulumi.get(self, "tenant_id")
@tenant_id.setter
def tenant_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tenant_id", value)
class Backend(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_id: Optional[pulumi.Input[str]] = None,
client_secret: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
subscription_id: Optional[pulumi.Input[str]] = None,
tenant_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Create a Backend resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] client_id: - The OAuth2 client id to connect to Azure.
:param pulumi.Input[str] client_secret: - The OAuth2 client secret to connect to Azure.
:param pulumi.Input[str] description: Human-friendly description of the mount for the backend.
:param pulumi.Input[str] environment: - The Azure environment.
:param pulumi.Input[str] path: - The unique path this backend should be mounted at. Defaults to `azure`.
:param pulumi.Input[str] subscription_id: - The subscription id for the Azure Active Directory.
:param pulumi.Input[str] tenant_id: - The tenant id for the Azure Active Directory.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: BackendArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Create a Backend resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param BackendArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(BackendArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_id: Optional[pulumi.Input[str]] = None,
client_secret: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
subscription_id: Optional[pulumi.Input[str]] = None,
tenant_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = BackendArgs.__new__(BackendArgs)
__props__.__dict__["client_id"] = client_id
__props__.__dict__["client_secret"] = client_secret
__props__.__dict__["description"] = description
__props__.__dict__["environment"] = environment
__props__.__dict__["path"] = path
if subscription_id is None and not opts.urn:
raise TypeError("Missing required property 'subscription_id'")
__props__.__dict__["subscription_id"] = subscription_id
if tenant_id is None and not opts.urn:
raise TypeError("Missing required property 'tenant_id'")
__props__.__dict__["tenant_id"] = tenant_id
super(Backend, __self__).__init__(
'vault:azure/backend:Backend',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
client_id: Optional[pulumi.Input[str]] = None,
client_secret: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
subscription_id: Optional[pulumi.Input[str]] = None,
tenant_id: Optional[pulumi.Input[str]] = None) -> 'Backend':
"""
Get an existing Backend resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] client_id: - The OAuth2 client id to connect to Azure.
:param pulumi.Input[str] client_secret: - The OAuth2 client secret to connect to Azure.
:param pulumi.Input[str] description: Human-friendly description of the mount for the backend.
:param pulumi.Input[str] environment: - The Azure environment.
:param pulumi.Input[str] path: - The unique path this backend should be mounted at. Defaults to `azure`.
:param pulumi.Input[str] subscription_id: - The subscription id for the Azure Active Directory.
:param pulumi.Input[str] tenant_id: - The tenant id for the Azure Active Directory.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _BackendState.__new__(_BackendState)
__props__.__dict__["client_id"] = client_id
__props__.__dict__["client_secret"] = client_secret
__props__.__dict__["description"] = description
__props__.__dict__["environment"] = environment
__props__.__dict__["path"] = path
__props__.__dict__["subscription_id"] = subscription_id
__props__.__dict__["tenant_id"] = tenant_id
return Backend(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="clientId")
def client_id(self) -> pulumi.Output[Optional[str]]:
"""
- The OAuth2 client id to connect to Azure.
"""
return pulumi.get(self, "client_id")
@property
@pulumi.getter(name="clientSecret")
def client_secret(self) -> pulumi.Output[Optional[str]]:
"""
- The OAuth2 client secret to connect to Azure.
"""
return pulumi.get(self, "client_secret")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Human-friendly description of the mount for the backend.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def environment(self) -> pulumi.Output[Optional[str]]:
"""
- The Azure environment.
"""
return pulumi.get(self, "environment")
@property
@pulumi.getter
def path(self) -> pulumi.Output[Optional[str]]:
"""
- The unique path this backend should be mounted at. Defaults to `azure`.
"""
return pulumi.get(self, "path")
@property
@pulumi.getter(name="subscriptionId")
def subscription_id(self) -> pulumi.Output[str]:
"""
- The subscription id for the Azure Active Directory.
"""
return pulumi.get(self, "subscription_id")
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> pulumi.Output[str]:
"""
- The tenant id for the Azure Active Directory.
"""
return pulumi.get(self, "tenant_id")
| 40.84186 | 134 | 0.629997 | 2,039 | 17,562 | 5.210888 | 0.072094 | 0.098353 | 0.122541 | 0.118024 | 0.857506 | 0.8384 | 0.816659 | 0.7904 | 0.774682 | 0.749647 | 0 | 0.001158 | 0.262499 | 17,562 | 429 | 135 | 40.937063 | 0.819179 | 0.261758 | 0 | 0.751908 | 1 | 0 | 0.088091 | 0.002238 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160305 | false | 0.003817 | 0.019084 | 0 | 0.274809 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a72783c391d8f6624123a48fd299daf04d850dd3 | 3,610 | py | Python | data/subqueries.py | delta-reporter/delta-core | e8fdcf01d3fd246c08fa30bbed84d66a85099167 | [
"Apache-2.0"
] | 1 | 2021-03-12T10:55:48.000Z | 2021-03-12T10:55:48.000Z | data/subqueries.py | delta-reporter/delta-core | e8fdcf01d3fd246c08fa30bbed84d66a85099167 | [
"Apache-2.0"
] | 29 | 2020-04-20T10:20:20.000Z | 2021-06-04T10:17:17.000Z | data/subqueries.py | delta-reporter/delta-core | e8fdcf01d3fd246c08fa30bbed84d66a85099167 | [
"Apache-2.0"
] | 3 | 2020-05-25T14:43:47.000Z | 2021-09-02T15:27:31.000Z | import models
from app import db
from sqlalchemy.sql import func
class TestCounts:
# Subqueries to return test amounts by test run id
total_tests_by_test_run_id = (
db.session.query(models.Test.test_run_id, func.count("*").label("tests_count"))
.group_by(models.Test.test_run_id)
.subquery()
)
failed_tests_by_test_run_id = (
db.session.query(
models.Test.test_run_id, func.count("*").label("failed_tests_count"),
)
.filter(models.Test.test_status_id == 1)
.group_by(models.Test.test_run_id)
.subquery()
)
passed_tests_by_test_run_id = (
db.session.query(
models.Test.test_run_id, func.count("*").label("passed_tests_count"),
)
.filter(models.Test.test_status_id == 2)
.group_by(models.Test.test_run_id)
.subquery()
)
running_tests_by_test_run_id = (
db.session.query(
models.Test.test_run_id, func.count("*").label("running_tests_count"),
)
.filter(models.Test.test_status_id == 3)
.group_by(models.Test.test_run_id)
.subquery()
)
incomplete_tests_by_test_run_id = (
db.session.query(
models.Test.test_run_id, func.count("*").label("incomplete_tests_count"),
)
.filter(models.Test.test_status_id == 4)
.group_by(models.Test.test_run_id)
.subquery()
)
skipped_tests_by_test_run_id = (
db.session.query(
models.Test.test_run_id, func.count("*").label("skipped_tests_count"),
)
.filter(models.Test.test_status_id == 5)
.group_by(models.Test.test_run_id)
.subquery()
)
# Subqueries to return test amounts by test suite history id
total_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id, func.count("*").label("tests_count"),
)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
failed_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id,
func.count("*").label("failed_tests_count"),
)
.filter(models.Test.test_status_id == 1)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
passed_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id,
func.count("*").label("passed_tests_count"),
)
.filter(models.Test.test_status_id == 2)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
running_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id,
func.count("*").label("running_tests_count"),
)
.filter(models.Test.test_status_id == 3)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
incomplete_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id,
func.count("*").label("incomplete_tests_count"),
)
.filter(models.Test.test_status_id == 4)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
skipped_tests_by_test_suite_history_id = (
db.session.query(
models.Test.test_suite_history_id,
func.count("*").label("skipped_tests_count"),
)
.filter(models.Test.test_status_id == 5)
.group_by(models.Test.test_suite_history_id)
.subquery()
)
| 30.083333 | 87 | 0.617175 | 454 | 3,610 | 4.511013 | 0.088106 | 0.166016 | 0.232422 | 0.166992 | 0.964844 | 0.952148 | 0.916992 | 0.882813 | 0.844727 | 0.780273 | 0 | 0.003758 | 0.262881 | 3,610 | 119 | 88 | 30.336134 | 0.765877 | 0.02964 | 0 | 0.49505 | 0 | 0 | 0.064571 | 0.012571 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.039604 | 0.029703 | 0 | 0.158416 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
598bc6205dd676d3557e24040fc19c6a25876808 | 3,718 | py | Python | Feature Selection/BorutaSHAP Implementation.py | caballerown/MMF-DSS | 0ae192b26d51aa44f7e78d1d8656a012008d1035 | [
"MIT"
] | null | null | null | Feature Selection/BorutaSHAP Implementation.py | caballerown/MMF-DSS | 0ae192b26d51aa44f7e78d1d8656a012008d1035 | [
"MIT"
] | null | null | null | Feature Selection/BorutaSHAP Implementation.py | caballerown/MMF-DSS | 0ae192b26d51aa44f7e78d1d8656a012008d1035 | [
"MIT"
] | 1 | 2022-03-31T14:19:57.000Z | 2022-03-31T14:19:57.000Z | # Caballero, Gaw, Jenkins, and Johnstone
# Toward Automated Instructor Pilots in Legacy Air Force Systems:
# Physiology-based Flight Difficulty Classification via Machine Learning
# BorutaSHAP
# Physiological features only
# Import Libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from BorutaShap import BorutaShap
#matplotlib inline
# Import input features
input_df = pd.read_csv('total_dev_all_phys.csv')
# Drop unnessary columns
input_df = input_df.drop(['Subject-Run'], axis=1)
# Import response/output values
output_df = pd.read_csv('PerfMetrics_dev.csv')
difficulty = output_df['Difficulty']
# Feature selection via BorutaShap
Feature_Selector = BorutaShap(importance_measure='shap',classification = True)
Feature_Selector.fit(X=input_df, y = difficulty, n_trials = 100, random_state=42)
input_df = Feature_Selector.Subset()
# Assign training data
x_train = input_df
y_train = difficulty
# load test data
x_test = pd.read_csv('total_val_all_phys.csv')
x_test = x_test.loc[:, input_df.columns]
output_df_test = pd.read_csv('PerfMetrics_val.csv')
difficulty_test = output_df_test['Difficulty']
y_test = difficulty_test
# Convert to binary
y_train_binary = y_train.copy()
y_train_binary[:] = [x if x != 2 else 1 for x in y_train_binary]
y_train_binary[:] = [x if x != 3 else 4 for x in y_train_binary]
y_test_binary = y_test.copy()
y_test_binary[:] = [x if x != 2 else 1 for x in y_test_binary]
y_test_binary[:] = [x if x != 3 else 4 for x in y_test_binary]
# Save files
x_train.to_csv(r'C:\Users\masked_user\Desktop\xtrain.csv', index=False, header=True)
y_train.to_csv(r'C:\Users\masked_user\Desktop\ytrain.csv', index=False, header=True)
x_test.to_csv(r'C:\Users\masked_user\Desktop\xtest.csv', index=False, header=True)
y_test.to_csv(r'C:\Users\masked_user\Desktop\ytest.csv', index=False, header=True)
# Caballero, Gaw, Jenkins, and Johnstone
# Toward Automated Instructor Pilots in Legacy Air Force Systems:
# Physiology-based Flight Difficulty Classification via Machine Learning
# BorutaSHAP
# All features included
# Import Libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from BorutaShap import BorutaShap
#matplotlib inline
# Import input features
input_df = pd.read_csv('total_dev_all.csv')
# Drop unnessary columns
input_df = input_df.drop(['Subject-Run'], axis=1)
# Import response/output values
output_df = pd.read_csv('PerfMetrics_dev.csv')
difficulty = output_df['Difficulty']
# Feature selection via BorutaShap
Feature_Selector = BorutaShap(importance_measure='shap',classification = True)
Feature_Selector.fit(X=input_df, y = difficulty, n_trials = 100, random_state=42)
input_df = Feature_Selector.Subset()
# Assign training data
x_train = input_df
y_train = difficulty
# load test data
x_test = pd.read_csv('total_val_all.csv')
x_test = x_test.loc[:, input_df.columns]
output_df_test = pd.read_csv('PerfMetrics_val.csv')
difficulty_test = output_df_test['Difficulty']
y_test = difficulty_test
# Convert to binary
y_train_binary = y_train.copy()
y_train_binary[:] = [x if x != 2 else 1 for x in y_train_binary]
y_train_binary[:] = [x if x != 3 else 4 for x in y_train_binary]
y_test_binary = y_test.copy()
y_test_binary[:] = [x if x != 2 else 1 for x in y_test_binary]
y_test_binary[:] = [x if x != 3 else 4 for x in y_test_binary]
# Save files
x_train.to_csv(r'C:\Users\masked_user\Desktop\xtrain2.csv', index=False, header=True)
y_train.to_csv(r'C:\Users\masked_user\Desktop\ytrain2.csv', index=False, header=True)
x_test.to_csv(r'C:\Users\masked_user\Desktop\xtest2.csv', index=False, header=True)
y_test.to_csv(r'C:\Users\masked_user\Desktop\ytest2.csv', index=False, header=True)
| 32.614035 | 85 | 0.768693 | 618 | 3,718 | 4.391586 | 0.18123 | 0.035372 | 0.044215 | 0.029477 | 0.963154 | 0.946205 | 0.946205 | 0.946205 | 0.946205 | 0.946205 | 0 | 0.009804 | 0.122109 | 3,718 | 113 | 86 | 32.902655 | 0.821691 | 0.226466 | 0 | 0.785714 | 0 | 0 | 0.188467 | 0.125176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.178571 | 0 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
59d1497a9354aac614d41d50170727316ed1627a | 3,979 | py | Python | auto.py | subhash26jan96/cluster | bdc6d72872d4bd3f16cb358e32c764175adc8083 | [
"Apache-2.0"
] | null | null | null | auto.py | subhash26jan96/cluster | bdc6d72872d4bd3f16cb358e32c764175adc8083 | [
"Apache-2.0"
] | null | null | null | auto.py | subhash26jan96/cluster | bdc6d72872d4bd3f16cb358e32c764175adc8083 | [
"Apache-2.0"
] | 1 | 2018-09-12T20:38:57.000Z | 2018-09-12T20:38:57.000Z | #!/usr/bin/python2
print "content-type:text/html"
print ""
import time,os,sys,commands,re,cgi,cgitb
cgitb.enable()
print """
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="generator" content="CoffeeCup HTML Editor (www.coffeecup.com)">
<meta name="dcterms.created" content="Mon, 20 Jun 2016 14:55:04 GMT">
<meta name="description" content="">
<meta name="keywords" content="">
<title>CG Hadoop</title>
<style>
body
{
margin:0;
padding:0;
}
.container{
height: 100%;
width: 100%;
}
#valuep{
z-index:1;
padding:0;
margin:1%;
width:10%;
height:30%;
text-align:center;
position:fixed;
left:7%;
bottom:40%;
border: 2px bold ;
font-family:Helvetica;
background:rgba(0,0,0,.13);
color:#fff
filter:blur(50px);
}
.forebodyimg
{
padding:0;
margin:0;
width:;
height:50%;
position:fixed;
left:37%;
bottom:25%;
opacity: 0.4;
}
#headerdiv
{position: fixed;
width:100%;
height:10%;
padding:0;
z-index: 1;
}
.imagehead
{
width:100%;
height:100%;
opacity: 0.8;
}
#bodydiv
{position: fixed;
width:100%;
height:90%;
}
#forebodyimg
{
width:100%;
height:100%;
}
#footerdiv
{ border:5px green ;
position: fixed;
top: 90%;
width:100%;
height:15%;
padding:0;
z-index: 1;
}
#footerimg
{
width:100%;
height:100%;
opacity: 0.8;
}
</style>
</head>
"""
print """
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="generator" content="CoffeeCup HTML Editor (www.coffeecup.com)">
<meta name="dcterms.created" content="Mon, 20 Jun 2016 14:55:04 GMT">
<meta name="description" content="">
<meta name="keywords" content="">
<title>CG Hadoop</title>
<style>
body
{
margin:0;
padding:0;
}
.container{
height: 100%;
width: 100%;
}
#valuep{
z-index:1;
padding:0;
margin:1%;
width:10%;
height:30%;
text-align:center;
position:fixed;
left:7%;
bottom:40%;
border: 2px bold ;
font-family:Helvetica;
background:rgba(0,0,0,.13);
color:#fff
filter:blur(50px);
}
.forebodyimg
{
padding:0;
margin:0;
width:;
height:50%;
position:fixed;
left:37%;
bottom:25%;
opacity: 0.4;
}
#headerdiv
{position: fixed;
width:100%;
height:10%;
padding:0;
z-index: 1;
}
.imagehead
{
width:100%;
height:100%;
opacity: 0.8;
}
#bodydiv
{position: fixed;
width:100%;
height:90%;
}
#forebodyimg
{
width:100%;
height:100%;
}
#footerdiv
{ border:5px green ;
position: fixed;
top: 90%;
width:100%;
height:15%;
padding:0;
z-index: 1;
}
#footerimg
{
width:100%;
height:100%;
opacity: 0.8;
}
</style>
</head>
"""
print """
<div id="bodydiv">
<img class="imagehead" src="http://192.168.0.1/photoshop-spotlight-background-free-psd-1.jpg">
<img class="forebodyimg" src="http://192.168.0.1/ele1.png" />
</div>
<div id="footerdiv">
<img id="footerimg" src="http://192.168.0.1/darkBlue.jpg">
</div>
</div>
"""
print """
<head>
<script>
var auto = prompt("Maximum No. Of Available Data Nodes: 2");
if (auto!="")
{
document.location="http://192.168.0.1/cgi-bin/automate.py?auto=" + auto;
}
</script>
</head>
</body>
</html>
"""
| 14.575092 | 102 | 0.490576 | 441 | 3,979 | 4.426304 | 0.297052 | 0.057377 | 0.086066 | 0.052254 | 0.820697 | 0.814549 | 0.791496 | 0.791496 | 0.791496 | 0.791496 | 0 | 0.085714 | 0.349083 | 3,979 | 272 | 103 | 14.628676 | 0.667954 | 0.004272 | 0 | 0.763547 | 0 | 0.029557 | 0.965152 | 0.05202 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.004926 | null | null | 0.029557 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ab7dab90d212943af166fef4b10d16adf00ddbf7 | 12,585 | py | Python | benchmarks/benchmark_down_sample_sampler.py | kvpradap/py_entitymatching | 4ff803df1a03cf4d77ef935357355e6de5dd9438 | [
"BSD-3-Clause"
] | 165 | 2016-08-28T14:30:01.000Z | 2022-03-29T17:24:03.000Z | benchmarks/benchmark_down_sample_sampler.py | mvahit/py_entitymatching | 6724081d7d95c547e5a51625b4a8207c6c1737f8 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 70 | 2016-11-22T00:35:22.000Z | 2022-03-11T22:26:26.000Z | benchmarks/benchmark_down_sample_sampler.py | mvahit/py_entitymatching | 6724081d7d95c547e5a51625b4a8207c6c1737f8 | [
"MIT",
"BSD-2-Clause",
"BSD-3-Clause"
] | 53 | 2016-09-22T02:07:34.000Z | 2022-03-19T18:57:06.000Z | # Write the benchmarking functions here.
# See "Writing benchmarks" in the asv docs for more information.
import os
import py_entitymatching as em
from py_entitymatching.utils.generic_helper import get_install_path
import sys
if sys.version[0] == '2':
reload(sys)
sys.setdefaultencoding("utf-8")
PATH = get_install_path()
DATASET_PATH = os.sep.join([PATH, 'datasets', 'example_datasets'])
class TimeDownSampleRestaurants:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'restaurants', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'restaurants', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 500
self.y_param = 2
except AssertionError:
print("Dataset \'restaurants\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleElectronics:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'electronics', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'electronics', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 500
self.y_param = 5
except AssertionError:
print("Dataset \'electronics\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleAnime:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'anime', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'anime', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 1000
self.y_param = 1
except AssertionError:
print("Dataset \'anime\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleBooks:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'books', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'books', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 2000
self.y_param = 2
except AssertionError:
print("Dataset \'books\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleCitations:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'citations', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'citations', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 3000
self.y_param = 2
except AssertionError:
print("Dataset \'citations\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleBikes:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'bikes', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'bikes', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 2500
self.y_param = 2
except AssertionError:
print("Dataset \'bikes\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleCosmetics:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'cosmetics', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'cosmetics', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 4000
self.y_param = 1
except AssertionError:
print("Dataset \'cosmetics\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleEbooks:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'ebooks', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'ebooks', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 3000
self.y_param = 1
except AssertionError:
print("Dataset \'ebooks\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleMovies:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'movies', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'movies', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 1000
self.y_param = 2
except AssertionError:
print("Dataset \'movies\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleMusic:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'music', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'music', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 1500
self.y_param = 2
except AssertionError:
print("Dataset \'music\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleBeer:
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'beer', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'beer', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 500
self.y_param = 10
except AssertionError:
print("Dataset \'beer\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleASongs1:
timeout = 2000.0
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 2000
self.y_param = 2
except AssertionError:
print("Dataset \'songs\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleASongs2:
timeout = 2000.0
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 3000
self.y_param = 1
except AssertionError:
print("Dataset \'songs\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleASongs3:
timeout = 2000.0
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'songs', 'A.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 4000
self.y_param = 1
except AssertionError:
print("Dataset \'songs\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleCitation1:
timeout = 2000.0
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'citation', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'citation', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 1000
self.y_param = 1
except AssertionError:
print("Dataset \'citation\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param
class TimeDownSampleCitation2:
timeout = 2000.0
def setup(self):
path_for_a = os.sep.join([DATASET_PATH, 'citation', 'A.csv'])
path_for_b = os.sep.join([DATASET_PATH, 'citation', 'B.csv'])
try:
self.A = em.read_csv_metadata(path_for_a)
self.B = em.read_csv_metadata(path_for_b)
self.size = 2000
self.y_param = 1
except AssertionError:
print("Dataset \'citation\' not found. Please visit the project website to download the dataset.")
raise SystemExit
def time_down_sample_tables(self):
em.down_sample(self.A, self.B, self.size, self.y_param)
def teardown(self):
del self.A
del self.B
del self.size
del self.y_param | 31.941624 | 113 | 0.606436 | 1,753 | 12,585 | 4.156874 | 0.064461 | 0.061479 | 0.065871 | 0.070262 | 0.891313 | 0.891313 | 0.870454 | 0.870454 | 0.845067 | 0.845067 | 0 | 0.01243 | 0.290425 | 12,585 | 394 | 114 | 31.941624 | 0.803583 | 0.011204 | 0 | 0.805031 | 0 | 0 | 0.145234 | 0 | 0 | 0 | 0 | 0 | 0.050314 | 1 | 0.150943 | false | 0 | 0.012579 | 0 | 0.22956 | 0.050314 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab99a2195848a3c6d84fffd256ad7e91e2dfb5f5 | 30,348 | py | Python | Hydrogen_Container/unit_tests.py | Jay4C/Python-Macros-For_FreeCAD | 12ce5441a26731377fa43e86ccd2be675740d3a0 | [
"MIT"
] | null | null | null | Hydrogen_Container/unit_tests.py | Jay4C/Python-Macros-For_FreeCAD | 12ce5441a26731377fa43e86ccd2be675740d3a0 | [
"MIT"
] | null | null | null | Hydrogen_Container/unit_tests.py | Jay4C/Python-Macros-For_FreeCAD | 12ce5441a26731377fa43e86ccd2be675740d3a0 | [
"MIT"
] | null | null | null | import time
import unittest
import os
import pywinauto.mouse
import pywinauto.keyboard
class unit_tests_hydrogen_container(unittest.TestCase):
# ok
# https://info-container.fr/dimensions-des-containers-maritimes/#:~:text=Dimensions%20des%20containers%20ext%C3%A9rieures%20maximales%20%20%20,591%20standard%20%2F%202%20896%20high%20cube%20
def test_part_container_20pieds(self):
print("test_part_container_20pieds")
if os.path.exists("part_container_20pieds.py"):
os.remove("part_container_20pieds.py")
else:
print("The file does not exist")
# Writing to file
with open("part_container_20pieds.py", "w") as file:
# Writing data to a file
file.write("""import FreeCAD, Part, Mesh
DOC = FreeCAD.activeDocument()
DOC_NAME = "part_container_20pieds"
def clear_doc():
# Clear the active document deleting all the objects
for obj in DOC.Objects:
DOC.removeObject(obj.Name)
def setview():
# Rearrange View
FreeCAD.Gui.SendMsgToActiveView("ViewFit")
FreeCAD.Gui.activeDocument().activeView().viewAxometric()
if DOC is None:
FreeCAD.newDocument(DOC_NAME)
FreeCAD.setActiveDocument(DOC_NAME)
DOC = FreeCAD.activeDocument()
else:
clear_doc()
# EPS= tolerance to use to cut the parts
EPS = 0.10
EPS_C = EPS * -0.5
# dimensions exterieures maximales
longueur_exterieure_maximale = 6058
largeur_exterieure_maximale = 2438
hauteur_exterieure_maximale = 2591
container = Part.makeBox(longueur_exterieure_maximale, largeur_exterieure_maximale, hauteur_exterieure_maximale)
# dimensions interieures maximales
longueur_interieure_maximale = 5867
largeur_interieure_maximale = 2330
Hauteur_interieure_maximale = 2350
box_1 = Part.makeBox(longueur_interieure_maximale, largeur_interieure_maximale, Hauteur_interieure_maximale)
# container cut by box_1
x = (longueur_exterieure_maximale - longueur_interieure_maximale)/2
y = (largeur_exterieure_maximale - largeur_interieure_maximale)/2
z = (hauteur_exterieure_maximale - Hauteur_interieure_maximale)/2
box_1_vector = FreeCAD.Vector(x, y, z)
box_1.translate(box_1_vector)
container = container.cut(box_1)
Part.show(container)
DOC.recompute()
__objs__=[]
__objs__.append(FreeCAD.getDocument("part_container_20pieds").getObject("Shape"))
stl_file = u"part_container_20pieds.stl"
Mesh.export(__objs__, stl_file)
setview()
FreeCADGui.getDocument("part_container_20pieds").getObject("Shape").Transparency = 80
# Generate PNG files
file = 'part_container_20pieds_'
# Ombré
Gui.runCommand('Std_DrawStyle',5)
i = 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
# Filaire
Gui.runCommand('Std_DrawStyle',2)
i += 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
""")
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(460, 750))
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(70, 670))
time.sleep(3)
pywinauto.keyboard.send_keys(
'exec{(}open{(}"part_container_20pieds.py"{)}.read{(}{)}{)}'
)
time.sleep(3)
pywinauto.keyboard.send_keys('{ENTER}')
# ok
# https://www.leroymerlin.fr/produits/quincaillerie/rangement-utilitaire/etagere-utilitaire/etagere-metallique-utilitaire/etagere-acier-orange-bleu-epoxy-simonrack-5-tablettes-l-100-x-h-200-x-p-30-cm-82130991.html
def test_part_etagere(self):
print("test_part_etagere")
if os.path.exists("part_etagere.py"):
os.remove("part_etagere.py")
else:
print("The file does not exist")
# Writing to file
with open("part_etagere.py", "w") as file:
# Writing data to a file
file.write("""import FreeCAD, Part, Mesh
DOC = FreeCAD.activeDocument()
DOC_NAME = "part_etagere"
def clear_doc():
# Clear the active document deleting all the objects
for obj in DOC.Objects:
DOC.removeObject(obj.Name)
def setview():
# Rearrange View
FreeCAD.Gui.SendMsgToActiveView("ViewFit")
FreeCAD.Gui.activeDocument().activeView().viewAxometric()
if DOC is None:
FreeCAD.newDocument(DOC_NAME)
FreeCAD.setActiveDocument(DOC_NAME)
DOC = FreeCAD.activeDocument()
else:
clear_doc()
# EPS= tolerance to use to cut the parts
EPS = 0.10
EPS_C = EPS * -0.5
# dimensions exterieures maximales
longueur_exterieure_maximale = 1000
largeur_exterieure_maximale = 300
hauteur_exterieure_maximale = 2000
etagere = Part.makeBox(longueur_exterieure_maximale, largeur_exterieure_maximale, hauteur_exterieure_maximale)
box_1 = Part.makeBox(longueur_exterieure_maximale - 2*20, largeur_exterieure_maximale, (hauteur_exterieure_maximale - 100 - 5*20)/4)
box_2 = Part.makeBox(longueur_exterieure_maximale, largeur_exterieure_maximale - 2*20, (hauteur_exterieure_maximale - 100 - 5*20)/4)
# etagere cut by box_1
box_1_vector = FreeCAD.Vector(20, 0, hauteur_exterieure_maximale)
box_1.translate(box_1_vector)
for i in range(0, 5):
box_1_vector = FreeCAD.Vector(0, 0, - (hauteur_exterieure_maximale - 100 - 5*20)/4 - 20)
box_1.translate(box_1_vector)
etagere = etagere.cut(box_1)
# etagere cut by box_2
box_2_vector = FreeCAD.Vector(0, 20, hauteur_exterieure_maximale)
box_2.translate(box_2_vector)
for i in range(0, 5):
box_2_vector = FreeCAD.Vector(0, 0, - (hauteur_exterieure_maximale - 100 - 5*20)/4 - 20)
box_2.translate(box_2_vector)
etagere = etagere.cut(box_2)
Part.show(etagere)
DOC.recompute()
__objs__=[]
__objs__.append(FreeCAD.getDocument("part_etagere").getObject("Shape"))
stl_file = u"part_etagere.stl"
Mesh.export(__objs__, stl_file)
FreeCADGui.getDocument("part_etagere").getObject("Shape").ShapeColor = (1.00, 1.00, 0.00)
setview()
# Generate PNG files
file = 'part_etagere_'
# Ombré
Gui.runCommand('Std_DrawStyle',5)
i = 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
# Filaire
Gui.runCommand('Std_DrawStyle',2)
i += 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
""")
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(460, 750))
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(70, 670))
time.sleep(3)
pywinauto.keyboard.send_keys(
'exec{(}open{(}"part_etagere.py"{)}.read{(}{)}{)}'
)
time.sleep(3)
pywinauto.keyboard.send_keys('{ENTER}')
# ok
def test_part_plancher(self):
print("test_part_plancher")
if os.path.exists("part_plancher.py"):
os.remove("part_plancher.py")
else:
print("The file does not exist")
# Writing to file
with open("part_plancher.py", "w") as file:
# Writing data to a file
file.write("""import FreeCAD, Part, Mesh
DOC = FreeCAD.activeDocument()
DOC_NAME = "part_plancher"
def clear_doc():
# Clear the active document deleting all the objects
for obj in DOC.Objects:
DOC.removeObject(obj.Name)
def setview():
# Rearrange View
FreeCAD.Gui.SendMsgToActiveView("ViewFit")
FreeCAD.Gui.activeDocument().activeView().viewAxometric()
if DOC is None:
FreeCAD.newDocument(DOC_NAME)
FreeCAD.setActiveDocument(DOC_NAME)
DOC = FreeCAD.activeDocument()
else:
clear_doc()
# EPS= tolerance to use to cut the parts
EPS = 0.10
EPS_C = EPS * -0.5
# dimensions exterieures maximales
hauteur_exterieure_maximale = 2591
# dimensions interieures maximales
longueur_interieure_maximale = 5867
largeur_interieure_maximale = 2330
hauteur_interieure_maximale = 2350
plancher = Part.makeBox(longueur_interieure_maximale, largeur_interieure_maximale, (hauteur_exterieure_maximale - hauteur_interieure_maximale)/2)
Part.show(plancher)
DOC.recompute()
__objs__=[]
__objs__.append(FreeCAD.getDocument("part_plancher").getObject("Shape"))
stl_file = u"part_plancher.stl"
Mesh.export(__objs__, stl_file)
FreeCADGui.getDocument("part_plancher").getObject("Shape").ShapeColor = (1.00, 0.00, 0.00)
setview()
# Generate PNG files
file = 'part_plancher_'
# Ombré
Gui.runCommand('Std_DrawStyle',5)
i = 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
# Filaire
Gui.runCommand('Std_DrawStyle',2)
i += 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
""")
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(460, 750))
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(70, 670))
time.sleep(3)
pywinauto.keyboard.send_keys(
'exec{(}open{(}"part_plancher.py"{)}.read{(}{)}{)}'
)
time.sleep(3)
pywinauto.keyboard.send_keys('{ENTER}')
# ok
def test_assembly_we_and_etagere(self):
print("test_assembly_we_and_etagere")
if os.path.exists("assembly_we_and_etagere.py"):
os.remove("assembly_we_and_etagere.py")
else:
print("The file does not exist")
# Writing to file
with open("assembly_we_and_etagere.py", "w") as file:
# Writing data to a file
file.write("""import FreeCAD, Part, Mesh
DOC = FreeCAD.activeDocument()
DOC_NAME = "assembly_we_and_etagere"
def clear_doc():
# Clear the active document deleting all the objects
for obj in DOC.Objects:
DOC.removeObject(obj.Name)
def setview():
# Rearrange View
FreeCAD.Gui.SendMsgToActiveView("ViewFit")
FreeCAD.Gui.activeDocument().activeView().viewAxometric()
if DOC is None:
FreeCAD.newDocument(DOC_NAME)
FreeCAD.setActiveDocument(DOC_NAME)
DOC = FreeCAD.activeDocument()
else:
clear_doc()
# EPS= tolerance to use to cut the parts
EPS = 0.10
EPS_C = EPS * -0.5
# dimensions exterieures maximales
longueur_exterieure_maximale = 1000
largeur_exterieure_maximale = 300
hauteur_exterieure_maximale = 2000
etagere = Part.makeBox(longueur_exterieure_maximale, largeur_exterieure_maximale, hauteur_exterieure_maximale)
box_1 = Part.makeBox(longueur_exterieure_maximale - 2*20, largeur_exterieure_maximale, (hauteur_exterieure_maximale - 100 - 5*20)/4)
box_2 = Part.makeBox(longueur_exterieure_maximale, largeur_exterieure_maximale - 2*20, (hauteur_exterieure_maximale - 100 - 5*20)/4)
# etagere cut by box_1
box_1_vector = FreeCAD.Vector(20, 0, hauteur_exterieure_maximale)
box_1.translate(box_1_vector)
for i in range(0, 5):
box_1_vector = FreeCAD.Vector(0, 0, - (hauteur_exterieure_maximale - 100 - 5*20)/4 - 20)
box_1.translate(box_1_vector)
etagere = etagere.cut(box_1)
# etagere cut by box_2
box_2_vector = FreeCAD.Vector(0, 20, hauteur_exterieure_maximale)
box_2.translate(box_2_vector)
for i in range(0, 5):
box_2_vector = FreeCAD.Vector(0, 0, - (hauteur_exterieure_maximale - 100 - 5*20)/4 - 20)
box_2.translate(box_2_vector)
etagere = etagere.cut(box_2)
Part.show(etagere)
DOC.recompute()
FreeCADGui.getDocument("assembly_we_and_etagere").getObject("Shape").ShapeColor = (1.00, 1.00, 0.00)
# etage 1 / insertion assembly_water_electrolyzer
vector = App.Vector(250/2 + 50, 250/2 + 25, 100 + 40 + (450 + 20)*0)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer").Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# insertion assembly_water_electrolyzer
for i in range(1, 3):
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*i, 250/2 + 25, 100 + 40 + (450 + 20)*0)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# etage 2 / insertion assembly_water_electrolyzer
for i in range(3, 6):
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*(i-3), 250/2 + 25, 100 + 40 + (450 + 20)*1)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# etage 3 / insertion assembly_water_electrolyzer
for i in range(6, 9):
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*(i-6), 250/2 + 25, 100 + 40 + (450 + 20)*2)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# etage 4 / insertion assembly_water_electrolyzer
for i in range(9, 12):
if i < 10:
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*(i-9), 250/2 + 25, 100 + 40 + (450 + 20)*3)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
else:
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*(i-9), 250/2 + 25, 100 + 40 + (450 + 20)*3)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer0" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# etage 5 / insertion assembly_water_electrolyzer
for i in range(12, 15):
vector = App.Vector((250/2 + 62.5) + (250 + 62.5)*(i-12), 250/2 + 25, 100 + 40 + (450 + 20)*4)
Mesh.insert(u"assembly_water_electrolyzer.stl", "assembly_we_and_etagere")
FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer0" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
setview()
__objs__=[]
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("Shape"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer003"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer007"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer008"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer009"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer006"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer004"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer010"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer011"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer012"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer001"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer013"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer005"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer014"))
__objs__.append(FreeCAD.getDocument("assembly_we_and_etagere").getObject("assembly_water_electrolyzer002"))
stl_file = u"assembly_we_and_etagere.stl"
Mesh.export(__objs__, stl_file)
del __objs__
# Generate PNG files
file = 'assembly_we_and_etagere_'
# Ombré
Gui.runCommand('Std_DrawStyle',5)
i = 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
# Filaire
Gui.runCommand('Std_DrawStyle',2)
i += 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
""")
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(460, 750))
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(70, 670))
time.sleep(3)
pywinauto.keyboard.send_keys(
'exec{(}open{(}"assembly_we_and_etagere.py"{)}.read{(}{)}{)}'
)
time.sleep(3)
pywinauto.keyboard.send_keys('{ENTER}')
# ok
def test_assembly_wee_and_plancher(self):
print("test_assembly_wee_and_plancher")
if os.path.exists("assembly_wee_and_plancher.py"):
os.remove("assembly_wee_and_plancher.py")
else:
print("The file does not exist")
# Writing to file
with open("assembly_wee_and_plancher.py", "w") as file:
# Writing data to a file
file.write("""import FreeCAD, Part, Mesh
DOC = FreeCAD.activeDocument()
DOC_NAME = "assembly_wee_and_plancher"
def clear_doc():
# Clear the active document deleting all the objects
for obj in DOC.Objects:
DOC.removeObject(obj.Name)
def setview():
# Rearrange View
FreeCAD.Gui.SendMsgToActiveView("ViewFit")
FreeCAD.Gui.activeDocument().activeView().viewAxometric()
if DOC is None:
FreeCAD.newDocument(DOC_NAME)
FreeCAD.setActiveDocument(DOC_NAME)
DOC = FreeCAD.activeDocument()
else:
clear_doc()
# EPS= tolerance to use to cut the parts
EPS = 0.10
EPS_C = EPS * -0.5
# dimensions exterieures maximales
hauteur_exterieure_maximale = 2591
# dimensions interieures maximales
longueur_interieure_maximale = 5867
largeur_interieure_maximale = 2330
hauteur_interieure_maximale = 2350
plancher = Part.makeBox(longueur_interieure_maximale, largeur_interieure_maximale, (hauteur_exterieure_maximale - hauteur_interieure_maximale)/2)
Part.show(plancher)
DOC.recompute()
FreeCADGui.getDocument("assembly_wee_and_plancher").getObject("Shape").ShapeColor = (1.00, 0.00, 0.00)
# rang 1 / insertion assembly_we_and_etagere
vector = App.Vector(0, 0, (hauteur_exterieure_maximale - hauteur_interieure_maximale)/2)
Mesh.insert(u"assembly_we_and_etagere.stl", "assembly_wee_and_plancher")
FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere").Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# rang 1 / insertion assembly_we_and_etagere
for i in range(1, 5):
vector = App.Vector((1000 + 200)*i, (300 + 1100)*0, (hauteur_exterieure_maximale - hauteur_interieure_maximale)/2)
Mesh.insert(u"assembly_we_and_etagere.stl", "assembly_wee_and_plancher")
FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
# rang 2 / insertion assembly_we_and_etagere
for i in range(5, 10):
vector = App.Vector((1000 + 200)*(i-5), (300 + 1100)*1, (hauteur_exterieure_maximale - hauteur_interieure_maximale)/2)
Mesh.insert(u"assembly_we_and_etagere.stl", "assembly_wee_and_plancher")
FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere00" + str(i)).Placement = App.Placement(vector, App.Rotation(App.Vector(0,0,1), 0))
setview()
__objs__=[]
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("Shape"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere001"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere002"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere003"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere004"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere005"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere006"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere007"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere008"))
__objs__.append(FreeCAD.getDocument("assembly_wee_and_plancher").getObject("assembly_we_and_etagere009"))
stl_file = u"assembly_wee_and_plancher.stl"
Mesh.export(__objs__, stl_file)
del __objs__
# Generate PNG files
file = 'assembly_wee_and_plancher_'
# Ombré
Gui.runCommand('Std_DrawStyle',5)
i = 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
# Filaire
Gui.runCommand('Std_DrawStyle',2)
i += 1
Gui.activeDocument().activeView().viewIsometric()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewFront()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewTop()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRight()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewRear()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewBottom()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
i += 1
Gui.activeDocument().activeView().viewLeft()
Gui.activeDocument().activeView().saveImage(file + str(i) + '.png',1117,388,'Current')
""")
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(460, 750))
time.sleep(3)
pywinauto.mouse.click(button="left", coords=(70, 670))
time.sleep(3)
pywinauto.keyboard.send_keys(
'exec{(}open{(}"assembly_wee_and_plancher.py"{)}.read{(}{)}{)}'
)
time.sleep(3)
pywinauto.keyboard.send_keys('{ENTER}')
if __name__ == '__main__':
unittest.main()
| 34.098876 | 217 | 0.721926 | 3,915 | 30,348 | 5.384163 | 0.065134 | 0.116941 | 0.18573 | 0.063096 | 0.927985 | 0.909673 | 0.896627 | 0.886048 | 0.869681 | 0.852934 | 0 | 0.052754 | 0.119283 | 30,348 | 889 | 218 | 34.137233 | 0.735895 | 0.020067 | 0 | 0.792332 | 0 | 0.161342 | 0.898217 | 0.639502 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007987 | false | 0 | 0.015974 | 0 | 0.025559 | 0.015974 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
aba4bcc9d9aef3ab4cb6bea761fbe2dc6d416e7a | 905 | py | Python | tests/experiments/test_utils.py | Ragabov/tensorflow-nalu | 5c3f56366727cdd668f3b6c419c75689cbd83dee | [
"MIT"
] | 9 | 2018-08-21T01:53:07.000Z | 2018-09-21T09:04:21.000Z | tests/experiments/test_utils.py | Ragabov/tensorflow-nalu | 5c3f56366727cdd668f3b6c419c75689cbd83dee | [
"MIT"
] | null | null | null | tests/experiments/test_utils.py | Ragabov/tensorflow-nalu | 5c3f56366727cdd668f3b6c419c75689cbd83dee | [
"MIT"
] | null | null | null | import unittest
from experiments.utils import *
class UtilsTest(unittest.TestCase):
def test_add_generate_synthetic_arithmetic_dataset(self):
X, Y, boundaries = generate_synthetic_arithmetic_dataset("add", 0, 100, 100, 2)
print(X.shape)
expected_output = np.array([np.sum(X[i][boundaries[0]:boundaries[1]]) +
np.sum(X[i][boundaries[2]:boundaries[3]]) for i in range(2)])
np.testing.assert_allclose(Y, expected_output)
def test_mult_generate_synthetic_arithmetic_dataset(self):
X, Y, boundaries = generate_synthetic_arithmetic_dataset("mult", 0, 100, 100, 2)
print(X.shape)
expected_output = np.array([np.sum(X[i][boundaries[0]:boundaries[1]]) *
np.sum(X[i][boundaries[2]:boundaries[3]]) for i in range(2)])
np.testing.assert_allclose(Y, expected_output)
| 39.347826 | 97 | 0.645304 | 120 | 905 | 4.683333 | 0.325 | 0.120996 | 0.192171 | 0.241993 | 0.825623 | 0.825623 | 0.825623 | 0.825623 | 0.825623 | 0.825623 | 0 | 0.037143 | 0.226519 | 905 | 22 | 98 | 41.136364 | 0.765714 | 0 | 0 | 0.4 | 1 | 0 | 0.007743 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.133333 | false | 0 | 0.133333 | 0 | 0.333333 | 0.133333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
abb3ebf3357ebbe0ea59d975235626e636018714 | 27,936 | py | Python | dingtalk/python/alibabacloud_dingtalk/im_1_0/client.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/im_1_0/client.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/im_1_0/client.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# This file is auto-generated, don't edit it. Thanks.
from Tea.core import TeaCore
from alibabacloud_tea_openapi.client import Client as OpenApiClient
from alibabacloud_tea_openapi import models as open_api_models
from alibabacloud_tea_util.client import Client as UtilClient
from alibabacloud_dingtalk.im_1_0 import models as dingtalkim__1__0_models
from alibabacloud_tea_util import models as util_models
from alibabacloud_openapi_util.client import Client as OpenApiUtilClient
class Client(OpenApiClient):
"""
*\
"""
def __init__(
self,
config: open_api_models.Config,
):
super().__init__(config)
self._endpoint_rule = ''
if UtilClient.empty(self._endpoint):
self._endpoint = 'api.dingtalk.com'
def update_group_permission(
self,
request: dingtalkim__1__0_models.UpdateGroupPermissionRequest,
) -> dingtalkim__1__0_models.UpdateGroupPermissionResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateGroupPermissionHeaders()
return self.update_group_permission_with_options(request, headers, runtime)
async def update_group_permission_async(
self,
request: dingtalkim__1__0_models.UpdateGroupPermissionRequest,
) -> dingtalkim__1__0_models.UpdateGroupPermissionResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateGroupPermissionHeaders()
return await self.update_group_permission_with_options_async(request, headers, runtime)
def update_group_permission_with_options(
self,
request: dingtalkim__1__0_models.UpdateGroupPermissionRequest,
headers: dingtalkim__1__0_models.UpdateGroupPermissionHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateGroupPermissionResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.permission_group):
body['permissionGroup'] = request.permission_group
if not UtilClient.is_unset(request.status):
body['status'] = request.status
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateGroupPermissionResponse(),
self.do_roarequest('UpdateGroupPermission', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/sceneGroups/permissions', 'json', req, runtime)
)
async def update_group_permission_with_options_async(
self,
request: dingtalkim__1__0_models.UpdateGroupPermissionRequest,
headers: dingtalkim__1__0_models.UpdateGroupPermissionHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateGroupPermissionResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.permission_group):
body['permissionGroup'] = request.permission_group
if not UtilClient.is_unset(request.status):
body['status'] = request.status
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateGroupPermissionResponse(),
await self.do_roarequest_async('UpdateGroupPermission', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/sceneGroups/permissions', 'json', req, runtime)
)
def update_the_group_roles_of_group_member(
self,
request: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberRequest,
) -> dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberHeaders()
return self.update_the_group_roles_of_group_member_with_options(request, headers, runtime)
async def update_the_group_roles_of_group_member_async(
self,
request: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberRequest,
) -> dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberHeaders()
return await self.update_the_group_roles_of_group_member_with_options_async(request, headers, runtime)
def update_the_group_roles_of_group_member_with_options(
self,
request: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberRequest,
headers: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.user_id):
body['userId'] = request.user_id
if not UtilClient.is_unset(request.open_role_ids):
body['openRoleIds'] = request.open_role_ids
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse(),
self.do_roarequest('UpdateTheGroupRolesOfGroupMember', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/sceneGroups/members/groupRoles', 'json', req, runtime)
)
async def update_the_group_roles_of_group_member_with_options_async(
self,
request: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberRequest,
headers: dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.user_id):
body['userId'] = request.user_id
if not UtilClient.is_unset(request.open_role_ids):
body['openRoleIds'] = request.open_role_ids
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateTheGroupRolesOfGroupMemberResponse(),
await self.do_roarequest_async('UpdateTheGroupRolesOfGroupMember', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/sceneGroups/members/groupRoles', 'json', req, runtime)
)
def send_interactive_card(
self,
request: dingtalkim__1__0_models.SendInteractiveCardRequest,
) -> dingtalkim__1__0_models.SendInteractiveCardResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.SendInteractiveCardHeaders()
return self.send_interactive_card_with_options(request, headers, runtime)
async def send_interactive_card_async(
self,
request: dingtalkim__1__0_models.SendInteractiveCardRequest,
) -> dingtalkim__1__0_models.SendInteractiveCardResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.SendInteractiveCardHeaders()
return await self.send_interactive_card_with_options_async(request, headers, runtime)
def send_interactive_card_with_options(
self,
request: dingtalkim__1__0_models.SendInteractiveCardRequest,
headers: dingtalkim__1__0_models.SendInteractiveCardHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.SendInteractiveCardResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.card_template_id):
body['cardTemplateId'] = request.card_template_id
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.receiver_user_id_list):
body['receiverUserIdList'] = request.receiver_user_id_list
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.out_track_id):
body['outTrackId'] = request.out_track_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.robot_code):
body['robotCode'] = request.robot_code
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.conversation_type):
body['conversationType'] = request.conversation_type
if not UtilClient.is_unset(request.callback_route_key):
body['callbackRouteKey'] = request.callback_route_key
if not UtilClient.is_unset(request.card_data):
body['cardData'] = request.card_data
if not UtilClient.is_unset(request.private_data):
body['privateData'] = request.private_data
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.chat_bot_id):
body['chatBotId'] = request.chat_bot_id
if not UtilClient.is_unset(request.user_id_type):
body['userIdType'] = request.user_id_type
if not UtilClient.is_unset(request.at_open_ids):
body['atOpenIds'] = request.at_open_ids
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.SendInteractiveCardResponse(),
self.do_roarequest('SendInteractiveCard', 'im_1.0', 'HTTP', 'POST', 'AK', f'/v1.0/im/interactiveCards/send', 'json', req, runtime)
)
async def send_interactive_card_with_options_async(
self,
request: dingtalkim__1__0_models.SendInteractiveCardRequest,
headers: dingtalkim__1__0_models.SendInteractiveCardHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.SendInteractiveCardResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.card_template_id):
body['cardTemplateId'] = request.card_template_id
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.receiver_user_id_list):
body['receiverUserIdList'] = request.receiver_user_id_list
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.out_track_id):
body['outTrackId'] = request.out_track_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.robot_code):
body['robotCode'] = request.robot_code
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.conversation_type):
body['conversationType'] = request.conversation_type
if not UtilClient.is_unset(request.callback_route_key):
body['callbackRouteKey'] = request.callback_route_key
if not UtilClient.is_unset(request.card_data):
body['cardData'] = request.card_data
if not UtilClient.is_unset(request.private_data):
body['privateData'] = request.private_data
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.chat_bot_id):
body['chatBotId'] = request.chat_bot_id
if not UtilClient.is_unset(request.user_id_type):
body['userIdType'] = request.user_id_type
if not UtilClient.is_unset(request.at_open_ids):
body['atOpenIds'] = request.at_open_ids
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.SendInteractiveCardResponse(),
await self.do_roarequest_async('SendInteractiveCard', 'im_1.0', 'HTTP', 'POST', 'AK', f'/v1.0/im/interactiveCards/send', 'json', req, runtime)
)
def update_interactive_card(
self,
request: dingtalkim__1__0_models.UpdateInteractiveCardRequest,
) -> dingtalkim__1__0_models.UpdateInteractiveCardResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateInteractiveCardHeaders()
return self.update_interactive_card_with_options(request, headers, runtime)
async def update_interactive_card_async(
self,
request: dingtalkim__1__0_models.UpdateInteractiveCardRequest,
) -> dingtalkim__1__0_models.UpdateInteractiveCardResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.UpdateInteractiveCardHeaders()
return await self.update_interactive_card_with_options_async(request, headers, runtime)
def update_interactive_card_with_options(
self,
request: dingtalkim__1__0_models.UpdateInteractiveCardRequest,
headers: dingtalkim__1__0_models.UpdateInteractiveCardHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateInteractiveCardResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.out_track_id):
body['outTrackId'] = request.out_track_id
if not UtilClient.is_unset(request.card_data):
body['cardData'] = request.card_data
if not UtilClient.is_unset(request.private_data):
body['privateData'] = request.private_data
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.user_id_type):
body['userIdType'] = request.user_id_type
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateInteractiveCardResponse(),
self.do_roarequest('UpdateInteractiveCard', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/interactiveCards', 'json', req, runtime)
)
async def update_interactive_card_with_options_async(
self,
request: dingtalkim__1__0_models.UpdateInteractiveCardRequest,
headers: dingtalkim__1__0_models.UpdateInteractiveCardHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.UpdateInteractiveCardResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.out_track_id):
body['outTrackId'] = request.out_track_id
if not UtilClient.is_unset(request.card_data):
body['cardData'] = request.card_data
if not UtilClient.is_unset(request.private_data):
body['privateData'] = request.private_data
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
if not UtilClient.is_unset(request.user_id_type):
body['userIdType'] = request.user_id_type
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.UpdateInteractiveCardResponse(),
await self.do_roarequest_async('UpdateInteractiveCard', 'im_1.0', 'HTTP', 'PUT', 'AK', f'/v1.0/im/interactiveCards', 'json', req, runtime)
)
def query_members_of_group_role(
self,
request: dingtalkim__1__0_models.QueryMembersOfGroupRoleRequest,
) -> dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.QueryMembersOfGroupRoleHeaders()
return self.query_members_of_group_role_with_options(request, headers, runtime)
async def query_members_of_group_role_async(
self,
request: dingtalkim__1__0_models.QueryMembersOfGroupRoleRequest,
) -> dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse:
runtime = util_models.RuntimeOptions()
headers = dingtalkim__1__0_models.QueryMembersOfGroupRoleHeaders()
return await self.query_members_of_group_role_with_options_async(request, headers, runtime)
def query_members_of_group_role_with_options(
self,
request: dingtalkim__1__0_models.QueryMembersOfGroupRoleRequest,
headers: dingtalkim__1__0_models.QueryMembersOfGroupRoleHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.open_role_id):
body['openRoleId'] = request.open_role_id
if not UtilClient.is_unset(request.timestamp):
body['timestamp'] = request.timestamp
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse(),
self.do_roarequest('QueryMembersOfGroupRole', 'im_1.0', 'HTTP', 'POST', 'AK', f'/v1.0/im/sceneGroups/roles/members/query', 'json', req, runtime)
)
async def query_members_of_group_role_with_options_async(
self,
request: dingtalkim__1__0_models.QueryMembersOfGroupRoleRequest,
headers: dingtalkim__1__0_models.QueryMembersOfGroupRoleHeaders,
runtime: util_models.RuntimeOptions,
) -> dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse:
UtilClient.validate_model(request)
body = {}
if not UtilClient.is_unset(request.open_conversation_id):
body['openConversationId'] = request.open_conversation_id
if not UtilClient.is_unset(request.open_role_id):
body['openRoleId'] = request.open_role_id
if not UtilClient.is_unset(request.timestamp):
body['timestamp'] = request.timestamp
if not UtilClient.is_unset(request.ding_token_grant_type):
body['dingTokenGrantType'] = request.ding_token_grant_type
if not UtilClient.is_unset(request.ding_org_id):
body['dingOrgId'] = request.ding_org_id
if not UtilClient.is_unset(request.ding_isv_org_id):
body['dingIsvOrgId'] = request.ding_isv_org_id
if not UtilClient.is_unset(request.ding_suite_key):
body['dingSuiteKey'] = request.ding_suite_key
if not UtilClient.is_unset(request.ding_oauth_app_id):
body['dingOauthAppId'] = request.ding_oauth_app_id
real_headers = {}
if not UtilClient.is_unset(headers.common_headers):
real_headers = headers.common_headers
if not UtilClient.is_unset(headers.x_acs_dingtalk_access_token):
real_headers['x-acs-dingtalk-access-token'] = headers.x_acs_dingtalk_access_token
req = open_api_models.OpenApiRequest(
headers=real_headers,
body=OpenApiUtilClient.parse_to_map(body)
)
return TeaCore.from_map(
dingtalkim__1__0_models.QueryMembersOfGroupRoleResponse(),
await self.do_roarequest_async('QueryMembersOfGroupRole', 'im_1.0', 'HTTP', 'POST', 'AK', f'/v1.0/im/sceneGroups/roles/members/query', 'json', req, runtime)
)
| 53.211429 | 175 | 0.708906 | 3,200 | 27,936 | 5.777188 | 0.051875 | 0.032455 | 0.097366 | 0.110348 | 0.968518 | 0.955104 | 0.945421 | 0.942121 | 0.921837 | 0.90377 | 0 | 0.008354 | 0.207259 | 27,936 | 524 | 176 | 53.312977 | 0.826425 | 0.002864 | 0 | 0.84879 | 1 | 0 | 0.08225 | 0.02859 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022177 | false | 0 | 0.014113 | 0 | 0.078629 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e6347d3aa576e91d6beb5bc5d972323431db9800 | 201 | py | Python | jsframework/views.py | itsClay/Exercise-Customization-Database | 512370115d66b21e1b958a38fad16a1eccd3fcd7 | [
"MIT"
] | null | null | null | jsframework/views.py | itsClay/Exercise-Customization-Database | 512370115d66b21e1b958a38fad16a1eccd3fcd7 | [
"MIT"
] | 4 | 2015-06-03T16:21:57.000Z | 2015-06-08T16:50:23.000Z | jsframework/views.py | itsClay/Exercise-Customization-Database | 512370115d66b21e1b958a38fad16a1eccd3fcd7 | [
"MIT"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def index(request):
return render(request, 'jsframework/index.html')
def base(request):
return render(request, 'jsframework/base.html') | 25.125 | 49 | 0.776119 | 27 | 201 | 5.777778 | 0.592593 | 0.166667 | 0.24359 | 0.333333 | 0.474359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109453 | 201 | 8 | 50 | 25.125 | 0.871508 | 0.114428 | 0 | 0 | 0 | 0 | 0.242938 | 0.242938 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.