hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f13fd3250f36412bd4635a9e674a1d18279b7f07 | 1,737 | py | Python | tests/test_golr_schema_generator.py | deepakunni3/golr-schema-generator | a3a20e53b35471f02889caecefe0309af45f9361 | [
"BSD-3-Clause"
] | null | null | null | tests/test_golr_schema_generator.py | deepakunni3/golr-schema-generator | a3a20e53b35471f02889caecefe0309af45f9361 | [
"BSD-3-Clause"
] | 5 | 2020-02-18T01:49:23.000Z | 2020-02-19T00:11:31.000Z | tests/test_golr_schema_generator.py | deepakunni3/golr-schema-generator | a3a20e53b35471f02889caecefe0309af45f9361 | [
"BSD-3-Clause"
] | null | null | null | import os
from golr_schema_generator.golr_schema_generator import GolrSchemaGenerator
cwd = os.path.abspath(os.path.dirname(__file__))
resource_dir = os.path.join(cwd, 'resources')
target_dir = os.path.join(cwd, 'target')
def test_simple_config():
config = os.path.join(resource_dir, 'simple-config.yaml')
schema_output = os.path.join(target_dir, 'simple-config-schema.xml')
generator = GolrSchemaGenerator(config)
generator.generate_schema()
generator.export_schema(schema_output)
it = generator.xml_root.iter()
elements = [x for x in it]
assert len(elements) == 37
children = list(generator.xml_root)
assert len(children) == 3
fields = children[1]
assert len(list(fields)) == 11
def test_oban_config():
config = os.path.join(resource_dir, 'oban-config.yaml')
schema_output = os.path.join(target_dir, 'oban-config-schema.xml')
generator = GolrSchemaGenerator(config)
generator.generate_schema()
generator.export_schema(schema_output)
it = generator.xml_root.iter()
elements = [x for x in it]
assert len(elements) == 304
children = list(generator.xml_root)
assert len(children) == 3
fields = children[1]
assert len(list(fields)) == 278
def test_ont_config():
config = os.path.join(resource_dir, 'ont-config.yaml')
schema_output = os.path.join(target_dir, 'ont-config-schema.xml')
generator = GolrSchemaGenerator(config)
generator.generate_schema()
generator.export_schema(schema_output)
it = generator.xml_root.iter()
elements = [x for x in it]
assert len(elements) == 114
children = list(generator.xml_root)
assert len(children) == 3
fields = children[1]
assert len(list(fields)) == 88 | 34.058824 | 75 | 0.707542 | 234 | 1,737 | 5.081197 | 0.205128 | 0.050463 | 0.067283 | 0.045416 | 0.821699 | 0.794786 | 0.794786 | 0.711522 | 0.711522 | 0.608074 | 0 | 0.014573 | 0.170409 | 1,737 | 51 | 76 | 34.058824 | 0.810548 | 0 | 0 | 0.545455 | 0 | 0 | 0.075374 | 0.03855 | 0 | 0 | 0 | 0 | 0.204545 | 1 | 0.068182 | false | 0 | 0.045455 | 0 | 0.113636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
74cb23cce49055be03c6b02d882f4f5680eb22ac | 18,855 | py | Python | pcdet/ops/torch_hash/torch_hash_utils.py | xiangruhuang/OpenPCDet | d82d9594a0629ffed0c457aedc304e0805e93221 | [
"Apache-2.0"
] | null | null | null | pcdet/ops/torch_hash/torch_hash_utils.py | xiangruhuang/OpenPCDet | d82d9594a0629ffed0c457aedc304e0805e93221 | [
"Apache-2.0"
] | null | null | null | pcdet/ops/torch_hash/torch_hash_utils.py | xiangruhuang/OpenPCDet | d82d9594a0629ffed0c457aedc304e0805e93221 | [
"Apache-2.0"
] | null | null | null | import torch
from .torch_hash_cuda import (
hash_insert_gpu,
correspondence,
voxel_graph_gpu,
points_in_radius_gpu
)
class HashTable:
def __init__(self, size=2 ** 20, dtype=torch.float32, device='cuda:0'):
self.size = size
self.keys = torch.zeros(size, dtype=torch.long) - 1
self.values = torch.zeros(size, 4, dtype=dtype)
self.reverse_indices = torch.zeros(size, dtype=torch.long)
self.device = device
self.qmin = torch.zeros(4, dtype=torch.int32).to(self.device) - 1
self.qmax = torch.zeros(4, dtype=torch.int32).to(self.device) + 1
self.keys = self.keys.to(self.device)
self.values = self.values.to(self.device)
self.reverse_indices = self.reverse_indices.to(self.device)
self.corres_pool_size = 10000
self.corres = torch.zeros(self.corres_pool_size,
dtype=torch.long,
device=self.device)-1
rp = torch.tensor([999269, 999437, 999377], dtype=torch.long)
self.rp0, self.rp1, self.rp2 = rp
def clear(self):
self.keys[:] = -1
def points_in_radius_step2(self, query_points, temporal_offset,
radius=0.5):
"""
Args:
query_points (M, D): query points
temporal_offset: offset in temporal (last) dimension.
Returns:
eq (M): corresponding point index in query_points
er (M): corresponding point index in ref_points
"""
try:
voxel_size = self.voxel_size
ndim = self.ndim
pc_range = self.pc_range
except Exception as e:
raise ValueError(f'make sure you call hash_into_gpu first: {e}')
query_points = query_points.cuda()
voxel_coors_query = torch.round(
(query_points-pc_range[:ndim]) / voxel_size
).long() + 1
self.qmin[-1] = temporal_offset
self.qmax[-1] = temporal_offset
if not hasattr(self, 'visited'):
self.visited = torch.zeros(self.num_points, dtype=torch.long,
device=self.device)
else:
self.visited[:] = 0
# look up points from hash table
points_in_radius_gpu(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors_query,
query_points, self.qmin, self.qmax,
radius, self.visited)
point_indices = torch.where(self.visited)[0]
return point_indices
#def hash(self, voxel_coors):
# """
# Args:
# voxel_coors [N, 4]: voxel coordinates
# Returns:
# keys: [N] integers
# hash_indices: [N] integers
# """
# insert_keys = torch.zeros_like(voxel_coors[:, 0])
# indices = torch.zeros_like(insert_keys)
# for i in range(voxel_coors.shape[1]):
# insert_keys = insert_keys * self.dims[i] + voxel_coors[:, i]
# indices = indices * self.dims[i] + voxel_coors[:, i]
# indices = (indices * self.rp0 + self.rp1)
# indices = indices % self.size
# return insert_keys.to(torch.long), indices.to(torch.long)
def find_voxels(self, keys):
"""
Args:
keys [N]: integers
Returns:
voxel_coors [N, 4]: voxel coordinates
"""
voxel_coors = []
for i in range(3, -1, -1):
voxel_coors.append(keys % dims[i])
keys = keys / dims[i]
voxel_coors = torch.stack(voxel_coors, dim=-1)
return voxel_coors
@torch.no_grad()
def find_corres(self, ref_points, query_points, voxel_size,
pc_range, temporal_offset):
"""
Args:
ref_points (N, D): reference points
query_points (M, D): query points
temporal_offset: offset in temporal (last) dimension.
Returns:
eq (M): corresponding point index in query_points
er (M): corresponding point index in ref_points
"""
ref_points = ref_points.cuda()
query_points = query_points.cuda()
voxel_size = voxel_size.cuda()
self.ndim = ndim = ref_points.shape[-1]
points = torch.cat([ref_points, query_points], dim=0)
points[:] = torch.max(
torch.min(points, pc_range[ndim:]),
pc_range[:ndim])
voxel_coors = torch.round((points-pc_range[:ndim]) / voxel_size
).long() + 1
self.dims = ((pc_range[ndim:] - pc_range[:ndim]) / voxel_size).long()+3
#self.dims = (voxel_coors.max(0)[0] - voxel_coors.min(0)[0]) + 3
self.keys[:] = -1
# hash points into hash table
hash_insert_gpu(self.keys, self.values, self.reverse_indices, self.dims,
voxel_coors[:ref_points.shape[0]], ref_points)
self.qmin[-1] = temporal_offset
self.qmax[-1] = temporal_offset
if query_points.shape[0] > self.corres_pool_size:
self.corres_pool_size = query_points.shape[0]*2
self.corres = torch.zeros(self.corres_pool_size,
dtype=torch.long,
device=self.device)-1
else:
self.corres[:] = -1
corres = self.corres
# look up points from hash table
correspondence(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors[ref_points.shape[0]:],
query_points, self.qmin, self.qmax, corres)
corres = corres[:query_points.shape[0]]
mask = (corres != -1)
corres0 = torch.where(mask)[0]
corres = torch.stack([corres0, corres[mask]], dim=0)
return corres
@torch.no_grad()
def hash_into_gpu(self, ref_points, voxel_size, pc_range):
"""Hash Points into GPU
Args:
ref_points (N, D): reference points
Returns:
"""
if ref_points.device == torch.device('cpu'):
ref_points = ref_points.cuda()
self.voxel_size = voxel_size = voxel_size.clone().cuda()
self.pc_range = pc_range = pc_range.clone().cuda()
points = ref_points
self.ndim = ndim = points.shape[-1]
voxel_coors = torch.round((points-pc_range[:ndim]) / voxel_size
).long() + 1
self.dims = ((pc_range[ndim:]-pc_range[:ndim]) / voxel_size).long()+3
self.keys[:] = -1
self.num_points = ref_points.shape[0]
# hash points into hash table
hash_insert_gpu(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors, points)
@torch.no_grad()
def find_corres_step2(self, query_points, temporal_offset):
"""
Args:
ref_points (N, D): reference points
Returns:
eq (M): corresponding point index in query_points
er (M): corresponding point index in ref_points
"""
pc_range = self.pc_range
voxel_size = self.voxel_size
ndim = self.ndim
query_points = query_points.cuda()
voxel_coors = torch.round(
(query_points-pc_range[:ndim]) / voxel_size
).long() + 1
self.qmin[-1] = temporal_offset
self.qmax[-1] = temporal_offset
if query_points.shape[0] > self.corres_pool_size:
self.corres_pool_size = query_points.shape[0]*2
self.corres = torch.zeros(self.corres_pool_size,
dtype=torch.long,
device=self.device)-1
else:
self.corres[:query_points.shape[0]] = -1
corres = self.corres
# look up points from hash table
correspondence(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors,
query_points, self.qmin, self.qmax, corres)
corres = corres[:query_points.shape[0]]
corres0 = torch.where(corres != -1)[0]
corres = torch.stack([corres0, corres[corres0]], dim=0)
return corres
@torch.no_grad()
def voxel_graph(self, ref_points, query_points, voxel_size,
pc_range, temporal_offset,
radius=0.5, max_num_neighbors=32):
"""
Args:
ref_points (N, D): reference points
query_points (M, D): query points
temporal_offset: offset in temporal (last) dimension.
Returns:
eq (M): corresponding point index in query_points
er (M): corresponding point index in ref_points
"""
assert ref_points.shape[0] + query_points.shape[0] < self.size * 2
ref_points = ref_points.cuda()
query_points = query_points.cuda()
voxel_size = voxel_size.cuda()
pc_range = pc_range.cuda()
self.ndim = ndim = ref_points.shape[-1]
voxel_coors_ref = torch.round(
(ref_points-pc_range[:ndim]) / voxel_size
).long() + 1
voxel_coors_query = torch.round(
(query_points-pc_range[:ndim]) / voxel_size
).long() + 1
self.dims = ((pc_range[ndim:]-pc_range[:ndim]) / voxel_size).long()+3
self.keys[:] = -1
# hash points into hash table
hash_insert_gpu(self.keys, self.values, self.reverse_indices, self.dims,
voxel_coors_ref, ref_points)
self.qmin[-1] = temporal_offset
self.qmax[-1] = temporal_offset
if query_points.shape[0]*max_num_neighbors > self.corres_pool_size:
self.corres_pool_size = query_points.shape[0]*max_num_neighbors*2
self.corres = torch.zeros(self.corres_pool_size,
dtype=torch.long,
device=self.device)-1
else:
self.corres[:(query_points.shape[0]*max_num_neighbors)] = -1
corres = self.corres
# look up points from hash table
voxel_graph_gpu(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors_query,
query_points, self.qmin, self.qmax,
max_num_neighbors, radius, corres)
#corres = corres.cpu()
corres = corres[:(query_points.shape[0]*max_num_neighbors)]
corres = corres.view(-1, max_num_neighbors)
mask = (corres != -1)
corres0, corres1 = torch.where(mask)
corres = torch.stack([corres0, corres[(corres0, corres1)]], dim=0)
return corres
@torch.no_grad()
def voxel_graph_step2(self, query_points, temporal_offset,
radius=0.5, max_num_neighbors=32):
"""
Args:
ref_points (N, D): reference points
query_points (M, D): query points
temporal_offset: offset in temporal (last) dimension.
Returns:
eq (M): corresponding point index in query_points
er (M): corresponding point index in ref_points
"""
try:
voxel_size = self.voxel_size
ndim = self.ndim
pc_range = self.pc_range
except Exception as e:
raise ValueError(f'make sure you call hash_into_gpu first: {e}')
query_points = query_points.cuda()
voxel_coors_query = torch.round(
(query_points-pc_range[:ndim]) / voxel_size
).long() + 1
self.qmin[-1] = temporal_offset
self.qmax[-1] = temporal_offset
if query_points.shape[0]*max_num_neighbors > self.corres_pool_size:
self.corres_pool_size = query_points.shape[0]*max_num_neighbors*3//2
self.corres = torch.zeros(self.corres_pool_size,
dtype=torch.long,
device=self.device)-1
else:
self.corres[:] = -1
corres = self.corres
# look up points from hash table
voxel_graph_gpu(self.keys, self.values, self.reverse_indices,
self.dims, voxel_coors_query,
query_points, self.qmin, self.qmax,
max_num_neighbors, radius, corres)
corres = corres[:(query_points.shape[0]*max_num_neighbors)
].view(-1, max_num_neighbors)
mask = (corres != -1)
corres0, corres1 = torch.where(mask)
corres = torch.stack([corres0, corres[(corres0, corres1)]], dim=0)
return corres
def test_corres(ht, n_ref, n_query, ndim, radius):
import time
ref = torch.randn(n_ref, ndim)
query = torch.randn(n_query, ndim)
ref[:, -1] = 0
query[:, -1] = 0
pall = torch.cat([ref, query], dim=0)
pc_range = torch.cat(
[pall.min(0)[0]-3, pall.max(0)[0]+3], dim=0
).cuda()
voxel_size = torch.tensor([radius, radius, radius, radius])
start_time = time.time()
eq, er = ht.find_corres(ref, query, voxel_size, pc_range, 0)
end_time = time.time()
dist = (ref[er] - query[eq]).norm(p=2, dim=-1)
er = er[dist <= radius]
eq = eq[dist <= radius]
tree = NN(n_neighbors=1).fit(ref)
dists, index_r = tree.kneighbors(query)
dists, index_r = dists[:, 0], index_r[:, 0]
index_q = torch.arange(query.shape[0])
index_r = index_r[dists < radius]
index_q = index_q[dists < radius]
assert (er.cpu()-index_r).abs().sum() < 1e-5
assert (eq.cpu()-index_q).abs().sum() < 1e-5
return end_time - start_time
def test_vgraph(ht, n_ref, n_query, ndim, radius, n_ngbr=16):
import time
ref = torch.randn(n_ref, ndim)
query = torch.randn(n_query, ndim)
ref[:, -1] = 0
query[:, -1] = 0
pall = torch.cat([ref, query], dim=0)
pc_range = torch.cat(
[pall.min(0)[0]-3, pall.max(0)[0]+3], dim=0
).cuda()
voxel_size = torch.tensor([radius, radius, radius, radius])
start_time = time.time()
eq, er = ht.voxel_graph(ref, query, voxel_size, pc_range, 0,
radius=radius, max_num_neighbors=n_ngbr)
end_time = time.time()
dist = (ref[er] - query[eq]).norm(p=2, dim=-1)
er = er[dist <= radius]
eq = eq[dist <= radius]
tree = NN(n_neighbors=n_ngbr).fit(ref)
dists, index_r = tree.kneighbors(query)
dists, index_r = dists.reshape(-1), torch.tensor(index_r.reshape(-1))
index_q = torch.arange(query.shape[0]).repeat(n_ngbr, 1).T.reshape(-1)
index_r = index_r[dists < radius]
index_q = index_q[dists < radius]
from torch_scatter import scatter
checksum1 = scatter(index_r, index_q, dim=0, dim_size=query.shape[0],
reduce='sum')
checksum2 = scatter(er, eq, dim=0, dim_size=query.shape[0],
reduce='sum')
assert (checksum1 - checksum2.cpu()).abs().sum() < 1e-5
return end_time - start_time
def test_multistep_corres(ht, n_ref, n_query, ndim, radius):
import time
ht.clear()
ref = torch.randn(n_ref, ndim)
ref[:, -1] = 0
voxel_size = torch.tensor([radius, radius, radius, radius])
pc_range = torch.cat(
[ref.min(0)[0]-3, ref.max(0)[0]+3], dim=0
).cuda()
ht.hash_into_gpu(ref, voxel_size, pc_range)
total_time = 0
for i in range(1):
query = torch.randn(n_query, ndim)
query[:, -1] = 0
query_cuda = query.cuda()
total_time -= time.time()
eq, er = ht.find_corres_step2(query_cuda, 0)
total_time += time.time()
dist = (ref[er] - query[eq]).norm(p=2, dim=-1)
er = er[dist <= radius]
eq = eq[dist <= radius]
tree = NN(n_neighbors=1).fit(ref)
dists, index_r = tree.kneighbors(query)
dists, index_r = dists.reshape(-1), index_r.reshape(-1)
index_q = torch.arange(query.shape[0])
index_r = index_r[dists < radius]
index_q = index_q[dists < radius]
#assert (er.cpu()-index_r).abs().sum() < 1e-5
#assert (eq.cpu()-index_q).abs().sum() < 1e-5
return total_time / 3.0
def test_multistep_vgraph(ht, n_ref, n_query, ndim, radius, n_ngbr=16):
import time
ht.clear()
ref = torch.randn(n_ref, ndim)
ref[:, -1] = 0
voxel_size = torch.tensor([radius, radius, radius, radius])
pc_range = torch.cat(
[ref.min(0)[0]-3, ref.max(0)[0]+3], dim=0
).cuda()
ht.hash_into_gpu(ref, voxel_size, pc_range)
total_time = 0
for i in range(3):
query = torch.randn(n_query, ndim)
query[:, -1] = 0
query_cuda = query.cuda()
total_time -= time.time()
eq, er = ht.voxel_graph_step2(query_cuda, 0, radius=radius,
max_num_neighbors=n_ngbr)
total_time += time.time()
dist = (ref[er] - query[eq]).norm(p=2, dim=-1)
er = er[dist <= radius]
eq = eq[dist <= radius]
tree = NN(n_neighbors=n_ngbr).fit(ref)
dists, index_r = tree.kneighbors(query)
dists, index_r = dists.reshape(-1), torch.tensor(index_r.reshape(-1))
index_q = torch.arange(query.shape[0]).repeat(n_ngbr, 1).T.reshape(-1)
index_r = index_r[dists < radius]
index_q = index_q[dists < radius]
from torch_scatter import scatter
checksum1 = scatter(index_r, index_q, dim=0, dim_size=query.shape[0],
reduce='sum')
checksum2 = scatter(er, eq, dim=0, dim_size=query.shape[0],
reduce='sum')
assert (checksum1 - checksum2.cpu()).abs().sum() < 1e-5
return total_time / 3.0
if __name__ == '__main__':
from sklearn.neighbors import NearestNeighbors as NN
ht = HashTable(2**21)
radius = 0.01
n_ref, n_query = 1000000, 100
ndim = 4
for i in range(3):
time = test_corres(ht, n_ref, n_query, ndim, radius)
print(f'corres test {i:03d}: time={time:.6f}')
for i in range(3):
time = test_multistep_corres(ht, n_ref, n_query, ndim, radius)
print(f'multi-step test {i:03d}, time={time:.6f}')
for i in range(3):
time = test_vgraph(ht, n_ref, n_query, ndim, radius, 16)
print(f'vgraph test {i:03d}: time={time:.6f}')
for i in range(3):
time = test_multistep_vgraph(ht, n_ref, n_query, ndim, radius, 16)
print(f'multi-step vgraph test {i:03d}: time={time:.6f}')
| 35.508475 | 80 | 0.558207 | 2,422 | 18,855 | 4.151941 | 0.070603 | 0.057975 | 0.022673 | 0.025358 | 0.83751 | 0.82687 | 0.794252 | 0.779037 | 0.748111 | 0.719869 | 0 | 0.024686 | 0.321082 | 18,855 | 530 | 81 | 35.575472 | 0.760878 | 0.12209 | 0 | 0.684971 | 0 | 0 | 0.017466 | 0 | 0 | 0 | 0 | 0 | 0.014451 | 1 | 0.037572 | false | 0 | 0.026012 | 0 | 0.095376 | 0.011561 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
741ce84d5d00f669062b5e6883dedc08a865f681 | 10,069 | py | Python | casanovo/denovo/train_test.py | Noble-Lab/casanovo | a7cefa2f0fb5f7ba1c20943d81d4869e98079db4 | [
"Apache-2.0"
] | 11 | 2022-02-09T21:40:07.000Z | 2022-03-29T16:45:55.000Z | casanovo/denovo/train_test.py | Noble-Lab/casanovo | a7cefa2f0fb5f7ba1c20943d81d4869e98079db4 | [
"Apache-2.0"
] | 14 | 2022-02-11T15:59:02.000Z | 2022-03-24T15:54:12.000Z | casanovo/denovo/train_test.py | Noble-Lab/casanovo | a7cefa2f0fb5f7ba1c20943d81d4869e98079db4 | [
"Apache-2.0"
] | null | null | null | """Training and testing functionality for the de novo peptide sequencing model"""
import logging, importlib, os
from pathlib import Path
import pytorch_lightning as pl
from depthcharge.data import AnnotatedSpectrumIndex, SpectrumIndex
from casanovo.denovo import DeNovoDataModule, Spec2Pep
def train(train_data_path, val_data_path, model_path, config_path):
"""Train a Casanovo model with options specified in config.py."""
#Use custom config file if specified
if config_path == None:
from casanovo import config
else:
importlib.machinery.SourceFileLoader('config', config_path).load_module()
import config
#Set random seed across PyTorch, numpy and python.random
pl.utilities.seed.seed_everything(seed=config.random_seed, workers=True)
#Index training and validation data
train_data_path = Path(train_data_path)
if train_data_path.is_file():
raise FileNotFoundError(f'train_data_path expects directory path but file path was provided instead')
train_mgf_files = [train_data_path / f for f in os.listdir(train_data_path) if train_data_path/f.suffix.lower() == ".mgf"]
val_data_path = Path(val_data_path)
if val_data_path.is_file():
raise FileNotFoundError(f'val_data_path expects directory path but file path was provided instead')
val_mgf_files = [val_data_path/f for f in os.listdir(val_data_path) if (val_data_path/f).suffix.lower() == ".mgf"]
train_index = AnnotatedSpectrumIndex(config.train_annot_spec_idx_path, train_mgf_files, overwrite=config.train_spec_idx_overwrite)
val_index = AnnotatedSpectrumIndex(config.val_annot_spec_idx_path, val_mgf_files, overwrite=config.val_spec_idx_overwrite)
#Initialize data loaders
train_loader = DeNovoDataModule(
train_index=train_index,
n_peaks=config.n_peaks,
min_mz=config.min_mz,
max_mz=config.max_mz,
min_intensity=config.min_intensity,
fragment_tol_mass=config.fragment_tol_mass,
preprocess_spec=config.preprocess_spec,
num_workers=config.num_workers,
batch_size=config.train_batch_size
)
val_loader = DeNovoDataModule(
valid_index=val_index,
n_peaks=config.n_peaks,
min_mz=config.min_mz,
max_mz=config.max_mz,
min_intensity=config.min_intensity,
fragment_tol_mass=config.fragment_tol_mass,
preprocess_spec=config.preprocess_spec,
num_workers=config.num_workers,
batch_size=config.val_batch_size
)
train_loader.setup()
val_loader.setup()
# Initialize the model
if config.train_from_scratch == True:
model = Spec2Pep(
dim_model=config.dim_model,
n_head=config.n_head,
dim_feedforward=config.dim_feedforward,
n_layers=config.n_layers,
dropout=config.dropout,
dim_intensity=config.dim_intensity,
custom_encoder=config.custom_encoder,
max_length=config.max_length,
residues=config.residues,
max_charge=config.max_charge,
n_log=config.n_log,
tb_summarywriter=config.tb_summarywriter,
warmup_iters = config.warmup_iters,
max_iters = config.max_iters,
lr=config.learning_rate,
weight_decay=config.weight_decay
)
else:
model_path = Path(model_path)
if model_path.is_dir():
raise FileNotFoundError(f'model_path expects file path but directory path was provided instead')
model = Spec2Pep().load_from_checkpoint(
model_path,
dim_model=config.dim_model,
n_head=config.n_head,
dim_feedforward=config.dim_feedforward,
n_layers=config.n_layers,
dropout=config.dropout,
dim_intensity=config.dim_intensity,
custom_encoder=config.custom_encoder,
max_length=config.max_length,
residues=config.residues,
max_charge=config.max_charge,
n_log=config.n_log,
tb_summarywriter=config.tb_summarywriter,
warmup_iters = config.warmup_iters,
max_iters = config.max_iters,
lr=config.learning_rate,
weight_decay=config.weight_decay
)
#Create Trainer object and checkpoint callback to save model
if config.save_model == True:
checkpoint_callback = pl.callbacks.ModelCheckpoint(
dirpath=config.model_save_folder_path,
save_weights_only=config.save_weights_only,
filename='{epoch}',
every_n_epochs=config.every_n_epochs,
save_top_k=-1
)
trainer = pl.Trainer(
accelerator=config.accelerator,
logger=config.logger,
gpus=config.gpus,
max_epochs=config.max_epochs,
num_sanity_val_steps=config.num_sanity_val_steps,
callbacks=[checkpoint_callback]
)
else:
trainer = pl.Trainer(
accelerator=config.accelerator,
logger=config.logger,
gpus=config.gpus,
max_epochs=config.max_epochs,
num_sanity_val_steps=config.num_sanity_val_steps
)
#Train the model
trainer.fit(model, train_loader.train_dataloader(), val_loader.val_dataloader())
def test_evaluate(test_data_path, model_path, config_path):
"""Run inference a pre-trained Casanovo model with evaluation and using options specified in config.py."""
#Use custom config file if specified
if config_path == None:
from casanovo import config
else:
importlib.machinery.SourceFileLoader('config', config_path).load_module()
import config
# Initialize the pre-trained model
model_path = Path(model_path)
if model_path.is_dir():
raise FileNotFoundError(f'model_path expects file path but directory path was provided instead')
model_trained = Spec2Pep().load_from_checkpoint(
model_path,
dim_model=config.dim_model,
n_head=config.n_head,
dim_feedforward=config.dim_feedforward,
n_layers=config.n_layers,
dropout=config.dropout,
dim_intensity=config.dim_intensity,
custom_encoder=config.custom_encoder,
max_length=config.max_length,
residues=config.residues,
max_charge=config.max_charge,
n_log=config.n_log,
)
#Index test data
test_data_path = Path(test_data_path)
if test_data_path.is_file():
raise FileNotFoundError(f'test_data_path expects directory path but file path was provided instead')
mgf_files = [test_data_path/f for f in os.listdir(test_data_path) if (test_data_path/f).suffix.lower() == ".mgf"]
index = AnnotatedSpectrumIndex(config.test_annot_spec_idx_path, mgf_files, overwrite=config.test_spec_idx_overwrite)
#Initialize the data loader
loaders=DeNovoDataModule(
test_index=index,
n_peaks=config.n_peaks,
min_mz=config.min_mz,
max_mz=config.max_mz,
min_intensity=config.min_intensity,
fragment_tol_mass=config.fragment_tol_mass,
preprocess_spec=config.preprocess_spec,
num_workers=config.num_workers,
batch_size=config.test_batch_size
)
loaders.setup(stage='test', annotated=True)
#Create Trainer object
trainer = pl.Trainer(
accelerator=config.accelerator,
logger=config.logger,
gpus=config.gpus,
max_epochs=config.max_epochs,
num_sanity_val_steps=config.num_sanity_val_steps
)
#Run test
trainer.validate(model_trained, loaders.test_dataloader())
def test_denovo(test_data_path, model_path, config_path, output_path):
"""Run inference with a pre-trained Casanovo model without evaluation and using options specified in config.py."""
#Use custom config file if specified
if config_path == None:
from casanovo import config
else:
importlib.machinery.SourceFileLoader('config', config_path).load_module()
import config
# Initialize the pre-trained model
model_path = Path(model_path)
if model_path.is_dir():
raise FileNotFoundError(f'model_path expects file path but directory path was provided instead')
model_trained = Spec2Pep().load_from_checkpoint(
model_path,
dim_model=config.dim_model,
n_head=config.n_head,
dim_feedforward=config.dim_feedforward,
n_layers=config.n_layers,
dropout=config.dropout,
dim_intensity=config.dim_intensity,
custom_encoder=config.custom_encoder,
max_length=config.max_length,
residues=config.residues,
max_charge=config.max_charge,
n_log=config.n_log,
output_path=output_path
)
#Index test data
test_data_path = Path(test_data_path)
if test_data_path.is_file():
raise FileNotFoundError(f'test_data_path expects directory path but file path was provided instead')
mgf_files = [test_data_path/f for f in os.listdir(test_data_path) if (test_data_path/f).suffix.lower() == ".mgf"]
index = SpectrumIndex(config.test_annot_spec_idx_path, mgf_files, overwrite=config.test_spec_idx_overwrite)
#Initialize the data loader
loaders=DeNovoDataModule(
test_index=index,
n_peaks=config.n_peaks,
min_mz=config.min_mz,
max_mz=config.max_mz,
min_intensity=config.min_intensity,
fragment_tol_mass=config.fragment_tol_mass,
preprocess_spec=config.preprocess_spec,
num_workers=config.num_workers,
batch_size=config.test_batch_size
)
loaders.setup(stage='test', annotated=False)
#Create Trainer object
trainer = pl.Trainer(
accelerator=config.accelerator,
logger=config.logger,
gpus=config.gpus,
max_epochs=config.max_epochs,
num_sanity_val_steps=config.num_sanity_val_steps
)
#Run model without evaluation
trainer.test(model_trained, loaders.test_dataloader()) | 37.996226 | 134 | 0.694011 | 1,280 | 10,069 | 5.152344 | 0.125781 | 0.038817 | 0.029113 | 0.020622 | 0.778165 | 0.760879 | 0.752237 | 0.718271 | 0.710993 | 0.710993 | 0 | 0.000774 | 0.230112 | 10,069 | 265 | 135 | 37.996226 | 0.849974 | 0.088191 | 0 | 0.714286 | 0 | 0 | 0.059171 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014286 | false | 0 | 0.066667 | 0 | 0.080952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7440864b52076ca17ae2090937e9ad6b35771117 | 13,001 | py | Python | tests/stubs/models/domain/ml_model.py | PSE-TECO-2020-TEAM1/e2e-ml_model-management | 7f01a008648e25a29c639a5e16124b2e399eb821 | [
"MIT"
] | 1 | 2021-05-04T08:46:19.000Z | 2021-05-04T08:46:19.000Z | tests/stubs/models/domain/ml_model.py | PSE-TECO-2020-TEAM1/e2e-ml_model-management | 7f01a008648e25a29c639a5e16124b2e399eb821 | [
"MIT"
] | null | null | null | tests/stubs/models/domain/ml_model.py | PSE-TECO-2020-TEAM1/e2e-ml_model-management | 7f01a008648e25a29c639a5e16124b2e399eb821 | [
"MIT"
] | 1 | 2022-01-28T21:21:32.000Z | 2022-01-28T21:21:32.000Z | from app.ml.objects.classification.factory import get_classifier
from app.ml.objects.column_transfomer import PandasColumnTransformer
from sklearn.pipeline import Pipeline
from tests.stubs.models.domain.feature_extraction_data import get_labels_of_data_windows_4_2, get_labels_of_data_windows_5_1
from app.models.domain.performance_metrics import PerformanceMetrics, SingleMetric
from app.ml.objects.classification.enum import Classifier
from app.ml.objects.feature.enum import Feature
from app.ml.objects.normalization.enum import Normalization
from app.ml.objects.imputation.enum import Imputation
from app.models.domain.sliding_window import SlidingWindow
from app.models.domain.training_config import PerComponentConfig, PipelineConfig, TrainingConfig
from app.models.domain.ml_model import MlModel
from app.ml.objects.classification.classifier_config_spaces.util import config_spaces
from sklearn.preprocessing import LabelEncoder
from app.ml.objects.imputation.factory import get_imputer
from app.ml.objects.normalization.factory import get_normalizer
from sklearn.compose import make_column_selector
MATCH_REST_REGEX = ".*"
def get_rfc_default_hyperparameters():
rfc_default_hyperparameters = config_spaces[Classifier.RANDOM_FOREST_CLASSIFIER].get_default_configuration(
).get_dictionary()
for name, value in rfc_default_hyperparameters.items():
if value == "None":
rfc_default_hyperparameters[name] = None
elif value == "True":
rfc_default_hyperparameters[name] = True
elif value == "False":
rfc_default_hyperparameters[name] = False
return rfc_default_hyperparameters
def get_training_config_stub_5_1():
return TrainingConfig(
model_name="Model_5_1",
sliding_window=SlidingWindow(window_size=5, sliding_step=1),
perComponentConfigs={
"x_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.MEAN_IMPUTATION,
normalization=Normalization.MIN_MAX_SCALER)
),
"y_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.LINEAR_INTERPOLATION,
normalization=Normalization.NORMALIZER)
),
"z_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.ZERO_INTERPOLATION,
normalization=Normalization.STANDARD_SCALER)
),
"x_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.MEAN_IMPUTATION,
normalization=Normalization.MIN_MAX_SCALER)
),
"y_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.LINEAR_INTERPOLATION,
normalization=Normalization.NORMALIZER)
),
"z_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.ZERO_INTERPOLATION,
normalization=Normalization.STANDARD_SCALER)
)
},
classifier=Classifier.RANDOM_FOREST_CLASSIFIER,
hyperparameters=get_rfc_default_hyperparameters()
)
def get_label_performance_metrics_stub_5_1():
return [PerformanceMetrics(label='Rotate', metrics=[SingleMetric(name='precision', score=0.0), SingleMetric(name='recall', score=0.0), SingleMetric(name='f1-score', score=0.0), SingleMetric(
name='support', score=1.0)]), PerformanceMetrics(label='Shake', metrics=[SingleMetric(name='precision', score=0.0), SingleMetric(name='recall', score=0.0), SingleMetric(name='f1-score', score=0.0), SingleMetric(name='support', score=0.0)])]
def get_column_order_stub_5_1():
return ['x_Accelerometer__minimum', 'x_Accelerometer__maximum', 'y_Accelerometer__minimum', 'y_Accelerometer__maximum', 'z_Accelerometer__minimum',
'z_Accelerometer__maximum', 'x_Gyroscope__minimum', 'x_Gyroscope__maximum', 'y_Gyroscope__minimum', 'y_Gyroscope__maximum', 'z_Gyroscope__minimum', 'z_Gyroscope__maximum']
def get_label_encoder_stub_5_1():
return LabelEncoder().fit(get_labels_of_data_windows_5_1())
def get_pipeline_stub_5_1():
return Pipeline(steps=[
("imputation", PandasColumnTransformer([
("x_Accelerometer", get_imputer(Imputation.MEAN_IMPUTATION),
make_column_selector(pattern="x_Accelerometer" + MATCH_REST_REGEX)),
("y_Accelerometer", get_imputer(Imputation.LINEAR_INTERPOLATION),
make_column_selector(pattern="y_Accelerometer" + MATCH_REST_REGEX)),
("z_Accelerometer", get_imputer(Imputation.ZERO_INTERPOLATION),
make_column_selector(pattern="z_Accelerometer" + MATCH_REST_REGEX)),
("x_Gyroscope", get_imputer(Imputation.MEAN_IMPUTATION),
make_column_selector(pattern="x_Gyroscope" + MATCH_REST_REGEX)),
("y_Gyroscope", get_imputer(Imputation.LINEAR_INTERPOLATION),
make_column_selector(pattern="y_Gyroscope" + MATCH_REST_REGEX)),
("z_Gyroscope", get_imputer(Imputation.ZERO_INTERPOLATION),
make_column_selector(pattern="z_Gyroscope" + MATCH_REST_REGEX)),
])),
("normalization", PandasColumnTransformer([
("x_Accelerometer", get_normalizer(Normalization.MIN_MAX_SCALER),
make_column_selector(pattern="x_Accelerometer" + MATCH_REST_REGEX)),
("y_Accelerometer", get_normalizer(Normalization.NORMALIZER),
make_column_selector(pattern="y_Accelerometer" + MATCH_REST_REGEX)),
("z_Accelerometer", get_normalizer(Normalization.STANDARD_SCALER),
make_column_selector(pattern="z_Accelerometer" + MATCH_REST_REGEX)),
("x_Gyroscope", get_normalizer(Normalization.MIN_MAX_SCALER),
make_column_selector(pattern="x_Gyroscope" + MATCH_REST_REGEX)),
("y_Gyroscope", get_normalizer(Normalization.NORMALIZER),
make_column_selector(pattern="y_Gyroscope" + MATCH_REST_REGEX)),
("z_Gyroscope", get_normalizer(Normalization.STANDARD_SCALER),
make_column_selector(pattern="z_Gyroscope" + MATCH_REST_REGEX))
])),
("classification", get_classifier(Classifier.RANDOM_FOREST_CLASSIFIER, get_rfc_default_hyperparameters()))
])
def get_ml_model_stub_5_1() -> MlModel:
return MlModel(
_id="60736013b961d239c76711a3",
config=get_training_config_stub_5_1(),
label_performance_metrics=get_label_performance_metrics_stub_5_1(),
column_order=get_column_order_stub_5_1(),
label_encoder_object_file_ID="6073604f8741014a0cd09780",
pipeline_object_file_ID="607360560b3beb16ac03de77"
)
def get_training_config_stub_4_2():
return TrainingConfig(
model_name="Model_4_2",
sliding_window=SlidingWindow(window_size=4, sliding_step=2),
perComponentConfigs={
"x_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.MEAN_IMPUTATION,
normalization=Normalization.MIN_MAX_SCALER)
),
"y_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.LINEAR_INTERPOLATION,
normalization=Normalization.NORMALIZER)
),
"z_Accelerometer": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.ZERO_INTERPOLATION,
normalization=Normalization.STANDARD_SCALER)
),
"x_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.MEAN_IMPUTATION,
normalization=Normalization.MIN_MAX_SCALER)
),
"y_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.LINEAR_INTERPOLATION,
normalization=Normalization.NORMALIZER)
),
"z_Gyroscope": PerComponentConfig(
features=[Feature.MINIMUM, Feature.MAXIMUM],
pipeline_config=PipelineConfig(imputation=Imputation.ZERO_INTERPOLATION,
normalization=Normalization.STANDARD_SCALER)
)
},
classifier=Classifier.RANDOM_FOREST_CLASSIFIER,
hyperparameters=get_rfc_default_hyperparameters()
)
def get_label_performance_metrics_stub_4_2():
return [PerformanceMetrics(label='Rotate', metrics=[SingleMetric(name='precision', score=0.0), SingleMetric(name='recall', score=0.0), SingleMetric(name='f1-score', score=0.0), SingleMetric(name='support', score=1.0)]), PerformanceMetrics(label='Shake', metrics=[SingleMetric(name='precision', score=0.0), SingleMetric(name='recall', score=0.0), SingleMetric(name='f1-score', score=0.0), SingleMetric(name='support', score=0.0)])]
def get_label_encoder_stub_4_2():
return LabelEncoder().fit(get_labels_of_data_windows_4_2())
def get_column_order_stub_4_2():
return ['x_Accelerometer__minimum', 'x_Accelerometer__maximum', 'y_Accelerometer__minimum', 'y_Accelerometer__maximum', 'z_Accelerometer__minimum',
'z_Accelerometer__maximum', 'x_Gyroscope__minimum', 'x_Gyroscope__maximum', 'y_Gyroscope__minimum', 'y_Gyroscope__maximum', 'z_Gyroscope__minimum', 'z_Gyroscope__maximum']
def get_pipeline_stub_4_2():
return Pipeline(steps=[
("imputation", PandasColumnTransformer([
("x_Accelerometer", get_imputer(Imputation.MEAN_IMPUTATION),
make_column_selector(pattern="x_Accelerometer" + MATCH_REST_REGEX)),
("y_Accelerometer", get_imputer(Imputation.LINEAR_INTERPOLATION),
make_column_selector(pattern="y_Accelerometer" + MATCH_REST_REGEX)),
("z_Accelerometer", get_imputer(Imputation.ZERO_INTERPOLATION),
make_column_selector(pattern="z_Accelerometer" + MATCH_REST_REGEX)),
("x_Gyroscope", get_imputer(Imputation.MEAN_IMPUTATION),
make_column_selector(pattern="x_Gyroscope" + MATCH_REST_REGEX)),
("y_Gyroscope", get_imputer(Imputation.LINEAR_INTERPOLATION),
make_column_selector(pattern="y_Gyroscope" + MATCH_REST_REGEX)),
("z_Gyroscope", get_imputer(Imputation.ZERO_INTERPOLATION),
make_column_selector(pattern="z_Gyroscope" + MATCH_REST_REGEX)),
])),
("normalization", PandasColumnTransformer([
("x_Accelerometer", get_normalizer(Normalization.MIN_MAX_SCALER),
make_column_selector(pattern="x_Accelerometer" + MATCH_REST_REGEX)),
("y_Accelerometer", get_normalizer(Normalization.NORMALIZER),
make_column_selector(pattern="y_Accelerometer" + MATCH_REST_REGEX)),
("z_Accelerometer", get_normalizer(Normalization.STANDARD_SCALER),
make_column_selector(pattern="z_Accelerometer" + MATCH_REST_REGEX)),
("x_Gyroscope", get_normalizer(Normalization.MIN_MAX_SCALER),
make_column_selector(pattern="x_Gyroscope" + MATCH_REST_REGEX)),
("y_Gyroscope", get_normalizer(Normalization.NORMALIZER),
make_column_selector(pattern="y_Gyroscope" + MATCH_REST_REGEX)),
("z_Gyroscope", get_normalizer(Normalization.STANDARD_SCALER),
make_column_selector(pattern="z_Gyroscope" + MATCH_REST_REGEX))
])),
("classification", get_classifier(Classifier.RANDOM_FOREST_CLASSIFIER, get_rfc_default_hyperparameters()))
])
def get_ml_model_stub_4_2() -> MlModel:
return MlModel(
_id="607367362d98418cae5a1522",
config=get_training_config_stub_4_2(),
label_performance_metrics=get_label_performance_metrics_stub_4_2(),
column_order=get_column_order_stub_4_2(),
label_encoder_object_file_ID="60736744c795e80494dbc3e9",
pipeline_object_file_ID="60736749a4c260179e9be00d"
)
| 56.038793 | 434 | 0.691485 | 1,292 | 13,001 | 6.55418 | 0.092105 | 0.029523 | 0.053141 | 0.070855 | 0.852149 | 0.786845 | 0.764289 | 0.757204 | 0.73453 | 0.73453 | 0 | 0.019491 | 0.214676 | 13,001 | 231 | 435 | 56.281385 | 0.809892 | 0 | 0 | 0.656863 | 0 | 0 | 0.130836 | 0.033228 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063725 | false | 0 | 0.083333 | 0.058824 | 0.210784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7485326fa193c45183eea432242fe661913999a3 | 205 | py | Python | datasets/__init__.py | mgwillia/unsupervised-analysis | 66b5c7af391199394ace3a1ff396d46ac5bd7ce5 | [
"MIT"
] | null | null | null | datasets/__init__.py | mgwillia/unsupervised-analysis | 66b5c7af391199394ace3a1ff396d46ac5bd7ce5 | [
"MIT"
] | null | null | null | datasets/__init__.py | mgwillia/unsupervised-analysis | 66b5c7af391199394ace3a1ff396d46ac5bd7ce5 | [
"MIT"
] | null | null | null | from .cub import *
from .cars import *
from .dogs import *
from .flowers import *
from .imagenet import *
from .imagenet_features import *
from .neighbors_dataset import *
from .transforms_dataset import * | 25.625 | 33 | 0.770732 | 27 | 205 | 5.740741 | 0.407407 | 0.451613 | 0.232258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15122 | 205 | 8 | 33 | 25.625 | 0.890805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
77b283ae45c45d0e649fcbf4c32ab43ddbaddded | 8,485 | py | Python | Chess/ChessAI.py | katrinaalaimo/chess | 87cd6f5b442dc40c42aa758d0b22715570dbadee | [
"MIT"
] | null | null | null | Chess/ChessAI.py | katrinaalaimo/chess | 87cd6f5b442dc40c42aa758d0b22715570dbadee | [
"MIT"
] | null | null | null | Chess/ChessAI.py | katrinaalaimo/chess | 87cd6f5b442dc40c42aa758d0b22715570dbadee | [
"MIT"
] | null | null | null | import random
# Depth of the algorithm determining AI moves. Higher set_depth == harder AI. Lower if engine is too slow.
set_depth = 4
# Positive values are good for white, negative for black. i.e. black checkmate = -1000
checkmate_points = 1000
stalemate_points = 0
piece_scores = {'K': 200.0, 'Q': 9.0, 'R': 5.0, 'B': 3.3, 'N': 3.2, 'P': 1.0}
piece_positions = {
'wP': [
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0],
[1.0, 1.0, 2.0, 3.0, 3.0, 2.0, 1.0, 1.0],
[0.5, 0.5, 1.0, 2.5, 2.5, 1.0, 0.5, 0.5],
[0.0, 0.0, 0.0, 2.0, 2.0, 0.0, 0.0, 0.0],
[0.5, -0.5, -1.0, 0.0, 0.0, -1.0, -0.5, 0.5],
[0.5, 1.0, 1.0, -2.0, -2.0, 1.0, 1.0, 0.5],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]],
'bP': [
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.5, 1.0, 1.0, -2.0, -2.0, 1.0, 1.0, 0.5],
[0.5, -0.5, -1.0, 0.0, 0.0, -1.0, -0.5, 0.5],
[0.0, 0.0, 0.0, 2.0, 2.0, 0.0, 0.0, 0.0],
[0.5, 0.5, 1.0, 2.5, 2.5, 1.0, 0.5, 0.5],
[1.0, 1.0, 2.0, 3.0, 3.0, 2.0, 1.0, 1.0],
[5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]],
'wN': [
[-5.0, -4.0, -3.0, -3.0, -3.0, -3.0, -4.0, -5.0],
[-4.0, -2.0, 0.0, 0.0, 0.0, 0.0, -2.0, -4.0],
[-3.0, 0.0, 1.0, 1.5, 1.5, 1.0, 0.0, -3.0],
[-3.0, 0.5, 1.5, 2.0, 2.0, 1.5, 5.0, -30],
[-3.0, 0.0, 1.5, 2.0, 2.0, 1.5, 0, -3.0],
[-3.0, 0.5, 1.0, 1.5, 1.5, 1.0, 5.0, -30],
[-4.0, -2.0, 0.0, 0.5, 0.5, 0.0, -2.0, -4.0],
[-5.0, -4.0, -3.0, -3.0, -3.0, -3.0, -4.0, -5.0]],
'bN': [
[-5.0, -4.0, -3.0, -3.0, -3.0, -3.0, -4.0, -5.0],
[-.0, -2.0, 0.0, 0.5, 0.5, 0.0, -2.0, -4.0],
[-3.0, 0.5, 1.0, 1.5, 1.5, 1.0, 0.5, -3.0],
[-3.0, 0.0, 1.5, 2.0, 2.0, 1.5, 0.0, -3.0],
[-3.0, 0.5, 1.5, 2.0, 2.0, 1.5, 0.5, -3.0],
[-3.0, 0.0, 1.0, 1.5, 1.5, 1.0, 0.0, -3.0],
[-4.0, -2.0, 0.0, 0.0, 0.0, 0.0, -2.0, -4.0],
[-5.0, -4.0, -3.0, -3.0, -3.0, -3.0, -4.0, -5.0]],
'wB': [
[-2.0, -1.0, -1.0, -1.0, -1.0, -1.0, -1.0, -2.0],
[-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.0],
[-1.0, 0.0, 0.5, 1.0, 1.0, 0.5, 0.0, -1.0],
[-1.0, 0.5, 0.5, 1.0, 1.0, 0.5, 0.5, -1.0],
[-1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, -1.0],
[-1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, -1.0],
[-1.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, -1.0],
[-2.0, -1.0, -1.0, -1.0, -1.0, -1.0, -1.0, -2.0]],
'bB': [
[-2.0, -1.0, -1.0, -1.0, -1.0, -1.0, -1.0, -2.0],
[-1.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, -1.0],
[-1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, -1.0],
[-1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, -1.0],
[-1.0, 0.5, 0.5, 1.0, 1.0, 0.5, 0.5, -1.0],
[-1.0, 0.0, 0.5, 1.0, 1.0, 0.5, 0.0, -1.0],
[-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.0],
[-2.0, -1.0, -1.0, -1.0, -1.0, -1.0, -1.0, -2.0]],
'wR': [
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
[0.5, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[0.0, 0.0, 0.0, 0.5, 0.5, 0.0, 0.0, 0.0]],
'bR': [
[0.0, 0.0, 0.0, 0.5, 0.5, 0.0, 0.0, 0.0],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[-0.5, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5],
[0.5, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.5],
[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]],
'wQ': [
[-2.0, -1.0, -1.0, -0.5, -0.5, -1.0, -1.0, -2.0],
[-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.0],
[-1.0, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -1.0],
[-0.5, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -0.5],
[0.0, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -0.5],
[-1.0, 0.5, 0.5, 0.5, 0.5, 0.5, 0.0, -1.0],
[-1.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, -1.0],
[-2.0, -1.0, -1.0, -0.5, -0.5, -1.0, -1.0, -2.0]],
"bQ": [
[-2.0, -1.0, -1.0, -0.5, -0.5, -1.0, -1.0, -2.0],
[-1.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, -1.0],
[-1.0, 0.5, 0.5, 0.5, 0.5, 0.5, 0.0, -1.0],
[0.0, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -0.5],
[-0.5, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -0.5],
[-1.0, 0.0, 0.5, 0.5, 0.5, 0.5, 0.0, -1.0],
[-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.0],
[-2.0, -1.0, -1.0, -0.5, -0.5, -1.0, -1.0, -2.0]],
'wK': [ # Uses chessprogramming.org King middle game values
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-2.0, -3.0, -3.0, -4.0, -4.0, -3.0, -3.0, -2.0],
[-1.0, -2.0, -2.0, -2.0, -2.0, -2.0, -2.0, -1.0],
[2.0, 2.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0],
[2.0, 3.0, 1.0, 0.0, 0.0, 1.0, 3.0, 2.0]],
'bK': [ # Uses chessprogramming.org King middle game values
[2.0, 3.0, 1.0, 0.0, 0.0, 1.0, 3.0, 2.0],
[2.0, 2.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0],
[-1.0, -2.0, -2.0, -2.0, -2.0, -2.0, -2.0, -1.0],
[-2.0, -3.0, -3.0, -4.0, -4.0, -3.0, -3.0, -2.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0],
[-3.0, -4.0, -4.0, -5.0, -5.0, -4.0, -4.0, -3.0]]}
def find_random_move(valid_moves):
return random.choice(valid_moves)
def find_best_move(game_state, valid_moves):
"""Helper method to make first recursive call"""
global next_move
next_move = None
random.shuffle(valid_moves)
find_negamax_move_alphabeta(game_state, valid_moves, set_depth, -checkmate_points, checkmate_points,
1 if game_state.white_to_move else -1)
return next_move
def find_negamax_move_alphabeta(game_state, valid_moves, depth, alpha, beta, turn_multiplier):
"""
NegaMax algorithm with alpha beta pruning.
Alpha beta pruning eliminates the need to check all moves within the game_state tree when
a better branch has been found or a branch has too low of a score.
alpha: upper bound (max possible); beta: lower bound (min possible)
If max score is greater than alpha, that becomes the new alpha value.
If alpha becomes >= beta, break out of branch.
White is always trying to maximise score and black is always
trying to minimise score. Once the possibility of a higher max or lower min
has been eliminated, there is no need to check further branches.
"""
global next_move
if depth == 0:
return turn_multiplier * score_board(game_state)
max_score = -checkmate_points
for move in valid_moves:
game_state.make_move(move)
next_moves = game_state.get_valid_moves()
score = -find_negamax_move_alphabeta(game_state, next_moves, depth - 1, -beta, -alpha, -turn_multiplier)
if score > max_score:
max_score = score
if depth == set_depth:
next_move = move
game_state.undo_move()
# Pruning
if max_score > alpha:
alpha = max_score
if alpha >= beta:
break
return max_score
def score_board(game_state):
"""Positive score is good for white; negative score is good for black."""
if game_state.checkmate:
if game_state.white_to_move:
return -checkmate_points # Black wins
else:
return checkmate_points # White wins
elif game_state.stalemate:
return stalemate_points
score = 0
for row in range(len(game_state.board)):
for column in range(len(game_state.board)):
if game_state.board[row][column][0] == 'w':
score += piece_scores[game_state.board[row][column][1]]
score += piece_positions[game_state.board[row][column]][row][column]
elif game_state.board[row][column][0] == 'b':
score -= piece_scores[game_state.board[row][column][1]]
score -= piece_positions[game_state.board[row][column]][row][column]
return score
| 43.512821 | 112 | 0.438774 | 2,004 | 8,485 | 1.81487 | 0.073852 | 0.269453 | 0.317569 | 0.350839 | 0.564476 | 0.564476 | 0.51691 | 0.492989 | 0.469068 | 0.467143 | 0 | 0.257815 | 0.283677 | 8,485 | 194 | 113 | 43.737113 | 0.340573 | 0.12033 | 0 | 0.53125 | 0 | 0 | 0.004324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025 | false | 0 | 0.00625 | 0.00625 | 0.08125 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
77c43d8fb929cd620a1cef91922a8e5de2063402 | 98 | py | Python | __init__.py | aleozlx/imeta.py | 3312743626c461bc74f4aac2b05b5a9cd91231a0 | [
"MIT"
] | null | null | null | __init__.py | aleozlx/imeta.py | 3312743626c461bc74f4aac2b05b5a9cd91231a0 | [
"MIT"
] | null | null | null | __init__.py | aleozlx/imeta.py | 3312743626c461bc74f4aac2b05b5a9cd91231a0 | [
"MIT"
] | null | null | null | __all__ = ['metadata', 'add_xval']
from .metadata import metadata
from .metadata import add_xval
| 19.6 | 34 | 0.765306 | 13 | 98 | 5.307692 | 0.461538 | 0.202899 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132653 | 98 | 4 | 35 | 24.5 | 0.811765 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
77ed904cca99b53a6e2c926e65a62f91529afe18 | 46,639 | py | Python | test/test_checks_build.py | zultron/catkin_lint | 7076a3626f5673e58c519346fa52cc78e759d100 | [
"BSD-3-Clause"
] | null | null | null | test/test_checks_build.py | zultron/catkin_lint | 7076a3626f5673e58c519346fa52cc78e759d100 | [
"BSD-3-Clause"
] | null | null | null | test/test_checks_build.py | zultron/catkin_lint | 7076a3626f5673e58c519346fa52cc78e759d100 | [
"BSD-3-Clause"
] | null | null | null | import unittest
import catkin_lint.checks.build as cc
from .helper import create_env, create_manifest, create_manifest2, mock_lint, patch, mock_open, posix_and_nt
import sys
sys.stderr = sys.stdout
import os
import stat
class ChecksBuildTest(unittest.TestCase):
@posix_and_nt
@patch("os.path.isdir", lambda x: x == os.path.normpath("/mock-path/include"))
def test_includes(self):
"""Test include_directories()"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg, "include_directories(include)", checks=cc.includes)
self.assertEqual([], result)
result = mock_lint(env, pkg, "include_directories(/somewhere/else/but/absolute)", checks=cc.includes)
self.assertEqual([], result)
result = mock_lint(env, pkg, "find_package(catkin REQUIRED) include_directories(${catkin_INCLUDE_DIRS})", checks=cc.includes)
self.assertEqual([], result)
result = mock_lint(env, pkg, "include_directories(missing_include)", checks=cc.includes)
self.assertEqual([ "MISSING_BUILD_INCLUDE_PATH" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x in [os.path.normpath("/mock-path/src/a.cpp"), os.path.normpath("/mock-path/src/b.cpp")])
def test_source_files(self):
"""Test add_executable() and add_library()"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg, "add_executable(mock IMPORTED) add_library(mock_lib IMPORTED)", checks=cc.source_files)
self.assertEqual([], result)
result = mock_lint(env, pkg, "add_executable(mock src/a.cpp src/b.cpp) add_library(mock_lib src/a.cpp src/b.cpp)", checks=cc.source_files)
self.assertEqual([], result)
result = mock_lint(env, pkg, "add_executable(mock ${CMAKE_CURRENT_SOURCE_DIR}/src/a.cpp) add_library(mock_lib ${CMAKE_CURRENT_SOURCE_DIR}/src/a.cpp)", checks=cc.source_files)
self.assertEqual([], result)
result = mock_lint(env, pkg, "add_executable(mock src/missing.cpp)", checks=cc.source_files)
self.assertEqual([ "MISSING_FILE" ], result)
result = mock_lint(env, pkg, "add_library(mock src/missing.cpp)", checks=cc.source_files)
self.assertEqual([ "MISSING_FILE" ], result)
result = mock_lint(env, pkg, "add_executable(mock src/b.cpp src/a.cpp)", checks=cc.source_files)
self.assertEqual([ "UNSORTED_LIST" ], result)
result = mock_lint(env, pkg, "add_library(mock src/b.cpp src/a.cpp)", checks=cc.source_files)
self.assertEqual([ "UNSORTED_LIST" ], result)
@posix_and_nt
@patch("os.path.isdir", lambda x: x == os.path.normpath("/mock-path/in_package"))
def test_link_directories(self):
"""Test link_directories()"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg, "link_directories(in_package)", checks=cc.link_directories)
self.assertEqual([ "LINK_DIRECTORY" ], result)
result = mock_lint(env, pkg, "link_directories(/not/in/package)", checks=cc.link_directories)
self.assertEqual([ "EXTERNAL_LINK_DIRECTORY" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/FindLocal.cmake"))
def test_depends(self):
"""Test dependency checks"""
env = create_env()
pkg = create_manifest("mock", build_depends=[ "other_catkin" ])
result = mock_lint(env, pkg,
"""
find_package(catkin REQUIRED COMPONENTS other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
catkin_package()
find_package(catkin REQUIRED COMPONENTS other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin COMPONENTS other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "MISSING_REQUIRED" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "MISSING_COMPONENTS", "UNCONFIGURED_BUILD_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
find_package(other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "DUPLICATE_FIND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
find_package(unknown_package REQUIRED)
""",
checks=cc.depends)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(other_catkin REQUIRED)
find_package(catkin REQUIRED COMPONENTS other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "DUPLICATE_FIND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_system)
find_package(other_catkin REQUIRED)
""",
checks=cc.depends)
self.assertEqual([ "NO_CATKIN_COMPONENT", "MISSING_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS unknown_package)
find_package(other_catkin REQUIRED)
""",
checks=cc.depends)
self.assertEqual([ "UNKNOWN_PACKAGE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
""",
checks=cc.depends)
self.assertEqual([ "UNCONFIGURED_BUILD_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin)
""",
checks=cc.depends)
self.assertEqual([ "MISSING_REQUIRED" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin)
if(other_catkin_FOUND)
endif()
""",
checks=cc.depends)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
find_package(other_catkin REQUIRED)
endif()
""",
checks=cc.depends)
self.assertEqual(["UNCONFIGURED_BUILD_DEPEND"], result)
pkg = create_manifest("mock", test_depends=[ "other_catkin" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
find_package(other_catkin REQUIRED)
endif()
""",
checks=cc.depends)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
""",
checks=cc.depends)
self.assertEqual(["UNGUARDED_TEST_DEPEND"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
else()
find_package(other_catkin REQUIRED)
endif()
""",
checks=cc.depends)
self.assertEqual(["UNGUARDED_TEST_DEPEND"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
else()
if(CATKIN_ENABLE_TESTING)
find_package(other_catkin REQUIRED)
endif()
endif()
""",
checks=cc.depends)
self.assertEqual([], result)
pkg = create_manifest("mock", build_depends=[ "other_catkin"], test_depends=[ "other_catkin" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
find_package(other_catkin REQUIRED)
endif()
""",
checks=cc.depends)
self.assertEqual(["UNCONFIGURED_BUILD_DEPEND"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
""",
checks=cc.depends)
self.assertEqual([], result)
pkg = create_manifest("mock", build_depends=[ "first_pkg", "second_pkg" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg second_pkg)
""",
checks=cc.depends)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS second_pkg first_pkg)
""",
checks=cc.depends)
self.assertEqual([ "UNSORTED_LIST" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/src/source.cpp"))
def test_targets(self):
"""Test checks catkin packages with declared targets"""
env = create_env()
pkg = create_manifest("mock", build_depends=[ "other_catkin" ], run_depends=[ "other_catkin" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
include_directories(${catkin_INCLUDE_DIRS})
add_executable(${PROJECT_NAME}/prog src/source.cpp)
set_target_properties(${PROJECT_NAME}/prog PROPERTIES OUTPUT_NAME "prog")
target_link_libraries(${PROJECT_NAME}/prog ${catkin_LIBRARIES})
""",
checks=cc.targets)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
include_directories(${catkin_INCLUDE_DIRS})
target_link_libraries(${PROJECT_NAME}/prog ${catkin_LIBRARIES})
add_executable(${PROJECT_NAME}/prog src/source.cpp)
set_target_properties(${PROJECT_NAME}/prog PROPERTIES OUTPUT_NAME "prog")
""",
checks=cc.targets)
self.assertEqual([ "ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
add_executable(${PROJECT_NAME}_prog src/source.cpp)
target_link_libraries(${PROJECT_NAME}_prog ${catkin_LIBRARIES})
""",
checks=cc.targets)
self.assertEqual([ "MISSING_CATKIN_INCLUDE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
include_directories(${catkin_INCLUDE_DIRS} ${other_catkin_INCLUDE_DIRS})
add_executable(${PROJECT_NAME}_prog src/source.cpp)
target_link_libraries(${PROJECT_NAME}_prog ${catkin_LIBRARIES})
""",
checks=cc.targets)
self.assertEqual([ "DUPLICATE_BUILD_INCLUDE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
include_directories(${catkin_INCLUDE_DIRS})
add_executable(${PROJECT_NAME}_prog src/source.cpp)
target_link_libraries(${PROJECT_NAME}_prog ${catkin_LIBRARIES})
""",
checks=cc.targets)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
add_executable(${PROJECT_NAME}_prog src/source.cpp)
find_package(catkin REQUIRED COMPONENTS other_catkin)
catkin_package()
include_directories(${catkin_INCLUDE_DIRS})
target_link_libraries(${PROJECT_NAME}_prog ${catkin_LIBRARIES})
""",
checks=cc.targets)
self.assertEqual([ "CATKIN_ORDER_VIOLATION", "ORDER_VIOLATION" ], result)
pkg = create_manifest("mock", meta=True)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_metapackage()
add_executable(${PROJECT_NAME}_prog src/source.cpp)
""",
checks=cc.targets)
self.assertEqual([ "INVALID_META_COMMAND" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/src/source.cpp"))
def test_name_check(self):
"""Test checks for invalid names"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_executable(${PROJECT_NAME}/prog src/source.cpp)
target_link_libraries(${PROJECT_NAME}/prog ${catkin_LIBRARIES})
""",
checks=cc.name_check)
self.assertEqual([ "INVALID_TARGET_OUTPUT" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_executable(prog src/source.cpp)
target_link_libraries(prog ${catkin_LIBRARIES})
""",
checks=cc.name_check)
self.assertEqual([ "TARGET_NAME_COLLISION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_library(lib${PROJECT_NAME} src/source.cpp)
""",
checks=cc.name_check)
self.assertEqual([ "REDUNDANT_LIB_PREFIX" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x in [ os.path.normpath("/mock-path/bin/script"), os.path.normpath("/mock-path/share/file"), os.path.normpath("/mock-path/src/source.cpp") ])
@patch("os.path.isdir", lambda x: x == os.path.normpath("/mock-path/include"))
def test_installs(self):
"""Test installation checks"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
include_directories(include)
catkin_package(INCLUDE_DIRS include)
add_executable(${PROJECT_NAME} src/source.cpp)
add_executable(test_${PROJECT_NAME} src/source.cpp)
add_executable(${PROJECT_NAME}_example src/source.cpp)
install(PROGRAMS bin/script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
install(FILES share/file DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})
install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
install(DIRECTORY include/ DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(EXPORT stuff DESTINATION "${missing_variable}")
""",
checks=cc.installs)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(PROGRAMS bin/script DESTINATION bin)
""",
checks=cc.installs)
self.assertEqual([ "INSTALL_DESTINATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(PROGRAMS bin/missing_script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([ "MISSING_FILE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(DIRECTORY missing DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([ "MISSING_DIRECTORY" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(PROGRAMS bin/script DESTINATION "${missing_variable}")
""",
checks=cc.installs)
self.assertEqual([ "INSTALL_DESTINATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(LIBRARIES ${PROJECT_NAME})
add_library(${PROJECT_NAME} src/source.cpp)
add_executable(${PROJECT_NAME}_prog src/source.cpp)
""",
checks=cc.installs)
self.assertEqual([ "UNINSTALLED_EXPORT_LIB", "MISSING_INSTALL_TARGET" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include)
add_executable(test_${PROJECT_NAME} src/source.cpp)
""",
checks=cc.installs)
self.assertEqual([ "MISSING_BUILD_INCLUDE", "MISSING_INSTALL_INCLUDE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_library(${PROJECT_NAME}_lib src/source.cpp)
add_executable(${PROJECT_NAME} src/source.cpp)
target_link_libraries(${PROJECT_NAME} ${PROJECT_NAME}_lib)
install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([ "UNINSTALLED_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_library(${PROJECT_NAME}_lib STATIC src/source.cpp)
add_executable(${PROJECT_NAME} src/source.cpp)
target_link_libraries(${PROJECT_NAME} ${PROJECT_NAME}_lib)
install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([ "UNDEFINED_INSTALL_TARGET" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package()
add_library(${PROJECT_NAME}_target1 src/source.cpp)
add_executable(${PROJECT_NAME}_target2 src/source.cpp)
install(TARGETS ${PROJECT_NAME}_target2 ${PROJECT_NAME}_target1 RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([ "UNSORTED_LIST" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_install_python(PROGRAMS bin/missing DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual(["MISSING_FILE"], result)
open_func = "builtins.open" if sys.version_info[0] >= 3 else "__builtin__.open"
# Work around a limitation of older Python mock_open() implementations
with patch(open_func, new_callable=mock_open, read_data="test\nthis\n"):
with open("anything", "r") as f:
if f.readline() != "test\n":
return
with patch(open_func, new_callable=mock_open, read_data="no python shebang\ncontent\n"):
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_install_python(PROGRAMS bin/script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual(["MISSING_SHEBANG"], result)
with patch(open_func, new_callable=mock_open, read_data="#!/wrong/shebang\ncontent\n"):
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_install_python(PROGRAMS bin/script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual(["MISSING_SHEBANG"], result)
with patch(open_func, new_callable=mock_open, read_data="#!/usr/bin/python\ncontent\n"):
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_install_python(PROGRAMS bin/script DESTINATION wrong/destination)
""",
checks=cc.installs)
self.assertEqual(["INSTALL_DESTINATION"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_install_python(PROGRAMS bin/script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.installs)
self.assertEqual([], result)
def test_tests(self):
"""Test unit test checks"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_download_test_data()
""",
checks=cc.tests)
self.assertEqual(["UNGUARDED_TEST_CMD"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
if(CATKIN_ENABLE_TESTING)
add_rostest()
endif()
""",
checks=cc.tests)
self.assertEqual(["MISSING_DEPEND"], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/src/source.cpp"))
@patch("os.path.isdir", lambda x: x in [ os.path.normpath("/mock-path/include"), os.path.normpath("/mock-path/include/mock") ])
def test_exports(self):
"""Test checks for exported libraries"""
env = create_env()
pkg = create_manifest("mock", build_depends=[ "other_catkin", "other_system" ], run_depends=[ "other_catkin", "other_system" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(other_system REQUIRED)
catkin_package(
INCLUDE_DIRS include
CATKIN_DEPENDS other_catkin
DEPENDS other_system
LIBRARIES ${PROJECT_NAME}
)
include_directories(include)
add_library(${PROJECT_NAME} src/source.cpp)
""",
checks=cc.exports)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(other_system REQUIRED)
catkin_package(
INCLUDE_DIRS ${CMAKE_CURRENT_BINARY_DIR}
CATKIN_DEPENDS other_catkin
DEPENDS other_system
)
""",
checks=cc.exports)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_catkin)
find_package(other_system REQUIRED)
find_path(Stuff_INCLUDE_DIRS stuff.h)
find_library(Stuff_LIBRARIES stuff)
catkin_package(
CATKIN_DEPENDS other_catkin
DEPENDS Stuff other_system
)
""",
checks=cc.exports)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(other_system REQUIRED)
catkin_package(
INCLUDE_DIRS missing_include
CATKIN_DEPENDS other_catkin
DEPENDS other_system
)
""",
checks=cc.exports)
self.assertEqual([ "MISSING_EXPORT_INCLUDE_PATH" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(other_system REQUIRED)
catkin_package(
DEPENDS other_catkin other_system
)
""",
checks=cc.exports)
self.assertEqual([ "CATKIN_AS_SYSTEM_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(other_system REQUIRED)
catkin_package(
CATKIN_DEPENDS other_catkin other_system
)
""",
checks=cc.exports)
self.assertEqual([ "SYSTEM_AS_CATKIN_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
find_package(unknown_package REQUIRED)
catkin_package(
CATKIN_DEPENDS other_catkin unknown_package
)
""",
checks=cc.exports)
self.assertEqual([ "UNKNOWN_PACKAGE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
catkin_package(
CATKIN_DEPENDS other_catkin
DEPENDS other_system
)
""",
checks=cc.exports)
self.assertEqual([ "UNCONFIGURED_SYSTEM_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
catkin_package(
CATKIN_DEPENDS other_catkin
INCLUDE_DIRS /not/in/package
)
""",
checks=cc.exports)
self.assertEqual([ "EXTERNAL_INCLUDE_PATH" ], result)
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(CATKIN_DEPENDS other_catkin)
""",
checks=cc.exports)
self.assertEqual([ "MISSING_DEPEND" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
pkg_check_modules(FOO foo)
catkin_package(DEPENDS FOO)
""",
checks=cc.exports)
self.assertEqual([ "EXPORTED_PKG_CONFIG"], result)
pkg = create_manifest("mock", build_depends=[ "other_msgs" ], run_depends=[ "other_msgs"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS other_msgs)
catkin_package()
""",
checks=cc.exports)
self.assertEqual([ "SUGGEST_CATKIN_DEPEND" ], result)
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include)
include_directories(include)
add_library(${PROJECT_NAME} src/source.cpp)
""",
checks=cc.exports)
self.assertEqual([ "MISSING_EXPORT_LIB" ], result)
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include LIBRARIES ${PROJECT_NAME})
include_directories(include)
add_library(${PROJECT_NAME} src/source.cpp)
set_target_properties(${PROJECT_NAME} PROPERTIES OUTPUT_NAME "renamed")
""",
checks=cc.exports)
self.assertEqual([ "EXPORT_LIB_RENAMED" ], result)
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include LIBRARIES ${PROJECT_NAME})
include_directories(include)
add_executable(${PROJECT_NAME} src/source.cpp)
""",
checks=cc.exports)
self.assertEqual([ "EXPORT_LIB_NOT_LIB" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include)
add_executable(test_${PROJECT_NAME} src/source.cpp)
""",
checks=cc.exports)
self.assertEqual([ "MISSING_BUILD_INCLUDE" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
catkin_package(INCLUDE_DIRS include)
include_directories(include/${PROJECT_NAME})
add_executable(test_${PROJECT_NAME} src/source.cpp)
""",
checks=cc.exports)
self.assertEqual([ "MISSING_BUILD_INCLUDE", "AMBIGUOUS_BUILD_INCLUDE" ], result)
pkg = create_manifest("mock", build_depends=[ "first_pkg", "second_pkg" ], run_depends=[ "first_pkg", "second_pkg" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg second_pkg)
catkin_package(CATKIN_DEPENDS first_pkg second_pkg)
""",
checks=cc.exports)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg second_pkg)
catkin_package(CATKIN_DEPENDS second_pkg first_pkg)
""",
checks=cc.exports)
self.assertEqual([ "UNSORTED_LIST" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/config.xml"))
def test_plugins(self):
"""Test checks for exported plugins"""
from catkin_pkg.package import Export
env = create_env()
pkg = create_manifest("mock", run_depends=[ "other_catkin" ])
plugin = Export("other_catkin")
plugin.attributes = { "plugin": "${prefix}/config.xml" }
pkg.exports += [ plugin ]
result = mock_lint(env, pkg, "install(FILES config.xml DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})", checks=cc.plugins)
self.assertEqual([], result)
result = mock_lint(env, pkg, "", checks=cc.plugins)
self.assertEqual([ "PLUGIN_MISSING_INSTALL" ], result)
result = mock_lint(env, pkg, "install(FILES config.xml DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})", checks=cc.plugins)
self.assertEqual([ "PLUGIN_MISSING_INSTALL" ], result)
pkg = create_manifest("mock", run_depends=[ "other_catkin" ])
plugin = Export("other_catkin")
plugin.attributes = { "plugin": "config.xml" }
pkg.exports += [ plugin ]
result = mock_lint(env, pkg, "install(FILES config.xml DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})", checks=cc.plugins)
self.assertEqual([ "PLUGIN_EXPORT_PREFIX" ], result)
pkg = create_manifest("mock")
plugin = Export("other_catkin")
plugin.attributes = { "plugin": "${prefix}/config.xml" }
pkg.exports += [ plugin ]
result = mock_lint(env, pkg, "install(FILES config.xml DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})", checks=cc.plugins)
self.assertEqual([ "PLUGIN_DEPEND" ], result)
pkg = create_manifest("mock")
plugin = Export("mock")
plugin.attributes = { "plugin": "${prefix}/config.xml" }
pkg.exports += [ plugin ]
result = mock_lint(env, pkg, "install(FILES config.xml DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})", checks=cc.plugins)
self.assertEqual([], result)
pkg = create_manifest("mock", run_depends=[ "other_catkin" ])
plugin = Export("other_catkin")
plugin.attributes = { "plugin": "${prefix}/missing_config.xml" }
pkg.exports += [ plugin ]
result = mock_lint(env, pkg, "install(FILES missing_config.xml DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION})", checks=cc.plugins)
self.assertEqual([ "PLUGIN_MISSING_FILE" ], result)
@posix_and_nt
@patch("os.path.isfile", lambda x: "exist" in x)
@patch("os.stat", lambda x: os.stat_result((stat.S_IXUSR if "script" in x else 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)))
def test_dynamic_reconfigure(self):
"""Test checks for dynamic reconfigure scripts"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
generate_dynamic_reconfigure_options(cfg/existing_script.cfg)
""",
checks=cc.dynamic_reconfigure)
self.assertEqual(["UNCONFIGURED_BUILD_DEPEND"], result)
pkg = create_manifest("mock", build_depends=["dynamic_reconfigure"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS dynamic_reconfigure)
generate_dynamic_reconfigure_options(cfg/existing_script.cfg)
""",
checks=cc.dynamic_reconfigure)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS dynamic_reconfigure)
generate_dynamic_reconfigure_options(cfg/missing_script.cfg)
""",
checks=cc.dynamic_reconfigure)
self.assertEqual(["MISSING_FILE"], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS dynamic_reconfigure)
generate_dynamic_reconfigure_options(cfg/existing_non_executable.cfg)
""",
checks=cc.dynamic_reconfigure)
self.assertEqual(["SCRIPT_NOT_EXECUTABLE"], result)
@posix_and_nt
@patch("os.walk", lambda x, topdown: iter([("/mock-path/bin", [], ["script"])]))
@patch("os.path.isfile", lambda x: x == os.path.normpath("/mock-path/bin/script"))
@patch("os.path.isdir", lambda x: x == os.path.normpath("/mock-path/bin"))
@patch("os.stat", lambda x: os.stat_result((stat.S_IXUSR, 0, 0, 0, 0, 0, 0, 0, 0, 0)))
def test_scripts(self):
"""Test checks for executable scripts"""
env = create_env()
pkg = create_manifest("mock")
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
install(PROGRAMS bin/script DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION})
""",
checks=cc.scripts)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
install(DIRECTORY bin/ DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION} USE_SOURCE_PERMISSIONS)
""",
checks=cc.scripts)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
""",
checks=cc.scripts)
self.assertEqual(["UNINSTALLED_SCRIPT"], result)
pkg = create_manifest("mock", build_depends=["dynamic_reconfigure"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS dynamic_reconfigure)
generate_dynamic_reconfigure_options(bin/script)
""",
checks=cc.scripts)
self.assertEqual([], result)
def test_message_generation(self):
"""Test ROS message generation checks"""
env = create_env()
pkg = create_manifest("mock", build_depends=[ "message_generation", "other_catkin" ], run_depends=[ "message_runtime", "other_catkin" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
generate_messages(DEPENDENCIES other_catkin)
add_message_files(FILES message.msg)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "ORDER_VIOLATION", "ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
find_package(catkin REQUIRED)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "CATKIN_ORDER_VIOLATION", "CATKIN_ORDER_VIOLATION" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS message_runtime)
""",
checks=cc.message_generation)
self.assertEqual([ "MISSING_MSG_CATKIN" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "MISSING_GENERATE_MSG" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "UNUSED_GENERATE_MSG" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(message_generation REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "MISSING_CATKIN_DEPEND" ], result)
pkg = create_manifest("mock")
pkg = create_manifest("mock", meta=True)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages()
catkin_metapackage()
""",
checks=cc.message_generation)
self.assertEqual([ "INVALID_META_COMMAND", "INVALID_META_COMMAND" ], result)
pkg = create_manifest("mock", build_depends=[ "other_catkin" ], run_depends=[ "message_runtime", "other_catkin" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED)
find_package(other_catkin REQUIRED)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "UNCONFIGURED_BUILD_DEPEND" ], result)
pkg = create_manifest("mock", build_depends=[ "message_generation" ], run_depends=[ "message_runtime" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS message_generation)
add_message_files(FILES message.msg)
generate_messages(DEPENDENCIES other_catkin)
catkin_package(CATKIN_DEPENDS message_runtime other_catkin)
""",
checks=cc.message_generation)
self.assertEqual([ "MISSING_DEPEND", "UNCONFIGURED_MSG_DEPEND", "MISSING_MSG_DEPEND", "MISSING_MSG_DEPEND" ], result)
pkg = create_manifest("mock", build_depends=[ "message_generation", "first_pkg", "second_pkg" ], run_depends=[ "message_runtime", "first_pkg", "second_pkg" ])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg message_generation second_pkg)
add_message_files(FILES message1.msg message2.msg)
generate_messages(DEPENDENCIES first_pkg second_pkg)
catkin_package(CATKIN_DEPENDS first_pkg message_runtime second_pkg)
""",
checks=cc.message_generation)
self.assertEqual([], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg message_generation second_pkg)
add_message_files(FILES message1.msg message2.msg)
generate_messages(DEPENDENCIES second_pkg first_pkg)
catkin_package(CATKIN_DEPENDS first_pkg message_runtime second_pkg)
""",
checks=cc.message_generation)
self.assertEqual([ "UNSORTED_LIST" ], result)
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS first_pkg message_generation second_pkg)
add_message_files(FILES message2.msg message1.msg)
generate_messages(DEPENDENCIES first_pkg second_pkg)
catkin_package(CATKIN_DEPENDS first_pkg message_runtime second_pkg)
""",
checks=cc.message_generation)
self.assertEqual([ "UNSORTED_LIST" ], result)
def test_format2_message_exports(self):
"""Test checks for package format version 2 features"""
env = create_env()
pkg = create_manifest2("mock", build_depends=["message_generation"], exec_depends=["message_runtime"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS message_generation)
catkin_package(CATKIN_DEPENDS message_runtime)
""",
checks=cc.exports)
self.assertEqual([], result)
pkg = create_manifest2("mock", build_depends=["message_generation"], build_export_depends=["message_runtime"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS message_generation)
catkin_package(CATKIN_DEPENDS message_runtime)
""",
checks=cc.exports)
self.assertEqual([], result)
pkg = create_manifest2("mock", build_depends=["message_generation"])
result = mock_lint(env, pkg,
"""
project(mock)
find_package(catkin REQUIRED COMPONENTS message_generation)
catkin_package(CATKIN_DEPENDS message_runtime)
""",
checks=cc.exports)
self.assertEqual(["MISSING_DEPEND"], result)
| 39.225399 | 182 | 0.593302 | 4,741 | 46,639 | 5.568867 | 0.045771 | 0.058746 | 0.06045 | 0.073404 | 0.914969 | 0.889667 | 0.868874 | 0.838724 | 0.811113 | 0.793387 | 0 | 0.001135 | 0.300735 | 46,639 | 1,188 | 183 | 39.258418 | 0.80842 | 0.011621 | 0 | 0.604436 | 0 | 0.003697 | 0.190614 | 0.071032 | 0 | 0 | 0 | 0 | 0.210721 | 1 | 0.025878 | false | 0 | 0.014787 | 0 | 0.044362 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bb00615fab7e2ea685642247e1e7b3def85f0e6e | 130 | py | Python | backend/database/Models.py | TheMrPhantom/beachify | cb5111ff459b26efed3327a595391e034c7af970 | [
"MIT"
] | 1 | 2022-03-28T12:16:03.000Z | 2022-03-28T12:16:03.000Z | backend/database/Models.py | TheMrPhantom/beachify | cb5111ff459b26efed3327a595391e034c7af970 | [
"MIT"
] | null | null | null | backend/database/Models.py | TheMrPhantom/beachify | cb5111ff459b26efed3327a595391e034c7af970 | [
"MIT"
] | null | null | null | from database.Song import Song
from database.Queue import Queue
from database.Setting import Setting
from database.Ban import Ban
| 26 | 36 | 0.846154 | 20 | 130 | 5.5 | 0.35 | 0.436364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123077 | 130 | 4 | 37 | 32.5 | 0.964912 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bb180dec97c910262aaff2e282d01c6c7885e363 | 38,630 | py | Python | src/datadog_api_client/v1/api/service_level_objectives_api.py | MichaelTROEHLER/datadog-api-client-python | 12c46626622fb1277bb1e172753b342c671348bd | [
"Apache-2.0"
] | null | null | null | src/datadog_api_client/v1/api/service_level_objectives_api.py | MichaelTROEHLER/datadog-api-client-python | 12c46626622fb1277bb1e172753b342c671348bd | [
"Apache-2.0"
] | null | null | null | src/datadog_api_client/v1/api/service_level_objectives_api.py | MichaelTROEHLER/datadog-api-client-python | 12c46626622fb1277bb1e172753b342c671348bd | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
# Unless explicitly stated otherwise all files in this repository are licensed under the Apache-2.0 License.
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2019-Present Datadog, Inc.
import re # noqa: F401
import sys # noqa: F401
from datadog_api_client.v1.api_client import ApiClient, Endpoint
from datadog_api_client.v1.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from datadog_api_client.v1.model.api_error_response import APIErrorResponse
from datadog_api_client.v1.model.check_can_delete_slo_response import CheckCanDeleteSLOResponse
from datadog_api_client.v1.model.slo_bulk_delete import SLOBulkDelete
from datadog_api_client.v1.model.slo_bulk_delete_response import SLOBulkDeleteResponse
from datadog_api_client.v1.model.slo_delete_response import SLODeleteResponse
from datadog_api_client.v1.model.slo_history_response import SLOHistoryResponse
from datadog_api_client.v1.model.slo_list_response import SLOListResponse
from datadog_api_client.v1.model.slo_response import SLOResponse
from datadog_api_client.v1.model.service_level_objective import ServiceLevelObjective
from datadog_api_client.v1.model.service_level_objective_request import ServiceLevelObjectiveRequest
class ServiceLevelObjectivesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __check_can_delete_slo(
self,
ids,
**kwargs
):
"""Check if SLOs can be safely deleted # noqa: E501
Check if a SLO can be safely deleted. For example, assure an SLO can be deleted without disrupting a dashboard. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.check_can_delete_slo(ids, async_req=True)
>>> result = thread.get()
Args:
ids (str): A comma separated list of the IDs of the service level objectives objects.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
CheckCanDeleteSLOResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['ids'] = \
ids
return self.call_with_http_info(**kwargs)
self.check_can_delete_slo = Endpoint(
settings={
'response_type': (CheckCanDeleteSLOResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/can_delete',
'operation_id': 'check_can_delete_slo',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'ids',
],
'required': [
'ids',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'ids':
(str,),
},
'attribute_map': {
'ids': 'ids',
},
'location_map': {
'ids': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__check_can_delete_slo
)
def __create_slo(
self,
body,
**kwargs
):
"""Create a SLO object # noqa: E501
Create a service level objective object. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_slo(body, async_req=True)
>>> result = thread.get()
Args:
body (ServiceLevelObjectiveRequest): Service level objective request object.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOListResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['body'] = \
body
return self.call_with_http_info(**kwargs)
self.create_slo = Endpoint(
settings={
'response_type': (SLOListResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo',
'operation_id': 'create_slo',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [
'body',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(ServiceLevelObjectiveRequest,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__create_slo
)
def __delete_slo(
self,
slo_id,
**kwargs
):
"""Delete a SLO # noqa: E501
Permanently delete the specified service level objective object. If an SLO is used in a dashboard, the `DELETE /v1/slo/` endpoint returns a 409 conflict error because the SLO is referenced in a dashboard. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_slo(slo_id, async_req=True)
>>> result = thread.get()
Args:
slo_id (str): The ID of the service level objective.
Keyword Args:
force (str): Delete the monitor even if it's referenced by other resources (e.g. SLO, composite monitor).. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLODeleteResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['slo_id'] = \
slo_id
return self.call_with_http_info(**kwargs)
self.delete_slo = Endpoint(
settings={
'response_type': (SLODeleteResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/{slo_id}',
'operation_id': 'delete_slo',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'slo_id',
'force',
],
'required': [
'slo_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'slo_id':
(str,),
'force':
(str,),
},
'attribute_map': {
'slo_id': 'slo_id',
'force': 'force',
},
'location_map': {
'slo_id': 'path',
'force': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__delete_slo
)
def __delete_slo_timeframe_in_bulk(
self,
body,
**kwargs
):
"""Bulk Delete SLO Timeframes # noqa: E501
Delete (or partially delete) multiple service level objective objects. This endpoint facilitates deletion of one or more thresholds for one or more service level objective objects. If all thresholds are deleted, the service level objective object is deleted as well. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_slo_timeframe_in_bulk(body, async_req=True)
>>> result = thread.get()
Args:
body (SLOBulkDelete): Delete multiple service level objective objects request body.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOBulkDeleteResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['body'] = \
body
return self.call_with_http_info(**kwargs)
self.delete_slo_timeframe_in_bulk = Endpoint(
settings={
'response_type': (SLOBulkDeleteResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/bulk_delete',
'operation_id': 'delete_slo_timeframe_in_bulk',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [
'body',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(SLOBulkDelete,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__delete_slo_timeframe_in_bulk
)
def __get_slo(
self,
slo_id,
**kwargs
):
"""Get a SLO's details # noqa: E501
Get a service level objective object. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_slo(slo_id, async_req=True)
>>> result = thread.get()
Args:
slo_id (str): The ID of the service level objective object.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['slo_id'] = \
slo_id
return self.call_with_http_info(**kwargs)
self.get_slo = Endpoint(
settings={
'response_type': (SLOResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/{slo_id}',
'operation_id': 'get_slo',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'slo_id',
],
'required': [
'slo_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'slo_id':
(str,),
},
'attribute_map': {
'slo_id': 'slo_id',
},
'location_map': {
'slo_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_slo
)
def __get_slo_history(
self,
slo_id,
from_ts,
to_ts,
**kwargs
):
"""Get an SLO's history # noqa: E501
Get a specific SLO’s history, regardless of its SLO type. The detailed history data is structured according to the source data type. For example, metric data is included for event SLOs that use the metric source, and monitor SLO types include the monitor transition history. **Note:** There are different response formats for event based and time based SLOs. Examples of both are shown. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_slo_history(slo_id, from_ts, to_ts, async_req=True)
>>> result = thread.get()
Args:
slo_id (str): The ID of the service level objective object.
from_ts (int): The `from` timestamp for the query window in epoch seconds.
to_ts (int): The `to` timestamp for the query window in epoch seconds.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOHistoryResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['slo_id'] = \
slo_id
kwargs['from_ts'] = \
from_ts
kwargs['to_ts'] = \
to_ts
return self.call_with_http_info(**kwargs)
self.get_slo_history = Endpoint(
settings={
'response_type': (SLOHistoryResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/{slo_id}/history',
'operation_id': 'get_slo_history',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'slo_id',
'from_ts',
'to_ts',
],
'required': [
'slo_id',
'from_ts',
'to_ts',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'slo_id':
(str,),
'from_ts':
(int,),
'to_ts':
(int,),
},
'attribute_map': {
'slo_id': 'slo_id',
'from_ts': 'from_ts',
'to_ts': 'to_ts',
},
'location_map': {
'slo_id': 'path',
'from_ts': 'query',
'to_ts': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_slo_history
)
def __list_sl_os(
self,
ids,
**kwargs
):
"""Search SLOs # noqa: E501
Get multiple service level objective objects by their IDs. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_sl_os(ids, async_req=True)
>>> result = thread.get()
Args:
ids (str): A comma separated list of the IDs of the service level objectives objects.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOListResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['ids'] = \
ids
return self.call_with_http_info(**kwargs)
self.list_sl_os = Endpoint(
settings={
'response_type': (SLOListResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo',
'operation_id': 'list_sl_os',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'ids',
],
'required': [
'ids',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'ids':
(str,),
},
'attribute_map': {
'ids': 'ids',
},
'location_map': {
'ids': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__list_sl_os
)
def __update_slo(
self,
slo_id,
body,
**kwargs
):
"""Update a SLO # noqa: E501
Update the specified service level objective object. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_slo(slo_id, body, async_req=True)
>>> result = thread.get()
Args:
slo_id (str): The ID of the service level objective object.
body (ServiceLevelObjective): The edited service level objective request object.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SLOListResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['slo_id'] = \
slo_id
kwargs['body'] = \
body
return self.call_with_http_info(**kwargs)
self.update_slo = Endpoint(
settings={
'response_type': (SLOListResponse,),
'auth': [
'apiKeyAuth',
'appKeyAuth'
],
'endpoint_path': '/api/v1/slo/{slo_id}',
'operation_id': 'update_slo',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'slo_id',
'body',
],
'required': [
'slo_id',
'body',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'slo_id':
(str,),
'body':
(ServiceLevelObjective,),
},
'attribute_map': {
'slo_id': 'slo_id',
},
'location_map': {
'slo_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__update_slo
)
| 37.037392 | 414 | 0.465674 | 3,436 | 38,630 | 4.996508 | 0.082945 | 0.030405 | 0.024231 | 0.025163 | 0.8298 | 0.813199 | 0.792346 | 0.77959 | 0.774231 | 0.754077 | 0 | 0.005031 | 0.454543 | 38,630 | 1,042 | 415 | 37.072937 | 0.809739 | 0.350194 | 0 | 0.665242 | 0 | 0 | 0.199279 | 0.028345 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012821 | false | 0 | 0.019943 | 0 | 0.045584 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bb19551565ab40313349c6e10b5552ec21cfa885 | 97 | py | Python | benchmark_constructor/structure_collectors/StructureCollector.py | Kortemme-Lab/benchmark_set_construct | ee6c9e097ff49d370936b41f102ada006fb4441a | [
"MIT"
] | null | null | null | benchmark_constructor/structure_collectors/StructureCollector.py | Kortemme-Lab/benchmark_set_construct | ee6c9e097ff49d370936b41f102ada006fb4441a | [
"MIT"
] | null | null | null | benchmark_constructor/structure_collectors/StructureCollector.py | Kortemme-Lab/benchmark_set_construct | ee6c9e097ff49d370936b41f102ada006fb4441a | [
"MIT"
] | null | null | null | class StructureCollector:
'''Base class for structure collectors'''
def __init__():
pass
| 19.4 | 43 | 0.71134 | 10 | 97 | 6.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185567 | 97 | 4 | 44 | 24.25 | 0.822785 | 0.360825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
bb1f2c3fa4d024482aa40b514b6310388b6ac29b | 35 | py | Python | uniflex_module_rs_signal_gen/__init__.py | uniflex/module_rs_signal_gen | acf32afb030097e31b054945cc237ec648e36201 | [
"MIT"
] | null | null | null | uniflex_module_rs_signal_gen/__init__.py | uniflex/module_rs_signal_gen | acf32afb030097e31b054945cc237ec648e36201 | [
"MIT"
] | null | null | null | uniflex_module_rs_signal_gen/__init__.py | uniflex/module_rs_signal_gen | acf32afb030097e31b054945cc237ec648e36201 | [
"MIT"
] | null | null | null | from .module_rs_signal_gen import * | 35 | 35 | 0.857143 | 6 | 35 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.84375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bb5454ed2229c7a5c3a0e3c97e6001017a88a26a | 13,019 | py | Python | tester/data_handler.py | pchero/asterisk-outbound | 62c48abe83a7868c2af2519517081022a85cf503 | [
"BSD-2-Clause"
] | 12 | 2015-11-26T10:54:47.000Z | 2021-05-26T08:35:49.000Z | tester/data_handler.py | surereddy/asterisk-outbound | 62c48abe83a7868c2af2519517081022a85cf503 | [
"BSD-2-Clause"
] | 70 | 2015-11-23T09:31:51.000Z | 2017-01-20T20:47:58.000Z | tester/data_handler.py | surereddy/asterisk-outbound | 62c48abe83a7868c2af2519517081022a85cf503 | [
"BSD-2-Clause"
] | 5 | 2017-09-10T12:25:48.000Z | 2020-06-07T16:06:25.000Z | # -*- coding: utf-8 -*-
"""
Created on Wed Jan 4 23:48:26 2017
@author: pchero
"""
import sqlite3 as db
import ast
class DataHandler(object):
# database
con = None # connection
cur = None # cursor
view_handler = None
def __init__(self):
self.con = db.connect(":memory:")
self.cur = self.con.cursor()
self._create_tables()
return
def _create_tables(self):
self.cur.execute("create table campaigns(uuid text not null primary key, data text)")
self.cur.execute("create table plans(uuid text not null primary key, data text)")
self.cur.execute("create table dlmas(uuid text not null primary key, data text)")
self.cur.execute("create table destinations(uuid text not null primary key, data text)")
self.cur.execute("create table diallists(uuid text not null primary key, dlma_uuid text, data text)")
self.cur.execute("create table dialings(uuid text not null primary key, data text)")
def _update_list_items(self, table):
try:
self.view_handler.update_list_items(table)
except:
pass
return
def set_view_handler(self, handler):
self.view_handler = handler
def plan_insert(self, uuid, data):
print("plan_insert")
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """insert or ignore into plans(uuid, data) values ("%s", "%s");""" % (uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("plan")
return True
def plan_update(self, uuid, data):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """update plans set data="%s" where uuid="%s";""" % (str(data), uuid)
self.cur.execute(sql)
self._update_list_items("plan")
print("Plan info updated.")
return True
def plan_delete(self, uuid):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from plans where uuid="%s";""" % (uuid)
self.cur.execute(sql)
print("Plan info deleted. uuid[%s]" % uuid)
# view handler update
self._update_list_items("plan")
return True
def plan_get(self, uuid):
if uuid == None:
return None
sql = """select data from plans where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def plan_get_list_all(self):
sql = "select uuid from plans"
self.cur.execute(sql)
res = []
for tmp in self.cur:
res.append(tmp[0])
return res
def campaign_insert(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """insert or ignore into campaigns(uuid, data) values ("%s", "%s");""" % (uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("campaign")
print("Campaign inserted. uuid[%s]" % uuid)
return True
def campaign_update(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """update campaigns set data="%s" where uuid="%s";""" % (str(data), uuid)
self.cur.execute(sql)
# view handler update
self._update_list_items("campaign")
print("Campaign info updated.")
return True
def campaign_delete(self, uuid):
# get uuid
print("campaign_delete")
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from campaigns where uuid="%s";""" % uuid
self.cur.execute(sql)
# view handler update
self._update_list_items("campaign")
print("Campaign info deleted. uuid[%s]" % uuid)
return True
def campaign_get(self, uuid):
if uuid == None:
return None
sql = """select data from campaigns where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def campaign_get_list_all(self):
sql = "select uuid from campaigns"
self.cur.execute(sql)
res = []
for tmp in self.cur:
res.append(tmp[0])
return res
def dlma_insert(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """insert or ignore into dlmas(uuid, data) values ("%s", "%s");""" % (uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("dlma")
print("Dlma inserted. uuid[%s]" % uuid)
return True
def dlma_update(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """update dlmas set data="%s" where uuid="%s";""" % (str(data), uuid)
self.cur.execute(sql)
# view handler update
self._update_list_items("dlma")
print("Dlma info updated.")
return True
def dlma_delete(self, uuid):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from dlmas where uuid="%s";""" % uuid
self.cur.execute(sql)
# view handler update
self._update_list_items("dlma")
print("Dlma info deleted. uuid[%s]" % uuid)
return True
def dlma_get(self, uuid):
if uuid == None:
return None
sql = """select data from dlmas where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def dlma_get_list_all(self):
self.cur.execute("select uuid from dlmas")
res = []
for tmp in self.cur:
res.append(tmp)
return res
def dlma_get_diallist_list_all(self, uuid):
if uuid == None:
return None
sql = """select uuid from diallists where dlma_uuid="%s";"""% (uuid)
self.cur.execute(sql)
res = []
for tmp in self.cur:
res.append(tmp[0])
return res
def destination_insert(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """insert or ignore into destinations(uuid, data) values ("%s", "%s");""" % (uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("destination")
print("Destination inserted. uuid[%s]" % uuid)
return True
def destination_update(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """update destinations set data="%s" where uuid="%s";""" % (str(data), uuid)
self.cur.execute(sql)
# view handler update
self._update_list_items("destination")
print("Destination info updated.")
return True
def destination_delete(self, uuid):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from destinations where uuid="%s";""" % uuid
self.cur.execute(sql)
# view handler update
self._update_list_items("destination")
print("Destination info deleted. uuid[%s]" % uuid)
return True
def destination_get(self, uuid):
if uuid == None:
return None
sql = """select data from destinations where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def destination_get_list_all(self):
self.cur.execute("select uuid from destinations")
res = []
for tmp in self.cur:
res.append(tmp)
return res
def diallist_insert(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
# get dlma_uuid
dlma_uuid = data["DlmaUuid"]
sql = """insert or ignore into diallists(uuid, dlma_uuid, data) values ("%s", "%s", "%s");""" % (uuid, dlma_uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("diallist")
print("Diallist inserted. uuid[%s]" % uuid)
return True
def diallist_update(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
# get dlma_uuid
dlma_uuid = data["DlmaUuid"]
sql = """update diallists set dlma_uuid="%s", data="%s" where uuid="%s";""" % (dlma_uuid, str(data), uuid)
self.cur.execute(sql)
# view handler update
self._update_list_items("diallist")
print("Diallist updated. uuid[%s]" % uuid)
return True
def diallist_delete(self, uuid):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from diallists where uuid="%s";""" % (uuid)
self.cur.execute(sql)
print("Diallist info deleted.")
return
def diallist_get(self, uuid):
if uuid == None:
return None
sql = """select data from diallists where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def diallist_get_list_all(self):
self.cur.execute("select uuid from diallists")
res = []
for tmp in self.cur:
res.append(tmp)
return res
def dialing_insert(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """insert or ignore into dialings(uuid, data) values ("%s", "%s");""" % (uuid, str(data))
self.cur.execute(sql)
# view handler update
self._update_list_items("dialing")
print("Dialing inserted. uuid[%s]" % uuid)
return True
def dialing_update(self, uuid, data):
if uuid == None or data == None:
print("Wrong input parameter")
return False
sql = """update dialings set data="%s" where uuid="%s";""" % (str(data), uuid)
self.cur.execute(sql)
# view handler update
self._update_list_items("dialing")
print("Dialing info updated.")
return True
def dialing_delete(self, uuid):
# get uuid
if uuid == None:
print("Wrong input parameter.")
return False
sql = """delete from dialings where uuid="%s";""" % uuid
self.cur.execute(sql)
# view handler update
self._update_list_items("dialing")
print("Dialing info deleted. uuid[%s]" % uuid)
return True
def dialing_get(self, uuid):
if uuid == None:
return None
sql = """select data from dialings where uuid="%s";"""% (uuid)
self.cur.execute(sql)
tmp = self.cur.fetchone()
if tmp == None:
return None
res = ast.literal_eval(str(tmp[0]))
return res
def dialing_get_list_all(self):
self.cur.execute("select uuid from dialings")
res = []
for tmp in self.cur:
res.append(tmp)
return res
| 25.986028 | 132 | 0.518396 | 1,483 | 13,019 | 4.461227 | 0.068105 | 0.05396 | 0.078295 | 0.069377 | 0.869559 | 0.831923 | 0.821191 | 0.777358 | 0.745466 | 0.737455 | 0 | 0.002677 | 0.368769 | 13,019 | 500 | 133 | 26.038 | 0.802385 | 0.039404 | 0 | 0.646465 | 0 | 0.003367 | 0.220261 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117845 | false | 0.003367 | 0.006734 | 0 | 0.353535 | 0.124579 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
24a19cde931daa285ab131eeaa32f6bf97a33847 | 26 | py | Python | goamazondownloader/vlfsensor/__init__.py | AdrianoPereira/goamazondownloader | 05671ee59c0a5a4c20edd5e4825d5ebf589d1327 | [
"MIT"
] | 1 | 2021-05-07T16:04:29.000Z | 2021-05-07T16:04:29.000Z | goamazondownloader/vlfsensor/__init__.py | AdrianoPereira/goamazondownloader | 05671ee59c0a5a4c20edd5e4825d5ebf589d1327 | [
"MIT"
] | 1 | 2020-08-14T16:34:32.000Z | 2020-08-14T16:34:32.000Z | goamazondownloader/vlfsensor/__init__.py | AdrianoPereira/goamazondownloader | 05671ee59c0a5a4c20edd5e4825d5ebf589d1327 | [
"MIT"
] | null | null | null | from ._vlfsensor import *
| 13 | 25 | 0.769231 | 3 | 26 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
24a94a7a548c32c8fdc99718a8950eeaf010b346 | 160 | py | Python | server/tests/test_database_manager_server.py | GabrielWechta/cans | 2a24edb47d06ce095f3a2f68e4323dc4931a48af | [
"MIT"
] | null | null | null | server/tests/test_database_manager_server.py | GabrielWechta/cans | 2a24edb47d06ce095f3a2f68e4323dc4931a48af | [
"MIT"
] | null | null | null | server/tests/test_database_manager_server.py | GabrielWechta/cans | 2a24edb47d06ce095f3a2f68e4323dc4931a48af | [
"MIT"
] | null | null | null | """DatabaseManager tests."""
from server.database_manager_server import DatabaseManager
def test_dummy():
"""A dummy test."""
assert DatabaseManager
| 17.777778 | 58 | 0.7375 | 17 | 160 | 6.764706 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 160 | 8 | 59 | 20 | 0.845588 | 0.225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7045c190840ec5cac18666d1d38cf8d7e6b25010 | 28 | py | Python | pys60/contextphone.py | crutchwalkfactory/jaikuengine-mobile-client | c47100ec009d47a4045b3d98addc9b8ad887b132 | [
"MIT"
] | null | null | null | pys60/contextphone.py | crutchwalkfactory/jaikuengine-mobile-client | c47100ec009d47a4045b3d98addc9b8ad887b132 | [
"MIT"
] | null | null | null | pys60/contextphone.py | crutchwalkfactory/jaikuengine-mobile-client | c47100ec009d47a4045b3d98addc9b8ad887b132 | [
"MIT"
] | null | null | null | from _contextphone import *
| 14 | 27 | 0.821429 | 3 | 28 | 7.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
568cb330cce713a090b07ec407b2c57a129ac24d | 151 | py | Python | duo/admin.py | JocelynHeckenkamp/Django-Duoboard | 0e8e7f8f32ab771cd4a4a668cc6853c20020b2f6 | [
"MIT"
] | null | null | null | duo/admin.py | JocelynHeckenkamp/Django-Duoboard | 0e8e7f8f32ab771cd4a4a668cc6853c20020b2f6 | [
"MIT"
] | 3 | 2020-02-11T23:37:50.000Z | 2021-06-10T19:08:41.000Z | duo/admin.py | JocelynHeckenkamp/Django-Duoboard | 0e8e7f8f32ab771cd4a4a668cc6853c20020b2f6 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import User, Username
from .forms import Submit
admin.site.register(Username)
admin.site.register(User)
| 21.571429 | 34 | 0.81457 | 22 | 151 | 5.590909 | 0.545455 | 0.146341 | 0.276423 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10596 | 151 | 6 | 35 | 25.166667 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3b21a515ef93675a016296b6138f2ad91e7fd0cf | 58 | py | Python | src/bund/__init__.py | vulogov/the-bund | 3e631148d5eda69e32aa91b42d49f5ebc2610b43 | [
"Apache-2.0"
] | null | null | null | src/bund/__init__.py | vulogov/the-bund | 3e631148d5eda69e32aa91b42d49f5ebc2610b43 | [
"Apache-2.0"
] | null | null | null | src/bund/__init__.py | vulogov/the-bund | 3e631148d5eda69e32aa91b42d49f5ebc2610b43 | [
"Apache-2.0"
] | null | null | null | from bund.grammar import *
from bund.Parser import Parser
| 19.333333 | 30 | 0.810345 | 9 | 58 | 5.222222 | 0.555556 | 0.340426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 58 | 2 | 31 | 29 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3b3568ab425b32c62d4f13899708d9f80f3a4d9c | 44 | py | Python | core/views.py | U-S-S-EnterScience/LetsRoll | 54a1131400e272ff753bb93b94bbd55f23c094a6 | [
"MIT"
] | 1 | 2021-06-03T18:01:01.000Z | 2021-06-03T18:01:01.000Z | core/views.py | U-S-S-EnterScience/LetsRoll | 54a1131400e272ff753bb93b94bbd55f23c094a6 | [
"MIT"
] | null | null | null | core/views.py | U-S-S-EnterScience/LetsRoll | 54a1131400e272ff753bb93b94bbd55f23c094a6 | [
"MIT"
] | 1 | 2021-06-03T18:01:08.000Z | 2021-06-03T18:01:08.000Z | from django import render
# your class here | 14.666667 | 25 | 0.795455 | 7 | 44 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 3 | 26 | 14.666667 | 0.972222 | 0.340909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8e57858c2500926c452cf84153e7a4359468abba | 42 | py | Python | py_file_carving/libary/plugins/pictures/__init__.py | wahlflo/pyFileCarving | 7bbbbedccb551273fd4b22614c86f51bc876bd78 | [
"MIT"
] | null | null | null | py_file_carving/libary/plugins/pictures/__init__.py | wahlflo/pyFileCarving | 7bbbbedccb551273fd4b22614c86f51bc876bd78 | [
"MIT"
] | null | null | null | py_file_carving/libary/plugins/pictures/__init__.py | wahlflo/pyFileCarving | 7bbbbedccb551273fd4b22614c86f51bc876bd78 | [
"MIT"
] | null | null | null | from .jpg import JPG
from .png import PNG
| 14 | 20 | 0.761905 | 8 | 42 | 4 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 2 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8e76d4d6ea5d1c171bff78df2415d81f183aea43 | 154 | py | Python | python/testData/inspections/PyUnresolvedReferencesInspection3K/NestedPackageNamedAsSourceRoot/a.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | null | null | null | python/testData/inspections/PyUnresolvedReferencesInspection3K/NestedPackageNamedAsSourceRoot/a.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | null | null | null | python/testData/inspections/PyUnresolvedReferencesInspection3K/NestedPackageNamedAsSourceRoot/a.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | 1 | 2022-01-02T19:58:08.000Z | 2022-01-02T19:58:08.000Z | from lib1 import m1
from lib1.m1 import x
from lib1.m1 import <error descr="Unresolved reference 'something'">something</error>
print(m1, x, something)
| 22 | 85 | 0.766234 | 24 | 154 | 4.916667 | 0.458333 | 0.20339 | 0.169492 | 0.271186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052239 | 0.12987 | 154 | 6 | 86 | 25.666667 | 0.828358 | 0 | 0 | 0 | 0 | 0 | 0.207792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.75 | null | null | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8ea9d950c1dc7cdd9a54a55660582ea317b400f6 | 12,789 | py | Python | tests/test_chapter5.py | GoodMonsters/Building-Data-Science-Applications-with-FastAPI | d2218d225c5b93723ecf46c19619ed5d3f2473e6 | [
"MIT"
] | 107 | 2021-03-26T20:18:51.000Z | 2022-03-26T03:38:08.000Z | tests/test_chapter5.py | GoodMonsters/Building-Data-Science-Applications-with-FastAPI | d2218d225c5b93723ecf46c19619ed5d3f2473e6 | [
"MIT"
] | 4 | 2021-06-09T08:48:21.000Z | 2021-12-27T09:04:43.000Z | tests/test_chapter5.py | GoodMonsters/Building-Data-Science-Applications-with-FastAPI | d2218d225c5b93723ecf46c19619ed5d3f2473e6 | [
"MIT"
] | 58 | 2021-03-12T20:51:19.000Z | 2022-03-27T15:49:49.000Z | from typing import Optional
import httpx
import pytest
from fastapi import status
from chapter5.chapter5_what_is_dependency_injection_01 import (
app as chapter5_what_is_dependency_injection_01_app,
)
from chapter5.chapter5_function_dependency_01 import (
app as chapter5_function_dependency_01_app,
)
from chapter5.chapter5_function_dependency_02 import (
app as chapter5_function_dependency_02_app,
)
from chapter5.chapter5_function_dependency_03 import (
app as chapter5_function_dependency_03_app,
)
from chapter5.chapter5_class_dependency_01 import (
app as chapter5_class_dependency_01_app,
)
from chapter5.chapter5_class_dependency_02 import (
app as chapter5_class_dependency_02_app,
)
from chapter5.chapter5_path_dependency_01 import (
app as chapter5_path_dependency_01_app,
)
from chapter5.chapter5_router_dependency_01 import (
app as chapter5_router_dependency_01_app,
)
from chapter5.chapter5_router_dependency_02 import (
app as chapter5_router_dependency_02_app,
)
from chapter5.chapter5_global_dependency_01 import (
app as chapter5_global_dependency_01_app,
)
@pytest.mark.fastapi(app=chapter5_what_is_dependency_injection_01_app)
@pytest.mark.asyncio
class TestWhatIsDependencyInjection01:
async def test_missing_header(self, client: httpx.AsyncClient):
client.headers.pop("User-Agent")
response = await client.get("/")
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_valid_header(self, client: httpx.AsyncClient):
response = await client.get("/", headers={"User-Agent": "HTTPX"})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"user_agent": "HTTPX"}
@pytest.mark.fastapi(app=chapter5_function_dependency_01_app)
@pytest.mark.parametrize("path", ["/items", "/things"])
@pytest.mark.asyncio
class TestFunctionDependency01:
async def test_default_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path)
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 0, "limit": 10}
async def test_custom_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": 10, "limit": 100})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 100}
@pytest.mark.fastapi(app=chapter5_function_dependency_02_app)
@pytest.mark.parametrize("path", ["/items", "/things"])
@pytest.mark.asyncio
class TestFunctionDependency02:
async def test_default_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path)
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 0, "limit": 10}
async def test_invalid_skip(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_invalid_limit(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"limit": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_capped_limit(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": 10, "limit": 1000})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 100}
async def test_custom_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": 10, "limit": 100})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 100}
@pytest.mark.fastapi(app=chapter5_function_dependency_03_app)
@pytest.mark.asyncio
class TestFunctionDependency03:
async def test_get_404(self, client: httpx.AsyncClient):
response = await client.get("/posts/4")
assert response.status_code == status.HTTP_404_NOT_FOUND
async def test_get_ok(self, client: httpx.AsyncClient):
response = await client.get("/posts/1")
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json["id"] == 1
async def test_update_404(self, client: httpx.AsyncClient):
response = await client.patch("/posts/4", json={})
assert response.status_code == status.HTTP_404_NOT_FOUND
async def test_update_ok(self, client: httpx.AsyncClient):
response = await client.patch("/posts/1", json={"title": "New title"})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json["id"] == 1
assert json["title"] == "New title"
async def test_delete_404(self, client: httpx.AsyncClient):
response = await client.delete("/posts/4")
assert response.status_code == status.HTTP_404_NOT_FOUND
async def test_delete_ok(self, client: httpx.AsyncClient):
response = await client.delete("/posts/1")
assert response.status_code == status.HTTP_204_NO_CONTENT
@pytest.mark.fastapi(app=chapter5_class_dependency_01_app)
@pytest.mark.parametrize("path", ["/items", "/things"])
@pytest.mark.asyncio
class TestClassDependency01:
async def test_default_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path)
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 0, "limit": 10}
async def test_invalid_skip(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_invalid_limit(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"limit": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_capped_limit(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": 10, "limit": 1000})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 50}
async def test_custom_pagination(self, client: httpx.AsyncClient, path: str):
response = await client.get(path, params={"skip": 10, "limit": 50})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 50}
@pytest.mark.fastapi(app=chapter5_class_dependency_02_app)
@pytest.mark.asyncio
class TestClassDependency02:
async def test_skip_limit_default_pagination(self, client: httpx.AsyncClient):
response = await client.get("/items")
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 0, "limit": 10}
async def test_skip_limit_invalid_skip(self, client: httpx.AsyncClient):
response = await client.get("/items", params={"skip": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_skip_limit_invalid_limit(self, client: httpx.AsyncClient):
response = await client.get("/items", params={"limit": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_skip_limit_capped_limit(self, client: httpx.AsyncClient):
response = await client.get("/items", params={"skip": 10, "limit": 1000})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 50}
async def test_skip_limit_custom_pagination(self, client: httpx.AsyncClient):
response = await client.get("/items", params={"skip": 10, "limit": 50})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"skip": 10, "limit": 50}
async def test_page_size_default_pagination(self, client: httpx.AsyncClient):
response = await client.get("/things")
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"page": 1, "size": 10}
async def test_page_size_invalid_page(self, client: httpx.AsyncClient):
response = await client.get("/things", params={"page": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_page_size_invalid_size(self, client: httpx.AsyncClient):
response = await client.get("/things", params={"size": -10})
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
async def test_page_size_capped_limit(self, client: httpx.AsyncClient):
response = await client.get("/things", params={"page": 10, "size": 1000})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"page": 10, "size": 50}
async def test_page_size_custom_pagination(self, client: httpx.AsyncClient):
response = await client.get("/things", params={"page": 10, "size": 50})
assert response.status_code == status.HTTP_200_OK
json = response.json()
assert json == {"page": 10, "size": 50}
@pytest.mark.fastapi(app=chapter5_path_dependency_01_app)
@pytest.mark.asyncio
class TestPathDependency01:
@pytest.mark.parametrize("secret_header", [None, "INVALID_VALUE"])
async def test_invalid_header(
self, client: httpx.AsyncClient, secret_header: Optional[str]
):
headers = {}
if secret_header:
headers["Secret-Header"] = secret_header
response = await client.get("/protected-route", headers=headers)
assert response.status_code == status.HTTP_403_FORBIDDEN
async def test_valid_header(self, client: httpx.AsyncClient):
response = await client.get(
"/protected-route", headers={"Secret-Header": "SECRET_VALUE"}
)
assert response.status_code == status.HTTP_200_OK
@pytest.mark.fastapi(app=chapter5_router_dependency_01_app)
@pytest.mark.parametrize("path", ["/router/route1", "/router/route2"])
@pytest.mark.asyncio
class TestRouterDependency01:
@pytest.mark.parametrize("secret_header", [None, "INVALID_VALUE"])
async def test_invalid_header(
self, client: httpx.AsyncClient, path: str, secret_header: Optional[str]
):
headers = {}
if secret_header:
headers["Secret-Header"] = secret_header
response = await client.get(path, headers=headers)
assert response.status_code == status.HTTP_403_FORBIDDEN
async def test_valid_header(self, path: str, client: httpx.AsyncClient):
response = await client.get(path, headers={"Secret-Header": "SECRET_VALUE"})
assert response.status_code == status.HTTP_200_OK
@pytest.mark.fastapi(app=chapter5_router_dependency_02_app)
@pytest.mark.parametrize("path", ["/router/route1", "/router/route2"])
@pytest.mark.asyncio
class TestRouterDependency02:
@pytest.mark.parametrize("secret_header", [None, "INVALID_VALUE"])
async def test_invalid_header(
self, client: httpx.AsyncClient, path: str, secret_header: Optional[str]
):
headers = {}
if secret_header:
headers["Secret-Header"] = secret_header
response = await client.get(path, headers=headers)
assert response.status_code == status.HTTP_403_FORBIDDEN
async def test_valid_header(self, path: str, client: httpx.AsyncClient):
response = await client.get(path, headers={"Secret-Header": "SECRET_VALUE"})
assert response.status_code == status.HTTP_200_OK
@pytest.mark.fastapi(app=chapter5_global_dependency_01_app)
@pytest.mark.parametrize("path", ["/route1", "/route2"])
@pytest.mark.asyncio
class TestGlobalDependency01:
@pytest.mark.parametrize("secret_header", [None, "INVALID_VALUE"])
async def test_invalid_header(
self, client: httpx.AsyncClient, path: str, secret_header: Optional[str]
):
headers = {}
if secret_header:
headers["Secret-Header"] = secret_header
response = await client.get(path, headers=headers)
assert response.status_code == status.HTTP_403_FORBIDDEN
async def test_valid_header(self, path: str, client: httpx.AsyncClient):
response = await client.get(path, headers={"Secret-Header": "SECRET_VALUE"})
assert response.status_code == status.HTTP_200_OK
| 38.176119 | 84 | 0.700211 | 1,597 | 12,789 | 5.379461 | 0.065122 | 0.035386 | 0.053079 | 0.106158 | 0.933069 | 0.911535 | 0.832732 | 0.787568 | 0.769759 | 0.732744 | 0 | 0.035738 | 0.183908 | 12,789 | 334 | 85 | 38.290419 | 0.787391 | 0 | 0 | 0.589431 | 0 | 0 | 0.066072 | 0 | 0 | 0 | 0 | 0 | 0.227642 | 1 | 0 | false | 0 | 0.056911 | 0 | 0.097561 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8ec7a0a4a34ca91a3566acbb612d7d098ab180dd | 125 | py | Python | badwing/level/__init__.py | kfields/badwing | 5f53c98cbb6fca8390e1632fa559f5201861365b | [
"MIT"
] | 3 | 2020-03-23T06:43:25.000Z | 2022-02-18T16:35:56.000Z | badwing/level/__init__.py | kfields/badwing | 5f53c98cbb6fca8390e1632fa559f5201861365b | [
"MIT"
] | 2 | 2020-03-26T02:05:36.000Z | 2021-08-02T19:13:06.000Z | badwing/level/__init__.py | kfields/badwing | 5f53c98cbb6fca8390e1632fa559f5201861365b | [
"MIT"
] | null | null | null | from badwing.level.level import Level
from badwing.level.tile import TileLevel
from badwing.level.sticker import StickerLevel | 41.666667 | 46 | 0.864 | 18 | 125 | 6 | 0.444444 | 0.305556 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088 | 125 | 3 | 46 | 41.666667 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
8ec9fec9e8d043db8cbdb3f547b18040c6aac280 | 25 | py | Python | test/json/aes/__init__.py | vincent-musedev/libacvp | b11247d9d0b2fbd88954358272a43d35c059be7b | [
"BSD-2-Clause",
"Apache-2.0"
] | 45 | 2016-08-01T11:47:34.000Z | 2022-02-22T21:27:27.000Z | test/json/aes/__init__.py | vincent-musedev/libacvp | b11247d9d0b2fbd88954358272a43d35c059be7b | [
"BSD-2-Clause",
"Apache-2.0"
] | 221 | 2016-08-04T17:10:36.000Z | 2022-01-21T19:53:36.000Z | test/json/aes/__init__.py | vincent-musedev/libacvp | b11247d9d0b2fbd88954358272a43d35c059be7b | [
"BSD-2-Clause",
"Apache-2.0"
] | 94 | 2016-10-23T11:08:19.000Z | 2022-01-21T11:50:16.000Z | from .aes import main_aes | 25 | 25 | 0.84 | 5 | 25 | 4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 25 | 1 | 25 | 25 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d9152b95484598ace42e21728bc149715f072346 | 195 | py | Python | __init__.py | rvencu/crawlingathome | 003a6e76d9f7aa5a5d17f5ae780857bd7f86e6f0 | [
"MIT"
] | null | null | null | __init__.py | rvencu/crawlingathome | 003a6e76d9f7aa5a5d17f5ae780857bd7f86e6f0 | [
"MIT"
] | null | null | null | __init__.py | rvencu/crawlingathome | 003a6e76d9f7aa5a5d17f5ae780857bd7f86e6f0 | [
"MIT"
] | null | null | null | from .version import PrintVersion as _printversion
_printversion()
from .core import init
from .recycler import dump, load
from .version import VERSION as __version__
from .errors import * | 27.857143 | 51 | 0.789744 | 25 | 195 | 5.92 | 0.44 | 0.148649 | 0.22973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164103 | 195 | 7 | 52 | 27.857143 | 0.907975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.833333 | 0 | 0.833333 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d9175089aa9a7a43bb82211ed4ae946856d81ef3 | 8,754 | py | Python | pyfrechet/distance.py | compgeomTU/frechetForCurves | 625bfe32a45d23b194226b4ac7713ded09bd2825 | [
"MIT"
] | null | null | null | pyfrechet/distance.py | compgeomTU/frechetForCurves | 625bfe32a45d23b194226b4ac7713ded09bd2825 | [
"MIT"
] | null | null | null | pyfrechet/distance.py | compgeomTU/frechetForCurves | 625bfe32a45d23b194226b4ac7713ded09bd2825 | [
"MIT"
] | null | null | null | ## @package pyfrechet
#
# Module for working with Frechet distances.
from ._strong_distance import lib as _sd
from ._weak_distance import lib as _wd
import os
## Super class of StrongDistance and WeakDistance.
#
# Class should not be used as a standalone object. Used in seperate modules
# to check if StrongDistance and WeakDistance are instances of Distance
class Distance:
## Protected - filepath of file containing first curve.
_curve_1_file: str
## Protected - filepath of file containing first curve.
_curve_2_file: str
## Protected - last input of epsilon set to class.
_epsilon: float
## Should be only called by sub classes.
# @param self Object pointer.
# @param curve_1_file Filepath of curve one.
# @param curve_2_file Filepath of curve one.
def __init__(self, curve_1_file, curve_2_file):
curve1_abspath = os.path.abspath(curve_1_file)
curve2_abspath = os.path.abspath(curve_2_file)
curve1_ascii = curve1_abspath.encode('ascii')
curve2_ascii = curve2_abspath.encode('ascii')
self._curve_1_file = curve1_ascii
self._curve_2_file = curve2_ascii
## Formatted string containing curve filepaths and child class name.
# @param self Object pointer.
# @param name_ Name of sub class.
# @return Formatted string.
def __str__(self, name_):
try:
curve_1_file, curve_2_file = self._curve_1_file, self._curve_2_file
except:
curve_1_file, curve_2_file = "N/A", "N/A"
return f"""
Frechet Distance | {name_}
========================================
Curve 1 File | {curve_1_file}
Curve 2 File | {curve_2_file}
"""
## Protected - raises IOError if sub class's setCurves() fails to read curve file.
# @param self Object pointer.
# @param errno Error number returned from C program.
def _checkSetCurvesErrno(self, errno):
if errno != 0:
raise IOError(f"{os.strerror(errno)} raised while running setCurves().\n"
f"Check file paths and formats for error:\n"
f"{self._curve_1_file}\n"
f"{self._curve_2_file}"
)
## Use to get filepaths of curves.
# @param self Object pointer.
# @returns First curve filepath then second curve filepath.
def getFileNames(self):
return self._curve_1_file, self._curve_2_file
## Use to get last input of epsilon.
# @param self Object pointer.
# @returns Epsilon.
def getEpsilon(self):
return self._epsilon
## Frechet Distance API
#
# Class can be used to load curves from files, build free space data structure
# and check if paths exist in free space.
class StrongDistance(Distance):
## Constructs an empty object with no curves
# @param self Object pointer.
def __init__(self):
_sd.create_freespace_reachabilitytable()
## Formatted string containing curve filepaths and child class name.
# @param self Object pointer.
# @return Formatted string.
def __str__(self):
return super().__str__(self.__class__.__name__)
## Creates instance that includes two curves from two separate files.
# @param cls Class pointer.
# @param curve_1_file Filepath of file containing first curve.
# @param curve_2_file Filepath of file containing second curve.
# @param reverse_curve_2 True if coordinates for second curve need to be reversed.
# @return Object pointer.
@classmethod
def setCurves(cls, curve_1_file, curve_2_file, reverse_curve_2=False):
self = cls.__new__(cls)
super(StrongDistance, self).__init__(curve_1_file, curve_2_file)
errno = _sd.createcurves(self._curve_1_file, self._curve_2_file, \
reverse_curve_2)
self._checkSetCurvesErrno(errno)
self.__init__()
return self
## Use to set free space data structure for epsilon.
# @param self Object pointer.
# @param epsilon Epsilon.
def setFreeSpace(self, epsilon):
self._epsilon = epsilon
_sd.setfreespace(epsilon)
## Use to get coodinates of first curve
# @param self Object pointer.
# @return Array of coordinates; array of structs containing variables 'x' and 'y.'
def getCurve1(self):
return _sd.getcurve1()
## Use to get coodinates of second curve
# @param self Object pointer.
# @return Array of coordinates; array of structs containing variables 'x' and 'y.'
def getCurve2(self):
return _sd.getcurve2()
## Use to get number of coodinates for first curve
# @param self Object pointer.
# @return Lenght of array returned by getCurve1().
def getCurve1Lenght(self):
return _sd.getcurve1lenght()
## Use to get number of coodinates for second curve
# @param self Object pointer.
# @return Lenght of array returned by getCurve2().
def getCurve2Lenght(self):
return _sd.getcurve2lenght()
## Use to get free space data struct
# @param self Object pointer.
# @return Typedef struct freespace.
def getFreeSpace(self):
return _sd.getfreespace()
## Use to check if path exists inside free space.
# @param self Object pointer.
# @return True if path exists; False if path is not found.
def isReachable(self):
_sd.setreachabilitytable()
return _sd.isreachable()
## Weak Frechet Distance API
#
# Class can be used to load curves from files, build free space data structure
# and check if paths exist in free space.
class WeakDistance(Distance):
## Constructs an empty object with no curves
# @param self Object pointer.
def __init__(self):
_wd.create_freespace_reachabilitytable()
## Formatted string containing curve filepaths and child class name.
# @param self Object pointer.
# @return Formatted string.
def __str__(self):
return super().__str__(self.__class__.__name__)
## Creates instance that includes two curves from two separate files.
# @param cls Class pointer.
# @param curve_1_file Filepath of file containing first curve.
# @param curve_2_file Filepath of file containing second curve.
# @param reverse_curve_2 True if coordinates for second curve need to be reversed.
# @return Object pointer.
@classmethod
def setCurves(cls, curve_1_file, curve_2_file, reverse_curve_2=False):
self = cls.__new__(cls)
super(WeakDistance, self).__init__(curve_1_file, curve_2_file)
errno = _wd.createcurves(self._curve_1_file, self._curve_2_file, \
reverse_curve_2)
self._checkSetCurvesErrno(errno)
self.__init__()
return self
## Use to set free space data structure for epsilon.
# @param self Object pointer.
# @param epsilon Epsilon.
def setFreeSpace(self, epsilon):
self._epsilon = epsilon
_wd.setfreespace(epsilon)
## Use to get coodinates of first curve
# @param self Object pointer.
# @return Array of coordinates; array of structs containing variables 'x' and 'y.'
def getCurve1(self):
return _wd.getcurve1()
## Use to get coodinates of second curve
# @param self Object pointer.
# @return Array of coordinates; array of structs containing variables 'x' and 'y.'
def getCurve2(self):
return _wd.getcurve2()
## Use to get number of coodinates for first curve
# @param self Object pointer.
# @return Lenght of array returned by getCurve1().
def getCurve1Lenght(self):
return _wd.getcurve1lenght()
## Use to get number of coodinates for second curve
# @param self Object pointer.
# @return Lenght of array returned by getCurve2().
def getCurve2Lenght(self):
return _wd.getcurve2lenght()
## Use to get free space data struct
# @param self Object pointer.
# @return Typedef struct freespace.
def getFreeSpace(self):
return _wd.getfreespace()
## Use to check if path exists inside free space.
# @param self Object pointer.
# @return True if path exists; False if path is not found.
def isReachable(self):
if _wd.getcurve1lenght() == 1 or _wd.getcurve2lenght() == 1:
try:
return _wd.computemaxdistances(super()._epsilon)
except ValueError:
raise ValueError("No value for Epsilon exists because"
"setfreespace() was never called.")
else:
return _wd.isreachable()
| 36.936709 | 87 | 0.654329 | 1,092 | 8,754 | 5.039377 | 0.156593 | 0.028348 | 0.062693 | 0.09195 | 0.760676 | 0.72724 | 0.702526 | 0.691259 | 0.663638 | 0.651645 | 0 | 0.012311 | 0.266964 | 8,754 | 236 | 88 | 37.09322 | 0.845255 | 0.477724 | 0 | 0.372549 | 0 | 0 | 0.103604 | 0.013964 | 0 | 0 | 0 | 0 | 0 | 1 | 0.245098 | false | 0 | 0.029412 | 0.137255 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
d93a073e8ca65a862c7e8292904e84226e6cca0f | 28 | py | Python | examples/project5/one/foo/foo.py | jeanslack/winpackit | 44bfebcf2191fbc32acfa4a61e7c53d8a8fbae4d | [
"MIT"
] | 2 | 2021-09-19T21:34:13.000Z | 2021-11-25T07:27:29.000Z | examples/project5/one/foo/foo.py | jeanslack/winpackit | 44bfebcf2191fbc32acfa4a61e7c53d8a8fbae4d | [
"MIT"
] | 1 | 2021-09-19T23:20:02.000Z | 2021-09-22T17:35:08.000Z | examples/project5/one/foo/foo.py | jeanslack/winpackit | 44bfebcf2191fbc32acfa4a61e7c53d8a8fbae4d | [
"MIT"
] | 1 | 2021-09-21T22:01:45.000Z | 2021-09-21T22:01:45.000Z | print('this is foo/foo.py')
| 14 | 27 | 0.678571 | 6 | 28 | 3.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
d98fb772da7f3e7be98ff5a284e96ee6b910ad56 | 8,030 | py | Python | MARKBOT/MarkBot-v2/cogs/Shrug.py | z404/Project-Mark | 87f4a13cc99dc7d1ceeaacaab8bcaa310355250b | [
"MIT"
] | 6 | 2021-05-17T06:04:37.000Z | 2021-08-11T06:35:15.000Z | MARKBOT/MarkBot-v2/cogs/Shrug.py | z404/Project-Mark | 87f4a13cc99dc7d1ceeaacaab8bcaa310355250b | [
"MIT"
] | 10 | 2021-08-30T07:37:48.000Z | 2021-12-14T12:17:53.000Z | MARKBOT/MarkBot-v2/cogs/Shrug.py | z404/Project-Mark | 87f4a13cc99dc7d1ceeaacaab8bcaa310355250b | [
"MIT"
] | 1 | 2021-08-28T14:45:08.000Z | 2021-08-28T14:45:08.000Z | # For Discord
import discord
from discord.ext import commands
# from discord_slash import cog_ext, SlashContext
from tabulate import tabulate
# Function to write into database
def write_db(db: dict) -> None:
with open("database", 'w+') as file:
file.write(str(db))
# Function to get database
def get_db() -> dict:
with open('database') as file:
return eval(file.read())
# Loading the config file
with open('config.json') as file:
config = eval(file.read())
class Shrug(commands.Cog):
def __init__(self, bot: commands.Bot):
self.bot = bot
self.db = get_db()
@commands.group(name="shrug", pass_context=True)
async def _shrug(self, ctx: commands.Context):
"""Command to manage a shrug channel"""
if not ctx.invoked_subcommand:
# TODO: Add help command here for shrug channel management
pass
@_shrug.command(pass_context=True, name="reset")
async def _reset(self, ctx: commands.Context):
'''Resets Shrug channel in a server'''
def check_if_server_has_shrug(server_id):
for i in self.db.keys():
if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
return True, i
return False, None
ret = check_if_server_has_shrug(ctx.guild.id)
if ret[0]:
del(self.db[ret[1]])
write_db(self.db)
await ctx.send("Successfully reset shrug channel!")
else:
await ctx.send("What's a shrug channel? I can't reset what I don't know. I'm not a supercomputer y'know")
@_shrug.command(pass_context=True, name="set")
async def _set(self, ctx: commands.Context, channel: discord.TextChannel):
'''Set Shrug channel in a server'''
def check_if_server_has_shrug(server_id):
for i in self.db.keys():
if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
return True
return False
if check_if_server_has_shrug(ctx.guild.id):
await ctx.send("This server already has a shrug channel! Type `!resetshrugchannel` to reset shrug channel settings for this server!")
else:
await ctx.send("Please wait while I figure out what a shrug means to humans")
counter = 0
message_count = {}
async for message in channel.history(oldest_first=True, limit=None):
if message.content == "¯\\_(ツ)_/¯" or repr(message.content) == r"'¯\\\\_(ツ)\\_/¯'":
try:
message_count[message.author.id] += 1
except:
message_count[message.author.id] = 1
self.db[str(ctx.guild.id)+str(channel.id) +
'shrugchan'] = message_count
write_db(self.db)
await ctx.send("Shrug channel is now configured!")
@_shrug.command(pass_context=True, name="leaderboard")
async def _leaderboard(self, ctx: commands.Context):
'''Displays a shrug leaderboard for the server'''
def check_if_server_has_shrug(server_id):
for i in self.db.keys():
if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
return True, i
return False, None
ret = check_if_server_has_shrug(ctx.guild.id)
if ret[0]:
leaderdata = self.db[ret[1]]
X = list(leaderdata.keys())
Y = list(leaderdata.values())
userlst = [x for _, x in sorted(zip(Y, X))]
userlst.reverse()
finaldict = []
for i in userlst:
finaldict.append(
[str(ctx.guild.get_member(i)), str(leaderdata[i])])
headers = ["User", "Shrugs"]
table = tabulate(finaldict, headers, tablefmt="fancy_grid")
# await ctx.send("Leaderboard:\n```"+table+"```")
desc = "```yaml\n"+table+"```\n Total shrugs: "+str(sum(Y))
embed = discord.Embed(title="Leaderboard",
description=desc, color=0x00ff00)
await ctx.send(embed=embed)
# await ctx.send(str(leaderdata))
else:
await ctx.send("No shrug channel has been set! Set a shrug channel by typing `!setshrugchannel`")
# @cog_ext.cog_subcommand(name="reset", base="shrug")
# async def _reset_slash(self, ctx: SlashContext):
# '''Resets Shrug channel in a server'''
# def check_if_server_has_shrug(server_id):
# for i in self.db.keys():
# if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
# return True, i
# return False, None
# ret = check_if_server_has_shrug(ctx.guild.id)
# if ret[0]:
# del(self.db[ret[1]])
# write_db(self.db)
# await ctx.send("Successfully reset shrug channel!")
# else:
# await ctx.send("What's a shrug channel? I can't reset what I don't know. I'm not a supercomputer y'know")
# @cog_ext.cog_subcommand(name="set", base="shrug")
# async def _set_slash(self, ctx: SlashContext, channel: discord.TextChannel):
# '''Set Shrug channel in a server'''
# def check_if_server_has_shrug(server_id):
# for i in self.db.keys():
# if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
# return True
# return False
# if check_if_server_has_shrug(ctx.guild.id):
# await ctx.send("This server already has a shrug channel! Type `!resetshrugchannel` to reset shrug channel settings for this server!")
# else:
# await ctx.send("Please wait while I figure out what a shrug means to humans")
# counter = 0
# message_count = {}
# async for message in channel.history(oldest_first=True, limit=None):
# if message.content == "¯\\_(ツ)_/¯" or repr(message.content) == r"'¯\\\\_(ツ)\\_/¯'":
# try:
# message_count[message.author.id] += 1
# except:
# message_count[message.author.id] = 1
# self.db[str(ctx.guild.id)+str(channel.id) +
# 'shrugchan'] = message_count
# write_db(self.db)
# await ctx.send("Shrug channel is now configured!")
# @cog_ext.cog_subcommand(name="leaderboard", base="shrug")
# async def _leaderboard_slash(self, ctx: SlashContext):
# '''Displays a shrug leaderboard for the server'''
# def check_if_server_has_shrug(server_id):
# for i in self.db.keys():
# if str(server_id) == i[:len(str(server_id))] and "shrugchan" in i:
# return True, i
# return False, None
# ret = check_if_server_has_shrug(ctx.guild.id)
# if ret[0]:
# leaderdata = self.db[ret[1]]
# X = list(leaderdata.keys())
# Y = list(leaderdata.values())
# userlst = [x for _, x in sorted(zip(Y, X))]
# userlst.reverse()
# finaldict = []
# for i in userlst:
# finaldict.append(
# [str(ctx.guild.get_member(i)), str(leaderdata[i])])
# headers = ["User", "Shrugs"]
# table = tabulate(finaldict, headers, tablefmt="fancy_grid")
# # await ctx.send("Leaderboard:\n```"+table+"```")
# desc = "```yaml\n"+table+"```\n Total shrugs: "+str(sum(Y))
# embed = discord.Embed(title="Leaderboard",
# description=desc, color=0x00ff00)
# await ctx.send(embed=embed)
# # await ctx.send(str(leaderdata))
# else:
# await ctx.send("No shrug channel has been set! Set a shrug channel by typing `!setshrugchannel`")
def setup(bot):
bot.add_cog(Shrug(bot))
| 44.120879 | 147 | 0.560149 | 1,015 | 8,030 | 4.3133 | 0.15665 | 0.05482 | 0.049338 | 0.043856 | 0.79534 | 0.77958 | 0.758337 | 0.758337 | 0.758337 | 0.758337 | 0 | 0.004356 | 0.313823 | 8,030 | 181 | 148 | 44.364641 | 0.788748 | 0.433998 | 0 | 0.261905 | 0 | 0.02381 | 0.134727 | 0 | 0 | 0 | 0.001858 | 0.005525 | 0 | 1 | 0.083333 | false | 0.059524 | 0.035714 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
79a86ed7f7916c2ba77617f0f70a1d3c994fc28c | 3,271 | py | Python | django_outlook/views.py | weijia/django-outlook | 3f6b22e3ec34890c9b84699a22b2e39b8cadc723 | [
"MIT"
] | null | null | null | django_outlook/views.py | weijia/django-outlook | 3f6b22e3ec34890c9b84699a22b2e39b8cadc723 | [
"MIT"
] | 2 | 2018-12-26T19:10:02.000Z | 2021-11-15T17:47:55.000Z | django_outlook/views.py | weijia/django-outlook | 3f6b22e3ec34890c9b84699a22b2e39b8cadc723 | [
"MIT"
] | null | null | null | from django.views.generic import TemplateView, RedirectView
from django_outlook.o365_utils.adv_connection import OutlookConnection
from django_outlook.o365_utils.login_token_storage import LoginTokenStorage
from django_outlook.o365_utils.token_storage import TokenStorage
from djangoautoconf.django_utils import retrieve_param
from djangoautoconf.local_key_manager import get_local_key
class O365AuthRedirectView(RedirectView):
permanent = False # Not always redirect to the same page
def get_redirect_url(self, *args, **kwargs):
# article = get_object_or_404(Article, pk=kwargs['pk'])
# article.update_counter()
# return super().get_redirect_url(*args, **kwargs)
token_storage = TokenStorage(self.request.user)
c = OutlookConnection(
get_local_key("o365_app_settings.o365_app_client_id"),
get_local_key("o365_app_settings.o365_app_secret"),
token_storage,
)
auth_url = c.get_auth_url()
return auth_url
class OutlookAuthResultView(TemplateView):
template_name = 'django_outlook/key_got.html'
def get_context_data(self, **kwargs):
# return super(OutlookAuthResultView, self).get_context_data(**kwargs)
# param = retrieve_param(self.request)
token_storage = TokenStorage(self.request.user)
c = OutlookConnection(
get_local_key("o365_app_settings.o365_app_client_id"),
get_local_key("o365_app_settings.o365_app_secret"),
token_storage,
)
token_url = "%s/?%s" % ("https://localhost", self.request.META['QUERY_STRING'])
res = c.update_extra_data(token_url)
return res
class O365LoginRedirectView(RedirectView):
permanent = False # Not always redirect to the same page
def get_redirect_url(self, *args, **kwargs):
# article = get_object_or_404(Article, pk=kwargs['pk'])
# article.update_counter()
# return super().get_redirect_url(*args, **kwargs)
token_storage = LoginTokenStorage(self.request.user)
c = OutlookConnection(
get_local_key("o365_login_app_settings.o365_app_client_id"),
get_local_key("o365_login_app_settings.o365_app_secret"),
token_storage,
redirect_url='https://%s/django_outlook/login_result/' % self.request.get_host()
)
auth_url = c.get_auth_url()
return auth_url
class OutlookLoginResultView(TemplateView):
template_name = 'django_outlook/key_got.html'
def get_context_data(self, **kwargs):
# return super(OutlookLoginResultView, self).get_context_data(**kwargs)
# param = retrieve_param(self.request)
token_storage = LoginTokenStorage(self.request.user)
c = OutlookConnection(
get_local_key("o365_login_app_settings.o365_app_client_id"),
get_local_key("o365_login_app_settings.o365_app_secret"),
token_storage,
redirect_url='https://%s/django_outlook/login_result/' % self.request.get_host()
)
token_url = "%s/?%s" % ("https://localhost", self.request.META['QUERY_STRING'])
param = retrieve_param(self.request)
res = c.update_extra_data(token_url, param["state"])
return res
| 41.405063 | 92 | 0.694895 | 394 | 3,271 | 5.416244 | 0.200508 | 0.039363 | 0.046392 | 0.056232 | 0.802718 | 0.752577 | 0.752577 | 0.727273 | 0.727273 | 0.727273 | 0 | 0.026569 | 0.206053 | 3,271 | 78 | 93 | 41.935897 | 0.795148 | 0.165699 | 0 | 0.690909 | 0 | 0 | 0.186672 | 0.130339 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072727 | false | 0 | 0.109091 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
79acf3ebfeff08d3ff7c9ee0d9af46eb34116b22 | 3,268 | py | Python | quedadas/tests/tests_friends.py | fevsea/meet-Run-Server | 48454a4665f55da019334271641c514df231f177 | [
"MIT"
] | null | null | null | quedadas/tests/tests_friends.py | fevsea/meet-Run-Server | 48454a4665f55da019334271641c514df231f177 | [
"MIT"
] | null | null | null | quedadas/tests/tests_friends.py | fevsea/meet-Run-Server | 48454a4665f55da019334271641c514df231f177 | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User
from django.urls import reverse
from rest_framework import status
from rest_framework.authtoken.models import Token
from rest_framework.test import APITestCase
from populateDB import create_basic_user
from populateDB import create_basic_user_2
class FriendsTests(APITestCase):
def setUp(self):
pass
def test_add_friend(self):
create_basic_user()
self.user = User.objects.get(username='awaisI')
token = Token.objects.create(user=self.user)
self.client.credentials(HTTP_AUTHORIZATION='Token ' + token.key)
token.save()
create_basic_user_2()
response = self.client.post(
reverse('friends', kwargs={'pk': 2}),
format='json'
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED) # comprobar que se ha creado la solicitud
response2 = self.client.get(
reverse('friends', kwargs={'pk': 2}),
data={'accepted': False}
)
self.assertEqual(response2.data['count'], 1) # comprobar que se ha creado la solicitud y no esta aceptada
self.user = User.objects.get(username='ericR')
token = Token.objects.create(user=self.user)
self.client.credentials(HTTP_AUTHORIZATION='Token ' + token.key)
token.save()
response = self.client.post( # aceptar la solicitud
reverse('friends', kwargs={'pk': 1}),
format='json'
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED) # comprobar que se ha creado la solicitud
response2 = self.client.get(
reverse('friends', kwargs={'pk': 1}),
data={'accepted': True}
)
self.assertEqual(response2.data['count'], 1) # comprobar que se ha aceptado la solicitud
response2 = self.client.get(
reverse('friends', kwargs={'pk': 1}),
data={'accepted': False}
)
self.assertEqual(response2.data['count'], 0) # comprobar que ya no hay solicitudes pendientes
response = self.client.delete( # eliminar amistad
reverse('friends', kwargs={'pk': 1}),
format='json'
)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT) # comprobar que se ha borrado
response2 = self.client.get(
reverse('friends', kwargs={'pk': 1}),
data={'accepted': True}
)
self.assertEqual(response2.data['count'], 1) # comprobar que se ha aceptado la solicitud
response2 = self.client.get(
reverse('friends', kwargs={'pk': 1}),
data={'accepted': False}
)
self.assertEqual(response2.data['count'], 0) # comprobar que ya no hay solicitudes pendientes
def test_add_friend_no_exists(self):
create_basic_user()
self.user = User.objects.get(username='awaisI')
token = Token.objects.create(user=self.user)
self.client.credentials(HTTP_AUTHORIZATION='Token ' + token.key)
token.save()
response = self.client.post(
reverse('friends', kwargs={'pk': 2}),
format='json'
)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
| 37.563218 | 114 | 0.626683 | 379 | 3,268 | 5.30343 | 0.224274 | 0.059701 | 0.089552 | 0.098507 | 0.80199 | 0.801493 | 0.751741 | 0.743284 | 0.734826 | 0.734826 | 0 | 0.015631 | 0.25612 | 3,268 | 86 | 115 | 38 | 0.811189 | 0.116891 | 0 | 0.638889 | 0 | 0 | 0.068522 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.041667 | false | 0.013889 | 0.097222 | 0 | 0.152778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8db34073615412750b94da91eb39e81a1bbf48eb | 4,281 | py | Python | preprocess/bilingual/download_data.py | jtbates/paraphrastic-representations-at-scale | 3df4a61e52640bb358467721d1de50d71b26889b | [
"BSD-3-Clause"
] | 68 | 2021-05-06T03:08:04.000Z | 2022-03-28T16:25:42.000Z | preprocess/bilingual/download_data.py | jtbates/paraphrastic-representations-at-scale | 3df4a61e52640bb358467721d1de50d71b26889b | [
"BSD-3-Clause"
] | 7 | 2021-06-03T06:04:48.000Z | 2021-10-05T22:51:35.000Z | preprocess/bilingual/download_data.py | jtbates/paraphrastic-representations-at-scale | 3df4a61e52640bb358467721d1de50d71b26889b | [
"BSD-3-Clause"
] | 6 | 2021-06-03T05:20:45.000Z | 2021-07-30T08:43:46.000Z | import os
import urllib.request
import sys
def check_site_exist(url):
request = urllib.request.Request(url)
request.get_method = lambda: 'HEAD'
try:
urllib.request.urlopen(request)
return True
except urllib.request.HTTPError:
return False
def download_for_lang(lang):
first = False
if lang < "en":
first = True
cmd = "rm -Rf en-{0}".format(lang)
os.system(cmd)
cmd = "mkdir en-{0}".format(lang)
os.system(cmd)
"""
#download os
if first:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/moses/{0}-en.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/moses/{0}-en.txt.zip -O en-{0}.os.txt.zip".format(lang)
else:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/moses/en-{0}.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/moses/en-{0}.txt.zip -O en-{0}.os.txt.zip".format(lang)
if exists:
os.system(cmd)
cmd = "unzip -d en-{0}/os/ en-{0}.os.txt.zip".format(lang)
os.system(cmd)
if lang == "zh":
cmd = "wget http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/moses/en-zh_cn.txt.zip -O en-zh.os.txt.zip"
os.system(cmd)
cmd = "unzip -d en-{0}/os/ en-{0}.os.txt.zip".format(lang)
os.system(cmd)
#download tanzil
if first:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=Tanzil/v1/moses/{0}-en.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=Tanzil/v1/moses/{0}-en.txt.zip -O en-{0}.tz.txt.zip".format(lang)
else:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=Tanzil/v1/moses/en-{0}.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=Tanzil/v1/moses/en-{0}.txt.zip -O en-{0}.tz.txt.zip".format(lang)
if exists:
os.system(cmd)
cmd = "unzip -d en-{0}/tz/ en-{0}.tz.txt.zip".format(lang)
os.system(cmd)
#download europarl
if first:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=Europarl/v8/moses/{0}-en.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=Europarl/v8/moses/{0}-en.txt.zip -O en-{0}.ep.txt.zip".format(lang)
else:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=Europarl/v8/moses/en-{0}.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=Europarl/v8/moses/en-{0}.txt.zip -O en-{0}.ep.txt.zip".format(lang)
if exists:
os.system(cmd)
cmd = "unzip -d en-{0}/ep/ en-{0}.ep.txt.zip".format(lang)
os.system(cmd)
#download un
if first:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=MultiUN/v1/moses/{0}-en.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=MultiUN/v1/moses/{0}-en.txt.zip -O en-{0}.un.txt.zip".format(lang)
else:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=MultiUN/v1/moses/en-{0}.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=MultiUN/v1/moses/en-{0}.txt.zip -O en-{0}.un.txt.zip".format(lang)
if exists:
os.system(cmd)
cmd = "unzip -d en-{0}/un/ en-{0}.un.txt.zip".format(lang)
os.system(cmd)
"""
#download global voices
if first:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=GlobalVoices/v2017q3/moses/{0}-en.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=GlobalVoices/v2017q3/moses/{0}-en.txt.zip -O en-{0}.gv.txt.zip".format(lang)
else:
exists = check_site_exist("http://opus.nlpl.eu/download.php?f=GlobalVoices/v2017q3/moses/en-{0}.txt.zip".format(lang))
cmd = "wget http://opus.nlpl.eu/download.php?f=GlobalVoices/v2017q3/moses/en-{0}.txt.zip -O en-{0}.gv.txt.zip".format(lang)
if exists:
os.system(cmd)
cmd = "unzip -d en-{0}/gv/ en-{0}.gv.txt.zip".format(lang)
os.system(cmd)
if len(sys.argv) == 1:
langs = ["fr", "de", "ru", "zh", "ar", "es", "tr"]
else:
langs = sys.argv[1].split("-")
for i in langs:
download_for_lang(i)
os.system("rm *.zip") | 40.771429 | 131 | 0.622985 | 713 | 4,281 | 3.701262 | 0.110799 | 0.086396 | 0.118227 | 0.157635 | 0.843122 | 0.843122 | 0.840091 | 0.814324 | 0.759757 | 0.759757 | 0 | 0.027833 | 0.177529 | 4,281 | 105 | 132 | 40.771429 | 0.72167 | 0.005139 | 0 | 0.166667 | 0 | 0.111111 | 0.330377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.083333 | 0 | 0.194444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5c15452796682cb065309a22f593b42278724405 | 3,075 | py | Python | client_wishlist/wishlist/views.py | EVolpert/client_whishlist | b2da64a53e978bc77bc4fb9a8c9b9dc4af66c5b1 | [
"CC0-1.0"
] | null | null | null | client_wishlist/wishlist/views.py | EVolpert/client_whishlist | b2da64a53e978bc77bc4fb9a8c9b9dc4af66c5b1 | [
"CC0-1.0"
] | 5 | 2021-03-30T14:20:02.000Z | 2021-09-22T19:29:15.000Z | client_wishlist/wishlist/views.py | EVolpert/client_whishlist | b2da64a53e978bc77bc4fb9a8c9b9dc4af66c5b1 | [
"CC0-1.0"
] | 1 | 2020-08-18T16:35:12.000Z | 2020-08-18T16:35:12.000Z | from django.shortcuts import get_object_or_404
from rest_framework import status
from rest_framework.views import APIView
from rest_framework.response import Response
from customers.auth import authenticate
from wishlist.models import Wishlist
from wishlist.serializers import WishlistSerializer
from wishlist.wishlist_svc import get_or_update, delete_product_from_wishlist
class WishListDetailView(APIView):
def get(self, request, customer_id):
try:
token = request.META['HTTP_AUTHORIZATION']
authenticated = authenticate(customer_id=customer_id, token=token)
except KeyError:
response = Response(status=status.HTTP_400_BAD_REQUEST)
else:
if authenticated:
wishlist = get_object_or_404(Wishlist, customer__id=customer_id)
serializer = WishlistSerializer(wishlist)
response = Response(serializer.data, status=status.HTTP_200_OK)
else:
response = Response(status=status.HTTP_401_UNAUTHORIZED)
return response
def delete(self, request, customer_id):
try:
token = request.META['HTTP_AUTHORIZATION']
authenticated = authenticate(customer_id=customer_id, token=token)
except KeyError:
response = Response(status=status.HTTP_400_BAD_REQUEST)
else:
if authenticated:
wishlist = get_object_or_404(Wishlist, customer__id=customer_id)
wishlist.delete()
response = Response(status=status.HTTP_204_NO_CONTENT)
else:
response = Response(status=status.HTTP_401_UNAUTHORIZED)
return response
class WishlistUpdateView(APIView):
def post(self, request, customer_id, product_id):
try:
token = request.META['HTTP_AUTHORIZATION']
authenticated = authenticate(customer_id=customer_id, token=token)
except KeyError:
response = Response(status=status.HTTP_400_BAD_REQUEST)
else:
if authenticated:
wishlist = get_or_update(customer_id, product_id)
serializer = WishlistSerializer(wishlist)
response = Response(serializer.data, status=status.HTTP_200_OK)
else:
response = Response(status=status.HTTP_401_UNAUTHORIZED)
return response
def delete(self, request, customer_id, product_id):
try:
token = request.META['HTTP_AUTHORIZATION']
authenticated = authenticate(customer_id=customer_id, token=token)
except KeyError:
response = Response(status=status.HTTP_400_BAD_REQUEST)
else:
if authenticated:
whishlist = delete_product_from_wishlist(customer_id, product_id)
serializer = WishlistSerializer(whishlist)
response = Response(serializer.data, status=status.HTTP_200_OK)
else:
response = Response(status=status.HTTP_401_UNAUTHORIZED)
return response
| 39.935065 | 81 | 0.66374 | 317 | 3,075 | 6.18612 | 0.179811 | 0.09179 | 0.097909 | 0.128506 | 0.759816 | 0.743498 | 0.710862 | 0.710862 | 0.710862 | 0.710862 | 0 | 0.02008 | 0.27122 | 3,075 | 76 | 82 | 40.460526 | 0.854975 | 0 | 0 | 0.712121 | 0 | 0 | 0.023422 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.121212 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
30c4bcb9b8b8cf76dd11cf1c07eb8c766fd069e8 | 42 | py | Python | bert4ms/models/__init__.py | lvyufeng/bert4ms | c62a41f0c1a8c94b1d51313211ad1fd2d3ededef | [
"Apache-2.0"
] | 25 | 2021-08-17T04:55:49.000Z | 2022-02-25T09:29:58.000Z | bert4ms/models/__init__.py | lvyufeng/bert4ms | c62a41f0c1a8c94b1d51313211ad1fd2d3ededef | [
"Apache-2.0"
] | 1 | 2022-02-19T14:15:00.000Z | 2022-02-25T07:14:27.000Z | bert4ms/models/__init__.py | lvyufeng/bert4ms | c62a41f0c1a8c94b1d51313211ad1fd2d3ededef | [
"Apache-2.0"
] | 1 | 2022-01-21T09:21:27.000Z | 2022-01-21T09:21:27.000Z | from .bert import *
# from .ernie import * | 21 | 22 | 0.690476 | 6 | 42 | 4.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 2 | 22 | 21 | 0.852941 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
30e04dd33bdadc0d59aca461bd3e9ac422e919c9 | 86 | py | Python | app/core/events/__init__.py | zmops/zeus-modbus | 3de989f2233e994876cf2a98ac46d9213e53ff3c | [
"Apache-2.0"
] | 3 | 2022-01-26T04:27:49.000Z | 2022-03-04T14:02:41.000Z | app/core/events/__init__.py | zmops/zeus-modbus | 3de989f2233e994876cf2a98ac46d9213e53ff3c | [
"Apache-2.0"
] | null | null | null | app/core/events/__init__.py | zmops/zeus-modbus | 3de989f2233e994876cf2a98ac46d9213e53ff3c | [
"Apache-2.0"
] | null | null | null | from .start import create_start_app_handler
from .stop import create_stop_app_handler
| 28.666667 | 43 | 0.883721 | 14 | 86 | 5 | 0.5 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 86 | 2 | 44 | 43 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
30ec4accf077b6e2f3a08f9897392a1dce282032 | 100 | py | Python | src/models.py | gwynhowell/gae-remoteapi-service-accounts | 524c67fd94d104ba16ead76e5938815722befb85 | [
"MIT"
] | 1 | 2016-09-01T13:54:50.000Z | 2016-09-01T13:54:50.000Z | src/models.py | gwynhowell/gae-remoteapi-service-accounts | 524c67fd94d104ba16ead76e5938815722befb85 | [
"MIT"
] | null | null | null | src/models.py | gwynhowell/gae-remoteapi-service-accounts | 524c67fd94d104ba16ead76e5938815722befb85 | [
"MIT"
] | null | null | null | # add model definitions here
from google.appengine.ext import ndb
class User(ndb.Model):
pass | 14.285714 | 36 | 0.75 | 15 | 100 | 5 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18 | 100 | 7 | 37 | 14.285714 | 0.914634 | 0.26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
ebd44ac26d47efc95150fc7d4c164441d5039051 | 39 | py | Python | convlab2/policy/ppo/__init__.py | ljw23/ConvLab-2 | 13d48ea0e441701bd66100689b6c25b561f15525 | [
"Apache-2.0"
] | 339 | 2020-03-04T09:43:22.000Z | 2022-03-26T17:27:38.000Z | convlab2/policy/ppo/__init__.py | ljw23/ConvLab-2 | 13d48ea0e441701bd66100689b6c25b561f15525 | [
"Apache-2.0"
] | 122 | 2020-04-12T04:19:06.000Z | 2022-03-23T14:20:57.000Z | convlab2/policy/ppo/__init__.py | ljw23/ConvLab-2 | 13d48ea0e441701bd66100689b6c25b561f15525 | [
"Apache-2.0"
] | 138 | 2020-02-18T16:48:04.000Z | 2022-03-26T17:27:43.000Z | from convlab2.policy.ppo.ppo import PPO | 39 | 39 | 0.846154 | 7 | 39 | 4.714286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.076923 | 39 | 1 | 39 | 39 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ebd469d19fee31df25b608351786ecda63fba4bd | 22,089 | py | Python | scripts/python/AttributeMenu.py | zengchen2015/Houdini_HPro | 105d05454c84fbb42dc888350642bb79b1aa75b3 | [
"MIT"
] | 16 | 2021-05-08T21:31:51.000Z | 2022-03-17T19:36:26.000Z | scripts/python/AttributeMenu.py | zengchen2015/Houdini_HPro | 105d05454c84fbb42dc888350642bb79b1aa75b3 | [
"MIT"
] | null | null | null | scripts/python/AttributeMenu.py | zengchen2015/Houdini_HPro | 105d05454c84fbb42dc888350642bb79b1aa75b3 | [
"MIT"
] | 1 | 2022-01-24T21:50:06.000Z | 2022-01-24T21:50:06.000Z | import hou
def point(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
for point_attr in point_attrs_tuple:
point_attr_name = point_attr.name()
final_list.append(point_attr_name)
final_list.append(point_attr_name)
return final_list
def vertex(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
for vertex_attr in vertex_attrs_tuple:
vertex_attr_name = vertex_attr.name()
final_list.append(vertex_attr_name)
final_list.append(vertex_attr_name)
return final_list
def prim(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
for prim_attr in prim_attrs_tuple:
prim_attr_name = prim_attr.name()
final_list.append(prim_attr_name)
final_list.append(prim_attr_name)
return final_list
def detail(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
global_attrs_tuple = geo.globalAttribs()
for global_attr in global_attrs_tuple:
global_attr_name = global_attr.name()
final_list.append(global_attr_name)
final_list.append(global_attr_name)
return final_list
#############################################################################
def vertexInt(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_int_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 1 and vertex_attr.dataType().name() == "Int" and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_int_attrs_list.append(name)
vertex_int_attrs_list.sort()
for attr_name in vertex_int_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexFloat(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_float_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 1 and vertex_attr.dataType().name() == "Float" and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_float_attrs_list.append(name)
vertex_float_attrs_list.sort()
for attr_name in vertex_float_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexString(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_string_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 1 and vertex_attr.dataType().name() == "String" and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_string_attrs_list.append(name)
vertex_string_attrs_list.sort()
for attr_name in vertex_string_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexVector2(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_vector2_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 2 and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_vector2_attrs_list.append(name)
vertex_vector2_attrs_list.sort()
for attr_name in vertex_vector2_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexVector(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_vector_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 3 and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_vector_attrs_list.append(name)
vertex_vector_attrs_list.sort()
for attr_name in vertex_vector_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexVector4(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_vector4_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 4 and not vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_vector4_attrs_list.append(name)
vertex_vector4_attrs_list.sort()
for attr_name in vertex_vector4_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexIntArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_int_array_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 1 and vertex_attr.dataType().name() == "Int" and vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_int_array_attrs_list.append(name)
vertex_int_array_attrs_list.sort()
for attr_name in vertex_int_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def vertexFloatArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
vertex_attrs_tuple = geo.vertexAttribs()
vertex_float_array_attrs_list = []
for vertex_attr in vertex_attrs_tuple:
if vertex_attr.size() == 1 and vertex_attr.dataType().name() == "Float" and vertex_attr.isArrayType():
name = vertex_attr.name()
vertex_float_array_attrs_list.append(name)
vertex_float_array_attrs_list.sort()
for attr_name in vertex_float_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
#############################################################################
def pointInt(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_int_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 1 and point_attr.dataType().name() == "Int" and not point_attr.isArrayType():
name = point_attr.name()
point_int_attrs_list.append(name)
point_int_attrs_list.sort()
for attr_name in point_int_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointFloat(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_float_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 1 and point_attr.dataType().name() == "Float" and not point_attr.isArrayType():
name = point_attr.name()
point_float_attrs_list.append(name)
point_float_attrs_list.sort()
for attr_name in point_float_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointString(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_string_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 1 and point_attr.dataType().name() == "String" and not point_attr.isArrayType():
name = point_attr.name()
point_string_attrs_list.append(name)
point_string_attrs_list.sort()
for attr_name in point_string_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointVector2(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_vector2_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 2 and not point_attr.isArrayType():
name = point_attr.name()
point_vector2_attrs_list.append(name)
point_vector2_attrs_list.sort()
for attr_name in point_vector2_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointVector(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_vector_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 3 and not point_attr.isArrayType():
name = point_attr.name()
point_vector_attrs_list.append(name)
point_vector_attrs_list.sort()
for attr_name in point_vector_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointVector4(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_vector4_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 4 and not point_attr.isArrayType():
name = point_attr.name()
point_vector4_attrs_list.append(name)
point_vector4_attrs_list.sort()
for attr_name in point_vector4_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointIntArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_int_array_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 1 and point_attr.dataType().name() == "Int" and point_attr.isArrayType():
name = point_attr.name()
point_int_array_attrs_list.append(name)
point_int_array_attrs_list.sort()
for attr_name in point_int_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def pointFloatArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
point_attrs_tuple = geo.pointAttribs()
point_float_array_attrs_list = []
for point_attr in point_attrs_tuple:
if point_attr.size() == 1 and point_attr.dataType().name() == "Float" and point_attr.isArrayType():
name = point_attr.name()
point_float_array_attrs_list.append(name)
point_float_array_attrs_list.sort()
for attr_name in point_float_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
################################################################
def primInt(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_int_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "Int" and not prim_attr.isArrayType():
name = prim_attr.name()
prim_int_attrs_list.append(name)
prim_int_attrs_list.sort()
for attr_name in prim_int_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primFloat(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_float_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "Float" and not prim_attr.isArrayType():
name = prim_attr.name()
prim_float_attrs_list.append(name)
prim_float_attrs_list.sort()
for attr_name in prim_float_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primString(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_str_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "String" and not prim_attr.isArrayType():
name = prim_attr.name()
prim_str_attrs_list.append(name)
prim_str_attrs_list.sort()
for attr_name in prim_str_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primVector2(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_vector2_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 2 and not prim_attr.isArrayType():
name = prim_attr.name()
prim_vector2_attrs_list.append(name)
prim_vector2_attrs_list.sort()
for attr_name in prim_vector2_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primVector(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_vector_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 3 and not prim_attr.isArrayType():
name = prim_attr.name()
prim_vector_attrs_list.append(name)
prim_vector_attrs_list.sort()
for attr_name in prim_vector_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primVector4(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_vector4_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 4 and not prim_attr.isArrayType():
name = prim_attr.name()
prim_vector4_attrs_list.append(name)
prim_vector4_attrs_list.sort()
for attr_name in prim_vector4_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primIntArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_int_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "Int" and prim_attr.isArrayType():
name = prim_attr.name()
prim_int_attrs_list.append(name)
prim_int_attrs_list.sort()
for attr_name in prim_int_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primFloatArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_float_array_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "Float" and prim_attr.isArrayType():
name = prim_attr.name()
prim_float_array_attrs_list.append(name)
prim_float_array_attrs_list.sort()
for attr_name in prim_float_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primStringArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_string_array_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 1 and prim_attr.dataType().name() == "String" and prim_attr.isArrayType():
name = prim_attr.name()
prim_string_array_attrs_list.append(name)
prim_string_array_attrs_list.sort()
for attr_name in prim_string_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def primVectorArray(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
prim_attrs_tuple = geo.primAttribs()
prim_vector_array_attrs_list = []
for prim_attr in prim_attrs_tuple:
if prim_attr.size() == 3 and prim_attr.isArrayType():
name = prim_attr.name()
prim_vector_array_attrs_list.append(name)
prim_vector_array_attrs_list.sort()
for attr_name in prim_vector_array_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
###############################################################
def detailVector(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
global_attrs_tuple = geo.globalAttribs()
global_vector_attrs_list = []
for global_attr in global_attrs_tuple:
if global_attr.size() == 3 and not global_attr.isArrayType():
name = global_attr.name()
global_vector_attrs_list.append(name)
global_vector_attrs_list.sort()
for attr_name in global_vector_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list
def detailString(node_relative_path):
node = hou.pwd()
input_node = node.node(node_relative_path)
geo = input_node.geometry()
final_list = []
if geo:
global_attrs_tuple = geo.globalAttribs()
global_string_attrs_list = []
for global_attr in global_attrs_tuple:
if global_attr.size() == 1 and global_attr.dataType().name() == "String" and not global_attr.isArrayType():
name = global_attr.name()
global_string_attrs_list.append(name)
global_string_attrs_list.sort()
for attr_name in global_string_attrs_list:
final_list.append(attr_name)
final_list.append(attr_name)
return final_list | 30.936975 | 120 | 0.617819 | 2,728 | 22,089 | 4.615103 | 0.027859 | 0.091501 | 0.081334 | 0.084512 | 0.959492 | 0.926132 | 0.888801 | 0.879428 | 0.83884 | 0.752423 | 0 | 0.00369 | 0.288424 | 22,089 | 714 | 121 | 30.936975 | 0.797302 | 0 | 0 | 0.678095 | 0 | 0 | 0.003697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060952 | false | 0 | 0.001905 | 0 | 0.12381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6929b1c7ba452c297e03a6959f9adef85d483fd2 | 126 | py | Python | idc/base/views.py | noracami/curly-guacamole | 41eae7264020587175318c19d1d7e102e9128865 | [
"MIT"
] | null | null | null | idc/base/views.py | noracami/curly-guacamole | 41eae7264020587175318c19d1d7e102e9128865 | [
"MIT"
] | null | null | null | idc/base/views.py | noracami/curly-guacamole | 41eae7264020587175318c19d1d7e102e9128865 | [
"MIT"
] | null | null | null | from django.shortcuts import render, redirect
# Create your views here.
def home(request):
return redirect('shift_list')
| 21 | 45 | 0.761905 | 17 | 126 | 5.588235 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150794 | 126 | 5 | 46 | 25.2 | 0.88785 | 0.18254 | 0 | 0 | 0 | 0 | 0.09901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
c6046ea87e4556f840bd337bf7696e5692976e22 | 31,969 | py | Python | Levels.py | clleger/cirkits | 4753eb389ca8f6f7e6bfacf22f0b643533cb8c65 | [
"MIT"
] | null | null | null | Levels.py | clleger/cirkits | 4753eb389ca8f6f7e6bfacf22f0b643533cb8c65 | [
"MIT"
] | null | null | null | Levels.py | clleger/cirkits | 4753eb389ca8f6f7e6bfacf22f0b643533cb8c65 | [
"MIT"
] | null | null | null | from kivy.uix.button import Button
from kivy.uix.floatlayout import FloatLayout
from kivy.uix.scatterlayout import ScatterLayout
import boolean
from widgets import BooleanOutput, ToggleInput, Wire, BGate, UGate, NumberDisplay
from kivy.graphics import Line, Color
def create_binary_level(operation, input1_default=False, input2_default=False):
def get_binary_level(widget):
input1 = ToggleInput("Input 0")#, size_hint=(0.2, 0.2), pos_hint={'x':50, 'y':300})
input1.size_hint = (50./960, 175./640)
input1.center = (100, 480)
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("Input 1", size_hint=(50./960, 175./640), center=(100, 180))#{'x':70, 'y':125})
# input2.size = (50, 200)
# input2.center = (100, 200)
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
output = BooleanOutput("Output 0")
output.size_hint = (100./960, 120./640)
output.center = (800, 340)
widget.add_widget(output)
op = operation(input1, input2, output)
gate = BGate(op, size_hint=(200./960, 175./640))
gate.center = (400, 360)
widget.add_widget(gate)
with widget.canvas:
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_width = 5.
line_1G = Wire(input1, op, c1, points=(160, 500), width=line_width, joint='round')
line_1G.points += [250, 500]
line_1G.points += [250, 385]
line_1G.points += [350, 385]
c2 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_2G = Wire(input2, op, c2, points=(160, 200), width=line_width, joint='round')
line_2G.points += [250, 200]
line_2G.points += [250, 335]
line_2G.points += [350, 335]
c3 = Color(0.22, 0.22, 0.22)
line_GO = Wire(op, output, c3, points=(575, 360), width=line_width * 1.5)
line_GO.points += [725, 360]
# self.button.state = 'down' if new_val else 'normal'
# input2.set_value(input2_default)
return get_binary_level
get_level_1 = create_binary_level(boolean.INPUT0_BGATE, False, False)
get_level_2 = create_binary_level(boolean.INPUT1_BGATE, False, False)
get_level_3 = create_binary_level(boolean.AND_BGATE, False, False)
get_level_4 = create_binary_level(boolean.OR_BGATE, False, False)
get_level_5 = create_binary_level(boolean.XOR_BGATE, False, False)
get_level_6 = create_binary_level(boolean.XNOR_BGATE, True, False)
get_level_8 = create_binary_level(boolean.NAND_BGATE, True, True)
get_level_9 = create_binary_level(boolean.NOR_BGATE, True, True)
def get_level_C1(widget):
input1_default = True
input2_default = True
input1 = ToggleInput("Input 0") # , size_hint=(0.2, 0.2), pos_hint={'x':50, 'y':300})
input1.size_hint = (50. / 960, 175. / 640)
input1.center = (100, 480)
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("Input 1", size_hint=(50. / 960, 175. / 640), center=(100, 180)) # {'x':70, 'y':125})
# input2.size = (50, 200)
# input2.center = (100, 200)
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
out2 = BooleanOutput("Output 1")
out2.size_hint = (100. / 960, 120. / 640)
out2.center = (800, 340)
widget.add_widget(out2)
op = boolean.AND_BGATE(input1, input2, None)
gate = BGate(op, size_hint=(200. / 960, 175. / 640))
gate.center = (300, 360)
widget.add_widget(gate)
op2 = boolean.INVERTER_UGATE(op, out2)
gate2 = UGate(op2, size_hint=(200. /960, 175./ 640))
gate2.center = (500, 360)
widget.add_widget(gate2)
with widget.canvas:
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_width = 5.
line_1G = Wire(input1, op, c1, points=(150, 500), width=line_width, joint='round')
line_1G.points += [200, 500]
line_1G.points += [200, 385]
line_1G.points += [250, 385]
c2 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_2G = Wire(input2, op, c2, points=(150, 200), width=line_width, joint='round')
line_2G.points += [200, 200]
line_2G.points += [200, 335]
line_2G.points += [250, 335]
c3 = Color(0, 0, 0.22 if not op else 0.66, mode='hsv')
line_GO = Wire(op, op2, c3, points=(415, 360), width=line_width * 1.5)
line_GO.points += [500, 360]
c4 = Color(0, 0, 0.22 if not op2 else 0.66, mode='hsv')
line_GO2 = Wire(op2, out2, c4, points=(650, 360), width=line_width * 1.5)
line_GO2.points += [725, 360]
# self.button.state = 'down' if new_val else 'normal'
# input2.set_value(input2_default)
def get_level_C2(widget):
input1_default = False
input2_default = False
input1 = ToggleInput("Input 0") # , size_hint=(0.2, 0.2), pos_hint={'x':50, 'y':300})
input1.size_hint = (50. / 960, 175. / 640)
input1.center = (100, 480)
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
inv1 = boolean.INVERTER_UGATE(input1, None)
inn1 = UGate(inv1, size_hint=(200./ 960, 175./ 640))
inn1.center = (170, 500)
widget.add_widget(inn1)
input2 = ToggleInput("Input 1", size_hint=(50. / 960, 175. / 640), center=(100, 180)) # {'x':70, 'y':125})
# input2.size = (50, 200)
# input2.center = (100, 200)
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
inv2 = boolean.INVERTER_UGATE(input2, None)
inn2 = UGate(inv2, size_hint=(200./ 960, 175./ 640))
inn2.center = (170, 200)
widget.add_widget(inn2)
out2 = BooleanOutput("Output 1")
out2.size_hint = (100. / 960, 120. / 640)
out2.center = (800, 340)
widget.add_widget(out2)
op = boolean.AND_BGATE(inn1.op, inn2.op, None)
gate = BGate(op, size_hint=(200. / 960, 175. / 640))
gate.center = (440, 360)
widget.add_widget(gate)
op2 = boolean.INVERTER_UGATE(op, out2)
gate2 = UGate(op2, size_hint=(200. /960, 175./ 640))
gate2.center = (600, 360)
widget.add_widget(gate2)
with widget.canvas:
line_width = 5.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, inv1, c1, points=(150, 500), width=line_width, joint='round')
line_1G.points += [160, 500]
ci1 = Color(0, 0.9, 0.3 if not inv1 else 0.9, mode='hsv')
line_1G = Wire(inv1, op, ci1, points=(305, 500), width=line_width, joint='round')
line_1G.points += [335, 500]
line_1G.points += [335, 385]
line_1G.points += [375, 385]
c2 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_2G = Wire(input2, inv2, c2, points=(150, 200), width=line_width, joint='round')
line_2G.points += [160, 200]
ci2 = Color(5. / 7, 0.9, 0.3 if not inv2 else 0.9, mode='hsv')
line_2G = Wire(inv2, op, ci2, points=(305, 200), width=line_width, joint='round')
line_2G.points += [335, 200]
line_2G.points += [335, 335]
line_2G.points += [375, 335]
c3 = Color(0, 0, 0.22 if not op else 0.66, mode='hsv')
line_GO = Wire(op, op2, c3, points=(555, 360), width=line_width * 1.5)
line_GO.points += [600, 360]
c4 = Color(0, 0, 0.22 if not op2 else 0.66, mode='hsv')
line_GO2 = Wire(op2, out2, c4, points=(730, 360), width=line_width * 1.5)
line_GO2.points += [750, 360]
# self.button.state = 'down' if new_val else 'normal'
# input2.set_value(input2_default)
def get_level_C3(widget):
input1_default = False
input2_default = False
input1 = ToggleInput("Input 0") # , size_hint=(0.2, 0.2), pos_hint={'x':50, 'y':300})
input1.size_hint = (50. / 960, 175. / 640)
input1.center = (100, 480)
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
inv1 = boolean.INVERTER_UGATE(input1, None)
inn1 = UGate(inv1, size_hint=(200./ 960, 175./ 640))
inn1.center = (170, 500)
widget.add_widget(inn1)
input2 = ToggleInput("Input 1", size_hint=(50. / 960, 175. / 640), center=(100, 180)) # {'x':70, 'y':125})
# input2.size = (50, 200)
# input2.center = (100, 200)
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
inv2 = boolean.INVERTER_UGATE(input2, None)
inn2 = UGate(inv2, size_hint=(200./ 960, 175./ 640))
inn2.center = (170, 200)
widget.add_widget(inn2)
out2 = BooleanOutput("Output 1")
out2.size_hint = (100. / 960, 120. / 640)
out2.center = (800, 340)
widget.add_widget(out2)
Y = boolean.AND_BGATE(inn1.op, inn2.op, None)
gateY = BGate(Y, size_hint=(200. / 960, 175. / 640))
gateY.center = (440, 500)
widget.add_widget(gateY)
Z = boolean.AND_BGATE(inn1.op, inn2.op, None)
gateZ = BGate(Z, size_hint=(200. / 960, 175. / 640))
gateZ.center = (440, 300)
widget.add_widget(gateZ)
op2 = boolean.OR_BGATE(Y, Z)
gate2 = BGate(op2, size_hint=(200. /960, 350./ 640))
gate2.center = (600, 360)
widget.add_widget(gate2)
with widget.canvas:
line_width = 5.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, inv1, c1, points=(150, 500), width=line_width, joint='round')
line_1G.points += [160, 500]
ci1 = Color(0, 0.9, 0.3 if not inv1 else 0.9, mode='hsv')
line_1G = Wire(inv1, Y, ci1, points=(305, 500), width=line_width, joint='round')
line_1G.points += [335, 500]
line_1G.points += [335, 385]
line_1G.points += [375, 385]
c2 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_2G = Wire(input2, inv2, c2, points=(150, 200), width=line_width, joint='round')
line_2G.points += [160, 200]
ci2 = Color(5. / 7, 0.9, 0.3 if not inv2 else 0.9, mode='hsv')
line_2G = Wire(inv2, Y, ci2, points=(305, 200), width=line_width, joint='round')
line_2G.points += [335, 200]
line_2G.points += [335, 335]
line_2G.points += [375, 335]
c3 = Color(0, 0, 0.22 if not Y else 0.66, mode='hsv')
line_GO = Wire(Y, op2, c3, points=(555, 360), width=line_width * 1.5)
line_GO.points += [600, 360]
c4 = Color(0, 0, 0.22 if not op2 else 0.66, mode='hsv')
line_GO2 = Wire(op2, out2, c4, points=(730, 360), width=line_width * 1.5)
line_GO2.points += [750, 360]
# self.button.state = 'down' if new_val else 'normal'
# input2.set_value(input2_default)
def get_level_N1(widget):
input1_default = False
input2_default = False
input3_default = False
input4_default = False
input5_default = False
base_default = True
input1 = ToggleInput("16", size_hint=(25. / 960, 80. / 640), center=(200, 520))
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("8", size_hint=(25. / 960, 80. / 640), center=(300, 520))
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
input3 = ToggleInput("4", size_hint=(25. / 960, 80. / 640), center=(400, 520))
widget.add_widget(input3)
input3.button.state = 'down' if input3_default else 'normal'
input4 = ToggleInput("2", size_hint=(25. / 960, 80. / 640), center=(500, 520))
widget.add_widget(input4)
input4.button.state = 'down' if input4_default else 'normal'
input5 = ToggleInput("1", size_hint=(25. / 960, 80. / 640), center=(600, 520), rows=3)
widget.add_widget(input5)
input5.button.state = 'down' if input5_default else 'normal'
display = NumberDisplay(5, "Display", size_hint=(200. / 960, 100. / 640), center=(300, 320), rows=3)
display.display.editable = False
widget.add_widget(display)
base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(250, 220), rows=3)
widget.add_widget(base)
base.button.state = 'down' if base_default else 'normal'
input5.set_output(display.bits[0])
input4.set_output(display.bits[1])
input3.set_output(display.bits[2])
input2.set_output(display.bits[3])
input1.set_output(display.bits[4])
base.set_output(display.base_in)
with widget.canvas:
line_width = 2.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, display, c1, points=(163, 460), width=line_width, joint='round')
line_1G.points += [163, 400]
line_1G.points += [265, 400]
line_1G.points += [265, 380]
c1 = Color(0, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, display, c1, points=(263, 460), width=line_width, joint='round')
line_1G.points += [263, 415]
line_1G.points += [305, 415]
line_1G.points += [305, 380]
c1 = Color(0, 0.9, 0.3 if not input3 else 0.9, mode='hsv')
line_1G = Wire(input3, display, c1, points=(363, 460), width=line_width, joint='round')
line_1G.points += [363, 430]
line_1G.points += [345, 430]
line_1G.points += [345, 380]
c1 = Color(0, 0.9, 0.3 if not input4 else 0.9, mode='hsv')
line_1G = Wire(input4, display, c1, points=(463, 460), width=line_width, joint='round')
line_1G.points += [463, 415]
line_1G.points += [385, 415]
line_1G.points += [385, 380]
c1 = Color(0, 0.9, 0.3 if not input5 else 0.9, mode='hsv')
line_1G = Wire(input5, display, c1, points=(563, 460), width=line_width, joint='round')
line_1G.points += [563, 400]
line_1G.points += [425, 400]
line_1G.points += [425, 380]
c2 = Color(4. / 7, 0.9, 0.3 if not base else 0.9, mode='hsv')
line_2G = Wire(base, display.base_in, c2, points=(213, 260), width=line_width, joint='round')
line_2G.points += [213, 300]
line_2G.points += [243, 300]
#
# ci2 = Color(5. / 7, 0.9, 0.3 if not inv2 else 0.9, mode='hsv')
# line_2G = Wire(inv2, Y, ci2, points=(305, 200), width=line_width, joint='round')
# line_2G.points += [335, 200]
# line_2G.points += [335, 335]
# line_2G.points += [375, 335]
#
# c3 = Color(0, 0, 0.22 if not Y else 0.66, mode='hsv')
# line_GO = Wire(Y, op2, c3, points=(555, 360), width=line_width * 1.5)
# line_GO.points += [600, 360]
#
# c4 = Color(0, 0, 0.22 if not op2 else 0.66, mode='hsv')
# line_GO2 = Wire(op2, out2, c4, points=(730, 360), width=line_width * 1.5)
# line_GO2.points += [750, 360]
# # self.button.state = 'down' if new_val else 'normal'
# # input2.set_value(input2_default)
def get_level_N2(widget):
base_default = True
output1 = BooleanOutput("16", size_hint=(25. / 960, 80. / 640), center = (200, 140))
widget.add_widget(output1)
output2 = BooleanOutput("8", size_hint=(25. / 960, 80. / 640), center = (300, 140))
widget.add_widget(output2)
output3 = BooleanOutput("4", size_hint=(25. / 960, 80. / 640), center = (400, 140))
widget.add_widget(output3)
output4 = BooleanOutput("2", size_hint=(25. / 960, 80. / 640), center = (500, 140))
widget.add_widget(output4)
output5 = BooleanOutput("1", size_hint=(25. / 960, 80. / 640), center = (600, 140))
widget.add_widget(output5)
display = NumberDisplay(5, "Display", size_hint=(200. / 960, 100. / 640), center=(300, 320), rows=3)
widget.add_widget(display)
base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(200, 320), rows=3)
widget.add_widget(base)
base.button.state = 'down' if base_default else 'normal'
display.bits[0].set_output(output5)
display.bits[1].set_output(output4)
display.bits[2].set_output(output3)
display.bits[3].set_output(output2)
display.bits[4].set_output(output1)
base.set_output(display.base_in)
with widget.canvas:
line_width = 2.
c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
line_1G = Wire(display.bits[4], output1, c1, points=(265, 270), width=line_width, joint='round')
line_1G.points += [265, 250]
line_1G.points += [163, 250]
line_1G.points += [163, 160]
c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
line_1G = Wire(display.bits[3], output2, c1, points=(305, 270), width=line_width, joint='round')
line_1G.points += [305, 235]
line_1G.points += [263, 235]
line_1G.points += [263, 160]
c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
line_1G = Wire(display.bits[2], output3, c1, points=(345, 270), width=line_width, joint='round')
line_1G.points += [345, 220]
line_1G.points += [363, 220]
line_1G.points += [363, 160]
c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
line_1G = Wire(display.bits[1], output4, c1, points=(385, 270), width=line_width, joint='round')
line_1G.points += [385, 235]
line_1G.points += [463, 235]
line_1G.points += [463, 160]
c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
line_1G = Wire(display.bits[0], output5, c1, points=(425, 270), width=line_width, joint='round')
line_1G.points += [425, 250]
line_1G.points += [563, 250]
line_1G.points += [563, 160]
c2 = Color(4. / 7, 0.9, 0.3 if not base else 0.9, mode='hsv')
line_2G = Wire(base, display.base_in, c2, points=(243, 300), width=line_width, joint='round')
line_2G.points += [183, 300]
def get_level_HA1(widget):
base_default = True
input1_default = False
input2_default = False
base_default = True
input1 = ToggleInput("A", size_hint=(25. / 960, 80. / 640), center=(200, 520))
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("B", size_hint=(25. / 960, 80. / 640), center=(200, 320))
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
display = NumberDisplay(5, "Display", size_hint=(200. / 960, 100. / 640), center=(600, 400), rows=3)
display.display.editable = False
widget.add_widget(display)
base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(860, 400), rows=3)
widget.add_widget(base)
base.button.state = 'down' if base_default else 'normal'
carry_op = boolean.AND_BGATE(input1, input2, None)
carry_gate = BGate(carry_op, size_hint=(100. / 960, 80. / 640))
carry_gate.replaceable = True
carry_gate.center = (400, 360)
widget.add_widget(carry_gate)
sum_op = boolean.XOR_BGATE(input1, input2, None)
sum_gate = BGate(sum_op, size_hint=(100. / 960, 80. / 640))
sum_gate.replaceable = True
sum_gate.center = (400, 540)
widget.add_widget(sum_gate)
sum_op.set_output(display.bits[0])
carry_op.set_output(display.bits[1])
base.set_output(display.base_in)
# output1 = BooleanOutput("2", size_hint=(25. / 960, 80. / 640), center=(500, 140))
# widget.add_widget(output1)
#
# output2 = BooleanOutput("1", size_hint=(25. / 960, 80. / 640), center=(600, 140))
# widget.add_widget(output2)
#
# display = NumberDisplay(2, "Display", size_hint=(200. / 960, 100. / 640), center=(300, 320), rows=3)
# widget.add_widget(display)
#
# base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(200, 320), rows=3)
# widget.add_widget(base)
# base.button.state = 'down' if base_default else 'normal'
#
# display.bits[0].set_output(output2)
# display.bits[1].set_output(output1)
base.set_output(display.base_in)
with widget.canvas:
line_width = 2.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, carry_gate, c1, points=(215, 520), width=line_width, joint='round')
line_1G.points += [350, 520]
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, sum_gate, c1, points=(215, 520), width=line_width, joint='round')
line_1G.points += [255, 520]
line_1G.points += [255, 340]
line_1G.points += [350, 340]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, carry_gate, c1, points=(215, 320), width=line_width, joint='round')
line_1G.points += [350, 320]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, sum_gate, c1, points=(215, 320), width=line_width, joint='round')
line_1G.points += [235, 320]
line_1G.points += [235, 500]
line_1G.points += [350, 500]
c2 = Color(4. / 7, 0.4, 0.3 if not carry_op else 0.9, mode='hsv')
line_2G = Wire(carry_op, display.bits[0], c2, points=(435, 330), width=line_width, joint='round')
line_2G.points += [485, 330]
line_2G.points += [485, 500]
line_2G.points += [690, 500]
line_2G.points += [690, 480]
c2 = Color(4. / 7, 0.4, 0.3 if not sum_op else 0.9, mode='hsv')
line_2G = Wire(sum_op, display.bits[1], c2, points=(435, 510), width=line_width, joint='round')
line_2G.points += [730, 510]
line_2G.points += [730, 480]
# c1 = Color(0, 0.9, 0.3 if not output1 else 0.9, mode='hsv')
# line_1G = Wire(display.bits[0], output5, c1, points=(425, 270), width=line_width, joint='round')
# line_1G.points += [425, 250]
# line_1G.points += [563, 250]
# line_1G.points += [563, 160]
#
# c2 = Color(4. / 7, 0.9, 0.3 if not base else 0.9, mode='hsv')
# line_2G = Wire(base, display.base_in, c2, points=(243, 300), width=line_width, joint='round')
# line_2G.points += [183, 300]
def get_level_FA1(widget):
base_default = True
input1_default = False
input2_default = False
input3_default = False
input1 = ToggleInput("A", size_hint=(25. / 960, 80. / 640), center=(100, 520))
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("B", size_hint=(25. / 960, 80. / 640), center=(100, 320))
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
input3 = ToggleInput("C_in", size_hint=(25. / 960, 80. / 640), center=(280, 600))
widget.add_widget(input3)
input3.button.state = 'down' if input3_default else 'normal'
display = NumberDisplay(5, "Display", size_hint=(200. / 960, 100. / 640), center=(600, 400), rows=3)
display.display.editable = False
widget.add_widget(display)
base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(860, 400), rows=3)
widget.add_widget(base)
base.button.state = 'down' if base_default else 'normal'
sum_op = boolean.XOR_BGATE(input1, input2, None)
sum_gate = BGate(sum_op, size_hint=(100. / 960, 80. / 640), center = (225, 540))
sum_gate.replaceable = True
widget.add_widget(sum_gate)
sum_op2 = boolean.XOR_BGATE(sum_op, input3, None)
sum_gate2 = BGate(sum_op2, size_hint=(100. / 960, 80. / 640), center = (450, 540))
sum_gate2.replaceable = True
widget.add_widget(sum_gate2)
carry_op = boolean.AND_BGATE(input1, input2, None)
carry_gate = BGate(carry_op, size_hint=(100. / 960, 80. / 640), center = (225, 360))
carry_gate.replaceable = True
widget.add_widget(carry_gate)
carry_op2 = boolean.AND_BGATE(input3, sum_op, None)
carry_gate2 = BGate(carry_op2, size_hint=(100. / 960, 80. / 640), center = (360, 480))
carry_gate2.replaceable = True
widget.add_widget(carry_gate2)
carry_op3 = boolean.OR_BGATE(carry_op, carry_op2, None)
carry_gate3 = BGate(carry_op3, size_hint=(100. / 960, 80. / 640), center = (470, 360))
carry_gate3.replaceable = True
widget.add_widget(carry_gate3)
sum_op2.set_output(display.bits[0])
carry_op3.set_output(display.bits[1])
carry_out = BooleanOutput("C_out", size_hint=(25. / 960, 80. / 640), center = (800, 240))
carry_op3.set_output(carry_out)
widget.add_widget(carry_out)
base.set_output(display.base_in)
with widget.canvas:
line_width = 2.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, carry_gate, c1, points=(115, 520), width=line_width, joint='round')
line_1G.points += [175, 520]
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, sum_gate, c1, points=(115, 520), width=line_width, joint='round')
line_1G.points += [155, 520]
line_1G.points += [155, 340]
line_1G.points += [175, 340]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, carry_gate, c1, points=(115, 320), width=line_width, joint='round')
line_1G.points += [175, 320]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, sum_gate, c1, points=(115, 320), width=line_width, joint='round')
line_1G.points += [135, 320]
line_1G.points += [135, 500]
line_1G.points += [175, 500]
c1 = Color(2. / 7, 0.4, 0.3 if not sum_op else 0.9, mode='hsv')
line_1G = Wire(sum_op, sum_gate2, c1, points=(260, 510), width=line_width, joint='round')
line_1G.points += [330, 510]
line_1G.points += [330, 500]
line_1G.points += [400, 500]
c2 = Color(2. / 7, 0.4, 0.3 if not carry_op2 else 0.9, mode='hsv')
line_2G = Wire(carry_op2, carry_op3, c2, points=(390, 450), width=line_width, joint='round')
line_2G.points += [408, 450]
line_2G.points += [408, 340]
line_2G.points += [430, 340]
c2 = Color(2. / 7, 0.4, 0.3 if not carry_op else 0.9, mode='hsv')
line_2G = Wire(carry_op, carry_op3, c2, points=(265, 330), width=line_width, joint='round')
line_2G.points += [305, 330]
line_2G.points += [305, 320]
line_2G.points += [430, 320]
c2 = Color(4. / 7, 0.4, 0.3 if not carry_op3 else 0.9, mode='hsv')
line_2G = Wire(carry_op3, display.bits[0], c2, points=(505, 330), width=line_width, joint='round')
line_2G.points += [535, 330]
line_2G.points += [535, 490]
line_2G.points += [690, 490]
line_2G.points += [690, 460]
c2 = Color(4. / 7, 0.4, 0.3 if not sum_op2 else 0.9, mode='hsv')
line_2G = Wire(sum_op2, display.bits[1], c2, points=(485, 510), width=line_width, joint='round')
line_2G.points += [730, 510]
line_2G.points += [730, 460]
c1 = Color(2. / 7, 0.4, 0.3 if not sum_op else 0.9, mode='hsv')
line_1G = Wire(sum_op, carry_gate2, c1, points=(260, 510), width=line_width, joint='round')
line_1G.points += [280, 510]
line_1G.points += [280, 440]
line_1G.points += [320, 440]
c3 = Color(2. / 7, 0.9, 0.3 if not input3 else 0.9, mode='hsv')
line_3G = Wire(input3, carry_gate2, c3, points=(265, 580), width=line_width, joint='round')
line_3G.points += [295, 580]
line_3G.points += [295, 460]
line_3G.points += [320, 460]
c3 = Color(2. / 7, 0.9, 0.3 if not input3 else 0.9, mode='hsv')
line_3G = Wire(input3, sum_gate2, c3, points=(265, 580), width=line_width, joint='round')
line_3G.points += [360, 580]
line_3G.points += [360, 520]
line_3G.points += [400, 520]
c2 = Color(4. / 7, 0.4, 0.3 if not carry_op3 else 0.9, mode='hsv')
line_2G = Wire(carry_op3, carry_out, c2, points=(505, 330), width=line_width, joint='round')
line_2G.points += [535, 330]
line_2G.points += [535, 245]
line_2G.points += [750, 245]
c2 = Color(4. / 7, 0.9, 0.3 if not base else 0.9, mode='hsv')
line_2G = Wire(base, display.base_in, c2, points=(790, 365), width=line_width, joint='round')
line_2G.points += [760, 365]
def get_level_HA2(widget):
base_default = True
input1_default = False
input2_default = False
input1 = ToggleInput("A", size_hint=(25. / 960, 80. / 640), center=(100, 520))
widget.add_widget(input1)
input1.button.state = 'down' if input1_default else 'normal'
input2 = ToggleInput("B", size_hint=(25. / 960, 80. / 640), center=(100, 320))
widget.add_widget(input2)
input2.button.state = 'down' if input2_default else 'normal'
display = NumberDisplay(5, "Display", size_hint=(200. / 960, 100. / 640), center=(600, 400), rows=3)
display.display.editable = False
widget.add_widget(display)
base = ToggleInput("bin/dec", size_hint=(25. / 960, 80. / 640), center=(860, 400), rows=3)
widget.add_widget(base)
base.button.state = 'down' if base_default else 'normal'
sum_op = boolean.XOR_BGATE(input1, input2, None)
sum_gate = BGate(sum_op, size_hint=(100. / 960, 80. / 640), center = (225, 540))
sum_gate.replaceable = True
widget.add_widget(sum_gate)
carry_op = boolean.AND_BGATE(input1, input2, None)
carry_gate = BGate(carry_op, size_hint=(100. / 960, 80. / 640), center = (225, 360))
carry_gate.replaceable = True
widget.add_widget(carry_gate)
sum_op.set_output(display.bits[0])
carry_op.set_output(display.bits[1])
carry_out = BooleanOutput("C_out", size_hint=(25. / 960, 80. / 640), center = (800, 240))
carry_op.set_output(carry_out)
widget.add_widget(carry_out)
base.set_output(display.base_in)
with widget.canvas:
line_width = 2.
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, carry_gate, c1, points=(115, 520), width=line_width, joint='round')
line_1G.points += [175, 520]
c1 = Color(0, 0.9, 0.3 if not input1 else 0.9, mode='hsv')
line_1G = Wire(input1, sum_gate, c1, points=(115, 520), width=line_width, joint='round')
line_1G.points += [155, 520]
line_1G.points += [155, 340]
line_1G.points += [175, 340]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, carry_gate, c1, points=(115, 320), width=line_width, joint='round')
line_1G.points += [175, 320]
c1 = Color(5. / 7, 0.9, 0.3 if not input2 else 0.9, mode='hsv')
line_1G = Wire(input2, sum_gate, c1, points=(115, 320), width=line_width, joint='round')
line_1G.points += [135, 320]
line_1G.points += [135, 500]
line_1G.points += [175, 500]
c1 = Color(2. / 7, 0.4, 0.3 if not sum_op else 0.9, mode='hsv')
line_1G = Wire(sum_op, display.bits[1], c1, points=(260, 510), width=line_width, joint='round')
line_1G.points += [330, 510]
line_1G.points += [730, 510]
line_1G.points += [730, 460]
c2 = Color(4. / 7, 0.4, 0.3 if not carry_op else 0.9, mode='hsv')
line_2G = Wire(carry_op, display.bits[0], c2, points=(265, 330), width=line_width, joint='round')
line_2G.points += [535, 330]
line_2G.points += [535, 490]
line_2G.points += [690, 490]
line_2G.points += [690, 460]
c2 = Color(4. / 7, 0.4, 0.3 if not carry_op else 0.9, mode='hsv')
line_2G = Wire(carry_op, carry_out, c2, points=(265, 330), width=line_width, joint='round')
line_2G.points += [535, 330]
line_2G.points += [535, 245]
line_2G.points += [750, 245]
c2 = Color(4. / 7, 0.9, 0.3 if not base else 0.9, mode='hsv')
line_2G = Wire(base, display.base_in, c2, points=(790, 365), width=line_width, joint='round')
line_2G.points += [760, 365]
# __all__ = (get_level_2, get_level_3)
| 41.303618 | 111 | 0.609591 | 4,940 | 31,969 | 3.797368 | 0.044737 | 0.035823 | 0.051175 | 0.020523 | 0.866038 | 0.824244 | 0.806386 | 0.793273 | 0.778506 | 0.731222 | 0 | 0.153549 | 0.230974 | 31,969 | 773 | 112 | 41.35705 | 0.609477 | 0.078607 | 0 | 0.581395 | 0 | 0 | 0.029739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017889 | false | 0 | 0.010733 | 0 | 0.030411 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c62f11012f90af9cba82eef6369bce4d3893b0bf | 35 | py | Python | src/tools_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | src/tools_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | src/tools_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | # __init__.py
from .tools import *
| 11.666667 | 20 | 0.714286 | 5 | 35 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 35 | 2 | 21 | 17.5 | 0.724138 | 0.314286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d6626d3fe713c5741916f0e3743b20f2d5a636b2 | 96 | py | Python | bsmedit/__init__.py | tianzhuqiao/bsmedit | 3e443ed6db7b44b3b0da0e4bc024b65dcfe8e7a4 | [
"MIT"
] | 9 | 2019-07-15T10:05:29.000Z | 2022-03-14T12:55:19.000Z | bsmedit/__init__.py | tianzhuqiao/bsmedit | 3e443ed6db7b44b3b0da0e4bc024b65dcfe8e7a4 | [
"MIT"
] | 2 | 2019-11-30T23:27:47.000Z | 2021-11-02T12:01:05.000Z | bsmedit/__init__.py | tianzhuqiao/bsmedit | 3e443ed6db7b44b3b0da0e4bc024b65dcfe8e7a4 | [
"MIT"
] | null | null | null | from .version import __version__
def to_byte(xpm):
return [x.encode('utf-8') for x in xpm]
| 19.2 | 43 | 0.697917 | 17 | 96 | 3.647059 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012658 | 0.177083 | 96 | 4 | 44 | 24 | 0.772152 | 0 | 0 | 0 | 0 | 0 | 0.052083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
d66a2a1ff35c3491098a5dd195c4432f6baf0b4b | 44 | py | Python | sine/module_name/core.py | SineObama/setup.py | 291286cdf43575be0370bb569f4444765d07c410 | [
"MIT"
] | null | null | null | sine/module_name/core.py | SineObama/setup.py | 291286cdf43575be0370bb569f4444765d07c410 | [
"MIT"
] | null | null | null | sine/module_name/core.py | SineObama/setup.py | 291286cdf43575be0370bb569f4444765d07c410 | [
"MIT"
] | null | null | null | # encoding: UTF-8
# Insert your code here.
| 14.666667 | 25 | 0.681818 | 7 | 44 | 4.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.204545 | 44 | 2 | 26 | 22 | 0.828571 | 0.863636 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0.5 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d6858d0f71004fbc8bd8453f114c861e881f199e | 29 | py | Python | pysts/utils/__init__.py | sdswart/pysts | f140072e064b59a7d8732e73d71fd812b6d292c5 | [
"MIT"
] | null | null | null | pysts/utils/__init__.py | sdswart/pysts | f140072e064b59a7d8732e73d71fd812b6d292c5 | [
"MIT"
] | null | null | null | pysts/utils/__init__.py | sdswart/pysts | f140072e064b59a7d8732e73d71fd812b6d292c5 | [
"MIT"
] | null | null | null | from . import utils, modules
| 14.5 | 28 | 0.758621 | 4 | 29 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 29 | 1 | 29 | 29 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d6a75f0796aa345d2dfacf38fd1f21244a273f1f | 159 | py | Python | general/thesislogger/__init__.py | duennbart/masterthesis_VAE | 1a161bc5c234acc0a021d84cde8cd69e784174e1 | [
"BSD-3-Clause"
] | 14 | 2020-06-28T15:38:48.000Z | 2021-12-05T01:49:50.000Z | general/thesislogger/__init__.py | duennbart/masterthesis_VAE | 1a161bc5c234acc0a021d84cde8cd69e784174e1 | [
"BSD-3-Clause"
] | null | null | null | general/thesislogger/__init__.py | duennbart/masterthesis_VAE | 1a161bc5c234acc0a021d84cde8cd69e784174e1 | [
"BSD-3-Clause"
] | 3 | 2020-06-28T15:38:49.000Z | 2022-02-13T22:04:34.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from general.thesislogger.ThesisLogger import *
| 22.714286 | 47 | 0.867925 | 19 | 159 | 6.526316 | 0.473684 | 0.241935 | 0.387097 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113208 | 159 | 6 | 48 | 26.5 | 0.879433 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
d6abf7b50e26e6c325086674e343e6af6d0f4508 | 26,155 | py | Python | skmob/models/tests/test_sts_epr.py | vlingenfelter/scikit-mobility | d9a7bbcac30efbf5bd834c2e5e3a3264c3d9dbc1 | [
"BSD-3-Clause"
] | 489 | 2019-05-05T07:47:35.000Z | 2022-03-31T10:36:00.000Z | skmob/models/tests/test_sts_epr.py | vlingenfelter/scikit-mobility | d9a7bbcac30efbf5bd834c2e5e3a3264c3d9dbc1 | [
"BSD-3-Clause"
] | 204 | 2019-05-02T22:51:05.000Z | 2022-03-19T17:14:29.000Z | skmob/models/tests/test_sts_epr.py | vlingenfelter/scikit-mobility | d9a7bbcac30efbf5bd834c2e5e3a3264c3d9dbc1 | [
"BSD-3-Clause"
] | 119 | 2019-05-28T19:51:29.000Z | 2022-03-23T03:43:00.000Z | from ...utils import constants
import pandas as pd
import geopandas as gpd
import numpy as np
import shapely
import pytest
from contextlib import ExitStack
from sklearn.metrics import mean_absolute_error
from ...preprocessing import detection, clustering
from ...models.sts_epr import STS_epr
from ...core.trajectorydataframe import TrajDataFrame
from ...models.markov_diary_generator import MarkovDiaryGenerator
def global_variables():
# tessellation
tess_polygons = [[[7.481, 45.184],
[7.481, 45.216],
[7.526, 45.216],
[7.526, 45.184],
[7.481, 45.184]],
[[7.481, 45.216],
[7.481, 45.247],
[7.526, 45.247],
[7.526, 45.216],
[7.481, 45.216]],
[[7.526, 45.184],
[7.526, 45.216],
[7.571, 45.216],
[7.571, 45.184],
[7.526, 45.184]],
[[7.526, 45.216],
[7.526, 45.247],
[7.571, 45.247],
[7.571, 45.216],
[7.526, 45.216]]]
geom = [shapely.geometry.Polygon(p) for p in tess_polygons]
tessellation = gpd.GeoDataFrame(geometry=geom, crs="EPSG:4326")
tessellation = tessellation.reset_index().rename(columns={"index": constants.TILE_ID})
relevance = np.random.randint(5, 10, size=len(tessellation))
tessellation[constants.RELEVANCE] = relevance
social_graph = [[0,1],[0,2],[0,3],[1,3],[2,4]]
# mobility diary generator
lats_lngs = np.array([[39.978253, 116.3272755],
[40.013819, 116.306532],
[39.878987, 116.1266865],
[40.013819, 116.306532],
[39.97958, 116.313649],
[39.978696, 116.3262205],
[39.98153775, 116.31079],
[39.978161, 116.3272425],
[38.978161, 115.3272425]])
traj = pd.DataFrame(lats_lngs, columns=[constants.LATITUDE, constants.LONGITUDE])
traj[constants.DATETIME] = pd.to_datetime([
'20130101 8:34:04', '20130101 10:34:08', '20130105 10:34:08',
'20130110 12:34:15', '20130101 1:34:28', '20130101 3:34:54',
'20130101 4:34:55', '20130105 5:29:12', '20130115 00:29:12'])
traj[constants.UID] = [1 for _ in range(5)] + [2 for _ in range(3)] + [3]
tdf = TrajDataFrame(traj)
ctdf = clustering.cluster(tdf)
mdg = MarkovDiaryGenerator()
mdg.fit(ctdf, 3, lid='cluster')
return tessellation, social_graph, mdg
tessellation, social_graph, mdg = global_variables()
sts_epr = STS_epr()
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph, 'random'])
@pytest.mark.parametrize('n_agents', [1,5])
@pytest.mark.parametrize('rsl', [True, False])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
# First test set: CORRECT arguments, no ERRORS expected (#test: 8)
def test_sts_generate_success(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
assert isinstance(tdf, TrajDataFrame)
# Second test set: WRONG arguments, expected to FAIL
# test 2.1: wrong n_agents (#test: 3)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [-2,-1,0])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=ValueError)
def test_sts_wrong_n_agents(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
assert isinstance(tdf, TrajDataFrame)
# test 2.2: end_date prior to start_date (#test: 1)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=ValueError)
def test_sts_wrong_dates(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# test 2.3: wrong rsl type (#test: 3)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [1, None, 'True'])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=TypeError)
def test_sts_wrong_rsl_type(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# test 2.4: wrong type for the spatial_tessellation (#test: 5)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', ["", None, [], "tessellation", [1,2,3]])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=TypeError)
def test_sts_wrong_tex_type(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# test 2.5: # of tiles in spatial_tessellation < 3 (#test: 3)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [pd.DataFrame(),tessellation[:1],tessellation[:2]])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=ValueError)
def test_sts_wrong_tiles_num(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# test 2.6: wrong relevance's column name (#test: 1)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [True,])
@pytest.mark.parametrize('relevance_column',['rel'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=IndexError)
def test_sts_wrong_relevance_col_name(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# test 2.7: wrong type for the diary_generator (#test: 3)
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/01/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [[],None,pd.DataFrame()])
@pytest.mark.parametrize('social_graph', [social_graph])
@pytest.mark.parametrize('n_agents', [5])
@pytest.mark.parametrize('rsl', [True,])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
@pytest.mark.xfail(raises=TypeError)
def test_sts_wrong_relevance_col_name(start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
# Third test set: assert the correctness of the model's functions
def all_equal(a, b, threshold=1e-3):
return mean_absolute_error(a, b) <= threshold
def correcteness_set_exp(a,b):
for i in range(len(b)):
if a[i]>0 and b[i]>0:
return False
return True
def correcteness_set_ret(a, b):
for i in range(len(b)):
if b[i]>0 and a[i]==0:
return False
return True
def correcteness_set_exp_social(lva,lvc,choices):
for i in range(len(choices)):
if choices[i]>0:
if not (lva[i]==0 and lvc[i]>0):
return False
return True
def correcteness_set_ret_social(lva,lvc,choices):
for i in range(len(choices)):
if choices[i]>0:
if not (lva[i]>0 and lvc[i]>0):
return False
return True
# test 3.1: correct random_weighted_choice (#test: 1)
@pytest.mark.parametrize('size', [1000])
@pytest.mark.parametrize('n_picks', [int(1e4)])
def test_weighted_random_choice(size,n_picks):
np.random.seed(24)
sts_epr = STS_epr()
weigths = np.random.randint(0, 10, size=size)
theoretical = weigths/np.sum(weigths)
empirical = [0]*len(weigths)
for j in range(n_picks):
i = sts_epr.random_weighted_choice(weigths)
empirical[i]+=1
empirical = empirical/np.sum(empirical)
assert(all_equal(theoretical,empirical))
# test 3.2: correct exploration choices (#test: 1)
# create a fake location vector of size n for the agent A (id=0)
# m elements = 0 and j elements > 0, m+j=n
# RETURN
@pytest.mark.parametrize('m', [3])
@pytest.mark.parametrize('j', [1])
@pytest.mark.parametrize('n_picks', [int(1e4)])
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/03/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', ['random'])
@pytest.mark.parametrize('n_agents', [2])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
def test_correctness_exp(m, j, n_picks, start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
np.random.seed(random_state)
# create a fake location vector of size n for the agent A (id=0)
# m elements = 0 and j elements > 0, m+j=n
location_vector = [0]*m + list(np.random.randint(5, 10, size=j-1))
np.random.shuffle(location_vector)
#diary constraint = {home, last_location},
#fix as an example both home and current loc to 0
location_vector = [5]+location_vector
choices = [0]*len(location_vector)
#assign this location vector to agent with id=0
sts_epr.agents[0]['location_vector'] = np.array(location_vector)
#assign the home location 0
sts_epr.agents[0]['home_location'] = 0
#assign current location 1
sts_epr.agents[0]['current_location'] = 0
sts_epr.compute_od_row(0)
v_dist = np.array((sts_epr.distance_matrix[0].todense())[0])[0]
v_rel = sts_epr.relevances
for j in range(n_picks):
location_id = sts_epr.make_individual_exploration_action(0)
choices[location_id]+=1
#test 1 correctness of the choices; i.e., no location j s.t. lv[j]>0
res_1 = correcteness_set_exp(location_vector,choices)
#test 2 correct probabilities
empirical = choices/np.sum(choices)
#avoid division by 0
v_dist[0]=1
theoretical = np.array(1/v_dist**2) * v_rel * v_rel[0]
theoretical[0] = 0
theoretical = theoretical/np.sum(theoretical)
res_2 = all_equal(theoretical,empirical,threshold=1e-2)
assert((res_1,res_2)==(True,True))
# test 3.3: correct return choices (#test: 1)
# create a fake location vector of size n for the agent A (id=0)
# m elements = 0 and j elements > 0, m+j=n
# RETURN
@pytest.mark.parametrize('m', [100])
@pytest.mark.parametrize('j', [500])
@pytest.mark.parametrize('n_picks', [int(1e4)])
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/03/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', ['random'])
@pytest.mark.parametrize('n_agents', [2])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
def test_correctness_ret(m, j, n_picks, start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
np.random.seed(random_state)
# create a fake location vector of size n for the agent A (id=0)
# m elements = 0 and j elements > 0, m+j=n
location_vector = [0]*m + list(np.random.randint(5, 10, size=j-2))
np.random.shuffle(location_vector)
#diary constraint = {home, last_location}, fix as an example loc 0 and 1
location_vector = [4,1]+location_vector
choices = [0]*len(location_vector)
#assign this location vector to agent with id=0
sts_epr.agents[0]['location_vector'] = np.array(location_vector)
#assign the home location 0
sts_epr.agents[0]['home_location'] = 0
#assign current location 1
sts_epr.agents[0]['current_location'] = 1
for j in range(n_picks):
location_id = sts_epr.make_individual_return_action(0)
choices[location_id]+=1
#test 1 correctness of the choices; i.e., no location j s.t. lv[j]=0
# we can consider the home and current location as unvisited to prove
# the correctness of the choices
location_vector[0]=0
location_vector[1]=0
res_1 = correcteness_set_ret(location_vector,choices)
#test 2 correct probabilities
empirical = choices/np.sum(choices)
theoretical = location_vector/np.sum(location_vector)
res_2 = all_equal(theoretical,empirical)
assert((res_1,res_2)==(True,True))
# test 3.4: correct exp social choices (#test: 1)
# create a fake location vector of size n for the agent A (id=0) and agent C (id=1)
# agent A and C are connected in the social graph
# m elements = 0 and j elements > 0, m+j=n
# EXP SOCIAL
@pytest.mark.parametrize('m', [100])
@pytest.mark.parametrize('j', [500])
@pytest.mark.parametrize('n_picks', [int(1e4)])
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/02/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [[(0,1)]])
@pytest.mark.parametrize('n_agents', [2])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
def test_correctness_exp_social(m, j, n_picks, start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
np.random.seed(random_state)
# agent A (id=0)
location_vector_a = [0]*(m-2) + list(np.random.randint(5, 10, size=j-2))
np.random.shuffle(location_vector_a)
location_vector_a = [5,2] + location_vector_a + [0]*2
choices = [0]*len(location_vector_a)
#assign the home location 0
sts_epr.agents[0]['home_location'] = 0
#assign current location 1
sts_epr.agents[0]['current_location'] = 1
#assign this location vector to agent with id=0
sts_epr.agents[0]['location_vector']=np.array(location_vector_a)
# agent C (id=1)
location_vector_c = [0]*(m) + list(np.random.randint(5, 10, size=j-2))
np.random.shuffle(location_vector_c)
location_vector_c = location_vector_c + [5,3]
#assign this location vector to agent with id=1
sts_epr.agents[1]['location_vector']=np.array(location_vector_c)
for j in range(n_picks):
location_id = sts_epr.make_social_action(0, 'exploration')
choices[location_id]+=1
#test 1 correctness of the choices;
res_1 = correcteness_set_exp_social(location_vector_a,location_vector_c,choices)
#test 2 correct probabilities
empirical = choices/np.sum(choices)
set_c = [location_vector_c[i] if (location_vector_a[i]==0 and location_vector_c[i]>0) else 0 for
i in range(len(location_vector_a))]
theoretical = set_c/np.sum(set_c)
res_2 = all_equal(theoretical,empirical)
assert((res_1,res_2)==(True,True))
# test 3.5: correct ret social choices (#test: 1)
# create a fake location vector of size n for the agent A (id=0) and agent C (id=1)
# agent A and C are connected in the social graph
# m elements = 0 and j elements > 0, m+j=n
# RET SOCIAL
@pytest.mark.parametrize('m', [100])
@pytest.mark.parametrize('j', [500])
@pytest.mark.parametrize('n_picks', [int(1e4)])
@pytest.mark.parametrize('start_date', [pd.to_datetime('2020/01/01 08:00:00')])
@pytest.mark.parametrize('end_date', [pd.to_datetime('2020/02/10 08:00:00')])
@pytest.mark.parametrize('spatial_tessellation', [tessellation])
@pytest.mark.parametrize('diary_generator', [mdg])
@pytest.mark.parametrize('social_graph', [[(0,1)]])
@pytest.mark.parametrize('n_agents', [2])
@pytest.mark.parametrize('rsl', [True])
@pytest.mark.parametrize('relevance_column',['relevance'])
@pytest.mark.parametrize('random_state', [2])
@pytest.mark.parametrize('show_progress', [True])
def test_correctness_ret_social(m, j, n_picks, start_date, end_date, spatial_tessellation, diary_generator,
social_graph, n_agents, rsl, relevance_column, random_state, show_progress):
sts_epr = STS_epr()
tdf = sts_epr.generate(start_date=start_date, end_date=end_date, spatial_tessellation=spatial_tessellation,
social_graph=social_graph, diary_generator=diary_generator, n_agents= n_agents, rsl=rsl,
relevance_column=relevance_column, random_state=random_state, show_progress=show_progress)
np.random.seed(random_state)
# agent A (id=0)
location_vector_a = [0]*(m-2) + list(np.random.randint(5, 10, size=j-2))
np.random.shuffle(location_vector_a)
location_vector_a = [5,2] + location_vector_a + [0]*2
choices = [0]*len(location_vector_a)
#assign the home location 0
sts_epr.agents[0]['home_location'] = 0
#assign current location 1
sts_epr.agents[0]['current_location'] = 1
#assign this location vector to agent with id=0
sts_epr.agents[0]['location_vector']=np.array(location_vector_a)
# agent C (id=1)
location_vector_c = [0]*(m) + list(np.random.randint(5, 10, size=j-2))
np.random.shuffle(location_vector_c)
location_vector_c = location_vector_c + [5,3]
#assign this location vector to agent with id=1
sts_epr.agents[1]['location_vector']=np.array(location_vector_c)
for j in range(n_picks):
location_id = sts_epr.make_social_action(0, 'return')
choices[location_id]+=1
# we can consider the home and current location as unvisited to prove
# the correctness of the choices
location_vector_a[0]=0
location_vector_a[1]=0
#test 1 correctness of the choices;
res_1 = correcteness_set_ret_social(location_vector_a,location_vector_c,choices)
#test 2 correct probabilities
empirical = choices/np.sum(choices)
set_c = [location_vector_c[i] if (location_vector_a[i]>0 and location_vector_c[i]>0) else 0 for
i in range(len(location_vector_a))]
theoretical = set_c/np.sum(set_c)
res_2 = all_equal(theoretical,empirical)
assert((res_1,res_2)==(True,True))
| 40.425039 | 118 | 0.689849 | 3,647 | 26,155 | 4.727173 | 0.073759 | 0.081787 | 0.163225 | 0.022274 | 0.850406 | 0.840255 | 0.833585 | 0.829872 | 0.8279 | 0.817343 | 0 | 0.051995 | 0.174957 | 26,155 | 647 | 119 | 40.425039 | 0.74693 | 0.107819 | 0 | 0.690176 | 0 | 0 | 0.104722 | 0 | 0 | 0 | 0 | 0 | 0.017632 | 1 | 0.047859 | false | 0 | 0.030227 | 0.002519 | 0.103275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ba3e62ce473d45a922fa62721e10121231cb803b | 158 | py | Python | neurokit2/hrv/__init__.py | vansjyo/NeuroKit | 238cd3d89467f7922c68a3a4c1f44806a8466922 | [
"MIT"
] | null | null | null | neurokit2/hrv/__init__.py | vansjyo/NeuroKit | 238cd3d89467f7922c68a3a4c1f44806a8466922 | [
"MIT"
] | null | null | null | neurokit2/hrv/__init__.py | vansjyo/NeuroKit | 238cd3d89467f7922c68a3a4c1f44806a8466922 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .hrv_time import hrv_time
from .hrv_frequency import hrv_frequency
from .hrv_nonlinear import hrv_nonlinear
from .hrv import hrv
| 26.333333 | 40 | 0.78481 | 25 | 158 | 4.72 | 0.36 | 0.237288 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007299 | 0.132911 | 158 | 5 | 41 | 31.6 | 0.854015 | 0.132911 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ba42773a265533baef05e230b6026e6f6168b819 | 69 | py | Python | Python/tests/base_test.py | kouassives/kyops | 2f4aa2d9eed6b608f28e70c0bb53a15bfc4c5d36 | [
"MIT"
] | null | null | null | Python/tests/base_test.py | kouassives/kyops | 2f4aa2d9eed6b608f28e70c0bb53a15bfc4c5d36 | [
"MIT"
] | null | null | null | Python/tests/base_test.py | kouassives/kyops | 2f4aa2d9eed6b608f28e70c0bb53a15bfc4c5d36 | [
"MIT"
] | null | null | null | from unittest import TestCase
class BaseTestCase(TestCase):
pass | 17.25 | 29 | 0.797101 | 8 | 69 | 6.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15942 | 69 | 4 | 30 | 17.25 | 0.948276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
ba81260e4f0c7626d913ad99355ba161c6533011 | 139 | py | Python | python/metadata_guardian/__init__.py | fvaleye/metadata-guardian | ab5d6cada67785c3cfd98112f68e8fdb193d1617 | [
"Apache-2.0"
] | 9 | 2021-12-31T20:32:35.000Z | 2022-02-18T17:51:49.000Z | python/metadata_guardian/__init__.py | fvaleye/metadata-guardian | ab5d6cada67785c3cfd98112f68e8fdb193d1617 | [
"Apache-2.0"
] | 1 | 2022-02-25T16:35:04.000Z | 2022-02-28T21:08:53.000Z | python/metadata_guardian/__init__.py | fvaleye/metadata-guardian | ab5d6cada67785c3cfd98112f68e8fdb193d1617 | [
"Apache-2.0"
] | null | null | null | from .conf import *
from .data_rules import *
from .exceptions import *
from .report import *
from .scanner import *
from .source import *
| 19.857143 | 25 | 0.741007 | 19 | 139 | 5.368421 | 0.473684 | 0.490196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172662 | 139 | 6 | 26 | 23.166667 | 0.886957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
baa9f2ce92342e59633d649eb8e7ca5905ff9476 | 40 | py | Python | shamanai/env/flappybirds/gym_foo/envs/__init__.py | adaptationio/Shaman-RL | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | 2 | 2020-06-13T04:38:08.000Z | 2022-03-22T08:38:10.000Z | shamanai/env/flappybirds/gym_foo/envs/__init__.py | adaptationio/Shaman-RL | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | 1 | 2020-11-13T17:46:38.000Z | 2020-11-13T17:46:38.000Z | shamanai/env/flappybirds/gym_foo/envs/__init__.py | adaptationio/Shaman-AI | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | null | null | null | from gym_foo.envs.foo_env import FooEnv
| 20 | 39 | 0.85 | 8 | 40 | 4 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 40 | 1 | 40 | 40 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bab74d81fc9e760485e3790f65eef6d1880ea858 | 45 | py | Python | src/samplemanager/__init__.py | cowanml/samplemanager | c802559d84b8470255edbe593996db58d86ec914 | [
"BSD-3-Clause"
] | null | null | null | src/samplemanager/__init__.py | cowanml/samplemanager | c802559d84b8470255edbe593996db58d86ec914 | [
"BSD-3-Clause"
] | 2 | 2015-02-25T20:58:52.000Z | 2015-02-25T20:59:31.000Z | src/samplemanager/__init__.py | cowanml/samplemanager | c802559d84b8470255edbe593996db58d86ec914 | [
"BSD-3-Clause"
] | null | null | null | from . import conf, commands, odm_templates
| 15 | 43 | 0.777778 | 6 | 45 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 45 | 2 | 44 | 22.5 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
baf8a84f5e9b331aff9ed4a18b51721cc4ea6bbb | 1,244 | py | Python | natural_selection/genetic_programs/operators/initialisation.py | Zipfian-Science/natural-selection | 5bf04142a73f39a83e86ad0eb53ba0fecb365864 | [
"Apache-2.0"
] | null | null | null | natural_selection/genetic_programs/operators/initialisation.py | Zipfian-Science/natural-selection | 5bf04142a73f39a83e86ad0eb53ba0fecb365864 | [
"Apache-2.0"
] | 1 | 2021-02-26T10:10:43.000Z | 2021-02-26T10:10:43.000Z | natural_selection/genetic_programs/operators/initialisation.py | Zipfian-Science/natural-selection | 5bf04142a73f39a83e86ad0eb53ba0fecb365864 | [
"Apache-2.0"
] | null | null | null | def initialise_population_full_method(adam, n : int, island=None):
"""
Classic random initialisation function to create a pool of `n` programs from a starting GeneticProgram `adam`.
This method grows a population of programs each with node trees that extend to `max_depth` for the whole tree.
Args:
adam (GeneticProgram): A genetic program already initialised with a node tree.
n (int): Population size.
island (Island): Needed to wrap to `create_node` and `create_genetic_program` methods.
Returns:
list: Population members.
"""
raise NotImplementedError('Coming sometime soon')
def initialise_population_grow_method(adam, n : int, island=None):
"""
Classic random initialisation function to create a pool of `n` programs from a starting GeneticProgram `adam`.
Node trees are grown from the root but not necessarily to `max_depth`.
Args:
adam (GeneticProgram): A genetic program already initialised with a node tree.
n (int): Population size.
island (Island): Needed to wrap to `create_node` and `create_genetic_program` methods.
Returns:
list: Population members.
"""
raise NotImplementedError('Coming sometime soon') | 42.896552 | 114 | 0.709807 | 161 | 1,244 | 5.397516 | 0.385093 | 0.018412 | 0.052934 | 0.032221 | 0.773303 | 0.773303 | 0.773303 | 0.773303 | 0.773303 | 0.773303 | 0 | 0 | 0.217846 | 1,244 | 29 | 115 | 42.896552 | 0.893114 | 0.727492 | 0 | 0.5 | 0 | 0 | 0.158103 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
79f4b146b493e6b7e1f72c416af99ffb0936a737 | 396 | py | Python | scripts/treasury.py | PyramiDAO/market-sim | 2b0461466da6e246de22d60a5c15addfe47d80f9 | [
"MIT"
] | null | null | null | scripts/treasury.py | PyramiDAO/market-sim | 2b0461466da6e246de22d60a5c15addfe47d80f9 | [
"MIT"
] | null | null | null | scripts/treasury.py | PyramiDAO/market-sim | 2b0461466da6e246de22d60a5c15addfe47d80f9 | [
"MIT"
] | 1 | 2022-02-05T06:33:19.000Z | 2022-02-05T06:33:19.000Z |
import multiprocessing as mp
class Treasury:
def __init__(self, balance):
# TODO add Vault/Strategy class
self.balance = balance
# initialize a multipleprocessing Queue to hold the simulated results across different processors
self.simulation_result_queue = mp.Queue()
def get_simulation_result_queue(self):
return self.simulation_result_queue | 30.461538 | 105 | 0.727273 | 47 | 396 | 5.893617 | 0.638298 | 0.173285 | 0.227437 | 0.180505 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.224747 | 396 | 13 | 106 | 30.461538 | 0.90228 | 0.318182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
030cbc7e7934eca006238f64cb1c8e9ae02db67c | 33 | py | Python | src/core/__init__.py | SuperstonkQuants/q1_correlations | 2836ddd79be2b289f77607aade9345cfa18e743c | [
"MIT"
] | 4 | 2021-06-07T16:54:52.000Z | 2021-06-13T17:57:17.000Z | src/core/__init__.py | HomeDepotHanksApeGang/q1_correlations | 2836ddd79be2b289f77607aade9345cfa18e743c | [
"MIT"
] | 1 | 2021-06-07T17:34:05.000Z | 2021-06-07T17:34:05.000Z | src/core/__init__.py | HomeDepotHanksApeGang/q1_correlations | 2836ddd79be2b289f77607aade9345cfa18e743c | [
"MIT"
] | 1 | 2021-06-07T17:22:30.000Z | 2021-06-07T17:22:30.000Z | from src.core.Stock import Stock
| 16.5 | 32 | 0.818182 | 6 | 33 | 4.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0361e64da601101156b11bd2918a48fa3d02f5af | 234 | py | Python | data/NullHubLogger.py | penguix0/lego-hub-tk | 7a8a26baf0b17400b89eccc5a6374f8d9daed32a | [
"MIT"
] | 16 | 2021-02-17T01:59:39.000Z | 2022-03-29T05:10:12.000Z | data/NullHubLogger.py | penguix0/lego-hub-tk | 7a8a26baf0b17400b89eccc5a6374f8d9daed32a | [
"MIT"
] | 15 | 2021-04-20T04:01:36.000Z | 2022-02-01T02:46:30.000Z | data/NullHubLogger.py | penguix0/lego-hub-tk | 7a8a26baf0b17400b89eccc5a6374f8d9daed32a | [
"MIT"
] | 9 | 2021-04-18T20:29:21.000Z | 2022-03-31T11:50:04.000Z | from data.HubLogger import HubLogger
class NullHubLogger(HubLogger):
def program_runstatus_update(self, timestamp, program_id, is_running):
pass
def telemetry_update(self, timestamp, message, hubstatus):
pass | 29.25 | 74 | 0.747863 | 27 | 234 | 6.296296 | 0.703704 | 0.117647 | 0.223529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183761 | 234 | 8 | 75 | 29.25 | 0.890052 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
03766ddc2e3c4d3a65134c5ebdf15ab08672ef6a | 186 | py | Python | yatsim_dashboard/admin.py | ucanyiit/YATSIM | c6c758e3e28086ee1902ed33c0972a29c247f9d1 | [
"MIT"
] | null | null | null | yatsim_dashboard/admin.py | ucanyiit/YATSIM | c6c758e3e28086ee1902ed33c0972a29c247f9d1 | [
"MIT"
] | 2 | 2022-03-05T09:59:41.000Z | 2022-03-05T09:59:50.000Z | yatsim_dashboard/admin.py | ucanyiit/YATSIM | c6c758e3e28086ee1902ed33c0972a29c247f9d1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Cell, Room, Train, Wagon
admin.site.register(Room)
admin.site.register(Cell)
admin.site.register(Train)
admin.site.register(Wagon)
| 20.666667 | 44 | 0.795699 | 28 | 186 | 5.285714 | 0.428571 | 0.243243 | 0.459459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091398 | 186 | 8 | 45 | 23.25 | 0.87574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
ceea8fb4a4eb1e42532a2ce3819aa83904066b0a | 99 | py | Python | backend/shrunk/util/__init__.py | kevinmonisit/shrunk | 55106356735c3491f8c8c0774f5ae500ba1c970a | [
"MIT"
] | 13 | 2015-05-08T00:26:23.000Z | 2021-07-28T15:42:10.000Z | backend/shrunk/util/__init__.py | kevinmonisit/shrunk | 55106356735c3491f8c8c0774f5ae500ba1c970a | [
"MIT"
] | 68 | 2015-01-12T20:27:44.000Z | 2021-05-17T19:08:05.000Z | backend/shrunk/util/__init__.py | kevinmonisit/shrunk | 55106356735c3491f8c8c0774f5ae500ba1c970a | [
"MIT"
] | 7 | 2015-08-05T20:31:20.000Z | 2022-01-28T21:14:06.000Z | from . import decorators, ldap, stats, string
__all__ = ['decorators', 'ldap', 'stats', 'string']
| 24.75 | 51 | 0.676768 | 11 | 99 | 5.727273 | 0.636364 | 0.444444 | 0.603175 | 0.793651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141414 | 99 | 3 | 52 | 33 | 0.741176 | 0 | 0 | 0 | 0 | 0 | 0.252525 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
cef8a4ead89b759503b7c126f2d76ec139d65524 | 38 | py | Python | ccbonfire/backbone/__init__.py | zcuncun/Bonfire | 47640f16dac840b68c0dd91573e89b772b126344 | [
"MIT"
] | 1 | 2022-02-06T14:10:14.000Z | 2022-02-06T14:10:14.000Z | ccbonfire/backbone/__init__.py | zcuncun/Bonfire | 47640f16dac840b68c0dd91573e89b772b126344 | [
"MIT"
] | null | null | null | ccbonfire/backbone/__init__.py | zcuncun/Bonfire | 47640f16dac840b68c0dd91573e89b772b126344 | [
"MIT"
] | null | null | null | from .toy_backbone import ToyBackbone
| 19 | 37 | 0.868421 | 5 | 38 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
300b7c36b210b9f6fa539aeb3c9b465bb61b1de7 | 169 | py | Python | apex/contrib/bottleneck/__init__.py | gilshm/apex | 3d61e4a8145a10ed3a86391dd85e3f65b226517b | [
"BSD-3-Clause"
] | null | null | null | apex/contrib/bottleneck/__init__.py | gilshm/apex | 3d61e4a8145a10ed3a86391dd85e3f65b226517b | [
"BSD-3-Clause"
] | null | null | null | apex/contrib/bottleneck/__init__.py | gilshm/apex | 3d61e4a8145a10ed3a86391dd85e3f65b226517b | [
"BSD-3-Clause"
] | 1 | 2021-12-20T00:49:01.000Z | 2021-12-20T00:49:01.000Z | from .bottleneck import Bottleneck, SpatialBottleneck
from .halo_exchangers import HaloExchangerNoComm, HaloExchangerAllGather, HaloExchangerSendRecv, HaloExchangerPeer
| 56.333333 | 114 | 0.893491 | 13 | 169 | 11.538462 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071006 | 169 | 2 | 115 | 84.5 | 0.955414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
307a5e66f9f20e0f5137fe4d08ef93721c864927 | 467 | py | Python | dbtlint/exceptions.py | boxysean/dbtlint | a626c2895e6063655144ebf23eb99ef2b7b5442b | [
"Apache-2.0"
] | 1 | 2021-08-12T08:36:59.000Z | 2021-08-12T08:36:59.000Z | dbtlint/exceptions.py | boxysean/dbtlint | a626c2895e6063655144ebf23eb99ef2b7b5442b | [
"Apache-2.0"
] | null | null | null | dbtlint/exceptions.py | boxysean/dbtlint | a626c2895e6063655144ebf23eb99ef2b7b5442b | [
"Apache-2.0"
] | null | null | null | class RuntimeException(Exception):
pass
class JinjaTemplateError(RuntimeException):
def __init__(self, message):
self.message = message
def __str__(self):
return f"JinjaTemplateError: {self.message}"
class SqlLintError(RuntimeException):
def __init__(self, message, temp_file_path):
self.message = message
self.temp_file_path = temp_file_path
def __str__(self):
return f"SqlLintError: {self.message}"
| 23.35 | 52 | 0.698073 | 50 | 467 | 6.08 | 0.32 | 0.217105 | 0.118421 | 0.177632 | 0.335526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211991 | 467 | 19 | 53 | 24.578947 | 0.826087 | 0 | 0 | 0.307692 | 0 | 0 | 0.132762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0.076923 | 0 | 0.153846 | 0.692308 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
062530ca7dc59491f534942e67d0d33f285df15b | 5,289 | py | Python | tests/test_signal_analog_cli.py | agilefall/signal_analog | 4ab4f9f0bff4a1067b1981219c194142d09a43c6 | [
"BSD-3-Clause"
] | null | null | null | tests/test_signal_analog_cli.py | agilefall/signal_analog | 4ab4f9f0bff4a1067b1981219c194142d09a43c6 | [
"BSD-3-Clause"
] | null | null | null | tests/test_signal_analog_cli.py | agilefall/signal_analog | 4ab4f9f0bff4a1067b1981219c194142d09a43c6 | [
"BSD-3-Clause"
] | null | null | null | import betamax
import click
import requests
from betamax_serializers import pretty_json
from click.testing import CliRunner
from signal_analog.charts import TimeSeriesChart
from signal_analog.cli import CliBuilder
from signal_analog.dashboards import Dashboard
from signal_analog.flow import Data
# Global config. This will store all recorded requests in the 'mocks' dir
with betamax.Betamax.configure() as config:
betamax.Betamax.register_serializer(pretty_json.PrettyJSONSerializer)
config.cassette_library_dir = 'tests/mocks'
# Don't get in the habit of doing this, but it simplifies testing
global_session = requests.Session()
global_recorder = betamax.Betamax(global_session)
def test_cli_create_success():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_create_success',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'create'])
assert result.exit_code == 0
def test_cli_create_failure():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_create_failure',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'create'])
assert result.exception
def test_cli_create_force_success():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_create_force_success',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'create', '-f'])
assert result.exit_code == 0
def test_cli_create_interactive_success():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_create_interactive_success',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'create', '-i'], input='y')
assert result.exit_code == 0
def test_cli_create_interactive_failure():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_create_interactive_failure',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'create', '-i'], input='n')
click.echo(result.exception)
assert result.exception
def test_cli_update_success():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_update_success',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'update',
'--description', 'updated_dashboard_description'])
assert result.exit_code == 0
def test_cli_update_failure():
program = Data('cpu.utilization').publish()
chart = TimeSeriesChart().with_name('lol').with_program(program)
with global_recorder.use_cassette('cli_update_failure',
serialize_with='prettyjson'):
dashboard = Dashboard(session=global_session)\
.with_name('testy mctesterson')\
.with_charts(chart)
cli = CliBuilder().with_resources(dashboard).build()
runner = CliRunner()
result = runner.invoke(cli, args=['--api-key', 'foo', 'update',
'--description', 'updated_dashboard_description'])
assert result.exception
| 37.246479 | 92 | 0.645112 | 555 | 5,289 | 5.922523 | 0.167568 | 0.034074 | 0.021296 | 0.05324 | 0.804381 | 0.804381 | 0.789169 | 0.789169 | 0.783389 | 0.775783 | 0 | 0.000988 | 0.234638 | 5,289 | 141 | 93 | 37.510638 | 0.811018 | 0.025525 | 0 | 0.683168 | 0 | 0 | 0.135896 | 0.027567 | 0 | 0 | 0 | 0 | 0.069307 | 1 | 0.069307 | false | 0 | 0.089109 | 0 | 0.158416 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ebf8fba1d5dbae2aa24201a253f9e66c14177955 | 29 | py | Python | package_template/tests/__init__.py | KIT-HYD/package-template | 32fb649acb1583023969fa4cba1a3a88d7397fd1 | [
"MIT"
] | null | null | null | package_template/tests/__init__.py | KIT-HYD/package-template | 32fb649acb1583023969fa4cba1a3a88d7397fd1 | [
"MIT"
] | null | null | null | package_template/tests/__init__.py | KIT-HYD/package-template | 32fb649acb1583023969fa4cba1a3a88d7397fd1 | [
"MIT"
] | null | null | null | from .core import TestMyModel | 29 | 29 | 0.862069 | 4 | 29 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
235a94839c8c845804cd5d0b16d7e8012e6e2f67 | 146 | py | Python | eckerd-django-google-sso/models.py | ChristopherDavenport/eckerd-django-google-sso | 7a5e3f143376d5cc2df440dd485bf01b5ab76cd2 | [
"MIT"
] | null | null | null | eckerd-django-google-sso/models.py | ChristopherDavenport/eckerd-django-google-sso | 7a5e3f143376d5cc2df440dd485bf01b5ab76cd2 | [
"MIT"
] | null | null | null | eckerd-django-google-sso/models.py | ChristopherDavenport/eckerd-django-google-sso | 7a5e3f143376d5cc2df440dd485bf01b5ab76cd2 | [
"MIT"
] | null | null | null | """Models for the eckerd-django-google-sso app."""
from django.contrib.auth.models import AbstractUser
class CustomUser(AbstractUser):
pass
| 20.857143 | 51 | 0.767123 | 19 | 146 | 5.894737 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 146 | 6 | 52 | 24.333333 | 0.875 | 0.30137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
0001d8f99ea3b5e880778b341a8160b3aaa474e3 | 47 | py | Python | cogs/utils/__init__.py | dashr9230/Discord-Bot | 86b0c50154df5e5e31dea8a7cd3edaa70970bd05 | [
"Apache-2.0"
] | 1 | 2020-09-13T16:50:51.000Z | 2020-09-13T16:50:51.000Z | cogs/utils/__init__.py | dashr9230/discord-bot | 86b0c50154df5e5e31dea8a7cd3edaa70970bd05 | [
"Apache-2.0"
] | null | null | null | cogs/utils/__init__.py | dashr9230/discord-bot | 86b0c50154df5e5e31dea8a7cd3edaa70970bd05 | [
"Apache-2.0"
] | null | null | null | from .helpers import *
from .tictactoe import * | 23.5 | 24 | 0.765957 | 6 | 47 | 6 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 47 | 2 | 24 | 23.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cc6b482690afde75ae861b570f2cd3b761493a36 | 26 | py | Python | utils/__init__.py | AleCandido/opale | dcd0a3c3af6c0b59b9be3b3712a3c33c14c2c3cc | [
"BSD-3-Clause"
] | null | null | null | utils/__init__.py | AleCandido/opale | dcd0a3c3af6c0b59b9be3b3712a3c33c14c2c3cc | [
"BSD-3-Clause"
] | 2 | 2020-11-25T09:13:28.000Z | 2021-02-10T08:00:07.000Z | utils/__init__.py | AleCandido/opale | dcd0a3c3af6c0b59b9be3b3712a3c33c14c2c3cc | [
"BSD-3-Clause"
] | null | null | null | from . import build_theme
| 13 | 25 | 0.807692 | 4 | 26 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cc955adfac82b4afeb1091669962fa335f784c2e | 262 | py | Python | calchas_datamodel/abstractLiteralExpression.py | s-i-newton/calchas-datamodel | eda5e2de37849d6d4766cd680bc75fec8e923f85 | [
"Apache-2.0"
] | null | null | null | calchas_datamodel/abstractLiteralExpression.py | s-i-newton/calchas-datamodel | eda5e2de37849d6d4766cd680bc75fec8e923f85 | [
"Apache-2.0"
] | 2 | 2017-06-01T14:14:09.000Z | 2017-06-20T10:01:13.000Z | calchas_datamodel/abstractLiteralExpression.py | s-i-newton/calchas | 13472f837605eff26010a28af9981ba8750e9af9 | [
"Apache-2.0"
] | null | null | null | from .abstractExpression import AbstractExpression
from abc import ABCMeta
class AbstractLiteralExpression(AbstractExpression, metaclass=ABCMeta):
"""
AbstractLiteralExpression ::= IntegerLiteralExpression
| FloatLiteralExpression
"""
pass
| 23.818182 | 71 | 0.782443 | 17 | 262 | 12.058824 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156489 | 262 | 10 | 72 | 26.2 | 0.927602 | 0.301527 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
ccc5c4dbabe022cbc545ae116af65ab002e920d3 | 35 | py | Python | grpc_wrappers/wrappers/__init__.py | matthewwardrop/python-grpc-wrappers | 988f6a7ff379c29e0fd471d3664617270fa99f70 | [
"MIT"
] | null | null | null | grpc_wrappers/wrappers/__init__.py | matthewwardrop/python-grpc-wrappers | 988f6a7ff379c29e0fd471d3664617270fa99f70 | [
"MIT"
] | null | null | null | grpc_wrappers/wrappers/__init__.py | matthewwardrop/python-grpc-wrappers | 988f6a7ff379c29e0fd471d3664617270fa99f70 | [
"MIT"
] | null | null | null | from . import google # noqa: F401
| 17.5 | 34 | 0.685714 | 5 | 35 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.228571 | 35 | 1 | 35 | 35 | 0.777778 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ccd3d49ffae1439c8ac1b013cc1f6fcad25df4c7 | 21 | py | Python | opal_common/confi/__init__.py | MatanyaStroh/opal | 6b0c61d0bd5d08fc5291827ff6af8587d6dbc775 | [
"Apache-2.0"
] | 367 | 2021-04-11T11:58:35.000Z | 2022-02-24T15:33:54.000Z | opal_common/confi/__init__.py | asafcsimpleorg/opal | 262a296e5a2d06e9d1b15fcf3cc9adf68e12fd91 | [
"Apache-2.0"
] | 63 | 2021-04-14T07:00:44.000Z | 2021-12-07T12:05:38.000Z | opal_common/confi/__init__.py | asafcsimpleorg/opal | 262a296e5a2d06e9d1b15fcf3cc9adf68e12fd91 | [
"Apache-2.0"
] | 32 | 2021-04-11T12:53:53.000Z | 2021-12-04T19:57:42.000Z | from .confi import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9de2490fb68cdc4c4ea62557e5435fed3f59aa2b | 96 | py | Python | nlp/command/__init__.py | aira/.public_object_detector_app | 30185b79749ec121c1b395571f22e395427cff4e | [
"MIT"
] | null | null | null | nlp/command/__init__.py | aira/.public_object_detector_app | 30185b79749ec121c1b395571f22e395427cff4e | [
"MIT"
] | null | null | null | nlp/command/__init__.py | aira/.public_object_detector_app | 30185b79749ec121c1b395571f22e395427cff4e | [
"MIT"
] | null | null | null | from nlp.command.color import DescribeObjectColor
from nlp.command.describe import DescribeScene | 48 | 49 | 0.885417 | 12 | 96 | 7.083333 | 0.666667 | 0.164706 | 0.329412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072917 | 96 | 2 | 50 | 48 | 0.955056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3b03bb796dfaf3e2746a09cc640ddd363debb6d8 | 28,612 | py | Python | koku/sources/test/api/test_source_status.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 157 | 2018-04-30T16:27:53.000Z | 2022-03-31T08:17:21.000Z | koku/sources/test/api/test_source_status.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 3,250 | 2018-04-26T14:14:25.000Z | 2022-03-31T23:49:15.000Z | koku/sources/test/api/test_source_status.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 65 | 2018-05-10T14:11:50.000Z | 2022-03-18T19:22:58.000Z | #
# Copyright 2021 Red Hat Inc.
# SPDX-License-Identifier: Apache-2.0
#
"""Test the Sources Status HTTP Client."""
from unittest.mock import create_autospec
from unittest.mock import patch
from uuid import uuid4
from django.test.utils import override_settings
from django.urls import reverse
from faker import Faker
from rest_framework import status
from rest_framework.serializers import ValidationError
from rest_framework.test import APIClient
from api.iam.test.iam_test_case import IamTestCase
from api.provider.models import Provider
from api.provider.models import Sources
from providers.provider_access import ProviderAccessor
from providers.provider_errors import ProviderErrors
from sources.api.source_status import SourceStatus
from sources.sources_http_client import SourcesHTTPClient
from sources.sources_http_client import SourcesHTTPClientError
from sources.sources_provider_coordinator import SourcesProviderCoordinator
faker = Faker()
@override_settings(ROOT_URLCONF="sources.urls")
class SourcesStatusTest(IamTestCase):
"""Source Status Test Class."""
def test_http_endpoint_source_not_found(self):
"""
Test sources status returns 404 when source isn't found.
When there's no provider or source, the endpoint should return 404.
"""
url = reverse("source-status")
client = APIClient()
response = client.get(url + "?source_id=1", **self.headers)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_mock_response_returns_false(self):
"""
Test sources status returns False.
This test ensures that a mock response contains the payload 'False'
"""
url = reverse("source-status")
client = APIClient()
response = client.get(url + "?source_id=1", **self.headers)
mock_response = create_autospec(response, data=False, status=status.HTTP_200_OK)
mock_response_source_status = mock_response.data
expected_source_status = False
expected_HTTP_code = status.HTTP_200_OK
self.assertEqual(mock_response_source_status, expected_source_status)
self.assertEqual(mock_response.status, expected_HTTP_code)
def test_mock_response_returns_true(self):
"""
Test sources status returns True.
response.data should contain a True value.
"""
url = reverse("source-status")
client = APIClient()
response = client.get(url + "?source_id=1", **self.headers)
mock_response = create_autospec(response, data=True, status=status.HTTP_200_OK)
mock_response_source_status = mock_response.data
expected_source_status = True
expected_HTTP_code = status.HTTP_200_OK
self.assertEqual(mock_response_source_status, expected_source_status)
self.assertEqual(mock_response.status, expected_HTTP_code)
def test_missing_query_parameter(self):
"""
Test when the user accesses this API without giving a parameter for example '?source_id=1'.
The API should respond with an error that there is a missing query paramter 'source_id'
The API should respond with HTTP_400_BAD_REQUEST
"""
url = reverse("source-status")
client = APIClient()
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data, "Missing query parameter source_id")
def test_source_id_not_integer(self):
"""
Test when the user accesses this API when giving a parameter for example '?source_id=string'.
The API should respond with an error that the source_id must be an integer
The API should respond with HTTP_400_BAD_REQUEST
"""
url = reverse("source-status")
client = APIClient()
response = client.get(url + "?source_id=string", **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data, "source_id must be an integer")
def test_post_status(self):
"""Test that the API pushes sources status with POST."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(
source_id=1,
name="New AWS Mock Test Source",
source_type=Provider.PROVIDER_AWS,
authentication={"authentication": {"rolearn": "myarn"}},
billing_source={"bucket": "my-bucket"},
koku_uuid="uuid",
offset=1,
)
json_data = {"source_id": 1}
with patch.object(SourcesHTTPClient, "set_source_status", return_value=True):
response = client.post(url, data=json_data, **self.headers)
self.assertEquals(response.status_code, 204)
@patch("sources.api.source_status.SourcesProviderCoordinator.create_account")
@patch("sources.api.source_status.SourcesProviderCoordinator.update_account")
@patch("sources.api.source_status.SourcesHTTPClient.set_source_status")
def test_push_status_first_gcp_table_discovery(
self, mock_set_source_status, mock_update_account, mock_create_account
):
"""Test that push_status for initial discovery of GCP BigQuery table id."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
test_source_id = 1
# Insert a source with ID 1
Sources.objects.create(
source_id=test_source_id,
name="New GCP Mock Test Source",
source_type=Provider.PROVIDER_GCP,
authentication={"credentials": {"project_id": "test_project_id"}},
billing_source={"data_source": {"dataset": "test_dataset", "table_id": "cost_table"}},
status={},
offset=1,
)
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
with patch.object(SourceStatus, "update_source_name", returns=True):
status_obj = SourceStatus(test_source_id)
status_obj.push_status()
mock_set_source_status.assert_called()
mock_create_account.assert_called()
mock_update_account.assert_not_called()
@patch("sources.api.source_status.SourcesProviderCoordinator.create_account")
@patch("sources.api.source_status.SourcesProviderCoordinator.update_account")
@patch("sources.api.source_status.SourcesHTTPClient.set_source_status")
def test_push_status_first_gcp_table_discovery_update(
self, mock_set_source_status, mock_update_account, mock_create_account
):
"""Test that push_status for initial discovery of GCP BigQuery table id after dataset was updated."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
test_source_id = 1
# Insert a source with ID 1
Sources.objects.create(
source_id=test_source_id,
name="New GCP Mock Test Source",
source_type=Provider.PROVIDER_GCP,
koku_uuid=faker.uuid4(),
authentication={"credentials": {"project_id": "test_project_id"}},
billing_source={"data_source": {"dataset": "test_dataset"}},
status={},
offset=1,
)
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
with patch.object(SourceStatus, "update_source_name", returns=True):
status_obj = SourceStatus(test_source_id)
status_obj.push_status()
mock_set_source_status.assert_called()
mock_create_account.assert_not_called()
mock_update_account.assert_called()
@patch("sources.api.source_status.SourcesHTTPClient.set_source_status")
def test_push_status_second_gcp_table_discovery(self, mock_set_source_status):
"""Test that push_status for when GCP BigQuery table id is not already known."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
test_source_id = 1
# Insert a source with ID 1
Sources.objects.create(
source_id=test_source_id,
name="New GCP Mock Test Source",
source_type=Provider.PROVIDER_GCP,
authentication={"credentials": {"project_id": "test_project_id"}},
billing_source={"data_source": {"dataset": "test_dataset"}},
status={"availability_status": "available", "availability_status_error": ""},
offset=1,
)
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
with patch.object(SourceStatus, "update_source_name", returns=True):
with patch.object(SourcesProviderCoordinator, "create_account", returns=True):
status_obj = SourceStatus(test_source_id)
status_obj.push_status()
mock_set_source_status.assert_called()
@patch("sources.api.source_status.SourcesHTTPClient.set_source_status")
def test_push_status_gcp_table_discovery_completed(self, mock_set_source_status):
"""Test that push_status for when GCP BigQuery table id is already known."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
test_source_id = 1
# Insert a source with ID 1
Sources.objects.create(
source_id=test_source_id,
name="New GCP Mock Test Source",
source_type=Provider.PROVIDER_GCP,
koku_uuid=faker.uuid4(),
authentication={"credentials": {"project_id": "test_project_id"}},
billing_source={"data_source": {"dataset": "test_dataset", "table_id": "billtable"}},
status={"availability_status": "available", "availability_status_error": ""},
offset=1,
)
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
with patch.object(SourceStatus, "update_source_name", returns=True):
with patch.object(SourcesProviderCoordinator, "update_account", returns=True):
status_obj = SourceStatus(test_source_id)
status_obj.push_status()
mock_set_source_status.assert_called()
@patch("sources.api.source_status.SourcesProviderCoordinator.update_account")
@patch("sources.api.source_status.SourcesHTTPClient.get_source_details")
def test_update_source_name(self, mock_get_source_details, mock_update_account):
"""Test that the source name is queued for update when out of sync with platform."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
# Insert a source with ID 1
test_source_id = 1
Sources.objects.create(
source_id=test_source_id,
name="New AWS Mock Test Source",
source_type=Provider.PROVIDER_AWS,
authentication={"authentication": {"rolearn": "myarn"}},
billing_source={"bucket": "my-bucket"},
koku_uuid="uuid",
offset=1,
)
status_obj = SourceStatus(test_source_id)
with patch.object(
SourcesHTTPClient, "get_source_details", return_value={"name": "New Name", "source_type_id": "1"}
):
status_obj.update_source_name()
mock_update_account.assert_called()
@patch("sources.api.source_status.SourcesProviderCoordinator.update_account")
@patch("sources.api.source_status.SourcesHTTPClient.get_source_details")
def test_update_source_name_no_change(self, mock_get_source_details, mock_update_account):
"""Test that the source name is not queued for update when out of sync with platform."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
source_name = "AWS source"
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
# Insert a source with ID 1
test_source_id = 1
Sources.objects.create(
source_id=test_source_id,
name=source_name,
source_type=Provider.PROVIDER_AWS,
authentication={"authentication": {"rolearn": "myarn"}},
billing_source={"bucket": "my-bucket"},
koku_uuid="uuid",
offset=1,
)
status_obj = SourceStatus(test_source_id)
with patch.object(
SourcesHTTPClient, "get_source_details", return_value={"name": source_name, "source_type_id": "1"}
):
status_obj.update_source_name()
mock_update_account.assert_not_called()
def test_post_status_error(self):
"""Test that the API pushes sources status with POST with connection error."""
mock_status = {"availability_status": "available", "availability_status_error": ""}
with patch.object(SourcesHTTPClient, "build_source_status", return_value=mock_status):
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(
source_id=1,
name="New AWS Mock Test Source",
source_type=Provider.PROVIDER_AWS,
authentication={"authentication": {"rolearn": "myarn"}},
billing_source={"bucket": "my-bucket"},
koku_uuid="uuid",
offset=1,
)
json_data = {"source_id": 1}
with patch.object(SourcesHTTPClient, "set_source_status", side_effect=SourcesHTTPClientError):
response = client.post(url, data=json_data, **self.headers)
self.assertEquals(response.status_code, 204)
def test_available(self):
"""Test that availability status is available when cost_usage_source_ready is True."""
request = self.request_context.get("request")
test_matrix = [
{
"name": "New AWS Mock Test Source",
"source_type": Provider.PROVIDER_AWS,
"authentication": {"credentials": {"role_arn": "fake-iam"}},
"billing_source": {"data_source": {"bucket": "my-bucket"}},
"offset": 1,
},
{
"name": "New Azure Mock Test Source",
"source_type": Provider.PROVIDER_AZURE,
"authentication": {
"credentials": {
"subscription_id": "subid",
"client_id": "testid",
"tenant_id": "tenant",
"client_secret": "secret",
}
},
"billing_source": {"data_source": {"resource_group": "rg", "storage_account": "sa"}},
"offset": 1,
},
{
"name": "New OCP Mock Test Source",
"source_type": Provider.PROVIDER_OCP,
"authentication": {"credentials": {"cluster_id": "cluster_id"}},
"offset": 1,
},
{
"name": "New GCP Mock Test Source",
"source_type": Provider.PROVIDER_GCP,
"authentication": {"credentials": {"project_id": "test_project_id"}},
"billing_source": {"data_source": {"dataset": "test_dataset"}},
"offset": 1,
},
]
for i, test in enumerate(test_matrix):
with self.subTest(test=test):
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
provider = Provider.objects.create(
name=test.get("name"), created_by=request.user, customer=request.user.customer, active=True
)
test["koku_uuid"] = str(provider.uuid)
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(source_id=i, **test)
response = client.get(url + f"?source_id={i}", **self.headers)
actual_source_status = response.data
self.assertEquals("available", actual_source_status.get("availability_status"))
self.assertTrue(Provider.objects.get(uuid=provider.uuid).active)
@override_settings(
DEMO_ACCOUNTS={"123": {"arn:aws:iam::999:role/DEMO": {"report_prefix": "cur", "report_name": "awscost"}}}
)
@patch("sources.api.source_status.SourcesHTTPClient.get_source_details")
def test_available_demo_accounts(self, mock_cost_usage_ready):
"""Test avaiability status for demo accounts."""
test_source_id = 3
source_json = {
"name": "New AWS Mock Test Source",
"source_type": Provider.PROVIDER_AWS,
"source_id": test_source_id,
"account_id": 123,
"authentication": {"credentials": {"role_arn": "fake-iam"}},
"billing_source": {"data_source": {"bucket": "my-bucket"}},
"offset": 1,
}
source_json["koku_uuid"] = faker.uuid4()
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(**source_json)
_ = client.get(url + f"?source_id={test_source_id}", **self.headers)
mock_cost_usage_ready.assert_not_called()
def test_not_ready_for_status(self):
"""Test that availability status when source is not ready for status check."""
request = self.request_context.get("request")
test_matrix = [
{
"name": "New AWS Mock Test Source",
"source_type": Provider.PROVIDER_AWS,
"authentication": {},
"billing_source": {"data_source": {"bucket": "my-bucket"}},
"offset": 1,
},
{
"name": "New Azure Mock Test Source",
"source_type": Provider.PROVIDER_AZURE,
"authentication": {
"credentials": {"client_id": "testid", "tenant_id": "tenant", "client_secret": "secret"}
},
"billing_source": {"data_source": {"resource_group": "rg", "storage_account": "sa"}},
"offset": 1,
},
{
"name": "New Azure Mock Test Source 2",
"source_type": Provider.PROVIDER_AZURE,
"authentication": {
"credentials": {
"subscription_id": "subid",
"client_id": "testid",
"tenant_id": "tenant",
"client_secret": "secret",
}
},
"billing_source": {"data_source": {"storage_account": "sa"}},
"offset": 1,
},
{
"name": "New Azure Mock Test Source 3",
"source_type": Provider.PROVIDER_AZURE,
"authentication": {
"credentials": {
"subscription_id": "subid",
"client_id": "testid",
"tenant_id": "tenant",
"client_secret": "secret",
}
},
"billing_source": {},
"offset": 1,
},
{
"name": "New Azure Mock Test Source 4",
"source_type": Provider.PROVIDER_AZURE,
"authentication": {},
"billing_source": {"data_source": {"resource_group": "rg", "storage_account": "sa"}},
"offset": 1,
},
{
"name": "New OCP Mock Test Source",
"source_type": Provider.PROVIDER_OCP,
"authentication": {},
"offset": 1,
},
{
"name": "New GCP Mock Test Source",
"source_type": Provider.PROVIDER_GCP,
"authentication": {"credentials": {"project_id": "test_project_id"}},
"billing_source": {},
"offset": 1,
},
]
for i, test in enumerate(test_matrix):
with self.subTest(test=test):
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
provider = Provider.objects.create(
name=test.get("name"), created_by=request.user, customer=request.user.customer, active=True
)
test["koku_uuid"] = str(provider.uuid)
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(source_id=i, **test)
response = client.get(url + f"?source_id={i}", **self.headers)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_aws_unavailable(self):
"""Test that the API returns status when a source is configured correctly."""
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(
source_id=1,
name="New AWS Mock Test Source",
source_type=Provider.PROVIDER_AWS,
authentication={"credentials": {"role_arn": "fake-iam"}},
billing_source={"data_source": {"bucket": "my-bucket"}},
koku_uuid=faker.uuid4(),
offset=1,
)
response = client.get(url + "?source_id=1", **self.headers)
actual_source_status = response.data
expected = {
"availability_status": "unavailable",
"availability_status_error": ProviderErrors.AWS_ROLE_ARN_UNREACHABLE_MESSAGE,
}
self.assertEquals(actual_source_status, expected)
def test_azure_unavailable(self):
"""Test that the API returns status when a source is configured correctly."""
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
credentials = {
"subscription_id": faker.uuid4(),
"tenant_id": faker.uuid4(),
"client_id": faker.uuid4(),
"client_secret": faker.word(),
}
data_source = {"resource_group": faker.word(), "storage_account": faker.word()}
Sources.objects.create(
source_id=1,
name="New Azure Mock Test Source",
source_type=Provider.PROVIDER_AZURE,
authentication={"credentials": credentials},
billing_source={"data_source": data_source},
koku_uuid=faker.uuid4(),
offset=1,
)
response = client.get(url + "?source_id=1", **self.headers)
actual_source_status = response.data
expected = {
"availability_status": "unavailable",
"availability_status_error": ProviderErrors.AZURE_INCORRECT_TENANT_ID_MESSAGE,
}
self.assertEquals(actual_source_status, expected)
def test_ocp_unavailable(self):
"""Test that the API returns status when a source is configured correctly."""
url = reverse("source-status")
client = APIClient()
# Insert a source with ID 1
Sources.objects.create(
source_id=1,
name="New OCP Mock Test Source",
source_type=Provider.PROVIDER_OCP,
authentication={"credentials": {"provider_resoure_name": ""}},
billing_source={"data_source": {}},
koku_uuid=faker.uuid4(),
offset=1,
)
response = client.get(url + "?source_id=1", **self.headers)
actual_source_status = response.data
expected = {
"availability_status": "unavailable",
"availability_status_error": "Provider resource name is a required parameter for OCP.",
}
self.assertEquals(actual_source_status, expected)
# TODO double check these new tests
def test_post_status_provider_available(self):
"""Test that the provider active flag is set to true when source is available."""
request = self.request_context.get("request")
source_id = 1
source_name = "New AWS Mock Test Source"
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
provider = Provider.objects.create(
name=source_name, created_by=request.user, customer=request.user.customer, active=False
)
Sources.objects.create(
source_id=1,
name=source_name,
source_type=Provider.PROVIDER_AWS,
authentication={"credentials": {"role_arn": "fake-iam"}},
billing_source={"data_source": {"bucket": "my-bucket"}},
koku_uuid=str(provider.uuid),
offset=1,
)
status_obj = SourceStatus(source_id)
status_obj.status()
self.assertTrue(Provider.objects.get(uuid=provider.uuid).active)
def test_post_status_provider_unavailable(self):
"""Test that the provider active flag is set to true when source is unavailable."""
request = self.request_context.get("request")
source_id = 1
source_name = "New AWS Mock Test Source"
with patch.object(ProviderAccessor, "cost_usage_source_ready", side_effect=ValidationError("test error")):
provider = Provider.objects.create(
name=source_name, created_by=request.user, customer=request.user.customer, active=True
)
Sources.objects.create(
source_id=1,
name=source_name,
source_type=Provider.PROVIDER_AWS,
authentication={"credentials": {"role_arn": "fake-iam"}},
billing_source={"data_source": {"bucket": "my-bucket"}},
koku_uuid=str(provider.uuid),
offset=1,
)
status_obj = SourceStatus(source_id)
status_obj.status()
self.assertFalse(Provider.objects.get(uuid=provider.uuid).active)
def test_post_status_wrong_provider(self):
"""Test for logs when provider mismatch is detected while setting status."""
source_id = 1
source_name = "New AWS Mock Test Source"
with patch.object(ProviderAccessor, "cost_usage_source_ready", returns=True):
Sources.objects.create(
source_id=source_id,
name=source_name,
source_type=Provider.PROVIDER_AWS,
authentication={"credentials": {"role_arn": "fake-iam"}},
billing_source={"data_source": {"bucket": "my-bucket"}},
koku_uuid=str(uuid4()),
offset=1,
)
status_obj = SourceStatus(source_id)
with self.assertLogs("sources.api.source_status", level="INFO") as logger:
status_obj.status()
expected = f"INFO:sources.api.source_status:No provider found for Source ID: {source_id}"
self.assertIn(expected, logger.output)
| 46.751634 | 115 | 0.598036 | 2,988 | 28,612 | 5.46988 | 0.083668 | 0.049927 | 0.02478 | 0.041361 | 0.848324 | 0.826358 | 0.808921 | 0.800355 | 0.787261 | 0.780409 | 0 | 0.006915 | 0.297463 | 28,612 | 611 | 116 | 46.828151 | 0.806179 | 0.088005 | 0 | 0.680556 | 0 | 0 | 0.216669 | 0.057964 | 0 | 0 | 0 | 0.001637 | 0.063492 | 1 | 0.043651 | false | 0 | 0.035714 | 0 | 0.081349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d19b9b72bb363998411bcd6194e5fd25d9500afb | 34 | py | Python | unittests/__init__.py | landcast/flaskseed | 15b73fef7345e8c05c4b6efe26c889f9818fabe3 | [
"Apache-2.0"
] | null | null | null | unittests/__init__.py | landcast/flaskseed | 15b73fef7345e8c05c4b6efe26c889f9818fabe3 | [
"Apache-2.0"
] | 1 | 2018-10-21T14:28:46.000Z | 2018-10-21T14:28:46.000Z | unittests/__init__.py | landcast/flaskseed | 15b73fef7345e8c05c4b6efe26c889f9818fabe3 | [
"Apache-2.0"
] | null | null | null | from unittests.test_base import *
| 17 | 33 | 0.823529 | 5 | 34 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0603897d3dd335a75934e4a8fc8d488124e9f8c1 | 9,978 | py | Python | testcases/generated/domainservice_test.py | Tanc009/jdcloud-cli | 4e11de77c68501f44e7026c0ad1c24e5d043197e | [
"Apache-2.0"
] | 95 | 2018-06-05T10:49:32.000Z | 2019-12-31T11:07:36.000Z | testcases/generated/domainservice_test.py | Tanc009/jdcloud-cli | 4e11de77c68501f44e7026c0ad1c24e5d043197e | [
"Apache-2.0"
] | 22 | 2018-06-05T10:58:59.000Z | 2020-07-31T12:13:19.000Z | testcases/generated/domainservice_test.py | Tanc009/jdcloud-cli | 4e11de77c68501f44e7026c0ad1c24e5d043197e | [
"Apache-2.0"
] | 21 | 2018-06-04T12:50:27.000Z | 2020-11-05T10:55:28.000Z | # coding=utf8
# Copyright 2018 JDCLOUD.COM
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# NOTE: This class is auto generated by the jdcloud code generator program.
import unittest
import os
import json
class DomainserviceTest(unittest.TestCase):
def test_describe_action_log(self):
cmd = """python ../../main.py domainservice describe-action-log --page-number '5' --page-size '5' --start-time 'xxx' --end-time 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_domains(self):
cmd = """python ../../main.py domainservice describe-domains --page-number '5' --page-size '5'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_domain(self):
cmd = """python ../../main.py domainservice create-domain --pack-id '5' --domain-name 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_modify_domain(self):
cmd = """python ../../main.py domainservice modify-domain --domain-id 'xxx' --domain-name 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_delete_domain(self):
cmd = """python ../../main.py domainservice delete-domain --domain-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_domain_query_count(self):
cmd = """python ../../main.py domainservice describe-domain-query-count --domain-id 'xxx' --domain-name 'xxx' --start 'xxx' --end 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_domain_query_traffic(self):
cmd = """python ../../main.py domainservice describe-domain-query-traffic --domain-id 'xxx' --domain-name 'xxx' --start 'xxx' --end 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_resource_record(self):
cmd = """python ../../main.py domainservice describe-resource-record --domain-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_resource_record(self):
cmd = """python ../../main.py domainservice create-resource-record --domain-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_modify_resource_record(self):
cmd = """python ../../main.py domainservice modify-resource-record --domain-id 'xxx' --resource-record-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_delete_resource_record(self):
cmd = """python ../../main.py domainservice delete-resource-record --domain-id 'xxx' --resource-record-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_modify_resource_record_status(self):
cmd = """python ../../main.py domainservice modify-resource-record-status --domain-id 'xxx' --resource-record-id 'xxx' --action 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_view_tree(self):
cmd = """python ../../main.py domainservice describe-view-tree --domain-id 'xxx' --pack-id '5' --view-id '5'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_batch_set_resource_records(self):
cmd = """python ../../main.py domainservice batch-set-resource-records --domain-id 'xxx' --req '[{"":""}]'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_user_view(self):
cmd = """python ../../main.py domainservice describe-user-view --domain-id 'xxx' --view-id '5' --page-number '5' --page-size '5'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_user_view(self):
cmd = """python ../../main.py domainservice create-user-view --domain-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_delete_user_view(self):
cmd = """python ../../main.py domainservice delete-user-view --domain-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_user_view_ip(self):
cmd = """python ../../main.py domainservice describe-user-view-ip --domain-id 'xxx' --view-id '5' --page-number '5' --page-size '5'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_user_view_ip(self):
cmd = """python ../../main.py domainservice create-user-view-ip --domain-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_delete_user_view_ip(self):
cmd = """python ../../main.py domainservice delete-user-view-ip --domain-id 'xxx' --req '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_monitor(self):
cmd = """python ../../main.py domainservice describe-monitor --domain-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_monitor(self):
cmd = """python ../../main.py domainservice create-monitor --domain-id 'xxx' --sub-domain-name 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_modify_monitor(self):
cmd = """python ../../main.py domainservice modify-monitor --domain-id 'xxx' --update-monitor '{"":""}'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_monitor_target(self):
cmd = """python ../../main.py domainservice describe-monitor-target --domain-id 'xxx' --sub-domain-name 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_create_monitor_target(self):
cmd = """python ../../main.py domainservice create-monitor-target --domain-id 'xxx' --sub-domain-name 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_modify_monitor_status(self):
cmd = """python ../../main.py domainservice modify-monitor-status --domain-id 'xxx' --monitor-id 'xxx' --action 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_delete_monitor(self):
cmd = """python ../../main.py domainservice delete-monitor --domain-id 'xxx' --monitor-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
def test_describe_monitor_alarm(self):
cmd = """python ../../main.py domainservice describe-monitor-alarm --domain-id 'xxx'"""
with os.popen(cmd) as f:
content = f.read()
print(content)
result = json.loads(content)
self.assertIsInstance(result, dict)
| 35.892086 | 149 | 0.602826 | 1,219 | 9,978 | 4.866284 | 0.109106 | 0.025287 | 0.061362 | 0.080243 | 0.859575 | 0.854855 | 0.837829 | 0.780175 | 0.701113 | 0.612778 | 0 | 0.002942 | 0.250451 | 9,978 | 277 | 150 | 36.021661 | 0.790213 | 0.06354 | 0 | 0.7 | 0 | 0.12 | 0.29391 | 0.033133 | 0 | 0 | 0 | 0 | 0.14 | 1 | 0.14 | false | 0 | 0.015 | 0 | 0.16 | 0.14 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0615f54ddddbf1a3c6f917ef5f1b3350d9d516e7 | 1,386 | py | Python | acmicpc/12761/12761.py | love-adela/algorithm | 4ccd02173c96f8369962f1fd4e5166a221690fa2 | [
"MIT"
] | 3 | 2019-03-09T05:19:23.000Z | 2019-04-06T09:26:36.000Z | acmicpc/12761/12761.py | love-adela/algorithm | 4ccd02173c96f8369962f1fd4e5166a221690fa2 | [
"MIT"
] | 1 | 2020-02-23T10:38:04.000Z | 2020-02-23T10:38:04.000Z | acmicpc/12761/12761.py | love-adela/algorithm | 4ccd02173c96f8369962f1fd4e5166a221690fa2 | [
"MIT"
] | 1 | 2019-05-22T13:47:53.000Z | 2019-05-22T13:47:53.000Z | import sys
max_num = 100001
visited = [False] * max_num
queue = []
A, B, N, M = map(int, sys.stdin.readline().split())
queue.append((N, 0))
visited[N] = True
while len(queue) != 0:
current = queue[0][0]
count = queue[0][1]
del queue[0]
if current == M:
print(count)
if current + 1 < max_num and not visited[current+1]:
queue.append((current+1, count +1))
visited[current + 1] = True
if 0<current - 1 < max_num and not visited[current-1]:
queue.append((current-1, count +1))
visited[current-1] = True
if current + A < max_num and not visited[current+A]:
queue.append((current+A, count +1))
visited[current + A] = True
if 0<current - A < max_num and not visited[current-A]:
queue.append((current-A, count +1))
visited[current -A] = True
if current + B < max_num and not visited[current+B]:
queue.append((current+B, count +1))
visited[current+B] = True
if 0<current -B < max_num and not visited[current-B]:
queue.append((current-B, count +1))
visited[current-B] = True
if 0<current*A < max_num and not visited[current*A]:
queue.append((current*A, count +1))
visited[current*A] = True
if 0< current*B < max_num and not visited[current*B]:
queue.append((current*B, count +1))
visited[current*B] = True
| 33 | 58 | 0.600289 | 212 | 1,386 | 3.877358 | 0.141509 | 0.272506 | 0.087591 | 0.116788 | 0.790754 | 0.790754 | 0.790754 | 0.790754 | 0.790754 | 0.790754 | 0 | 0.032882 | 0.253968 | 1,386 | 41 | 59 | 33.804878 | 0.762089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.027027 | 0.027027 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ae1cbf126fc5d17039855bcf980e4e5db9093d73 | 198 | py | Python | 8_kyu/For_Twins_2_Math_operations.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 8_kyu/For_Twins_2_Math_operations.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | 8_kyu/For_Twins_2_Math_operations.py | UlrichBerntien/Codewars-Katas | bbd025e67aa352d313564d3862db19fffa39f552 | [
"MIT"
] | null | null | null | def ice_brick_volume(radius: int, bottle_length: int, rim_length: int) -> int:
if radius < 0 or bottle_length < rim_length:
return 0
return 2*radius*radius*(bottle_length-rim_length) | 49.5 | 78 | 0.722222 | 31 | 198 | 4.354839 | 0.451613 | 0.266667 | 0.222222 | 0.311111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.181818 | 198 | 4 | 79 | 49.5 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
ae1fa5033a828c9bd11d308f75a5e70aabee29ba | 16,400 | py | Python | benchml/models/mod_bench_xtal.py | rudolfspetrovs/benchml | 896673f387a6bb9b185664ddd54f569a1ba54e51 | [
"Apache-2.0"
] | 3 | 2021-08-12T13:25:31.000Z | 2022-03-21T21:30:22.000Z | benchml/models/mod_bench_xtal.py | rudolfspetrovs/benchml | 896673f387a6bb9b185664ddd54f569a1ba54e51 | [
"Apache-2.0"
] | 5 | 2020-12-08T08:59:41.000Z | 2022-01-22T06:46:09.000Z | benchml/models/mod_bench_xtal.py | rudolfspetrovs/benchml | 896673f387a6bb9b185664ddd54f569a1ba54e51 | [
"Apache-2.0"
] | 1 | 2021-06-25T11:07:32.000Z | 2021-06-25T11:07:32.000Z | import numpy as np
import benchml.transforms as btf
from benchml.hyper import GridHyper, Hyper
from benchml.models.common import (
get_acsf_krr_kwargs,
get_acsf_rr_kwargs,
get_compile_gylm,
get_mbtr_krr_kwargs,
get_mbtr_rr_kwargs,
get_pdf_gylm_krr_kwargs,
get_pdf_gylm_rr_kwargs,
get_pdf_soap_krr_kwargs,
get_pdf_soap_rr_kwargs,
make_soap_krr,
make_soap_rr,
)
whiten_hyper = [False] # NOTE: False = no whitening in ridge models
regularization_range = np.logspace(-9, +7, 17)
def compile_physchem(custom_fields=None, with_hyper=False, **kwargs):
if custom_fields is None:
custom_fields = []
models = []
for bins in [5, 10, 20]:
models.extend(
[
btf.Module(
tag="bxtal_physchem_s%02d_rf" % bins,
transforms=[
btf.ExtXyzInput(tag="input"),
btf.PhyschemXtal(
tag="descriptor",
args={"bins": bins},
inputs={"configs": "input.configs"},
),
btf.DoDivideBySize(
tag="input_norm",
args={
"config_to_size": "lambda c: len(c)",
"skip_if_not_force": True,
"force": None,
},
inputs={
"configs": "input.configs",
"meta": "input.meta",
"y": "input.y",
},
),
btf.RandomForestRegressor(
tag="predictor", inputs={"X": "descriptor.X", "y": "input_norm.y"}
),
btf.UndoDivideBySize(
tag="output", inputs={"y": "predictor.y", "sizes": "input_norm.sizes"}
),
],
hyper=GridHyper(
Hyper({"input_norm.force": [False, True]}),
Hyper({"predictor.max_depth": [None]}),
),
broadcast={"meta": "input.meta"},
outputs={"y": "output.y"},
),
btf.Module(
tag="bxtal_physchem_s%02d_rr" % bins,
transforms=[
btf.ExtXyzInput(tag="input"),
btf.PhyschemXtal(
tag="descriptor",
args={"bins": bins},
inputs={"configs": "input.configs"},
),
btf.WhitenMatrix(tag="whiten", inputs={"X": "descriptor.X"}),
btf.DoDivideBySize(
tag="input_norm",
args={
"config_to_size": "lambda c: len(c)",
"skip_if_not_force": True,
"force": None,
},
inputs={
"configs": "input.configs",
"meta": "input.meta",
"y": "input.y",
},
),
btf.Ridge(
tag="predictor",
args={"alpha": None},
inputs={"X": "whiten.X", "y": "input_norm.y"},
),
btf.UndoDivideBySize(
tag="output", inputs={"y": "predictor.y", "sizes": "input_norm.sizes"}
),
],
hyper=GridHyper(
Hyper({"input_norm.force": [False, True]}),
Hyper(
{
"predictor.alpha": regularization_range,
}
),
),
broadcast={"meta": "input.meta"},
outputs={"y": "output.y"},
),
]
)
return models
def compile_ecfp(**kwargs):
# TODO
return []
def compile_esm(*args, **kwargs):
models = []
for permutation in ["sorted_l2", "eigenspectrum"]:
models.extend(
[
btf.Module(
tag="bxtal_esm_%s_rr" % permutation,
transforms=[
btf.ExtXyzInput(tag="input"),
btf.DscribeEwaldSumMatrix(
tag="descriptor_atomic",
args={"permutation": permutation},
inputs={"configs": "input.configs"},
),
btf.ReduceMatrix(
tag="descriptor",
args={"reduce": "np.mean(x, axis=0)", "norm": False, "epsilon": 1e-10},
inputs={"X": "descriptor_atomic.X"},
),
btf.WhitenMatrix(tag="whiten", inputs={"X": "descriptor.X"}),
btf.DoDivideBySize(
tag="input_norm",
args={
"config_to_size": "lambda c: len(c)",
"skip_if_not_force": False,
"force": None,
},
inputs={
"configs": "input.configs",
"meta": "input.meta",
"y": "input.y",
},
),
btf.Ridge(tag="predictor", inputs={"X": "whiten.X", "y": "input_norm.y"}),
btf.UndoDivideBySize(
tag="output", inputs={"y": "predictor.y", "sizes": "input_norm.sizes"}
),
],
hyper=GridHyper(
Hyper({"input_norm.force": [False, True]}),
Hyper(
{
"predictor.alpha": regularization_range,
}
),
),
broadcast={"meta": "input.meta"},
outputs={"y": "output.y"},
),
btf.Module(
tag="bxtal_esm_%s_krr" % permutation,
transforms=[
btf.ExtXyzInput(tag="input"),
btf.DscribeEwaldSumMatrix(
tag="descriptor_atomic",
args={"permutation": permutation},
inputs={"configs": "input.configs"},
),
btf.ReduceMatrix(
tag="descriptor",
args={"reduce": "np.mean(x, axis=0)", "norm": False, "epsilon": 1e-10},
inputs={"X": "descriptor_atomic.X"},
),
btf.KernelDot(tag="kernel", inputs={"X": "descriptor.X"}),
btf.DoDivideBySize(
tag="input_norm",
args={
"config_to_size": "lambda c: len(c)",
"skip_if_not_force": False,
"force": None,
},
inputs={
"configs": "input.configs",
"meta": "input.meta",
"y": "input.y",
},
),
btf.KernelRidge(
tag="predictor",
args={
"alpha": None,
},
inputs={"K": "kernel.K", "y": "input_norm.y"},
),
btf.UndoDivideBySize(
tag="output", inputs={"y": "predictor.y", "sizes": "input_norm.sizes"}
),
],
hyper=GridHyper(
Hyper({"input_norm.force": [False, True]}),
Hyper(
{
"predictor.alpha": regularization_range,
}
),
),
broadcast={"meta": "input.meta"},
outputs={"y": "output.y"},
),
]
)
return models
def compile_acsf(adjust_to_species=None, *args, **kwargs):
if adjust_to_species is None:
adjust_to_species = ["C", "N", "O"]
models = []
for scalerange, sharpness, scale in zip(
[0.85, 1.2, 1.8], [1.0, 1.0, 1.2], ["minimal", "smart", "longrange"]
):
for extensive in [False, True]:
models.extend(
[
btf.Module(
tag="bxtal_acsf_%s_%s_rr" % (scale, "ext" if extensive else "int"),
**get_acsf_rr_kwargs(
scalerange, sharpness, extensive, whiten_hyper, regularization_range
),
),
btf.Module(
tag="bxtal_acsf_%s_%s_krr" % (scale, "ext" if extensive else "int"),
**get_acsf_krr_kwargs(scalerange, extensive, regularization_range),
),
]
)
return models
def compile_mbtr(**kwargs):
models = []
for _ in ["default"]:
for extensive in [False, True]:
models.extend(
[
btf.Module(
tag="bxtal_mbtr_%s_rr" % ("ext" if extensive else "int"),
**get_mbtr_rr_kwargs(extensive, whiten_hyper, regularization_range),
),
btf.Module(
tag="bxtal_mbtr_%s_krr" % ("ext" if extensive else "int"),
**get_mbtr_krr_kwargs(extensive, regularization_range),
),
]
)
return models
def compile_soap(*args, **kwargs):
krr_int_settings = GridHyper(
Hyper({"descriptor_atomic.normalize": [False]}),
Hyper({"descriptor_atomic.mode": ["minimal", "smart", "longrange"]}),
Hyper({"descriptor_atomic.crossover": [False, True]}),
Hyper({"descriptor.reduce_op": ["mean"]}),
Hyper({"descriptor.normalize": [False]}),
Hyper({"descriptor.reduce_by_type": [False]}),
Hyper({"whiten.centre": [False]}),
Hyper({"whiten.scale": [False]}),
Hyper({"predictor.power": [2]}),
)
krr_int_hyper = GridHyper(Hyper({"predictor.power": [1, 2, 3]}))
krr_ext_settings = GridHyper(
Hyper({"descriptor_atomic.normalize": [False]}),
Hyper({"descriptor_atomic.mode": ["minimal", "smart", "longrange"]}),
Hyper({"descriptor_atomic.crossover": [False, True]}),
Hyper({"descriptor.reduce_op": ["sum"]}),
Hyper({"descriptor.normalize": [False]}),
Hyper({"descriptor.reduce_by_type": [False]}),
Hyper({"whiten.centre": [False]}),
Hyper({"whiten.scale": [False]}),
Hyper({"predictor.power": [1]}),
)
krr_ext_hyper = GridHyper(Hyper({"predictor.power": [1, 2, 3]}))
rr_int_settings = GridHyper(
Hyper({"descriptor_atomic.normalize": [False]}),
Hyper({"descriptor_atomic.mode": ["minimal", "smart", "longrange"]}),
Hyper({"descriptor_atomic.crossover": [False, True]}),
Hyper({"descriptor.reduce_op": ["mean"]}),
Hyper({"descriptor.normalize": [False]}),
Hyper({"descriptor.reduce_by_type": [False]}),
Hyper({"whiten.centre": [True]}),
Hyper({"whiten.scale": [True]}),
)
rr_int_hyper = GridHyper(Hyper({"whiten.centre": whiten_hyper, "whiten.scale": whiten_hyper}))
rr_ext_settings = GridHyper(
Hyper({"descriptor_atomic.normalize": [False]}),
Hyper({"descriptor_atomic.mode": ["minimal", "smart", "longrange"]}),
Hyper({"descriptor_atomic.crossover": [False, True]}),
Hyper({"descriptor.reduce_op": ["sum"]}),
Hyper({"descriptor.normalize": [False]}),
Hyper({"descriptor.reduce_by_type": [False]}),
Hyper({"whiten.centre": [False]}),
Hyper({"whiten.scale": [False]}),
)
rr_ext_hyper = GridHyper(Hyper({"whiten.centre": whiten_hyper, "whiten.scale": whiten_hyper}))
models = []
for hidx, updates in enumerate(krr_int_settings):
tag = "%s_%s" % (
updates["descriptor_atomic.mode"],
"cross" if updates["descriptor_atomic.crossover"] else "nocross",
)
model = make_soap_krr(
"bxtal_soap_%s_int_krr" % tag,
extensive=False,
regularization_range=regularization_range,
)
model.hyperUpdate(updates)
model.hyper.add(krr_int_hyper)
models.append(model)
for hidx, updates in enumerate(krr_ext_settings):
tag = "%s_%s" % (
updates["descriptor_atomic.mode"],
"cross" if updates["descriptor_atomic.crossover"] else "nocross",
)
model = make_soap_krr(
"bxtal_soap_%s_ext_krr" % tag, extensive=True, regularization_range=regularization_range
)
model.hyperUpdate(updates)
model.hyper.add(krr_ext_hyper)
models.append(model)
for hidx, updates in enumerate(rr_int_settings):
tag = "%s_%s" % (
updates["descriptor_atomic.mode"],
"cross" if updates["descriptor_atomic.crossover"] else "nocross",
)
model = make_soap_rr(
"bxtal_soap_%s_int_rr" % tag, extensive=False, regularization_range=regularization_range
)
model.hyperUpdate(updates)
model.hyper.add(rr_int_hyper)
models.append(model)
for hidx, updates in enumerate(rr_ext_settings):
tag = "%s_%s" % (
updates["descriptor_atomic.mode"],
"cross" if updates["descriptor_atomic.crossover"] else "nocross",
)
model = make_soap_rr(
"bxtal_soap_%s_ext_rr" % tag, extensive=True, regularization_range=regularization_range
)
model.hyperUpdate(updates)
model.hyper.add(rr_ext_hyper)
models.append(model)
return models
compile_gylm = get_compile_gylm("bxtal", whiten_hyper, regularization_range)
def compile_pdf():
models = []
for minimal in [False, True]:
models.extend(make_pdf_rr(minimal))
models.extend(make_pdf_krr(minimal))
return models
def make_pdf_krr(minimal):
return [
btf.Module(
tag="bxtal_pdf_soap_%s_krr" % ("minimal" if minimal else "standard"),
**get_pdf_soap_krr_kwargs(minimal, regularization_range),
),
btf.Module(
tag="bxtal_pdf_gylm_%s_krr" % ("minimal" if minimal else "standard"),
**get_pdf_gylm_krr_kwargs(minimal, regularization_range),
),
]
def make_pdf_rr(minimal):
return [
btf.Module(
tag="bxtal_pdf_soap_%s_rr" % ("minimal" if minimal else "standard"),
**get_pdf_soap_rr_kwargs(minimal, whiten_hyper, regularization_range),
),
btf.Module(
tag="bxtal_pdf_gylm_%s_rr" % ("minimal" if minimal else "standard"),
**get_pdf_gylm_rr_kwargs(minimal, whiten_hyper, regularization_range),
),
]
def register_all():
return {
"bxtal_physchem": compile_physchem,
"bxtal_ecfp": compile_ecfp,
"bxtal_esm": compile_esm,
"bxtal_acsf": compile_acsf,
"bxtal_mbtr": compile_mbtr,
"bxtal_soap": compile_soap,
"bxtal_gylm": compile_gylm,
"bxtal_pdf": compile_pdf,
}
| 39.805825 | 100 | 0.444207 | 1,375 | 16,400 | 5.069091 | 0.111273 | 0.055093 | 0.02066 | 0.029268 | 0.823242 | 0.788092 | 0.768867 | 0.730273 | 0.7066 | 0.646055 | 0 | 0.0046 | 0.430061 | 16,400 | 411 | 101 | 39.902676 | 0.741093 | 0.002866 | 0 | 0.606218 | 0 | 0 | 0.181896 | 0.044648 | 0 | 0 | 0 | 0.002433 | 0 | 1 | 0.025907 | false | 0 | 0.010363 | 0.010363 | 0.062176 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ae2265d576b31dc5189a8fbdf2a036b016a9bf70 | 42 | py | Python | crane/__init__.py | victorlin/crane | a888e9731295e4b8332c44b5168fbceb87546fa5 | [
"Apache-2.0"
] | 4 | 2015-01-13T09:41:59.000Z | 2016-02-13T16:10:03.000Z | crane/__init__.py | fangpenlin/crane | a888e9731295e4b8332c44b5168fbceb87546fa5 | [
"Apache-2.0"
] | null | null | null | crane/__init__.py | fangpenlin/crane | a888e9731295e4b8332c44b5168fbceb87546fa5 | [
"Apache-2.0"
] | null | null | null | from .builders import BuilderBase # noqa
| 21 | 41 | 0.785714 | 5 | 42 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 42 | 1 | 42 | 42 | 0.942857 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ae379fe311c75d8d6ea820e512b7990543083db9 | 37 | py | Python | py_tea_code/3.mypro-modules/math3/build/lib/baizhanSuperMath/demo2.py | qq4215279/study_python | b0eb9dedfc4abb2fd6c024a599e7375869c3d77a | [
"Apache-2.0"
] | null | null | null | py_tea_code/3.mypro-modules/math3/build/lib/baizhanSuperMath/demo2.py | qq4215279/study_python | b0eb9dedfc4abb2fd6c024a599e7375869c3d77a | [
"Apache-2.0"
] | null | null | null | py_tea_code/3.mypro-modules/math3/build/lib/baizhanSuperMath/demo2.py | qq4215279/study_python | b0eb9dedfc4abb2fd6c024a599e7375869c3d77a | [
"Apache-2.0"
] | null | null | null | def multiple():
print("multiple") | 18.5 | 21 | 0.648649 | 4 | 37 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 37 | 2 | 21 | 18.5 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ae40dc9f22a8469c207eea4e91acf2ffdba1cb25 | 187 | py | Python | timm/utils/random.py | KnockerPulsar/pytorch-image-models | 893f5dde27ae6b17389f738bd6e37160e2868c72 | [
"Apache-2.0"
] | null | null | null | timm/utils/random.py | KnockerPulsar/pytorch-image-models | 893f5dde27ae6b17389f738bd6e37160e2868c72 | [
"Apache-2.0"
] | null | null | null | timm/utils/random.py | KnockerPulsar/pytorch-image-models | 893f5dde27ae6b17389f738bd6e37160e2868c72 | [
"Apache-2.0"
] | null | null | null | import random
import numpy as np
import torch
def random_seed(seed=42, rank=0):
torch.manual_seed(seed + rank)
np.random.seed(seed + rank)
random.seed(seed + rank)
| 18.7 | 35 | 0.668449 | 29 | 187 | 4.241379 | 0.413793 | 0.260163 | 0.341463 | 0.292683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.229947 | 187 | 9 | 36 | 20.777778 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
88358ee4b8e42cb2326701f5fd05bd2c0e35c0f0 | 126 | py | Python | shiyanlou_cs214-ca51b72ef6/mysite/mysite/views.py | tongxindao/shiyanlou | 1d002ea342deb69066c287db9935f77f49f0a09e | [
"Apache-2.0"
] | null | null | null | shiyanlou_cs214-ca51b72ef6/mysite/mysite/views.py | tongxindao/shiyanlou | 1d002ea342deb69066c287db9935f77f49f0a09e | [
"Apache-2.0"
] | null | null | null | shiyanlou_cs214-ca51b72ef6/mysite/mysite/views.py | tongxindao/shiyanlou | 1d002ea342deb69066c287db9935f77f49f0a09e | [
"Apache-2.0"
] | null | null | null | # _*_ coding: utf-8 _*_
from django.http import HttpResponse
def first_page(request):
return HttpResponse("<p>世界好</p>")
| 18 | 37 | 0.706349 | 17 | 126 | 4.941176 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009346 | 0.150794 | 126 | 6 | 38 | 21 | 0.775701 | 0.166667 | 0 | 0 | 0 | 0 | 0.097087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
8841e89aae3cab417d7af4ca2226e145bcde9bf7 | 70 | py | Python | 7KYU/add.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | 4 | 2021-07-17T22:48:03.000Z | 2022-03-25T14:10:58.000Z | 7KYU/add.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | null | null | null | 7KYU/add.py | yaznasivasai/python_codewars | 25493591dde4649dc9c1ec3bece8191a3bed6818 | [
"MIT"
] | 3 | 2021-06-14T14:18:16.000Z | 2022-03-16T06:02:02.000Z | def add(a):
def add2(b):
return a + b
return add2
| 14 | 20 | 0.471429 | 11 | 70 | 3 | 0.545455 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.428571 | 70 | 5 | 21 | 14 | 0.775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
8896bb2c12634a2a0a543ea1c435138e3f3d7a97 | 124 | py | Python | django_any/contrib/__init__.py | kadych/django-whatever | 8b706a2728c8f3f80bbce7191f83285238eb975b | [
"MIT"
] | 8 | 2015-01-27T07:21:24.000Z | 2018-02-07T04:22:06.000Z | django_any/contrib/__init__.py | kadych/django-whatever | 8b706a2728c8f3f80bbce7191f83285238eb975b | [
"MIT"
] | 18 | 2015-06-04T17:25:44.000Z | 2020-04-07T23:37:13.000Z | django_any/contrib/__init__.py | kadych/django-whatever | 8b706a2728c8f3f80bbce7191f83285238eb975b | [
"MIT"
] | 12 | 2015-07-06T13:09:32.000Z | 2020-04-05T12:03:30.000Z | from . import auth
from . import default
any_user = auth.any_user
any_model_with_defaults = default.any_model_with_defaults
| 24.8 | 57 | 0.83871 | 20 | 124 | 4.8 | 0.45 | 0.208333 | 0.25 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112903 | 124 | 4 | 58 | 31 | 0.872727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
31fabef610bfe9a3859d38acdf4c122d56fd32bf | 13,063 | py | Python | test/test_npu/test_network_ops/test_pow.py | Ascend/pytorch | 39849cf72dafe8d2fb68bd1679d8fd54ad60fcfc | [
"BSD-3-Clause"
] | 1 | 2021-12-02T03:07:35.000Z | 2021-12-02T03:07:35.000Z | test/test_npu/test_network_ops/test_pow.py | Ascend/pytorch | 39849cf72dafe8d2fb68bd1679d8fd54ad60fcfc | [
"BSD-3-Clause"
] | 1 | 2021-11-12T07:23:03.000Z | 2021-11-12T08:28:13.000Z | test/test_npu/test_network_ops/test_pow.py | Ascend/pytorch | 39849cf72dafe8d2fb68bd1679d8fd54ad60fcfc | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2020, Huawei Technologies.All rights reserved.
#
# Licensed under the BSD 3-Clause License (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://opensource.org/licenses/BSD-3-Clause
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import torch
import numpy as np
import copy
from common_utils import TestCase, run_tests
from common_device_type import dtypes, instantiate_device_type_tests
from util_test import create_common_tensor
class TestPow(TestCase):
def cpu_op_exec(self, input1, input2):
output = torch.pow(input1, input2)
output = output.numpy()
return output
def npu_op_exec(self, input1, input2):
output = torch.pow(input1, input2)
output = output.to("cpu")
output = output.numpy()
return output
def npu_op_exec_out(self, input1, input2, out):
torch.pow(input1, input2, out=out)
output = out.to("cpu")
output = output.numpy()
return output
def cpu_op_inplace_exec(self, input1, input2):
input1.pow_(input2)
output = input1.numpy()
return output
def npu_op_inplace_exec(self, input1, input2):
input1.pow_(input2)
output = input1.to("cpu")
output = output.numpy()
return output
def cpu_op_exec_tensor_scalar(self, input1, n):
output = torch.pow(input1, n)
output = output.numpy()
return output
def npu_op_exec_tensor_scalar(self, input1, n):
# input1 = input1.to("npu")
output = torch.pow(input1, n)
output = output.to("cpu")
output = output.numpy()
return output
def npu_op_exec_tensor_scalar_out(self, input1, n, out):
# input1 = input1.to("npu")
output = torch.pow(input1, n, out=out)
output = out.to("cpu")
output = output.numpy()
return output
def cpu_op_exec_scalar_tensor(self, n, input1):
output = torch.pow(n, input1)
output = output.numpy()
return output
def npu_op_exec_scalar_tensor(self, n, input1):
output = torch.pow(n, input1)
output = output.to("cpu")
output = output.numpy()
return output
def npu_op_exec_scalar_tensor_out(self, n, input1, out):
torch.pow(n, input1, out=out)
output = out.to("cpu")
output = output.numpy()
return output
def pow_result(self, shape_format):
for item in shape_format:
cpu_input1, npu_input1 = create_common_tensor(item[0], 0, 1)
cpu_input2, npu_input2 = create_common_tensor(item[1], 0, 1)
npu_input3 = copy.deepcopy(cpu_input1).to("npu")
if cpu_input1.dtype == torch.float16:
cpu_input1 = cpu_input1.to(torch.float32)
cpu_input2 = cpu_input2.to(torch.float32)
cpu_output = self.cpu_op_exec(cpu_input1, cpu_input2)
npu_output = self.npu_op_exec(npu_input1, npu_input2)
npu_output_out = self.npu_op_exec_out(npu_input1, npu_input2, npu_input3)
cpu_output_inp = self.cpu_op_inplace_exec(cpu_input1, cpu_input2)
npu_output_inp = self.npu_op_inplace_exec(npu_input1, npu_input2)
cpu_output = cpu_output.astype(npu_output.dtype)
cpu_output_inp = cpu_output_inp.astype(npu_output_inp.dtype)
self.assertRtolEqual(cpu_output, npu_output)
self.assertRtolEqual(cpu_output, npu_output_out)
self.assertRtolEqual(cpu_output_inp, npu_output_inp)
def pow_result_scalar_tensor(self, shape_format):
for item in shape_format:
scalar = np.random.randint(0, 1)
cpu_input1, npu_input1 = create_common_tensor(item, 0, 1)
npu_input3 = copy.deepcopy(cpu_input1).to("npu")
if cpu_input1.dtype == torch.float16:
cpu_input1 = cpu_input1.to(torch.float32)
cpu_output_scalar = self.cpu_op_exec_scalar_tensor(scalar, cpu_input1)
npu_output_scalar = self.npu_op_exec_scalar_tensor(scalar, npu_input1)
npu_output_scalar_out = self.npu_op_exec_scalar_tensor_out(scalar, npu_input1, npu_input3)
cpu_output_scalar = cpu_output_scalar.astype(npu_output_scalar.dtype)
self.assertRtolEqual(cpu_output_scalar, npu_output_scalar)
self.assertRtolEqual(cpu_output_scalar, npu_output_scalar_out)
def pow_result_tensor_scalar_(self, shape_format):
for item in shape_format:
scalar = np.random.randint(0, 1)
cpu_input1, npu_input1 = create_common_tensor(item, 0, 1)
npu_input3 = copy.deepcopy(cpu_input1).to("npu")
if cpu_input1.dtype == torch.float16:
cpu_input1 = cpu_input1.to(torch.float32)
cpu_output_tensor_scalar = self.cpu_op_exec_tensor_scalar(cpu_input1, scalar)
npu_output_tensor_scalar = self.npu_op_exec_tensor_scalar(npu_input1, scalar)
npu_output_tensor_scalar_out = self.npu_op_exec_tensor_scalar_out(npu_input1, scalar, npu_input3)
cpu_output_tensor_scalar = cpu_output_tensor_scalar.astype(npu_output_tensor_scalar.dtype)
self.assertRtolEqual(cpu_output_tensor_scalar, npu_output_tensor_scalar)
self.assertRtolEqual(cpu_output_tensor_scalar, npu_output_tensor_scalar_out)
# scalar_tensor-------------------------------------------------------
def test_pow_shape_format_scalar_tensor_fp16_1d(self, device):
format_list = [-1, 0, 3]
shape_format = [[np.float16, i, [18]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp32_1d(self, device):
format_list = [-1, 0, 3]
shape_format = [[np.float32, i, [18]] for i in format_list]
# shape_format = [np.float32, 0, [18]]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp16_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp32_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp16_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64, 128]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp32_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64, 128]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp16_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64, 128, 256]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
def test_pow_shape_format_scalar_tensor_fp32_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64, 128, 256]] for i in format_list]
self.pow_result_scalar_tensor(shape_format)
# tensor_scalar-----------------------------------------------------------
def test_pow_shape_format_tensor_scala_fp16_1d(self, device):
format_list = [-1, 0, 3]
shape_format = [[np.float16, i, [18]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scalar_fp32_1d(self, device):
format_list = [-1, 0, 3]
shape_format = [[np.float32, i, [18]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scala_fp16_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scalar_fp32_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scala_fp16_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64, 128]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scalar_fp32_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64, 128]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scala_fp16_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float16, i, [18, 64, 128, 256]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
def test_pow_shape_format_tensor_scalar_fp32_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[np.float32, i, [18, 64, 128, 256]] for i in format_list]
self.pow_result_tensor_scalar_(shape_format)
# tensor_tensor-----------------------------------------------------------
def test_pow_shape_format_fp16_1d(self, device):
format_list = [-1, 0, 3]
shape_format = [[[np.float16, i, [5]], [np.float16, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_1d(self, device):
format_list = [-1, 0, 3,]
shape_format = [[[np.float32, i, [5]], [np.float32, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp16_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [448, 1]], [np.float16, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_2d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [448, 1]], [np.float32, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp16_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [16, 640, 640]], [np.float16, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_3d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [16, 640, 640]], [np.float32, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp16_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [32, 3, 3, 3]], [np.float16, i, []]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_4d(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [32, 3, 3, 3]], [np.float32, i, []]] for i in format_list]
self.pow_result(shape_format)
#broadcast
def test_pow_shape_format_fp16_2d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [448, 20]], [np.float16, i, [448,1]]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_2d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [448, 20]], [np.float32, i, [448,1]]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp16_3d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [16, 640, 640]], [np.float16, i, [16, 640, 1]]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_3d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [16, 640, 640]], [np.float32, i, [16, 1, 1]]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp16_4d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float16, i, [32, 3, 3, 3]], [np.float16, i, [32, 1, 1, 1]]] for i in format_list]
self.pow_result(shape_format)
def test_pow_shape_format_fp32_4d_broadcast(self, device):
format_list = [-1, 0, 3, 29]
shape_format = [[[np.float32, i, [32, 3, 3, 3]], [np.float32, i, [32, 3, 1, 1]]] for i in format_list]
self.pow_result(shape_format)
instantiate_device_type_tests(TestPow, globals(), except_for="cpu")
if __name__ == "__main__":
run_tests()
| 43.983165 | 110 | 0.652224 | 1,904 | 13,063 | 4.143382 | 0.07458 | 0.135252 | 0.051084 | 0.057041 | 0.821777 | 0.79389 | 0.750158 | 0.732159 | 0.70706 | 0.689314 | 0 | 0.062166 | 0.225446 | 13,063 | 296 | 111 | 44.131757 | 0.717533 | 0.067902 | 0 | 0.56 | 0 | 0 | 0.003373 | 0 | 0 | 0 | 0 | 0 | 0.031111 | 1 | 0.195556 | false | 0 | 0.026667 | 0 | 0.275556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
31ff37acc6710fafe444f73b20b817180ee9bc27 | 69 | py | Python | WeatherPy/api_keys.py | Sreeteja27/python-api-challenge | 9d7218dc00dabb141824fbe2f44b5155a0cbd365 | [
"ADSL"
] | null | null | null | WeatherPy/api_keys.py | Sreeteja27/python-api-challenge | 9d7218dc00dabb141824fbe2f44b5155a0cbd365 | [
"ADSL"
] | null | null | null | WeatherPy/api_keys.py | Sreeteja27/python-api-challenge | 9d7218dc00dabb141824fbe2f44b5155a0cbd365 | [
"ADSL"
] | null | null | null | # OpenWeatherMap API Key
api_key = "dc40273bbbf4cb8e9aee7fe659e2f605" | 34.5 | 44 | 0.855072 | 6 | 69 | 9.666667 | 0.666667 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253968 | 0.086957 | 69 | 2 | 44 | 34.5 | 0.666667 | 0.318841 | 0 | 0 | 0 | 0 | 0.695652 | 0.695652 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ee45f59540a30e4dd5ef88925e6ac84f45ae5aa4 | 100 | py | Python | rl/sims/__init__.py | gtrll/librl | 39709c3e485e232865b3e08b7211cd9d871c666a | [
"MIT"
] | 5 | 2020-07-14T23:01:53.000Z | 2020-12-09T08:11:29.000Z | rl/sims/__init__.py | chinganc/mamba | e8adf0cf91660aed2c025508137a14f9d062248c | [
"MIT"
] | 1 | 2022-03-27T04:43:31.000Z | 2022-03-27T04:43:31.000Z | rl/sims/__init__.py | chinganc/mamba | e8adf0cf91660aed2c025508137a14f9d062248c | [
"MIT"
] | 4 | 2020-08-05T14:13:26.000Z | 2022-02-26T00:46:03.000Z | from rl.sims.envs import Cartpole, Hopper, Snake, Walker3d
from rl.sims.utils import create_sim_env
| 33.333333 | 58 | 0.82 | 17 | 100 | 4.705882 | 0.764706 | 0.15 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.11 | 100 | 2 | 59 | 50 | 0.88764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ee6673892c7c34e732e3e7bde7fe66bfafb6686d | 60,847 | py | Python | etl/parsers/etw/Microsoft_Windows_MPRMSG.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 104 | 2020-03-04T14:31:31.000Z | 2022-03-28T02:59:36.000Z | etl/parsers/etw/Microsoft_Windows_MPRMSG.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 7 | 2020-04-20T09:18:39.000Z | 2022-03-19T17:06:19.000Z | etl/parsers/etw/Microsoft_Windows_MPRMSG.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 16 | 2020-03-05T18:55:59.000Z | 2022-03-01T10:19:28.000Z | # -*- coding: utf-8 -*-
"""
Microsoft-Windows-MPRMSG
GUID : f2c628ae-d26c-4352-9c45-74754e1e2f9f
"""
from construct import Int8sl, Int8ul, Int16ul, Int16sl, Int32sl, Int32ul, Int64sl, Int64ul, Bytes, Double, Float32l, Struct
from etl.utils import WString, CString, SystemTime, Guid
from etl.dtyp import Sid
from etl.parsers.etw.core import Etw, declare, guid
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20001, version=0)
class Microsoft_Windows_MPRMSG_20001_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20002, version=0)
class Microsoft_Windows_MPRMSG_20002_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20003, version=0)
class Microsoft_Windows_MPRMSG_20003_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20004, version=0)
class Microsoft_Windows_MPRMSG_20004_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20005, version=0)
class Microsoft_Windows_MPRMSG_20005_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20006, version=0)
class Microsoft_Windows_MPRMSG_20006_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20009, version=0)
class Microsoft_Windows_MPRMSG_20009_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20010, version=0)
class Microsoft_Windows_MPRMSG_20010_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20011, version=0)
class Microsoft_Windows_MPRMSG_20011_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20012, version=0)
class Microsoft_Windows_MPRMSG_20012_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20013, version=0)
class Microsoft_Windows_MPRMSG_20013_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20016, version=0)
class Microsoft_Windows_MPRMSG_20016_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20018, version=0)
class Microsoft_Windows_MPRMSG_20018_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20019, version=0)
class Microsoft_Windows_MPRMSG_20019_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20020, version=0)
class Microsoft_Windows_MPRMSG_20020_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20021, version=0)
class Microsoft_Windows_MPRMSG_20021_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20022, version=0)
class Microsoft_Windows_MPRMSG_20022_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20023, version=0)
class Microsoft_Windows_MPRMSG_20023_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20024, version=0)
class Microsoft_Windows_MPRMSG_20024_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20025, version=0)
class Microsoft_Windows_MPRMSG_20025_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20026, version=0)
class Microsoft_Windows_MPRMSG_20026_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20027, version=0)
class Microsoft_Windows_MPRMSG_20027_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20028, version=0)
class Microsoft_Windows_MPRMSG_20028_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20029, version=0)
class Microsoft_Windows_MPRMSG_20029_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20030, version=0)
class Microsoft_Windows_MPRMSG_20030_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20031, version=0)
class Microsoft_Windows_MPRMSG_20031_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20032, version=0)
class Microsoft_Windows_MPRMSG_20032_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20033, version=0)
class Microsoft_Windows_MPRMSG_20033_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20034, version=0)
class Microsoft_Windows_MPRMSG_20034_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20035, version=0)
class Microsoft_Windows_MPRMSG_20035_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20036, version=0)
class Microsoft_Windows_MPRMSG_20036_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20037, version=0)
class Microsoft_Windows_MPRMSG_20037_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20038, version=0)
class Microsoft_Windows_MPRMSG_20038_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20042, version=0)
class Microsoft_Windows_MPRMSG_20042_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20043, version=0)
class Microsoft_Windows_MPRMSG_20043_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20044, version=0)
class Microsoft_Windows_MPRMSG_20044_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20045, version=0)
class Microsoft_Windows_MPRMSG_20045_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20046, version=0)
class Microsoft_Windows_MPRMSG_20046_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20047, version=0)
class Microsoft_Windows_MPRMSG_20047_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20051, version=0)
class Microsoft_Windows_MPRMSG_20051_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20052, version=0)
class Microsoft_Windows_MPRMSG_20052_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20053, version=0)
class Microsoft_Windows_MPRMSG_20053_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20054, version=0)
class Microsoft_Windows_MPRMSG_20054_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20055, version=0)
class Microsoft_Windows_MPRMSG_20055_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20056, version=0)
class Microsoft_Windows_MPRMSG_20056_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20057, version=0)
class Microsoft_Windows_MPRMSG_20057_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20058, version=0)
class Microsoft_Windows_MPRMSG_20058_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20059, version=0)
class Microsoft_Windows_MPRMSG_20059_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20060, version=0)
class Microsoft_Windows_MPRMSG_20060_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20061, version=0)
class Microsoft_Windows_MPRMSG_20061_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20062, version=0)
class Microsoft_Windows_MPRMSG_20062_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20063, version=0)
class Microsoft_Windows_MPRMSG_20063_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20064, version=0)
class Microsoft_Windows_MPRMSG_20064_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20065, version=0)
class Microsoft_Windows_MPRMSG_20065_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20066, version=0)
class Microsoft_Windows_MPRMSG_20066_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20067, version=0)
class Microsoft_Windows_MPRMSG_20067_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20068, version=0)
class Microsoft_Windows_MPRMSG_20068_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20069, version=0)
class Microsoft_Windows_MPRMSG_20069_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20070, version=0)
class Microsoft_Windows_MPRMSG_20070_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20071, version=0)
class Microsoft_Windows_MPRMSG_20071_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20072, version=0)
class Microsoft_Windows_MPRMSG_20072_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20079, version=0)
class Microsoft_Windows_MPRMSG_20079_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20080, version=0)
class Microsoft_Windows_MPRMSG_20080_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20081, version=0)
class Microsoft_Windows_MPRMSG_20081_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20082, version=0)
class Microsoft_Windows_MPRMSG_20082_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20084, version=0)
class Microsoft_Windows_MPRMSG_20084_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20085, version=0)
class Microsoft_Windows_MPRMSG_20085_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20086, version=0)
class Microsoft_Windows_MPRMSG_20086_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20087, version=0)
class Microsoft_Windows_MPRMSG_20087_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20088, version=0)
class Microsoft_Windows_MPRMSG_20088_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20090, version=0)
class Microsoft_Windows_MPRMSG_20090_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20091, version=0)
class Microsoft_Windows_MPRMSG_20091_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20092, version=0)
class Microsoft_Windows_MPRMSG_20092_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20093, version=0)
class Microsoft_Windows_MPRMSG_20093_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20094, version=0)
class Microsoft_Windows_MPRMSG_20094_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20096, version=0)
class Microsoft_Windows_MPRMSG_20096_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20098, version=0)
class Microsoft_Windows_MPRMSG_20098_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20099, version=0)
class Microsoft_Windows_MPRMSG_20099_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20100, version=0)
class Microsoft_Windows_MPRMSG_20100_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20101, version=0)
class Microsoft_Windows_MPRMSG_20101_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20102, version=0)
class Microsoft_Windows_MPRMSG_20102_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20103, version=0)
class Microsoft_Windows_MPRMSG_20103_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20104, version=0)
class Microsoft_Windows_MPRMSG_20104_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20105, version=0)
class Microsoft_Windows_MPRMSG_20105_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20106, version=0)
class Microsoft_Windows_MPRMSG_20106_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20107, version=0)
class Microsoft_Windows_MPRMSG_20107_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20108, version=0)
class Microsoft_Windows_MPRMSG_20108_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20110, version=0)
class Microsoft_Windows_MPRMSG_20110_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20111, version=0)
class Microsoft_Windows_MPRMSG_20111_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20112, version=0)
class Microsoft_Windows_MPRMSG_20112_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20113, version=0)
class Microsoft_Windows_MPRMSG_20113_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20114, version=0)
class Microsoft_Windows_MPRMSG_20114_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20125, version=0)
class Microsoft_Windows_MPRMSG_20125_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20126, version=0)
class Microsoft_Windows_MPRMSG_20126_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20127, version=0)
class Microsoft_Windows_MPRMSG_20127_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20132, version=0)
class Microsoft_Windows_MPRMSG_20132_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20138, version=0)
class Microsoft_Windows_MPRMSG_20138_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20143, version=0)
class Microsoft_Windows_MPRMSG_20143_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20144, version=0)
class Microsoft_Windows_MPRMSG_20144_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20145, version=0)
class Microsoft_Windows_MPRMSG_20145_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20146, version=0)
class Microsoft_Windows_MPRMSG_20146_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20147, version=0)
class Microsoft_Windows_MPRMSG_20147_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20148, version=0)
class Microsoft_Windows_MPRMSG_20148_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20149, version=0)
class Microsoft_Windows_MPRMSG_20149_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20150, version=0)
class Microsoft_Windows_MPRMSG_20150_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20151, version=0)
class Microsoft_Windows_MPRMSG_20151_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20152, version=0)
class Microsoft_Windows_MPRMSG_20152_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20153, version=0)
class Microsoft_Windows_MPRMSG_20153_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20157, version=0)
class Microsoft_Windows_MPRMSG_20157_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20165, version=0)
class Microsoft_Windows_MPRMSG_20165_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20166, version=0)
class Microsoft_Windows_MPRMSG_20166_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20167, version=0)
class Microsoft_Windows_MPRMSG_20167_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20168, version=0)
class Microsoft_Windows_MPRMSG_20168_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20169, version=0)
class Microsoft_Windows_MPRMSG_20169_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20170, version=0)
class Microsoft_Windows_MPRMSG_20170_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20171, version=0)
class Microsoft_Windows_MPRMSG_20171_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20172, version=0)
class Microsoft_Windows_MPRMSG_20172_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20173, version=0)
class Microsoft_Windows_MPRMSG_20173_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20174, version=0)
class Microsoft_Windows_MPRMSG_20174_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20175, version=0)
class Microsoft_Windows_MPRMSG_20175_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20176, version=0)
class Microsoft_Windows_MPRMSG_20176_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20177, version=0)
class Microsoft_Windows_MPRMSG_20177_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20178, version=0)
class Microsoft_Windows_MPRMSG_20178_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20179, version=0)
class Microsoft_Windows_MPRMSG_20179_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20180, version=0)
class Microsoft_Windows_MPRMSG_20180_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20181, version=0)
class Microsoft_Windows_MPRMSG_20181_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20182, version=0)
class Microsoft_Windows_MPRMSG_20182_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20183, version=0)
class Microsoft_Windows_MPRMSG_20183_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20184, version=0)
class Microsoft_Windows_MPRMSG_20184_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20185, version=0)
class Microsoft_Windows_MPRMSG_20185_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20186, version=0)
class Microsoft_Windows_MPRMSG_20186_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20190, version=0)
class Microsoft_Windows_MPRMSG_20190_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20191, version=0)
class Microsoft_Windows_MPRMSG_20191_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20192, version=0)
class Microsoft_Windows_MPRMSG_20192_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20193, version=0)
class Microsoft_Windows_MPRMSG_20193_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20196, version=0)
class Microsoft_Windows_MPRMSG_20196_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20197, version=0)
class Microsoft_Windows_MPRMSG_20197_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20198, version=0)
class Microsoft_Windows_MPRMSG_20198_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20199, version=0)
class Microsoft_Windows_MPRMSG_20199_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20202, version=0)
class Microsoft_Windows_MPRMSG_20202_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20203, version=0)
class Microsoft_Windows_MPRMSG_20203_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20204, version=0)
class Microsoft_Windows_MPRMSG_20204_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20205, version=0)
class Microsoft_Windows_MPRMSG_20205_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20206, version=0)
class Microsoft_Windows_MPRMSG_20206_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20207, version=0)
class Microsoft_Windows_MPRMSG_20207_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20208, version=0)
class Microsoft_Windows_MPRMSG_20208_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20209, version=0)
class Microsoft_Windows_MPRMSG_20209_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20210, version=0)
class Microsoft_Windows_MPRMSG_20210_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20211, version=0)
class Microsoft_Windows_MPRMSG_20211_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20212, version=0)
class Microsoft_Windows_MPRMSG_20212_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20213, version=0)
class Microsoft_Windows_MPRMSG_20213_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20214, version=0)
class Microsoft_Windows_MPRMSG_20214_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20215, version=0)
class Microsoft_Windows_MPRMSG_20215_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20216, version=0)
class Microsoft_Windows_MPRMSG_20216_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20217, version=0)
class Microsoft_Windows_MPRMSG_20217_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20218, version=0)
class Microsoft_Windows_MPRMSG_20218_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20219, version=0)
class Microsoft_Windows_MPRMSG_20219_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20220, version=0)
class Microsoft_Windows_MPRMSG_20220_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20230, version=0)
class Microsoft_Windows_MPRMSG_20230_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20247, version=0)
class Microsoft_Windows_MPRMSG_20247_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20248, version=0)
class Microsoft_Windows_MPRMSG_20248_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20249, version=0)
class Microsoft_Windows_MPRMSG_20249_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20250, version=0)
class Microsoft_Windows_MPRMSG_20250_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20251, version=0)
class Microsoft_Windows_MPRMSG_20251_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"param6" / WString,
"param7" / WString,
"param8" / WString,
"param9" / WString,
"param10" / WString,
"param11" / WString,
"param12" / WString,
"param13" / WString,
"param14" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20252, version=0)
class Microsoft_Windows_MPRMSG_20252_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20253, version=0)
class Microsoft_Windows_MPRMSG_20253_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20254, version=0)
class Microsoft_Windows_MPRMSG_20254_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20255, version=0)
class Microsoft_Windows_MPRMSG_20255_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20256, version=0)
class Microsoft_Windows_MPRMSG_20256_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20257, version=0)
class Microsoft_Windows_MPRMSG_20257_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20258, version=0)
class Microsoft_Windows_MPRMSG_20258_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20259, version=0)
class Microsoft_Windows_MPRMSG_20259_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20260, version=0)
class Microsoft_Windows_MPRMSG_20260_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20261, version=0)
class Microsoft_Windows_MPRMSG_20261_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20262, version=0)
class Microsoft_Windows_MPRMSG_20262_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20263, version=0)
class Microsoft_Windows_MPRMSG_20263_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20264, version=0)
class Microsoft_Windows_MPRMSG_20264_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20265, version=0)
class Microsoft_Windows_MPRMSG_20265_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20266, version=0)
class Microsoft_Windows_MPRMSG_20266_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20267, version=0)
class Microsoft_Windows_MPRMSG_20267_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20268, version=0)
class Microsoft_Windows_MPRMSG_20268_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20269, version=0)
class Microsoft_Windows_MPRMSG_20269_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20270, version=0)
class Microsoft_Windows_MPRMSG_20270_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20271, version=0)
class Microsoft_Windows_MPRMSG_20271_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20272, version=0)
class Microsoft_Windows_MPRMSG_20272_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"param6" / WString,
"param7" / WString,
"param8" / WString,
"param9" / WString,
"param10" / WString,
"param11" / WString,
"param12" / WString,
"param13" / WString,
"param14" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20274, version=0)
class Microsoft_Windows_MPRMSG_20274_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"param5" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20275, version=0)
class Microsoft_Windows_MPRMSG_20275_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20276, version=0)
class Microsoft_Windows_MPRMSG_20276_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20279, version=0)
class Microsoft_Windows_MPRMSG_20279_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"param4" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20280, version=0)
class Microsoft_Windows_MPRMSG_20280_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20281, version=0)
class Microsoft_Windows_MPRMSG_20281_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20282, version=0)
class Microsoft_Windows_MPRMSG_20282_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20283, version=0)
class Microsoft_Windows_MPRMSG_20283_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20284, version=0)
class Microsoft_Windows_MPRMSG_20284_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20285, version=0)
class Microsoft_Windows_MPRMSG_20285_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20286, version=0)
class Microsoft_Windows_MPRMSG_20286_0(Etw):
pattern = Struct(
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20287, version=0)
class Microsoft_Windows_MPRMSG_20287_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20288, version=0)
class Microsoft_Windows_MPRMSG_20288_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20289, version=0)
class Microsoft_Windows_MPRMSG_20289_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20290, version=0)
class Microsoft_Windows_MPRMSG_20290_0(Etw):
pattern = Struct(
"param1" / WString,
"param2" / WString,
"param3" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20291, version=0)
class Microsoft_Windows_MPRMSG_20291_0(Etw):
pattern = Struct(
"param1" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
@declare(guid=guid("f2c628ae-d26c-4352-9c45-74754e1e2f9f"), event_id=20292, version=0)
class Microsoft_Windows_MPRMSG_20292_0(Etw):
pattern = Struct(
"MaxCount" / WString,
"HResult" / WString,
"__binLength" / Int32ul,
"binary" / Bytes(lambda this: this.__binLength)
)
| 30.793016 | 123 | 0.66023 | 6,807 | 60,847 | 5.634641 | 0.039077 | 0.084683 | 0.116439 | 0.105853 | 0.940138 | 0.940138 | 0.754869 | 0.754869 | 0.754869 | 0.753383 | 0 | 0.154625 | 0.206567 | 60,847 | 1,975 | 124 | 30.808608 | 0.639836 | 0.001496 | 0 | 0.608946 | 0 | 0 | 0.210924 | 0.11971 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002556 | 0 | 0.260703 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ee735b6feffa74a71f82b3de70f0a942bd1f91ad | 10,877 | py | Python | premium/backend/tests/baserow_premium/admin/dashboard/test_admin_dashboard_handler.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | 1 | 2022-01-24T15:12:02.000Z | 2022-01-24T15:12:02.000Z | premium/backend/tests/baserow_premium/admin/dashboard/test_admin_dashboard_handler.py | rasata/baserow | c6e1d7842c53f801e1c96b49f1377da2a06afaa9 | [
"MIT"
] | null | null | null | premium/backend/tests/baserow_premium/admin/dashboard/test_admin_dashboard_handler.py | rasata/baserow | c6e1d7842c53f801e1c96b49f1377da2a06afaa9 | [
"MIT"
] | null | null | null | import pytest
from pytz import timezone
from datetime import timedelta, datetime, date
from django.test.utils import override_settings
from baserow.core.models import UserLogEntry
from baserow_premium.admin.dashboard.handler import AdminDashboardHandler
@pytest.mark.django_db
@override_settings(DEBUG=True)
def test_get_new_user_counts(premium_data_fixture):
tz = timezone("UTC")
premium_data_fixture.create_user(date_joined=datetime(2020, 12, 30, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 1, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 2, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 3, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 4, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 5, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 10, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 23, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 24, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 25, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 26, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 27, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 28, tzinfo=tz))
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 29, tzinfo=tz))
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 30, 12, 1, tzinfo=tz)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 30, 15, 1, tzinfo=tz)
)
handler = AdminDashboardHandler()
now = datetime(2021, 1, 30, 23, 59, tzinfo=tz)
assert handler.get_new_user_counts(
{
"last_24_hours": timedelta(hours=24),
"last_7_days": timedelta(days=7),
"last_30_days": timedelta(days=30),
"last_40_days": timedelta(days=40),
"last_10_days": timedelta(days=10),
"last_2_days": timedelta(days=2),
},
now=now,
) == {
"last_24_hours": 2,
"last_7_days": 8,
"last_30_days": 15,
"last_40_days": 16,
"last_10_days": 9,
"last_2_days": 3,
}
assert handler.get_new_user_counts(
{
"last_24_hours": timedelta(hours=24),
"last_7_days": timedelta(days=7),
},
now=now,
include_previous=True,
) == {
"last_24_hours": 2,
"last_7_days": 8,
"previous_last_24_hours": 1,
"previous_last_7_days": 1,
}
@pytest.mark.django_db
@override_settings(DEBUG=True)
def test_get_active_user_counts(premium_data_fixture):
tz = timezone("UTC")
user_1 = premium_data_fixture.create_user()
user_2 = premium_data_fixture.create_user()
user_3 = premium_data_fixture.create_user()
def create_entries(user, dates):
for d in dates:
entry = UserLogEntry()
entry.actor = user
entry.action = "SIGNED_IN"
entry.save()
# To override the auto_now_add.
entry.timestamp = d
entry.save()
create_entries(
user_1,
[
datetime(2020, 12, 30, tzinfo=tz),
datetime(2021, 1, 1, tzinfo=tz),
datetime(2021, 1, 2, tzinfo=tz),
datetime(2021, 1, 3, tzinfo=tz),
datetime(2021, 1, 4, tzinfo=tz),
datetime(2021, 1, 5, tzinfo=tz),
datetime(2021, 1, 7, tzinfo=tz),
datetime(2021, 1, 7, tzinfo=tz),
datetime(2021, 1, 7, tzinfo=tz),
datetime(2021, 1, 8, tzinfo=tz),
datetime(2021, 1, 9, tzinfo=tz),
datetime(2021, 1, 10, tzinfo=tz),
datetime(2021, 1, 20, tzinfo=tz),
datetime(2021, 1, 21, tzinfo=tz),
datetime(2021, 1, 22, tzinfo=tz),
datetime(2021, 1, 29, tzinfo=tz),
],
)
create_entries(
user_2,
[
datetime(2020, 12, 20, tzinfo=tz),
datetime(2021, 1, 1, tzinfo=tz),
datetime(2021, 1, 2, tzinfo=tz),
datetime(2021, 1, 3, tzinfo=tz),
datetime(2021, 1, 4, tzinfo=tz),
datetime(2021, 1, 10, tzinfo=tz),
datetime(2021, 1, 11, tzinfo=tz),
datetime(2021, 1, 12, tzinfo=tz),
datetime(2021, 1, 13, tzinfo=tz),
datetime(2021, 1, 14, tzinfo=tz),
datetime(2021, 1, 15, tzinfo=tz),
datetime(2021, 1, 16, tzinfo=tz),
datetime(2021, 1, 20, tzinfo=tz),
datetime(2021, 1, 21, tzinfo=tz),
datetime(2021, 1, 24, tzinfo=tz),
],
)
create_entries(
user_3,
[
datetime(2020, 12, 20, tzinfo=tz),
datetime(2020, 12, 21, tzinfo=tz),
datetime(2020, 12, 23, tzinfo=tz),
datetime(2020, 12, 25, tzinfo=tz),
datetime(2020, 12, 27, tzinfo=tz),
datetime(2020, 12, 30, tzinfo=tz),
],
)
handler = AdminDashboardHandler()
now = datetime(2021, 1, 30, 23, 59, tzinfo=tz)
assert handler.get_active_user_count(
{
"last_24_hours": timedelta(hours=24),
"last_7_days": timedelta(days=7),
"last_30_days": timedelta(days=30),
"last_40_days": timedelta(days=40),
"last_10_days": timedelta(days=10),
},
now=now,
) == {
"last_24_hours": 0,
"last_7_days": 2,
"last_30_days": 2,
"last_40_days": 3,
"last_10_days": 2,
}
assert handler.get_active_user_count(
{
"last_24_hours": timedelta(hours=24),
"last_7_days": timedelta(days=7),
"last_30_days": timedelta(days=30),
"last_40_days": timedelta(days=40),
"last_10_days": timedelta(days=10),
},
now=now,
include_previous=True,
) == {
"last_24_hours": 0,
"last_7_days": 2,
"last_30_days": 2,
"last_40_days": 3,
"last_10_days": 2,
"previous_last_24_hours": 1,
"previous_last_7_days": 2,
"previous_last_30_days": 3,
"previous_last_40_days": 2,
"previous_last_10_days": 2,
}
@pytest.mark.django_db
@override_settings(DEBUG=True)
def test_get_new_users_per_day(premium_data_fixture):
utc = timezone("UTC")
gmt3 = timezone("Etc/GMT+3")
premium_data_fixture.create_user(
date_joined=datetime(2020, 12, 29, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(date_joined=datetime(2021, 1, 1, 1, 1, tzinfo=utc))
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 1, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 2, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 2, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 2, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 30, 12, 1, tzinfo=utc)
)
premium_data_fixture.create_user(
date_joined=datetime(2021, 1, 30, 15, 1, tzinfo=utc)
)
handler = AdminDashboardHandler()
now = datetime(2021, 1, 30, 23, 59, tzinfo=utc)
counts = handler.get_new_user_count_per_day(timedelta(days=30), now)
assert len(counts) == 3
assert counts[0]["date"] == date(2021, 1, 1)
assert counts[0]["count"] == 2
assert counts[1]["date"] == date(2021, 1, 2)
assert counts[1]["count"] == 3
assert counts[2]["date"] == date(2021, 1, 30)
assert counts[2]["count"] == 2
now = datetime(2021, 1, 1, 13, 00, tzinfo=utc)
counts = handler.get_new_user_count_per_day(timedelta(days=1), now)
assert len(counts) == 1
assert counts[0]["date"] == date(2021, 1, 1)
assert counts[0]["count"] == 2
now = datetime(2021, 1, 1, 13, 00, tzinfo=gmt3)
counts = handler.get_new_user_count_per_day(timedelta(days=1), now)
assert len(counts) == 2
assert counts[0]["date"] == date(2020, 12, 31)
assert counts[0]["count"] == 1
assert counts[1]["date"] == date(2021, 1, 1)
assert counts[1]["count"] == 1
@pytest.mark.django_db
@override_settings(DEBUG=True)
def test_get_active_users_per_day(premium_data_fixture):
utc = timezone("UTC")
gmt3 = timezone("Etc/GMT+3")
user_1 = premium_data_fixture.create_user()
user_2 = premium_data_fixture.create_user()
user_3 = premium_data_fixture.create_user()
def create_entries(user, dates):
for d in dates:
entry = UserLogEntry()
entry.actor = user
entry.action = "SIGNED_IN"
entry.save()
# To override the auto_now_add.
entry.timestamp = d
entry.save()
create_entries(
user_1,
[
datetime(2020, 12, 29, tzinfo=utc),
datetime(2021, 1, 1, 1, 1, tzinfo=utc),
datetime(2021, 1, 1, 12, 1, tzinfo=utc),
datetime(2021, 1, 1, 13, 1, tzinfo=utc),
datetime(2021, 1, 1, 14, 1, tzinfo=utc),
datetime(2021, 1, 10, 14, 1, tzinfo=utc),
],
)
create_entries(
user_2,
[
datetime(2020, 12, 29, tzinfo=utc),
datetime(2021, 1, 1, 1, 1, tzinfo=utc),
datetime(2021, 1, 10, 12, 1, tzinfo=utc),
datetime(2021, 1, 10, 13, 1, tzinfo=utc),
],
)
create_entries(
user_3,
[
datetime(2021, 1, 2, tzinfo=utc),
datetime(2021, 1, 10, tzinfo=utc),
],
)
handler = AdminDashboardHandler()
now = datetime(2021, 1, 30, 23, 59, tzinfo=utc)
counts = handler.get_active_user_count_per_day(timedelta(days=30), now)
assert len(counts) == 3
assert counts[0]["date"] == date(2021, 1, 1)
assert counts[0]["count"] == 2
assert counts[1]["date"] == date(2021, 1, 2)
assert counts[1]["count"] == 1
assert counts[2]["date"] == date(2021, 1, 10)
assert counts[2]["count"] == 3
now = datetime(2021, 1, 1, 13, 00, tzinfo=utc)
counts = handler.get_active_user_count_per_day(timedelta(days=1), now)
assert len(counts) == 1
assert counts[0]["date"] == date(2021, 1, 1)
assert counts[0]["count"] == 2
now = datetime(2021, 1, 1, 13, 00, tzinfo=gmt3)
counts = handler.get_active_user_count_per_day(timedelta(days=1), now)
assert len(counts) == 2
assert counts[0]["date"] == date(2020, 12, 31)
assert counts[0]["count"] == 2
assert counts[1]["date"] == date(2021, 1, 1)
assert counts[1]["count"] == 1
| 33.467692 | 88 | 0.596764 | 1,480 | 10,877 | 4.174324 | 0.068919 | 0.063937 | 0.145193 | 0.116543 | 0.931208 | 0.88977 | 0.869699 | 0.832794 | 0.803496 | 0.791033 | 0 | 0.113605 | 0.266802 | 10,877 | 324 | 89 | 33.570988 | 0.661066 | 0.005424 | 0 | 0.626761 | 0 | 0 | 0.067776 | 0.009894 | 0 | 0 | 0 | 0 | 0.119718 | 1 | 0.021127 | false | 0 | 0.021127 | 0 | 0.042254 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ee881ba0372077594bb439911721c13d5cb90c98 | 184 | py | Python | src/handler/getinformationbyconfig/contract/request.py | julianVelandia/Eureka | b7cfea2d7fbc9f4dbb0d08f0b6dd547b6367602e | [
"MIT"
] | 2 | 2022-01-29T17:56:42.000Z | 2022-02-05T01:33:01.000Z | src/handler/getinformationbyconfig/contract/request.py | Turing-Core-Team/Eureka-v1.0 | b7cfea2d7fbc9f4dbb0d08f0b6dd547b6367602e | [
"MIT"
] | null | null | null | src/handler/getinformationbyconfig/contract/request.py | Turing-Core-Team/Eureka-v1.0 | b7cfea2d7fbc9f4dbb0d08f0b6dd547b6367602e | [
"MIT"
] | null | null | null |
class Params:
language: str
config_name: str
def __init__(self, language: str, config_name: str):
self.language = language
self.config_name = config_name
| 20.444444 | 56 | 0.663043 | 23 | 184 | 4.956522 | 0.391304 | 0.350877 | 0.298246 | 0.368421 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 184 | 8 | 57 | 23 | 0.838235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
c99b3d2444387cff08836c757d073f9e9483d81a | 130 | py | Python | mosaic/__init__.py | histuckyi/mosaic | a3113a14ac63af9414566ccb032cd42193bf0135 | [
"MIT"
] | null | null | null | mosaic/__init__.py | histuckyi/mosaic | a3113a14ac63af9414566ccb032cd42193bf0135 | [
"MIT"
] | null | null | null | mosaic/__init__.py | histuckyi/mosaic | a3113a14ac63af9414566ccb032cd42193bf0135 | [
"MIT"
] | 1 | 2020-02-10T16:05:38.000Z | 2020-02-10T16:05:38.000Z | """
This is the top-level namespace of the mosaic package.
"""
# import
from . import mosaic # noqa
from . import image # noqa | 16.25 | 54 | 0.684615 | 19 | 130 | 4.684211 | 0.684211 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 130 | 8 | 55 | 16.25 | 0.872549 | 0.553846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e4cc51025c371ee2a2d44462926a7c2aec6fab5 | 58 | py | Python | example/__init__.py | xwkgch/IsoTensor | b08e9753d50f082023d4f516361bc666ee359223 | [
"MIT"
] | 17 | 2021-10-01T04:24:04.000Z | 2022-03-22T14:11:56.000Z | example/__init__.py | xwkgch/IsoTensor | b08e9753d50f082023d4f516361bc666ee359223 | [
"MIT"
] | null | null | null | example/__init__.py | xwkgch/IsoTensor | b08e9753d50f082023d4f516361bc666ee359223 | [
"MIT"
] | 1 | 2021-10-05T11:30:20.000Z | 2021-10-05T11:30:20.000Z | from . import mera
from . import tnr
from . import meraad
| 14.5 | 20 | 0.741379 | 9 | 58 | 4.777778 | 0.555556 | 0.697674 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 58 | 3 | 21 | 19.333333 | 0.934783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e52118b9eaed5ac742a514f07703fe654a38abb | 21 | py | Python | src/no2migen/__init__.py | no2fpga/no2migen | 676aafaf46ed0ea9e2f3c2d81a6e9f1bc9b1a649 | [
"RSA-MD"
] | 5 | 2021-06-21T09:59:08.000Z | 2022-02-21T12:39:40.000Z | src/no2migen/__init__.py | no2fpga/no2migen | 676aafaf46ed0ea9e2f3c2d81a6e9f1bc9b1a649 | [
"RSA-MD"
] | 1 | 2022-02-16T17:10:46.000Z | 2022-02-16T19:35:34.000Z | src/no2migen/__init__.py | no2fpga/no2migen | 676aafaf46ed0ea9e2f3c2d81a6e9f1bc9b1a649 | [
"RSA-MD"
] | null | null | null | from .migen import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e567a2632c1961a27b4a59548bc1d5ffb4a934d | 32 | py | Python | alosi/data/__init__.py | harvard-vpal/alosi | aa525c1cdbfb4d16e79970540ac7136098e50d6f | [
"Apache-2.0"
] | 6 | 2018-05-30T18:26:23.000Z | 2021-09-15T07:40:24.000Z | alosi/data/__init__.py | harvard-vpal/alosi | aa525c1cdbfb4d16e79970540ac7136098e50d6f | [
"Apache-2.0"
] | 3 | 2019-02-11T21:31:07.000Z | 2020-07-23T14:29:40.000Z | alosi/data/__init__.py | harvard-vpal/alosi | aa525c1cdbfb4d16e79970540ac7136098e50d6f | [
"Apache-2.0"
] | 4 | 2018-06-10T12:17:03.000Z | 2020-10-08T11:17:49.000Z | from . import google_drive, olx
| 16 | 31 | 0.78125 | 5 | 32 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 32 | 1 | 32 | 32 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e6565a09c37369a3ddbc85263db607d56a34d5e | 144 | py | Python | neuroglia/calcium/oasis/__init__.py | NileGraddis/neuroglia | cbe132347aeb9f3664f25d2e67eba34aa4d1ed0c | [
"BSD-3-Clause"
] | 1 | 2019-01-05T09:12:13.000Z | 2019-01-05T09:12:13.000Z | neuroglia/calcium/oasis/__init__.py | j-friedrich/neuroglia | 5b4e43d3dcfd75184d205e18a251598586f9f812 | [
"BSD-3-Clause"
] | null | null | null | neuroglia/calcium/oasis/__init__.py | j-friedrich/neuroglia | 5b4e43d3dcfd75184d205e18a251598586f9f812 | [
"BSD-3-Clause"
] | null | null | null | # from .oasis_methods import oasisAR1, constrained_oasisAR1, oasisAR2, constrained_oasisAR2
from . import functions
from . import oasis_methods
| 36 | 91 | 0.840278 | 17 | 144 | 6.882353 | 0.470588 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.111111 | 144 | 3 | 92 | 48 | 0.882813 | 0.618056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1179d8d93300da5131d5eefefa6c3b4dbe9e8af9 | 28 | py | Python | raco/python/__init__.py | uwescience/raco | 1f2bedbef71bacf715340289f4973d85a3c1dc97 | [
"BSD-3-Clause"
] | 61 | 2015-02-09T17:27:40.000Z | 2022-03-28T14:37:53.000Z | raco/python/__init__.py | uwescience/raco | 1f2bedbef71bacf715340289f4973d85a3c1dc97 | [
"BSD-3-Clause"
] | 201 | 2015-01-03T02:46:19.000Z | 2017-09-19T02:16:36.000Z | raco/python/__init__.py | uwescience/raco | 1f2bedbef71bacf715340289f4973d85a3c1dc97 | [
"BSD-3-Clause"
] | 17 | 2015-06-03T12:01:30.000Z | 2021-11-27T15:49:21.000Z | from convert import convert
| 14 | 27 | 0.857143 | 4 | 28 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
11837744d7d5a9677123eb9716977638736c8f44 | 120 | py | Python | 15_pr_07.py | Madhav2204/Python-vs-code | 6b845573f6e9f77cccf8f67913ad4de05da6593c | [
"Apache-2.0"
] | null | null | null | 15_pr_07.py | Madhav2204/Python-vs-code | 6b845573f6e9f77cccf8f67913ad4de05da6593c | [
"Apache-2.0"
] | null | null | null | 15_pr_07.py | Madhav2204/Python-vs-code | 6b845573f6e9f77cccf8f67913ad4de05da6593c | [
"Apache-2.0"
] | null | null | null | n = 3
for i in range(3):
print(" " * (n-i-1), end="")
print("*" * (2*i+1), end="")
print(" " * (n-i-1)) | 24 | 33 | 0.366667 | 21 | 120 | 2.095238 | 0.47619 | 0.136364 | 0.318182 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.3 | 120 | 5 | 34 | 24 | 0.452381 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.6 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
118451a77127d5dc39c55c8f38f353b168a7ddf2 | 23 | py | Python | api/api/resources/weather/__init__.py | coreyjwhite/flext | 86b2614a3db203512627c7cd12f8ecc9fcface90 | [
"MIT"
] | null | null | null | api/api/resources/weather/__init__.py | coreyjwhite/flext | 86b2614a3db203512627c7cd12f8ecc9fcface90 | [
"MIT"
] | null | null | null | api/api/resources/weather/__init__.py | coreyjwhite/flext | 86b2614a3db203512627c7cd12f8ecc9fcface90 | [
"MIT"
] | null | null | null | from . import forecast
| 11.5 | 22 | 0.782609 | 3 | 23 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
11ca60a70a42c3da6c60b9ccd310746bc5df57aa | 165 | py | Python | src/python/lowerandupper.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 98 | 2018-10-09T15:42:41.000Z | 2021-10-04T15:25:44.000Z | src/python/lowerandupper.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 141 | 2018-10-06T16:55:20.000Z | 2021-10-31T18:25:35.000Z | src/python/lowerandupper.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 885 | 2018-10-06T17:14:44.000Z | 2022-01-29T03:16:21.000Z |
#converting string to uppercase
def uppercase(string):
return string.upper()
#converting string to lower case
def lowercase(string):
return string.lower()
| 15 | 32 | 0.751515 | 21 | 165 | 5.904762 | 0.47619 | 0.258065 | 0.290323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 165 | 10 | 33 | 16.5 | 0.898551 | 0.369697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
11f82ea896e1d25fda862e3ac02f84e26ea768df | 140 | py | Python | src/vak/models/__init__.py | jspaaks/vak | 581ec4869d342e5d52bc057de54c10901f06d343 | [
"BSD-3-Clause"
] | 26 | 2019-03-04T20:08:57.000Z | 2022-01-22T13:40:00.000Z | src/vak/models/__init__.py | jspaaks/vak | 581ec4869d342e5d52bc057de54c10901f06d343 | [
"BSD-3-Clause"
] | 379 | 2019-03-03T12:16:05.000Z | 2022-03-29T13:44:46.000Z | src/vak/models/__init__.py | jspaaks/vak | 581ec4869d342e5d52bc057de54c10901f06d343 | [
"BSD-3-Clause"
] | 12 | 2019-11-22T21:19:19.000Z | 2022-03-14T17:44:59.000Z | from .models import find, from_model_config_map
from . import teenytweetynet
__all__ = ["find", "from_model_config_map", "teenytweetynet"]
| 28 | 61 | 0.792857 | 18 | 140 | 5.611111 | 0.5 | 0.158416 | 0.257426 | 0.376238 | 0.435644 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 140 | 4 | 62 | 35 | 0.808 | 0 | 0 | 0 | 0 | 0 | 0.278571 | 0.15 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
eedfc2e4c9a78369f2d34ba97c829c68965edc2d | 144 | py | Python | data_cleaning/__init__.py | peter-amerkhanian/Bus_Accidents | 25901e7c0c54fb8b45d95b7f375ae8f4402d2c17 | [
"MIT"
] | null | null | null | data_cleaning/__init__.py | peter-amerkhanian/Bus_Accidents | 25901e7c0c54fb8b45d95b7f375ae8f4402d2c17 | [
"MIT"
] | null | null | null | data_cleaning/__init__.py | peter-amerkhanian/Bus_Accidents | 25901e7c0c54fb8b45d95b7f375ae8f4402d2c17 | [
"MIT"
] | null | null | null | from data_cleaning.create_html import make_html_table
from data_cleaning.duplicate_handling import get_first_non_null, within_5, max_two_routes
| 48 | 89 | 0.902778 | 24 | 144 | 4.916667 | 0.791667 | 0.135593 | 0.271186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0.069444 | 144 | 2 | 90 | 72 | 0.873134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6d93c4fab10b1009798e9a4840144cde7d1f2f35 | 44 | py | Python | keys/generate_key.py | Crain-32/wwrando | 1832963c7a9339f4f004e7f4b8bee227bfd86f23 | [
"MIT"
] | 284 | 2018-07-05T15:18:18.000Z | 2022-03-28T19:59:03.000Z | keys/generate_key.py | Crain-32/wwrando | 1832963c7a9339f4f004e7f4b8bee227bfd86f23 | [
"MIT"
] | 235 | 2018-07-06T18:39:36.000Z | 2022-03-29T21:09:46.000Z | keys/generate_key.py | Crain-32/wwrando | 1832963c7a9339f4f004e7f4b8bee227bfd86f23 | [
"MIT"
] | 96 | 2018-07-12T02:33:40.000Z | 2022-03-22T05:08:02.000Z | import secrets
print(secrets.token_hex(16))
| 14.666667 | 28 | 0.818182 | 7 | 44 | 5 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0.068182 | 44 | 2 | 29 | 22 | 0.804878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6dc7e2e204cdeb459e9bc4123f0b40bbf364f387 | 101 | py | Python | users/models.py | DavLivesey/hw05_final | 7d632a79eb67ea60ae7fd410b22472df5d4bb743 | [
"BSD-2-Clause"
] | null | null | null | users/models.py | DavLivesey/hw05_final | 7d632a79eb67ea60ae7fd410b22472df5d4bb743 | [
"BSD-2-Clause"
] | null | null | null | users/models.py | DavLivesey/hw05_final | 7d632a79eb67ea60ae7fd410b22472df5d4bb743 | [
"BSD-2-Clause"
] | null | null | null | from django.contrib.auth import get_user_model
from django.db import models
User = get_user_model()
| 20.2 | 46 | 0.821782 | 17 | 101 | 4.647059 | 0.588235 | 0.253165 | 0.303797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118812 | 101 | 4 | 47 | 25.25 | 0.88764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a308640c135b37bf2ce3af4b8f95c2462a3b567c | 6,134 | py | Python | analysis/distances.py | morphic-team/research-results | 2e23ae005e9a91d6b0e4d6612e452e4eec666b85 | [
"MIT"
] | null | null | null | analysis/distances.py | morphic-team/research-results | 2e23ae005e9a91d6b0e4d6612e452e4eec666b85 | [
"MIT"
] | null | null | null | analysis/distances.py | morphic-team/research-results | 2e23ae005e9a91d6b0e4d6612e452e4eec666b85 | [
"MIT"
] | null | null | null | import csv
import json
d = json.loads('{"pure":[102054.8731037809,121618.95754858853,72806.80266132006,196661.6322107075,76332.73825128668,67539.53215371947,116835.04508051295,45854.01881859638,19740.322605579095,76116.27803035868,102174.89172236627,18208.408528395943,266440.334996511,20717.97373258692,18705.09802294008,136602.76066983046,135832.64528585435,152502.49924799197,266172.3924627057,33181.53072776702,102299.72419228953,127147.15748041899,19391.051372890197,151362.5702525659,104605.78481100459,76477.07235616203,76415.32945500512,65198.21968579202,155068.14169928487,10380.490350259128,170849.5318224179,282028.6053132273,133055.69841921094,17095.246967327483,1391.7214945234784,65007.52121969288,64070.13468868874,48632.47367106624,168861.18457228836,133933.99294272612,102150.99195438308,160246.43318793422,169011.04239331698,25138.1988224694,490206.07664434414,217643.53143564775,102287.70424662932,155459.84966864,68536.37738582528,26971.335687079172,63170.49717558881,15121.598050938912,82614.42032087577,48209.7291333436,74159.957001691,111780.00430325403,96039.89526430155,54119.48468367237,112047.57189320821,63298.10554374254,74403.20707375549,69095.79294273315,147592.71721718187,122844.19872379744,84182.8631401318,14729.864537061074,157706.90752111064,76725.45282845711,482890.0768378575,157957.20599412633,139604.30001444148,280265.36005996005,309491.7299394059,212597.1935790098,372239.0088832956,423870.08290918387,216535.238409009,171953.09572829836,387102.2837775287,502006.97408226377,76585.16332639303,157592.00719283897,112036.1941637891,168979.5241521824,75140.27425070861,82011.81798001625,73380.47739419722,135370.75381285424,76553.606232692,92969.3930166596,179968.6390836263,19303.68727080851,176904.98946026296,419956.92867986206,58100.04580042804,212663.85101972238,240641.0519369769,209580.25485453813,504987.01813949045,119964.49523804557,382010.5015393246,218439.1847352873,77757.88216268522,135310.06604263792,175830.23857108227,254999.41060136774,246093.29452050079,114423.90726520744,246109.62561845334,219750.31972594344,201680.37070446767,186559.62741336372,186529.8734552982,179085.02650626752,174819.48601813108,179074.20996247316,66758.94145005009,219750.6247741363,178863.15313936025,169105.82290917906,166701.78630250652,147404.9610279823,180876.70964955192,148376.41228008983,295244.30009532935,216053.78604703746,155247.77814994077,268340.5073169057,246093.29452050079,241301.29968817858,212598.9413321571,82047.74856367928,21749.72183975632,218658.2061158593,445091.27328411327,277592.9437882211,60883.11054065835,8067.228255661756,9336.453611054956,43918.12154772328,8216.391046661523,71847.17675376624,179365.5022240786,17280.952261991435,17682.342040186155,17303.11342697027,17268.198454909845,17327.022812738203,17319.827072623444,18260.467212481846,18228.84847197738,17803.001825392574,18108.297996857753,17682.342040186155,51878.57414749761,18054.554647283305,44884.103143811255,18345.90501780133,60065.68829778242,44884.103143811255,60065.68829778242,18358.66256995795,18263.518237913308,17647.106864023383,17621.56050568087,17654.4132878234,18125.564007980436,18123.89495657167,171995.13103387345,102054.8731037809,39240.27270837228,76116.27803035868,201526.38206419977,8307.427496016251,16522.529930261197,79616.58590240408,16522.529930261197,39966.99572614851,44822.31737778307,44886.99013340696,44474.06213091847,45235.14399024054,440.87586762275515,39983.007161972324,178384.9199176947,39595.59777853531,85531.07799488939,204595.22012238955,94686.32923850809,121987.39172951812,118566.60740965896,44884.103143811255,44838.49771824976,111987.67621247414,123143.51087803733,145350.06259224546,94054.49666729372,388786.4895404398,388799.5854387024,498932.5889043086,39240.27270837228,39969.02293575861,204595.22012238955,145813.50791027083,25073.145707438314,33357.75520896938,28061.83135919827,485088.10443168815,486186.93447802775,257092.7149234824,329688.2262344128,51928.63019071519,485273.66694397735,308605.2298170282,76116.27803035868,486122.1833830076,257563.53077721084,487727.8452534729,18407.26735936022,485058.29657905706,494227.42663199245,31160.444169295602,81782.99764711561,486119.32966556685,2091.860649686922,28225.51156600926,258153.33434784945,487058.7678338829,41907.3363533799,739909.714326515,485128.4635191034,31160.444169295602,28239.966478279333,78774.07985066652,53619.8862372966,50263.780612556395,120783.9674749375,41907.3363533799,219580.52558385738,186531.86350341517,102062.24542911512,127468.53199208992,164206.22848825596,80620.87855671081,44829.42508504858,135373.60374411114,132959.6471764488,76861.67302702599,20683.795733272047,164206.22848825596,2034.6395308849842,3091.8023958652257,187288.2563616965,1837.020925361182,84465.88386179216,112213.21922272086,29603.212991376768,180267.67150100815,195833.42836066702,157209.183614533,180278.2361202585,180120.63097502023,80700.6852746477,64760.752350325776,184850.69831332474,204067.36199858514,189013.24618709594,194197.4917995847,157877.90667383687,194239.0467429149,17286.414238875976,209580.25485453813,2069.406876446226,35432.177093434955,150831.41557627186,173971.73531007464,44838.88668411276,196894.16521007227],"hybrid":[16862.610330125346,8.434237744761583,25496.522096218876,127471.20974736377,81866.40216932316,44869.756947862326,241301.29968817858,6809.791751450423,79605.86920362037,44415.74727067786,44884.103143811255,44884.103143811255,44884.103143811255,44884.103143811255,44884.103143811255,25072.575405387506,36039.659385371335,50526.070484645286,39240.27270837228,76116.27803035868,139604.30001444148,34359.88591046033,59836.65593778577,45474.646934749006,44863.37657287107,69423.20145563065,187320.00208209874,2056.6015957179243,216301.4087611589]}')
f = open('crow-distances-ibis.csv', 'w')
distances_csv = csv.DictWriter(f, fieldnames=['type', 'distance'])
distances_csv.writeheader()
for pure_crow_distance in d['pure']:
distances_csv.writerow({
'type': 'pure',
'distance': pure_crow_distance
})
for hybrid_crow_distance in d['hybrid']:
distances_csv.writerow({
'type': 'hybrid',
'distance': hybrid_crow_distance
})
f.close() | 266.695652 | 5,702 | 0.874959 | 675 | 6,134 | 7.933333 | 0.868148 | 0.025397 | 0.016433 | 0.025397 | 0.015873 | 0.015873 | 0.015873 | 0.015873 | 0 | 0 | 0 | 0.832344 | 0.011086 | 6,134 | 23 | 5,703 | 266.695652 | 0.050445 | 0 | 0 | 0.235294 | 0 | 0.058824 | 0.939527 | 0.930236 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.