hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
235999ccf160401d757ca2409a60b6fbd3ad27ee | 146 | py | Python | tests/clear_team_data_for_redispatching.py | alibaba/easydispatch | 2cf32a374d12c804ff396f90b789c2a838003c5d | [
"Apache-2.0"
] | 11 | 2021-05-04T03:15:20.000Z | 2022-02-16T07:44:16.000Z | tests/clear_team_data_for_redispatching.py | alibaba/easydispatch | 2cf32a374d12c804ff396f90b789c2a838003c5d | [
"Apache-2.0"
] | 4 | 2021-06-21T11:12:37.000Z | 2021-06-29T11:54:18.000Z | tests/clear_team_data_for_redispatching.py | alibaba/easydispatch | 2cf32a374d12c804ff396f90b789c2a838003c5d | [
"Apache-2.0"
] | 2 | 2021-05-05T00:42:44.000Z | 2021-05-10T12:51:58.000Z | from dispatch.common.utils.kandbox_clear_data import clear_team_data_for_redispatching
clear_team_data_for_redispatching(org_code='0', team_id=1)
| 48.666667 | 86 | 0.890411 | 24 | 146 | 4.916667 | 0.666667 | 0.152542 | 0.220339 | 0.271186 | 0.491525 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014286 | 0.041096 | 146 | 2 | 87 | 73 | 0.828571 | 0 | 0 | 0 | 0 | 0 | 0.006849 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
236802c1b1619db5fa65b8de00998bbe955c7c2d | 48 | py | Python | pixsfm/residuals/__init__.py | prajwalchidananda/pixel-perfect-sfm | 78f56bc36d6fb38920bd01807d197fdc04728423 | [
"Apache-2.0"
] | null | null | null | pixsfm/residuals/__init__.py | prajwalchidananda/pixel-perfect-sfm | 78f56bc36d6fb38920bd01807d197fdc04728423 | [
"Apache-2.0"
] | null | null | null | pixsfm/residuals/__init__.py | prajwalchidananda/pixel-perfect-sfm | 78f56bc36d6fb38920bd01807d197fdc04728423 | [
"Apache-2.0"
] | null | null | null | from .._pixsfm._residuals import * # noqa F403
| 24 | 47 | 0.729167 | 6 | 48 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 0.166667 | 48 | 1 | 48 | 48 | 0.75 | 0.1875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
88cb5503438db132e3f37a41251195488dc6fde6 | 33 | py | Python | async_polygon/__init__.py | mirage-deadline/async-Polygon | 7b08c1943c11313f17b48c0caa4374de90fb01e2 | [
"MIT"
] | null | null | null | async_polygon/__init__.py | mirage-deadline/async-Polygon | 7b08c1943c11313f17b48c0caa4374de90fb01e2 | [
"MIT"
] | null | null | null | async_polygon/__init__.py | mirage-deadline/async-Polygon | 7b08c1943c11313f17b48c0caa4374de90fb01e2 | [
"MIT"
] | null | null | null | from .rest import RestAsyncClient | 33 | 33 | 0.878788 | 4 | 33 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
88dfd1ab43f1ecbdaac81765d1183373a09b62aa | 126 | py | Python | app/core/utilities/__init__.py | michaelscales88/mWreporting_final | b0399fb32fd594c2f5a20d47c2c0dceaecb6f326 | [
"MIT"
] | 2 | 2019-06-10T21:15:03.000Z | 2020-01-02T13:12:45.000Z | app/core/utilities/__init__.py | michaelscales88/python-reporting-app | b0399fb32fd594c2f5a20d47c2c0dceaecb6f326 | [
"MIT"
] | 14 | 2018-01-18T19:07:15.000Z | 2018-05-16T18:44:55.000Z | app/core/utilities/__init__.py | michaelscales88/mWreporting_final | b0399fb32fd594c2f5a20d47c2c0dceaecb6f326 | [
"MIT"
] | null | null | null | from .forms import *
from .health_check import *
from .helpers import *
from .logger import *
from .authenticate_jwt import *
| 21 | 31 | 0.761905 | 17 | 126 | 5.529412 | 0.529412 | 0.425532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15873 | 126 | 5 | 32 | 25.2 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
88e3a97dcab079c3e94c7d1cc7a27d2fdd4c7799 | 44,435 | py | Python | build/lib/DissNet/metrics.py | Davi1990/DissNet | 10755322a87e8ce8d22f0641651334f86088fc6d | [
"MIT"
] | null | null | null | build/lib/DissNet/metrics.py | Davi1990/DissNet | 10755322a87e8ce8d22f0641651334f86088fc6d | [
"MIT"
] | null | null | null | build/lib/DissNet/metrics.py | Davi1990/DissNet | 10755322a87e8ce8d22f0641651334f86088fc6d | [
"MIT"
] | null | null | null | """
miscelallaneous functions and classes to extract connectivity metrics
Author: Davide Momi, PhD [momi.davide89@gmail.com], https://twitter.com/davemomi
"""
import numpy as np
import pandas as pd
from math import pi
import glob
import seaborn as sns
import matplotlib.pyplot as plt
import bct as bct
class Connectivity_metrics(object):
def __init__(self, matrices_files, net_label_txt, labels_dic):
self.matrices_files = matrices_files
self.net_label_txt = net_label_txt
self.labels_dic = labels_dic
def nodes_overall_conn(self, make_symmetric=True, upper_threshold=None,
lower_threshold=None):
'''
computing the overall connectivity of each node
regardless of network affiliation
Parameters
----------
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy array (dim number of subject X number of node)
representing the connectivity of each node regardless
of network affiliation
'''
self.nodes_conn = []
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
for nodes in range(self.matrix.shape[0]):
self._node_conn = np.sum(self.matrix[nodes])
self.nodes_conn.append(self._node_conn)
self.nodes_conn = np.array(self.nodes_conn)
self.nodes_conn = self.nodes_conn.reshape(len(self.matrices_files), self.matrix.shape[0])
return self.nodes_conn
def node_inner_conn(self, sbj_number, nodes_number, make_symmetric=True,
upper_threshold=None, lower_threshold=None):
'''
computing the connectivity of each node with its own network
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy array (dim number of subject X number of node)
representing the connectivity of each node with its own
network
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = np.zeros([sbj_number, nodes_number])
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
for network in net:
for nodes in self.labels_dic[network]:
self.sub_matrix =self.matrix[nodes]
self.streamlines_sum = np.sum(self.sub_matrix[self.labels_dic[network]])
self.all_conn[subj, nodes] = self.streamlines_sum/self.labels_dic[network].shape[0]
return self.all_conn
def node_outer_conn(self, sbj_number, nodes_number, make_symmetric=True,
upper_threshold=None, lower_threshold=None):
'''
computing the connectivity of each node with the other nodes
which don't belong to the same network
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy array (dim number of subject X number of node)
representing the connectivity of each node with regions that
are outsite the node's network
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = np.zeros([sbj_number, nodes_number])
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
self.nodes_ranges = np.arange(len(self.labels_dic['nodes']))
for network in net:
self.outer_idx = np.setdiff1d(self.nodes_ranges, self.labels_dic[network])
for nodes in self.outer_idx:
self.sub_matrix =self.matrix[nodes]
self.streamlines_sum = np.sum(self.sub_matrix[self.outer_idx])
self.all_conn[subj, nodes] = self.streamlines_sum/self.outer_idx.shape[0]
return self.all_conn
def node_ranking(self, sbj_number, nodes_number, networks_number,
make_symmetric=True, upper_threshold=None, lower_threshold=None):
'''
computing how much each node is connected with the each network
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
networks_number: int|
number of networks
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy a 3D array (dim number of subject X number of network X number of network)
representing the connectivity of each node with all the networks
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = np.zeros([sbj_number, nodes_number, networks_number])
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
for nodes in range(self.matrix.shape[0]):
self.node_conn = self.matrix[nodes]
for network in net:
self.streamlines_sum = np.sum(self.node_conn[self.labels_dic[network]])
self.all_conn[subj, nodes, net.index(network)] = self.streamlines_sum/self.labels_dic[network].shape[0]
return self.all_conn
def net_inner_conn(self, make_symmetric=True, upper_threshold=None,
lower_threshold=None):
'''
computing the how much each network is connected with itself
Parameters
----------
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy array (dim number of subject X number of network)
representing the connectivity of each network with itself
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = []
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
for network in net:
self.subj_matrix = self.matrix[self.labels_dic[network]]
self.subj_matrix = self.subj_matrix[:,self.labels_dic[network]]
self.streamlines_sum = np.sum(np.sum(self.subj_matrix))
self.conn_measure = self.streamlines_sum/len(self.labels_dic[network])
self.all_conn.append(self.conn_measure)
self.all_conn = np.array(self.all_conn)
self.all_conn = self.all_conn.reshape(len(self.matrices_files), len(net))
return self.all_conn
def net_outer_conn(self, make_symmetric=True, upper_threshold=None,
lower_threshold=None):
'''
computing how much each network is connected with the other
networks
Parameters
----------
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
Returns
-------
float data : numpy array |
numpy array (dim number of subject X number of network)
representing the connectivity of each network with other networks
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = []
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
np.fill_diagonal(self.matrix,0)
self.nodes_ranges = np.arange(len(self.labels_dic['nodes']))
for network in net:
self.outer_idx = np.setdiff1d(self.nodes_ranges, self.labels_dic[network])
self.subj_matrix = self.matrix[self.labels_dic[network]]
self.subj_matrix = self.subj_matrix[:,self.outer_idx]
self.streamlines_sum = np.sum(np.sum(self.subj_matrix))
self.conn_measure = self.streamlines_sum/self.outer_idx.shape[0]
self.all_conn.append(self.conn_measure)
self.all_conn = np.array(self.all_conn)
self.all_conn = self.all_conn.reshape(len(self.matrices_files), len(net))
return self.all_conn
def net_ranking(self, sbj_number, nodes_number, make_symmetric=True,
upper_threshold=None, lower_threshold=None,
percentage_value=False):
'''
computing how much each node is connected with the other network
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
percentage_value: Boolean|
True return values express in percentage_value
False return raw values
Returns
-------
float data : numpy array |
numpy a 3D array (dim number of subject X number of network X number of network)
representing the connectivity of each node with all the networks
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.all_conn = self.node_ranking(sbj_number, nodes_number, len(net), make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold)
self.all_conn_rank = np.zeros([sbj_number, len(net), len(net)])
for subj in range(len(self.matrices_files)):
self.subj2use = self.all_conn[subj,:,:]
for network in net:
self.net2use = self.subj2use[self.labels_dic[network],:]
if percentage_value==False:
self.all_conn_rank[subj, net.index(network), :] = np.mean(self.net2use, axis=0)
else:
self.all_conn_rank[subj, net.index(network), :] = 100* np.mean(self.net2use, axis=0)/np.sum(np.mean(self.net2use, axis=0))
return self.all_conn_rank
def all_standard_metrics(self, sbj_number, nodes_number, networks_number,
make_symmetric=True, upper_threshold=None,
lower_threshold=None, percentage_value=False):
self.metrics_dict = {
"nodes_overall_conn": self.nodes_overall_conn(make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"node_inner_conn": self.node_inner_conn(sbj_number, nodes_number, make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"node_outer_conn": self.node_outer_conn(sbj_number, nodes_number, make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"node_ranking": self.node_ranking(sbj_number, nodes_number, networks_number, make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"net_inner_conn": self.net_inner_conn(make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"net_outer_conn": self.net_outer_conn(make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold),
"net_ranking": self.net_ranking(sbj_number, nodes_number, make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold, percentage_value=percentage_value)
}
return self.metrics_dict
class Graph_Theory(object):
def __init__(self, matrices_files, net_label_txt, labels_dic):
self.matrices_files = matrices_files
self.net_label_txt = net_label_txt
self.labels_dic = labels_dic
def nodal_degree(self, sbj_number, nodes_number, make_symmetric=True,
upper_threshold=None, lower_threshold=None, binarize=False):
'''
computing graph theory node measures regardless of network affiliation
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
binarize: Boolean|
True will make the connectivity matrix binary
Default is False
Returns
-------
dict: : dictonary with the following keys |
degree: int | Number of links connected to the node
in_degree: int | Number of inward links
out_degree: int | Number of outward links
joint_in_degree: int | number of vertices with in_degree>out_degree
joint_out_degree: int | number of vertices with out_degree>in_degree
joint_bilateral: int | number of vertices with in_degree==out_degree
node_strength_dir: int | node strength (in-strength + out-strength)
node_strength_undir: int | sum of weights of links connected to the node
'''
self.all_nodal_degree = {
"degree": np.zeros([sbj_number, nodes_number]),
# "in_degree" : np.zeros([sbj_number, nodes_number]),
# "out_degree" : np.zeros([sbj_number, nodes_number]),
# "joint_in_degree" : np.zeros([sbj_number, nodes_number]),
# "joint_out_degree" : np.zeros([sbj_number, nodes_number]),
# "joint_bilateral" : np.zeros([sbj_number, nodes_number]),
# "node_strength_dir": np.zeros([sbj_number, nodes_number]),
"node_strength_undir":np.zeros([sbj_number, nodes_number])
}
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
if binarize==True:
self.matrix = bct.algorithms.binarize(self.matrix)
else:
self.matrix = self.matrix
np.fill_diagonal(self.matrix,0)
self.deg = bct.algorithms.degrees_und(self.matrix)
# self.all_nodal_degree['in_degree'][subj] = self.inp
# self.all_nodal_degree['out_degree'][subj] = self.od
self.all_nodal_degree['degree'][subj] = self.deg
# self.J, self.J_od, self.J_id, self.J_bl = bct.algorithms.jdegree(self.matrix)
# self.all_nodal_degree['joint_in_degree'][subj] = self.J_id
# self.all_nodal_degree['joint_out_degree'][subj] = self.J_od
# self.all_nodal_degree['joint_bilateral'][subj] = self.J_bl
# self.nodestr_dir = bct.algorithms.strengths_dir(self.matrix)
# self.all_nodal_degree['node_strength_dir'][subj] = self.nodestr_dir
self.nodestr_undir = bct.algorithms.strengths_und(self.matrix)
self.all_nodal_degree['node_strength_undir'][subj] = self.nodestr_undir
return self.all_nodal_degree
def network_level_degree(self, sbj_number, nodes_number, label_dic,
make_symmetric=True, upper_threshold=None,
lower_threshold=None, binarize=False,):
'''
computing graph theory node measures specific for each network
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
label_dic: dict |
dictonary computed using files.labels()
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
binarize: Boolean|
True will make the connectivity matrix binary
Default is False
Returns
-------
dict: : dictonary with the following keys |
degree: int | Number of links connected to the node
in_degree: int | Number of inward links
out_degree: int | Number of outward links
joint_in_degree: int | number of vertices with in_degree>out_degree
joint_out_degree: int | number of vertices with out_degree>in_degree
joint_bilateral: int | number of vertices with in_degree==out_degree
node_strength_dir: int | node strength (in-strength + out-strength)
node_strength_undir: int | sum of weights of links connected to the node
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.degree = self.nodal_degree(sbj_number, nodes_number, make_symmetric=make_symmetric, upper_threshold=upper_threshold, lower_threshold=lower_threshold, binarize=binarize)
self.values = np.zeros([sbj_number, len(self.degree.keys()), len(net)])
self.list = list(self.degree.keys())
for subject in range(sbj_number):
for key in self.list:
for network in net:
self.values[subject, self.list.index(key), net.index(network)] = np.mean(self.degree[key][subject][label_dic[network]])
self.d = {}
for i in self.degree.keys():
self.d[i] = self.values[:, self.list.index(i), :]
return self.d
def physical_connectivity(self, sbj_number, networks_number, label_dic,
make_symmetric=True, upper_threshold=None,
lower_threshold=None, binarize=False):
'''
Density is the fraction of present connections to possible connections.
Parameters
----------
sbj_number: int |
number of subjects
networks_number: int|
number of networks
label_dic: dict |
dictonary computed using files.labels()
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
binarize= Boolean|
True will make the connectivity matrix binary
Default is False
Returns
-------
dict: : dictonary with the following keys |
Density_und: int | Density is the fraction of present connections
to possible connections
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.physical_connectivity = {
"Density_und": np.zeros([sbj_number, networks_number]),
}
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
if binarize==True:
self.matrix = bct.algorithms.binarize(self.matrix)
else:
self.matrix = self.matrix
np.fill_diagonal(self.matrix,0)
for network in net:
self.net_matrix = self.matrix[label_dic[network]]
self.net_matrix = self.net_matrix[:,label_dic[network]]
self.kden, self.n, self.k = bct.algorithms.density_und(self.net_matrix)
self.physical_connectivity['Density_und'][subj, net.index(network)] = self.kden
return self.physical_connectivity
def modularity(self, sbj_number, networks_number, label_dic,
make_symmetric=True, upper_threshold=None,
lower_threshold=None, binarize=False):
'''
Computing modularity of the adjencency matrix
Parameters
----------
sbj_number: int |
number of subjects
networks_number: int|
number of networks
label_dic: dict |
dictonary computed using files.labels()
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
binarize= Boolean|
True will make the connectivity matrix binary
Default is False
Returns
-------
dict: : dictonary with the following keys |
community_louvain: int | Modularity values
similarity_idx: float32 | Values indicating how much each original
network is similar to the new modules found
with the modularity algorithm
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.modularity = {
"community_louvain": np.zeros([sbj_number, networks_number]),
"similarity_idx": np.zeros([sbj_number, networks_number])
}
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
if binarize==True:
self.matrix = bct.algorithms.binarize(self.matrix)
else:
self.matrix = self.matrix
np.fill_diagonal(self.matrix,0)
for network in net:
self.net_matrix = self.matrix[label_dic[network]]
self.net_matrix = self.net_matrix[:,label_dic[network]]
self.ci, self.q = bct.algorithms.community_louvain(self.net_matrix)
self.modularity['community_louvain'][subj, net.index(network)] = self.q
self.unico = np.unique(self.ci)
self.index={}
for values in self.unico:
self.index[values]= np.where(self.ci == values)
self.similarity_matrix = np.zeros([len(net), self.unico.shape[0]])
for network in net:
for module in self.unico:
self.simil = np.intersect1d(label_dic[network], self.index[module])
self.similarity_matrix[net.index(network), module-1] = self.simil.shape[0]
self.modularity['similarity_idx'][subj, net.index(network)] = (np.max(self.similarity_matrix[net.index(network)]) - np.min(self.similarity_matrix[net.index(network)])) / label_dic[network].shape[0]
return self.modularity
def centrality(self, sbj_number, nodes_number, atlas,
make_symmetric=True, upper_threshold=None,
lower_threshold=None, binarize=False):
'''
Computing centrality measures of the adjencency matrix
Parameters
----------
sbj_number: int |
number of subjects
nodes_number: int|
number of nodes
atlas: excel file |
please se example available in the repo (e.g. new_atlas_coords.xlsx)
make_symmetric: Boolean|
True indicate that the matrix is either upper
or lower triangular and need to be symmetrize
False indicate that the matrix is a full matrix already
upper_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
under that threshold will be 0 (Default is None)
lower_threshold: int |
an integer value ranging from 0 to 100 representing the
percentage of values as respect to maximum. The value
above that threshold will be 0 (Default is None)
binarize= Boolean|
True will make the connectivity matrix binary
Default is False
Returns
-------
dict: : dictonary with the following keys |
edge_betweeness_bin: | np.ndarray
Edge betweenness centrality is the fraction of all
shortest paths in the network that contain a given
edge. Edges with high values of betweenness centrality
participate in a large number of shortest paths.
It will return node betweenness centrality vector.
edge_betweeness_wei: | np.ndarray
Edge betweenness centrality is the fraction of all
shortest paths in the network that contain a given
edge. Edges with high values of betweenness centrality
participate in a large number of shortest paths.
It will return node betweenness centrality vector.
eigenvector_centrality_und: | np.ndarray
Eigenector centrality is a self-referential measure
of centrality: nodes have high eigenvector centrality
if they connect to other nodes that have high
eigenvector centrality. The eigenvector centrality of
node i is equivalent to the ith element in the eigenvector
corresponding to the largest eigenvalue of the adjacency matrix.
It will return the eigenvector associated with the
largest eigenvalue of the matrix
coreness_kcoreness_centrality_bu: | np.ndarray
The k-core is the largest subgraph comprising nodes
of degree at least k. The coreness of a node is k if
the node belongs to the k-core but not to the (k+1)-core.
This function computes the coreness of all nodes for a
given binary undirected connection matrix.
It will return the node coreness.
kn_kcoreness_centrality_bu: | np.ndarray
The k-core is the largest subgraph comprising nodes
of degree at least k. The coreness of a node is k if
the node belongs to the k-core but not to the (k+1)-core.
This function computes the coreness of all nodes for a
given binary undirected connection matrix.
It will return the size of k-core
module_degree_zscore: | np.ndarray
The within-module degree z-score is a within-module
version of degree centrality. It will return
within-module degree Z-score
participation_coef: | np.ndarray
Participation coefficient is a measure of diversity
of intermodular connections of individual nodes.
It will return the participation coefficient
subgraph_centrality: | np.ndarray
The subgraph centrality of a node is a weighted sum
of closed walks of different lengths in the network
starting and ending at the node. This function returns
a vector of subgraph centralities for each node of the
network. It will return the subgraph centrality
'''
with open(self.net_label_txt) as f:
net=f.read().splitlines()
self.atlas = pd.read_excel(atlas, header=None)
self.atlas = np.array(self.atlas)
self.ci_original = self.atlas[:,8]
self.centrality = {
"edge_betweeness_bin": np.zeros([sbj_number, nodes_number]),
"edge_betweeness_wei": np.zeros([sbj_number, nodes_number]),
"eigenvector_centrality_und": np.zeros([sbj_number, nodes_number]),
"coreness_kcoreness_centrality_bu": np.zeros([sbj_number, nodes_number]),
"kn_kcoreness_centrality_bu": np.zeros([sbj_number, nodes_number]),
"module_degree_zscore": np.zeros([sbj_number, nodes_number]),
"participation_coef": np.zeros([sbj_number, nodes_number]),
"subgraph_centrality": np.zeros([sbj_number, nodes_number])
}
for subj in range(len(self.matrices_files)):
self.matrix = pd.read_csv(self.matrices_files[subj], sep= ' ', header=None)
self.matrix = np.array(self.matrix)
if make_symmetric==True:
self.matrix = self.matrix + self.matrix.T - np.diag(self.matrix.diagonal())
else:
self.matrix = self.matrix
self.max=np.max(self.matrix.flatten())
if upper_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix < upper_threshold*self.max/100 ] = 0
if lower_threshold==None:
self.matrix= self.matrix
else:
self.matrix[self.matrix > lower_threshold*self.max/100 ] = 0
self.matrix_bin = bct.algorithms.binarize(self.matrix)
self.matrix_weight = self.matrix
if binarize==True:
self.matrix = bct.algorithms.binarize(self.matrix)
else:
self.matrix = self.matrix
np.fill_diagonal(self.matrix,0)
np.fill_diagonal(self.matrix_bin,0)
np.fill_diagonal(self.matrix_weight,0)
self.BC = bct.betweenness_bin(self.matrix_bin)
self.centrality['edge_betweeness_bin'][subj] = self.BC
self.BC_w = bct.betweenness_wei(self.matrix_weight)
self.centrality['edge_betweeness_wei'][subj] = self.BC_w
self.v = bct.eigenvector_centrality_und(self.matrix)
self.centrality['eigenvector_centrality_und'][subj] = self.v
self.coreness, self.kn = bct.kcoreness_centrality_bu(self.matrix_bin)
self.centrality['coreness_kcoreness_centrality_bu'][subj] = self.coreness
self.centrality['kn_kcoreness_centrality_bu'][subj] = self.kn
self.Z = bct.module_degree_zscore(self.matrix, ci=self.ci_original)
self.centrality['module_degree_zscore'][subj] = self.Z
self.P = bct.participation_coef(self.matrix, ci=self.ci_original)
self.centrality['participation_coef'][subj] = self.P
self.Cs = bct.subgraph_centrality(self.matrix_bin)
self.centrality['subgraph_centrality'][subj] = self.Cs
return self.centrality
| 44.038652 | 213 | 0.583391 | 5,230 | 44,435 | 4.816444 | 0.063289 | 0.092894 | 0.051131 | 0.059547 | 0.841763 | 0.810361 | 0.782374 | 0.774593 | 0.752719 | 0.736443 | 0 | 0.008459 | 0.342905 | 44,435 | 1,008 | 214 | 44.082341 | 0.854271 | 0.411477 | 0 | 0.658537 | 0 | 0 | 0.026598 | 0.007313 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036585 | false | 0 | 0.017073 | 0 | 0.090244 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ccbe1b68cbe8eeea685ac2474ece0c132e53dd8c | 57 | py | Python | utils/__init__.py | nasimdjanovich/async_telegram_bot | 250a36d4382aa51e2f02d5a1de0326890d8f973a | [
"MIT"
] | 3 | 2022-02-17T13:01:11.000Z | 2022-02-17T15:45:03.000Z | utils/__init__.py | turdibek-jumabaev/AsyncTelebot-shablon | f25d28389326ba13a1747a6534c9643c49feeeca | [
"MIT"
] | null | null | null | utils/__init__.py | turdibek-jumabaev/AsyncTelebot-shablon | f25d28389326ba13a1747a6534c9643c49feeeca | [
"MIT"
] | null | null | null | from . import notify_admins
from . import set_my_commands | 28.5 | 29 | 0.842105 | 9 | 57 | 5 | 0.777778 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122807 | 57 | 2 | 29 | 28.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ccc36f94c578336d97eafa2850141cdbb54f7ee0 | 155 | py | Python | subject/admin.py | nightwarriorftw/signals | c8274b846f7fd139eb4863250c99a97c33324b34 | [
"MIT"
] | null | null | null | subject/admin.py | nightwarriorftw/signals | c8274b846f7fd139eb4863250c99a97c33324b34 | [
"MIT"
] | null | null | null | subject/admin.py | nightwarriorftw/signals | c8274b846f7fd139eb4863250c99a97c33324b34 | [
"MIT"
] | null | null | null | from django.contrib import admin
from subject.models import Author
from subject.models import Book
admin.site.register(Author)
admin.site.register(Book)
| 19.375 | 33 | 0.825806 | 23 | 155 | 5.565217 | 0.478261 | 0.171875 | 0.265625 | 0.359375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103226 | 155 | 7 | 34 | 22.142857 | 0.920863 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
aeb152909d511b8313faf73de63b2732e1560147 | 4,621 | py | Python | FDARecall.py | Zebra/Savanna-Python-SDK | 973f642a13f8c695a1781322ebf804c34cdb1b47 | [
"MIT"
] | 1 | 2021-03-08T06:49:57.000Z | 2021-03-08T06:49:57.000Z | FDARecall.py | Zebra/Savanna-Python-SDK | 973f642a13f8c695a1781322ebf804c34cdb1b47 | [
"MIT"
] | 1 | 2022-01-25T16:36:43.000Z | 2022-01-25T16:36:43.000Z | FDARecall.py | Zebra/Savanna-Python-SDK | 973f642a13f8c695a1781322ebf804c34cdb1b47 | [
"MIT"
] | 1 | 2020-06-11T14:00:40.000Z | 2020-06-11T14:00:40.000Z | import http
from SavannaAPI import SavannaAPI
from urllib.error import URLError
import logging
"""
CreateBarcode --- Provides access to the Savanna barcode creation APIs.
@author Dbuhrsmith@zebra.com
"""
class FDARecall:
@staticmethod
def deviceSearch(search):
"""Returns medical device recall notices for a given description
@param search A simple one word search string
@return A JSONObject containing a result from the device recall search, if
any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return FDARecall.deviceSearch_limit(search, 1)
except URLError as error:
logging.error(error)
raise
@staticmethod
def deviceSearch_limit(search, limit):
"""Returns medical device recall notices for a given description
@param search A simple one word search string
@param limit Maximum number of records to return
@return A JSONObject containing a result from the device recall search, if
any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return SavannaAPI.callService("recalls/device/description?val={}&limit={}"
.format(search, limit))
except URLError as error:
logging.error(error)
raise
@staticmethod
def drugSearch(search):
"""Returns drug recall notices for a given description
@param search A simple one word search string
@return A JSONObject containing results from the drug recall search, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return FDARecall.drugSearch_limit(search, 1)
except URLError as error:
logging.error(error)
raise
@staticmethod
def drugSearch_limit(search, limit):
"""Returns drug recall notices for a given description
@param search A simple one word search string
@param limit Maximum number of records to return
@return A JSONObject containing results from the drug recall search, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return SavannaAPI.callService("recalls/drug/description?val={}&limit={}"
.format(search, limit))
except URLError as error:
logging.error(error)
raise
@staticmethod
def foodUpc(upc):
"""Returns food recall notices for a given UPC code
@param A valid UPC code for a food item
@return A JSONObject containing a result from the food recall lookup, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return FDARecall.foodUpc_limit(upc, 1)
except URLError as error:
logging.error(error)
raise
@staticmethod
def foodUpc_limit(upc, limit):
"""Returns food recall notices for a given UPC code
@param A valid UPC code for a food item
@param limit Maximum number of records to return (maximum 99)
@return A JSONObject containing a result from the food recall lookup, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return SavannaAPI.callService("recalls/food/upc?val={}&limit={}"
.format(upc, limit))
except URLError as error:
logging.error(error)
raise
@staticmethod
def drugUpc(upc):
"""Returns FDA drug recall notices for a UPC code
@param upc Value
@return A JSONObject containing a result from the drug recall lookup, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return FDARecall.drugUpc_upc_limit(upc, 1)
except URLError as error:
logging.error(error)
raise
@staticmethod
def drugUpc_limit(upc, limit):
"""Returns FDA drug recall notices for a UPC code
@param upc Value
@param limit Maximum number of records to return
@return A JSONObject containing a result from the drug recall lookup, if any
@throws HTTPError Thrown if there is an error calling the service
"""
try:
return SavannaAPI.callService("recalls/drug/upc?val={}&limit={}"
.format(upc, limit))
except URLError as error:
logging.error(error)
raise | 29.062893 | 86 | 0.63298 | 559 | 4,621 | 5.216458 | 0.148479 | 0.013717 | 0.043896 | 0.046639 | 0.87963 | 0.87963 | 0.87963 | 0.87963 | 0.865912 | 0.865912 | 0 | 0.001879 | 0.308808 | 4,621 | 159 | 87 | 29.062893 | 0.911083 | 0.460074 | 0 | 0.676923 | 0 | 0 | 0.070159 | 0.070159 | 0 | 0 | 0 | 0 | 0 | 1 | 0.123077 | false | 0 | 0.061538 | 0 | 0.323077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4e15a4b4f10db8efd8efb5f30bfa821c169da02d | 123 | py | Python | Hello-World/Python/hello_world_chandr_sekhar_bala.py | RoyalTechie/Hacktoberfest | 5a66f81972a3db15d7c48c8d163d8de4df01d0fe | [
"Apache-2.0"
] | 1 | 2021-10-08T17:18:23.000Z | 2021-10-08T17:18:23.000Z | Hello-World/Python/hello_world_chandr_sekhar_bala.py | RoyalTechie/Hacktoberfest | 5a66f81972a3db15d7c48c8d163d8de4df01d0fe | [
"Apache-2.0"
] | null | null | null | Hello-World/Python/hello_world_chandr_sekhar_bala.py | RoyalTechie/Hacktoberfest | 5a66f81972a3db15d7c48c8d163d8de4df01d0fe | [
"Apache-2.0"
] | null | null | null | '''
LANGUAGE: Python
AUTHOR: Chandra Sekhar Bala
GITHUB: https://github.com/Chandra-Sekhar-Bala
'''
print('Hello, World!')
| 17.571429 | 46 | 0.723577 | 16 | 123 | 5.5625 | 0.75 | 0.292135 | 0.382022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 123 | 6 | 47 | 20.5 | 0.801802 | 0.739837 | 0 | 0 | 0 | 0 | 0.541667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
4e17afde18d26184f765204137a776a28011d69f | 147 | py | Python | commanderbot/lib/utils/__init__.py | CommanderBot-Dev/commanderbot-ext | c8798b4475b892c234a1e4ffbfb4fed3fb702938 | [
"MIT"
] | 4 | 2020-09-25T19:22:48.000Z | 2021-06-16T18:08:49.000Z | commanderbot/lib/utils/__init__.py | CommanderBot-Dev/commanderbot-py | 835841f733e466c5a0e6724d4020747c55856fe3 | [
"MIT"
] | 23 | 2021-08-30T04:07:29.000Z | 2021-11-08T17:44:41.000Z | commanderbot/lib/utils/__init__.py | CommanderBot-Dev/commanderbot-py | 835841f733e466c5a0e6724d4020747c55856fe3 | [
"MIT"
] | 3 | 2020-09-25T19:23:22.000Z | 2021-03-16T18:19:48.000Z | from .colors import *
from .datetimes import *
from .json_path import *
from .timedeltas import *
from .utils import *
from .yield_fields import *
| 21 | 27 | 0.755102 | 20 | 147 | 5.45 | 0.5 | 0.458716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 147 | 6 | 28 | 24.5 | 0.886179 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9d851ba26a7a95a56fc980e0bc23be832cfa24ac | 35,517 | py | Python | main.py | salmanahmedshaikh/GeoFlinkViaRESTAPI | b537a7636e944e4c06d1e875c7eeba9b8f201b26 | [
"MIT"
] | 1 | 2021-11-23T12:16:59.000Z | 2021-11-23T12:16:59.000Z | main.py | salmanahmedshaikh/GeoFlinkViaRESTAPI | b537a7636e944e4c06d1e875c7eeba9b8f201b26 | [
"MIT"
] | null | null | null | main.py | salmanahmedshaikh/GeoFlinkViaRESTAPI | b537a7636e944e4c06d1e875c7eeba9b8f201b26 | [
"MIT"
] | 1 | 2021-10-06T16:11:51.000Z | 2021-10-06T16:11:51.000Z | import requests
import os
import json
import time
def main():
base_url = "http://localhost:29999/"
# x = getAllJars(base_url)
# x = getWebUIConfig(base_url)
# x = uploadJar(base_url, path)
# x = deleteJar(base_url, jar_id)
jar_id = "79da2b57-386f-409d-9964-4e55c0aa221c_GeoFlinkProject20210315.jar"
experimentFrequency = 3
executionTimeSeconds = 120
waitBetweenExecutionsSec = 10
# Range Query
'''
outputFilePathAndName = "qureyOutput_RangeQuery.csv"
inputTopicNameList = ["TaxiDrive17MillionGeoJSON", "NYCBuildingsPolygons", "NYCBuildingsLineStrings"]
ifApproximateQuery = ["true", "false"]
radiusList = ["0.0005", "0.005", "0.05", "0.5"]
wIntervalList = ["50", "100", "150", "200", "250"]
wStepList = ["25", "50", "75", "100", "125"]
uniformGridSizeList = ["100", "200", "300", "400", "500"]
queryOptionListWindowed = []
queryOptionListRealtime = []
dateFormat = ""
gridMinX = ""
gridMaxX = ""
gridMinY = ""
gridMaxY = ""
trajIDSet = ""
queryPoint = ""
queryPolygon = ""
queryLineString = ""
file = openFile(outputFilePathAndName)
file.write(
"queryOption" + "," + "approximateQuery" + "," + "inputTopicName" + "," + "radius" + "," + "wInterval" + "," + "wStep" + "," + "uniformGridSize" + "," + "executionCost1, executionCost2, executionCost3" + "," + "avg_time_sec" + ", " + "numberRecords1, numberRecords2, numberRecords3, " + "avg_records" + ", " + "throughput" + "\n")
file.flush()
file.close()
for inputTopicName in inputTopicNameList:
if inputTopicName == "TaxiDrive17MillionGeoJSON":
queryOptionListWindowed = ["1", "6", "11"]
queryOptionListRealtime = ["2", "7", "12"]
dateFormat = "yyyy-MM-dd HH:mm:ss"
gridMinX = "115.50000"
gridMaxX = "117.60000"
gridMinY = "39.60000"
gridMaxY = "41.10000"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[116.14319183444924, 40.07271444145411]"
queryPolygon = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762], [116.14319183444924, 40.07271444145411]"
queryLineString = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762]"
elif inputTopicName == "NYCBuildingsPolygons":
queryOptionListWindowed = ["16", "21", "26"]
queryOptionListRealtime = ["17", "22", "27"]
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
else:
queryOptionListWindowed = ["31", "36", "41"]
queryOptionListRealtime = ["32", "37", "42"]
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
for queryOption in queryOptionListWindowed:
for approximateQuery in ifApproximateQuery:
radius = "0.005"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wInterval in wIntervalList:
approximateQuery = "true"
radius = "0.005"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wStep in wStepList:
approximateQuery = "true"
radius = "0.005"
uniformGridSize = "200"
wInterval = "100"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for queryOption in queryOptionListRealtime:
for approximateQuery in ifApproximateQuery:
radius = "0.005"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
'''
'''
# kNN Query
outputFilePathAndName = "qureyOutput_kNNQuery.csv"
inputTopicNameList = ["TaxiDrive17MillionGeoJSON", "NYCBuildingsPolygons", "NYCBuildingsLineStrings"]
kList = ["25", "50", "75", "100", "125"]
radiusList = ["0.0005", "0.005", "0.05", "0.5"]
wIntervalList = ["50", "100", "150", "200", "250"]
wStepList = ["25", "50", "75", "100", "125"]
uniformGridSizeList = ["100", "200", "300", "400", "500"]
queryOptionListWindowed = []
queryOptionListRealtime = []
dateFormat = ""
gridMinX = ""
gridMaxX = ""
gridMinY = ""
gridMaxY = ""
trajIDSet = ""
queryPoint = ""
queryPolygon = ""
queryLineString = ""
file = openFile(outputFilePathAndName)
file.write(
"queryOption" + "," + "approximateQuery" + "," + "inputTopicName" + "," + "radius" + "," + "k" + "," + "wInterval" + "," + "wStep" + "," + "uniformGridSize" + "," + "executionCost1, executionCost2, executionCost3" + "," + "avg_time_sec" + ", " + "numberRecords1, numberRecords2, numberRecords3, " + "avg_records" + ", " + "throughput" + "\n")
file.flush()
file.close()
for inputTopicName in inputTopicNameList:
if inputTopicName == "TaxiDrive17MillionGeoJSON":
queryOptionListWindowed = ["51", "56", "61"]
queryOptionListRealtime = ["52", "57", "62"]
dateFormat = "yyyy-MM-dd HH:mm:ss"
gridMinX = "115.50000"
gridMaxX = "117.60000"
gridMinY = "39.60000"
gridMaxY = "41.10000"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[116.14319183444924, 40.07271444145411]"
queryPolygon = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762], [116.14319183444924, 40.07271444145411]"
queryLineString = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762]"
elif inputTopicName == "NYCBuildingsPolygons":
queryOptionListWindowed = ["66", "71", "76"]
queryOptionListRealtime = ["67", "72", "77"]
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
else:
queryOptionListWindowed = ["81", "86", "91"]
queryOptionListRealtime = ["82", "87", "92"]
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
for queryOption in queryOptionListWindowed:
for k in kList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wInterval in wIntervalList:
approximateQuery = "true"
radius = "0.005"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wStep in wStepList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for queryOption in queryOptionListRealtime:
for k in kList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
'''
# Join Query
outputFilePathAndName = "qureyOutput_JoinQuery.csv"
inputTopicNameList = ["TaxiDrive17MillionGeoJSON", "NYCBuildingsPolygons", "NYCBuildingsLineStrings"]
radiusList = ["0.0005", "0.005", "0.05", "0.5"]
wIntervalList = ["50", "100", "150", "200", "250"]
wStepList = ["25", "50", "75", "100", "125"]
uniformGridSizeList = ["100", "200", "300", "400", "500"]
queryOptionListWindowed = []
queryOptionListRealtime = []
dateFormat = ""
gridMinX = ""
gridMaxX = ""
gridMinY = ""
gridMaxY = ""
trajIDSet = ""
queryPoint = ""
queryPolygon = ""
queryLineString = ""
file = openFile(outputFilePathAndName)
file.write(
"queryOption" + "," + "approximateQuery" + "," + "inputTopicName" + "," + "radius" + "," + "k" + "," + "wInterval" + "," + "wStep" + "," + "uniformGridSize" + "," + "executionCost1, executionCost2, executionCost3" + "," + "avg_time_sec" + ", " + "numberRecords1, numberRecords2, numberRecords3, " + "avg_records" + ", " + "throughput" + "\n")
file.flush()
file.close()
for inputTopicName in inputTopicNameList:
if inputTopicName == "TaxiDrive17MillionGeoJSON":
queryOptionListWindowed = ["101", "106", "111"]
queryOptionListRealtime = ["102", "107", "112"]
queryTopicName = "TaxiDriveQuery"
dateFormat = "yyyy-MM-dd HH:mm:ss"
gridMinX = "115.50000"
gridMaxX = "117.60000"
gridMinY = "39.60000"
gridMaxY = "41.10000"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[116.14319183444924, 40.07271444145411]"
queryPolygon = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762], [116.14319183444924, 40.07271444145411]"
queryLineString = "[116.14319183444924, 40.07271444145411], [116.14305232274667, 40.06231150684208], [116.16313670438304, 40.06152322130762]"
elif inputTopicName == "NYCBuildingsPolygons":
queryOptionListWindowed = ["116", "121", "126"]
queryOptionListRealtime = ["117", "122", "127"]
queryTopicName = "NYCBuildingsPolygonsQuery"
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
else:
queryOptionListWindowed = ["131", "136", "141"]
queryOptionListRealtime = ["132", "137", "142"]
queryTopicName = "NYCBuildingsLineStringsQuery"
dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
gridMinX = "-74.25540"
gridMaxX = "-73.70007"
gridMinY = "40.49843"
gridMaxY = "40.91506"
trajIDSet = "9211800, 9320801, 9090500, 7282400, 10390100"
queryPoint = "[-74.0000, 40.72714]"
queryPolygon = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744], [-73.98452330316861, 40.67563064195701]"
queryLineString = "[-73.98452330316861, 40.67563064195701], [-73.98776303794413, 40.671603874732455], [-73.97826680869485, 40.666980275860936], [-73.97297380718484, 40.67347172572744]"
for queryOption in queryOptionListWindowed:
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval, wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wInterval in wIntervalList:
approximateQuery = "true"
radius = "0.005"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval,
wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for wStep in wStepList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval,
wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval,
wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for queryOption in queryOptionListRealtime:
for radius in radiusList:
approximateQuery = "true"
wInterval = "100"
wStep = "50"
uniformGridSize = "200"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval,
wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
for uniformGridSize in uniformGridSizeList:
approximateQuery = "true"
radius = "0.005"
wInterval = "100"
wStep = "50"
k = "50"
executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval,
wStep,
uniformGridSize, dateFormat, gridMinX, gridMaxX, gridMinY, gridMaxY,
trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec,
base_url, jar_id, outputFilePathAndName)
# x = getAllJobsOverview(base_url)
# x = getAllJars(base_url)
# print(x.status_code)
# print(x.text)
# y = json.loads(x.text)
# for rows in y["jobs"]:
# if rows["state"] == "RUNNING":
# terminateJob(base_url, rows["jid"])
def executeAndSave(queryOption, approximateQuery, inputTopicName, queryTopicName, radius, k, wInterval, wStep, uniformGridSize, dateFormat,
gridMinX, gridMaxX, gridMinY, gridMaxY, trajIDSet, queryPoint, queryPolygon, queryLineString,
experimentFrequency, executionTimeSeconds, waitBetweenExecutionsSec, base_url, jar_id,
outputFilePathAndName):
executionCostList = []
numberRecordList = []
executionCostList.clear()
numberRecordList.clear()
for i in range(experimentFrequency):
parameters = {"programArgsList": ["--onCluster", "true",
"--approximateQuery", approximateQuery,
"--queryOption", queryOption,
"--inputTopicName", inputTopicName,
"--queryTopicName", queryTopicName,
"--outputTopicName", "QueryLatency",
"--inputFormat", "GeoJSON",
"--dateFormat", dateFormat,
"--radius", radius,
"--aggregate", "SUM",
"--wType", "TIME",
"--wInterval", wInterval,
"--wStep", wStep,
"--uniformGridSize", uniformGridSize,
"--k", k,
"--trajDeletionThreshold", 1000,
"--outOfOrderAllowedLateness", "1",
"--omegaJoinDuration", "1",
"--gridMinX", gridMinX,
"--gridMaxX", gridMaxX,
"--gridMinY", gridMinY,
"--gridMaxY", gridMaxY,
"--trajIDSet", trajIDSet,
"--queryPoint", queryPoint,
"--queryPolygon", queryPolygon,
"--queryLineString", queryLineString],
"parallelism": 30}
x = submitJob(base_url, jar_id, parameters)
if x.status_code == 200:
print("Job submitted: " +
queryOption + "," + approximateQuery + "," + inputTopicName + "," + radius + "," + k + "," + wInterval + "," + wStep + "," + uniformGridSize)
# Execute for executionTimeSeconds
time.sleep(executionTimeSeconds)
job_id = json.dumps(x.json()['jobid'], indent=4).replace('"', '')
y = getJobOverview(base_url, job_id)
print(str(y.status_code) + ", " + y.text)
while str(json.dumps(y.json()['vertices'][0]['metrics']['write-records-complete'], indent=4)) != "true":
time.sleep(1)
y = getJobOverview(base_url, job_id)
print(str(y.status_code) + ", " + y.text)
duration = json.dumps(y.json()['vertices'][0]['duration'], indent=4)
print('duration : ' + duration)
metrics = json.dumps(y.json()['vertices'], indent=4)
records = json.dumps(y.json()['vertices'][0]['metrics']['write-records'], indent=4)
print('records : ' + records)
executionCostList.append(duration)
numberRecordList.append(records)
z = terminateJob(base_url, job_id)
print(str(z.status_code) + ", " + z.text)
# wait at-least 10 seconds before starting next job
time.sleep(waitBetweenExecutionsSec)
file = openFile(outputFilePathAndName)
avg_time_ms = average(executionCostList)
avg_time_sec = avg_time_ms / 1000
avg_records = average(numberRecordList)
throughput = avg_records / avg_time_sec
file.write(
queryOption + "," + approximateQuery + "," + inputTopicName + "," + radius + "," + k + "," + wInterval + "," + wStep + "," + uniformGridSize + "," + str(
executionCostList)[1:-1].replace("'", '') + "," + str(avg_time_sec) + ", " + str(numberRecordList)[
1:-1].replace("'",
'') + "," + str(
avg_records) + ", " + str(throughput) + "\n")
print(
queryOption + "," + approximateQuery + "," + inputTopicName + "," + radius + "," + k + "," + wInterval + "," + wStep + "," + uniformGridSize + "," + str(
executionCostList)[1:-1].replace("'", '') + "," + str(
avg_time_sec) + ", " + str(numberRecordList)[
1:-1].replace("'", '') + "," + str(
avg_records) + ", " + str(throughput))
file.flush()
file.close()
def submitJob(base_url, jar_id, parameters):
url = base_url + "jars/" + jar_id + "/run"
myheader = {'content-type': 'application/json'}
mydata = parameters
x = requests.post(url, data=json.dumps(mydata), headers=myheader)
return x
def terminateJob(base_url, job_id):
url = base_url + "jobs/" + job_id
mydata = {}
x = requests.patch(url, data=json.dumps(mydata))
return x
def uploadJar(base_url, path):
url = base_url + "/jars/upload"
myfile = {"jarfile": (
os.path.basename(path),
open(path, "rb"),
"application/x-java-archive"
)}
x = requests.post(url, files=myfile)
return x
def deleteJar(base_url, jar_id):
url = base_url + "jars/" + jar_id
mydata = '{}'
x = requests.delete(url, data=mydata)
return x
def getAllJobsOverview(base_url):
url = base_url + "jobs/overview"
mydata = '{}'
x = requests.get(url, data=mydata)
return x
def getJobOverview(base_url, job_id):
url = base_url + "jobs/" + job_id
mydata = '{}'
x = requests.get(url, data=mydata)
return x
def getAllJars(base_url):
url = base_url + "jars"
mydata = '{}'
x = requests.get(url, data=mydata)
return x
def getWebUIConfig(base_url):
url = base_url + "config"
mydata = '{}'
x = requests.get(url, data=mydata)
return x
def getFlinkClusterOverview(base_url):
url = base_url + "overview"
mydata = '{}'
x = requests.get(url, data=mydata)
return x
def openFile(filePathAndName):
file = open(filePathAndName, 'a')
file.truncate()
file.close()
file = open(filePathAndName, 'a')
return file
def sum(data):
total = 0
for x in data:
total += float(x)
return total
def average(data):
listSum = sum(data)
average = listSum / float(len(data))
return average
def variance(data):
avg = average(data)
variance = 0
for x in data:
variance += (avg - float(x)) ** 2
return variance / len(data)
def std_deviation(data):
return variance(data) ** 0.5
def calculate(data):
avg = average(data)
std = std_deviation(data)
return (avg, std)
if __name__ == "__main__":
main()
| 49.883427 | 351 | 0.547175 | 2,565 | 35,517 | 7.523977 | 0.120078 | 0.019224 | 0.061609 | 0.016788 | 0.842168 | 0.815483 | 0.806881 | 0.806881 | 0.802891 | 0.798539 | 0 | 0.165397 | 0.345637 | 35,517 | 711 | 352 | 49.953587 | 0.664989 | 0.012304 | 0 | 0.431894 | 0 | 0.019934 | 0.180016 | 0.023218 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056478 | false | 0 | 0.013289 | 0.003322 | 0.119601 | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9d8dbb1f4d23459ac37780517c38dca719abc057 | 32 | py | Python | utils/__init__.py | GCL-staging/GMI | 3f42c4204d0a58e6b8461f3a04e163fdfa2db7e4 | [
"MIT"
] | 86 | 2020-04-22T03:47:04.000Z | 2022-02-22T09:20:35.000Z | utils/__init__.py | GCL-staging/GMI | 3f42c4204d0a58e6b8461f3a04e163fdfa2db7e4 | [
"MIT"
] | 5 | 2020-05-15T13:28:08.000Z | 2021-08-10T15:02:42.000Z | utils/__init__.py | GCL-staging/GMI | 3f42c4204d0a58e6b8461f3a04e163fdfa2db7e4 | [
"MIT"
] | 10 | 2020-05-14T14:03:39.000Z | 2022-03-17T07:09:33.000Z | from models.logreg import LogReg | 32 | 32 | 0.875 | 5 | 32 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9db7df6e51fbcf177ff9b7b61eaaa710a516c870 | 19 | py | Python | grAdapt/utils/__init__.py | mkduong-ai/grAdapt | 94c2659b0f6ff9a2984a9dc58e3c83213313bf90 | [
"Apache-2.0"
] | 25 | 2020-11-13T05:57:01.000Z | 2021-06-18T11:16:03.000Z | pyandroidtouch/utils/__init__.py | HsOjo/PyAndroidTouch | 1c291a604460924ccfd470d12ac6ffd750b8e880 | [
"MIT"
] | null | null | null | pyandroidtouch/utils/__init__.py | HsOjo/PyAndroidTouch | 1c291a604460924ccfd470d12ac6ffd750b8e880 | [
"MIT"
] | null | null | null | from . import math
| 9.5 | 18 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9df37009938112fd466a6e372b451446265207aa | 38 | py | Python | src/qilib/__init__.py | QuTech-Delft/qilib | a87892f8a9977ed338c36e8fb1e262b47449cf44 | [
"MIT"
] | 1 | 2019-02-20T16:56:30.000Z | 2019-02-20T16:56:30.000Z | src/qilib/__init__.py | QuTech-Delft/qilib | a87892f8a9977ed338c36e8fb1e262b47449cf44 | [
"MIT"
] | 22 | 2019-02-16T06:10:55.000Z | 2022-02-15T18:52:34.000Z | src/qilib/__init__.py | QuTech-Delft/qilib | a87892f8a9977ed338c36e8fb1e262b47449cf44 | [
"MIT"
] | 2 | 2020-02-04T08:46:21.000Z | 2020-10-18T16:31:58.000Z | from qilib.version import __version__
| 19 | 37 | 0.868421 | 5 | 38 | 5.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d1ced02db00fce39d00911b8750a06f5421b9222 | 3,810 | py | Python | app/views.py | dice-project/DICE-Monitoring-WUI | 6eec59d8464ecd3c35e8f0e9040b882a24fa0893 | [
"Apache-2.0"
] | null | null | null | app/views.py | dice-project/DICE-Monitoring-WUI | 6eec59d8464ecd3c35e8f0e9040b882a24fa0893 | [
"Apache-2.0"
] | null | null | null | app/views.py | dice-project/DICE-Monitoring-WUI | 6eec59d8464ecd3c35e8f0e9040b882a24fa0893 | [
"Apache-2.0"
] | null | null | null | from flask import render_template
from app import app
@app.route('/login.html')
def login():
return render_template('pages/login.html', title="Login")
@app.route('/')
@app.route('/index.html')
def index():
return render_template('pages/index.html', title="Home", header="Home")
@app.route('/blank.html')
def blank():
return render_template('pages/blank.html', title="Blank", header="Blank", nav="Blank Page")
@app.route('/flot.html')
def flot():
return render_template('pages/flot.html', title="Flot", header="Flot Charts", nav="Flot Page")
@app.route('/morris.html')
def morris():
return render_template('pages/morris.html', title="Morris", header="Morris.js Charts", nav="Morris Page")
@app.route('/tables.html')
def tables():
return render_template('pages/tables.html', title="Tables", header="Tables", nav="Tables Page")
@app.route('/forms.html')
def forms():
return render_template('pages/forms.html', title="Forms", header="Forms", nav="Forms Page")
@app.route('/panels-wells.html')
def panels_wells():
return render_template('pages/panels-wells.html', title="Panels and Wells", header="Panels and Wells",
nav="Panels and Wells Page")
@app.route('/buttons.html')
def buttons():
return render_template('pages/buttons.html', title="Buttons", header="Buttons", nav="Buttons Page")
@app.route('/notifications.html')
def notifications():
return render_template('pages/notifications.html', title="Notifications", header="Notifications",
nav="Notifications Page")
@app.route('/typography.html')
def typography():
return render_template('pages/typography.html', title="Typography", header="Typography", nav="Typography Page")
@app.route('/icons.html')
def icons():
return render_template('pages/icons.html', title="Icons", header="Icons", nav="Icons Page")
@app.route('/grid.html')
def grid():
return render_template('pages/grid.html', title="Grid", header="Grid", nav="Grid Page")
# DICE Resource Routes
@app.route('/nodes.html')
def nodes():
return render_template('pages/nodes.html', title="Monitored Nodes", header='Nodes', nav="Grid page")
@app.route('/nodes-map.html')
def nodeMap():
return render_template('/pages/nodes-map.html', title="Node Map", header='Map', nav="Grid page")
@app.route('/hdfs.html')
def hdfs():
return render_template('/pages/hdfs.html', title="HDFS Metrics", header='HDFS', nav="Grid page")
@app.route('/yarn.html')
def yarn():
return render_template('/pages/yarn.html', title="YARN Metrics", header='YARN', nav="Grid page")
@app.route('/spark.html')
def spark():
return render_template('pages/spark.html', title="Spark Metrics", header='Spark', nav="Grid page")
@app.route('/storm.html')
def storm():
return render_template('pages/storm.html', title="Storm Metrics", header='Storm', nav="Grid page")
@app.route('/kafka.html')
def kafka():
return render_template('pages/kafka.html', title="Kafka Metrics", header='Kafka', nav="Grid page")
@app.route('/system.html')
def systemMetrics():
return render_template('pages/system.html', title="System Metrics", header='System', nav="Grid page")
@app.route('/dmon-ctrl.html')
def dmonController():
return render_template('pages/dmon-ctrl.html', title="DMON Controller", header='Controller', nav="Grid page")
@app.route('/dmon-plat.html')
def dmonMap():
return render_template('pages/dmon-plat.html', title="DMON Platform", header='Platform', nav="Grid page")
@app.route('/dmon-map.html')
def dmonPlat():
return render_template('pages/dmon-map.html', title="DMON Map", header='Map', nav="Grid page")
@app.route('/dmon-mas.html')
def dmonMAS():
return render_template('pages/dmon-mas.html', title="DMON MAS", header='MAS', nav="Grid page")
| 29.534884 | 115 | 0.687402 | 515 | 3,810 | 5.03301 | 0.114563 | 0.140432 | 0.192901 | 0.241127 | 0.163966 | 0.052083 | 0.02392 | 0.02392 | 0 | 0 | 0 | 0 | 0.11916 | 3,810 | 128 | 116 | 29.765625 | 0.772348 | 0.005249 | 0 | 0 | 0 | 0 | 0.36906 | 0.023495 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3125 | true | 0 | 0.025 | 0.3125 | 0.65 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
d1e1ec428ad7e46d38b3d783bbd99db3518355c7 | 56 | py | Python | models/__init__.py | sr-vazkez/ecommerce-fastapi | a2e6557b1be60a88a7f453e7fa92b1726c17abef | [
"MIT"
] | null | null | null | models/__init__.py | sr-vazkez/ecommerce-fastapi | a2e6557b1be60a88a7f453e7fa92b1726c17abef | [
"MIT"
] | null | null | null | models/__init__.py | sr-vazkez/ecommerce-fastapi | a2e6557b1be60a88a7f453e7fa92b1726c17abef | [
"MIT"
] | null | null | null | from models.complain import *
from models.user import *
| 18.666667 | 29 | 0.785714 | 8 | 56 | 5.5 | 0.625 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 56 | 2 | 30 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d1ebcf5f8a40110dc733fcf81f583f43443f1201 | 55 | py | Python | aioopm/__init__.py | jawilson/aioopm | f24330923b8b5e2844ef1fd67fa7fe65a5761d8c | [
"Apache-2.0"
] | null | null | null | aioopm/__init__.py | jawilson/aioopm | f24330923b8b5e2844ef1fd67fa7fe65a5761d8c | [
"Apache-2.0"
] | null | null | null | aioopm/__init__.py | jawilson/aioopm | f24330923b8b5e2844ef1fd67fa7fe65a5761d8c | [
"Apache-2.0"
] | null | null | null | from .operating_status import current_operating_status
| 27.5 | 54 | 0.909091 | 7 | 55 | 6.714286 | 0.714286 | 0.638298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 55 | 1 | 55 | 55 | 0.921569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ae6098bdd03f1ef12ca482288e5fb9075818cf6c | 82 | py | Python | tasks/__init__.py | wmww/instigate | bc4f77d2cc40a57a0757ca4f460d5a9bc8a076ae | [
"MIT"
] | null | null | null | tasks/__init__.py | wmww/instigate | bc4f77d2cc40a57a0757ca4f460d5a9bc8a076ae | [
"MIT"
] | null | null | null | tasks/__init__.py | wmww/instigate | bc4f77d2cc40a57a0757ca4f460d5a9bc8a076ae | [
"MIT"
] | null | null | null | from .subprocess import execute_run, scout_run
from .git_config import git_config
| 27.333333 | 46 | 0.853659 | 13 | 82 | 5.076923 | 0.615385 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109756 | 82 | 2 | 47 | 41 | 0.90411 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
885d91e14ef68bcef6b50dbbcd2061fa3ec7ada8 | 212 | py | Python | scraper.py | laboratoryyingong/web-scraper-pet | 900154c20cb0c1515843023528b28e37f6fb0780 | [
"MIT"
] | null | null | null | scraper.py | laboratoryyingong/web-scraper-pet | 900154c20cb0c1515843023528b28e37f6fb0780 | [
"MIT"
] | null | null | null | scraper.py | laboratoryyingong/web-scraper-pet | 900154c20cb0c1515843023528b28e37f6fb0780 | [
"MIT"
] | null | null | null | from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.common.by import By
import time
import re
| 21.2 | 56 | 0.853774 | 28 | 212 | 6.428571 | 0.428571 | 0.2 | 0.233333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113208 | 212 | 9 | 57 | 23.555556 | 0.957447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ee72fde5775b4a5d494724e9fe42218f40d4355c | 28 | py | Python | achilles/terminal/train/__init__.py | np-core/druid | ae483db67eeace4dabdb15e1401e9fc01e2c7d07 | [
"MIT"
] | 5 | 2019-09-04T21:02:36.000Z | 2021-11-05T01:14:56.000Z | achilles/terminal/train/__init__.py | np-core/druid | ae483db67eeace4dabdb15e1401e9fc01e2c7d07 | [
"MIT"
] | 7 | 2019-03-14T05:42:11.000Z | 2019-03-15T01:49:20.000Z | achilles/terminal/train/__init__.py | np-core/druid | ae483db67eeace4dabdb15e1401e9fc01e2c7d07 | [
"MIT"
] | 2 | 2021-03-01T06:53:35.000Z | 2021-11-05T01:18:43.000Z | from .commands import train
| 14 | 27 | 0.821429 | 4 | 28 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ee865ffadc5774af35575bcda51361ae1fc3897f | 136 | py | Python | workin/forms/__init__.py | waderly/workin | 0fa5873cd2da36ef7d0dc578cd20f1b7f5c224dd | [
"BSD-3-Clause"
] | 1 | 2015-11-05T18:32:43.000Z | 2015-11-05T18:32:43.000Z | workin/forms/__init__.py | waderly/workin | 0fa5873cd2da36ef7d0dc578cd20f1b7f5c224dd | [
"BSD-3-Clause"
] | null | null | null | workin/forms/__init__.py | waderly/workin | 0fa5873cd2da36ef7d0dc578cd20f1b7f5c224dd | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# encoding: utf-8
from wtforms import *
from wtforms.validators import *
from workin.forms.base import BaseForm
| 17 | 38 | 0.764706 | 20 | 136 | 5.2 | 0.75 | 0.211538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.139706 | 136 | 7 | 39 | 19.428571 | 0.880342 | 0.264706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c9a69ef203084632bca90330ecc5be3574b58878 | 29 | py | Python | lookup/nas201bench/__init__.py | hyunghunny/bananas | 03768352e7677e1dd99176d9189fd8c60069609e | [
"Apache-2.0"
] | null | null | null | lookup/nas201bench/__init__.py | hyunghunny/bananas | 03768352e7677e1dd99176d9189fd8c60069609e | [
"Apache-2.0"
] | null | null | null | lookup/nas201bench/__init__.py | hyunghunny/bananas | 03768352e7677e1dd99176d9189fd8c60069609e | [
"Apache-2.0"
] | null | null | null | from .api import NAS201Bench
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.137931 | 29 | 1 | 29 | 29 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c9b107d516bcf2ef668f6b352704c0071c33eacc | 591 | py | Python | counter_attack/tests/__init__.py | samuelemarro/anti-attacks | f63829ee26e24d40aecdd2d6cc6bd7026d11a016 | [
"MIT"
] | null | null | null | counter_attack/tests/__init__.py | samuelemarro/anti-attacks | f63829ee26e24d40aecdd2d6cc6bd7026d11a016 | [
"MIT"
] | null | null | null | counter_attack/tests/__init__.py | samuelemarro/anti-attacks | f63829ee26e24d40aecdd2d6cc6bd7026d11a016 | [
"MIT"
] | null | null | null | from counter_attack.tests.accuracy import accuracy_test
from counter_attack.tests.adversarial_perturbation import adversarial_perturbation_test
from counter_attack.tests.attack import attack_test
from counter_attack.tests.boundary_distance import boundary_distance_test
from counter_attack.tests.parallelization import parallelization_test
from counter_attack.tests.radius import radius_test
from counter_attack.tests.roc_curve import roc_curve_test
from counter_attack.tests.shallow_defense import shallow_defense_test
from counter_attack.tests.shallow_rejector import shallow_rejector_test | 65.666667 | 87 | 0.910321 | 82 | 591 | 6.219512 | 0.219512 | 0.194118 | 0.3 | 0.388235 | 0.435294 | 0.129412 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059222 | 591 | 9 | 88 | 65.666667 | 0.917266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c9c30d7a83eb2284dec702f33ebf0af99605ceaf | 56 | py | Python | autosklearn/classification.py | tuggeluk/auto-sklearn | 202918e5641701c696b995039d06bfec81973cc6 | [
"BSD-3-Clause"
] | 1 | 2017-08-13T13:57:40.000Z | 2017-08-13T13:57:40.000Z | autosklearn/classification.py | chrinide/auto-sklearn | 1c6af59ff61f1d0a3b54b16a35ffbc5d2d3828cd | [
"BSD-3-Clause"
] | null | null | null | autosklearn/classification.py | chrinide/auto-sklearn | 1c6af59ff61f1d0a3b54b16a35ffbc5d2d3828cd | [
"BSD-3-Clause"
] | 1 | 2019-06-18T15:40:37.000Z | 2019-06-18T15:40:37.000Z | from autosklearn.estimators import AutoSklearnClassifier | 56 | 56 | 0.928571 | 5 | 56 | 10.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053571 | 56 | 1 | 56 | 56 | 0.981132 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c9ed91a0844e561b0da656810e45f6aea1a86374 | 63 | py | Python | nameyourapp/models.py | wannadrunk/python-flask-boilerplate | 01184e9636a258156a806c42d60a4d85c7c1a5d1 | [
"BSD-3-Clause"
] | 4 | 2020-01-03T20:32:47.000Z | 2020-01-06T11:04:08.000Z | nameyourapp/models.py | wannadrunk/python-flask-boilerplate | 01184e9636a258156a806c42d60a4d85c7c1a5d1 | [
"BSD-3-Clause"
] | null | null | null | nameyourapp/models.py | wannadrunk/python-flask-boilerplate | 01184e9636a258156a806c42d60a4d85c7c1a5d1 | [
"BSD-3-Clause"
] | 1 | 2021-07-04T10:05:30.000Z | 2021-07-04T10:05:30.000Z | from .extensions import db
class MyModel(db.Model):
pass
| 10.5 | 26 | 0.714286 | 9 | 63 | 5 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206349 | 63 | 5 | 27 | 12.6 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
a01fea1eee360f1376a613b834a5db6ef63ae8e6 | 18,586 | py | Python | tworaven_apps/ta2_interfaces/views_non_streaming_requests.py | TwoRavens/TwoRavens | e5f820557d6646df525ceed15e17d79f4159cf0a | [
"Apache-2.0"
] | 20 | 2017-12-11T07:26:06.000Z | 2021-11-22T16:16:20.000Z | tworaven_apps/ta2_interfaces/views_non_streaming_requests.py | TwoRavens/TwoRavens | e5f820557d6646df525ceed15e17d79f4159cf0a | [
"Apache-2.0"
] | 849 | 2017-10-20T18:21:18.000Z | 2022-02-18T02:45:44.000Z | tworaven_apps/ta2_interfaces/views_non_streaming_requests.py | TwoRavens/TwoRavens | e5f820557d6646df525ceed15e17d79f4159cf0a | [
"Apache-2.0"
] | 1 | 2020-05-18T06:02:13.000Z | 2020-05-18T06:02:13.000Z | """
Functions for when the UI sends JSON requests to route to TA2s as gRPC calls
- Right now this code is quite redundant. Wait for integration to factor it out,
e.g. lots may change--including the "req_" files being part of a separate service
"""
import json
from collections import OrderedDict
from django.shortcuts import render
from django.http import JsonResponse #, HttpResponse, Http404
from django.views.decorators.csrf import csrf_exempt
from tworaven_apps.raven_auth.models import User
from tworaven_apps.ta2_interfaces.models import StoredRequest, StoredResponse
from tworaven_apps.ta2_interfaces.req_hello import ta2_hello
from tworaven_apps.ta2_interfaces.req_search_solutions import \
(search_solutions, end_search_solutions,
stop_search_solutions, describe_solution,
score_solution, fit_solution,
produce_solution,
solution_export, solution_export_with_saved_response,
solution_export3,
list_primitives)
from tworaven_apps.utils.json_helper import json_loads
from tworaven_apps.utils.view_helper import \
(get_request_body,
get_request_body_as_json,
get_authenticated_user,
get_json_error,
get_json_success)
from tworaven_apps.call_captures.models import ServiceCallEntry
from tworaven_apps.utils.view_helper import SESSION_KEY, get_session_key
from tworaven_apps.ta2_interfaces.ta2_search_solutions_helper import \
SearchSolutionsHelper
from tworaven_apps.user_workspaces.utils import get_latest_user_workspace
from tworaven_apps.user_workspaces.reset_util import ResetUtil
from tworaven_apps.ta2_interfaces import static_vals as ta2_static
from tworaven_apps.behavioral_logs.log_entry_maker import LogEntryMaker
from tworaven_apps.behavioral_logs import static_vals as bl_static
@csrf_exempt
def view_hello_heartbeat(request):
"""Hello to TA2 with no logging. Used for testing"""
# Let's call the TA2!
#
resp_info = ta2_hello()
if not resp_info.success:
return JsonResponse(get_json_error(resp_info.err_msg))
json_str = resp_info.result_obj
# Convert JSON str to python dict - err catch here
# - let it blow up for now--should always return JSON
json_format_info = json_loads(json_str)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
json_info = get_json_success('success!',
data=json_format_info.result_obj)
return JsonResponse(json_info)
@csrf_exempt
def view_hello(request):
"""gRPC: Call from UI as a hearbeat"""
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
# --------------------------------
# Behavioral logging
# --------------------------------
log_data = dict(session_key=get_session_key(request),
feature_id=ta2_static.HELLO,
activity_l1=bl_static.L1_SYSTEM_ACTIVITY,
activity_l2=bl_static.L2_APP_LAUNCH)
LogEntryMaker.create_ta2ta3_entry(user_info.result_obj, log_data)
# note: this is just a heartbeat, so no params are sent
#
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='Hello',
request_msg=('no params for this call'))
# Let's call the TA2!
#
resp_info = ta2_hello()
if not resp_info.success:
return JsonResponse(get_json_error(resp_info.err_msg))
json_str = resp_info.result_obj
# Convert JSON str to python dict - err catch here
# - let it blow up for now--should always return JSON
json_format_info = json_loads(json_str)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!',
data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_search_describe_fit_score_solutions(request):
"""gRPC: Call from UI with params to
Search, Describe, Fit, and Score solutions"""
# ------------------------------------
# Retrieve the User
# ------------------------------------
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
user_obj = user_info.result_obj
websocket_id = user_obj.username # websocket pushes currently based on username
user_id = user_obj.id
# ------------------------------------
# Parse the JSON request
# ------------------------------------
req_json_info = get_request_body_as_json(request)
if not req_json_info.success:
return JsonResponse(get_json_error(req_json_info.err_msg))
extra_params = {SESSION_KEY: get_session_key(request)}
search_info = SearchSolutionsHelper.make_search_solutions_call(\
req_json_info.result_obj,
websocket_id,
user_id,
**extra_params)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
json_info = get_json_success('success!', data=search_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_search_solutions(request):
"""gRPC: Call from UI with a SearchSolutionsRequest"""
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type=ta2_static.SEARCH_SOLUTIONS,
request_msg=req_body_info.result_obj)
# Let's call the TA2!
#
search_info = search_solutions(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
# - let it blow up for now--should always return JSON
#json_dict = json.loads(search_info.result_obj, object_pairs_hook=OrderedDict)
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!', data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_end_search_solutions(request):
"""gRPC: Call from UI with a EndSearchSolutionsRequest"""
print('view_end_search_solutions 1')
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
user = user_info.result_obj
print('view_end_search_solutions 2')
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
print('view_end_search_solutions 3')
# --------------------------------
# Behavioral logging
# --------------------------------
log_data = dict(session_key=get_session_key(request),
feature_id=ta2_static.END_SEARCH_SOLUTIONS,
activity_l1=bl_static.L1_SYSTEM_ACTIVITY,
activity_l2=bl_static.L2_ACTIVITY_BLANK)
LogEntryMaker.create_ta2ta3_entry(user, log_data)
print('view_end_search_solutions 4')
# Let's call the TA2 and end the session!
#
params = dict(user=user)
search_info = end_search_solutions(req_body_info.result_obj,
**params)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# The session is over, write the log entries files
#
#LogEntryMaker.write_user_log_from_request(request)
# User is done at this point!
# Write out the log and delete it....
user_workspace = None
ws_info = get_latest_user_workspace(request)
if ws_info.success:
user_workspace = ws_info.result_obj
ResetUtil.write_and_clear_behavioral_logs(user, user_workspace)
json_info = get_json_success('success!', data=search_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_stop_search_solutions(request):
"""gRPC: Call from UI with a StopSearchSolutions"""
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type=ta2_static.STOP_SEARCH_SOLUTIONS,
request_msg=req_body_info.result_obj)
# --------------------------------
# Behavioral logging
# --------------------------------
log_data = dict(session_key=get_session_key(request),
feature_id=ta2_static.STOP_SEARCH_SOLUTIONS,
activity_l1=bl_static.L1_SYSTEM_ACTIVITY,
activity_l2=bl_static.L2_ACTIVITY_BLANK)
LogEntryMaker.create_ta2ta3_entry(user_info.result_obj, log_data)
# Let's call the TA2!
#
search_info = stop_search_solutions(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
# - let it blow up for now--should always return JSON
json_dict = json.loads(search_info.result_obj, object_pairs_hook=OrderedDict)
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_dict)
json_info = get_json_success('success!', data=json_dict)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_describe_solution(request):
"""gRPC: Call from UI with a DescribeSolutionRequest"""
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='DescribeSolution',
request_msg=req_body_info.result_obj)
# Let's call the TA2!
#
search_info = describe_solution(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
# - let it blow up for now--should always return JSON
json_dict = json.loads(search_info.result_obj, object_pairs_hook=OrderedDict)
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_dict)
json_info = get_json_success('success!', data=json_dict)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_score_solution(request):
"""gRPC: Call from UI with a ScoreSolutionRequest"""
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='ScoreSolution',
request_msg=req_body_info.result_obj)
# Let's call the TA2!
#
search_info = score_solution(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
#
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!',
data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_fit_solution(request):
"""gRPC: Call from UI with a FitSolutionRequest"""
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='FitSolution',
request_msg=req_body_info.result_obj)
# Let's call the TA2!
#
search_info = fit_solution(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
#
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!', data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_produce_solution(request):
"""gRPC: Call from UI with a ProduceSolutionRequest"""
req_body_info = get_request_body(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
# Begin to log D3M call
#
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='ProduceSolution',
request_msg=req_body_info.result_obj)
# Let's call the TA2!
#
search_info = produce_solution(req_body_info.result_obj)
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
#
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!', data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_solution_export3(request):
"""gRPC: Call from UI with a SolutionExportRequest"""
# Retrieve the User
#
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
user = user_info.result_obj
req_body_info = get_request_body_as_json(request)
if not req_body_info.success:
return JsonResponse(get_json_error(req_body_info.err_msg))
session_key = get_session_key(request)
# Let's call the TA2!
#
search_info = solution_export3(user,
req_body_info.result_obj,
session_key=session_key)
# print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
#
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
json_info = get_json_success('success!', data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
@csrf_exempt
def view_list_primitives(request):
"""gRPC: Call from UI with a ListPrimitivesRequest"""
user_info = get_authenticated_user(request)
if not user_info.success:
return JsonResponse(get_json_error(user_info.err_msg))
# --------------------------------
# (2) Begin to log D3M call
# --------------------------------
call_entry = None
if ServiceCallEntry.record_d3m_call():
call_entry = ServiceCallEntry.get_dm3_entry(\
request_obj=request,
call_type='ListPrimitives',
request_msg='no params for this call')
# --------------------------------
# (2a) Behavioral logging
# --------------------------------
log_data = dict(session_key=get_session_key(request),
feature_id=ta2_static.LIST_PRIMITIVES,
activity_l1=bl_static.L1_SYSTEM_ACTIVITY,
activity_l2=bl_static.L2_ACTIVITY_BLANK)
LogEntryMaker.create_ta2ta3_entry(user_info.result_obj, log_data)
# Let's call the TA2!
#
search_info = list_primitives()
#print('search_info', search_info)
if not search_info.success:
return JsonResponse(get_json_error(search_info.err_msg))
# Convert JSON str to python dict - err catch here
#
json_format_info = json_loads(search_info.result_obj)
if not json_format_info.success:
return JsonResponse(get_json_error(json_format_info.err_msg))
# Save D3M log
#
if call_entry:
call_entry.save_d3m_response(json_format_info.result_obj)
json_info = get_json_success('success!', data=json_format_info.result_obj)
return JsonResponse(json_info, safe=False)
| 33.854281 | 86 | 0.680835 | 2,429 | 18,586 | 4.850144 | 0.087279 | 0.048383 | 0.05407 | 0.086156 | 0.818182 | 0.778711 | 0.762838 | 0.739326 | 0.724047 | 0.709023 | 0 | 0.007288 | 0.224793 | 18,586 | 548 | 87 | 33.916058 | 0.810383 | 0.18105 | 0 | 0.676667 | 0 | 0 | 0.021547 | 0.00665 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.063333 | 0 | 0.26 | 0.013333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4ecffbb55661ac45323dab0fa6c592e3c61a7d36 | 16,980 | py | Python | snakes/levels.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | 12 | 2019-04-15T07:20:31.000Z | 2019-05-18T22:03:35.000Z | snakes/levels.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | null | null | null | snakes/levels.py | japinol7/snakes | bb501736027897bacab498ad7bbbe622cf4b9755 | [
"MIT"
] | null | null | null | """Module levels."""
__author__ = 'Joan A. Pinol (japinol)'
from random import randint
from snakes.actors import Actor
from snakes.apples import Apple, AppleType
from snakes.bats import Bat, BatType
from snakes.bullets import Bullet
from snakes.cartridges import Cartridge, CartridgeType
from snakes.mines import Mine, MineType
from snakes.recovery_potions import RecoveryPotion, RecoveryPotionType
from snakes.snakes import Snake
APPLE_TYPE_01_LEVEL_MULTIPLIER = 0.19
APPLE_TYPE_02_LEVEL_MULTIPLIER = 0.24
APPLE_TYPE_03_LEVEL_MULTIPLIER = 0.29
MINE_TYPE_01_LEVEL_MULTIPLIER = 0.25
MINE_TYPE_02_LEVEL_MULTIPLIER = 0.17
BAT_TYPE_01_LEVEL_MULTIPLIER = 0.22
BAT_TYPE_02_LEVEL_MULTIPLIER = 0.17
BAT_TYPE_03_LEVEL_MULTIPLIER = 0.13
class Level:
"""Represents a level."""
def __init__(self, game):
self.id = None
self.start_time = None
self.end_time = None
self.game = game
self.name = None
self.next_level = None
def start_up(self):
self.start_time = self.game.current_time
def clean_up(self):
self.end_time = self.game.current_time
# Remove current mines, cartridges and recovery potions from the board
for mine in self.game.mines:
mine.kill()
for cartridge in self.game.cartridges:
cartridge.kill()
for rec_potion in self.game.rec_potions:
rec_potion.kill()
class Level_01(Level):
"""Represents level 1."""
def __init__(self, game):
super().__init__(game)
self.id = 0
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
# Initialize actors
Actor.initialize_actors([Apple, Mine, Bat, Snake, Bullet, Cartridge, RecoveryPotion])
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=2, apple_type=AppleType.T1_RED,
apple_list=self.game.apples, game=self.game)
def clean_up(self):
super().clean_up()
class Level_02(Level):
"""Represents level 2."""
def __init__(self, game):
super().__init__(game)
self.id = 1
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=2, apple_type=AppleType.T2_GREEN,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T3_YELLOW,
apple_list=self.game.apples, game=self.game)
def clean_up(self):
super().clean_up()
class Level_03(Level):
"""Represents level 3."""
def __init__(self, game):
super().__init__(game)
self.id = 2
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=1, apple_type=AppleType.T1_RED,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T2_GREEN,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T3_YELLOW,
apple_list=self.game.apples, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=1, mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=80, game=self.game)
# Put bats on the board
Bat.create_some_random_pos(n=1, bat_type=BatType.T1_BLUE, bat_list=self.game.bats,
probability_each=30, game=self.game)
def clean_up(self):
super().clean_up()
class Level_04(Level):
"""Represents level 4."""
def __init__(self, game):
super().__init__(game)
self.id = 3
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=2, apple_type=AppleType.T1_RED,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=2, apple_type=AppleType.T3_YELLOW,
apple_list=self.game.apples, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=1, mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=80, game=self.game)
# Put bats on the board
Bat.create_some_random_pos(n=1, bat_type=BatType.T2_LILAC, bat_list=self.game.bats,
probability_each=80, game=self.game)
def clean_up(self):
super().clean_up()
class Level_05(Level):
"""Represents level 5."""
def __init__(self, game):
super().__init__(game)
self.id = 4
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=1, apple_type=AppleType.T1_RED, probability_each=100,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=3, apple_type=AppleType.T2_GREEN, probability_each=40,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=3, apple_type=AppleType.T3_YELLOW, probability_each=40,
apple_list=self.game.apples, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=2, mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=30, game=self.game)
# Put bats on the board
Bat.create_some_random_pos(n=1, bat_type=BatType.T1_BLUE, bat_list=self.game.bats,
probability_each=100, game=self.game)
def clean_up(self):
super().clean_up()
class Level_06(Level):
"""Represents level 6."""
def __init__(self, game):
super().__init__(game)
self.id = 5
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=1, apple_type=AppleType.T1_RED, probability_each=100,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=3, apple_type=AppleType.T2_GREEN, probability_each=40,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=4, apple_type=AppleType.T3_YELLOW, probability_each=20,
apple_list=self.game.apples, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=3, mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=75, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=30, game=self.game)
# Put bats on the board
Bat.create_some_random_pos(n=2, bat_type=BatType.T1_BLUE, bat_list=self.game.bats,
probability_each=60, game=self.game)
def clean_up(self):
super().clean_up()
class Level_07(Level):
"""Represents level 7."""
def __init__(self, game):
super().__init__(game)
self.id = 6
self.name = f'{self.id + 1:03}'
self.next_level = self.id + 1
def start_up(self):
super().start_up()
# Put apples on the board
Apple.create_some_random_pos(n=1, apple_type=AppleType.T3_YELLOW, probability_each=100,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T1_RED, probability_each=40,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T2_GREEN, probability_each=40,
apple_list=self.game.apples, game=self.game)
Apple.create_some_random_pos(n=1, apple_type=AppleType.T3_YELLOW, probability_each=40,
apple_list=self.game.apples, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=2, mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=100, game=self.game)
Mine.create_some_random_pos(n=1, mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=40, game=self.game)
# Put bats on the board
Bat.create_some_random_pos(n=1, bat_type=BatType.T2_LILAC, bat_list=self.game.bats,
probability_each=100, game=self.game)
def clean_up(self):
super().clean_up()
class Level_others(Level):
"""Represents other high levels."""
def __init__(self, game):
super().__init__(game)
self.id = 7
self.name = f'{self.id + 1:03}'
self.next_level = self.id
def start_up(self):
super().start_up()
self.id = self.game.current_level_no
self.name = f'{self.game.current_level_no + 1:03}'
level = self.game.current_level_no + 1
# Put apples on the board
Apple.create_some_random_pos(n=int(1 + level * APPLE_TYPE_01_LEVEL_MULTIPLIER),
apple_type=AppleType.T1_RED, apple_list=self.game.apples,
probability_each=75, game=self.game)
Apple.create_some_random_pos(n=int(1 + level * APPLE_TYPE_02_LEVEL_MULTIPLIER),
apple_type=AppleType.T2_GREEN, apple_list=self.game.apples,
probability_each=90, game=self.game)
Apple.create_some_random_pos(n=int(1 + level * APPLE_TYPE_03_LEVEL_MULTIPLIER),
apple_type=AppleType.T3_YELLOW, apple_list=self.game.apples,
probability_each=85, game=self.game)
# Put mines on the board
Mine.create_some_random_pos(n=int(level * MINE_TYPE_01_LEVEL_MULTIPLIER),
mine_type=MineType.T1_AQUA, mine_list=self.game.mines,
probability_each=80, game=self.game)
Mine.create_some_random_pos(n=int(level * MINE_TYPE_02_LEVEL_MULTIPLIER),
mine_type=MineType.T2_LILAC, mine_list=self.game.mines,
probability_each=70, game=self.game)
# Put cartridges and recovery potions on the board
random_items_prob = randint(1, 100)
if random_items_prob < 21:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T1_LASER1,
cartridge_list=self.game.cartridges,
probability_each=100, game=self.game)
elif random_items_prob < 61:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T2_LASER2,
cartridge_list=self.game.cartridges,
probability_each=100, game=self.game)
if level >= 14 and randint(1, 100) >= 80:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T3_PHOTONIC,
cartridge_list=self.game.cartridges,
probability_each=80, game=self.game)
RecoveryPotion.create_some_random_pos(n=1, rec_potion_type=RecoveryPotionType.T2_POWER,
rec_potion_list=self.game.rec_potions,
probability_each=20, game=self.game)
elif random_items_prob < 81:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T3_PHOTONIC,
cartridge_list=self.game.cartridges,
probability_each=90, game=self.game)
RecoveryPotion.create_some_random_pos(n=1, rec_potion_type=RecoveryPotionType.T1_HEALTH,
rec_potion_list=self.game.rec_potions,
probability_each=80, game=self.game)
if level >= 14 and randint(1, 100) >= 70:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T4_NEUTRONIC,
cartridge_list=self.game.cartridges,
probability_each=60, game=self.game)
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T2_LASER2,
cartridge_list=self.game.cartridges,
probability_each=80, game=self.game)
else:
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T2_LASER2,
cartridge_list=self.game.cartridges,
probability_each=70, game=self.game)
Cartridge.create_some_random_pos(n=1, cartridge_type=CartridgeType.T3_PHOTONIC,
cartridge_list=self.game.cartridges,
probability_each=30, game=self.game)
if level >= 10 and randint(1, 100) >= 56:
RecoveryPotion.create_some_random_pos(n=1, rec_potion_type=RecoveryPotionType.T1_HEALTH,
rec_potion_list=self.game.rec_potions,
probability_each=60, game=self.game)
RecoveryPotion.create_some_random_pos(n=1, rec_potion_type=RecoveryPotionType.T2_POWER,
rec_potion_list=self.game.rec_potions,
probability_each=60, game=self.game)
# Put bats on the board
if level > 9:
Bat.create_some_random_pos(n=1, bat_type=BatType.T3_RED, bat_list=self.game.bats,
probability_each=90, game=self.game)
Bat.create_some_random_pos(n=1, bat_type=BatType.T2_LILAC, bat_list=self.game.bats,
probability_each=100, game=self.game)
Bat.create_some_random_pos(n=int(level * BAT_TYPE_01_LEVEL_MULTIPLIER),
bat_type=BatType.T1_BLUE, bat_list=self.game.bats,
probability_each=80, game=self.game)
Bat.create_some_random_pos(n=int(level * BAT_TYPE_02_LEVEL_MULTIPLIER),
bat_type=BatType.T2_LILAC, bat_list=self.game.bats,
probability_each=68, game=self.game)
Bat.create_some_random_pos(n=1, bat_type=BatType.T3_RED, bat_list=self.game.bats,
probability_each=34, game=self.game)
def clean_up(self):
super().clean_up()
| 50.236686 | 105 | 0.570671 | 2,091 | 16,980 | 4.341942 | 0.066954 | 0.116312 | 0.100452 | 0.119286 | 0.869479 | 0.831589 | 0.821016 | 0.798106 | 0.789404 | 0.759445 | 0 | 0.033848 | 0.340577 | 16,980 | 337 | 106 | 50.385757 | 0.776994 | 0.047291 | 0 | 0.633588 | 0 | 0 | 0.01186 | 0.001712 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103053 | false | 0 | 0.034351 | 0 | 0.171756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
14c50b2e3227dafaf49a46313481cdad3bb2a1d2 | 900 | py | Python | conman/redirects/views.py | meshy/django-conman | c739d09250d02d99068358e925ed8298a2a37a75 | [
"BSD-2-Clause"
] | 2 | 2019-07-02T04:55:37.000Z | 2019-12-16T15:11:19.000Z | conman/redirects/views.py | Ian-Foote/django-conman | 8505a39806f4a62fb11666a4d53c64c443968e36 | [
"BSD-2-Clause"
] | 81 | 2015-07-27T23:21:49.000Z | 2018-05-21T22:06:09.000Z | conman/redirects/views.py | Ian-Foote/django-conman | 8505a39806f4a62fb11666a4d53c64c443968e36 | [
"BSD-2-Clause"
] | 2 | 2015-10-06T09:18:06.000Z | 2017-03-17T08:51:56.000Z | from django.views.generic import RedirectView
class RouteRedirectView(RedirectView):
"""Redirect to the target Route."""
permanent = False # Set to django 1.9's default to avoid RemovedInDjango19Warning
def get_redirect_url(self, *args, route, **kwargs):
"""
Return the route's target url.
Save the route's redirect type for use by RedirectView.
"""
self.permanent = route.permanent
return route.target.url
class URLRedirectView(RedirectView):
"""Redirect to a URLRedirect Route's target URL."""
permanent = False # Set to django 1.9's default to avoid RemovedInDjango19Warning
def get_redirect_url(self, *args, route, **kwargs):
"""
Return the target url.
Save the route's redirect type for use by RedirectView.
"""
self.permanent = route.permanent
return route.target
| 30 | 86 | 0.664444 | 110 | 900 | 5.4 | 0.336364 | 0.040404 | 0.045455 | 0.063973 | 0.703704 | 0.703704 | 0.703704 | 0.703704 | 0.703704 | 0.703704 | 0 | 0.011852 | 0.25 | 900 | 29 | 87 | 31.034483 | 0.868148 | 0.408889 | 0 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
14ddc3dbde35c395927d464ea0b243e891e67157 | 2,670 | py | Python | tests/test_visual_assessment_of_tendency.py | gags13/pyclustertend | c13c44ba4807d659536ff74aca0253d288e1d5b4 | [
"BSD-3-Clause"
] | null | null | null | tests/test_visual_assessment_of_tendency.py | gags13/pyclustertend | c13c44ba4807d659536ff74aca0253d288e1d5b4 | [
"BSD-3-Clause"
] | null | null | null | tests/test_visual_assessment_of_tendency.py | gags13/pyclustertend | c13c44ba4807d659536ff74aca0253d288e1d5b4 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
from pathlib import Path
from sklearn import datasets
from unittest.mock import patch
from pyclustertend import vat, ivat, compute_ordered_dissimilarity_matrix, compute_ivat_ordered_dissimilarity_matrix
TEST_DIR = Path(__file__).resolve().parent
def test_compute_ordered_dissimilarity_matrix():
# given
iris = datasets.load_iris()
iris_dataset = iris.data
expected_ordered_matrix = np.load(TEST_DIR / 'data/iris_vat.npy')
# when
ordered_matrix = compute_ordered_dissimilarity_matrix(iris_dataset)
# then
np.testing.assert_allclose(ordered_matrix, expected_ordered_matrix, atol=0.1)
def test_compute_ivat_ordered_dissimilarity_matrix():
# given
iris = datasets.load_iris()
iris_dataset = iris.data
expected_ordered_matrix = np.load(TEST_DIR / 'data/iris_ivat.npy')
# when
ordered_matrix = compute_ivat_ordered_dissimilarity_matrix(iris_dataset)
# then
np.testing.assert_allclose(ordered_matrix, expected_ordered_matrix, atol=0.1)
@patch('pyclustertend.visual_assessment_of_tendency.compute_ivat_ordered_dissimilarity_matrix')
def test_ivat_call_compute_ivat_ordered_dissimilarity_matrix_to_obtain_the_ordered_matrix(mock_compute_ivat):
# given
iris = datasets.load_iris()
iris_dataset = iris.data
mock_compute_ivat.return_value = np.ones((3, 3))
# when
ivat(iris_dataset)
# then
mock_compute_ivat.assert_called_once_with(iris_dataset)
@patch('pyclustertend.visual_assessment_of_tendency.compute_ordered_dissimilarity_matrix')
def test_vat_call_compute_ivat_ordered_dissimilarity_matrix_to_obtain_the_ordered_matrix(mock_compute_vat):
# given
iris = datasets.load_iris()
iris_dataset = iris.data
mock_compute_vat.return_value = np.ones((3, 3))
# when
ivat(iris_dataset)
# then
mock_compute_vat.assert_called_once_with(iris_dataset)
@patch('pyclustertend.visual_assessment_of_tendency.compute_ivat_ordered_dissimilarity_matrix')
def test_ivat_does_not_return_the_matrix_by_default(mock_compute_ivat):
# given
iris = datasets.load_iris()
iris_dataset = iris.data
mock_compute_ivat.return_value = np.ones((3, 3))
# when
output_result = ivat(iris_dataset)
# then
assert output_result is None
@patch('pyclustertend.visual_assessment_of_tendency.compute_ordered_dissimilarity_matrix')
def test_vat_does_not_return_the_matrix_by_default(mock_compute_vat):
# given
iris = datasets.load_iris()
iris_dataset = iris.data
mock_compute_vat.return_value = np.ones((3, 3))
# when
output_result = vat(iris_dataset)
# then
assert output_result is None
| 29.340659 | 116 | 0.78015 | 361 | 2,670 | 5.310249 | 0.174515 | 0.080334 | 0.162754 | 0.113198 | 0.887846 | 0.86072 | 0.829421 | 0.829421 | 0.788732 | 0.743871 | 0 | 0.005259 | 0.145318 | 2,670 | 90 | 117 | 29.666667 | 0.834794 | 0.035581 | 0 | 0.565217 | 0 | 0 | 0.142801 | 0.129108 | 0 | 0 | 0 | 0 | 0.130435 | 1 | 0.130435 | false | 0 | 0.108696 | 0 | 0.23913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
14e43faba6e45303a571de79b0830b07f787a83d | 26 | py | Python | python/init.py | tasptz/gpuencode | 2bb9682d6c9184ae4d934fdffafcabfa52c41ceb | [
"MIT"
] | null | null | null | python/init.py | tasptz/gpuencode | 2bb9682d6c9184ae4d934fdffafcabfa52c41ceb | [
"MIT"
] | null | null | null | python/init.py | tasptz/gpuencode | 2bb9682d6c9184ae4d934fdffafcabfa52c41ceb | [
"MIT"
] | null | null | null | from .pygpuencode import * | 26 | 26 | 0.807692 | 3 | 26 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
14e9cce50207cdeb1051b49fc17d59edf14be3ff | 62 | py | Python | Chapter03/todo_app/models/__init__.py | PacktPublishing/Odoo-11-Development-Essentials-Third-Edition | 3cfeae0c2ce5a81d69f62a5be1ed28d74a7a78f5 | [
"MIT"
] | 31 | 2018-05-29T00:16:45.000Z | 2021-07-20T00:45:13.000Z | Chapter03/todo_app/models/__init__.py | PacktPublishing/Odoo-11-Development-Essentials-Third-Edition | 3cfeae0c2ce5a81d69f62a5be1ed28d74a7a78f5 | [
"MIT"
] | 1 | 2018-04-27T08:47:12.000Z | 2018-06-27T06:56:44.000Z | Chapter03/todo_app/models/__init__.py | PacktPublishing/Odoo-11-Development-Essentials-Third-Edition | 3cfeae0c2ce5a81d69f62a5be1ed28d74a7a78f5 | [
"MIT"
] | 31 | 2018-03-30T08:43:10.000Z | 2020-08-31T13:36:33.000Z | from . import todo_task_model
from . import res_partner_model
| 20.666667 | 31 | 0.83871 | 10 | 62 | 4.8 | 0.7 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 62 | 2 | 32 | 31 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
09209d0d4b78ee4fb949e05079bf7d32abeadf1d | 204 | py | Python | Python_Assignment_1/12.py | tanujdey7/Python | 5fe28582ac955cc46cc24af1c807023d768edbdb | [
"WTFPL"
] | null | null | null | Python_Assignment_1/12.py | tanujdey7/Python | 5fe28582ac955cc46cc24af1c807023d768edbdb | [
"WTFPL"
] | null | null | null | Python_Assignment_1/12.py | tanujdey7/Python | 5fe28582ac955cc46cc24af1c807023d768edbdb | [
"WTFPL"
] | null | null | null | num1 = 10
num2 = 20
print("Number 1: "+str(num1))
print("Number 2: "+str(num2))
num1 = num1 ^ num2
num2 = num2 ^ num1
num1 = num1 ^ num2
print("Number 1: "+str(num1))
print("Number 2: "+str(num2)) | 22.666667 | 30 | 0.607843 | 33 | 204 | 3.757576 | 0.272727 | 0.354839 | 0.193548 | 0.241935 | 0.612903 | 0.612903 | 0.612903 | 0.612903 | 0.612903 | 0.612903 | 0 | 0.140244 | 0.196078 | 204 | 9 | 31 | 22.666667 | 0.615854 | 0 | 0 | 0.666667 | 0 | 0 | 0.203046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
116ea89680c208dcc92afa6cfd19dd7f7da6a315 | 14,743 | py | Python | tests/test_report_producer_json.py | jsoref/centos-package-cron | 0c7e3e24b91619916a515c8ef492dcfa863dae66 | [
"BSD-2-Clause"
] | 83 | 2015-03-19T09:07:57.000Z | 2021-10-14T02:19:58.000Z | tests/test_report_producer_json.py | jsoref/centos-package-cron | 0c7e3e24b91619916a515c8ef492dcfa863dae66 | [
"BSD-2-Clause"
] | 26 | 2015-01-08T17:29:10.000Z | 2020-03-04T19:56:19.000Z | tests/test_report_producer_json.py | jsoref/centos-package-cron | 0c7e3e24b91619916a515c8ef492dcfa863dae66 | [
"BSD-2-Clause"
] | 21 | 2016-05-17T19:22:56.000Z | 2021-02-15T14:27:08.000Z | # -*- coding: utf-8 -*-
import unittest
from centos_package_cron.report_producer import *
from centos_package_cron.package import *
from centos_package_cron.errata_item import *
from mock import Mock
import json
class db_session_fetcher_mock:
def __enter__(self):
return "foo"
def __exit__(self, type, value, traceback):
howdy = 'howdy'
class ReportProducerTest(unittest.TestCase):
def setUp(self):
self.pkg_fetcher_mock = Mock()
self.pkg_checker_mock = Mock()
self.annoyance_check_mock = Mock()
self.annoy_fetcher_mock = Mock()
self.annoy_fetcher_mock.fetch = Mock(return_value=self.annoyance_check_mock)
self.db_session_mock = db_session_fetcher_mock()
self.advisories_on_installed = []
self.pkg_checker_mock.findAdvisoriesOnInstalledPackages = lambda: self.advisories_on_installed
self.advisory_alerts_not_necessary = []
self.annoyance_check_mock.is_advisory_alert_necessary = lambda advisory: advisory not in self.advisory_alerts_not_necessary
self.general_update_not_necessary = []
self.annoyance_check_mock.is_package_alert_necessary = lambda package: package not in self.general_update_not_necessary
self.old_advisories_removed_for_advisory_set = []
self.annoyance_check_mock.remove_old_advisories = lambda active_ad: self.old_advisories_removed_for_advisory_set.append(active_ad)
self.old_general_alerts_removed = []
self.annoyance_check_mock.remove_old_alerts_for_package = lambda package: self.old_general_alerts_removed.append(package)
self.general_updates = []
self.pkg_fetcher_mock.get_package_updates = lambda: self.general_updates
self.changelogs = {}
self.pkg_fetcher_mock.get_package_changelog = lambda name,vers,release: self.changelogs[(name,vers,release)]
self.depends_on = {}
self.pkg_fetcher_mock.get_what_depends_on = lambda name: self.depends_on[name]
def get_producer(self,repo_exclude=[], repo_include=[], skip_old=True,include_depends_on=False):
return ReportProducer(repo_exclude,
repo_include,
skip_old,
'doesnt matter',
self.pkg_fetcher_mock,
self.pkg_checker_mock,
self.annoy_fetcher_mock,
self.db_session_mock,
include_depends_on)
def test_get_report_content_security_advisories_as_json(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages':installed_packages}]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result)['advisories'] == [{"advisory_id": "adv id", "packages": ["libgcrypt-1.5.3-4.el7"], "severity": "Important"}]
def test_get_report_content_general_updates_as_json(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages':installed_packages}]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result)['updates'] == [{"name":"libgcrypt-1.5.3-4.el7","source":"updates"}]
def test_get_report_content_as_json_no_updates(self):
# arrange
producer = self.get_producer()
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[]
}
def test_get_report_content_as_json_no_updates_dependsonenabled(self):
# arrange
producer = self.get_producer(include_depends_on=True)
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[]
}
def test_get_report_content_as_json_general_but_no_security_advisories(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"changelog": "stuff"
}]
}
assert self.old_general_alerts_removed == [package]
def test_get_report_content_as_json_general_but_no_security_advisories_depends_on_enabled(self):
# arrange
producer = self.get_producer(include_depends_on=True)
package1 = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
package2 = Package('foo', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package1, package2]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff', ('foo', '1.5.3', '4.el7'): 'bah'}
self.depends_on['libgcrypt'] = [Package('openssl', '1.2', '4.el7','x86_64', ''), Package('gnutls', '1.3', '4.el7','x86_64', '')]
self.depends_on['foo'] = [Package('bah', '1.2', '4.el7','x86_64', '')]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[
{
"name":"foo-1.5.3-4.el7",
"source":"updates",
"required_by": ["bah"],
"changelog": "bah"
},
{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"required_by": ["openssl","gnutls"],
"changelog": "stuff"
}
]
}
assert self.old_general_alerts_removed == [package1, package2]
def test_get_report_content_as_json_general_but_no_security_advisories_depends_on_enabled_and_some_packages_have_no_dependencies(self):
# arrange
producer = self.get_producer(include_depends_on=True)
package1 = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
package2 = Package('foo', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package1, package2]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff', ('foo', '1.5.3', '4.el7'): 'bah'}
self.depends_on['libgcrypt'] = []
self.depends_on['foo'] = [Package('bah', '1.2', '4.el7','x86_64', '')]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[
{
"name":"foo-1.5.3-4.el7",
"source":"updates",
"required_by": ["bah"],
"changelog": "bah"
},
{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"changelog": "stuff"
}
]
}
assert self.old_general_alerts_removed == [package1, package2]
def test_get_report_content_as_json_unicode_in_changelog(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
offending_changelog = '* Thu Aug 14 07:00:00 2014 Luk\xc3\xa1\xc5\xa1 Nykr\xc3\xbdn <lnykryn@redhat.com> - 9.49.17-1.1\n- fedora-readonly: fix prefix detection'
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): offending_changelog}
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[],
'updates':[
{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"changelog": u"* Thu Aug 14 07:00:00 2014 Lukáš Nykrýn <lnykryn@redhat.com> - 9.49.17-1.1\n- fedora-readonly: fix prefix detection"
}
]
}
def test_get_report_content_as_json_general_but_no_security_advisories_already_notified(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
self.general_update_not_necessary = [package]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {'advisories': [], 'updates': []}
assert self.old_general_alerts_removed == []
def test_get_report_content_as_json_both_security_and_general_updates(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages': installed_packages}]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[{
"advisory_id": "adv id",
"packages": ["libgcrypt-1.5.3-4.el7"],
"severity": "Important"}],
'updates':[{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"changelog": "stuff"
}]
}
assert self.old_advisories_removed_for_advisory_set == [[advisory]]
assert self.old_general_alerts_removed == [package]
def test_get_report_content_as_json_both_security_and_general_updates_already_notified(self):
# arrange
producer = self.get_producer()
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages':installed_packages}]
self.general_update_not_necessary = self.general_updates
self.advisory_alerts_not_necessary = [advisory]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {'advisories': [], 'updates': []}
assert self.old_general_alerts_removed == []
def test_get_report_content_as_json_both_security_and_some_general_updates_already_notified(self):
# arrange
producer = self.get_producer()
package1 = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
package2 = Package('foo', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package1, package2]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff', ('foo', '1.5.3', '4.el7'): 'bah'}
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package1]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages':installed_packages}]
self.general_update_not_necessary = [package1]
self.advisory_alerts_not_necessary = [advisory]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {'advisories': [], 'updates': [
{
"name":"foo-1.5.3-4.el7",
"source":"updates",
"changelog": "bah"
}
]}
assert self.old_general_alerts_removed == [package2]
def test_get_report_content_as_json_skip_old_turned_off(self):
# arrange
producer = self.get_producer(skip_old=False)
package = Package('libgcrypt', '1.5.3', '4.el7', 'x86_64', 'updates')
self.general_updates = [package]
self.changelogs = {('libgcrypt', '1.5.3', '4.el7'): 'stuff'}
advisory = ErrataItem('adv id', ErrataType.SecurityAdvisory,ErrataSeverity.Important, ['i686','x86_64'], ['7'], [{'name': 'libgcrypt','version':'1.5.3', 'release':'4.el7', 'arch':'x86_64'}],[])
installed_packages = [package]
self.advisories_on_installed = [{'advisory': advisory, 'installed_packages':installed_packages}]
# act
result = producer.get_report_content_as_json()
# assert
assert json.loads(result) == {
'advisories':[{
"advisory_id": "adv id",
"packages": ["libgcrypt-1.5.3-4.el7"],
"severity": "Important"}],
'updates':[{
"name":"libgcrypt-1.5.3-4.el7",
"source":"updates",
"changelog": "stuff"
}]
}
assert self.old_general_alerts_removed == []
assert self.old_advisories_removed_for_advisory_set == []
if __name__ == "__main__":
unittest.main() | 43.361765 | 201 | 0.590382 | 1,645 | 14,743 | 5.001824 | 0.09848 | 0.024307 | 0.016772 | 0.019446 | 0.858046 | 0.813077 | 0.763977 | 0.737482 | 0.71913 | 0.717914 | 0 | 0.041694 | 0.26304 | 14,743 | 340 | 202 | 43.361765 | 0.715601 | 0.018178 | 0 | 0.596 | 0 | 0.008 | 0.176931 | 0.014548 | 0 | 0 | 0 | 0 | 0.092 | 1 | 0.068 | false | 0 | 0.06 | 0.008 | 0.144 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1197e17ec4a01a0e4f1509b0d46406d160ca238f | 33 | py | Python | mozetl/basic/__init__.py | harterrt/python_mozetl | 06dd376b6e3e60355ac3307ab75b0154c6d15d81 | [
"MIT"
] | 28 | 2017-05-01T20:06:19.000Z | 2021-11-14T19:41:51.000Z | mozetl/basic/__init__.py | harterrt/python_mozetl | 06dd376b6e3e60355ac3307ab75b0154c6d15d81 | [
"MIT"
] | 302 | 2017-04-25T17:59:54.000Z | 2022-03-24T13:19:34.000Z | mozetl/basic/__init__.py | harterrt/python_mozetl | 06dd376b6e3e60355ac3307ab75b0154c6d15d81 | [
"MIT"
] | 36 | 2017-04-25T18:31:37.000Z | 2022-01-25T02:05:20.000Z | from .transform import * # noqa
| 16.5 | 32 | 0.69697 | 4 | 33 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 33 | 1 | 33 | 33 | 0.884615 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
11fb6980bf1f144dd51c458fbf379a1e6f7363e7 | 134 | py | Python | npw/db/__init__.py | eirannejad/NavisPythonWrapper | 9eb22b2dfe7934dc06bb30d92e4cb3afd8a74843 | [
"MIT"
] | 9 | 2018-06-06T15:18:35.000Z | 2022-01-09T13:51:27.000Z | npw/db/__init__.py | eirannejad/NavisPythonWrapper | 9eb22b2dfe7934dc06bb30d92e4cb3afd8a74843 | [
"MIT"
] | null | null | null | npw/db/__init__.py | eirannejad/NavisPythonWrapper | 9eb22b2dfe7934dc06bb30d92e4cb3afd8a74843 | [
"MIT"
] | null | null | null | """Wrappers for Navisworks API classes."""
from npw.db import element
def wrap(nvswrks_mi):
return element.Element(nvswrks_mi)
| 16.75 | 42 | 0.746269 | 19 | 134 | 5.157895 | 0.789474 | 0.183673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149254 | 134 | 7 | 43 | 19.142857 | 0.859649 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
eeb87ef4c2446160b87cb9f1de127980c16e481a | 274 | py | Python | python/muse/algorithms/meta/Search.py | networkhermit/algorithm | 5c7221b12fac2de947a7c75ee40ff4ff519b443f | [
"MIT"
] | null | null | null | python/muse/algorithms/meta/Search.py | networkhermit/algorithm | 5c7221b12fac2de947a7c75ee40ff4ff519b443f | [
"MIT"
] | null | null | null | python/muse/algorithms/meta/Search.py | networkhermit/algorithm | 5c7221b12fac2de947a7c75ee40ff4ff519b443f | [
"MIT"
] | null | null | null | from muse.algorithms.search import BinarySearch
from muse.algorithms.search import LinearSearch
def binarySearch(arr: list, key: object) -> int:
return BinarySearch.find(arr, key)
def linearSearch(arr: list, key: object) -> int:
return LinearSearch.find(arr, key)
| 30.444444 | 48 | 0.759124 | 36 | 274 | 5.777778 | 0.416667 | 0.076923 | 0.173077 | 0.230769 | 0.528846 | 0.240385 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138686 | 274 | 8 | 49 | 34.25 | 0.881356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
eebf6d6395ca275d1eb3750cf38c4bc912be3a51 | 300 | py | Python | app.py | msmannan00/Genesis-Auto-Crawler | c0cf79a0fc7a12e056108ffc24faf0d3baa949ad | [
"MIT"
] | 1 | 2020-03-02T17:19:50.000Z | 2020-03-02T17:19:50.000Z | app.py | msmannan00/Genesis-Auto-Crawler | c0cf79a0fc7a12e056108ffc24faf0d3baa949ad | [
"MIT"
] | null | null | null | app.py | msmannan00/Genesis-Auto-Crawler | c0cf79a0fc7a12e056108ffc24faf0d3baa949ad | [
"MIT"
] | null | null | null | from crawler.crawler_instance.application_controller.application_controller import application_controller
from crawler.crawler_instance.application_controller.application_enums import APPICATION_COMMANDS
application_controller.get_instance().invoke_triggers(APPICATION_COMMANDS.S_START_APPLICATION)
| 60 | 105 | 0.923333 | 33 | 300 | 7.969697 | 0.424242 | 0.39924 | 0.136882 | 0.197719 | 0.441065 | 0.441065 | 0.441065 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 300 | 4 | 106 | 75 | 0.906897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
eec4ecfbb313be794b9e6bcad67fb73b31cd925f | 96 | py | Python | venv/lib/python3.8/site-packages/numpy/ma/tests/test_core.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/numpy/ma/tests/test_core.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/numpy/ma/tests/test_core.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/b8/5e/b6/584e57840903af8c8365fcbe61d7d22164b39c04589d0f5cc7f79fe025 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.4375 | 0 | 96 | 1 | 96 | 96 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
eedd7362b2665a6bcd28e805fcbc51960ac81f9e | 26 | py | Python | backend/apps/authentication/migrations/__init__.py | apopovidis/speechex | e38da51d8b3e58cbaf5965a832de4e14fa0a73db | [
"MIT"
] | null | null | null | backend/apps/authentication/migrations/__init__.py | apopovidis/speechex | e38da51d8b3e58cbaf5965a832de4e14fa0a73db | [
"MIT"
] | null | null | null | backend/apps/authentication/migrations/__init__.py | apopovidis/speechex | e38da51d8b3e58cbaf5965a832de4e14fa0a73db | [
"MIT"
] | null | null | null | from ....settings import * | 26 | 26 | 0.692308 | 3 | 26 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e109f93feff6676da644531687ef9ee6d0bda07a | 29 | py | Python | lab_new/core.py | GuoBioinfoLab/guo-lab-home | b863ee02e14d5c7f2769ba5ab31be21fa1a36a8e | [
"MIT"
] | null | null | null | lab_new/core.py | GuoBioinfoLab/guo-lab-home | b863ee02e14d5c7f2769ba5ab31be21fa1a36a8e | [
"MIT"
] | null | null | null | lab_new/core.py | GuoBioinfoLab/guo-lab-home | b863ee02e14d5c7f2769ba5ab31be21fa1a36a8e | [
"MIT"
] | null | null | null |
from lab_new import app
| 7.25 | 24 | 0.689655 | 5 | 29 | 3.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.310345 | 29 | 3 | 25 | 9.666667 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e1149ed39a04f3dadc2c878a8ca3c60f5882a6e1 | 118 | py | Python | Python/Tests/TestData/DebuggerProject/EGGceptionOnCall.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 404 | 2019-05-07T02:21:57.000Z | 2022-03-31T17:03:04.000Z | Python/Tests/TestData/DebuggerProject/EGGceptionOnCall.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 1,672 | 2019-05-06T21:09:38.000Z | 2022-03-31T23:16:04.000Z | Python/Tests/TestData/DebuggerProject/EGGceptionOnCall.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 186 | 2019-05-13T03:17:37.000Z | 2022-03-31T16:24:05.000Z | import os, sys
sys.path.append(os.path.abspath('EGG.egg'))
import EGG.function_exception
EGG.function_exception.f()
| 16.857143 | 43 | 0.779661 | 19 | 118 | 4.736842 | 0.526316 | 0.244444 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076271 | 118 | 6 | 44 | 19.666667 | 0.825688 | 0 | 0 | 0 | 0 | 0 | 0.059322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
011d1c046a10cc4eb13f0466fb6a9af732f23a24 | 26 | py | Python | ofsc/__init__.py | btoron/pyOFSC | 0a954ba8f43782ba46d5780efd57a2cde5f51cf4 | [
"MIT"
] | 3 | 2020-05-26T16:49:04.000Z | 2022-02-03T22:42:04.000Z | ofsc/__init__.py | btoron/pyOFSC | 0a954ba8f43782ba46d5780efd57a2cde5f51cf4 | [
"MIT"
] | 2 | 2021-04-06T18:14:30.000Z | 2021-08-24T13:01:05.000Z | ofsc/__init__.py | btoron/pyOFSC | 0a954ba8f43782ba46d5780efd57a2cde5f51cf4 | [
"MIT"
] | null | null | null | from .ofsc_proxy import *
| 13 | 25 | 0.769231 | 4 | 26 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6d6a96cda94093818265722e79579286e2ba083c | 16,267 | py | Python | tests/sc/test_organizations.py | csanders-git/pyTenable | dea25ba02b049bfe3a8cd151c155c3ccf9b2a285 | [
"MIT"
] | null | null | null | tests/sc/test_organizations.py | csanders-git/pyTenable | dea25ba02b049bfe3a8cd151c155c3ccf9b2a285 | [
"MIT"
] | null | null | null | tests/sc/test_organizations.py | csanders-git/pyTenable | dea25ba02b049bfe3a8cd151c155c3ccf9b2a285 | [
"MIT"
] | null | null | null | from tenable.errors import *
from ..checker import check
import pytest, os
def test_organizations_constructor_name_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(name=1)
def test_organizations_constructor_description_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(description=1)
def test_organizations_constructor_address_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(address=1)
def test_organizations_constructor_city_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(city=1)
def test_organizations_constructor_state_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(state=1)
def test_organizations_constructor_country_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(country=1)
def test_organizations_constructor_phone_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(phone=1)
def test_organizations_constructor_lce_ids_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(lce_ids=1)
def test_organizations_constructor_lce_ids_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(lce_ids=['one',])
def test_organizations_constructor_zone_selection_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(zone_selection=1)
def test_organizations_constructor_zone_selection_unexpectedvalueerror(sc):
with pytest.raises(UnexpectedValueError):
sc.organizations._constructor(zone_selection='something')
def test_organizations_constructor_restricted_ips_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(restricted_ips=1)
def test_organizations_constructor_restricted_ips_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(restricted_ips=[1])
def test_organizations_constructor_repos_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(repos=1)
def test_organizations_constructor_repos_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(repos=['one'])
def test_organizations_constructor_pub_sites_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(pub_sites=1)
def test_organizations_constructor_pub_sites_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(pub_sites=['one'])
def test_organizations_constructor_ldap_ids_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(ldap_ids=1)
def test_organizations_constructor_ldap_ids_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(ldap_ids=['one'])
def test_organizations_constructor_nessus_managers_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(nessus_managers=1)
def test_organizations_constructor_nessus_managers_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(nessus_managers=['one'])
def test_organizations_constructor_info_links_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(info_links=1)
def test_organizations_constructor_info_links_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(info_links=[1])
def test_organizations_constructor_info_links_item_name_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(info_links=[(1, 'http://site.com/%IP%')])
def test_organizations_constructor_info_links_item_link_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(info_links=[('name', 1)])
def test_organizations_constructor_vuln_score_low_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(vuln_score_low='one')
def test_organizations_constructor_vuln_score_medium_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(vuln_score_medium='one')
def test_organizations_constructor_vuln_score_high_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(vuln_score_high='one')
def test_organizations_constructor_vuln_score_critical_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations._constructor(vuln_score_critical='one')
def test_organizations_constructor_success(sc):
o = sc.organizations._constructor(
name='name',
description='description',
address='123 main street',
city='anytown',
state='IL',
country='USA',
phone='999.888.7766',
lce_ids=[1, 2, 3],
zone_selection='auto_only',
restricted_ips=['127.0.0.1', '127.0.0.0/8'],
repos=[1, 2, 3],
pub_sites=[1, 2, 3],
ldap_ids=[1, 2, 3],
nessus_managers=[1, 2, 3],
info_links=[('link', 'http://url/%IP%')],
vuln_score_low=1,
vuln_score_medium=5,
vuln_score_high=10,
vuln_score_critical=40,
)
assert o == {
'name': 'name',
'description': 'description',
'address': '123 main street',
'city': 'anytown',
'state': 'IL',
'country': 'USA',
'phone': '999.888.7766',
'lces': [{'id': 1}, {'id': 2}, {'id': 3}],
'zoneSelection': 'auto_only',
'restrictedIPs': '127.0.0.1,127.0.0.0/8',
'repositories': [{'id': 1}, {'id': 2}, {'id': 3}],
'pubSites': [{'id': 1}, {'id': 2}, {'id': 3}],
'ldaps': [{'id': 1}, {'id': 2}, {'id': 3}],
'nessusManagers': [{'id': 1}, {'id': 2}, {'id': 3}],
'ipInfoLinks': [{'name': 'link', 'link': 'http://url/%IP%'}],
'vulnScoreLow': 1,
'vulnScoreMedium': 5,
'vulnScoreHigh': 10,
'vulnScoreCritical': 40
}
@pytest.fixture
def org(request, admin, vcr):
with vcr.use_cassette('test_organizations_create_success'):
o = admin.organizations.create('New Org')
def teardown():
try:
with vcr.use_cassette('test_organizations_delete_success'):
admin.organizations.delete(int(o['id']))
except APIError:
pass
request.addfinalizer(teardown)
return o
@pytest.mark.vcr()
def test_organizations_create_success(admin, org):
assert isinstance(org, dict)
check(org, 'id', str)
check(org, 'name', str)
check(org, 'description', str)
check(org, 'email', str)
check(org, 'address', str)
check(org, 'city', str)
check(org, 'state', str)
check(org, 'country', str)
check(org, 'phone', str)
check(org, 'fax', str)
check(org, 'ipInfoLinks', list)
for i in org['ipInfoLinks']:
check(i, 'name', str)
check(i, 'link', str)
check(org, 'zoneSelection', str)
check(org, 'restrictedIPs', str)
check(org, 'vulnScoreLow', str)
check(org, 'vulnScoreMedium', str)
check(org, 'vulnScoreHigh', str)
check(org, 'vulnScoreCritical', str)
check(org, 'createdTime', str)
check(org, 'modifiedTime', str)
check(org, 'userCount', str)
check(org, 'lces', list)
for i in org['lces']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(org, 'repositories', list)
for i in org['repositories']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(i, 'type', str)
check(i, 'dataFormat', str)
check(i, 'groupAssign', str)
check(org, 'zones', list)
for i in org['zones']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(org, 'ldaps', list)
for i in org['ldaps']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(org, 'nessusManagers', list)
for i in org['nessusManagers']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(org, 'pubSites', list)
for i in org['pubSites']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
@pytest.mark.vcr()
def test_organizations_delete_success(admin, org):
admin.organizations.delete(int(org['id']))
@pytest.mark.vcr()
def test_organizations_details_success(admin, org):
o = admin.organizations.details(int(org['id']))
assert isinstance(o, dict)
check(o, 'id', str)
check(o, 'name', str)
check(o, 'description', str)
check(o, 'email', str)
check(o, 'address', str)
check(o, 'city', str)
check(o, 'state', str)
check(o, 'country', str)
check(o, 'phone', str)
check(o, 'fax', str)
check(o, 'ipInfoLinks', list)
for i in o['ipInfoLinks']:
check(i, 'name', str)
check(i, 'link', str)
check(o, 'zoneSelection', str)
check(o, 'restrictedIPs', str)
check(o, 'vulnScoreLow', str)
check(o, 'vulnScoreMedium', str)
check(o, 'vulnScoreHigh', str)
check(o, 'vulnScoreCritical', str)
check(o, 'createdTime', str)
check(o, 'modifiedTime', str)
check(o, 'userCount', str)
check(o, 'lces', list)
for i in o['lces']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'repositories', list)
for i in o['repositories']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(i, 'type', str)
check(i, 'dataFormat', str)
check(i, 'groupAssign', str)
check(o, 'zones', list)
for i in o['zones']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'ldaps', list)
for i in o['ldaps']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'nessusManagers', list)
for i in o['nessusManagers']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'pubSites', list)
for i in o['pubSites']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
@pytest.mark.vcr()
def test_organizations_edit_success(admin, org):
o = admin.organizations.edit(int(org['id']), name='new org name')
assert isinstance(o, dict)
check(o, 'id', str)
check(o, 'name', str)
check(o, 'description', str)
check(o, 'email', str)
check(o, 'address', str)
check(o, 'city', str)
check(o, 'state', str)
check(o, 'country', str)
check(o, 'phone', str)
check(o, 'fax', str)
check(o, 'ipInfoLinks', list)
for i in o['ipInfoLinks']:
check(i, 'name', str)
check(i, 'link', str)
check(o, 'zoneSelection', str)
check(o, 'restrictedIPs', str)
check(o, 'vulnScoreLow', str)
check(o, 'vulnScoreMedium', str)
check(o, 'vulnScoreHigh', str)
check(o, 'vulnScoreCritical', str)
check(o, 'createdTime', str)
check(o, 'modifiedTime', str)
check(o, 'userCount', str)
check(o, 'lces', list)
for i in o['lces']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'repositories', list)
for i in o['repositories']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(i, 'type', str)
check(i, 'dataFormat', str)
check(i, 'groupAssign', str)
check(o, 'zones', list)
for i in o['zones']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'ldaps', list)
for i in o['ldaps']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'nessusManagers', list)
for i in o['nessusManagers']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
check(o, 'pubSites', list)
for i in o['pubSites']:
check(i, 'id', str)
check(i, 'name', str)
check(i, 'description', str)
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_id_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.accept_risk_rules('one')
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_repos_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.accept_risk_rules(1, repos=1)
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_repos_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.accept_risk_rules(1, repos=['one'])
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_plugin_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.accept_risk_rules(1, plugin='one')
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_port_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.accept_risk_rules(1, port='one')
@pytest.mark.vcr()
def test_organizations_accept_risk_rules_success(admin, org):
rules = admin.organizations.accept_risk_rules(int(org['id']))
assert isinstance(rules, list)
for r in rules:
check(r, 'id', str)
check(r, 'hostType', str)
check(r, 'hostValue', str)
check(r, 'port', str)
check(r, 'protocol', str)
check(r, 'expires', str)
check(r, 'status', str)
check(r, 'repository', dict)
check(r['repository'], 'id', str)
check(r['repository'], 'name', str)
check(r['repository'], 'description', str)
check(r, 'organization', dict)
check(r['organization'], 'id', str)
check(r['organization'], 'name', str)
check(r['organization'], 'description', str)
check(r, 'user', dict)
check(r['user'], 'id', str)
check(r['user'], 'username', str)
check(r['user'], 'firstname', str)
check(r['user'], 'lastname', str)
check(r, 'plugin', dict)
check(r['plugin'], 'id', str)
check(r['plugin'], 'name', str)
check(r['plugin'], 'description', str)
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_id_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.recast_risk_rules('one')
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_repos_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.recast_risk_rules(1, repos=1)
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_repos_item_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.recast_risk_rules(1, repos=['one'])
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_plugin_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.recast_risk_rules(1, plugin='one')
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_port_typeerror(sc):
with pytest.raises(TypeError):
sc.organizations.recast_risk_rules(1, port='one')
@pytest.mark.vcr()
def test_organizations_recast_risk_rules_success(admin, org):
rules = admin.organizations.recast_risk_rules(int(org['id']))
assert isinstance(rules, list)
for r in rules:
check(r, 'id', str)
check(r, 'hostType', str)
check(r, 'hostValue', str)
check(r, 'port', str)
check(r, 'protocol', str)
check(r, 'status', str)
check(r, 'repository', dict)
check(r['repository'], 'id', str)
check(r['repository'], 'name', str)
check(r['repository'], 'description', str)
check(r, 'organization', dict)
check(r['organization'], 'id', str)
check(r['organization'], 'name', str)
check(r['organization'], 'description', str)
check(r, 'user', dict)
check(r['user'], 'id', str)
check(r['user'], 'username', str)
check(r['user'], 'firstname', str)
check(r['user'], 'lastname', str)
check(r, 'plugin', dict)
check(r['plugin'], 'id', str)
check(r['plugin'], 'name', str)
check(r['plugin'], 'description', str) | 34.537155 | 79 | 0.639577 | 2,036 | 16,267 | 4.942534 | 0.070727 | 0.127199 | 0.044718 | 0.069761 | 0.866541 | 0.81745 | 0.738746 | 0.714598 | 0.707046 | 0.656365 | 0 | 0.009542 | 0.207537 | 16,267 | 471 | 80 | 34.537155 | 0.771081 | 0 | 0 | 0.568396 | 0 | 0 | 0.145562 | 0.005348 | 0 | 0 | 0 | 0 | 0.014151 | 1 | 0.113208 | false | 0.009434 | 0.007075 | 0 | 0.122642 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d9ebe6edc833812931953e71687a2335b76ec75 | 95 | py | Python | cliutils/__init__.py | grecoe/CliSkin | 536b555a52e00b6a3b52ae7ab0da6e708ee8b65b | [
"MIT"
] | null | null | null | cliutils/__init__.py | grecoe/CliSkin | 536b555a52e00b6a3b52ae7ab0da6e708ee8b65b | [
"MIT"
] | null | null | null | cliutils/__init__.py | grecoe/CliSkin | 536b555a52e00b6a3b52ae7ab0da6e708ee8b65b | [
"MIT"
] | null | null | null | from cliutils.configuration import Configuration
from cliutils.app_helper import CliAppHelpers
| 31.666667 | 48 | 0.894737 | 11 | 95 | 7.636364 | 0.636364 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084211 | 95 | 2 | 49 | 47.5 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6da8752f7754f3a8540469f620fd066123c1751d | 3,964 | py | Python | Generative/MAKEDATA/util/integrator.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | null | null | null | Generative/MAKEDATA/util/integrator.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | null | null | null | Generative/MAKEDATA/util/integrator.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | 1 | 2022-02-23T13:48:06.000Z | 2022-02-23T13:48:06.000Z | import numpy as np
def l96StepRK2(u,Sim):
utmp = u.copy()
source = u[Sim['im'],:] * (u[Sim['ip'],:] - u[Sim['im2'],:]) + Sim['R'] - u
k1 = utmp + Sim['Timestep'] * 0.5 * source
source = k1[Sim['im'],:] * (k1[Sim['ip'],:] - k1[Sim['im2'],:]) + Sim['R'] - k1
u = utmp + Sim['Timestep'] * source
return u
def l96StepBackRK2(u,Sim):
utmp = u.copy()
source = -u[Sim['im'],:] * (u[Sim['ip'],:] - u[Sim['im2'],:]) - Sim['R'] + u
k1 = utmp + Sim['Timestep'] * 0.5 * source
source = -k1[Sim['im'],:] * (k1[Sim['ip'],:] - k1[Sim['im2'],:]) - Sim['R'] + k1
u = utmp + Sim['Timestep'] * source
return u
def l96qoi(u):
return np.sum(u**2)/(2*len(u))
def ksStepETDRK4(u,Sim):
g = Sim['g']
E = Sim['E']
E_2 = Sim['E_2']
Q = Sim['Q']
f1 = Sim['f1']
f2 = Sim['f2']
f3 = Sim['f3']
v = np.fft.fft(u,axis=0)
Nv = g*np.fft.fft(np.real(np.fft.ifft(v,axis=0))**2,axis=0)
a = E_2*v + Q*Nv
Na = g*np.fft.fft(np.real(np.fft.ifft(a,axis=0))**2,axis=0)
b = E_2*v + Q*Na
Nb = g*np.fft.fft(np.real(np.fft.ifft(b,axis=0))**2,axis=0)
c = E_2*a + Q*(2*Nb-Nv)
Nc = g*np.fft.fft(np.real(np.fft.ifft(c,axis=0))**2,axis=0)
v = E*v + Nv*f1 + 2*(Na+Nb)*f2 + Nc*f3
u = np.real(np.fft.ifft(v,axis=0))
return u
def ksStepRK2(u,Sim):
g = Sim['g']
E = Sim['E']
E_2 = Sim['E_2']
Q = Sim['Q']
f1 = Sim['f1']
f2 = Sim['f2']
f3 = Sim['f3']
k = Sim['k']
h = Sim['Timestep']
indexAlias = Sim['indexAlias']
v = np.fft.fft(u,axis=0)
vcopy = v.copy()
v[indexAlias]=0
source = (k**2-k**4)*v - 0.5j*k*np.fft.fft(np.fft.ifft(v,axis=0)**2,axis=0)
k1 = vcopy + h * 0.5 * source
k1[indexAlias]=0
source = (k**2-k**4)*k1 - 0.5j*k*np.fft.fft(np.fft.ifft(k1,axis=0)**2,axis=0)
v = vcopy + h * source
u = np.real(np.fft.ifft(v,axis=0))
return u
def ksStepBackRegularizedETDRK4(u,Sim):
beta = Sim['beta']
k = Sim['k']
h = Sim['Timestep']
Ndof = Sim['Ndof']
g = Sim['g']
E = Sim['Eback']
E_2 = Sim['E_2back']
Q = Sim['Qback']
f1 = Sim['f1back']
f2 = Sim['f2back']
f3 = Sim['f3back']
indexAlias = Sim['indexAlias']
dealias = False
v = np.fft.fft(u,axis=0)
if dealias:
vtmp = v.copy()
vtmp[indexAlias] = 0
Nv = g*np.fft.fft(np.real(np.fft.ifft(vtmp,axis=0))**2,axis=0)
a = E_2*v + Q*Nv
atmp = a.copy()
atmp[indexAlias] = 0
Na = g*np.fft.fft(np.real(np.fft.ifft(atmp,axis=0))**2,axis=0)
b = E_2*v + Q*Na
btmp = b.copy()
btmp[indexAlias] = 0
Nb = g*np.fft.fft(np.real(np.fft.ifft(btmp,axis=0))**2,axis=0)
c = E_2*a + Q*(2*Nb-Nv)
ctmp = c.copy()
ctmp[indexAlias] = 0
Nc = g*np.fft.fft(np.real(np.fft.ifft(c,axis=0))**2,axis=0)
else:
Nv = g*np.fft.fft(np.real(np.fft.ifft(v,axis=0))**2,axis=0)
a = E_2*v + Q*Nv
Na = g*np.fft.fft(np.real(np.fft.ifft(a,axis=0))**2,axis=0)
b = E_2*v + Q*Na
Nb = g*np.fft.fft(np.real(np.fft.ifft(b,axis=0))**2,axis=0)
c = E_2*a + Q*(2*Nb-Nv)
Nc = g*np.fft.fft(np.real(np.fft.ifft(c,axis=0))**2,axis=0)
v = E*v + Nv*f1 + 2*(Na+Nb)*f2 + Nc*f3
u = np.real(np.fft.ifft(v,axis=0))
return u
def ksStepBackRegularizedRK2(u,Sim):
beta = Sim['beta']
k = Sim['k']
h = Sim['Timestep']
Ndof = Sim['Ndof']
indexAlias = Sim['indexAlias']
v = np.fft.fft(u,axis=0)
vcopy = v.copy()
v[indexAlias]=0
source = -(k**2-k**4)*v/(1+beta*(k**4)) + 0.5j*k*np.fft.fft(np.fft.ifft(v,axis=0)**2,axis=0)/(1+beta*(k**4))
k1 = vcopy + h * 0.5 * source
k1[indexAlias]=0
source = -(k**2-k**4)*k1/(1+beta*(k**4)) + 0.5j*k*np.fft.fft(np.fft.ifft(k1,axis=0)**2,axis=0)/(1+beta*(k**4))
v = vcopy + h * source
u = np.real(np.fft.ifft(v,axis=0))
return u
| 27.527778 | 115 | 0.49773 | 746 | 3,964 | 2.624665 | 0.093834 | 0.102145 | 0.081716 | 0.081716 | 0.818693 | 0.814096 | 0.805414 | 0.790603 | 0.790603 | 0.783453 | 0 | 0.059783 | 0.257316 | 3,964 | 143 | 116 | 27.72028 | 0.605299 | 0 | 0 | 0.678261 | 0 | 0 | 0.04971 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06087 | false | 0 | 0.008696 | 0.008696 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6db8068347c1ef7647d68c9f10de5b5e35628040 | 139 | py | Python | kao_generator/__init__.py | cloew/KaoGenerator | c749814a8ce2bff70acf28982ae424369ff349c8 | [
"MIT"
] | null | null | null | kao_generator/__init__.py | cloew/KaoGenerator | c749814a8ce2bff70acf28982ae424369ff349c8 | [
"MIT"
] | null | null | null | kao_generator/__init__.py | cloew/KaoGenerator | c749814a8ce2bff70acf28982ae424369ff349c8 | [
"MIT"
] | null | null | null | from .gen_or_fn import GenOrFn, RunGenOrFn
from .kao_generator import KaoGenerator
from .decorators import kao_generator, wrap_generators | 46.333333 | 54 | 0.856115 | 19 | 139 | 6 | 0.684211 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107914 | 139 | 3 | 54 | 46.333333 | 0.919355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6db97cf5c1683e4cf39dc4952a2a362333621bf5 | 119 | py | Python | tests/basics/list_insert.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 13,648 | 2015-01-01T01:34:51.000Z | 2022-03-31T16:19:53.000Z | tests/basics/list_insert.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 7,092 | 2015-01-01T07:59:11.000Z | 2022-03-31T23:52:18.000Z | tests/basics/list_insert.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 4,942 | 2015-01-02T11:48:50.000Z | 2022-03-31T19:57:10.000Z | a = [1, 2, 3]
a.insert(1, 42)
print(a)
a.insert(-1, -1)
print(a)
a.insert(99, 99)
print(a)
a.insert(-99, -99)
print(a)
| 11.9 | 18 | 0.579832 | 28 | 119 | 2.464286 | 0.285714 | 0.405797 | 0.304348 | 0.565217 | 0.57971 | 0.57971 | 0.57971 | 0.57971 | 0 | 0 | 0 | 0.156863 | 0.142857 | 119 | 9 | 19 | 13.222222 | 0.519608 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
0984b328e438fe37d3edac1323cd91b33fc30400 | 169 | py | Python | fastapi_rest_jsonapi/resource/utils.py | Zenor27/fastapi-rest-jsonapi | 1c6eaad0791949bbaf9f4032fb7ecd483e80a02a | [
"MIT"
] | 2 | 2022-03-01T00:59:04.000Z | 2022-03-03T06:17:51.000Z | fastapi_rest_jsonapi/resource/utils.py | Zenor27/fastapi-rest-jsonapi | 1c6eaad0791949bbaf9f4032fb7ecd483e80a02a | [
"MIT"
] | 9 | 2022-01-16T15:47:35.000Z | 2022-03-28T18:47:18.000Z | fastapi_rest_jsonapi/resource/utils.py | Zenor27/fastapi-rest-jsonapi | 1c6eaad0791949bbaf9f4032fb7ecd483e80a02a | [
"MIT"
] | null | null | null | from fastapi_rest_jsonapi.resource import Resource, ResourceDetail
def is_detail_resource(resource: Resource) -> bool:
return issubclass(resource, ResourceDetail)
| 28.166667 | 66 | 0.822485 | 19 | 169 | 7.105263 | 0.684211 | 0.325926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112426 | 169 | 5 | 67 | 33.8 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 6 |
09858f7386b3216a5cc50ad99cf9bcba2fc3da2b | 9,809 | py | Python | test/test_senders/test_mh.py | jwodder/outgoing | 133d38c980fd6fe59e4edf0bb0f89392ecde68de | [
"MIT"
] | 4 | 2021-03-05T00:53:03.000Z | 2022-01-04T20:59:19.000Z | test/test_senders/test_mh.py | jwodder/outgoing | 133d38c980fd6fe59e4edf0bb0f89392ecde68de | [
"MIT"
] | null | null | null | test/test_senders/test_mh.py | jwodder/outgoing | 133d38c980fd6fe59e4edf0bb0f89392ecde68de | [
"MIT"
] | 1 | 2021-09-04T13:25:51.000Z | 2021-09-04T13:25:51.000Z | from email.message import EmailMessage
import logging
from mailbox import MH
from operator import itemgetter
from pathlib import Path
from typing import List, Union
from mailbits import email2dict
import pytest
from outgoing import Sender, from_dict
from outgoing.senders.mailboxes import MHSender
@pytest.mark.parametrize("folder", [None, "work", ["important", "work"]])
def test_mh_construct(
folder: Union[str, List[str], None], monkeypatch: pytest.MonkeyPatch, tmp_path: Path
) -> None:
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": folder,
},
configpath=str(tmp_path / "foo.txt"),
)
assert isinstance(sender, Sender)
assert isinstance(sender, MHSender)
assert sender.dict() == {
"configpath": tmp_path / "foo.txt",
"path": tmp_path / "inbox",
"folder": folder,
}
assert sender._mbox is None
def test_mh_send_no_folder_new_path(
caplog: pytest.LogCaptureFixture,
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
tmp_path: Path,
) -> None:
caplog.set_level(logging.DEBUG, logger="outgoing")
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
},
configpath=str(tmp_path / "foo.txt"),
)
with sender as s:
assert sender is s
sender.send(test_email1)
inbox = MH("inbox")
assert inbox.list_folders() == []
msgs = list(inbox)
assert len(msgs) == 1
assert email2dict(test_email1) == email2dict(msgs[0])
assert caplog.record_tuples == [
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Opening MH mailbox at {tmp_path/'inbox'}, root folder",
),
(
"outgoing.senders.mailboxes",
logging.INFO,
f"Adding e-mail {test_email1['Subject']!r} to MH mailbox at"
f" {tmp_path/'inbox'}, root folder",
),
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Closing MH mailbox at {tmp_path/'inbox'}, root folder",
),
]
def test_mh_send_folder_str_new_path(
caplog: pytest.LogCaptureFixture,
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
tmp_path: Path,
) -> None:
caplog.set_level(logging.DEBUG, logger="outgoing")
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": "work",
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email1)
inbox = MH("inbox")
assert inbox.list_folders() == ["work"]
work = inbox.get_folder("work")
msgs = list(work)
assert len(msgs) == 1
assert email2dict(test_email1) == email2dict(msgs[0])
assert caplog.record_tuples == [
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Opening MH mailbox at {tmp_path/'inbox'}, folder 'work'",
),
(
"outgoing.senders.mailboxes",
logging.INFO,
f"Adding e-mail {test_email1['Subject']!r} to MH mailbox at"
f" {tmp_path/'inbox'}, folder 'work'",
),
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Closing MH mailbox at {tmp_path/'inbox'}, folder 'work'",
),
]
def test_mh_send_folder_list_new_path(
caplog: pytest.LogCaptureFixture,
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
tmp_path: Path,
) -> None:
caplog.set_level(logging.DEBUG, logger="outgoing")
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": ["important", "work"],
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email1)
inbox = MH("inbox")
assert inbox.list_folders() == ["important"]
important = inbox.get_folder("important")
assert important.list_folders() == ["work"]
work = important.get_folder("work")
msgs = list(work)
assert len(msgs) == 1
assert email2dict(test_email1) == email2dict(msgs[0])
assert caplog.record_tuples == [
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Opening MH mailbox at {tmp_path/'inbox'}, folder 'important'/'work'",
),
(
"outgoing.senders.mailboxes",
logging.INFO,
f"Adding e-mail {test_email1['Subject']!r} to MH mailbox at"
f" {tmp_path/'inbox'}, folder 'important'/'work'",
),
(
"outgoing.senders.mailboxes",
logging.DEBUG,
f"Closing MH mailbox at {tmp_path/'inbox'}, folder 'important'/'work'",
),
]
def test_mh_send_no_folder_extant_path(
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
test_email2: EmailMessage,
tmp_path: Path,
) -> None:
monkeypatch.chdir(tmp_path)
inbox = MH("inbox")
inbox.lock()
inbox.add(test_email1)
inbox.unlock()
sender = from_dict(
{
"method": "mh",
"path": "inbox",
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email2)
assert inbox.list_folders() == []
msgs = list(inbox)
assert len(msgs) == 2
msgs.sort(key=itemgetter("Subject"))
assert email2dict(test_email1) == email2dict(msgs[0])
assert email2dict(test_email2) == email2dict(msgs[1])
def test_mh_send_folder_str_extant_path(
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
test_email2: EmailMessage,
tmp_path: Path,
) -> None:
monkeypatch.chdir(tmp_path)
inbox = MH("inbox")
inbox.add(test_email1)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": "work",
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email2)
assert inbox.list_folders() == ["work"]
work = inbox.get_folder("work")
msgs = list(work)
assert len(msgs) == 1
assert email2dict(test_email2) == email2dict(msgs[0])
def test_mh_send_extant_folder_str_extant_path(
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
test_email2: EmailMessage,
tmp_path: Path,
) -> None:
monkeypatch.chdir(tmp_path)
inbox = MH("inbox")
inbox.add_folder("work").add(test_email1)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": "work",
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email2)
assert inbox.list_folders() == ["work"]
work = inbox.get_folder("work")
msgs = list(work)
assert len(msgs) == 2
msgs.sort(key=itemgetter("Subject"))
assert email2dict(test_email1) == email2dict(msgs[0])
assert email2dict(test_email2) == email2dict(msgs[1])
def test_mh_send_folder_list_extant_path(
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
test_email2: EmailMessage,
tmp_path: Path,
) -> None:
monkeypatch.chdir(tmp_path)
inbox = MH("inbox")
inbox.add(test_email1)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": ["important", "work"],
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email2)
assert inbox.list_folders() == ["important"]
important = inbox.get_folder("important")
assert important.list_folders() == ["work"]
work = important.get_folder("work")
msgs = list(work)
assert len(msgs) == 1
assert email2dict(test_email2) == email2dict(msgs[0])
def test_mh_send_partially_extant_folder_list(
monkeypatch: pytest.MonkeyPatch,
test_email1: EmailMessage,
test_email2: EmailMessage,
tmp_path: Path,
) -> None:
monkeypatch.chdir(tmp_path)
inbox = MH("inbox")
inbox.add_folder("important").add(test_email1)
inbox.add_folder("work")
sender = from_dict(
{
"method": "mh",
"path": "inbox",
"folder": ["important", "work"],
},
configpath=str(tmp_path / "foo.txt"),
)
with sender:
sender.send(test_email2)
assert sorted(inbox.list_folders()) == ["important", "work"]
assert list(inbox.get_folder("work")) == []
important = inbox.get_folder("important")
assert important.list_folders() == ["work"]
work = important.get_folder("work")
msgs = list(work)
assert len(msgs) == 1
assert email2dict(test_email2) == email2dict(msgs[0])
def test_mh_send_no_context(
monkeypatch: pytest.MonkeyPatch, test_email1: EmailMessage, tmp_path: Path
) -> None:
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
},
configpath=str(tmp_path / "foo.txt"),
)
sender.send(test_email1)
inbox = MH("inbox")
assert inbox.list_folders() == []
msgs = list(inbox)
assert len(msgs) == 1
assert email2dict(test_email1) == email2dict(msgs[0])
def test_mh_close_unopened(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> None:
monkeypatch.chdir(tmp_path)
sender = from_dict(
{
"method": "mh",
"path": "inbox",
},
configpath=str(tmp_path / "foo.txt"),
)
assert isinstance(sender, MHSender)
with pytest.raises(ValueError) as excinfo:
sender.close()
assert str(excinfo.value) == "Mailbox is not open"
| 28.765396 | 88 | 0.593435 | 1,089 | 9,809 | 5.172635 | 0.089991 | 0.054678 | 0.031955 | 0.027694 | 0.866146 | 0.861885 | 0.844133 | 0.844133 | 0.841647 | 0.823362 | 0 | 0.011906 | 0.272199 | 9,809 | 340 | 89 | 28.85 | 0.77714 | 0 | 0 | 0.698113 | 0 | 0 | 0.153736 | 0.031808 | 0 | 0 | 0 | 0 | 0.13522 | 1 | 0.034591 | false | 0 | 0.09434 | 0 | 0.128931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0987eab2950c32f7585988908cc296e33d3145ca | 543 | py | Python | tests/test_import.py | gusbeane/galpy | d6db971285f163456c81775fc2fdc7d75189762c | [
"BSD-3-Clause"
] | 147 | 2015-01-01T14:06:17.000Z | 2022-03-24T14:47:41.000Z | tests/test_import.py | gusbeane/galpy | d6db971285f163456c81775fc2fdc7d75189762c | [
"BSD-3-Clause"
] | 269 | 2015-01-07T15:58:31.000Z | 2022-03-30T18:42:08.000Z | tests/test_import.py | gusbeane/galpy | d6db971285f163456c81775fc2fdc7d75189762c | [
"BSD-3-Clause"
] | 110 | 2015-02-08T10:57:24.000Z | 2021-12-28T07:56:49.000Z | ###################TEST WHETHER THE PACKAGE CAN BE IMPORTED####################
def test_top_import():
import galpy
def test_orbit_import():
import galpy.orbit
def test_potential_import():
import galpy.potential
def test_actionAngle_import():
import galpy.actionAngle
def test_df_import():
import galpy.df
def test_util_import():
import galpy.util
import galpy.util.multi
import galpy.util.plot
import galpy.util.coords
import galpy.util.conversion
| 22.625 | 79 | 0.626151 | 64 | 543 | 5.125 | 0.3125 | 0.335366 | 0.310976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.244936 | 543 | 23 | 80 | 23.608696 | 0.8 | 0.073665 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | true | 0 | 1 | 0 | 1.375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
61d50fc00b55d341cad10a57026378a5beb62d48 | 24 | py | Python | Python/p171.py | ekazyam/study | bd4d6bae8624c7b6e166881c898afa0afd3b0c70 | [
"MIT"
] | null | null | null | Python/p171.py | ekazyam/study | bd4d6bae8624c7b6e166881c898afa0afd3b0c70 | [
"MIT"
] | null | null | null | Python/p171.py | ekazyam/study | bd4d6bae8624c7b6e166881c898afa0afd3b0c70 | [
"MIT"
] | null | null | null | str_list = [1,2,3,4,5]
| 12 | 23 | 0.541667 | 7 | 24 | 1.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.166667 | 24 | 1 | 24 | 24 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
61ee710e6ba25d13271f4b749b7548deb9282c2c | 286 | py | Python | registration/www/conf/2017-eu/agenda.py | aherbhagya/registration | 4eee8d3a182748e7b027f4bb61596f080eedd88f | [
"MIT"
] | null | null | null | registration/www/conf/2017-eu/agenda.py | aherbhagya/registration | 4eee8d3a182748e7b027f4bb61596f080eedd88f | [
"MIT"
] | null | null | null | registration/www/conf/2017-eu/agenda.py | aherbhagya/registration | 4eee8d3a182748e7b027f4bb61596f080eedd88f | [
"MIT"
] | null | null | null | import frappe
import babel.dates
from frappe.utils import get_time
def get_context(context):
context.get_formated_time = get_formated_time
def get_formated_time(time):
return babel.dates.format_time(get_time(time), format='short', locale=(frappe.local.lang or "").replace("-", "_")) | 31.777778 | 115 | 0.783217 | 43 | 286 | 4.953488 | 0.44186 | 0.15493 | 0.211268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087413 | 286 | 9 | 115 | 31.777778 | 0.816092 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.428571 | 0.142857 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
1111e35e9bc62bcd74ecdd7cf9d7612bedd21f01 | 74 | py | Python | ahunt_package/visualization/__init__.py | theSekyi/ahunt_package | 5c27553c8496db1918085c48630db4c4d6d37b5f | [
"MIT"
] | null | null | null | ahunt_package/visualization/__init__.py | theSekyi/ahunt_package | 5c27553c8496db1918085c48630db4c4d6d37b5f | [
"MIT"
] | null | null | null | ahunt_package/visualization/__init__.py | theSekyi/ahunt_package | 5c27553c8496db1918085c48630db4c4d6d37b5f | [
"MIT"
] | null | null | null | from .image_viz import *
from .loss import *
from .evaluation_viz import * | 24.666667 | 29 | 0.77027 | 11 | 74 | 5 | 0.545455 | 0.327273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148649 | 74 | 3 | 29 | 24.666667 | 0.873016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
feeb4a678b6252ba08e116c44afded03877d3ad2 | 186 | py | Python | sample_project/testapp/views.py | timgates42/django-webpacker | 5e8072f3eb7eb0b43a34df49c1208213791d5ce9 | [
"MIT"
] | 79 | 2017-04-08T10:28:25.000Z | 2021-07-24T12:17:57.000Z | sample_project/testapp/views.py | timgates42/django-webpacker | 5e8072f3eb7eb0b43a34df49c1208213791d5ce9 | [
"MIT"
] | 9 | 2018-02-06T03:29:52.000Z | 2021-06-10T17:57:07.000Z | sample_project/testapp/views.py | timgates42/django-webpacker | 5e8072f3eb7eb0b43a34df49c1208213791d5ce9 | [
"MIT"
] | 20 | 2017-04-08T12:14:52.000Z | 2022-02-22T21:49:57.000Z | from django.shortcuts import render
# Create your views here.
def index(request):
return render(request, 'index.html')
def login_page(request):
return render(request, 'login.html') | 18.6 | 37 | 0.758065 | 26 | 186 | 5.384615 | 0.615385 | 0.185714 | 0.271429 | 0.371429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 186 | 10 | 38 | 18.6 | 0.864198 | 0.123656 | 0 | 0 | 0 | 0 | 0.123457 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
3a0f121fb2d3c41ead46f941d19c85fdb2513280 | 80 | py | Python | OctaHomeCore/OctaFiles/menus/__init__.py | Tomcuzz/OctaHomeAutomation | 4f0c5ea8b3d5b6e67633ae9c4cb95287d2784f5e | [
"MIT"
] | 4 | 2016-08-14T22:07:03.000Z | 2020-10-05T14:43:03.000Z | OctaHomeCore/OctaFiles/menus/__init__.py | Tomcuzz/OctaHomeAutomation | 4f0c5ea8b3d5b6e67633ae9c4cb95287d2784f5e | [
"MIT"
] | null | null | null | OctaHomeCore/OctaFiles/menus/__init__.py | Tomcuzz/OctaHomeAutomation | 4f0c5ea8b3d5b6e67633ae9c4cb95287d2784f5e | [
"MIT"
] | null | null | null | from base import *
from top import *
from device import *
from settings import * | 20 | 22 | 0.7625 | 12 | 80 | 5.083333 | 0.5 | 0.491803 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 80 | 4 | 22 | 20 | 0.938462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3a2033fc7162bf7924cfe3dbf1c681e1f1662fec | 34 | py | Python | pyodbc/database-programming-pyodbc-python-playbook/02/demos/m2/Demo1/m2d1-PythonTest.py | sudeep0901/python | 7a50af12e72d21ca4cad7f2afa4c6f929552043f | [
"MIT"
] | null | null | null | pyodbc/database-programming-pyodbc-python-playbook/02/demos/m2/Demo1/m2d1-PythonTest.py | sudeep0901/python | 7a50af12e72d21ca4cad7f2afa4c6f929552043f | [
"MIT"
] | 3 | 2019-12-26T05:13:55.000Z | 2020-03-07T06:59:56.000Z | pyodbc/database-programming-pyodbc-python-playbook/02/demos/m2/Demo1/m2d1-PythonTest.py | sudeep0901/python | 7a50af12e72d21ca4cad7f2afa4c6f929552043f | [
"MIT"
] | null | null | null | print('Hello World from VS Code!') | 34 | 34 | 0.735294 | 6 | 34 | 4.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
28e64d528d60b8d167557798d437e1b9af287416 | 177 | py | Python | ovbpclient/models/odata/ftp_entry.py | openergy/ovbpclient | 975c317275f79c29a2fd7d4809b697d5b7c21afd | [
"MIT"
] | null | null | null | ovbpclient/models/odata/ftp_entry.py | openergy/ovbpclient | 975c317275f79c29a2fd7d4809b697d5b7c21afd | [
"MIT"
] | null | null | null | ovbpclient/models/odata/ftp_entry.py | openergy/ovbpclient | 975c317275f79c29a2fd7d4809b697d5b7c21afd | [
"MIT"
] | 2 | 2018-06-29T12:59:15.000Z | 2018-06-29T16:22:41.000Z | class FtpEntry:
"""
Not really a model
"""
def __init__(self, **data):
self.data = data
def __getattr__(self, item):
return self.data[item]
| 17.7 | 32 | 0.559322 | 21 | 177 | 4.333333 | 0.619048 | 0.263736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.316384 | 177 | 9 | 33 | 19.666667 | 0.752066 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
28f0c13bc5c52de856e48db2a8f06e62c4ac54b1 | 147 | py | Python | IsabelaFunctions/__init__.py | de-oliveira/IsabelaFunctions | 6ea9f181f1e9eda90db3e2542d72eaf00caf977b | [
"MIT"
] | null | null | null | IsabelaFunctions/__init__.py | de-oliveira/IsabelaFunctions | 6ea9f181f1e9eda90db3e2542d72eaf00caf977b | [
"MIT"
] | null | null | null | IsabelaFunctions/__init__.py | de-oliveira/IsabelaFunctions | 6ea9f181f1e9eda90db3e2542d72eaf00caf977b | [
"MIT"
] | null | null | null | from . import shifting
from . import fit
from . import mapsetup
from . import read
from . import orbits
from . import fieldmodel
from . import sun
| 18.375 | 24 | 0.761905 | 21 | 147 | 5.333333 | 0.428571 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 147 | 7 | 25 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e91fba4536c87e782216f71d913cc25a08f6b593 | 29 | py | Python | server/src/jsontest.py | sandernaert/brat | 6468258fcb0f83e092169d8c7e88a036077aab5c | [
"CC-BY-3.0"
] | 1 | 2017-09-25T20:53:58.000Z | 2017-09-25T20:53:58.000Z | server/src/jsontest.py | sandernaert/brat | 6468258fcb0f83e092169d8c7e88a036077aab5c | [
"CC-BY-3.0"
] | null | null | null | server/src/jsontest.py | sandernaert/brat | 6468258fcb0f83e092169d8c7e88a036077aab5c | [
"CC-BY-3.0"
] | null | null | null | from simplejson import dumps
| 14.5 | 28 | 0.862069 | 4 | 29 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3a6bfc4117b327ff7bf997a93807a29d165cdf92 | 158 | py | Python | bc/model/__init__.py | rjgpinel/rlbc | 55a7499e4ad10182d9a84ce3c2494231db6fd3b5 | [
"MIT"
] | 43 | 2019-10-16T02:56:13.000Z | 2022-01-25T02:04:51.000Z | bc/model/__init__.py | rjgpinel/rlbc | 55a7499e4ad10182d9a84ce3c2494231db6fd3b5 | [
"MIT"
] | 6 | 2019-10-16T03:44:24.000Z | 2021-06-19T21:59:09.000Z | bc/model/__init__.py | rjgpinel/rlbc | 55a7499e4ad10182d9a84ce3c2494231db6fd3b5 | [
"MIT"
] | 11 | 2020-03-23T01:47:46.000Z | 2021-11-25T07:43:25.000Z | from bc.model.flat import FlatPolicy
from bc.model.regression import Regression
from bc.model.skills import SkillsPolicy
from bc.model.film import FilmPolicy
| 31.6 | 42 | 0.848101 | 24 | 158 | 5.583333 | 0.458333 | 0.179104 | 0.328358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101266 | 158 | 4 | 43 | 39.5 | 0.943662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3aae42957a1d228febbf74ade8a43aae569269ca | 96 | py | Python | venv/lib/python3.8/site-packages/clikit/ui/component.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/clikit/ui/component.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/clikit/ui/component.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/ea/45/ce/72b891e496b0800e17447689d147be2a58fa31e2345741a52b62720423 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.489583 | 0 | 96 | 1 | 96 | 96 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3ab61c8982444ab78f5c3dd03e8be71d4e4a6520 | 14,307 | py | Python | tests/test_protocol.py | eventphone/python-yate | 0598b132ec1747bec6f38248573a3453800eec05 | [
"MIT"
] | 4 | 2019-12-26T13:18:04.000Z | 2022-01-18T21:07:46.000Z | tests/test_protocol.py | eventphone/python-yate | 0598b132ec1747bec6f38248573a3453800eec05 | [
"MIT"
] | null | null | null | tests/test_protocol.py | eventphone/python-yate | 0598b132ec1747bec6f38248573a3453800eec05 | [
"MIT"
] | null | null | null | import unittest
from yate import protocol
class BaseEncodingTestCases(unittest.TestCase):
def test_encode_bytes(self):
result = protocol.yate_encode_bytes(b"test%")
self.assertEqual( b"test%%", result)
def test_encode_bytes2(self):
result = protocol.yate_encode_bytes(b"/bin:/usr/bin:/usr/local/bin")
self.assertEqual( b"/bin%z/usr/bin%z/usr/local/bin", result)
def test_decode_bytes(self):
result = protocol.yate_decode_bytes(b"test%%")
self.assertEqual(b"test%" , result)
def test_decode_bytes2(self):
result = protocol.yate_decode_bytes(b"/bin%z/usr/bin%z/usr/local/bin")
self.assertEqual(b"/bin:/usr/bin:/usr/local/bin" , result)
def test_decode_fails_invalid(self):
with self.assertRaises(Exception):
result = protocol.yate_decode_bytes(b"/bin%:/usr/bin%:/usr/local/bin")
class MessageDeserializationTestCases(unittest.TestCase):
def test_parse_yate_msg(self):
result = protocol.parse_yate_message('%%>message:0x7ff823883bb0.1932044751:1522601502:call.execute::id=sip/151:module=sip:status=incoming:address=172.20.23.1%z5060:billid=1522598913-105:answered=false:direction=incoming:callid=sip/4265e70a902406405ac10e1e275@DSIP/f-9024064014ea27935f/:caller=9940 DebügDÄCT:called=2049:callername=DebügDÄCT:antiloop=19:ip_host=172.20.23.1:ip_port=5060:ip_transport=UDP:connection_id=dect:connection_reliable=false:sip_uri=sip%z2049@172.20.23.2:sip_from=sip%z9940@172.20.23.2:sip_to=<sip%z2049@172.20.23.2>:sip_callid=4265e70a902406405ac10e1e275@DSIP:device=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_contact="DebügDÄCT" <sip%z9940@172.20.1.3>;+sip.instance="<urn%zuuid%z1F102AF1-2C00-0100-8000-03029649cf19>":sip_supported=replaces, 100rel, path, gruu:sip_user-agent=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_allow=INVITE, ACK, OPTIONS, CANCEL, BYE, REFER, NOTIFY, INFO, MESSAGE, UPDATE, PRACK:sip_content-type=application/sdp:username=9940:realm=voip.eventphone.de:newcall=true:domain=172.20.23.2:xsip_nonce_age=0:rtp_addr=172.20.23.12:media=yes:formats=alaw,mulaw:transport=RTP/AVP:rtp_rfc2833=101:rtp_port=16326:sdp_silenceSupp=off - - - -:sdp_sendrecv=:sdp_fmtp%z=0-15:rtp_forward=yes:handlers=javascript%z15,regexroute%z40,regexroute%z100,javascript%z15,cdrbuild%z50,regexroute%z80,regexroute%z100,sip%z100,register%z120:context=default:oconnection_id=dect:osip_X-EventphoneID=f45a42b9-35cc-11e8-b1ae-000c2991f54c:osip_X-CallType=default:callto=sip/sip%z2049@172.20.1.3'.encode("utf-8"))
self.assertEqual("message", result.msg_type)
self.assertEqual('0x7ff823883bb0.1932044751', result.id)
self.assertEqual(1522601502, result.time)
self.assertEqual('call.execute', result.name)
self.assertEqual('DebügDÄCT', result.params['callername'])
self.assertEqual(False, result.reply)
def test_parse_yate_msg_without_kv(self):
result = protocol.parse_yate_message('%%>message:0x7ff823883bb0.1932044751:1522601502:call.execute:'.encode("utf-8"))
self.assertEqual("message", result.msg_type)
self.assertEqual('0x7ff823883bb0.1932044751', result.id)
self.assertEqual(1522601502, result.time)
self.assertEqual('call.execute', result.name)
self.assertEqual({}, result.params)
self.assertEqual(False, result.reply)
def test_parse_install_message(self):
result = protocol.parse_yate_message(b"%%<install:50:test:true")
self.assertEqual("install", result.msg_type)
self.assertEqual(50, result.priority)
self.assertEqual("test", result.name)
self.assertEqual(True, result.success)
def test_parse_install_request_no_filter(self):
result = protocol.parse_yate_message(b"%%>install:70:chan.test")
self.assertEqual(result.priority, 70)
self.assertEqual(result.name, "chan.test")
def test_parse_install_request(self):
result = protocol.parse_yate_message(b"%%>install:70:chan.test:important:true")
self.assertEqual(result.priority, 70)
self.assertEqual(result.name, "chan.test")
self.assertEqual(result.filter_name, "important")
self.assertEqual(result.filter_value, "true")
def test_parse_uninstall_request(self):
result = protocol.parse_yate_message(b"%%>uninstall:test")
self.assertEqual(result.name, "test")
def test_parse_uninstall_message(self):
result = protocol.parse_yate_message(b"%%<uninstall:50:test:true")
self.assertEqual("uninstall", result.msg_type)
self.assertEqual(50, result.priority)
self.assertEqual("test", result.name)
self.assertEqual(True, result.success)
def test_parse_watch_request(self):
result = protocol.parse_yate_message(b"%%>watch:test")
self.assertEqual(result.name, "test")
def test_parse_watch_message(self):
result = protocol.parse_yate_message(b"%%<watch:call.execute:false")
self.assertEqual("watch", result.msg_type)
self.assertEqual("call.execute", result.name)
self.assertEqual(False, result.success)
def test_parse_unwatch_request(self):
result = protocol.parse_yate_message(b"%%>unwatch:test")
self.assertEqual(result.name, "test")
def test_parse_unwatch_message(self):
result = protocol.parse_yate_message(b"%%<unwatch:call.execute:true")
self.assertEqual("unwatch", result.msg_type)
self.assertEqual("call.execute", result.name)
self.assertEqual(True, result.success)
def test_parse_setlocal_message(self):
result = protocol.parse_yate_message(b"%%>setlocal:id:mychan0")
self.assertEqual("setlocal", result.msg_type)
self.assertEqual("id", result.param)
self.assertEqual("mychan0", result.value)
def test_parse_setlocal_param_request_message(self):
result = protocol.parse_yate_message(b"%%>setlocal:engine.version:")
self.assertEqual("setlocal", result.msg_type)
self.assertEqual("engine.version", result.param)
self.assertEqual("", result.value)
def test_parse_setlocal_positive_answer(self):
result = protocol.parse_yate_message(b"%%<setlocal:id:mychan0:true")
self.assertEqual("setlocal", result.msg_type)
self.assertEqual("id", result.param)
self.assertEqual("mychan0", result.value)
self.assertTrue(result.success)
def test_parse_setlocal_negative_answer(self):
result = protocol.parse_yate_message(b"%%<setlocal:id:oldchan:false")
self.assertEqual("setlocal", result.msg_type)
self.assertEqual("id", result.param)
self.assertEqual("oldchan", result.value)
self.assertFalse(result.success)
class MessageSerializationTestCases(unittest.TestCase):
def test_encode_yate_install_mgs(self):
message = protocol.InstallRequest(1, "call.execute")
result = message.encode()
self.assertEqual(b'%%>install:1:call.execute', result)
def test_encode_yate_install_mgs_filter(self):
message = protocol.InstallRequest(1, "call.execute", "test1", "test2")
result = message.encode()
self.assertEqual(b'%%>install:1:call.execute:test1:test2', result)
def test_encode_yate_msg(self):
result = protocol.parse_yate_message('%%>message:0x7ff823883bb0.1932044751:1522601502:call.execute::id=sip/151:module=sip:status=incoming:address=172.20.23.1%z5060:billid=1522598913-105:answered=false:direction=incoming:callid=sip/4265e70a902406405ac10e1e275@DSIP/f-9024064014ea27935f/:caller=9940 DebügDÄCT:called=2049:callername=DebügDÄCT:antiloop=19:ip_host=172.20.23.1:ip_port=5060:ip_transport=UDP:connection_id=dect:connection_reliable=false:sip_uri=sip%z2049@172.20.23.2:sip_from=sip%z9940@172.20.23.2:sip_to=<sip%z2049@172.20.23.2>:sip_callid=4265e70a902406405ac10e1e275@DSIP:device=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_contact="DebügDÄCT" <sip%z9940@172.20.1.3>;+sip.instance="<urn%zuuid%z1F102AF1-2C00-0100-8000-03029649cf19>":sip_supported=replaces, 100rel, path, gruu:sip_user-agent=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_allow=INVITE, ACK, OPTIONS, CANCEL, BYE, REFER, NOTIFY, INFO, MESSAGE, UPDATE, PRACK:sip_content-type=application/sdp:username=9940:realm=voip.eventphone.de:newcall=true:domain=172.20.23.2:xsip_nonce_age=0:rtp_addr=172.20.23.12:media=yes:formats=alaw,mulaw:transport=RTP/AVP:rtp_rfc2833=101:rtp_port=16326:sdp_silenceSupp=off - - - -:sdp_sendrecv=:sdp_fmtp%z=0-15:rtp_forward=yes:handlers=javascript%z15,regexroute%z40,regexroute%z100,javascript%z15,cdrbuild%z50,regexroute%z80,regexroute%z100,sip%z100,register%z120:context=default:oconnection_id=dect:osip_X-EventphoneID=f45a42b9-35cc-11e8-b1ae-000c2991f54c:osip_X-CallType=default:callto=sip/sip%z2049@172.20.1.3'.encode("utf-8"))
result = result.encode_answer_for_yate(False)
real_example_answer = '%%<message:0x7ff823883bb0.1932044751:false:call.execute::id=sip/151:module=sip:status=incoming:address=172.20.23.1%z5060:billid=1522598913-105:answered=false:direction=incoming:callid=sip/4265e70a902406405ac10e1e275@DSIP/f-9024064014ea27935f/:caller=9940 DebügDÄCT:called=2049:callername=DebügDÄCT:antiloop=19:ip_host=172.20.23.1:ip_port=5060:ip_transport=UDP:connection_id=dect:connection_reliable=false:sip_uri=sip%z2049@172.20.23.2:sip_from=sip%z9940@172.20.23.2:sip_to=<sip%z2049@172.20.23.2>:sip_callid=4265e70a902406405ac10e1e275@DSIP:device=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_contact="DebügDÄCT" <sip%z9940@172.20.1.3>;+sip.instance="<urn%zuuid%z1F102AF1-2C00-0100-8000-03029649cf19>":sip_supported=replaces, 100rel, path, gruu:sip_user-agent=Mitel SIP-DECT (SW-Version=7.1-CK14):sip_allow=INVITE, ACK, OPTIONS, CANCEL, BYE, REFER, NOTIFY, INFO, MESSAGE, UPDATE, PRACK:sip_content-type=application/sdp:username=9940:realm=voip.eventphone.de:newcall=true:domain=172.20.23.2:xsip_nonce_age=0:rtp_addr=172.20.23.12:media=yes:formats=alaw,mulaw:transport=RTP/AVP:rtp_rfc2833=101:rtp_port=16326:sdp_silenceSupp=off - - - -:sdp_sendrecv=:sdp_fmtp%z=0-15:rtp_forward=yes:handlers=javascript%z15,regexroute%z40,regexroute%z100,javascript%z15,cdrbuild%z50,regexroute%z80,regexroute%z100,sip%z100,register%z120:context=default:oconnection_id=dect:osip_X-EventphoneID=f45a42b9-35cc-11e8-b1ae-000c2991f54c:osip_X-CallType=default:callto=sip/sip%z2049@172.20.1.3'.encode("utf-8")
result_split = result.split(b":")
real_example_split = real_example_answer.split(b":")
self.assertListEqual(real_example_split[0:4], result_split[0:4])
self.assertSetEqual(set(real_example_split[5:]), set(result_split[5:]))
def test_encode_yate_message_changed(self):
result = protocol.parse_yate_message(b"%%>message:id123:4711:call.execute:old_return:test=yes")
result.params["test"] = "false"
result.return_value = "new_return"
encodedMsg = result.encode_answer_for_yate(True)
self.assertEqual(b"%%<message:id123:true:call.execute:new_return:test=false", encodedMsg)
def test_encode_new_yate_message(self):
msg = protocol.MessageRequest("call.execute", {"caller": "nick"}, "")
result = msg.encode("id-4908", 4711)
self.assertEqual(b"%%>message:id-4908:4711:call.execute::caller=nick", result)
def test_encode_new_yate_message_no_params(self):
msg = protocol.MessageRequest("call.execute", {}, "")
result = msg.encode("id-4908", 4712)
self.assertEqual(b"%%>message:id-4908:4712:call.execute:", result)
def test_encode_install_message(self):
msg = protocol.InstallRequest("100", "call.hangup")
result = msg.encode()
self.assertEqual(b"%%>install:100:call.hangup", result)
def test_encode_install_message_with_filter(self):
msg = protocol.InstallRequest("100", "chan.hangup", "caller", "nick")
result = msg.encode()
self.assertEqual(b"%%>install:100:chan.hangup:caller:nick", result)
def test_encode_install_confirm(self):
msg = protocol.InstallConfirm(80, "chan.test", True)
result = msg.encode()
self.assertEqual(b"%%<install:80:chan.test:true", result)
def test_encode_uninstall_message(self):
msg = protocol.UninstallRequest("chan.hangup")
result = msg.encode()
self.assertEqual(b"%%>uninstall:chan.hangup", result)
def test_encode_uninstall_confirm(self):
msg = protocol.UninstallConfirm(80, "chan.test", True)
result = msg.encode()
self.assertEqual(b"%%<uninstall:80:chan.test:true", result)
def test_encode_watch_message(self):
msg = protocol.WatchRequest("chan.dtmf")
result = msg.encode()
self.assertEqual(b"%%>watch:chan.dtmf", result)
def test_encode_watch_confirm(self):
msg = protocol.WatchConfirm("chan.dtmf", True)
result = msg.encode()
self.assertEqual(b"%%<watch:chan.dtmf:true", result)
def test_encode_unwatch_message(self):
msg = protocol.UnwatchRequest("chan.dtmf")
result = msg.encode()
self.assertEqual(b"%%>unwatch:chan.dtmf", result)
def test_encode_unwatch_confirm(self):
msg = protocol.UnwatchConfirm("chan.dtmf", True)
result = msg.encode()
self.assertEqual(b"%%<unwatch:chan.dtmf:true", result)
def test_encode_setlocal_request(self):
msg = protocol.SetLocalRequest("id", "mychan0")
result = msg.encode()
self.assertEqual(b"%%>setlocal:id:mychan0", result)
def test_encode_setlocal_request_query_param(self):
msg = protocol.SetLocalRequest("engine.version")
result = msg.encode()
self.assertEqual(b"%%>setlocal:engine.version:", result)
def test_encode_setlocal_answer_success(self):
msg = protocol.SetLocalAnswer("id", "mychan0", True)
result = msg.encode()
self.assertEqual(b"%%<setlocal:id:mychan0:true", result)
def test_encode_setlocal_answer_failed(self):
msg = protocol.SetLocalAnswer("id", "oldchan", False)
result = msg.encode()
self.assertEqual(b"%%<setlocal:id:oldchan:false", result)
def test_encode_connect_message(self):
msg = protocol.ConnectToYate()
result = msg.encode()
self.assertEqual(b"%%>connect:global", result)
if __name__ == '__main__':
unittest.main()
| 61.403433 | 1,531 | 0.725868 | 1,922 | 14,307 | 5.245578 | 0.130593 | 0.104146 | 0.036501 | 0.033922 | 0.8415 | 0.784765 | 0.714045 | 0.663261 | 0.589466 | 0.536104 | 0 | 0.089505 | 0.130845 | 14,307 | 232 | 1,532 | 61.668103 | 0.721271 | 0 | 0 | 0.283422 | 0 | 0.016043 | 0.426854 | 0.359754 | 0 | 0 | 0.005871 | 0 | 0.40107 | 1 | 0.213904 | false | 0 | 0.02139 | 0 | 0.251337 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3ae7fbc205f06b93f71df9127f8251d817dfa957 | 1,813 | py | Python | platform_operator/helm_hooks.py | neuro-inc/platform-operator | 8e06e76658ac3b3e8fbc5409bebbcddd73ad044b | [
"Apache-2.0"
] | null | null | null | platform_operator/helm_hooks.py | neuro-inc/platform-operator | 8e06e76658ac3b3e8fbc5409bebbcddd73ad044b | [
"Apache-2.0"
] | 3 | 2022-02-07T11:50:51.000Z | 2022-02-07T11:51:20.000Z | platform_operator/helm_hooks.py | neuro-inc/platform-operator | 8e06e76658ac3b3e8fbc5409bebbcddd73ad044b | [
"Apache-2.0"
] | null | null | null | import asyncio
import aiohttp
from .kube_client import KubeClient
from .models import KubeConfig
LOCK_KEY = "helm"
def start_helm_chart_upgrade_hook(
deployment_namespace: str, deployment_name: str
) -> None:
kube_config = KubeConfig.load_from_env()
async def run() -> None:
async with KubeClient(kube_config) as kube_client:
await start_helm_chart_upgrade(
kube_client, deployment_namespace, deployment_name
)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
loop.close()
def end_helm_chart_upgrade_hook(
deployment_namespace: str, deployment_name: str
) -> None:
kube_config = KubeConfig.load_from_env()
async def run() -> None:
async with KubeClient(kube_config) as kube_client:
await end_helm_chart_upgrade(
kube_client, deployment_namespace, deployment_name
)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
loop.close()
async def start_helm_chart_upgrade(
kube_client: KubeClient, deployment_namespace: str, deployment_name: str
) -> None:
try:
acquire_lock = kube_client.acquire_lock(
deployment_namespace, deployment_name, LOCK_KEY, ttl_s=15 * 60, sleep_s=5
)
await asyncio.wait_for(acquire_lock, 10 * 60)
except aiohttp.ClientResponseError as ex:
if ex.status == 404:
pass
else:
raise
async def end_helm_chart_upgrade(
kube_client: KubeClient, deployment_namespace: str, deployment_name: str
) -> None:
try:
await kube_client.release_lock(deployment_namespace, deployment_name, LOCK_KEY)
except aiohttp.ClientResponseError as ex:
if ex.status == 404:
pass
else:
raise
| 26.661765 | 87 | 0.674573 | 220 | 1,813 | 5.236364 | 0.263636 | 0.078125 | 0.083333 | 0.111111 | 0.824653 | 0.807292 | 0.793403 | 0.717014 | 0.717014 | 0.717014 | 0 | 0.011054 | 0.251517 | 1,813 | 67 | 88 | 27.059701 | 0.837878 | 0 | 0 | 0.653846 | 0 | 0 | 0.002206 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0.038462 | 0.076923 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c906ce4b0aaed04864f18b71cc2e1807c0c15db4 | 157 | py | Python | rltime/exploration/__init__.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 147 | 2019-09-05T10:41:15.000Z | 2021-12-28T23:41:54.000Z | rltime/exploration/__init__.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 1 | 2020-10-18T14:55:53.000Z | 2021-11-18T10:41:36.000Z | rltime/exploration/__init__.py | frederikschubert/rltime | d1722ffd4cf7b4599655b8d9c64abc243919afc9 | [
"Apache-2.0"
] | 11 | 2019-09-08T09:18:28.000Z | 2020-11-30T12:41:37.000Z | from .epsilon_greedy import EpsilonGreedyExplorationManager
def get_types():
return {
"epsilon_greedy": EpsilonGreedyExplorationManager,
}
| 19.625 | 59 | 0.751592 | 12 | 157 | 9.583333 | 0.75 | 0.226087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184713 | 157 | 7 | 60 | 22.428571 | 0.898438 | 0 | 0 | 0 | 0 | 0 | 0.089172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a324f4cb0b19b2566ad8e13dbf7e542024889620 | 46 | py | Python | paraphraser/__init__.py | mahmoudeid789/paraphraser | 5426b6865601132bab5af932b66eb36304887bd1 | [
"MIT"
] | 371 | 2018-02-12T01:44:04.000Z | 2022-03-12T09:08:00.000Z | paraphraser/__init__.py | mahmoudeid789/paraphraser | 5426b6865601132bab5af932b66eb36304887bd1 | [
"MIT"
] | 28 | 2018-04-13T16:42:40.000Z | 2022-02-09T23:28:56.000Z | paraphraser/__init__.py | mahmoudeid789/paraphraser | 5426b6865601132bab5af932b66eb36304887bd1 | [
"MIT"
] | 104 | 2018-04-12T08:13:04.000Z | 2022-03-22T23:27:59.000Z | from .synonym_model import synonym_paraphrase
| 23 | 45 | 0.891304 | 6 | 46 | 6.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a32593db98692d70aeafd346467f060a65f08291 | 282 | py | Python | mesonh_interface/__init__.py | pnarvor/nephelae_simulation | 7b3f3a2c2aaa49324f8b09a6ab62819c280efa4c | [
"BSD-3-Clause"
] | null | null | null | mesonh_interface/__init__.py | pnarvor/nephelae_simulation | 7b3f3a2c2aaa49324f8b09a6ab62819c280efa4c | [
"BSD-3-Clause"
] | null | null | null | mesonh_interface/__init__.py | pnarvor/nephelae_simulation | 7b3f3a2c2aaa49324f8b09a6ab62819c280efa4c | [
"BSD-3-Clause"
] | null | null | null | from .DimensionHelper import DimensionBounds
from .DimensionHelper import DimensionHelper
from .ScaledArray import ScaledArray
from .NetCDFInterface import NetCDFInterface
from .PeriodicContainer import PeriodicContainer
from .MesoNHVariable import MesoNHVariable
| 28.2 | 48 | 0.829787 | 24 | 282 | 9.75 | 0.333333 | 0.162393 | 0.213675 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 282 | 9 | 49 | 31.333333 | 0.975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a3307c0c07033799eda3464b08c644c417f93904 | 19 | py | Python | vdffit/io/__init__.py | dstansby/vdffit | 1fe6adc5c5ef2ace266f91fa6e29af91544ca768 | [
"BSD-2-Clause"
] | null | null | null | vdffit/io/__init__.py | dstansby/vdffit | 1fe6adc5c5ef2ace266f91fa6e29af91544ca768 | [
"BSD-2-Clause"
] | null | null | null | vdffit/io/__init__.py | dstansby/vdffit | 1fe6adc5c5ef2ace266f91fa6e29af91544ca768 | [
"BSD-2-Clause"
] | null | null | null | from .cdf import *
| 9.5 | 18 | 0.684211 | 3 | 19 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a34d26d5e6da41e0c26c45a8c23b49f079b51cad | 2,415 | py | Python | work/my_func/data_info.py | tensorfly-gpu/Unet-for-localize-fovea | e6da986eb742f9632177b87fe07842cb92bdcd16 | [
"Apache-2.0"
] | null | null | null | work/my_func/data_info.py | tensorfly-gpu/Unet-for-localize-fovea | e6da986eb742f9632177b87fe07842cb92bdcd16 | [
"Apache-2.0"
] | null | null | null | work/my_func/data_info.py | tensorfly-gpu/Unet-for-localize-fovea | e6da986eb742f9632177b87fe07842cb92bdcd16 | [
"Apache-2.0"
] | null | null | null | #0 means left eye and 1 means right eye
train_lr = [1,1,1,0,0,1,1,0,1,0,
0,1,1,0,1,0,1,1,0,1,
0,1,1,0,0,1,1,1,0,0,
0,0,1,1,0,1,1,1,1,0,
0,0,0,1,0,0,0,0,1,1,
1,1,1,1,0,1,1,1,1,1,
0,0,0,1,1,1,1,1,1,1,
0,0,1,1,0,0,1,0,1,1,
0,0,1,1,0,1,0,1,0,0,
1,0,0,0,0,0,0,0,0,0,
]
test_lr = [1,0,0,0,0,0,1,0,0,1,
0,0,1,0,0,1,1,1,0,0,
1,1,1,1,0,1,1,1,0,0,
1,1,1,1,1,1,1,0,1,0,
1,1,0,1,1,0,0,0,0,1,
0,1,1,1,1,1,1,1,0,1,
1,0,1,0,0,0,1,1,1,1,
0,1,0,0,1,0,1,0,1,1,
1,0,1,1,1,1,1,1,1,0,
0,0,1,0,1,1,1,0,1,0,
]
#0means 1956x1934 and 1 means 2992x2000
train_size = [0,0,1,1,1,0,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,0,1,1,1,1,1,1,1,1,
1,1,1,1,1,0,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,0,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,1,
]
test_size = [1,1,1,1,1,1,0,1,1,1,
1,1,1,1,1,1,0,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,0,1,1,1,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1,0,
1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,0,1,1,1,
1,1,0,1,1,1,1,1,1,1,
1,1,0,1,1,1,1,1,1,1,
]
#get img path,inputs img index and mode
def get_img_path(index,mode='train'):
assert index >= 0 and index < 100, 'index need >= 0 and < 100'
assert mode=='train' or mode=='test', 'mode error'
if mode == 'train':
if index < 9:
path = 'data/training/fundus color images/000{}.jpg'.format(index+1)
elif index < 99:
path = 'data/training/fundus color images/00{}.jpg'.format(index+1)
else:
path ='data/training/fundus color images/0100.jpg'
elif mode == 'test':
if index < 9:
path = 'data/testing/fundus color images/010{}.jpg'.format(index+1)
elif index < 99:
path = 'data/testing/fundus color images/01{}.jpg'.format(index+1)
else:
path = 'data/testing/fundus color images/0200.jpg'
return path | 33.541667 | 81 | 0.40911 | 539 | 2,415 | 1.821892 | 0.085343 | 0.486762 | 0.610998 | 0.700611 | 0.713849 | 0.697556 | 0.527495 | 0.460285 | 0.40835 | 0.279022 | 0 | 0.300593 | 0.371843 | 2,415 | 72 | 82 | 33.541667 | 0.346737 | 0.047205 | 0 | 0.354839 | 0 | 0 | 0.138689 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 1 | 0.016129 | false | 0 | 0 | 0 | 0.032258 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6e02c6c9f03a2a01808554316ac061ced4adbaa6 | 96 | py | Python | venv/lib/python3.8/site-packages/yapftests/yapf_test.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/yapftests/yapf_test.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/yapftests/yapf_test.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/bd/26/ac/38418f1d558d0d2c07019294869f15f6792ff994cf3ceb646c94ea7020 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.447917 | 0 | 96 | 1 | 96 | 96 | 0.447917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
281e097539dbbca21238d5244d4f88638c0e41ee | 65 | py | Python | tests/unit/peapods/runtimes/remote/jinad/executor/my_executor.py | vishalbelsare/jina | ae72cc5ce1f7e7f4c662e72e96ea21dddc28bf43 | [
"Apache-2.0"
] | 3 | 2021-12-06T08:10:02.000Z | 2021-12-06T14:50:11.000Z | tests/unit/peapods/runtimes/remote/jinad/executor/my_executor.py | vishalbelsare/jina | ae72cc5ce1f7e7f4c662e72e96ea21dddc28bf43 | [
"Apache-2.0"
] | 2 | 2021-12-17T15:22:12.000Z | 2021-12-18T07:19:06.000Z | tests/unit/peapods/runtimes/remote/jinad/executor/my_executor.py | vishalbelsare/jina | ae72cc5ce1f7e7f4c662e72e96ea21dddc28bf43 | [
"Apache-2.0"
] | 1 | 2021-11-15T05:51:07.000Z | 2021-11-15T05:51:07.000Z | from jina import Executor
class MyExecutor(Executor):
pass
| 10.833333 | 27 | 0.753846 | 8 | 65 | 6.125 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 65 | 5 | 28 | 13 | 0.942308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
28744f6ca5f5d229231e5e7391490c721138fa53 | 20 | py | Python | SWIM-Executables/Windows/pyinstaller-2.0 for windows/buildtests/import/relimp2/bar/bar2/__init__.py | alexsigaras/SWIM | 1a35df8acb26bdcb307a1b8f60e9feba68ed1715 | [
"MIT"
] | 47 | 2020-03-08T08:43:28.000Z | 2022-03-18T18:51:55.000Z | SWIM-Executables/Windows/pyinstaller-2.0 for windows/buildtests/import/relimp2/bar/bar2/__init__.py | alexsigaras/SWIM | 1a35df8acb26bdcb307a1b8f60e9feba68ed1715 | [
"MIT"
] | null | null | null | SWIM-Executables/Windows/pyinstaller-2.0 for windows/buildtests/import/relimp2/bar/bar2/__init__.py | alexsigaras/SWIM | 1a35df8acb26bdcb307a1b8f60e9feba68ed1715 | [
"MIT"
] | 16 | 2020-03-08T08:43:30.000Z | 2022-01-10T22:05:57.000Z | from ..baz import *
| 10 | 19 | 0.65 | 3 | 20 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 20 | 1 | 20 | 20 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
95374db5c02c6c1a304afc720263d94fef6e997a | 106 | py | Python | catcoin/__init__.py | val-labs/catcoinledger | e46f5f188a0d326ebf4d01f95c091d5ae26d01fa | [
"Apache-2.0"
] | 1 | 2018-07-29T21:12:52.000Z | 2018-07-29T21:12:52.000Z | catcoin/__init__.py | val-labs/catcoinledger | e46f5f188a0d326ebf4d01f95c091d5ae26d01fa | [
"Apache-2.0"
] | null | null | null | catcoin/__init__.py | val-labs/catcoinledger | e46f5f188a0d326ebf4d01f95c091d5ae26d01fa | [
"Apache-2.0"
] | null | null | null | from catcoin import cli
from catcoin import api
from catcoin import version
from catcoin import peer2peer
| 21.2 | 29 | 0.849057 | 16 | 106 | 5.625 | 0.4375 | 0.488889 | 0.755556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011111 | 0.150943 | 106 | 4 | 30 | 26.5 | 0.988889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
953ac33e3e4aa149323f0b5a5ce21d6d725d39e2 | 630 | py | Python | Chapter7_CodingGuidelines/PEP8/test_isort.py | tomex74/UdemyPythonPro | b4b83483fa2d3337a2860d53ff38e68eb38b3ac4 | [
"MIT"
] | null | null | null | Chapter7_CodingGuidelines/PEP8/test_isort.py | tomex74/UdemyPythonPro | b4b83483fa2d3337a2860d53ff38e68eb38b3ac4 | [
"MIT"
] | null | null | null | Chapter7_CodingGuidelines/PEP8/test_isort.py | tomex74/UdemyPythonPro | b4b83483fa2d3337a2860d53ff38e68eb38b3ac4 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
import os
import sys
import numpy as np
from my_lib import Object
from my_lib import Object2
from my_lib import Object3
from third_party import lib1
from third_party import lib2
from third_party import lib3
from third_party import lib4
from third_party import lib5
from third_party import lib6
from third_party import lib7
from third_party import lib8
from third_party import lib9
from third_party import lib10
from third_party import lib11
from third_party import lib12
from third_party import lib13
from third_party import lib14
from third_party import lib15
print("Hey")
print("yo")
| 21 | 38 | 0.842857 | 107 | 630 | 4.747664 | 0.317757 | 0.265748 | 0.413386 | 0.590551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042672 | 0.144444 | 630 | 29 | 39 | 21.724138 | 0.899814 | 0 | 0 | 0 | 0 | 0 | 0.007937 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.916667 | 0 | 0.916667 | 0.083333 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
953d188eb0b3fc8505231c7f856331dcb3b9766b | 52,103 | py | Python | web/transiq/restapi/serializers/driver.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | web/transiq/restapi/serializers/driver.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | 14 | 2020-06-05T23:06:45.000Z | 2022-03-12T00:00:18.000Z | web/transiq/restapi/serializers/driver.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | from django.contrib.auth.models import User
from django.db.models import Q
from rest_framework import serializers, ISO_8601
from rest_framework.validators import UniqueValidator, UniqueTogetherValidator
from driver.models import Driver, DriverAppUser, GPSLogNew, OTP, GPSDevice, GPSDeviceLog, TracknovateGPSDevice, \
TracknovateGPSDeviceLog, WaytrackerGPSDevice, WaytrackerGPSDeviceLog, TempoGoGPSDevice, TempoGoGPSDeviceLog, \
SecuGPSDevice, SecuGPSDeviceLog, MahindraGPSDevice, MahindraGPSDeviceLog, BharatGPSTrackerLog, GPSDeviceProvider
from fms.models import Document
from owner.models import Vehicle
from restapi.helper_api import DATE_FORMAT, DATETIME_FORMAT
from restapi.serializers.authentication import BankSerializer
from restapi.service.validators import PAN, MOBILE_NUMBER_REGEX, validate_mobile_number, validate_vehicle_number
from utils.models import TaxationID, IDDetails, Address, Bank
class TrackVehicleSerializer(serializers.Serializer):
vehicle_number = serializers.CharField(allow_null=True, max_length=40, read_only=True)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False, read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False, read_only=True)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False, read_only=True)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False, read_only=True)
device_id = serializers.CharField(max_length=50, read_only=True)
source = serializers.CharField(max_length=50, read_only=True)
def create(self, validated_data):
pass
def update(self, instance, validated_data):
pass
class DriverSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
name = serializers.CharField( max_length=35, required=True)
phone = serializers.RegexField(regex=MOBILE_NUMBER_REGEX,
validators=[UniqueValidator(queryset=Driver.objects.all())])
alt_phone = serializers.RegexField(regex=MOBILE_NUMBER_REGEX, allow_blank=True, allow_null=True, min_length=10,
max_length=10, required=False)
alt_phone2 = serializers.RegexField(regex=MOBILE_NUMBER_REGEX, allow_blank=True, allow_null=True, min_length=10,
max_length=10, required=False)
pan = serializers.RegexField(regex=PAN, allow_blank=True, allow_null=True, max_length=11, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=50, required=False)
driving_licence_location = serializers.CharField(allow_null=True, max_length=50, required=False)
driving_licence_validity = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
smartphone_available = serializers.BooleanField(required=False)
route = serializers.CharField(allow_null=True, allow_blank=True, max_length=255, required=False)
priority_level = serializers.CharField(allow_null=True, max_length=255, required=False)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
address = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Address.objects.all(), required=False,
validators=[UniqueValidator(queryset=Driver.objects.all())])
id_proof = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=IDDetails.objects.all(), required=False,
validators=[UniqueValidator(queryset=Driver.objects.all())])
account_details = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Bank.objects.all(), required=False,
validators=[UniqueValidator(queryset=Driver.objects.all())])
taxation_id = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=TaxationID.objects.all(), required=False,
validators=[UniqueValidator(queryset=Driver.objects.all())])
driving_licence = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Document.objects.all(),
required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def create(self, validated_data):
instance = Driver.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
Driver.objects.filter(id=instance.id).update(**validated_data)
return Driver.objects.get(id=instance.id)
class FMSDriverSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
name = serializers.CharField(read_only=True)
phone = serializers.CharField(read_only=True)
driving_licence_number = serializers.CharField(read_only=True)
driving_licence_location = serializers.CharField(read_only=True)
pan = serializers.CharField(read_only=True)
driving_licence_validity = serializers.DateField(read_only=True, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
docs = serializers.SerializerMethodField()
account_details = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Bank.objects.all(), required=False,
validators=[UniqueValidator(queryset=Driver.objects.all())])
def to_representation(self, instance):
self.fields['account_details'] = BankSerializer(read_only=True)
return super().to_representation(instance=instance)
def get_docs(self, instance):
return [
{'id': doc.id, 'url': doc.s3_upload.public_url(), 'document_category': doc.document_category,
'document_category_display': doc.get_document_category_display(),
'thumb_url': doc.s3_upload.public_url(),
'bucket': doc.s3_upload.bucket,
'folder': doc.s3_upload.folder,
'uuid': doc.s3_upload.uuid,
'filename': doc.s3_upload.filename,
'validity': None,
} for doc in
instance.driver_files.filter(document_category__in=['DL', 'PAN']).exclude(
Q(s3_upload=None) | Q(deleted=True))
]
@classmethod
def many_init(cls, *args, **kwargs):
kwargs['child'] = cls()
excluded_fields = [
'driving_licence_number', 'driving_licence_location', 'driving_licence_validity', 'docs'
]
for field in excluded_fields:
kwargs['child'].fields.pop(field)
return serializers.ListSerializer(*args, **kwargs)
def create(self, validated_data):
pass
def update(self, instance, validated_data):
pass
class DriverAppUserSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
device_id = serializers.CharField(max_length=50, validators=[UniqueValidator(queryset=DriverAppUser.objects.all())])
auth_token = serializers.CharField(max_length=40,
validators=[UniqueValidator(queryset=DriverAppUser.objects.all())])
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
number_verified = serializers.BooleanField(required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
is_active = serializers.BooleanField(required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(), required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = DriverAppUser.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
DriverAppUser.objects.filter(id=instance.id).update(**validated_data)
return DriverAppUser.objects.get(id=instance.id)
class GPSLogNewSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
datetime = serializers.DateTimeField(help_text='log time', required=True)
device_id = serializers.CharField(help_text='imei or uuid generated on phone', max_length=50, required=True)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
altitude = serializers.FloatField(allow_null=True, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
course = serializers.FloatField(allow_null=True, required=False)
accuracy = serializers.FloatField(allow_null=True, required=False)
provider = serializers.CharField(allow_null=True, max_length=20, required=False)
battery = serializers.FloatField(allow_null=True, required=False)
total_memory = serializers.FloatField(allow_null=True, required=False)
available_memory = serializers.FloatField(allow_null=True, required=False)
threshold = serializers.FloatField(allow_null=True, required=False)
low_memory = serializers.BooleanField(required=False)
android_release = serializers.CharField(allow_null=True, max_length=20, required=False)
android_sdk_int = serializers.IntegerField(allow_null=True, max_value=2147483647, min_value=-2147483648,
required=False)
version_name = serializers.CharField(allow_null=True, max_length=20, required=False)
version_code = serializers.IntegerField(allow_null=True, max_value=2147483647, min_value=-2147483648,
required=False)
brand = serializers.CharField(allow_null=True, max_length=30, required=False)
manufacturer = serializers.CharField(allow_null=True, max_length=30, required=False)
product = serializers.CharField(allow_null=True, max_length=30, required=False)
device = serializers.CharField(allow_null=True, max_length=30, required=False)
model = serializers.CharField(allow_null=True, max_length=30, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=DriverAppUser.objects.all(), required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def to_representation(self, instance):
self.fields['driver'] = DriverAppUserSerializer(read_only=True)
return super().to_representation(instance=instance)
class Meta:
validators = [UniqueTogetherValidator(queryset=GPSLogNew.objects.all(), fields=('device_id', 'datetime'))]
def create(self, validated_data):
instance = GPSLogNew.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
GPSLogNew.objects.filter(id=instance.id).update(**validated_data)
return GPSLogNew.objects.get(id=instance.id)
class OTPSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
phone = serializers.CharField(max_length=20, validators=[UniqueValidator(queryset=OTP.objects.all())])
expires_at = serializers.DateTimeField()
otp = serializers.CharField(max_length=8)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def create(self, validated_data):
instance = OTP.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
OTP.objects.filter(id=instance.id).update(**validated_data)
return OTP.objects.get(id=instance.id)
class GPSDeviceProviderSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
name = serializers.CharField(allow_blank=True, allow_null=True, max_length=50, required=False)
def create(self, validated_data):
pass
def update(self, instance, validated_data):
pass
class GPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
device_id = serializers.CharField(allow_null=True, max_length=50, required=False)
imei = serializers.CharField(allow_null=True, max_length=40, required=False)
address = serializers.CharField(allow_blank=True, allow_null=True, max_length=500, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True,min_length=10, max_length=10, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_blank=True, allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False, format=DATETIME_FORMAT)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
is_active = serializers.BooleanField(required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
device_provider = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=GPSDeviceProvider.objects.all(),
required=False)
vehicle = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Vehicle.objects.all(), required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(), required=False)
device_provider_data = serializers.SerializerMethodField()
class Meta:
validators = [
UniqueTogetherValidator(queryset=GPSDevice.objects.all(), fields=('device_id', 'device_provider'))]
def get_device_provider_data(self, instance):
if isinstance(instance.device_provider, GPSDeviceProvider):
return {'id': instance.device_provider.id, 'name': instance.device_provider.name}
return {'id': -1, 'name': None}
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = GPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
GPSDevice.objects.filter(id=instance.id).update(**validated_data)
return GPSDevice.objects.get(id=instance.id)
def validate_driver_number(self, value):
if not validate_mobile_number(value) and value:
raise serializers.ValidationError("Not a valid mobile number")
return value
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
class GPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
location_id = serializers.CharField(max_length=40,
validators=[UniqueValidator(queryset=GPSDeviceLog.objects.all())])
datetime = serializers.DateTimeField(help_text='log time')
vehicle_id = serializers.CharField(help_text='imei or uuid generated on phone', max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
altitude = serializers.FloatField(allow_null=True, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
course = serializers.FloatField(allow_null=True, required=False)
accuracy = serializers.FloatField(allow_null=True, required=False)
engine_on = serializers.BooleanField(required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=GPSDevice.objects.all(), required=False)
def to_representation(self, instance):
self.fields['device'] = GPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = GPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
GPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return GPSDeviceLog.objects.get(id=instance.id)
class TracknovateGPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
phone = serializers.CharField(max_length=20)
sim_number = serializers.CharField(max_length=20)
vehicle_id = serializers.CharField(max_length=40,
validators=[UniqueValidator(queryset=TracknovateGPSDevice.objects.all())])
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
number_verified = serializers.BooleanField(required=False)
current_duration = serializers.CharField(allow_null=True, required=False, style={'base_template': 'textarea.html'})
current_vstatus = serializers.CharField(allow_null=True, required=False, style={'base_template': 'textarea.html'})
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
is_active = serializers.BooleanField(required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(),
required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = TracknovateGPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
TracknovateGPSDevice.objects.filter(id=instance.id).update(**validated_data)
return TracknovateGPSDevice.objects.get(id=instance.id)
class TracknovateGPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
datetime = serializers.DateTimeField(help_text='log time')
vehicle_id = serializers.CharField(max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
altitude = serializers.FloatField(allow_null=True, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
course = serializers.FloatField(allow_null=True, required=False)
accuracy = serializers.FloatField(allow_null=True, required=False)
engine_on = serializers.BooleanField(required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=TracknovateGPSDevice.objects.all(),
required=False)
def to_representation(self, instance):
self.fields['device'] = TracknovateGPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = TracknovateGPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
TracknovateGPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return TracknovateGPSDeviceLog.objects.get(id=instance.id)
class WaytrackerGPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
vehicle_id = serializers.CharField(max_length=40,
validators=[UniqueValidator(queryset=WaytrackerGPSDevice.objects.all())])
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
number_verified = serializers.BooleanField(required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
is_active = serializers.BooleanField(required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(),
required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = WaytrackerGPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
WaytrackerGPSDevice.objects.filter(id=instance.id).update(**validated_data)
return WaytrackerGPSDevice.objects.get(id=instance.id)
class WaytrackerGPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
datetime = serializers.DateTimeField(help_text='log time')
vehicle_id = serializers.CharField(max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
altitude = serializers.FloatField(allow_null=True, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
course = serializers.FloatField(allow_null=True, required=False)
accuracy = serializers.FloatField(allow_null=True, required=False)
engine_on = serializers.BooleanField(required=False)
fuel = serializers.CharField(allow_null=True, max_length=10, required=False)
nearest_site = serializers.CharField(allow_null=True, max_length=150, required=False)
nearest_location = serializers.CharField(allow_null=True, max_length=150, required=False)
idle_time = serializers.CharField(allow_null=True, max_length=20, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=WaytrackerGPSDevice.objects.all(),
required=False)
def to_representation(self, instance):
self.fields['device'] = WaytrackerGPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = WaytrackerGPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
WaytrackerGPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return WaytrackerGPSDeviceLog.objects.get(id=instance.id)
class TempoGoGPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
device_id = serializers.CharField(max_length=50,validators=[UniqueValidator(queryset=TempoGoGPSDevice.objects.all())])
imei = serializers.CharField(max_length=50,validators=[UniqueValidator(queryset=TempoGoGPSDevice.objects.all())])
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True,min_length=10, max_length=10, required=False)
number_verified = serializers.BooleanField(required=False)
driving_licence_number = serializers.CharField(allow_blank=True, allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(
allow_null=True, max_length=20, validators=[UniqueValidator(queryset=TempoGoGPSDevice.objects.all())],
required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
is_active = serializers.BooleanField(required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(),
required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username", required=False)
source = serializers.SerializerMethodField()
def get_source(self, instance):
if isinstance(instance, TempoGoGPSDevice):
return "tempo-go"
return None
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = TempoGoGPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
TempoGoGPSDevice.objects.filter(id=instance.id).update(**validated_data)
return TempoGoGPSDevice.objects.get(id=instance.id)
def validate_driver_number(self, value):
if not validate_mobile_number(value) and value:
raise serializers.ValidationError("Not a valid mobile number")
return value
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
class TempoGoGPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
gps_log_id = serializers.CharField(max_length=50, required=True)
datetime = serializers.DateTimeField(help_text='log time', required=True)
vehicle_id = serializers.CharField(max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
speed = serializers.FloatField(allow_null=True, required=False, min_value=0.0)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True,min_length=10, max_length=10, required=False)
driving_licence_number = serializers.CharField(allow_blank=True, allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=TempoGoGPSDevice.objects.all(),
required=False)
class Meta:
validators = [UniqueTogetherValidator(queryset=TempoGoGPSDeviceLog.objects.all(),
fields=('gps_log_id', 'datetime'))]
def to_representation(self, instance):
self.fields['device'] = TempoGoGPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = TempoGoGPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
TempoGoGPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return TempoGoGPSDeviceLog.objects.get(id=instance.id)
def validate_driver_number(self, value):
if not validate_mobile_number(value) and value:
raise serializers.ValidationError("Not a valid mobile number")
return value
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
class SecuGPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
device_id = serializers.CharField(max_length=50, validators=[UniqueValidator(queryset=SecuGPSDevice.objects.all())])
imei = serializers.CharField(max_length=50)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=20, required=False)
number_verified = serializers.BooleanField(required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
address = serializers.CharField(allow_null=True, max_length=300, required=False)
status = serializers.CharField(allow_null=True, max_length=300, required=False)
is_active = serializers.BooleanField(required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(), required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = SecuGPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
SecuGPSDevice.objects.filter(id=instance.id).update(**validated_data)
return SecuGPSDevice.objects.get(id=instance.id)
class SecuGPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
datetime = serializers.DateTimeField(help_text='log time')
vehicle_id = serializers.CharField(max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
address = serializers.CharField(allow_null=True, max_length=300, required=False)
status = serializers.CharField(allow_null=True, max_length=300, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True, max_length=40, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=SecuGPSDevice.objects.all(),
required=False)
def to_representation(self, instance):
self.fields['device'] = SecuGPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = SecuGPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
SecuGPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return SecuGPSDeviceLog.objects.get(id=instance.id)
class MahindraGPSDeviceSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
device_id = serializers.CharField(max_length=50,
validators=[UniqueValidator(queryset=MahindraGPSDevice.objects.all())])
imei = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True,min_length=10, max_length=10, required=False)
number_verified = serializers.BooleanField(required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=20, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
location_time = serializers.DateTimeField(allow_null=True, required=False)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
address = serializers.CharField(allow_null=True, max_length=300, required=False)
status = serializers.CharField(allow_null=True, max_length=300, required=False)
is_active = serializers.BooleanField(required=False)
inactive_sms_sent_at = serializers.DateTimeField(allow_null=True, required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(),
required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
source = serializers.SerializerMethodField()
def get_source(self, instance):
if isinstance(instance, MahindraGPSDevice):
return "mahindra-gps-device"
return None
def to_representation(self, instance):
self.fields['driver'] = DriverSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = MahindraGPSDevice.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
MahindraGPSDevice.objects.filter(id=instance.id).update(**validated_data)
return MahindraGPSDevice.objects.get(id=instance.id)
def validate_driver_number(self, value):
if not validate_mobile_number(value) and value:
raise serializers.ValidationError("Not a valid mobile number")
return value
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
class MahindraGPSDeviceLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
datetime = serializers.DateTimeField(help_text='log time')
vehicle_id = serializers.CharField(max_length=50)
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
speed = serializers.FloatField(allow_null=True, required=False,min_value=0.0)
fuel_efficiency = serializers.CharField(allow_null=True, max_length=30, required=False)
address = serializers.CharField(allow_null=True, max_length=300, required=False)
status = serializers.CharField(allow_null=True, max_length=300, required=False)
driver_name = serializers.CharField(allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_null=True,min_length=10, max_length=10, required=False)
driving_licence_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_number = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=MahindraGPSDevice.objects.all(),
required=False)
def to_representation(self, instance):
self.fields['device'] = MahindraGPSDeviceSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = MahindraGPSDeviceLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
MahindraGPSDeviceLog.objects.filter(id=instance.id).update(**validated_data)
return MahindraGPSDeviceLog.objects.get(id=instance.id)
def validate_driver_number(self, value):
if not validate_mobile_number(value) and value:
raise serializers.ValidationError("Not a valid mobile number")
return value
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
class BharatGPSTrackerLogSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
position_id = serializers.CharField(max_length=30,
validators=[UniqueValidator(queryset=BharatGPSTrackerLog.objects.all())],
allow_null=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
datetime = serializers.DateTimeField(help_text='log time', input_formats=['%d-%m-%Y %H:%M:%S', ISO_8601])
latitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
longitude = serializers.DecimalField(allow_null=True, decimal_places=10, max_digits=20, required=False)
speed = serializers.FloatField(allow_null=True, required=False)
address = serializers.CharField(allow_null=True, max_length=300, required=False)
status = serializers.CharField(allow_null=True, max_length=300, required=False)
driver_name = serializers.CharField(allow_blank=True, allow_null=True, max_length=50, required=False)
driver_number = serializers.CharField(allow_blank=True, allow_null=True, max_length=40, required=False)
driving_licence_number = serializers.CharField(allow_blank=True, allow_null=True, max_length=40, required=False)
vehicle_number = serializers.CharField(allow_blank=True, allow_null=True, max_length=40, required=False)
vehicle_type = serializers.CharField(allow_blank=True, allow_null=True, max_length=40, required=False)
vehicle_status = serializers.ChoiceField(allow_blank=True, allow_null=True, choices=(
('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
device = serializers.PrimaryKeyRelatedField(queryset=GPSDevice.objects.all())
def create(self, validated_data):
instance = BharatGPSTrackerLog.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
BharatGPSTrackerLog.objects.filter(id=instance.id).update(**validated_data)
return BharatGPSTrackerLog.objects.get(id=instance.id)
| 61.082063 | 122 | 0.737712 | 5,791 | 52,103 | 6.451563 | 0.048524 | 0.09882 | 0.081422 | 0.046251 | 0.888333 | 0.873183 | 0.842269 | 0.830599 | 0.817269 | 0.785311 | 0 | 0.011319 | 0.157285 | 52,103 | 852 | 123 | 61.153756 | 0.839574 | 0 | 0 | 0.686649 | 0 | 0 | 0.038041 | 0.001823 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095368 | false | 0.008174 | 0.014986 | 0.001362 | 0.771117 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
955d6da757cb614fa40e85bf37d1cf78878470e3 | 163 | py | Python | geomagio/algorithm/AlgorithmException.py | usgs/geomag-algorithms | a83a0e36bed9307828e37b9130c25dbc26dd1bc9 | [
"CC0-1.0"
] | 49 | 2015-10-06T17:57:20.000Z | 2022-01-12T18:40:17.000Z | geomagio/algorithm/AlgorithmException.py | usgs/geomag-algorithms | a83a0e36bed9307828e37b9130c25dbc26dd1bc9 | [
"CC0-1.0"
] | 229 | 2015-01-26T20:10:36.000Z | 2022-03-12T00:46:33.000Z | geomagio/algorithm/AlgorithmException.py | alejandrodelcampillo/geomag-algorithms | 43a734d63a8eb2a696f14237e0054e21d36de7c3 | [
"CC0-1.0"
] | 44 | 2015-03-03T16:18:18.000Z | 2021-11-06T17:07:38.000Z | """
Base class for exceptions thrown by Algorithms.
"""
class AlgorithmException(Exception):
"""Base class for exceptions thrown by Algorithms."""
pass
| 16.3 | 57 | 0.711656 | 18 | 163 | 6.444444 | 0.555556 | 0.155172 | 0.206897 | 0.37931 | 0.689655 | 0.689655 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0.184049 | 163 | 9 | 58 | 18.111111 | 0.87218 | 0.582822 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
9579b6fffbef1b06cecdc1388641f671e9a455f1 | 80 | py | Python | hwhandler_api/core/__init__.py | kevimota/hwhandler-api | 3817d555689f584ae3b7eafcd4eaa1a9a73e01f5 | [
"MIT"
] | 1 | 2020-07-22T21:54:13.000Z | 2020-07-22T21:54:13.000Z | hwhandler_api/core/__init__.py | kevimota/hwhandler-api | 3817d555689f584ae3b7eafcd4eaa1a9a73e01f5 | [
"MIT"
] | null | null | null | hwhandler_api/core/__init__.py | kevimota/hwhandler-api | 3817d555689f584ae3b7eafcd4eaa1a9a73e01f5 | [
"MIT"
] | 1 | 2020-07-29T18:56:42.000Z | 2020-07-29T18:56:42.000Z | from .base_system import *
from .base_command import *
from .system_fsm import * | 26.666667 | 27 | 0.7875 | 12 | 80 | 5 | 0.5 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1375 | 80 | 3 | 28 | 26.666667 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
95a59cf4d4465f71fa82bb101cb035e0d2b810ca | 136 | py | Python | enas_ori/__init__.py | MichaelChuai/modelzoo | 4744f555784873192168c65e2c10b49b982cb1f1 | [
"Apache-2.0"
] | null | null | null | enas_ori/__init__.py | MichaelChuai/modelzoo | 4744f555784873192168c65e2c10b49b982cb1f1 | [
"Apache-2.0"
] | null | null | null | enas_ori/__init__.py | MichaelChuai/modelzoo | 4744f555784873192168c65e2c10b49b982cb1f1 | [
"Apache-2.0"
] | null | null | null | from .layer_collection import *
from .macro_child import *
from .macro_controller import *
from .model import *
from .listeners import * | 27.2 | 31 | 0.786765 | 18 | 136 | 5.777778 | 0.5 | 0.384615 | 0.288462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139706 | 136 | 5 | 32 | 27.2 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
95ced17df8da70b2b15ee6bec353d817e6ecd229 | 6,909 | py | Python | tests/test_include.py | gvwilson/mccole-old | 5d724a64e7e91d39d72947798f5ee38bfdf96a23 | [
"MIT"
] | 1 | 2022-01-08T04:10:46.000Z | 2022-01-08T04:10:46.000Z | tests/test_include.py | gvwilson/mccole | 5d724a64e7e91d39d72947798f5ee38bfdf96a23 | [
"MIT"
] | 43 | 2022-01-21T11:04:39.000Z | 2022-02-11T21:11:54.000Z | tests/test_include.py | gvwilson/mccole-old | 5d724a64e7e91d39d72947798f5ee38bfdf96a23 | [
"MIT"
] | 1 | 2022-01-23T18:52:23.000Z | 2022-01-23T18:52:23.000Z | """Test snippet inclusion."""
import logging
from textwrap import dedent
from mccole.accounting import Info
from mccole.include import inclusion_to_html
def test_include_text_with_badly_formatted_spec(caplog):
info = Info(src="./page.md")
spec = ' whatever="a.txt" '
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == ""
assert len(caplog.record_tuples) == 1
assert "Unrecognized inclusion spec" in caplog.record_tuples[0][2]
def test_include_text_file_when_file_exists(fs):
info = Info(src="./page.md")
spec = ' file="a.txt" '
fs.create_file("a.txt", contents="AAA")
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">AAA\n</code></pre>\n'
def test_include_text_file_when_file_missing(fs, caplog):
info = Info(src="./page.md")
spec = 'file="a.txt"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == "MISSING"
assert len(caplog.record_tuples) == 1
assert "Unable to read inclusion" in caplog.record_tuples[0][2]
def test_include_keep_subset_correct(fs):
info = Info(src="./page.md")
content = dedent(
"""\
goes
// [stays]
should
// [/stays]
goes
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" keep="stays"'
html = inclusion_to_html(info, spec, None)
assert (
html == '<pre title="a.txt"><code class="language-txt">should\n</code></pre>\n'
)
def test_include_keep_subset_field_in_wrong_order(fs, caplog):
info = Info(src="./page.md")
content = "AAA"
fs.create_file("a.txt", contents=content)
spec = 'keep="stays" file="a.txt"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == ""
assert len(caplog.record_tuples) == 1
assert "Unrecognized inclusion spec" in caplog.record_tuples[0][2]
def test_include_omit_subset_correct(fs):
info = Info(src="./page.md")
content = dedent(
"""\
stays
// [skip]
should not appear
// [/skip]
stays
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" omit="skip"'
html = inclusion_to_html(info, spec, None)
assert (
html
== '<pre title="a.txt"><code class="language-txt">stays\nstays\n</code></pre>\n'
)
def test_include_omit_subset_field_in_wrong_order(fs, caplog):
info = Info(src="./page.md")
content = "AAA"
fs.create_file("a.txt", contents=content)
spec = 'omit="stays" file="a.txt"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == ""
assert len(caplog.record_tuples) == 1
assert "Unrecognized inclusion spec" in caplog.record_tuples[0][2]
def test_include_keep_omit_subset_correct(fs):
info = Info(src="./page.md")
content = dedent(
"""\
goes
// [yes]
should
// [no]
should not
// [/no]
// [/yes]
goes
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" keep="yes"\tomit="no"'
html = inclusion_to_html(info, spec, None)
assert (
html == '<pre title="a.txt"><code class="language-txt">should\n</code></pre>\n'
)
def test_include_multi_correct(fs):
info = Info(src="./page.md")
fs.create_file("a.out", contents="OUT")
fs.create_file("a.py", contents="PY")
fs.create_file("a.sh", contents="RUN")
spec = ' pat="a.*" fill="py sh out" '
html = inclusion_to_html(info, spec, None)
for (s, b) in (("py", "PY"), ("sh", "RUN"), ("out", "OUT")):
expected = f'<pre title="a.{s}"><code class="language-{s}">{b}\n</code></pre>'
assert expected in html
def test_keep_when_opening_tag_missing(fs, caplog):
info = Info(src="./page.md")
content = dedent(
"""\
goes
// missing opening tag
should appear
// [/stays]
goes
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" keep="stays"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">\n</code></pre>\n'
assert len(caplog.record_tuples) == 1
assert "Failed to match inclusion 'keep' key" in caplog.record_tuples[0][2]
def test_keep_when_closing_tag_missing(fs, caplog):
info = Info(src="./page.md")
content = dedent(
"""\
goes
// [stays]
should appear
// missing closing tag
goes
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" keep="stays"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">\n</code></pre>\n'
assert len(caplog.record_tuples) == 1
assert "Failed to match inclusion 'keep' key" in caplog.record_tuples[0][2]
def test_omit_when_opening_tag_missing(fs, caplog):
info = Info(src="./page.md")
content = dedent(
"""\
stays
// missing opening tag
should not appear
// [/no]
stays
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" omit="stays"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">\n</code></pre>\n'
assert len(caplog.record_tuples) == 1
assert "Failed to match inclusion 'omit' key" in caplog.record_tuples[0][2]
def test_omit_when_closing_tag_missing(fs, caplog):
info = Info(src="./page.md")
content = dedent(
"""\
stays
// [no]
should not appear
// missing closing tag
stays
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" omit="stays"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">\n</code></pre>\n'
assert len(caplog.record_tuples) == 1
assert "Failed to match inclusion 'omit' key" in caplog.record_tuples[0][2]
def test_include_keep_omit_closing_out_of_order(fs, caplog):
info = Info(src="./page.md")
content = dedent(
"""\
goes
// [yes]
should appear
// [no]
should not appear
// [/yes]
// [/no]
goes
"""
)
fs.create_file("a.txt", contents=content)
spec = ' file="a.txt" keep="yes"\tomit="no"'
with caplog.at_level(logging.WARNING):
html = inclusion_to_html(info, spec, None)
assert html == '<pre title="a.txt"><code class="language-txt">\n</code></pre>\n'
assert len(caplog.record_tuples) == 1
assert "Failed to match inclusion 'omit' key" in caplog.record_tuples[0][2]
| 29.275424 | 88 | 0.618903 | 963 | 6,909 | 4.288681 | 0.096573 | 0.031961 | 0.044552 | 0.050847 | 0.845036 | 0.843099 | 0.836804 | 0.789346 | 0.784504 | 0.771186 | 0 | 0.004985 | 0.216095 | 6,909 | 235 | 89 | 29.4 | 0.75757 | 0.003329 | 0 | 0.676056 | 0 | 0.070423 | 0.249551 | 0.073445 | 0 | 0 | 0 | 0 | 0.225352 | 1 | 0.098592 | false | 0 | 0.028169 | 0 | 0.126761 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
95d11433cf5de939625702672a0e81ef9392ecff | 12,788 | py | Python | Deep learning/merge_validation_set.py | jiahao95/project_lab-ss2020 | 7f4101f449b3f04221da9484d8bfbbe6258ba270 | [
"MIT"
] | null | null | null | Deep learning/merge_validation_set.py | jiahao95/project_lab-ss2020 | 7f4101f449b3f04221da9484d8bfbbe6258ba270 | [
"MIT"
] | null | null | null | Deep learning/merge_validation_set.py | jiahao95/project_lab-ss2020 | 7f4101f449b3f04221da9484d8bfbbe6258ba270 | [
"MIT"
] | 1 | 2020-12-03T05:08:17.000Z | 2020-12-03T05:08:17.000Z | #%%
import os
from pyteomics import mzid, mzml
import pandas as pd
import numpy as np
import glob
"""
Identically as how we did with the training data set, we randomly divided the test files into different
folders, then we generated different data frames and stored all of them in one single hdf file as our
validation daata set
"""
#%%
os.chdir('./test')
mzid_files=glob.glob('*.mzid')
indexed_mzid = mzid.chain.from_iterable(mzid_files,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid = []
for entry in(indexed_mzid):
all_mzid.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid)
mzid_df = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra)
spectra_df = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df = pd.merge(mzid_df,spectra_df,how='left',on=['file','id'])
merged_df = merged_df[['id','seq','mz','intensities']]
#%%
hdf_test = pd.HDFStore('/home/ubuntu/data/jiahao/files/test.hdf5', mode='w')
#%%
hdf_test.put(value=merged_df, key="df")
#%%
os.chdir('./test_1')
mzid_files_1 = glob.glob('*.mzid')
indexed_mzid_1 = mzid.chain.from_iterable(mzid_files_1,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_1 = []
for entry in(indexed_mzid_1):
all_mzid_1.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_1)
mzid_df_1 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_1 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_1.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_1)
spectra_df_1 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_1 = pd.merge(mzid_df_1,spectra_df_1,how='left',on=['file','id'])
merged_df_1 = merged_df_1[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_1, key="df1")
#%%
os.chdir('./test_2')
mzid_files_2 = glob.glob('*.mzid')
indexed_mzid_2 = mzid.chain.from_iterable(mzid_files_2,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_2 = []
for entry in(indexed_mzid_2):
all_mzid_2.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_2)
mzid_df_2 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_2 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_2.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_2)
spectra_df_2 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_2 = pd.merge(mzid_df_2,spectra_df_2,how='left',on=['file','id'])
merged_df_2 = merged_df_2[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_2, key="df2")
#%%
os.chdir('./test_4')
mzid_files_4 = glob.glob('*.mzid')
indexed_mzid_4 = mzid.chain.from_iterable(mzid_files_4,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_4 = []
for entry in(indexed_mzid_4):
all_mzid_4.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_4)
mzid_df_4 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_4 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_4.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_4)
spectra_df_4 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_4 = pd.merge(mzid_df_4,spectra_df_4,how='left',on=['file','id'])
merged_df_4 = merged_df_4[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_4, key="df4")
#%%
os.chdir('./test_5')
mzid_files_5 = glob.glob('*.mzid')
indexed_mzid_5 = mzid.chain.from_iterable(mzid_files_5,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_5 = []
for entry in(indexed_mzid_5):
all_mzid_5.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_5)
mzid_df_5 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_5 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_5.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_5)
spectra_df_5 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_5 = pd.merge(mzid_df_5,spectra_df_5,how='left',on=['file','id'])
merged_df_5 = merged_df_5[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_5, key="df5")
#%%
os.chdir('./test_6')
mzid_files_6 = glob.glob('*.mzid')
indexed_mzid_6 = mzid.chain.from_iterable(mzid_files_6,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_6 = []
for entry in(indexed_mzid_6):
all_mzid_6.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_6)
mzid_df_6 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_6 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_6.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_6)
spectra_df_6 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_6 = pd.merge(mzid_df_6,spectra_df_6,how='left',on=['file','id'])
merged_df_6 = merged_df_6[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_6, key="df6")
# %%
os.chdir('./test_7')
mzid_files_7 = glob.glob('*.mzid')
indexed_mzid_7 = mzid.chain.from_iterable(mzid_files_7,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_7 = []
for entry in(indexed_mzid_7):
all_mzid_7.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_7)
mzid_df_7 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_7 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_7.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_7)
spectra_df_7 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
#### MERGE: mzid + mzml
merged_df_7 = pd.merge(mzid_df_7,spectra_df_7,how='left',on=['file','id'])
merged_df_7 = merged_df_7[['id','seq','mz','intensities']]
#%%
hdf_test.put(value=merged_df_7, key="df7")
# %%
os.chdir('./test_8')
mzid_files_8 = glob.glob('*.mzid')
indexed_mzid_8 = mzid.chain.from_iterable(mzid_files_8,use_index=True)
def _parse_mzid_entry(entry):
spectrum_id = str(entry['spectrumID'])
seq = str(entry['SpectrumIdentificationItem'][0]['PeptideSequence'])
try:
mods = entry['SpectrumIdentificationItem'][0]['Modification']
except:
mods = None
rank = int(entry['SpectrumIdentificationItem'][0]['rank'])
file_location = str(entry['name'])
return file_location,spectrum_id,seq,mods,rank
all_mzid_8 = []
for entry in(indexed_mzid_8):
all_mzid_8.append(_parse_mzid_entry(entry))
file_location,spectrum_ids,seq,mods,rank = zip(*all_mzid_8)
mzid_df_8 = pd.DataFrame({'file':file_location,'id':spectrum_ids,'seq':seq})
def _parse_mzml_entry(entry):
ID = str(entry['id'])
mz = np.array(entry['m/z array'])
intensities = np.array(entry['intensity array'])
return ID, mz, intensities
all_spectra_8 = []
for file in np.unique(file_location):
print(file)
indexed = mzml.MzML(file)
for i,entry in enumerate(indexed.map(_parse_mzml_entry)):
tupl = (file,)+entry
all_spectra_7.append(tupl)
mzml_location, ids, mz, intensities = zip(*all_spectra_8)
spectra_df_8 = pd.DataFrame({'file':mzml_location,'id':ids,'mz':mz,'intensities':intensities})
merged_df_8 = pd.merge(mzid_df_8,spectra_df_8,how='left',on=['file','id'])
merged_df_8 = merged_df_8[['id','seq','mz','intensities']]
# %%
hdf.put(value=merged_df_8, key="df8")
| 28.672646 | 103 | 0.697842 | 1,879 | 12,788 | 4.491219 | 0.063332 | 0.056879 | 0.091006 | 0.036023 | 0.87617 | 0.826994 | 0.792866 | 0.771063 | 0.771063 | 0.771063 | 0 | 0.014981 | 0.138724 | 12,788 | 445 | 104 | 28.737079 | 0.751226 | 0.01345 | 0 | 0.566434 | 0 | 0 | 0.147061 | 0.053831 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055944 | false | 0 | 0.017483 | 0 | 0.129371 | 0.027972 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
95ecd2517c9540bd715c01ccc03196448d217761 | 49 | py | Python | src/neptuneml_toolkit/modelzoo/decoder/__init__.py | riaz/neptuneml-toolkit | 6c68ba6d02a3d52116e6e8ca23d5f755693ae3d8 | [
"Apache-2.0"
] | 7 | 2021-11-11T15:47:39.000Z | 2022-03-31T03:44:40.000Z | src/neptuneml_toolkit/modelzoo/decoder/__init__.py | riaz/neptuneml-toolkit | 6c68ba6d02a3d52116e6e8ca23d5f755693ae3d8 | [
"Apache-2.0"
] | null | null | null | src/neptuneml_toolkit/modelzoo/decoder/__init__.py | riaz/neptuneml-toolkit | 6c68ba6d02a3d52116e6e8ca23d5f755693ae3d8 | [
"Apache-2.0"
] | 2 | 2021-11-23T19:32:25.000Z | 2022-02-24T18:42:42.000Z | from .dotproduct import *
from .distmult import * | 24.5 | 25 | 0.77551 | 6 | 49 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 49 | 2 | 26 | 24.5 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
95fed769094e63893bd3db7158bbe6ccbd3aaad6 | 23 | py | Python | muDIC/filtering/__init__.py | diehlpk/muDIC | b5d90aa62267b4bd0b88ae0a989cf09a51990654 | [
"MIT"
] | 70 | 2019-04-15T08:08:23.000Z | 2022-03-23T08:24:25.000Z | muDIC/filtering/__init__.py | diehlpk/muDIC | b5d90aa62267b4bd0b88ae0a989cf09a51990654 | [
"MIT"
] | 34 | 2019-05-03T18:09:43.000Z | 2022-02-10T11:36:29.000Z | muDIC/filtering/__init__.py | diehlpk/muDIC | b5d90aa62267b4bd0b88ae0a989cf09a51990654 | [
"MIT"
] | 37 | 2019-04-25T15:39:23.000Z | 2022-03-28T21:40:24.000Z | from .filters import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
250cdc05bea4062653242f43b3c1fca567270e64 | 30 | py | Python | src/imu/__init__.py | heyuhang0/SAUVC2019 | 111a2ac5936b95c75930394a3df63536a47d61e9 | [
"Apache-2.0"
] | 1 | 2018-11-16T13:05:48.000Z | 2018-11-16T13:05:48.000Z | src/imu/__init__.py | heyuhang0/SAUVC2019 | 111a2ac5936b95c75930394a3df63536a47d61e9 | [
"Apache-2.0"
] | null | null | null | src/imu/__init__.py | heyuhang0/SAUVC2019 | 111a2ac5936b95c75930394a3df63536a47d61e9 | [
"Apache-2.0"
] | 2 | 2020-02-28T02:51:51.000Z | 2021-03-23T06:17:37.000Z | from imu.imu_class import IMU
| 15 | 29 | 0.833333 | 6 | 30 | 4 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2510c1f783addb06ec7600b0c483b80bc0cbbbed | 41 | py | Python | libs/yowsup/yowsup/yowsup/layers/protocol_calls/__init__.py | akshitpradhan/TomHack | 837226e7b38de1140c19bc2d478eeb9e379ed1fd | [
"MIT"
] | 22 | 2017-07-14T20:01:17.000Z | 2022-03-08T14:22:39.000Z | libs/yowsup/yowsup/yowsup/layers/protocol_calls/__init__.py | akshitpradhan/TomHack | 837226e7b38de1140c19bc2d478eeb9e379ed1fd | [
"MIT"
] | 6 | 2017-07-14T21:03:50.000Z | 2021-06-10T19:08:32.000Z | libs/yowsup/yowsup/yowsup/layers/protocol_calls/__init__.py | akshitpradhan/TomHack | 837226e7b38de1140c19bc2d478eeb9e379ed1fd | [
"MIT"
] | 13 | 2017-07-14T20:13:14.000Z | 2020-11-12T08:06:05.000Z | from .layer import YowCallsProtocolLayer
| 20.5 | 40 | 0.878049 | 4 | 41 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c25b945998264925d772e0839e155a19c639b903 | 149 | py | Python | app/usecases/user/create_user.py | rjNemo/graphql_python_template | 14bc5fd657f6bdba8d7293f21cfcec821fa6374f | [
"MIT"
] | 1 | 2021-05-02T01:47:57.000Z | 2021-05-02T01:47:57.000Z | app/usecases/user/create_user.py | rjNemo/graphql_python_template | 14bc5fd657f6bdba8d7293f21cfcec821fa6374f | [
"MIT"
] | null | null | null | app/usecases/user/create_user.py | rjNemo/graphql_python_template | 14bc5fd657f6bdba8d7293f21cfcec821fa6374f | [
"MIT"
] | null | null | null | from app.models.user import User
from app.repositories.users import add_user
def create_user(username: str) -> User:
return add_user(username)
| 21.285714 | 43 | 0.778523 | 23 | 149 | 4.913043 | 0.565217 | 0.123894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14094 | 149 | 6 | 44 | 24.833333 | 0.882813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
c26a711a88f8ec40ff012f2eb1b31e8f31d513df | 39 | py | Python | modules/strange_lists/one.py | keianrao/PythonExplore | d6448f133ebb5ef5673942ea288306325a199e8a | [
"CC0-1.0"
] | null | null | null | modules/strange_lists/one.py | keianrao/PythonExplore | d6448f133ebb5ef5673942ea288306325a199e8a | [
"CC0-1.0"
] | null | null | null | modules/strange_lists/one.py | keianrao/PythonExplore | d6448f133ebb5ef5673942ea288306325a199e8a | [
"CC0-1.0"
] | null | null | null |
cycling = [ 1, 4, 2, 3, 5, 8, 6, 7 ];
| 13 | 37 | 0.384615 | 9 | 39 | 1.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 0.333333 | 39 | 2 | 38 | 19.5 | 0.269231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c2ac6cb997485628b5a06cf8735db11982e7d464 | 79 | py | Python | code/0/07/0/import_hello_world.py | dudung/butir | 6684b6bd2160aa36969d51a531b5d4f8d11622a4 | [
"MIT"
] | null | null | null | code/0/07/0/import_hello_world.py | dudung/butir | 6684b6bd2160aa36969d51a531b5d4f8d11622a4 | [
"MIT"
] | null | null | null | code/0/07/0/import_hello_world.py | dudung/butir | 6684b6bd2160aa36969d51a531b5d4f8d11622a4 | [
"MIT"
] | null | null | null | """
import_hello_world.py
"""
from hello_world import printhello as ph
ph()
| 9.875 | 40 | 0.721519 | 12 | 79 | 4.5 | 0.666667 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164557 | 79 | 7 | 41 | 11.285714 | 0.818182 | 0.265823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
c2e661d5c729c83494d63810f66c09cf8a7fcdd2 | 43 | py | Python | file2.py | cgrinstead12/saturdayclass1 | 6ee4cc9ffca4720bd6cd1f9d209624b71eaf46c5 | [
"MIT"
] | null | null | null | file2.py | cgrinstead12/saturdayclass1 | 6ee4cc9ffca4720bd6cd1f9d209624b71eaf46c5 | [
"MIT"
] | null | null | null | file2.py | cgrinstead12/saturdayclass1 | 6ee4cc9ffca4720bd6cd1f9d209624b71eaf46c5 | [
"MIT"
] | null | null | null | import random
print(random.randint(0,10))
| 10.75 | 27 | 0.767442 | 7 | 43 | 4.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.093023 | 43 | 3 | 28 | 14.333333 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6c2a38ec4b9155065384aec4d7e1679884048649 | 20,801 | py | Python | fhir/resources/tests/test_patient.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 144 | 2019-05-08T14:24:43.000Z | 2022-03-30T02:37:11.000Z | fhir/resources/tests/test_patient.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 82 | 2019-05-13T17:43:13.000Z | 2022-03-30T16:45:17.000Z | fhir/resources/tests/test_patient.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 48 | 2019-04-04T14:14:53.000Z | 2022-03-30T06:07:31.000Z | # -*- coding: utf-8 -*-
"""
Profile: http://hl7.org/fhir/StructureDefinition/Patient
Release: R4
Version: 4.0.1
Build ID: 9346c8cc45
Last updated: 2019-11-01T09:29:23.356+11:00
"""
from pydantic.validators import bytes_validator # noqa: F401
from .. import fhirtypes # noqa: F401
from .. import patient
def impl_patient_1(inst):
assert inst.active is True
assert inst.address[0].city == "Metropolis"
assert inst.address[0].country == "USA"
assert inst.address[0].line[0] == "100 Main St"
assert inst.address[0].postalCode == "44130"
assert inst.address[0].state == "Il"
assert inst.birthDate == fhirtypes.Date.validate("1956-05-27")
assert inst.gender == "male"
assert inst.id == "xds"
assert inst.identifier[0].system == "urn:oid:1.2.3.4.5"
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].use == "usual"
assert inst.identifier[0].value == "89765a87b"
assert inst.managingOrganization.reference == "Organization/2"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Doe"
assert inst.name[0].given[0] == "John"
assert inst.text.status == "generated"
def test_patient_1(base_settings):
"""No. 1 tests collection for Patient.
Test File: patient-example-xds.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-xds.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_1(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_1(inst2)
def impl_patient_2(inst):
assert inst.active is True
assert inst.address[0].city == "Amsterdam"
assert inst.address[0].country == "NLD"
assert inst.address[0].line[0] == "Van Egmondkade 23"
assert inst.address[0].postalCode == "1024 RJ"
assert inst.address[0].use == "home"
assert inst.birthDate == fhirtypes.Date.validate("1944-11-17")
assert inst.communication[0].language.coding[0].code == "nl"
assert inst.communication[0].language.coding[0].display == "Dutch"
assert inst.communication[0].language.coding[0].system == "urn:ietf:bcp:47"
assert inst.communication[0].language.text == "Nederlands"
assert inst.communication[0].preferred is True
assert inst.contact[0].name.family == "Abels"
assert inst.contact[0].name.given[0] == "Sarah"
assert inst.contact[0].name.use == "usual"
assert inst.contact[0].relationship[0].coding[0].code == "C"
assert (
inst.contact[0].relationship[0].coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0131"
)
assert inst.contact[0].telecom[0].system == "phone"
assert inst.contact[0].telecom[0].use == "mobile"
assert inst.contact[0].telecom[0].value == "0690383372"
assert inst.deceasedBoolean is False
assert inst.gender == "male"
assert inst.id == "f001"
assert inst.identifier[0].system == "urn:oid:2.16.840.1.113883.2.4.6.3"
assert inst.identifier[0].use == "usual"
assert inst.identifier[0].value == "738472983"
assert inst.identifier[1].system == "urn:oid:2.16.840.1.113883.2.4.6.3"
assert inst.identifier[1].use == "usual"
assert inst.managingOrganization.display == "Burgers University Medical Centre"
assert inst.managingOrganization.reference == "Organization/f001"
assert inst.maritalStatus.coding[0].code == "M"
assert inst.maritalStatus.coding[0].display == "Married"
assert (
inst.maritalStatus.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v3-MaritalStatus"
)
assert inst.maritalStatus.text == "Getrouwd"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.multipleBirthBoolean is True
assert inst.name[0].family == "van de Heuvel"
assert inst.name[0].given[0] == "Pieter"
assert inst.name[0].suffix[0] == "MSc"
assert inst.name[0].use == "usual"
assert inst.telecom[0].system == "phone"
assert inst.telecom[0].use == "mobile"
assert inst.telecom[0].value == "0648352638"
assert inst.telecom[1].system == "email"
assert inst.telecom[1].use == "home"
assert inst.telecom[1].value == "p.heuvel@gmail.com"
assert inst.text.status == "generated"
def test_patient_2(base_settings):
"""No. 2 tests collection for Patient.
Test File: patient-example-f001-pieter.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-f001-pieter.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_2(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_2(inst2)
def impl_patient_3(inst):
assert inst.active is True
assert inst.birthDate == fhirtypes.Date.validate("1982-08-02")
assert inst.deceasedBoolean is True
assert inst.gender == "female"
assert inst.id == "pat4"
assert inst.identifier[0].system == "urn:oid:0.1.2.3.4.5.6.7"
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].use == "usual"
assert inst.identifier[0].value == "123458"
assert inst.managingOrganization.display == "ACME Healthcare, Inc"
assert inst.managingOrganization.reference == "Organization/1"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Notsowell"
assert inst.name[0].given[0] == "Sandy"
assert inst.name[0].use == "official"
assert inst.text.status == "generated"
def test_patient_3(base_settings):
"""No. 3 tests collection for Patient.
Test File: patient-example-d.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-d.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_3(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_3(inst2)
def impl_patient_4(inst):
assert inst.birthDate == fhirtypes.Date.validate("2017-05-15")
assert inst.contact[0].name.family == "Organa"
assert inst.contact[0].name.given[0] == "Leia"
assert inst.contact[0].name.use == "maiden"
assert inst.contact[0].relationship[0].coding[0].code == "72705000"
assert inst.contact[0].relationship[0].coding[0].display == "Mother"
assert inst.contact[0].relationship[0].coding[0].system == "http://snomed.info/sct"
assert inst.contact[0].relationship[0].coding[1].code == "N"
assert (
inst.contact[0].relationship[0].coding[1].system
== "http://terminology.hl7.org/CodeSystem/v2-0131"
)
assert inst.contact[0].relationship[0].coding[2].code == "MTH"
assert (
inst.contact[0].relationship[0].coding[2].system
== "http://terminology.hl7.org/CodeSystem/v3-RoleCode"
)
assert inst.contact[0].telecom[0].system == "phone"
assert inst.contact[0].telecom[0].use == "mobile"
assert inst.contact[0].telecom[0].value == "+31201234567"
assert inst.extension[0].url == (
"http://hl7.org/fhir/StructureDefinition/patient-" "mothersMaidenName"
)
assert inst.extension[0].valueString == "Organa"
assert inst.gender == "female"
assert inst.id == "infant-twin-1"
assert (
inst.identifier[0].system
== "http://coruscanthealth.org/main-hospital/patient-identifier"
)
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].value == "MRN7465737865"
assert (
inst.identifier[1].system
== "http://new-republic.gov/galactic-citizen-identifier"
)
assert inst.identifier[1].value == "7465737865"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.multipleBirthInteger == 1
assert inst.name[0].family == "Solo"
assert inst.name[0].given[0] == "Jaina"
assert inst.name[0].use == "official"
assert inst.text.status == "generated"
def test_patient_4(base_settings):
"""No. 4 tests collection for Patient.
Test File: patient-example-infant-twin-1.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-infant-twin-1.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_4(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_4(inst2)
def impl_patient_5(inst):
assert inst.birthDate == fhirtypes.Date.validate("1995-10-12")
assert inst.gender == "female"
assert inst.generalPractitioner[0].display == "Too-Onebee"
assert inst.generalPractitioner[0].reference == "Practitioner/21B"
assert inst.id == "infant-mom"
assert inst.maritalStatus.coding[0].code == "M"
assert inst.maritalStatus.coding[0].display == "Married"
assert (
inst.maritalStatus.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v3-MaritalStatus"
)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Solo"
assert inst.name[0].given[0] == "Leia"
assert inst.name[0].use == "official"
assert inst.name[1].family == "Organa"
assert inst.name[1].given[0] == "Leia"
assert inst.name[1].use == "maiden"
assert inst.text.status == "generated"
def test_patient_5(base_settings):
"""No. 5 tests collection for Patient.
Test File: patient-example-infant-mom.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-infant-mom.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_5(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_5(inst2)
def impl_patient_6(inst):
assert inst.birthDate == fhirtypes.Date.validate("2017-09-05")
assert inst.extension[0].url == (
"http://hl7.org/fhir/StructureDefinition/patient-" "mothersMaidenName"
)
assert inst.extension[0].valueString == "Everywoman"
assert inst.gender == "male"
assert inst.id == "newborn"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.multipleBirthInteger == 2
assert inst.text.status == "generated"
def test_patient_6(base_settings):
"""No. 6 tests collection for Patient.
Test File: patient-example-newborn.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-newborn.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_6(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_6(inst2)
def impl_patient_7(inst):
assert inst.contact[0].name.family == "Organa"
assert inst.contact[0].name.given[0] == "Leia"
assert inst.contact[0].name.use == "maiden"
assert inst.contact[0].relationship[0].coding[0].code == "72705000"
assert inst.contact[0].relationship[0].coding[0].display == "Mother"
assert inst.contact[0].relationship[0].coding[0].system == "http://snomed.info/sct"
assert inst.contact[0].relationship[0].coding[1].code == "N"
assert (
inst.contact[0].relationship[0].coding[1].system
== "http://terminology.hl7.org/CodeSystem/v2-0131"
)
assert inst.contact[0].relationship[0].coding[2].code == "MTH"
assert (
inst.contact[0].relationship[0].coding[2].system
== "http://terminology.hl7.org/CodeSystem/v3-RoleCode"
)
assert inst.contact[0].telecom[0].system == "phone"
assert inst.contact[0].telecom[0].use == "mobile"
assert inst.contact[0].telecom[0].value == "+31201234567"
assert inst.extension[0].url == (
"http://hl7.org/fhir/StructureDefinition/patient-" "mothersMaidenName"
)
assert inst.extension[0].valueString == "Organa"
assert inst.gender == "male"
assert inst.id == "infant-fetal"
assert (
inst.identifier[0].system
== "http://coruscanthealth.org/main-hospital/patient-identifier"
)
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].value == "MRN657865757378"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_patient_7(base_settings):
"""No. 7 tests collection for Patient.
Test File: patient-example-infant-fetal.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-infant-fetal.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_7(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_7(inst2)
def impl_patient_8(inst):
assert inst.active is True
assert inst.address[0].line[0] == "2222 Home Street"
assert inst.address[0].use == "home"
assert inst.birthDate == fhirtypes.Date.validate("1973-05-31")
assert inst.gender == "female"
assert inst.id == "genetics-example1"
assert inst.identifier[0].system == "http://hl7.org/fhir/sid/us-ssn"
assert inst.identifier[0].type.coding[0].code == "SS"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].value == "444222222"
assert inst.managingOrganization.reference == "Organization/hl7"
assert inst.meta.lastUpdated == fhirtypes.Instant.validate("2012-05-29T23:45:32Z")
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Everywoman"
assert inst.name[0].given[0] == "Eve"
assert inst.name[0].use == "official"
assert inst.telecom[0].system == "phone"
assert inst.telecom[0].use == "work"
assert inst.telecom[0].value == "555-555-2003"
assert inst.text.status == "generated"
def test_patient_8(base_settings):
"""No. 8 tests collection for Patient.
Test File: patient-genetics-example1.json
"""
filename = base_settings["unittest_data_dir"] / "patient-genetics-example1.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_8(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_8(inst2)
def impl_patient_9(inst):
assert inst.active is True
assert inst.gender == "other"
assert inst.id == "pat2"
assert inst.identifier[0].system == "urn:oid:0.1.2.3.4.5.6.7"
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].use == "usual"
assert inst.identifier[0].value == "123456"
assert inst.link[0].other.reference == "Patient/pat1"
assert inst.link[0].type == "seealso"
assert inst.managingOrganization.display == "ACME Healthcare, Inc"
assert inst.managingOrganization.reference == "Organization/1"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Donald"
assert inst.name[0].given[0] == "Duck"
assert inst.name[0].given[1] == "D"
assert inst.name[0].use == "official"
assert inst.photo[0].contentType == "image/gif"
assert inst.text.status == "generated"
def test_patient_9(base_settings):
"""No. 9 tests collection for Patient.
Test File: patient-example-b.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-b.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_9(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_9(inst2)
def impl_patient_10(inst):
assert inst.active is True
assert inst.birthDate == fhirtypes.Date.validate("1982-01-23")
assert inst.deceasedDateTime == fhirtypes.DateTime.validate(
"2015-02-14T13:42:00+10:00"
)
assert inst.gender == "male"
assert inst.id == "pat3"
assert inst.identifier[0].system == "urn:oid:0.1.2.3.4.5.6.7"
assert inst.identifier[0].type.coding[0].code == "MR"
assert (
inst.identifier[0].type.coding[0].system
== "http://terminology.hl7.org/CodeSystem/v2-0203"
)
assert inst.identifier[0].use == "usual"
assert inst.identifier[0].value == "123457"
assert inst.managingOrganization.display == "ACME Healthcare, Inc"
assert inst.managingOrganization.reference == "Organization/1"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.name[0].family == "Notsowell"
assert inst.name[0].given[0] == "Simon"
assert inst.name[0].use == "official"
assert inst.text.status == "generated"
def test_patient_10(base_settings):
"""No. 10 tests collection for Patient.
Test File: patient-example-c.json
"""
filename = base_settings["unittest_data_dir"] / "patient-example-c.json"
inst = patient.Patient.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Patient" == inst.resource_type
impl_patient_10(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Patient" == data["resourceType"]
inst2 = patient.Patient(**data)
impl_patient_10(inst2)
| 37.277778 | 88 | 0.666026 | 2,738 | 20,801 | 5.004748 | 0.106282 | 0.172225 | 0.056922 | 0.053638 | 0.859155 | 0.82325 | 0.795154 | 0.757717 | 0.693133 | 0.655258 | 0 | 0.048001 | 0.179751 | 20,801 | 557 | 89 | 37.344704 | 0.755128 | 0.077016 | 0 | 0.554795 | 0 | 0.004566 | 0.217186 | 0.023005 | 0 | 0 | 0 | 0 | 0.584475 | 1 | 0.045662 | false | 0 | 0.006849 | 0 | 0.052511 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6c52cacdbcf4759c9bc95d54f52adad7c932d91d | 111 | py | Python | test/src/Options/command-arg.py | milliburn/llvmPy | d6fa3002e823fae00cf33d9b2ea480604681376c | [
"MIT"
] | 1 | 2019-01-22T02:58:04.000Z | 2019-01-22T02:58:04.000Z | test/src/Options/command-arg.py | roberth-k/llvmPy | d6fa3002e823fae00cf33d9b2ea480604681376c | [
"MIT"
] | null | null | null | test/src/Options/command-arg.py | roberth-k/llvmPy | d6fa3002e823fae00cf33d9b2ea480604681376c | [
"MIT"
] | null | null | null | # RUN: llvmPy -c 'print("Test!")' | FileCheck %s
# RUN: llvmPy %s | FileCheck %s
print("Test!")
# CHECK: Test!
| 22.2 | 48 | 0.603604 | 16 | 111 | 4.1875 | 0.5 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171171 | 111 | 4 | 49 | 27.75 | 0.728261 | 0.801802 | 0 | 0 | 0 | 0 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
6c57646031932126cc7bd95b74597c48383ffc6f | 12,397 | py | Python | 14B-088/HI/analysis/HI_veldisp_profile.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | 1 | 2021-03-08T23:19:12.000Z | 2021-03-08T23:19:12.000Z | 14B-088/HI/analysis/HI_veldisp_profile.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | null | null | null | 14B-088/HI/analysis/HI_veldisp_profile.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as p
from pandas import read_csv
from spectral_cube import SpectralCube, Projection
import numpy as np
import astropy.units as u
from astropy.coordinates import Angle
from astropy.io import fits
import os
from os.path import join as osjoin
from cube_analysis.profiles import radial_profile
from paths import (fourteenB_HI_data_path, fourteenB_HI_file_dict,
fourteenB_HI_data_wGBT_path, fourteenB_wGBT_HI_file_dict,
allfigs_path, alltables_path)
from galaxy_params import gal_feath as gal
from plotting_styles import (default_figure, onecolumn_figure,
twocolumn_twopanel_figure)
stack_figure_folder = allfigs_path("stacked_profiles")
if not os.path.exists(stack_figure_folder):
os.mkdir(allfigs_path(stack_figure_folder))
props_figure_folder = allfigs_path("HI_properties")
if not os.path.exists(props_figure_folder):
os.mkdir(allfigs_path(props_figure_folder))
lwidth_hdu = fits.open(fourteenB_HI_file_dict["LWidth"])[0]
lwidth = Projection.from_hdu(lwidth_hdu).to(u.km / u.s)
lwidth_feath_hdu = fits.open(fourteenB_wGBT_HI_file_dict["LWidth"])[0]
lwidth_feath = Projection.from_hdu(lwidth_feath_hdu).to(u.km / u.s)
# Create a radial profile of the HI vel disp out to 8 kpc.
# Beyond 8 kpc, noise is dominant. It may be some reflection of the
# warp, but I don't trust it right now.
rs, sd, sd_sigma = radial_profile(gal, lwidth, max_rad=8 * u.kpc)
rs_feath, sd_feath, sd_feath_sigma = \
radial_profile(gal, lwidth_feath, max_rad=8 * u.kpc)
onecolumn_figure(font_scale=1.)
p.errorbar(rs.value, sd.value,
yerr=sd_sigma.value, fmt="-",
drawstyle='steps-mid', label='VLA')
p.errorbar(rs_feath.value, sd_feath.value,
yerr=sd_feath_sigma.value, fmt="--",
drawstyle='steps-mid', label='VLA + GBT')
p.xlabel("Radius (kpc)")
p.ylabel("HI Line Width (km/s)")
p.legend(loc='lower left', frameon=True)
p.grid()
p.tight_layout()
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile.png"))
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile.pdf"))
p.close()
# Create the North and South portions.
rs_n, sd_n, sd_sigma_n = \
radial_profile(gal, lwidth, max_rad=8 * u.kpc,
pa_bounds=Angle([0.5 * np.pi * u.rad,
-0.5 * np.pi * u.rad]))
rs_s, sd_s, sd_sigma_s = \
radial_profile(gal, lwidth, max_rad=8 * u.kpc,
pa_bounds=Angle([-0.5 * np.pi * u.rad,
0.5 * np.pi * u.rad]))
p.plot(rs_n.value, sd_n.value, "-.", label="North",
drawstyle='steps-mid')
p.plot(rs_s.value, sd_s.value, "--", label="South",
drawstyle='steps-mid')
p.errorbar(rs.value, sd.value,
yerr=sd_sigma.value, fmt="-",
drawstyle='steps-mid', label='Total')
p.xlabel("Radius (kpc)")
p.ylabel("HI Line Width (km/s)")
p.grid()
p.legend(frameon=True)
p.tight_layout()
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile_n_s.png"))
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile_n_s.pdf"))
p.close()
rs_n, sd_n, sd_sigma_n = \
radial_profile(gal, lwidth_feath, max_rad=8 * u.kpc,
pa_bounds=Angle([0.5 * np.pi * u.rad,
-0.5 * np.pi * u.rad]))
rs_s, sd_s, sd_sigma_s = \
radial_profile(gal, lwidth_feath, max_rad=8 * u.kpc,
pa_bounds=Angle([-0.5 * np.pi * u.rad,
0.5 * np.pi * u.rad]))
p.plot(rs_n.value, sd_n.value, "-.", label="North",
drawstyle='steps-mid')
p.plot(rs_s.value, sd_s.value, "--", label="South",
drawstyle='steps-mid')
p.errorbar(rs_feath.value, sd_feath.value,
yerr=sd_feath_sigma.value, fmt="-",
drawstyle='steps-mid', label='Total')
p.xlabel("Radius (kpc)")
p.ylabel("HI Line Width (km/s)")
p.grid()
p.legend(frameon=True)
p.tight_layout()
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile_n_s_feather.png"))
p.savefig(osjoin(props_figure_folder, "hi_veldisp_profile_n_s_feather.pdf"))
p.close()
# Now load in the line stacking fits with the same bin size
hi_radial_fits = \
read_csv(fourteenB_HI_data_path("tables/hi_gaussian_hwhm_totalprof_fits_radial.csv"))
twocolumn_twopanel_figure(font_scale=1.25)
fig, ax = p.subplots(1, 2, sharex=True, sharey=True)
ax[0].errorbar(rs.value, sd.value,
yerr=sd_sigma.value, fmt="-",
drawstyle='steps-mid', label='Avg. Line Width')
ax[0].set_xlabel("Radius (kpc)")
ax[0].set_ylabel("HI Line Width (km/s)")
# Now the stacked fits
# Note that low_lim == up_lim for sigma
ax[0].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['rotsub_sigma'],
yerr=hi_radial_fits['rotsub_sigma_low_lim'],
fmt='D', label='Rot. Stack', alpha=0.5)
ax[0].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['centsub_sigma'],
yerr=hi_radial_fits['centsub_sigma_low_lim'],
fmt='o', label='Cent. Stack', alpha=0.5)
ax[0].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['peaksub_sigma'],
yerr=hi_radial_fits['peaksub_sigma_low_lim'],
fmt='^', label='Peak Stack', alpha=0.5)
ax[0].grid()
ax[0].fill_between([0, 0.5], 4, 15, color='gray', alpha=0.5)
ax[0].set_ylim([4., 15])
ax[0].set_xlim([0, 8.2])
ax[0].legend(frameon=True)
ax[0].text(0.2, 13.5, "VLA",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[1].errorbar(rs.value, sd_feath.value,
yerr=sd_feath_sigma.value, fmt="-",
drawstyle='steps-mid', label='Averaged Line Width')
ax[1].set_xlabel("Radius (kpc)")
# Now the stacked fits
ax[1].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['rotsub_feath_sigma'],
yerr=hi_radial_fits['rotsub_feath_sigma_low_lim'],
fmt='D', label='Rot. Stack', alpha=0.5)
ax[1].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['centsub_feath_sigma'],
yerr=hi_radial_fits['centsub_feath_sigma_low_lim'],
fmt='o', label='Cent. Stack', alpha=0.5)
ax[1].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['peaksub_feath_sigma'],
yerr=hi_radial_fits['peaksub_feath_sigma_low_lim'],
fmt='^', label='Peak Stack', alpha=0.5)
ax[1].fill_between([0, 0.5], 4, 15, color='gray', alpha=0.5)
ax[1].set_ylim([4., 15])
ax[1].grid()
ax[1].text(5.9, 13.5, "VLA + GBT",
bbox={"boxstyle": "square", "facecolor": "w"})
p.tight_layout()
p.savefig(osjoin(stack_figure_folder, "hi_veldisp_w_stackedfits.png"))
p.savefig(osjoin(stack_figure_folder, "hi_veldisp_w_stackedfits.pdf"))
p.close()
onecolumn_figure()
p.errorbar(rs.value, sd_feath.value,
yerr=sd_feath_sigma.value, fmt="-",
drawstyle='steps-mid', label='Averaged Line Width')
p.xlabel("Radius (kpc)")
p.ylabel("HI Line Width (km/s)")
# Now the stacked fits
p.errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['rotsub_feath_sigma'],
yerr=hi_radial_fits['rotsub_feath_sigma_low_lim'],
fmt='D', label='Rot. Stack', alpha=0.5)
p.errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['centsub_feath_sigma'],
yerr=hi_radial_fits['centsub_feath_sigma_low_lim'],
fmt='o', label='Cent. Stack', alpha=0.5)
p.errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['peaksub_feath_sigma'],
yerr=hi_radial_fits['peaksub_feath_sigma_low_lim'],
fmt='^', label='Peak Stack', alpha=0.5)
p.fill_between([0, 0.5], 4, 17, color='gray', alpha=0.5)
p.ylim([5., 17])
p.grid()
p.tight_layout()
p.xlim([0, 8.2])
p.legend(frameon=True, ncol=2, loc='upper center')
p.savefig(osjoin(stack_figure_folder, "hi_veldisp_w_stackedfits_combinedonly.png"))
p.savefig(osjoin(stack_figure_folder, "hi_veldisp_w_stackedfits_combinedonly.pdf"))
p.close()
# Let's compare the line width from the second moment to the Gaussian width
rot_stack = SpectralCube.read(fourteenB_HI_data_path("stacked_spectra/rotation_stacked_radial_100pc.fits"))
cent_stack = SpectralCube.read(fourteenB_HI_data_path("stacked_spectra/centroid_stacked_radial_100pc.fits"))
peakvel_stack = SpectralCube.read(fourteenB_HI_data_path("stacked_spectra/peakvel_stacked_radial_100pc.fits"))
twocolumn_twopanel_figure(font_scale=1.2)
fig, ax = p.subplots(1, 3, sharey=True)
ax[0].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['rotsub_sigma'],
yerr=hi_radial_fits["rotsub_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[0].plot(hi_radial_fits['bin_center'],
rot_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[0].text(5, 11.5, "Rotation\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[0].legend(frameon=True, loc='lower right')
ax[0].grid()
# ax[0].set_xticklabels([])
ax[0].set_ylabel("HI Line Width (km/s)")
ax[0].set_xlabel("Radius (kpc)")
ax[1].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['centsub_sigma'],
yerr=hi_radial_fits["centsub_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[1].plot(hi_radial_fits['bin_center'],
cent_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[1].text(5, 11.5, "Centroid\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[1].grid()
ax[1].set_xlabel("Radius (kpc)")
ax[2].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['peaksub_sigma'],
yerr=hi_radial_fits["peaksub_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[2].plot(hi_radial_fits['bin_center'],
cent_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[2].text(5, 11.5, "Peak Vel.\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[2].grid()
ax[2].set_xlabel("Radius (kpc)")
p.tight_layout()
fig.savefig(osjoin(stack_figure_folder, "hi_veldisp_avg_vs_stackedfits.png"))
fig.savefig(osjoin(stack_figure_folder, "hi_veldisp_avg_vs_stackedfits.pdf"))
p.close()
rot_stack = SpectralCube.read(fourteenB_HI_data_wGBT_path("stacked_spectra/rotation_stacked_radial_100pc.fits"))
cent_stack = SpectralCube.read(fourteenB_HI_data_wGBT_path("stacked_spectra/centroid_stacked_radial_100pc.fits"))
peakvel_stack = SpectralCube.read(fourteenB_HI_data_wGBT_path("stacked_spectra/peakvel_stacked_radial_100pc.fits"))
fig, ax = p.subplots(1, 3, sharey=True)
ax[0].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['rotsub_feath_sigma'],
yerr=hi_radial_fits["rotsub_feather_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[0].plot(hi_radial_fits['bin_center'],
rot_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[0].text(4.75, 14.5, "Rotation\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[0].legend(frameon=True, loc='lower left')
ax[0].grid()
# ax[0].set_xticklabels([])
ax[0].set_ylabel("HI Line Width (km/s)")
ax[0].set_xlabel("Radius (kpc)")
ax[1].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['centsub_feath_sigma'],
yerr=hi_radial_fits["centsub_feather_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[1].plot(hi_radial_fits['bin_center'],
cent_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[1].text(4.75, 14.5, "Centroid\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[1].grid()
ax[1].set_xlabel("Radius (kpc)")
ax[2].errorbar(hi_radial_fits['bin_center'],
hi_radial_fits['peaksub_feath_sigma'],
yerr=hi_radial_fits["peaksub_feather_sigma_low_lim"],
fmt='D', label='Gaussian Fit', alpha=0.5)
ax[2].plot(hi_radial_fits['bin_center'],
cent_stack.linewidth_sigma().value / 1000.,
label="Moment")
ax[2].text(4.75, 14.5, "Peak Vel.\nsubtracted",
bbox={"boxstyle": "square", "facecolor": "w"})
ax[2].grid()
ax[2].set_xlabel("Radius (kpc)")
ax[2].set_ylim([5, 17])
p.tight_layout()
print(argh)
fig.savefig(osjoin(stack_figure_folder, "hi_veldisp_avg_vs_stackedfits_feath.png"))
fig.savefig(osjoin(stack_figure_folder, "hi_veldisp_avg_vs_stackedfits_feath.pdf"))
p.close()
default_figure()
| 41.600671 | 115 | 0.668146 | 1,905 | 12,397 | 4.074016 | 0.115486 | 0.053601 | 0.080402 | 0.040588 | 0.817163 | 0.784564 | 0.750032 | 0.742559 | 0.734313 | 0.72233 | 0 | 0.024505 | 0.173752 | 12,397 | 297 | 116 | 41.740741 | 0.733184 | 0.03888 | 0 | 0.630769 | 0 | 0 | 0.23097 | 0.094354 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0.003846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6c60659e10b46a3fa2abb6fb3dbe7ecad98a0c2a | 57 | py | Python | ske_customization/customizations_for_ske/jinja.py | akshay83/ske_customization | 910e8ca88ffc83554ebb23f7480901dba9f08221 | [
"MIT"
] | null | null | null | ske_customization/customizations_for_ske/jinja.py | akshay83/ske_customization | 910e8ca88ffc83554ebb23f7480901dba9f08221 | [
"MIT"
] | null | null | null | ske_customization/customizations_for_ske/jinja.py | akshay83/ske_customization | 910e8ca88ffc83554ebb23f7480901dba9f08221 | [
"MIT"
] | null | null | null | import json
def json_load(str):
return json.loads(str)
| 11.4 | 23 | 0.754386 | 10 | 57 | 4.2 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 57 | 4 | 24 | 14.25 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
6c74d06ce59645591465cd971b59f6b45b31556f | 2,179 | py | Python | main.py | 5U55/DadBot | 2ab28b09c44efcd6b8495c62bebe416619eb5c56 | [
"CC0-1.0"
] | null | null | null | main.py | 5U55/DadBot | 2ab28b09c44efcd6b8495c62bebe416619eb5c56 | [
"CC0-1.0"
] | null | null | null | main.py | 5U55/DadBot | 2ab28b09c44efcd6b8495c62bebe416619eb5c56 | [
"CC0-1.0"
] | null | null | null | import discord, os, random
from keep_alive import keep_alive
client = discord.Client()
reprimainds = ['Language ', 'We don\'t use that language in this household ', 'Watch your mouth ', 'Hey, watch your language ']
@client.event
async def on_ready():
print('we have logged in as {0.user}'.format(client))
@client.event
async def on_message(message):
if message.author == client.user:
return
if 'i am ' in message.content.lower():
await message.channel.send('Hi '+message.content[message.content.lower().index('i am ')+5:len(message.content)]+' I\'m Dad')
if 'i\'m ' in message.content.lower():
await message.channel.send('Hi '+message.content[message.content.lower().index('i\'m ')+4:len(message.content)]+' I\'m Dad')
if 'im ' in message.content.lower():
if message.content.lower().index('im ') == 0 or message.content[message.content.lower().index('im ')-1] == ' ':
await message.channel.send('Hi '+message.content[message.content.lower().index('im ')+3:len(message.content)]+' I\'m Dad')
if message.content.lower() == 'hell':
await message.channel.send(random.choice(reprimainds)+'{}'.format(message.author.display_name))
if message.content.lower() == 'heck':
await message.channel.send(random.choice(reprimainds)+'{}'.format(message.author.display_name))
if message.content.lower() == 'damn':
await message.channel.send(random.choice(reprimainds)+'{}'.format(message.author.display_name))
if message.content.lower() == 'shoot':
await message.channel.send(random.choice(reprimainds)+'{}'.format(message.author.display_name))
if 'dad bot' in message.content.lower():
await message.channel.send('Yes, {} , you said my name?'.format(message.author.display_name))
if 'dad' in message.content.lower():
await message.channel.send('Yes, {} , you said my name?'.format(message.author.display_name))
if 'what\'s up' in message.content.lower():
await message.channel.send('The sky ;)')
if 'whats up' in message.content.lower():
await message.channel.send('The sky ;)')
if 'wassup' in message.content.lower():
await message.channel.send('The sky ;)')
keep_alive()
client.run(os.getenv('TOKEN'))
| 48.422222 | 128 | 0.695732 | 310 | 2,179 | 4.854839 | 0.251613 | 0.223256 | 0.214618 | 0.183389 | 0.752159 | 0.706977 | 0.706977 | 0.627243 | 0.627243 | 0.627243 | 0 | 0.003138 | 0.122533 | 2,179 | 44 | 129 | 49.522727 | 0.783996 | 0 | 0 | 0.289474 | 0 | 0 | 0.121615 | 0.011932 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.078947 | 0.026316 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6c7a4a41ecff4bbd7b724718148c2d5b599900ef | 4,724 | py | Python | model/tree_models.py | phykn/xai_tree | 66f5cb4ea77686364478b1f16f937678b2e544a8 | [
"Apache-2.0"
] | 1 | 2022-02-06T17:49:26.000Z | 2022-02-06T17:49:26.000Z | model/tree_models.py | phykn/xai_tree | 66f5cb4ea77686364478b1f16f937678b2e544a8 | [
"Apache-2.0"
] | null | null | null | model/tree_models.py | phykn/xai_tree | 66f5cb4ea77686364478b1f16f937678b2e544a8 | [
"Apache-2.0"
] | null | null | null | from numpy import ndarray
from typing import Dict, Any, Optional
from lightgbm import LGBMRegressor, LGBMClassifier
from xgboost import XGBRegressor, XGBClassifier
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier, ExtraTreesRegressor, ExtraTreesClassifier
def lgb_reg(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''LightGBM Regressor'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = LGBMRegressor(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'LGBMRegressor',
'type': 'reg'
}
def lgb_clf(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''LightGBM Classifier'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = LGBMClassifier(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'LGBMClassifier',
'type': 'clf'
}
def xgb_reg(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: Optional[int]=None,
) -> Dict[str, Any]:
'''XGBoost Regressor'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
params['verbosity'] = 0
# model
model = XGBRegressor(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'XGBRegressor',
'type': 'reg'
}
def xgb_clf(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: Optional[int]=None,
) -> Dict[str, Any]:
'''XGBoost Classifier'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
params['verbosity'] = 0
params['use_label_encoder'] = False
# model
model = XGBClassifier(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'XGBClassifier',
'type': 'clf'
}
def rf_reg(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''Random Forest Regressor'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = RandomForestRegressor(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'RandomForestRegressor',
'type': 'reg'
}
def rf_clf(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''Random Forest Classifier'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = RandomForestClassifier(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'RandomForestClassifier',
'type': 'clf'
}
def et_reg(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''Extra Trees Regressor'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = ExtraTreesRegressor(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'ExtraTreesRegressor',
'type': 'reg'
}
def et_clf(
x: ndarray,
y: ndarray,
params: Dict[str, Any]={},
random_state: int=42,
n_jobs: int=-1,
) -> Dict[str, Any]:
'''Extra Trees Classifier'''
# Set random_state and n_jobs
params = params.copy()
params['random_state'] = random_state
params['n_jobs'] = n_jobs
# model
model = ExtraTreesClassifier(**params)
model.fit(x, y)
return {
'model': model,
'params': params,
'name': 'ExtraTreesClassifier',
'type': 'clf'
}
| 22.602871 | 118 | 0.550169 | 525 | 4,724 | 4.809524 | 0.110476 | 0.139406 | 0.063366 | 0.063366 | 0.738218 | 0.738218 | 0.738218 | 0.738218 | 0.738218 | 0.738218 | 0 | 0.007371 | 0.310754 | 4,724 | 208 | 119 | 22.711538 | 0.76812 | 0.093565 | 0 | 0.75 | 0 | 0 | 0.121702 | 0.010702 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.032895 | 0 | 0.138158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6c7ca44974d157666404f5ebaace6ad95fdd78ec | 30 | py | Python | react_game/employeelistBackend/urls.py | Darren212176/react_app | 36f94ee8b22eec8094c44d9f3740c857dee13510 | [
"MIT"
] | 3 | 2020-05-10T11:05:14.000Z | 2021-01-10T16:33:26.000Z | react_game/employeelistBackend/urls.py | Darren212176/react_app | 36f94ee8b22eec8094c44d9f3740c857dee13510 | [
"MIT"
] | null | null | null | react_game/employeelistBackend/urls.py | Darren212176/react_app | 36f94ee8b22eec8094c44d9f3740c857dee13510 | [
"MIT"
] | 1 | 2021-12-14T02:13:51.000Z | 2021-12-14T02:13:51.000Z | from django.urls import path
| 10 | 28 | 0.8 | 5 | 30 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 30 | 2 | 29 | 15 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
66a37d98f2868cfa50ebea0534d3335808e54171 | 27 | py | Python | wapi/__init__.py | Andre-Ceschia/wapi | d64c556d22d9621f875e655e1ef3c1c0cf1b98ee | [
"MIT"
] | null | null | null | wapi/__init__.py | Andre-Ceschia/wapi | d64c556d22d9621f875e655e1ef3c1c0cf1b98ee | [
"MIT"
] | 1 | 2021-11-03T00:34:39.000Z | 2021-11-03T00:34:39.000Z | wapi/__init__.py | Andre-Ceschia/wapi | d64c556d22d9621f875e655e1ef3c1c0cf1b98ee | [
"MIT"
] | null | null | null | from wapi.wapi import Trade | 27 | 27 | 0.851852 | 5 | 27 | 4.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
66ea0b9240be0f397b4518058d74e1b82c011317 | 443 | py | Python | deprecated/clear.py | 3it-nano/QDMS | 9ec2d4e198c00f394d8882517c4b3b336c7fe8c2 | [
"MIT"
] | 1 | 2021-11-21T15:18:27.000Z | 2021-11-21T15:18:27.000Z | deprecated/clear.py | 3it-nano/QDMS | 9ec2d4e198c00f394d8882517c4b3b336c7fe8c2 | [
"MIT"
] | null | null | null | deprecated/clear.py | 3it-nano/QDMS | 9ec2d4e198c00f394d8882517c4b3b336c7fe8c2 | [
"MIT"
] | null | null | null | import shutil
try:
shutil.rmtree('.\\.eggs')
except:
pass
try:
shutil.rmtree('.\\.pytest_cache')
except:
pass
try:
shutil.rmtree('.\\qdms.egg-info')
except:
pass
try:
shutil.rmtree('.\\qdms\\__pycache__')
except:
pass
try:
shutil.rmtree('.\\tests\\__pycache__')
except:
pass
try:
shutil.rmtree('.\\tests\\Simulation')
except:
pass
try:
shutil.rmtree('.\\Simulation')
except:
pass
| 11.972973 | 42 | 0.609481 | 50 | 443 | 5.22 | 0.3 | 0.241379 | 0.402299 | 0.436782 | 0.697318 | 0.505747 | 0.283525 | 0 | 0 | 0 | 0 | 0 | 0.209932 | 443 | 36 | 43 | 12.305556 | 0.745714 | 0 | 0 | 0.724138 | 0 | 0 | 0.257336 | 0.047404 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.241379 | 0.034483 | 0 | 0.034483 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
66eee6c9f35c37373a3c3fe3ce8c02acfc797259 | 86 | py | Python | tests/test_commands/temp_extension.py | cfhamlet/os-tornado | a1ac204889d561d2db9d646ee49214c39fa546c1 | [
"MIT"
] | 5 | 2018-03-01T12:28:50.000Z | 2020-12-08T19:19:56.000Z | tests/test_commands/temp_extension.py | cfhamlet/os-tornado | a1ac204889d561d2db9d646ee49214c39fa546c1 | [
"MIT"
] | 3 | 2017-12-20T05:41:42.000Z | 2018-03-13T05:42:28.000Z | tests/test_commands/temp_extension.py | cfhamlet/os-tornado | a1ac204889d561d2db9d646ee49214c39fa546c1 | [
"MIT"
] | null | null | null | from os_tornado.extension import Extension
class TempExtension(Extension):
pass
| 14.333333 | 42 | 0.802326 | 10 | 86 | 6.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151163 | 86 | 5 | 43 | 17.2 | 0.931507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
dd378ed801c91519ed6ca7eefd77d961174cc26d | 27 | py | Python | bk_chain/bk_chain.py | pola-lei-pola-grei/bk_chain | 6d76e43405204f9f38dcf9712c1578ea56371b6d | [
"MIT"
] | null | null | null | bk_chain/bk_chain.py | pola-lei-pola-grei/bk_chain | 6d76e43405204f9f38dcf9712c1578ea56371b6d | [
"MIT"
] | null | null | null | bk_chain/bk_chain.py | pola-lei-pola-grei/bk_chain | 6d76e43405204f9f38dcf9712c1578ea56371b6d | [
"MIT"
] | null | null | null | import utils.lowlevel as ll | 27 | 27 | 0.851852 | 5 | 27 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dd8f9992d6dcafca897b681ba066c6d45364471c | 202 | py | Python | extras/sandbox.py | szcyd-chian/soliwordsapi | 87d496f82f40febee3bdf4de878064a98a82c005 | [
"MIT"
] | null | null | null | extras/sandbox.py | szcyd-chian/soliwordsapi | 87d496f82f40febee3bdf4de878064a98a82c005 | [
"MIT"
] | null | null | null | extras/sandbox.py | szcyd-chian/soliwordsapi | 87d496f82f40febee3bdf4de878064a98a82c005 | [
"MIT"
] | null | null | null | class Test:
def __init__(self):
self.value = 5
@property
def test_attr(self):
return self.value
def test_self(self, value):
arg = value ** 2
return arg
| 16.833333 | 31 | 0.554455 | 26 | 202 | 4.076923 | 0.461538 | 0.254717 | 0.245283 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.356436 | 202 | 11 | 32 | 18.363636 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.111111 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
06bd818ed90031b5e2073b02c80fe665b005236c | 134 | py | Python | tests/test_app/urls.py | arski/django-prefetch | 70b0537015bfdca03461ad35bb6aa7fc452d6154 | [
"BSD-2-Clause"
] | 1 | 2019-06-27T13:05:10.000Z | 2019-06-27T13:05:10.000Z | tests/test_app/urls.py | techdragon/django-prefetch | 50df80a55103d718d65a8993cc79ac2bb3c704a5 | [
"BSD-2-Clause"
] | null | null | null | tests/test_app/urls.py | techdragon/django-prefetch | 50df80a55103d718d65a8993cc79ac2bb3c704a5 | [
"BSD-2-Clause"
] | null | null | null | try:
from django.conf.urls.defaults import *
except ImportError:
from django.conf.urls import *
urlpatterns = patterns('')
| 14.888889 | 43 | 0.708955 | 16 | 134 | 5.9375 | 0.6875 | 0.210526 | 0.294737 | 0.378947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186567 | 134 | 8 | 44 | 16.75 | 0.87156 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.