hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
57b48444d9cc4d7539c90ccf749a3805ef912c0a | 9,593 | py | Python | scripts/rocsparse-bench-compare.py | raramakr/rocSPARSE | 5d0ad85b5b9f69c017966c4dd989d3ef560630be | [
"MIT"
] | null | null | null | scripts/rocsparse-bench-compare.py | raramakr/rocSPARSE | 5d0ad85b5b9f69c017966c4dd989d3ef560630be | [
"MIT"
] | null | null | null | scripts/rocsparse-bench-compare.py | raramakr/rocSPARSE | 5d0ad85b5b9f69c017966c4dd989d3ef560630be | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# ########################################################################
# Copyright (c) 2019-2021 Advanced Micro Devices, Inc.
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
#
# ########################################################################
import argparse
import subprocess
import os
import re # regexp package
import sys
import tempfile
import json
import xml.etree.ElementTree as ET
import rocsparse_bench_gnuplot_helper
def export_gnuplot(obasename,xargs, yargs, case_results,case_titles,verbose = False,debug = False):
num_cases = len(case_results)
datafile = open(obasename + ".dat", "w+")
len_xargs = len(xargs)
for iplot in range(len(yargs)):
for case_index in range(num_cases):
samples = case_results[case_index]
for ixarg in range(len_xargs):
isample = iplot * len_xargs + ixarg
tg = samples[isample]["timing"]
datafile.write(os.path.basename(os.path.splitext(xargs[ixarg])[0]) + " " +
tg["time"][0] + " " +
tg["time"][1] + " " +
tg["time"][2] + " " +
tg["flops"][0] + " " +
tg["flops"][1] + " " +
tg["flops"][2] + " " +
tg["bandwidth"][0] + " " +
tg["bandwidth"][1] + " "+
tg["bandwidth"][2] + "\n")
datafile.write("\n")
datafile.write("\n")
datafile.close();
if verbose:
print('//rocsparse-bench-compare - write gnuplot file : \'' + obasename + '.gnuplot\'')
cmdfile = open(obasename + ".gnuplot", "w+")
# for each plot
num_plots=len(yargs)
if num_plots==1:
filename_extension= ".pdf"
else:
filename_extension= "."+str(iplot)+".pdf"
for iplot in range(len(yargs)):
#
# Reminder, files is what we want to compare.
#
plot_index=iplot * num_cases
# rocsparse_bench_gnuplot_helper.curve(cmdfile,
# obasename + "_msec"+ filename_extension,
# 'Time',
# range(plot_index,plot_index + num_cases),
# obasename + ".dat",
# [-0.5,len_xargs + 0.5],
# "milliseconds",
# 2,
# case_titles)
rocsparse_bench_gnuplot_helper.histogram(cmdfile,
obasename + "_msec"+ filename_extension,
'Time',
range(plot_index,plot_index + num_cases),
obasename + ".dat",
[-0.5,len_xargs + 0.5],
"milliseconds",
2,3,4,
case_titles)
rocsparse_bench_gnuplot_helper.histogram(cmdfile,
obasename + "_gflops"+ filename_extension,
'Performance',
range(plot_index,plot_index + num_cases),
obasename + ".dat",
[-0.5,len_xargs + 0.5],
"GFlops",
5,6,7,
case_titles)
rocsparse_bench_gnuplot_helper.histogram(cmdfile,
obasename + "_bandwitdh"+ filename_extension,
'Bandwidth',
range(plot_index,plot_index + num_cases),
obasename + ".dat",
[-0.5,len_xargs + 0.5],
"GBytes/s",
8,9,10,
case_titles)
cmdfile.close();
rocsparse_bench_gnuplot_helper.call(obasename + ".gnuplot")
if verbose:
print('//rocsparse-bench-compare CLEANING')
if not debug:
os.remove(obasename + '.dat')
os.remove(obasename + '.gnuplot')
#
#
# MAIN
#
#
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-o', '--obasename', required=False, default = 'a')
parser.add_argument('-v', '--verbose', required=False, default = False, action = "store_true")
parser.add_argument('-d', '--debug', required=False, default = False, action = "store_true")
user_args, case_names = parser.parse_known_args()
if len(case_names) < 2:
print('//rocsparse-bench-compare.error number of filenames provided is < 2, (num_cases = '+str(len(case_names))+')')
exit(1)
verbose=user_args.verbose
debug=user_args.debug
obasename = user_args.obasename
cases = []
num_cases = len(case_names)
case_titles = []
for case_index in range(num_cases):
case_titles.append(os.path.basename(os.path.splitext(case_names[case_index])[0]))
for case_index in range(num_cases):
with open(case_names[case_index],"r") as f:
cases.append(json.load(f))
# mytree = ET.parse('rocsparse-bench-csrmv.xml')
# myroot = mytree.getroot()
# print(len(myroot))
# for i in range(len(myroot)):
# for j in range(len(myroot[i])):
# print(myroot[i][j].attrib['cmd'])
# proc=subprocess.Popen(['bash', '-c', myroot[i][j].attrib['cmd']])
# proc.wait()
# rc = proc.returncode
# if rc != 0:
# print('//rocsparse-bench-compare.error running cmd')
# exit(1)
# return
cmd = [case['cmdline'] for case in cases]
xargs = [case['xargs'] for case in cases]
yargs = [case['yargs'] for case in cases]
case_results = [case['results'] for case in cases]
num_samples = len(case_results[0])
len_xargs = len(xargs[0])
if verbose:
print('//rocsparse-bench-compare INPUT CASES')
for case_index in range(num_cases):
print('//rocsparse-bench-compare - case'+str(case_index) +' : \'' + case_names[case_index] + '\'')
print('//rocsparse-bench-compare CHECKING')
####
# for i in range(1,num_cases):
# if cmd[0] != cmd[i]:
# print('cmdlines must be equal, cmdline from file \''+case_names[i]+'\' is not equal to cmdline from file \''+case_names[0]+'\'')
# exit(1)
# if verbose:
# print('//rocsparse-bench-compare - cmdlines checked.')
####
for case_index in range(1,num_cases):
if xargs[0] != xargs[case_index]:
print('xargs\'s must be equal, xargs from case \''+case_names[case_index]+'\' is not equal to xargs from case \''+case_names[0]+'\'')
exit(1)
if verbose:
print('//rocsparse-bench-compare - xargs checked.')
####
for case_index in range(1,num_cases):
if yargs[0] != yargs[case_index]:
print('yargs\'s must be equal, yargs from case \''+case_names[case_index]+'\' is not equal to yargs from case \''+case_names[0]+'\'')
exit(1)
if verbose:
print('//rocsparse-bench-compare - yargs checked.')
####
for case_index in range(1,num_cases):
if num_samples != len(case_results[case_index]):
print('num_samples\'s must be equal, num_samples from case \''+case_names[case_index]+'\' is not equal to num_samples from case \''+case_names[0]+'\'')
exit(1)
if verbose:
print('//rocsparse-bench-compare - num samples checked.')
####
if verbose:
print('//rocsparse-bench-compare - write data file : \'' + obasename + '.dat\'')
export_gnuplot(obasename,
xargs[0],
yargs[0],
case_results,
case_titles,
verbose,
debug)
if __name__ == "__main__":
main()
| 41.708696 | 163 | 0.50099 | 1,002 | 9,593 | 4.659681 | 0.244511 | 0.056972 | 0.048833 | 0.066824 | 0.370315 | 0.3142 | 0.260227 | 0.202827 | 0.202827 | 0.166417 | 0 | 0.011915 | 0.370062 | 9,593 | 229 | 164 | 41.89083 | 0.760715 | 0.262483 | 0 | 0.272059 | 0 | 0 | 0.124252 | 0.04059 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014706 | false | 0 | 0.066176 | 0 | 0.080882 | 0.095588 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57b501afe4a3feaaf623c2f288d28bdfef3c1cbb | 1,761 | py | Python | score/score_patients.py | jacobdeasy/icu-score | 2c07fe51e86250e886c504e8c20a82cf6e5f8afa | [
"MIT"
] | 2 | 2019-11-28T11:53:35.000Z | 2020-02-01T21:21:10.000Z | score/score_patients.py | jacobdeasy/icu-score | 2c07fe51e86250e886c504e8c20a82cf6e5f8afa | [
"MIT"
] | null | null | null | score/score_patients.py | jacobdeasy/icu-score | 2c07fe51e86250e886c504e8c20a82cf6e5f8afa | [
"MIT"
] | null | null | null | import argparse, numpy as np, os, pandas as pd
from oasis import *
from saps2 import *
def score_patients(score_name, root, partition, out_dir='scores'):
"""Score a directory of patient timeseries."""
score_dict = eval(score_name+'_score')
ts_files = sorted([f for f in root if f != 'listfile.csv'])
scores = np.zeros(len(ts_files))
for i, ts_file in enumerate(ts_files):
ts = pd.read_csv(os.path.join(root, ts_file), dtype={'icd9': str})
if ts['Hours'].min() > 24:
# No info before 24 hours
score, risk = 0, 0
else:
if ts.loc[(0 < ts['Hours']) & (ts['Hours'] < 24)].shape[0] == 0:
# No info after admission
ts = ts.loc[ts['Hours'] < 0].iloc[-1, :]
else:
ts = ts.loc[(0 < ts['Hours']) & (ts['Hours'] < 24)]
scores[i] = score_func(ts)
score_arr = np.stack((np.array(ts_files), scores), axis=1)
score_df = pd.DataFrame(score_arr, columns=['stay', 'score'])
score_df.to_csv(
os.path.join(out_dir, f'{partition}_{score_name}_scores.csv'),
index=None)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Score ICU patients.')
parser.add_argument('score_name', type=str,
help='ICU severity score name')
parser.add_argument('data', type=str,
help='path to patient directory')
parser.add_argument('--out', type=str, default='scores',
help='output directory')
args = parser.parse_args()
if not os.path.exists(args.out):
os.makedirs(args.out)
score_patients(args.score_name, args.data, 'test')
score_patients(args.score_name, args.data, 'train')
| 34.529412 | 76 | 0.586031 | 242 | 1,761 | 4.099174 | 0.38843 | 0.063508 | 0.051411 | 0.02621 | 0.112903 | 0.112903 | 0.112903 | 0.044355 | 0 | 0 | 0 | 0.014638 | 0.262919 | 1,761 | 50 | 77 | 35.22 | 0.749615 | 0.050539 | 0 | 0.057143 | 0 | 0 | 0.136336 | 0.021021 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0 | 0.085714 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57b5673b9751bcca32bae625e091b24052cadd1c | 1,976 | py | Python | dems/aster/create_shapefile_l1a_dl.py | subond/ww_tvol_study | 6fbcae251015a7cd49220abbb054914266b3b4a1 | [
"MIT"
] | 20 | 2021-04-28T18:11:43.000Z | 2022-03-09T13:15:56.000Z | dems/aster/create_shapefile_l1a_dl.py | subond/ww_tvol_study | 6fbcae251015a7cd49220abbb054914266b3b4a1 | [
"MIT"
] | 4 | 2021-04-28T15:51:43.000Z | 2022-01-02T19:10:25.000Z | dems/aster/create_shapefile_l1a_dl.py | rhugonnet/ww_tvol_study | f29fc2fca358aa169f6b7cc790e6b6f9f8b55c6f | [
"MIT"
] | 9 | 2021-04-28T17:58:27.000Z | 2021-12-19T05:51:56.000Z | """
@author: hugonnet
create shapefiles of union-cascaded tiles to download ASTER L1A data on EarthData Search
"""
import numpy as np
import os
import pandas as pd
from vectlib import write_poly_to_shp, polygon_list_to_multipoly, union_cascaded_multipoly
from tiledivlib import stack_tile_polygon
def main(in_tile_list,out_shp,min_inters_area):
print('Recovering tiles ID with intersecting area superior than ' + str(min_inters_area) + '...')
#recovering list of intersecting tile with area superior than criteria
df=pd.read_csv(in_tile_list) #fetching tile list as Series object
tilelist=df['Tile_name']
chk=df['Tot_area_intersecting [km2]']
ind=chk[chk>min_inters_area].index #sorting Series according to criterium
list_tiles=tilelist[ind].tolist()
#create stack of tiles polygon
_,list_poly = stack_tile_polygon(list_tiles,True)
#calculate multipolygon
multipoly=polygon_list_to_multipoly(list_poly)
#derive cascaded union
cascpoly=union_cascaded_multipoly(multipoly)
#write ESRI files to folder, zip folder
write_poly_to_shp(cascpoly,out_shp,os.path.basename(out_shp),True)
if __name__ == '__main__':
rgi_naming_txt = '/home/atom/proj/aster_tdem/worldwide/rgi_neighb_merged_naming_convention.txt'
main_dir = '/home/atom/proj/aster_tdem/worldwide/'
text_file = open(rgi_naming_txt, 'r')
rgi_list = text_file.readlines()
for rgi_counter in np.arange(len(rgi_list)):
rgi_region = rgi_list[rgi_counter]
in_csv = os.path.join(main_dir, rgi_region[:-1].split('rgi60')[0] + 'rgi60',
'list_glacierized_tiles_' + rgi_region[:-1].split('rgi60')[0] + 'rgi60' + '.csv')
out_shp = os.path.join(main_dir, rgi_region[:-1].split('rgi60')[0] + 'rgi60',
'L1A_glacierized_extended_' + rgi_region[:-1].split('rgi60')[0] + 'rgi60')
min_glacier_area = 0.
main(in_csv, out_shp, min_glacier_area)
| 36.592593 | 111 | 0.712551 | 283 | 1,976 | 4.660777 | 0.385159 | 0.022745 | 0.030326 | 0.045489 | 0.150114 | 0.150114 | 0.104625 | 0.065201 | 0.065201 | 0.065201 | 0 | 0.017273 | 0.179656 | 1,976 | 53 | 112 | 37.283019 | 0.796422 | 0.181174 | 0 | 0 | 0 | 0 | 0.193387 | 0.113537 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.172414 | 0 | 0.206897 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57bb5aca9adb2757bdc10c4c2217b591c9ff36c6 | 3,294 | py | Python | chapter17/full_system/face_track_behavior.py | dannystaple/Learn-Robotics-Programming-Second-Edition | 081ed9bbab59aab57334fe8f2f06a157a8639eb4 | [
"MIT"
] | 19 | 2020-05-13T12:53:59.000Z | 2022-03-07T19:50:30.000Z | chapter17/full_system/face_track_behavior.py | dannystaple/Learn-Robotics-Programming-Second-Edition | 081ed9bbab59aab57334fe8f2f06a157a8639eb4 | [
"MIT"
] | 1 | 2020-11-20T16:56:24.000Z | 2020-12-01T06:24:45.000Z | chapter17/full_system/face_track_behavior.py | dannystaple/Learn-Robotics-Programming-Second-Edition | 081ed9bbab59aab57334fe8f2f06a157a8639eb4 | [
"MIT"
] | 12 | 2019-12-24T18:13:14.000Z | 2022-03-20T23:44:12.000Z | import time
from image_app_core import start_server_process, get_control_instruction, put_output_image
import cv2
import os
import camera_stream
from pid_controller import PIController
from robot import Robot
class FaceTrackBehavior:
"""Behavior to find and point at a face."""
def __init__(self, robot):
self.robot = robot
cascade_path = "/usr/local/lib/python3.7/dist-packages/cv2/data/haarcascade_frontalface_default.xml"
assert os.path.exists(cascade_path), f"File {cascade_path} not found"
self.cascade = cv2.CascadeClassifier(cascade_path)
# Tuning values
self.center_x = 160
self.center_y = 120
self.min_size = 20
self.pan_pid = PIController(proportional_constant=0.1, integral_constant=0.03)
self.tilt_pid = PIController(proportional_constant=-0.1, integral_constant=-0.03)
# Current state
self.running = False
def process_control(self):
instruction = get_control_instruction()
if instruction:
command = instruction['command']
if command == "start":
self.running = True
elif command == "stop":
self.running = False
self.pan_pid.reset()
self.tilt_pid.reset()
self.robot.servos.stop_all()
elif command == "exit":
print("Stopping")
exit()
def find_object(self, original_frame):
"""Search the frame for an object. Return the rectangle of the largest by w * h"""
gray_img = cv2.cvtColor(original_frame, cv2.COLOR_BGR2GRAY)
objects = self.cascade.detectMultiScale(gray_img)
largest = 0, (0, 0, 0, 0) # area, x, y, w, h
for (x, y, w, h) in objects:
item_area = w * h
if item_area > largest[0]:
largest = item_area, (x, y, w, h)
return largest[1]
def make_display(self, display_frame):
encoded_bytes = camera_stream.get_encoded_bytes_for_frame(display_frame)
put_output_image(encoded_bytes)
def process_frame(self, frame):
(x, y, w, h) = self.find_object(frame)
cv2.rectangle(frame, (x, y), (x + w, y + w), [255, 0, 0])
self.make_display(frame)
return x, y, w, h
def run(self):
camera = camera_stream.setup_camera()
time.sleep(0.1)
print("Setup Complete")
for frame in camera_stream.start_stream(camera):
(x, y, w, h) = self.process_frame(frame)
self.process_control()
if self.running and h > self.min_size:
pan_error = self.center_x - (x + (w / 2))
pan_value = self.pan_pid.get_value(pan_error)
self.robot.set_pan(int(pan_value))
tilt_error = self.center_y - (y + (h /2))
tilt_value = self.tilt_pid.get_value(tilt_error)
self.robot.set_tilt(int(tilt_value))
print(f"x: {x}, y: {y}, pan_error: {pan_error}, tilt_error: {tilt_error}, pan_value: {pan_value:.2f}, tilt_value: {tilt_value:.2f}")
print("Setting up")
behavior = FaceTrackBehavior(Robot())
process = start_server_process('color_track_behavior.html')
try:
behavior.run()
finally:
process.terminate()
| 37.862069 | 148 | 0.613843 | 432 | 3,294 | 4.462963 | 0.3125 | 0.008299 | 0.009336 | 0.012448 | 0.074689 | 0.058091 | 0.058091 | 0.058091 | 0.058091 | 0.058091 | 0 | 0.018923 | 0.278081 | 3,294 | 86 | 149 | 38.302326 | 0.791842 | 0.048573 | 0 | 0.027778 | 0 | 0.027778 | 0.099648 | 0.034604 | 0 | 0 | 0 | 0 | 0.013889 | 1 | 0.083333 | false | 0 | 0.097222 | 0 | 0.222222 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57bcfc2a02699b17546c0c6bdf7931103e98a38b | 2,782 | py | Python | Others/findOdds.py | johanaluna/DataScience_summary | a365728b81a38f31a28e97666252910a23732936 | [
"MIT"
] | null | null | null | Others/findOdds.py | johanaluna/DataScience_summary | a365728b81a38f31a28e97666252910a23732936 | [
"MIT"
] | null | null | null | Others/findOdds.py | johanaluna/DataScience_summary | a365728b81a38f31a28e97666252910a23732936 | [
"MIT"
] | null | null | null | """
make a function that returns the number that appear odd times in an array
"""
import unittest
def merge_ranges( meetings ):
## Meeting(start, end)
#sort the array
meetings.sort()
# make a pointer in the first position
i = 0
# if we receive just one or an empty metting return the array
if len( meetings )< 2:
return meetings
# go throught the array until the last position minus one,
# because we are check current vs the next position
while i < len( meetings ) - 1:
# if the if the end hour of our current position is later than
# the starting hour of my next meeting
if meetings[ i ][ 1 ] >= meetings[ i+1 ][ 0 ] :
# save the start hour of my current position
start = meetings[i][0]
# save the later hour between the end hour of my current and next meeting as my end
end = max(meetings[ i ][ 1 ], meetings[ i + 1 ][ 1 ])
# save both start and end hour in the first position of my meeting array
meetings[ i ]= ( start, end )
# delete the next meeting because in the range of the actual position the we melted before
del(meetings[ i + 1 ])
else:
i += 1
return(meetings)
# # Tests
class Test(unittest.TestCase):
def test_meetings_overlap(self):
actual = merge_ranges([(1, 3), (2, 4)])
expected = [(1, 4)]
self.assertEqual(actual, expected)
def test_meetings_touch(self):
actual = merge_ranges([(5, 6), (6, 8)])
expected = [(5, 8)]
self.assertEqual(actual, expected)
def test_meeting_contains_other_meeting(self):
actual = merge_ranges([(1, 8), (2, 5)])
expected = [(1, 8)]
self.assertEqual(actual, expected)
def test_meetings_stay_separate(self):
actual = merge_ranges([(1, 3), (4, 8)])
expected = [(1, 3), (4, 8)]
self.assertEqual(actual, expected)
def test_multiple_merged_meetings(self):
actual = merge_ranges([(1, 4), (2, 5), (5, 8)])
expected = [(1, 8)]
self.assertEqual(actual, expected)
def test_meetings_not_sorted(self):
actual = merge_ranges([(5, 8), (1, 4), (6, 8)])
expected = [(1, 4), (5, 8)]
self.assertEqual(actual, expected)
def test_one_long_meeting_contains_smaller_meetings(self):
actual = merge_ranges([(1, 10), (2, 5), (6, 8), (9, 10), (10, 12)])
expected = [(1, 12)]
self.assertEqual(actual, expected)
def test_sample_input(self):
actual = merge_ranges([(0, 1), (3, 5), (4, 8), (10, 12), (9, 10)])
expected = [(0, 1), (3, 8), (9, 12)]
self.assertEqual(actual, expected)
unittest.main(verbosity=2) | 28.680412 | 102 | 0.582315 | 382 | 2,782 | 4.151832 | 0.272251 | 0.062421 | 0.075662 | 0.105927 | 0.344262 | 0.281841 | 0.167087 | 0.116015 | 0.068096 | 0.068096 | 0 | 0.047886 | 0.294393 | 2,782 | 97 | 103 | 28.680412 | 0.760061 | 0.253415 | 0 | 0.204082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 1 | 0.183673 | false | 0 | 0.020408 | 0 | 0.244898 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57bf11fa5199033f47e5038b9ddc688a7f426943 | 5,194 | py | Python | scripts/weigh_beads/weigh_bead_efield.py | charlesblakemore/opt_lev_analysis | 704f174e9860907de349688ed82b5812bbb07c2d | [
"MIT"
] | null | null | null | scripts/weigh_beads/weigh_bead_efield.py | charlesblakemore/opt_lev_analysis | 704f174e9860907de349688ed82b5812bbb07c2d | [
"MIT"
] | null | null | null | scripts/weigh_beads/weigh_bead_efield.py | charlesblakemore/opt_lev_analysis | 704f174e9860907de349688ed82b5812bbb07c2d | [
"MIT"
] | 1 | 2019-11-27T19:10:25.000Z | 2019-11-27T19:10:25.000Z | import os, fnmatch, sys
import dill as pickle
import scipy.interpolate as interp
import scipy.optimize as opti
import scipy.constants as constants
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.mlab as mlab
import bead_util as bu
import configuration as config
import transfer_func_util as tf
plt.rcParams.update({'font.size': 14})
dirs = ['/data/20180927/bead1/weigh_bead_dc/ramp_top_negative_bottom_at_p100', \
'/data/20180927/bead1/weigh_bead_dc/ramp_top_negative_bottom_at_p100_10_repeats'
]
dirs = ['/data/20180927/bead1/weigh_bead_20e_10v_bottom_constant', \
]
#
V2 = 100.0
amp_gain = 200 #????
#dirs = ['/data/20181119/bead1/mass_meas/neg_charge_2', \
# ]
dirs = ['/data/20181119/bead1/mass_meas/pos_charge_1', \
]
pos = True
mon_fac = 200
maxfiles = 1000 # Many more than necessary
lpf = 2500 # Hz
file_inds = (0, 500)
userNFFT = 2**12
diag = False
fullNFFT = False
###########################################################
power_dat = np.loadtxt('/power_v_bits/20181119_init.txt', delimiter=',')
bits_to_power = interp.interp1d(power_dat[0], power_dat[1])
e_top_dat = np.loadtxt('/calibrations/e-top_1V_optical-axis.txt', delimiter=',')
e_top_func = interp.interp1d(e_top_dat[0], e_top_dat[1])
e_bot_dat = np.loadtxt('/calibrations/e-bot_1V_optical-axis.txt', delimiter=',')
e_bot_func = interp.interp1d(e_bot_dat[0], e_bot_dat[1])
def line(x, a, b):
return a * x + b
def weigh_bead_efield(files, colormap='jet', sort='time', file_inds=(0,10000), \
pos=False):
'''Loops over a list of file names, loads each file, diagonalizes,
then plots the amplitude spectral density of any number of data
or cantilever/electrode drive signals
INPUTS: files, list of files names to extract data
data_axes, list of pos_data axes to plot
cant_axes, list of cant_data axes to plot
elec_axes, list of electrode_data axes to plot
diag, boolean specifying whether to diagonalize
OUTPUTS: none, plots stuff
'''
files = [(os.stat(path), path) for path in files]
files = [(stat.st_ctime, path) for stat, path in files]
files.sort(key = lambda x: (x[0]))
files = [obj[1] for obj in files]
files = files[file_inds[0]:file_inds[1]]
#files = files[::10]
date = files[0].split('/')[2]
charge_file = '/calibrations/charges/' + date
if pos:
charge_file += '_recharge.charge'
else:
charge_file += '.charge'
q_bead = np.load(charge_file)[0] * constants.elementary_charge
print(q_bead / constants.elementary_charge)
run_index = 0
masses = []
nfiles = len(files)
print("Processing %i files..." % nfiles)
eforce = []
power = []
for fil_ind, fil in enumerate(files):#files[56*(i):56*(i+1)]):
bu.progress_bar(fil_ind, nfiles)
# Load data
df = bu.DataFile()
try:
df.load(fil, load_other=True)
except:
continue
df.calibrate_stage_position()
df.calibrate_phase()
if fil_ind == 0:
init_phi = np.mean(df.zcal)
top_elec = mon_fac * np.mean(df.other_data[6])
bot_elec = mon_fac * np.mean(df.other_data[7])
# Synth plugged in negative so just adding instead of subtracting negative
Vdiff = V2 + amp_gain * df.synth_settings[0]
Vdiff = np.mean(df.electrode_data[2]) - np.mean(df.electrode_data[1])
Vdiff = top_elec - bot_elec
force = - (Vdiff / (4.0e-3)) * q_bead
force2 = (top_elec * e_top_func(0.0) + bot_elec * e_bot_func(0.0)) * q_bead
try:
mean_fb = np.mean(df.pos_fb[2])
mean_pow = bits_to_power(mean_fb)
except:
continue
#eforce.append(force)
eforce.append(force2)
power.append(mean_pow)
eforce = np.array(eforce)
power = np.array(power)
power = power / np.mean(power)
inds = np.abs(eforce) < 2e-13
eforce = eforce[inds]
power = power[inds]
popt, pcov = opti.curve_fit(line, eforce*1e13, power, \
absolute_sigma=False, maxfev=10000)
test_vals = np.linspace(np.min(eforce*1e13), np.max(eforce*1e13), 100)
fit = line(test_vals, *popt)
lev_force = -popt[1] / (popt[0] * 1e13)
mass = lev_force / (9.806)
mass_err = np.sqrt( pcov[0,0] / popt[0]**2 + \
pcov[1,1] / popt[1]**2 + \
np.abs(pcov[0,1]) / np.abs(popt[0]*popt[1]) ) * mass
#masses.append(mass)
print(mass * 1e12)
print(mass_err * 1e12)
plt.figure()
plt.plot(eforce, power, 'o')
plt.xlabel('Elec. Force [N]', fontsize=14)
plt.ylabel('Levitation Power [arb]', fontsize=14)
plt.tight_layout()
plt.plot(test_vals*1e-13, fit, lw=2, color='r', \
label='Implied mass: %0.3f ng' % (mass*1e12))
plt.legend()
plt.show()
#print np.mean(masses) * 1e12
#print np.std(masses) * 1e12
allfiles, lengths = bu.find_all_fnames(dirs, sort_time=True)
weigh_bead_efield(allfiles, pos=pos)
| 23.1875 | 88 | 0.613015 | 748 | 5,194 | 4.07754 | 0.347594 | 0.015738 | 0.015738 | 0.021639 | 0.131148 | 0.100984 | 0.05377 | 0.05377 | 0.036066 | 0.036066 | 0 | 0.053819 | 0.248749 | 5,194 | 223 | 89 | 23.29148 | 0.727832 | 0.148633 | 0 | 0.054054 | 0 | 0 | 0.116741 | 0.087322 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018018 | false | 0 | 0.099099 | 0.009009 | 0.126126 | 0.036036 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57bf453001a120801b801e1d39e0498c36b1230f | 3,086 | py | Python | python/scripts/train.py | darwinbeing/deepdriving-tensorflow | 036a83871f3515b2c041bc3cd5e845f6d8f7b3b7 | [
"MIT"
] | 1 | 2018-12-13T14:00:03.000Z | 2018-12-13T14:00:03.000Z | python/scripts/train.py | darwinbeing/deepdriving-tensorflow | 036a83871f3515b2c041bc3cd5e845f6d8f7b3b7 | [
"MIT"
] | null | null | null | python/scripts/train.py | darwinbeing/deepdriving-tensorflow | 036a83871f3515b2c041bc3cd5e845f6d8f7b3b7 | [
"MIT"
] | null | null | null | # The MIT license:
#
# Copyright 2017 Andre Netzeband
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
# documentation files (the "Software"), to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and
# to permit persons to whom the Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies or substantial portions of
# the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO
# THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
#
# Note: The DeepDriving project on this repository is derived from the DeepDriving project devloped by the princeton
# university (http://deepdriving.cs.princeton.edu/). The above license only applies to the parts of the code, which
# were not a derivative of the original DeepDriving project. For the derived parts, the original license and
# copyright is still valid. Keep this in mind, when using code from this project.
import time
import misc.settings
import deep_learning as dl
import deep_driving.model as model
class CTrainSettings(misc.settings.CSettings):
_Dict = {
'Data': {
'TrainingPath': "../../../training",
'ValidatingPath': "../../../validation",
'BatchSize': 128,
'ImageWidth': 210,
'ImageHeight': 280
},
'Trainer': {
'EpochSize': 10000,
'NumberOfEpochs': 50,
'SummaryPath': 'Summary',
'CheckpointPath': 'Checkpoint',
'CheckpointEpochs': 10,
},
'Optimizer':{
'StartingLearningRate': 0.005,
'EpochsPerDecay': 10,
'LearnRateDecay': 0.95,
'WeightDecay': 0.004,
'Momentum': 0.9
},
'Validation': {
'Samples': 1000
},
'PreProcessing':
{
'MeanFile': 'image-mean.tfrecord'
},
}
SettingFile = "train.cfg"
IsRetrain = True
def main():
Settings = CTrainSettings(SettingFile)
dl.summary.cleanSummary(Settings['Trainer']['SummaryPath'], 30)
Model = dl.CModel(model.CAlexNet)
Trainer = Model.createTrainer(model.CTrainer, model.CReader, model.CError, Settings)
Trainer.addPrinter(model.CPrinter())
Trainer.addSummaryMerger(model.CMerger())
if not IsRetrain:
Trainer.restore()
#Trainer.restore(3)
StartTime = time.time()
Trainer.train()
DeltaTime = time.time() - StartTime
print("Training took {}s ({})".format(DeltaTime, misc.time.getStringFromTime(DeltaTime)))
if __name__ == "__main__":
main() | 36.305882 | 119 | 0.690538 | 374 | 3,086 | 5.668449 | 0.550802 | 0.041509 | 0.012264 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018048 | 0.209981 | 3,086 | 85 | 120 | 36.305882 | 0.851518 | 0.490279 | 0 | 0 | 0 | 0 | 0.255116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019231 | false | 0 | 0.076923 | 0 | 0.134615 | 0.019231 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57bfb543f00d6c125ec2a43951e36b417441bab7 | 1,038 | py | Python | tests/python/test_kyber.py | jyao1/qrllib | 641ec32e7ff84dc10a07ed0f049321287113e3bc | [
"MIT"
] | 48 | 2017-09-06T19:43:37.000Z | 2022-03-08T20:38:40.000Z | tests/python/test_kyber.py | jyao1/qrllib | 641ec32e7ff84dc10a07ed0f049321287113e3bc | [
"MIT"
] | 19 | 2017-09-30T22:17:01.000Z | 2021-12-31T04:30:18.000Z | tests/python/test_kyber.py | jyao1/qrllib | 641ec32e7ff84dc10a07ed0f049321287113e3bc | [
"MIT"
] | 31 | 2017-09-14T15:24:08.000Z | 2022-03-14T19:10:06.000Z | # Distributed under the MIT software license, see the accompanying
# file LICENSE or http://www.opensource.org/licenses/mit-license.php.
from __future__ import print_function
import unittest
from unittest import TestCase
from pyqrllib.kyber import Kyber
class TestKyber(TestCase):
def __init__(self, *args, **kwargs):
super(TestKyber, self).__init__(*args, **kwargs)
def test_exchange_keys(self):
alice = Kyber()
bob = Kyber()
# Alice sends her public key to Bob
alice_public_key = alice.getPK()
# Bob receives the public key, derives a secret and a response
bob.kem_encode(alice_public_key)
cypherText = bob.getCypherText()
# Bob sends the cyphertext to alice
valid = alice.kem_decode(cypherText)
# Now Alice and Bob share the same key
alice_key = alice.getMyKey()
bob_key = bob.getMyKey()
self.assertTrue(valid)
self.assertEqual(alice_key, bob_key)
if __name__ == '__main__':
unittest.main()
| 26.615385 | 70 | 0.675337 | 132 | 1,038 | 5.060606 | 0.492424 | 0.053892 | 0.041916 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.240848 | 1,038 | 38 | 71 | 27.315789 | 0.847716 | 0.287091 | 0 | 0 | 0 | 0 | 0.010914 | 0 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.35 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57c02bb76a2c6f83df228b90c4a6be7629a5fa88 | 418 | py | Python | samples/sample_capture_hdr.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 23 | 2019-07-01T09:50:04.000Z | 2022-03-06T23:54:28.000Z | samples/sample_capture_hdr.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 100 | 2019-07-02T07:49:13.000Z | 2022-02-16T21:05:39.000Z | samples/sample_capture_hdr.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 13 | 2019-10-01T07:26:05.000Z | 2022-02-16T20:21:56.000Z | """HDR capture sample."""
from zivid import Application, Settings
def _main():
app = Application()
camera = app.connect_camera()
settings = Settings(
acquisitions=[
Settings.Acquisition(aperture=aperture) for aperture in (10.90, 5.80, 2.83)
]
)
with camera.capture(settings) as hdr_frame:
hdr_frame.save("result.zdf")
if __name__ == "__main__":
_main()
| 20.9 | 87 | 0.626794 | 48 | 418 | 5.1875 | 0.666667 | 0.064257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031847 | 0.248804 | 418 | 19 | 88 | 22 | 0.761147 | 0.045455 | 0 | 0 | 0 | 0 | 0.045802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.076923 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57c0c0dfd5057bc4b742c0457f1d5b3e51df1f4f | 13,871 | py | Python | fastdeploy/_app.py | notAI-tech/fastDeploy-core | 011f4cba2d2b018efe0a5548978e290225f2e745 | [
"MIT"
] | null | null | null | fastdeploy/_app.py | notAI-tech/fastDeploy-core | 011f4cba2d2b018efe0a5548978e290225f2e745 | [
"MIT"
] | 1 | 2020-04-12T13:36:22.000Z | 2020-04-12T13:36:22.000Z | fastdeploy/_app.py | notAI-tech/fastDeploy-core | 011f4cba2d2b018efe0a5548978e290225f2e745 | [
"MIT"
] | null | null | null | from gevent import monkey
monkey.patch_all()
import gevent
import gevent.pool
import os
import sys
import glob
import time
import uuid
import ujson
import falcon
import base64
import shutil
import logging
import datetime
import mimetypes
from functools import partial
from . import _utils
while "time_per_example" not in _utils.META_INDEX:
_utils.logger.info(f"Waiting for batch size search to finish.")
time.sleep(5)
ONLY_ASYNC = os.getenv("ONLY_ASYNC", False)
TIME_PER_EXAMPLE = _utils.META_INDEX["time_per_example"]
IS_FILE_INPUT = _utils.META_INDEX["IS_FILE_INPUT"]
def wait_and_read_pred(unique_id):
"""
Waits for and reads result for unique_id.
:param unique_id: unique_id of the input
:return response: json dumped python dict with keys "success" and "prediction"/ "reason"
:return status: HTTP status code
"""
# Keeping track of start_time for TIMEOUT implementation
start_time = time.time()
# Default response and status
response, status = (
{"success": False, "reason": "timeout"},
falcon.HTTP_503,
)
while True:
try:
# if result doesn't exist for this uuid, while loop continues/
pred, metrics = _utils.RESULTS_INDEX[unique_id]
try:
response = {"prediction": pred, "success": True}
# if return dict has any non json serializable values, we str() it
except:
_utils.logger.info(
f"unique_id: {unique_id} could not json serialize the result."
)
response = {"prediction": str(pred), "success": True}
status = falcon.HTTP_200
break
except:
# stop in case of timeout
if time.time() - start_time >= _utils.TIMEOUT:
_utils.logger.warn(
f"unique_id: {unique_id} timedout, with timeout {_utils.TIMEOUT}"
)
break
gevent.time.sleep(TIME_PER_EXAMPLE * 0.501)
metrics = {}
return response, status, metrics
class Infer(object):
def on_post(self, req, resp):
try:
unique_id = str(uuid.uuid4())
req_params = req.params
is_async_request = ONLY_ASYNC or req_params.get("async")
_extra_options_for_predictor = {}
if (req.content_type == "application/json" and IS_FILE_INPUT) or (
req.content_type != "application/json" and not IS_FILE_INPUT
):
if IS_FILE_INPUT:
resp.media = {
"success": False,
"reason": f"Received json input. Expected multi-part file input.",
}
else:
resp.media = {
"success": False,
"reason": f"Received multi-part file input. Expected json input.",
}
resp.status = falcon.HTTP_400
else:
if req.content_type == "application/json":
in_data = req.media
try:
# Legacy. use data in "data" key if exists
in_data = in_data["data"]
except:
pass
_in_file_names = [None for _ in range(len(in_data))]
else:
in_data = []
_in_file_names = []
for part in req.get_media():
if not part.filename and _utils.META_INDEX["ACCEPTS_EXTRAS"]:
try:
_extra_options_for_predictor.update(
ujson.loads(part.text)
)
except:
pass
else:
_in_file_names.append(part.name)
_temp_file_path = (
f"{uuid.uuid4()}{os.path.splitext(part.filename)[1]}"
)
_temp_file = open(_temp_file_path, "wb")
while True:
chunk = part.stream.read(2048)
if not chunk:
break
_temp_file.write(chunk)
_temp_file.flush()
_temp_file.close()
in_data.append(_temp_file_path)
metrics = {
"received": time.time(),
"prediction_start": -1,
"prediction_end": -1,
"batch_size": len(in_data),
"predicted_in_batch": -1,
"responded": -1,
}
_utils.REQUEST_INDEX[unique_id] = (
in_data,
metrics,
[_extra_options_for_predictor.get(_) for _ in _in_file_names],
)
_utils.META_INDEX["TOTAL_REQUESTS"] += 1
if is_async_request:
resp.media = {"unique_id": unique_id, "success": True}
resp.status = falcon.HTTP_200
else:
preds, status, _metrics = wait_and_read_pred(unique_id)
if not len(_metrics):
_metrics = metrics
_metrics["responded"] = time.time()
_utils.METRICS_CACHE[len(_utils.METRICS_CACHE)] = (
unique_id,
_metrics,
in_data,
)
resp.media = preds
resp.status = status
except Exception as ex:
_utils.logger.exception(ex, exc_info=True)
resp.media = {"success": False, "reason": "malformed request"}
resp.status = falcon.HTTP_400
class Metrics(object):
def on_get(self, req, resp):
try:
end_time = int(req.params.get("from_time", time.time()))
total_time = int(req.params.get("total_time", 3600))
loop_batch_size = _utils.META_INDEX["batch_size"]
batch_size_to_time_per_example = _utils.META_INDEX[
"batch_size_to_time_per_example"
]
first_end_time = 0
all_metrics_in_time_period = {
"time_graph_data": {
"labels": [],
"datasets": [
{"name": "Response time", "values": []},
{"name": "Prediction time", "values": []},
],
},
"auto_batching_graph_data": {
"labels": [],
"datasets": [
{"name": "Input batch size", "values": []},
{"name": "Dynamically batched to", "values": []},
],
},
"index_to_all_meta": {},
}
current_time = time.time()
n_metrics = len(_utils.METRICS_CACHE)
total_requests = n_metrics
for _ in reversed(range(n_metrics)):
unique_id, _metrics, _in_data = _utils.METRICS_CACHE[_]
# max 5 second loop alowed
if time.time() - current_time >= 5:
break
received_time = _metrics["received"]
prediction_start = _metrics["prediction_start"]
prediction_end = _metrics["prediction_end"]
batch_size = _metrics["batch_size"]
predicted_in_batch = _metrics["predicted_in_batch"]
responded_at = _metrics["responded"]
prediction_time_per_example = (
prediction_end - prediction_start
) / predicted_in_batch
if current_time - received_time >= total_time:
break
if received_time <= 0 or responded_at <= 0:
continue
x_id = total_requests - len(
all_metrics_in_time_period["index_to_all_meta"]
)
all_metrics_in_time_period["auto_batching_graph_data"]["labels"].insert(
0, x_id
)
all_metrics_in_time_period["auto_batching_graph_data"]["datasets"][0][
"values"
].insert(0, batch_size)
all_metrics_in_time_period["auto_batching_graph_data"]["datasets"][1][
"values"
].insert(0, predicted_in_batch)
all_metrics_in_time_period["time_graph_data"]["labels"].insert(0, x_id)
all_metrics_in_time_period["time_graph_data"]["datasets"][0][
"values"
].insert(0, responded_at - received_time)
all_metrics_in_time_period["time_graph_data"]["datasets"][1][
"values"
].insert(
0,
batch_size
* (prediction_end - prediction_start)
/ predicted_in_batch,
)
all_metrics_in_time_period["index_to_all_meta"][
n_metrics - len(all_metrics_in_time_period["index_to_all_meta"])
] = {
"unique_id": unique_id,
"received_time": str(
datetime.datetime.fromtimestamp(received_time)
),
"prediction_time_per_example": prediction_time_per_example,
"batch_size": batch_size,
"start_to_end_time": responded_at - received_time,
"predicted_in_batch": predicted_in_batch,
}
resp.media = all_metrics_in_time_period
resp.status = falcon.HTTP_200
except Exception as ex:
logging.exception(ex, exc_info=True)
pass
ALL_META = {}
for k, v in _utils.META_INDEX.items():
ALL_META[k] = v
ALL_META["is_file_input"] = IS_FILE_INPUT
ALL_META["example"] = _utils.example
class Meta(object):
def on_get(self, req, resp):
if req.params.get("example") == "true":
resp.content_type = mimetypes.guess_type(_utils.example[0])[0]
resp.stream = open(_utils.example[0], "rb")
resp.downloadable_as = os.path.basename(_utils.example[0])
else:
resp.media = ALL_META
resp.status = falcon.HTTP_200
class Res(object):
def on_post(self, req, resp):
try:
unique_id = req.media["unique_id"]
_utils.logger.info(f"unique_id: {unique_id} Result request received.")
try:
pred, metrics = _utils.RESULTS_INDEX[unique_id]
resp.media = {"success": True, "prediction": pred}
except:
if unique_id in _utils.REQUEST_INDEX:
resp.media = {"success": None, "reason": "processing"}
else:
resp.media = {
"success": False,
"reason": "No request found with this unique_id",
}
resp.status = falcon.HTTP_200
except Exception as ex:
_utils.logger.exception(ex, exc_info=True)
resp.media = {"success": False, "reason": str(ex)}
resp.status = falcon.HTTP_400
json_handler = falcon.media.JSONHandler(
loads=ujson.loads, dumps=partial(ujson.dumps, ensure_ascii=False)
)
extra_handlers = {
"application/json": json_handler,
}
app = falcon.App(cors_enable=True)
app.req_options.media_handlers.update(extra_handlers)
app.resp_options.media_handlers.update(extra_handlers)
app.req_options.auto_parse_form_urlencoded = True
app = falcon.App(
middleware=falcon.CORSMiddleware(
allow_origins=_utils.ALLOWED_ORIGINS, allow_credentials=_utils.ALLOWED_ORIGINS
)
)
infer_api = Infer()
res_api = Res()
metrics_api = Metrics()
meta_api = Meta()
app.add_route("/infer", infer_api)
app.add_route("/result", res_api)
app.add_route("/metrics", metrics_api)
app.add_route("/meta", meta_api)
app.add_static_route(
"/",
_utils.FASTDEPLOY_UI_PATH,
fallback_filename="index.html",
)
# Backwards compatibility
app.add_route("/sync", infer_api)
from geventwebsocket import WebSocketApplication
class WebSocketInfer(WebSocketApplication):
def on_open(self):
self.connection_id = f"{uuid.uuid4()}"
self.n = 0
self.start_time = time.time()
_utils.logger.info(f"{self.connection_id} websocket connection opened.")
def on_message(self, message):
self.n += 1
try:
if message is not None:
metrics = {
"received": time.time(),
"prediction_start": -1,
"prediction_end": -1,
"batch_size": 1,
"predicted_in_batch": -1,
"responded": -1,
}
message_id = f"{self.connection_id}.{self.n}"
_utils.REQUEST_INDEX[message_id] = ([message], metrics)
preds, status, metrics = wait_and_read_pred(message_id)
if "prediction" in preds:
preds["prediction"] = preds["prediction"][0]
self.ws.send(ujson.dumps(preds))
except Exception as ex:
_utils.logger.exception(ex, exc_info=True)
pass
def on_close(self, reason):
_utils.logger.info(
f"{self.connection_id} websocket connection closed. Time spent: {time.time() - self.start_time} n_mesages: {self.n}"
)
pass
| 33.263789 | 128 | 0.519069 | 1,430 | 13,871 | 4.725874 | 0.177622 | 0.029595 | 0.019532 | 0.026043 | 0.343149 | 0.286327 | 0.236165 | 0.166321 | 0.152116 | 0.097662 | 0 | 0.009159 | 0.386057 | 13,871 | 416 | 129 | 33.34375 | 0.784406 | 0.038281 | 0 | 0.263158 | 0 | 0.003096 | 0.1413 | 0.017437 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024768 | false | 0.01548 | 0.055728 | 0 | 0.099071 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57c0e33cc5acc12edf941e388670d90f43ea7724 | 676 | py | Python | JSON/dumps_loads.py | SpenceGuo/py3-learning | 041ace54313c817cb494d8829493c2979d76efa1 | [
"Apache-2.0"
] | 1 | 2020-09-28T07:02:58.000Z | 2020-09-28T07:02:58.000Z | JSON/dumps_loads.py | SpenceGuo/py3-learning | 041ace54313c817cb494d8829493c2979d76efa1 | [
"Apache-2.0"
] | null | null | null | JSON/dumps_loads.py | SpenceGuo/py3-learning | 041ace54313c817cb494d8829493c2979d76efa1 | [
"Apache-2.0"
] | null | null | null | """
json.dumps 与 json.loads 实例
以下实例演示了 Python 数据结构转换为JSON:
"""
import json
data = {
"no": 1,
"name": "SPENCE",
"url": "https://www.google.com"
}
json_str = json.dumps(data)
print("Python 原始数据:", repr(data))
print("JSON对象:", json_str)
# 通过输出的结果可以看出,简单类型通过编码后跟其原始的repr()输出结果非常相似。
#
# 接着以上实例,我们可以将一个JSON编码的字符串转换回一个Python数据结构:
# 将 JSON 对象转换为 Python 字典
data2 = json.loads(json_str)
print("data2['name']: ", data2['name'])
print("data2['url']: ", data2['url'])
# 如果你要处理的是文件而不是字符串,你可以使用 json.dump() 和 json.load() 来编码和解码JSON数据。例如:
# 写入 JSON 数据
with open('data.json', 'w') as f:
json.dump(data, f)
# 读取数据
with open('data.json', 'r') as f:
data = json.load(f)
| 19.882353 | 67 | 0.656805 | 95 | 676 | 4.642105 | 0.515789 | 0.047619 | 0.054422 | 0.072562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010453 | 0.150888 | 676 | 33 | 68 | 20.484848 | 0.75784 | 0.359467 | 0 | 0 | 0 | 0 | 0.267303 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57c1d85857b7b18c4fde53e5f75de518c3ccb410 | 743 | py | Python | app/api/init_db.py | med-cab-1/machine_learning2 | 4fd62acf8d578e68e48203affc9400929c41cd7a | [
"MIT"
] | null | null | null | app/api/init_db.py | med-cab-1/machine_learning2 | 4fd62acf8d578e68e48203affc9400929c41cd7a | [
"MIT"
] | null | null | null | app/api/init_db.py | med-cab-1/machine_learning2 | 4fd62acf8d578e68e48203affc9400929c41cd7a | [
"MIT"
] | null | null | null | #! usr/bin/python
"""
File containing functions for database creation and management
"""
# IMPORTS
import pandas as pd
import sqlite3
def create_db():
print('Inside init_db file')
#df = pd.read_csv('../../Data/cannabis_new.csv')
df = pd.read_csv('Data/cannabis_new.csv')
print(df.head())
df = df.rename(columns={'Index': 'Strain_ID'})
#conn = sqlite3.connect('../../Data/cannabis.sqlite3')
conn = sqlite3.connect('Data/cannabis.sqlite3')
curs = conn.cursor()
curs.execute("DROP TABLE IF EXISTS Cannabis")
df.to_sql('Cannabis', con=conn)
# curs.close()
# conn.close()
print('Database Created!')
def say_hi():
print("Hello!")
if __name__ == '__main__':
create_db()
say_hi()
| 21.852941 | 62 | 0.643338 | 99 | 743 | 4.636364 | 0.545455 | 0.104575 | 0.034858 | 0.04793 | 0.287582 | 0.287582 | 0.126362 | 0.126362 | 0 | 0 | 0 | 0.008264 | 0.185734 | 743 | 33 | 63 | 22.515152 | 0.750413 | 0.286676 | 0 | 0 | 0 | 0 | 0.276596 | 0.081238 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.117647 | 0 | 0.235294 | 0.235294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57c68cd8a10af57bbd1c4f512d47a580db20a9ea | 1,223 | py | Python | test.py | cforclown/identifikasi-kualitas-daging-sapi-unggul-dgn-menggunakan-metode-regionprops | fb9c907416d42daca48d8bf214b848ff04657f8a | [
"MIT"
] | null | null | null | test.py | cforclown/identifikasi-kualitas-daging-sapi-unggul-dgn-menggunakan-metode-regionprops | fb9c907416d42daca48d8bf214b848ff04657f8a | [
"MIT"
] | null | null | null | test.py | cforclown/identifikasi-kualitas-daging-sapi-unggul-dgn-menggunakan-metode-regionprops | fb9c907416d42daca48d8bf214b848ff04657f8a | [
"MIT"
] | null | null | null | # TESTING SCRIPT
import sys
from PySide2.QtWidgets import QApplication
from Views.MainViewController import View
import cv2
def checkQuality(self, contour, filteredFrame):
x, y, w, h = cv2.boundingRect(contour)
roi=filteredFrame[y:h, x:w]
# convert to HSV
for y in range(0, roi.shape[1]):
for x in range(0, roi.shape[0]):
pixel=roi[x, y]
if pixel!=None and pixel[0]!=0:
print(pixel)
def coodsMouseDisp(event, x, y, flags, param):
# left mouse double click
if event == cv2.EVENT_LBUTTONDBLCLK:
if img is not None and hsv is not None:
print("Orginal BGR: ", img[x, y][0])
print("HSV values: ", hsv[x, y])
def main():
global img
global hsv
img=cv2.imread('./Resources/Templates/template-1.jpg')
hsv=cv2.cvtColor(img, cv2.COLOR_BGR2HSV)
while True:
cv2.imshow("Original", img)
cv2.imshow("HSV", hsv)
#left mouse click event
cv2.setMouseCallback("HSV", coodsMouseDisp)
cv2.setMouseCallback("Original", coodsMouseDisp)
if cv2.waitKey(1) &0xFF == ord("q"):
cv2.destroyAllWindows()
break
if __name__ == "__main__":
main()
| 27.795455 | 58 | 0.609975 | 162 | 1,223 | 4.54321 | 0.45679 | 0.013587 | 0.021739 | 0.029891 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026786 | 0.267375 | 1,223 | 43 | 59 | 28.44186 | 0.794643 | 0.062142 | 0 | 0 | 0 | 0 | 0.08056 | 0.031524 | 0 | 0 | 0.003503 | 0 | 0 | 1 | 0.09375 | false | 0 | 0.125 | 0 | 0.21875 | 0.09375 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57ccf2baebbea0d86574a21ec3d30d695a76e891 | 1,980 | py | Python | src/pollution/random_delay.py | ccberg/LA | df3929c9ab4b7cbfa38749363c5ccced010f3002 | [
"MIT"
] | 1 | 2021-12-28T03:27:42.000Z | 2021-12-28T03:27:42.000Z | src/pollution/random_delay.py | ccberg/LA | df3929c9ab4b7cbfa38749363c5ccced010f3002 | [
"MIT"
] | null | null | null | src/pollution/random_delay.py | ccberg/LA | df3929c9ab4b7cbfa38749363c5ccced010f3002 | [
"MIT"
] | null | null | null | import numpy as np
from numpy import random
from tqdm import tqdm
from src.pollution.tools import max_data
from src.trace_set.database import Database
from src.trace_set.set_hw import TraceSetHW
A = 5
B = 3
DELAY_AMP = 10
def random_delay(traces: np.ndarray, a: int = A, b: int = B, delay_amplitude: int = DELAY_AMP, delay_probability=.5):
"""
Based on the implementation of L. Wu & S. Picek (2020): "Remove Some Noise: On Pre-processing of Side-channel
Measurements with Autoencoders."
"""
res = np.zeros_like(traces)
num_traces, trace_length = traces.shape
max_sp = np.max(traces)
norm_factor = max_data(traces) / (max_sp + delay_amplitude)
if norm_factor < 1:
traces = np.array(norm_factor * traces, dtype=traces.dtype)
delay_amplitude *= norm_factor
for ix, trace in tqdm(enumerate(traces), total=num_traces, desc=f"Random delay ({delay_probability})"):
sp_old, sp_new = 0, 0
# Computing (too much) random variables all at once yields >2x speed increase.
do_jitter = random.binomial(1, delay_probability, trace_length)
lower_bound = random.randint(0, a - b, size=trace_length + 1)
upper_bound = random.randint(0, b, size=trace_length + 1) + lower_bound
while sp_new < trace_length and sp_old < trace_length:
r = do_jitter[sp_new]
res[ix, sp_new] = trace[sp_old]
sp_old += 1
sp_new += 1
if r:
for _ in range(upper_bound[sp_old]):
if sp_new + 3 > trace_length:
continue
spike = trace[sp_old] + delay_amplitude
delay_sequence = [trace[sp_old], spike, trace[sp_old + 1]]
res[ix, sp_new:sp_new + 3] = delay_sequence
sp_new += 3
return res
if __name__ == '__main__':
random_delay(TraceSetHW(Database.ascad_none).profile()[0], delay_probability=0.0001)
| 34.137931 | 117 | 0.630303 | 280 | 1,980 | 4.214286 | 0.378571 | 0.038136 | 0.033898 | 0.025424 | 0.028814 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020906 | 0.275253 | 1,980 | 57 | 118 | 34.736842 | 0.801394 | 0.113131 | 0 | 0 | 0 | 0 | 0.024194 | 0.012097 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0.157895 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57cde40e015dbb64c1835ab2b0bac5e905b4c54a | 1,375 | py | Python | test/test_add_contact_into_group.py | Beumanc278/python_training | 7c4f5bfd132391c098e19a1fd2da3de56b06bef7 | [
"Apache-2.0"
] | null | null | null | test/test_add_contact_into_group.py | Beumanc278/python_training | 7c4f5bfd132391c098e19a1fd2da3de56b06bef7 | [
"Apache-2.0"
] | null | null | null | test/test_add_contact_into_group.py | Beumanc278/python_training | 7c4f5bfd132391c098e19a1fd2da3de56b06bef7 | [
"Apache-2.0"
] | null | null | null | import random
import allure
from data.contacts import testdata as contact_testdata
from data.groups import testdata as group_testdata
from model.contact import Contact
def test_add_contact_into_group(app, db, check_ui):
with allure.step("Given a non-empty group list and a non-empty contact list"):
if not app.contact.get_contact_list_from_group():
list(map(lambda contact: app.contact.create(contact), contact_testdata))
if not app.group.get_group_list():
list(map(lambda group: app.group.create(group), group_testdata))
contacts = app.contact.get_contact_list_from_group()
groups = app.group.get_group_list()
with allure.step("Given a random contact and a random group"):
contact_for_add = random.choice(contacts)
group_for_add = random.choice(groups)
with allure.step(f"When I add the randomly chosen contact {contact_for_add} to the randomly chosen group"):
app.contact.add_contact_to_group(contact_for_add, group_for_add)
with allure.step("Then the randomly chosen contact is in the chosen group"):
assert contact_for_add in db.get_contacts_in_group(group_for_add)
if check_ui:
new_contacts = db.get_contact_list()
assert sorted(new_contacts, key=Contact.id_or_max) == sorted(app.contact.get_contact_list(), key=Contact.id_or_max)
| 50.925926 | 127 | 0.735273 | 209 | 1,375 | 4.588517 | 0.253589 | 0.043796 | 0.058394 | 0.062565 | 0.212722 | 0.068822 | 0.068822 | 0 | 0 | 0 | 0 | 0 | 0.182545 | 1,375 | 26 | 128 | 52.884615 | 0.853203 | 0 | 0 | 0 | 0 | 0 | 0.173091 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.043478 | false | 0 | 0.217391 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57cfbe53259cd5e43ea07166dcdb24c9e092b86e | 898 | py | Python | exact/exact/base/context_processors.py | maubreville/Exact | 2f4ce50054bfe5350a106ef3fa1a2f03c90bbbef | [
"MIT"
] | 43 | 2020-01-29T17:19:21.000Z | 2022-03-29T11:11:32.000Z | exact/exact/base/context_processors.py | maubreville/Exact | 2f4ce50054bfe5350a106ef3fa1a2f03c90bbbef | [
"MIT"
] | 41 | 2020-01-31T09:31:31.000Z | 2022-02-24T15:55:21.000Z | exact/exact/base/context_processors.py | maubreville/Exact | 2f4ce50054bfe5350a106ef3fa1a2f03c90bbbef | [
"MIT"
] | 16 | 2020-02-11T18:26:32.000Z | 2021-07-30T09:05:15.000Z | from django.conf import settings
from exact.users.models import Team
from exact.tagger_messages.models import TeamMessage
from django.db.models import Q
def base_data(request):
show_datasets = settings.SHOW_DEMO_DATASETS
if request.user.is_authenticated:
my_teams = Team.objects.filter(members=request.user)
unread_message_count = 0
#unread_message_count = TeamMessage.in_range(TeamMessage.get_messages_for_user(request.user).filter(~Q(read_by=request.user))).count()
else:
my_teams = None
unread_message_count = 0
return {
'IMPRINT_URL': settings.IMPRINT_URL,
'USE_IMPRINT': settings.USE_IMPRINT,
'IMPRINT_NAME': settings.IMPRINT_NAME,
'TOOLS_ENABLED': settings.TOOLS_ENABLED,
'my_teams': my_teams,
'unread_message_count': unread_message_count,
'show_datasets':show_datasets
}
| 33.259259 | 142 | 0.717149 | 114 | 898 | 5.342105 | 0.429825 | 0.106732 | 0.147783 | 0.062397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002766 | 0.194878 | 898 | 26 | 143 | 34.538462 | 0.839557 | 0.148107 | 0 | 0.095238 | 0 | 0 | 0.115183 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.190476 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d009409eca5ae7e11656bf0054fd7cba2f11fd | 4,859 | py | Python | src/windshape/drone/control/Controller.py | Adrien4193/windshape | 4c73a4a85409f04518029f0ddb8bd7e3c60e4905 | [
"BSD-2-Clause"
] | null | null | null | src/windshape/drone/control/Controller.py | Adrien4193/windshape | 4c73a4a85409f04518029f0ddb8bd7e3c60e4905 | [
"BSD-2-Clause"
] | null | null | null | src/windshape/drone/control/Controller.py | Adrien4193/windshape | 4c73a4a85409f04518029f0ddb8bd7e3c60e4905 | [
"BSD-2-Clause"
] | null | null | null | import math
import numpy
# ROS main library
import rospy
# Parameters
from ControlParameters import ControlParameters
# PID controller (SISO)
from PIDController import PIDController
# Low-pass filter
from ..common.LowPassFilter import LowPassFilter
class Controller(object):
"""MIMO controller to compute drone attitude to reach a position.
Called as a function.
The attitude is computed from the X, Y, Z and Yaw errors between the
desired setpoint and the drone pose measured by the mocap system
using four PIDs.
Input: x, y, z, roll, pitch, yaw (numpy.array[6]) [m, rad]
Output: roll, pitch, yaw, thrust (numpy.array[4]) [rad, (0-1)]
"Friend" class of ControlParameters.
Inherits from object.
Overrides __init__, __del__, __call__.
"""
def __init__(self):
"""Initializes drone PIDs (x, y, z, yaw) and LP filter."""
# Control attributes
self.__parameters = ControlParameters()
# Filter for manual position setpoint
par = rospy.get_param('~control/sp_filter')
self.__filter = LowPassFilter(par, numpy.zeros(6))
# Loads PID parameters
pars = rospy.get_param('~control/pid')
r, p, y, t = (pars['roll'], pars['pitch'], pars['yaw'],
pars['thrust'])
# Roll, pitch, yaw, thrust
self.__pids = [ PIDController(r['kp'], r['ki'], r['kd'],
r['min'], r['max'], r['ff']),
PIDController(p['kp'], p['ki'], p['kd'],
p['min'], p['max'], p['ff']),
PIDController(y['kp'], y['ki'], y['kd'],
y['min'], y['max'], y['ff']),
PIDController(t['kp'], t['ki'], t['kd'],
t['min'], t['max'], t['ff'])]
def __del__(self):
"""Does nothing special."""
pass
def __call__(self, pose, estimate):
"""Returns attitude (numpy.array[4]) to reach the setpoint.
Process:
1 - Computes X, Y, Z, Yaw error.
2 - Project XY errors on drone body frame.
3 - Sets yaw feed-forward as desired yaw from estimate.
3 - Computes R, P, Y, T from X, Y, Yaw, Z errors using PIDs.
4 - Returns R, P, Y, T.
Args:
pose (numpy.array[6]): Current real drone pose.
estimate (float): Pose estimated by FCU (can be shifted).
"""
setpoint = self.__parameters.getSetpoint().toArray()
# Filters manual setpoint (from current drone pose)
if not self.__parameters.isFollowingTarget():
setpoint = self.__filter(setpoint)
else:
self.__filter.reset(pose)
# Computes pose error
error = setpoint - pose
# Projects XY from global frame to body frame of the drone
x, y = error[0:2]
xb = x * math.cos(pose[5]) + y * math.sin(pose[5])
yb = y * math.cos(pose[5]) - x * math.sin(pose[5])
# Computes yaw feed forward in drone estimated frame
yaw = setpoint[5] - pose[5] + estimate[5]
self.__pids[2].setFeedForward(setpoint[5])
# -yb -> roll, xb -> pitch, yaw -> yaw, z -> thrust
error = numpy.array([-yb, xb, error[5], error[2]])
# Calls PIDs and masks fields with manual attitude if needed
attitude = self.__computeAttitude(error)
# Display
self.__record(attitude, error)
return attitude
#
# Public methods to access parameters and perform reset.
#
def getParameters(self):
"""Returns the controller parameters (ControlParameters)."""
return self.__parameters
def reset(self):
"""Resets PIDs and display when controller is disabled."""
for pid in self.__pids:
pid.reset()
self.__parameters._setControlInput(numpy.zeros(4))
self.__parameters._setError(numpy.zeros(4))
self.__parameters._setSetparatedOutputs(*(3*[numpy.zeros(4)]))
#
# Private methods to make some computations and records.
#
def __angle(self, value):
"""Returns the given angle between -pi and pi.
Args:
value (float): Angle in radians to convert.
"""
result = value % (2*math.pi)
if result > math.pi:
result -= (2*math.pi)
return result
def __computeAttitude(self, error):
"""Computes RPYT from pose error (m, rad)."""
mask = self.__parameters.getMask()
# Initializes with manual attitude
attitude = list(self.__parameters.getAttitude())
# Roll, pitch, yaw, thrust
for axis in range(4):
# Replaces axis with controller value if not masked
if not mask[axis]:
attitude[axis] = self.__pids[axis](error[axis])
else:
self.__pids[axis].reset()
return numpy.array(attitude)
def __record(self, attitude, error):
"""Records outputs in parameters."""
# Total output RPYT
self.__parameters._setControlInput(attitude)
# Error RPYT
self.__parameters._setError(error)
# Separated P, I and D contributions
attitudes = [numpy.zeros(4), numpy.zeros(4), numpy.zeros(4)]
for axis, pid in enumerate(self.__pids):
p, i, d = pid.getSeparatedOutputs()
attitudes[0][axis] = p
attitudes[1][axis] = i
attitudes[2][axis] = d
self.__parameters._setSetparatedOutputs(*attitudes)
| 26.845304 | 69 | 0.657954 | 664 | 4,859 | 4.686747 | 0.284639 | 0.053985 | 0.021208 | 0.017352 | 0.026671 | 0.010604 | 0 | 0 | 0 | 0 | 0 | 0.009827 | 0.204157 | 4,859 | 180 | 70 | 26.994444 | 0.794931 | 0.415106 | 0 | 0.027778 | 0 | 0 | 0.037491 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.041667 | 0.083333 | 0 | 0.263889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d23712e6839672afb07b755f64158e43e0fce4 | 639 | py | Python | setup.py | xpqz/aoctools | 660f3ed9f2ebbe23bb2ceaba91fee7ef54a0c248 | [
"Apache-2.0"
] | null | null | null | setup.py | xpqz/aoctools | 660f3ed9f2ebbe23bb2ceaba91fee7ef54a0c248 | [
"Apache-2.0"
] | 4 | 2020-03-24T16:45:28.000Z | 2021-06-01T23:40:03.000Z | setup.py | xpqz/aoctools | 660f3ed9f2ebbe23bb2ceaba91fee7ef54a0c248 | [
"Apache-2.0"
] | null | null | null | import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
setuptools.setup(
name="aoctools",
version="0.0.1",
author="Stefan Kruger",
author_email="stefan.kruger@gmail.com",
description="Utility classes and algorithms for AoC",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/xpqz/aoctools",
packages=setuptools.find_packages(),
classifiers=(
"Programming Language :: Python :: 3",
"Operating System :: OS Independent",
),
entry_points={'console_scripts': []},
install_requires=[]
)
| 27.782609 | 57 | 0.674491 | 72 | 639 | 5.833333 | 0.763889 | 0.142857 | 0.090476 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007663 | 0.183099 | 639 | 22 | 58 | 29.045455 | 0.796935 | 0 | 0 | 0 | 0 | 0 | 0.353678 | 0.035994 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d41b6f4bb5de73e10cec96c2999e9dc07450f4 | 32,085 | py | Python | PJLink/MathLinkEnvironment.py | b3m2a1/PJLink | 650a40eb00484ac758eb53562915c72cfeb31e8c | [
"MIT"
] | 27 | 2018-09-13T09:15:45.000Z | 2021-11-17T11:46:20.000Z | PJLink/MathLinkEnvironment.py | b3m2a1/PJLink | 650a40eb00484ac758eb53562915c72cfeb31e8c | [
"MIT"
] | 11 | 2018-09-24T22:46:20.000Z | 2020-05-03T09:12:14.000Z | PJLink/MathLinkEnvironment.py | b3m2a1/PJLink | 650a40eb00484ac758eb53562915c72cfeb31e8c | [
"MIT"
] | 9 | 2018-09-20T11:02:43.000Z | 2020-11-06T12:56:42.000Z | """The MathLinkFlags class is a single object that handles data type translation from flag ints
"""
from decimal import Decimal as decimal
import os
##############################################################################################
# #
# MathLinkEnvironment #
# #
##############################################################################################
class MathLinkEnvironment:
"""A class holding all of the MathLink environment flags that will be used elsewhere in the package
"""
### These are all used by MathLink itself
# The collection of packet type ints
PACKET_TYPES = {
"Illegal" : 0, # Constant returned by nextPacket
"Call" : 7, # Constant returned by nextPacket.
"Evaluate" : 13, # Constant returned by nextPacket.
"Return" : 3, # Constant returned by nextPacket.
"InputName" : 8, # Constant returned by nextPacket.
"EnterText" : 14, # Constant returned by nextPacket.
"EnterExpr" : 15, # Constant returned by nextPacket.
"OutputName" : 9, # Constant returned by nextPacket.
"ReturnText" : 4, # Constant returned by nextPacket.
"ReturnExpr" : 16, # Constant returned by nextPacket.
"Display" : 11, # Constant returned by nextPacket.
"DisplayEnd" : 12, # Constant returned by nextPacket.
"Message" : 5, # Constant returned by nextPacket.
"Text" : 2, # Constant returned by nextPacket.
"Input" : 1, # Constant returned by nextPacket.
"InputString": 21, # Constant returned by nextPacket.
"Menu" : 6, # Constant returned by nextPacket.
"Syntax" : 10, # Constant returned by nextPacket.
"Suspend" : 17, # Constant returned by nextPacket.
"Resume" : 18, # Constant returned by nextPacket.
"BeginDialog": 19, # Constant returned by nextPacket.
"EndDialog" : 20,
"FirstUser" : 128,
"LastUser" : 255,
"FE" : 100, # Catch-all for packets that need to go to FE.
"Expression" : 101 # Sent for Pr output
}
PACKET_TYPE_NAMES = {}
PACKET_TYPE_NAMES.update(tuple((item, key) for key, item in PACKET_TYPES.items()))
# The collection of message type ints
MESSAGE_TYPES = {
"Terminate" : 1,
"Interrupt" : 2,
"Abort" : 3,
# Used in putMessage() to cause the current Mathematica evaluation to be aborted.
"AuthenticateFailure" : 10
# Low-level message type that will be detected by a messagehandler function if the
# kernel fails to start because of an authentication error (e.g., incorrect password file).
}
MESSAGE_TYPE_NAMES = {}
MESSAGE_TYPE_NAMES.update(tuple((item, key) for key, item in MESSAGE_TYPES.items()))
# The collection of type characters
TYPE_TOKENS = {
# Constants for use in putNext() or returned by getNext() and getType().
"Function" : ord('F'),
"String" : ord('"'),
"Symbol" : ord('\043'),
"Real" : ord('*'),
"Integer" : ord('+'),
"Error" : 0,
"Object" : 100000
}
TYPE_TOKEN_NAMES = {}
TYPE_TOKEN_NAMES.update(tuple((item, key) for key, item in TYPE_TOKENS.items()))
# The collection of error type ints
ERROR_TYPES = {
# Some of these need to agree with C code.
"Ok" : 0,
"Memory" : 8,
"Unconnected" : 10,
"UnknownPacket" : 23,
"User" : 1000,
"NonMLError" : 1000,
"LinkIsNull" : 1000,
"OutOfMemory" : 1001,
"ArrayTooShallow" : 1002,
"BadComplex" : 1003,
"CreationFailed" : 1004,
"ConnectTimeout" : 1005,
"WrappedException" : 1006,
"BadObject" : 1100,
"FirstUserException" : 2000,
"SignalCaught" : 2100,
"UnknownCallType" : 2101,
"CannotPut" : 2201
}
ERROR_TYPE_NAMES = {}
ERROR_TYPE_NAMES.update(tuple((item, key) for key, item in ERROR_TYPES.items()))
ERROR_MESSAGES = {
"ArrayTooShallow" : "Array is not as deep as requested.",
"BadComplex" : "Expression could not be read as a complex number.",
"ConnectTimeout" : "The link was not connected before the requested time limit elapsed.",
"BadObject" : "Expression on link is not a valid Python object reference.",
"FallThrough" : "Extended error message not available.",
"LinkIsNull" : "Link is not open.",
"CreationFailed" : "Link failed to open.",
"SignalCaught" : "Signal was caught.",
"CannotPut" : "Cannot put object onto link"
}
# These must remain in sync with Mathematica and C code. They don't really belong here,
# but they are used in a few places, so it's convenient to dump them here.
# If you change any of these, consult KernelLinkImpl, which has a few that
# pick up where these leave off.
MAX_ARRAY_DEPTH = 9
TYPE_INTEGERS = {
# Constants for use in getArray().
"Boolean" : -1,
"Byte" : -2,
"Char" : -3,
"Short" : -4,
"Integer" : -5,
"Long" : -6,
"Float" : -7,
"Double" : -8,
"String" : -9,
"BigInteger" : -10,
"Decimal" : -11,
"Expr" : -12,
"Complex" : -13,
## exclusively for use in a KernelLink
"Object" : -14,
"FloatOrInt" : -15,
"DoubleOrInt" : -16,
"Array1D" : -17,
"Array2D" : 2*(-17),#TYPE_ARRAY
"Array3D" : 3*(-17),#TYPE_ARRAY1
"Array4D" : 4*(-17),#TYPE_ARRAY1
"Array5D" : 5*(-17),#TYPE_ARRAY1
"Array6D" : 6*(-17),#TYPE_ARRAY1
"Array7D" : 7*(-17),#TYPE_ARRAY1
"Array8D" : 8*(-17),#TYPE_ARRAY1
"Array9D" : 9*(-17),#TYPE_ARRAY1
"Bad" : -10000
}
TYPE_INTEGER_NAMES = {}
TYPE_INTEGER_NAMES.update(tuple((item, key) for key, item in TYPE_INTEGERS.items()))
# "Complex" must always be the last one (largest absolute value number) of the set of types that have a byte value representation.
# This rule does not apply to "Double"" or "Float", which are defined KernelLinkImpl and are not user-level constants.
# They are never supplied as an argument to any J/Link method.
### Just for NumPy support
try:
import numpy
HAS_NUMPY = True
except ImportError as e:
HAS_NUMPY = False
### For PIL support
try:
import PIL
HAS_PIL = True
except ImportError:
HAS_PIL = False
### Maps for ease of type detection
TYPE_MAP = {
int : "Long",
float : "Double",
str : "String",
bool : "Boolean",
decimal : "Decimal",
complex : "Complex"
}
TYPE_MAP.update(tuple((item, key) for key, item in TYPE_MAP.items()))
if HAS_NUMPY:
import numpy as np
NUMPY_TYPE_MAP = {
np.int64 : "Long",
np.int32 : "Integer",
np.int16 : "Short",
np.int8 : "Char",
np.float64 : "Double",
np.float32 : "Float",
np.complex128 : "Complex",
np.complex64 : "Complex",
np.bytes_ : "Byte",
np.byte : "Byte",
np.str_ : "String",
np.string_ : "String"
}
TYPE_MAP.update(NUMPY_TYPE_MAP.items())
NUMPY_TYPE_MAP.update(tuple((item, key) for key, item in NUMPY_TYPE_MAP.items()))
TYPE_MAP.update({
"Integer" : int,
"Short" : int,
"BigInteger" : int,
"Float" : float
})
TYPECODE_MAP = {
'i' : "Integer",
'h' : "Short",
'l' : "Long",
'f' : "Float",
'd' : "Double",
'b' : "Char",
'B' : "Byte"
}
TYPECODE_MAP.update(tuple((item, key) for key, item in TYPECODE_MAP.items()))
TYPENAME_MAP = {
"byte" : "Byte",
"char" : "Char",
"short" : "Short",
"int" : "Int",
"long" : "Long",
"float" : "Float",
"double" : "Double",
"bool" : "Boolean",
"bigint" : "BigInteger",
"bigdec" : "Decimal",
"complex": "Complex",
"expr" : "Expr",
"Real" : "Double",
"Integer": "Int"
}
CALL_TYPES = {
"CallPython" : 1,
"Throw" : 2,
"Clear" : 3,
"Get" : 4,
"Set" : 5,
"New" : 6
}
CALL_TYPE_NAMES = {}
CALL_TYPE_NAMES.update(tuple((item, key) for key, item in CALL_TYPES.items()))
### Types used by the Expr class
EXPR_TYPES = {
### Expr flags
'Unknown' : 0,
# Mathematica Expr types
'Integer' : 1,
'Real' : 2,
'String' : 3,
'Symbol' : 4,
'Rational' : 5,
'Complex' : 6,
# Python / Java types
'BigInteger' : 7, # Java legacy
'BigDecimal' : 8, # Java legacy
'Decimal' : 9, # python decimal.Decimal object
# Composite types
'FirstComposite' : 100,
'Function' : 100,
# Specialized arrays
'FirstArrayType' : 200,
#'IntArray' : 200, # I'm killing these because they don't really do anything in python
#'REALARRAY1' : 201,
#'INTARRAY2' : 202,
#'REALARRAY2' : 203,
# Generalized list support
'List' : 208,
# Association support
'Association' : 209,
# efficient buffered data objects
"Array" : 214,
'NumPyArray' : 215,
'BufferedNDArray' : 216,
# arbitrary object
'Object' : 999
}
EXPR_TYPE_NAMES = {}
EXPR_TYPE_NAMES.update(tuple((item, key) for key, item in EXPR_TYPES.items()))
### Strings used by Expr I think? Or MathLink ?
DECIMAL_POINT_STRING = '.'
EXP_STRING = '*^'
TICK_STRING = '`'
### Perfomance oriented flags
# Used by _getArray and friends
ALLOW_RAGGED_ARRAYS = False
# Not currently used -- will force copies of data buffers to protect against corruption
COPY_DATA_BUFFERS = False # I'm not sure I can actually disable this?
import platform
PLATFORM = platform.system()
del platform
MATHLINK_LIBRARY_NAME = None
CURRENT_MATHEMATICA = None
APPLICATIONS_ROOT = None
INSTALLATION_DIRECTORY = None
MATHEMATICA_BINARY = None
WOLFRAMKERNEL_BINARY = None
MATHLINK_LIBRARY_DIR = None
if HAS_NUMPY:
del np
# Used to turn logging on or off
ALLOW_LOGGING = False
LOG_FILE = os.path.join(os.path.dirname(os.path.abspath(__file__)), "log.txt")
def __init__(self):
raise TypeError("{} is a standalone class and cannot be instantiated".format(type(self).__name__))
@staticmethod
def __lookup(m, key, default=None):
try:
return m[key]
except KeyError:
return default
@classmethod
def toTypeInt(cls, o):
"""A convenience function that turns a python type, typcode, or type name into a type int
:param o:
:return:
"""
tint = None
try:
tint = cls.TYPE_INTEGERS[o]
except KeyError:
for m in (cls.TYPE_MAP, cls.TYPECODE_MAP, cls.TYPE_INTEGERS, cls.TYPENAME_MAP):
try:
tint = m[o]
if isinstance(tint, str):
tint = cls.TYPE_INTEGERS[tint]
break
except KeyError:
pass
return tint
@classmethod
def fromTypeInt(cls, tint, mode="intname"): ### THIS IS TERRIBLY DESIGNED TODO: MAKE IT NOT SUCK
"""A convenience function that turns a type int into a python type, typecode, or typename
:param tint:
:param mode: a string determining what to return, possible values are type, typecode, typename
:return:
"""
try:
if isinstance(tint, int):
tint = cls.TYPE_INTEGER_NAMES[tint]
if mode == "typename":
if tint in cls.TYPENAME_MAP:
otype = cls.TYPENAME_MAP[tint]
elif isinstance(tint, str):
otype = tint
else:
otype = None
elif mode == "typecode":
otype = cls.TYPECODE_MAP[tint]
elif mode == "type":
otype = cls.TYPE_MAP[tint]
else:
otype = cls.TYPENAME_MAP[tint]
except KeyError:
otype = None
return otype
@classmethod
def getTypeNameFromTypeInt(cls, tint):
return cls.__lookup(cls.TYPE_INTEGER_NAMES, tint)
@classmethod
def getTypeCodeFromTypeInt(cls, tint):
if isinstance(tint, int):
tint = cls.getTypeNameFromTypeInt(tint)
return cls.__lookup(cls.TYPECODE_MAP, tint)
@classmethod
def getShortNameFromTypeInt(cls, tint):
return cls.fromTypeInt(cls.TYPENAME_MAP, "typename")
@classmethod
def getObjectTypeInt(cls, ob):
tint = None
for t, n in cls.TYPE_MAP.items():
if not isinstance(t, str) and isinstance(ob, t):
tint = cls.TYPE_INTEGERS[n]
return tint
@classmethod
def getObjectArrayTypeInt(cls, arr):
"""A convenience function that gets a type int for an iterable
:param o:
:return:
"""
from array import array
from .HelperClasses import BufferedNDArray
tint = None
if isinstance(arr, (bytes, bytearray)):
tint = cls.toTypeInt("Byte")
elif isinstance(arr, str):
tint = cls.toTypeInt("Char")
elif isinstance(arr, (BufferedNDArray, array)):
tint = cls.toTypeInt(arr.typecode)
if tint is None and cls.HAS_NUMPY:
import numpy as np
if isinstance(arr, np.ndarray):
tint = cls.toTypeInt(arr.dtype.type)
if tint is None:
item = arr
try:
while True:
item2 = item[0]
if item2 is item:
break
else:
item = item2
except:
pass
tint = cls.getObjectTypeInt(item)
return tint
@classmethod
def fromTypeToken(cls, tchar):
"""A convenience function that turns a type char into a str
:param tchar: (can be an int)
:return:
"""
try:
if isinstance(tchar, str):
tchar = ord(tchar)
retstr = cls.TYPE_TOKEN_NAMES[tchar]
except (KeyError, TypeError):
retstr = None
return retstr
@classmethod
def toTypeToken(cls, tstr):
"""A convenience function that turns a type char into a str
:param tchar: (can be an int)
:return:
"""
try:
tchar = cls.TYPE_TOKENS[tstr]
except KeyError:
tchar = None
return tchar
@classmethod
def getNumPyTypeInt(cls, dtype):
"""A convenience function that turns a numpy.dtype a type int
:param dtype:
:return:
"""
try:
ttype = dtype.type
except AttributeError:
ttype = dtype
return cls.toTypeInt(ttype)
@classmethod
def getNumPyType(cls, tint):
"""A convenience function that turns a numpy.dtype a type int
:param dtype:
:return:
"""
try:
ttype = cls.NUMPY_TYPE_MAP[tint]
except KeyError:
ttype = None
return ttype
@classmethod
def allowRagged(cls):
return cls.ALLOW_RAGGED_ARRAYS
@classmethod
def getExprTypeInt(cls, tstr):
try:
expint = cls.EXPR_TYPES[tstr]
except KeyError:
expint = None
return expint
@classmethod
def getExprTypeName(cls, tint):
try:
expname = cls.EXPR_TYPE_NAMES[tint]
except KeyError:
expname = None
return expname
@classmethod
def getErrorInt(cls, err_str):
try:
err_int = cls.ERROR_TYPES[err_str]
except KeyError:
err_int = None
return err_int
@classmethod
def getErrorName(cls, err_no):
try:
err_name = cls.ERROR_TYPE_NAMES[err_no]
except KeyError:
err_name = None
return err_name
@classmethod
def getErrorMessageText(cls, err_no, fallback = True):
if isinstance(err_no, int):
err_no = cls.getErrorName(err_no)
msg = None
try:
msg = cls.ERROR_MESSAGES[err_no]
except KeyError:
if fallback:
msg = cls.ERROR_MESSAGES["FallThrough"]
return msg
@classmethod
def getPacketInt(cls, packet_name):
try:
packet_int = cls.PACKET_TYPES[packet_name]
except KeyError:
packet_int = None
return packet_int
@classmethod
def getPacketName(cls, packet_int):
try:
packet_name = cls.PACKET_TYPE_NAMES[packet_int]
except KeyError:
packet_name = None
return packet_name
@classmethod
def getMessageInt(cls, packet_name):
try:
packet_int = cls.MESSAGE_TYPES[packet_name]
except KeyError:
packet_int = None
return packet_int
@classmethod
def getMessageName(cls, packet_int):
try:
packet_name = cls.MESSAGE_TYPE_NAMES[packet_int]
except KeyError:
packet_name = None
return packet_name
@classmethod
def getCallInt(cls, packet_name):
try:
packet_int = cls.CALL_TYPES[packet_name]
except KeyError:
packet_int = None
return packet_int
@classmethod
def getCallName(cls, packet_int):
try:
packet_name = cls.CALL_TYPE_NAMES[packet_int]
except KeyError:
packet_name = None
return packet_name
@classmethod
def system_name(cls):
plat = cls.PLATFORM
if plat == "Darwin":
sys_name = "MacOSX"
else:
sys_name = plat
return sys_name
@classmethod
def get_NativeLibrary_root(cls, use_default = True):
import os
base = os.path.dirname(os.path.abspath(__file__))
return os.path.join(base, "PJLinkNativeLibrary")
@classmethod
def get_Applications_root(cls, use_default = True):
import os
if cls.APPLICATIONS_ROOT is None or not use_default:
plat = cls.system_name()
if plat == "MacOSX":
root = os.sep + "Applications"
elif plat == "Linux": #too much stuff going on to really know if I'm handling this right
root = os.sep + os.path.join("usr", "local", "Wolfram", "Mathematica")
if not os.path.exists(root):
root = os.sep + os.path.join("opt", "Wolfram", "Mathematica")
elif plat == "Windows":
root = os.path.expandvars(os.path.join("%ProgramFiles%", "Wolfram Research", "Mathematica"))
else:
raise ValueError("Couldn't determine Current Mathematica for platform {}".format(plat, bin))
else:
root = cls.APPLICATIONS_ROOT
return root
@classmethod
def get_Installed_Mathematica(cls, use_default = True):
import os, re
root = cls.get_Applications_root(use_default=use_default)
mathematicas = []
for app in os.listdir(root):
if app.startswith("Mathematica") or app.startswith("Wolfram Desktop"):
mathematica = os.path.join(root, app)
app, ext = os.path.splitext(app)
vers = app.strip("Mathematica").strip("Wolfram Desktop").strip()
if len(vers)>0:
vers = vers
verNum = re.findall(r"\d+.\d", vers)[0]
verNum = float(verNum)
else:
vers = ""
verNum = 10000 # hopefully WRI never gets here...
mathematicas.append((mathematica, verNum, vers))
elif re.match(r"\d+.\d.*", app):
mathematica = os.path.join(root, app)
vers = app
verNum = re.findall(r"\d+.\d", vers)[0]
verNum = float(verNum)
mathematicas.append((mathematica, verNum, vers))
mathematicas = sorted(mathematicas, key = lambda tup: tup[1], reverse = True)
if len(mathematicas) == 0:
raise ValueError("couldn't find any Mathematica installations")
cls.CURRENT_MATHEMATICA = mathematicas[0][2]
return mathematicas[0][0]
@classmethod
def get_Mathematica_name(cls, version = None, use_default = True):
import os, re
mname = version
plat = cls.system_name()
if plat == "MaxOSX":
if mname is None:
mname = "Mathematica.app"
elif isinstance(mname, float) or (isinstance(mname, str) and re.match(r"\d\d.\d", mname)):
mname = "Mathematica {}.app".format(mname)
elif plat == "Linux":
if mname is None:
if cls.CURRENT_MATHEMATICA is None:
cls.get_Installed_Mathematica(use_default=use_default)
mname = str(cls.CURRENT_MATHEMATICA)
elif isinstance(mname, float) or (isinstance(mname, str) and re.match(r"\d\d.\d", mname)):
mname = str(mname)
elif plat == "Windows":
if mname is None:
if cls.CURRENT_MATHEMATICA is None:
cls.get_Installed_Mathematica(use_default=use_default)
mname = str(cls.CURRENT_MATHEMATICA)
elif isinstance(mname, float) or (isinstance(mname, str) and re.match(r"\d\d.\d", mname)):
mname = str(mname)
return mname
@classmethod
def get_Mathematica_root(cls, mname = None, use_default = True):
import os
if use_default:
root = cls.INSTALLATION_DIRECTORY
else:
root = None
if root is None:
plat = cls.system_name()
if mname is None and cls.CURRENT_MATHEMATICA is None:
root = cls.get_Installed_Mathematica(use_default=use_default)
if plat == "MacOSX":
root = os.path.join(root, "Contents")
else:
app_root = cls.get_Applications_root(use_default=use_default)
mname = cls.get_Mathematica_name(mname)
if plat == "MacOSX":
root = os.path.join(app_root, mname, "Contents")
elif plat == "Linux":
root = os.path.join(app_root, mname)
elif plat == "Windows":
root = os.path.join(app_root, mname)
else:
raise ValueError("Couldn't find Mathematica for platform {}".format(plat))
return root
@classmethod
def get_Kernel_binary(cls, version = None, use_default = True):
import platform, os
if use_default:
mbin = cls.WOLFRAMKERNEL_BINARY
else:
mbin = None
if mbin is None:
plat = cls.system_name()
try:
root = cls.get_Mathematica_root(version, use_default=use_default)
except ValueError:
if not (isinstance(version, str) and os.path.isfile(version)):
raise ValueError("Couldn't find WolframKernel executable for platform {}".format(plat))
else:
mbin = version
else:
if plat == "MacOSX":
mbin = os.path.join(root, "MacOS", "WolframKernel")
if not os.path.isfile(mbin):
mbin = os.path.join(root, "MacOS", "MathKernel")
elif plat == "Linux":
linux_base = os.path.join(root, "SystemFiles", "Kernel", "Binaries", "Linux-x86-64")
mbin = os.path.join(linux_base, "WolframKernel")
if not os.path.isfile(mbin):
mbin = os.path.join(linux_base, "MathKernel")
if not os.path.isfile(mbin):
mbin = os.path.join(linux_base, "math")
elif plat == "Windows":
mbin = os.path.join(root, "wolfram.exe")
if not os.path.isfile(mbin):
mbin = os.path.join(root, "math.exe")
if not os.path.isfile(mbin):
raise ValueError("Couldn't find WolframKernel executable for platform {} ({} is not a file)".format(plat, mbin))
return mbin
@classmethod
def get_Mathematica_binary(cls, version = None, use_default=True):
import platform, os
if use_default:
mbin = cls.MATHEMATICA_BINARY
else:
mbin = None
if mbin is None:
plat = cls.system_name()
try:
root = cls.get_Mathematica_root(version, use_default=use_default)
except ValueError:
if not (isinstance(version, str) and os.path.isfile(version)):
raise ValueError("Couldn't find Mathematica executable for platform {}".format(plat))
else:
mbin = version
else:
if plat == "MacOSX":
mbin = os.path.join(root, "MacOS", "Mathematica")
elif plat == "Linux":
bin_bit = "Linux"
if cls.get_is_64_bit():
bin_bit += "-x86-64"
mbin = os.path.join(root, "SystemFiles", "Kernel", "Binaries", bin_bit, "Mathematica")
elif plat == "Windows":
mbin = os.path.join(root, "Mathematica")
if not os.path.isfile(mbin):
raise ValueError("Couldn't find Mathematica executable for platform {} ({} is not a file)".format(plat, mbin))
return mbin
@classmethod
def get_is_64_bit(cls):
plat = cls.PLATFORM
if plat == "Windows":
import struct
is_64 = 8 * struct.calcsize("P") == 64
else:
import sys
is_64 = sys.maxsize > 2**32
return is_64
@classmethod
def get_MathLink_library(cls, version = None, use_default = True):
import os
if use_default:
lib = cls.MATHLINK_LIBRARY_DIR
else:
lib = None
sys_name = cls.system_name()
if lib is None:
core_lib = os.path.join(cls.get_NativeLibrary_root(), "src", "MathLinkBinaries")
if os.path.exists(core_lib):
lib = os.path.join(core_lib, sys_name)
if cls.get_is_64_bit():
lib = lib + "-x86-64"
if not os.path.exists(lib):
lib = None
if lib is None:
try:
root = cls.get_Mathematica_root(version, use_default=use_default)
except ValueError:
# if not (isinstance(version, str) and os.path.isfile(version)):
raise ValueError("Don't know how to find MathLink library on system {}".format(sys_name))
# else:
# mbin = version
else:
lib = os.path.join(root, "SystemFiles", "Links", "MathLink", "DeveloperKit")
ext_list = [ "-x86" ]
if cls.get_is_64_bit():
ext_list.append("-x86-64")
core_lib = lib
for ext in ext_list:
lib = os.path.join(core_lib, sys_name + ext, "CompilerAdditions")
if os.path.exists(lib):
break
else:
lib = os.path.join(core_lib, sys_name, "CompilerAdditions")
if not os.path.exists(lib):
raise ValueError("Couldn't find MathLink library for platform {} (path {} does not exist)".format(sys_name, lib))
return lib
@classmethod
def get_MathLink_library_name(cls, version = None, use_default = True):
"""Finds the actual library file to use. On Mac this is just the .a, on Linux this is a .a with a different format,
on Windows this is a .lib file with yet a different naming convention...
:param version:
:type version:
:param use_default:
:type use_default:
:return:
:rtype:
"""
import os
if use_default:
lib_name = cls.MATHLINK_LIBRARY_NAME
else:
lib_name = None
if lib_name is None:
lib_dir = cls.get_MathLink_library(version, use_default=use_default)
# plat = cls.PLATFORM
math_link_names = []
for lib_file in os.listdir(lib_dir):
name, ext = os.path.splitext(lib_file)
if ext == ".a":
name_bits = name.split("MLi")
if len(name_bits)>1: #Mac versions
sort_bits = [int(v) for v in name_bits[1].split(".")]
math_link_names.append((name.strip("lib"), sort_bits))
else: #Linux versions
name_bits = name.split("ML")
if len(name_bits)>1:
sort_bits = [ int(v) for v in name_bits[1].split("i") ]
math_link_names.append((name.strip("lib"), sort_bits))
elif ext == ".lib" and name[-1] == "s": #Windows
name_bits = name.split("ml")
if len(name_bits)>1:
sort_bits = [ int(v) for v in name_bits[1].strip("s").split("i") ]
math_link_names.append((name.strip("lib"), sort_bits))
if len(math_link_names) == 0:
raise ValueError("Couldn't find any MathLink library files in {}".format(lib_dir))
math_link_names = sorted(math_link_names, key=lambda b:b[1], reverse=True)
lib_name = math_link_names[0][0]
return lib_name
@classmethod
def log(cls, *expr):
if cls.ALLOW_LOGGING:
mode = "w+"
if os.path.isfile(cls.LOG_FILE):
mode = "a"
with open(cls.LOG_FILE, mode) as logger:
print(*expr, file=logger)
@classmethod
def logf(cls, logs, *args, **kwargs):
if cls.ALLOW_LOGGING:
cls.log(logs.format(*args, **kwargs))
TRACEBACK_LIMIT = 3
@classmethod
def get_tb(cls, limit = None):
if limit is None:
limit = cls.TRACEBACK_LIMIT
import traceback as tb
return tb.format_exc(limit)
@classmethod
def log_tb(cls, limit = None):
if cls.ALLOW_LOGGING:
cls.log(cls.get_tb(limit)) | 33.667366 | 134 | 0.525697 | 3,453 | 32,085 | 4.767738 | 0.177237 | 0.017858 | 0.0164 | 0.035716 | 0.33961 | 0.29788 | 0.258641 | 0.221223 | 0.206706 | 0.188969 | 0 | 0.01664 | 0.370672 | 32,085 | 953 | 135 | 33.667366 | 0.798683 | 0.142309 | 0 | 0.332402 | 0 | 0 | 0.110327 | 0 | 0 | 0 | 0 | 0.001049 | 0 | 1 | 0.055866 | false | 0.002793 | 0.032123 | 0.00419 | 0.189944 | 0.001397 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d46d4428b78c33cd132c9bc78ff2e64c5b2b97 | 3,379 | py | Python | dabstract/dataset/dbs/DCASE2018Task5.py | magics-tech/dabstract-1 | 9f7a2d99d0dff1df5c2f90c82b1eecc9c42c2c24 | [
"MIT"
] | 7 | 2020-11-04T13:21:01.000Z | 2021-12-14T13:08:04.000Z | dabstract/dataset/dbs/DCASE2018Task5.py | magics-tech/dabstract-1 | 9f7a2d99d0dff1df5c2f90c82b1eecc9c42c2c24 | [
"MIT"
] | null | null | null | dabstract/dataset/dbs/DCASE2018Task5.py | magics-tech/dabstract-1 | 9f7a2d99d0dff1df5c2f90c82b1eecc9c42c2c24 | [
"MIT"
] | 2 | 2020-11-26T09:25:23.000Z | 2021-09-22T12:05:14.000Z | import dcase_util
import pandas
from dabstract.dataprocessor.processing_chain import ProcessingChain
from dabstract.dataset.dataset import Dataset
from dabstract.dataprocessor.processors import *
from dabstract.utils import stringlist2ind
class DCASE2018Task5(Dataset):
"""DCASE2020Task1A dataset
This class downloads the datasets and prepares it in the dabstract format.
Parameters
----------
paths : dict or str:
Path configuration in the form of a dictionary.
For example::
$ paths={ 'data': path_to_data,
$ 'meta': path_to_meta,
$ 'feat': path_to_feat}
test_only : bool
To specify if this dataset should be used for testing or both testing and train.
This is only relevant if multiple datasets are combined and set_xval() is used.
For example::
test_only = 0 -> use for both train and test
test_only = 1 -> use only for test
check dabstract.dataset.dataset.Dataset for more info
Returns
-------
DCASE2020Task1B dataset class
"""
def __init__(self, paths=None, test_only=0, **kwargs):
# init dict abstract
super().__init__(name=self.__class__.__name__, paths=paths, test_only=test_only)
# Data: get data
def set_data(self, paths):
"""Set the data"""
# audio
chain = ProcessingChain().add(WavDatareader(select_channel=0))
from dabstract.dataset.helpers import FolderDictSeqAbstract
self.add(
"audio",
FolderDictSeqAbstract(
paths["data"],
map_fct=chain,
file_info_save_path=os.path.join(
paths["feat"], self.__class__.__name__, "audio", "raw"
),
),
)
# get meta
if os.path.exists(os.path.join(paths["meta"], "meta_dabstract.txt")):
labels = pandas.read_csv(
os.path.join(paths["meta"], "meta_dabstract.txt"), delimiter="\t", header=None
)
else:
labels = pandas.read_csv(
os.path.join(paths["meta"], "meta.txt"), delimiter="\t", header=None
)
# make sure audio and meta is aligned
filenames = labels[0].to_list()
resort = np.array(
[
filenames.index("audio/" + filename)
for filename in self["audio"]["example"]
]
)
labels = labels.reindex(resort)
labels.to_csv(os.path.join(paths["meta"], "meta_dabstract.txt"), sep="\t", header = False, index=False)
# add labels
self.add("identifier", labels[2].to_list(), lazy=False)
#self.add("source", [filename for filename in filenames], lazy=False)
self.add("scene", labels[1].to_list(), lazy=False)
self.add(
"scene_id", stringlist2ind(self['scene']), lazy=False
)
self.add("group", stringlist2ind(self['identifier']), lazy=False)
return self
def prepare(self, paths):
pass
# """Prepare the data"""
# dcase_util.datasets.dataset_factory(
# dataset_class_name="DCASE2018_Task5_DevelopmentSet",
# data_path=os.path.split(os.path.split(paths["data"])[0])[0],
# ).initialize() | 35.568421 | 115 | 0.580941 | 380 | 3,379 | 5.002632 | 0.342105 | 0.02525 | 0.026302 | 0.039453 | 0.148343 | 0.112046 | 0.088901 | 0.088901 | 0.070489 | 0.044187 | 0 | 0.013623 | 0.304824 | 3,379 | 95 | 116 | 35.568421 | 0.795658 | 0.331459 | 0 | 0.125 | 0 | 0 | 0.076959 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.020833 | 0.145833 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d481ba49640879152ec8fdf7ba102e2a4ba52e | 28,568 | py | Python | appParsers/ParseDXF_Spline.py | DannyPol/flatcam | 25a8634d0658e98b7fae31a095f8bef40c1b3067 | [
"MIT"
] | 1 | 2022-02-11T06:19:34.000Z | 2022-02-11T06:19:34.000Z | appParsers/ParseDXF_Spline.py | MRemy2/FlatCam | d4f941335ca8a8d5351aab23b396f99da06a9029 | [
"MIT"
] | null | null | null | appParsers/ParseDXF_Spline.py | MRemy2/FlatCam | d4f941335ca8a8d5351aab23b396f99da06a9029 | [
"MIT"
] | null | null | null | # Author: vvlachoudis@gmail.com
# Vasilis Vlachoudis
# Date: 20-Oct-2015
# ##########################################################
# FlatCAM: 2D Post-processing for Manufacturing #
# File modified: Marius Adrian Stanciu #
# Date: 3/10/2019 #
# ##########################################################
import math
def norm(v):
return math.sqrt(v[0]*v[0] + v[1]*v[1] + v[2]*v[2])
def normalize_2(v):
m = norm(v)
return [v[0]/m, v[1]/m, v[2]/m]
# ------------------------------------------------------------------------------
# Convert a B-spline to polyline with a fixed number of segments
# ------------------------------------------------------------------------------
def spline2Polyline(xyz, degree, closed, segments, knots):
"""
:param xyz: DXF spline control points
:param degree: degree of the Spline curve
:param closed: closed Spline
:type closed: bool
:param segments: how many lines to use for Spline approximation
:param knots: DXF spline knots
:return: x,y,z coordinates (each is a list)
"""
# Check if last point coincide with the first one
if (Vector(xyz[0]) - Vector(xyz[-1])).length2() < 1e-10:
# it is already closed, treat it as open
closed = False
# FIXME we should verify if it is periodic,.... but...
# I am not sure :)
if closed:
xyz.extend(xyz[:degree])
knots = None
else:
# make base-1
# knots.insert(0, 0)
pass
npts = len(xyz)
if degree < 1 or degree > 3:
# print "invalid degree"
return None, None, None
# order:
k = degree+1
if npts < k:
# print "not enough control points"
return None, None, None
# resolution:
nseg = segments * npts
# WARNING: base 1
b = [0.0]*(npts*3+1) # polygon points
h = [1.0]*(npts+1) # set all homogeneous weighting factors to 1.0
p = [0.0]*(nseg*3+1) # returned curved points
i = 1
for pt in xyz:
b[i] = pt[0]
b[i+1] = pt[1]
b[i+2] = pt[2]
i += 3
# if periodic:
if closed:
_rbsplinu(npts, k, nseg, b, h, p, knots)
else:
_rbspline(npts, k, nseg, b, h, p, knots)
x = []
y = []
z = []
for i in range(1, 3*nseg+1, 3):
x.append(p[i])
y.append(p[i+1])
z.append(p[i+2])
# for i,xyz in enumerate(zip(x,y,z)):
# print i,xyz
return x, y, z
# ------------------------------------------------------------------------------
# Subroutine to generate a B-spline open knot vector with multiplicity
# equal to the order at the ends.
# c = order of the basis function
# n = the number of defining polygon vertices
# n+2 = index of x[] for the first occurence of the maximum knot vector value
# n+order = maximum value of the knot vector -- $n + c$
# x[] = array containing the knot vector
# ------------------------------------------------------------------------------
def _knot(n, order):
x = [0.0]*(n+order+1)
for i in range(2, n+order+1):
if order < i < n+2:
x[i] = x[i-1] + 1.0
else:
x[i] = x[i-1]
return x
# ------------------------------------------------------------------------------
# Subroutine to generate a B-spline uniform (periodic) knot vector.
#
# order = order of the basis function
# n = the number of defining polygon vertices
# n+order = maximum value of the knot vector -- $n + order$
# x[] = array containing the knot vector
# ------------------------------------------------------------------------------
def _knotu(n, order):
x = [0]*(n+order+1)
for i in range(2, n+order+1):
x[i] = float(i-1)
return x
# ------------------------------------------------------------------------------
# Subroutine to generate rational B-spline basis functions--open knot vector
# C code for An Introduction to NURBS
# by David F. Rogers. Copyright (C) 2000 David F. Rogers,
# All rights reserved.
# Name: rbasis
# Subroutines called: none
# Book reference: Chapter 4, Sec. 4. , p 296
# c = order of the B-spline basis function
# d = first term of the basis function recursion relation
# e = second term of the basis function recursion relation
# h[] = array containing the homogeneous weights
# npts = number of defining polygon vertices
# nplusc = constant -- npts + c -- maximum number of knot values
# r[] = array containing the rational basis functions
# r[1] contains the basis function associated with B1 etc.
# t = parameter value
# temp[] = temporary array
# x[] = knot vector
# ------------------------------------------------------------------------------
def _rbasis(c, t, npts, x, h, r):
nplusc = npts + c
temp = [0.0]*(nplusc+1)
# calculate the first order non-rational basis functions n[i]
for i in range(1, nplusc):
if x[i] <= t < x[i+1]:
temp[i] = 1.0
else:
temp[i] = 0.0
# calculate the higher order non-rational basis functions
for k in range(2, c+1):
for i in range(1, nplusc-k+1):
# if the lower order basis function is zero skip the calculation
if temp[i] != 0.0:
d = ((t-x[i])*temp[i])/(x[i+k-1]-x[i])
else:
d = 0.0
# if the lower order basis function is zero skip the calculation
if temp[i+1] != 0.0:
e = ((x[i+k]-t)*temp[i+1])/(x[i+k]-x[i+1])
else:
e = 0.0
temp[i] = d + e
# pick up last point
if t >= x[nplusc]:
temp[npts] = 1.0
# calculate sum for denominator of rational basis functions
s = 0.0
for i in range(1, npts+1):
s += temp[i]*h[i]
# form rational basis functions and put in r vector
for i in range(1, npts+1):
if s != 0.0:
r[i] = (temp[i]*h[i])/s
else:
r[i] = 0
# ------------------------------------------------------------------------------
# Generates a rational B-spline curve using a uniform open knot vector.
#
# C code for An Introduction to NURBS
# by David F. Rogers. Copyright (C) 2000 David F. Rogers,
# All rights reserved.
#
# Name: rbspline.c
# Subroutines called: _knot, rbasis
# Book reference: Chapter 4, Alg. p. 297
#
# b = array containing the defining polygon vertices
# b[1] contains the x-component of the vertex
# b[2] contains the y-component of the vertex
# b[3] contains the z-component of the vertex
# h = array containing the homogeneous weighting factors
# k = order of the B-spline basis function
# nbasis = array containing the basis functions for a single value of t
# nplusc = number of knot values
# npts = number of defining polygon vertices
# p[,] = array containing the curve points
# p[1] contains the x-component of the point
# p[2] contains the y-component of the point
# p[3] contains the z-component of the point
# p1 = number of points to be calculated on the curve
# t = parameter value 0 <= t <= npts - k + 1
# x[] = array containing the knot vector
# ------------------------------------------------------------------------------
def _rbspline(npts, k, p1, b, h, p, x):
nplusc = npts + k
nbasis = [0.0]*(npts+1) # zero and re-dimension the basis array
# generate the uniform open knot vector
if x is None or len(x) != nplusc+1:
x = _knot(npts, k)
icount = 0
# calculate the points on the rational B-spline curve
t = 0
step = float(x[nplusc])/float(p1-1)
for i1 in range(1, p1+1):
if x[nplusc] - t < 5e-6:
t = x[nplusc]
# generate the basis function for this value of t
nbasis = [0.0]*(npts+1) # zero and re-dimension the knot vector and the basis array
_rbasis(k, t, npts, x, h, nbasis)
# generate a point on the curve
for j in range(1, 4):
jcount = j
p[icount+j] = 0.0
# Do local matrix multiplication
for i in range(1, npts+1):
p[icount+j] += nbasis[i]*b[jcount]
jcount += 3
icount += 3
t += step
# ------------------------------------------------------------------------------
# Subroutine to generate a rational B-spline curve using an uniform periodic knot vector
#
# C code for An Introduction to NURBS
# by David F. Rogers. Copyright (C) 2000 David F. Rogers,
# All rights reserved.
#
# Name: rbsplinu.c
# Subroutines called: _knotu, _rbasis
# Book reference: Chapter 4, Alg. p. 298
#
# b[] = array containing the defining polygon vertices
# b[1] contains the x-component of the vertex
# b[2] contains the y-component of the vertex
# b[3] contains the z-component of the vertex
# h[] = array containing the homogeneous weighting factors
# k = order of the B-spline basis function
# nbasis = array containing the basis functions for a single value of t
# nplusc = number of knot values
# npts = number of defining polygon vertices
# p[,] = array containing the curve points
# p[1] contains the x-component of the point
# p[2] contains the y-component of the point
# p[3] contains the z-component of the point
# p1 = number of points to be calculated on the curve
# t = parameter value 0 <= t <= npts - k + 1
# x[] = array containing the knot vector
# ------------------------------------------------------------------------------
def _rbsplinu(npts, k, p1, b, h, p, x=None):
nplusc = npts + k
nbasis = [0.0]*(npts+1) # zero and re-dimension the basis array
# generate the uniform periodic knot vector
if x is None or len(x) != nplusc+1:
# zero and re dimension the knot vector and the basis array
x = _knotu(npts, k)
icount = 0
# calculate the points on the rational B-spline curve
t = k-1
step = (float(npts)-(k-1))/float(p1-1)
for i1 in range(1, p1+1):
if x[nplusc] - t < 5e-6:
t = x[nplusc]
# generate the basis function for this value of t
nbasis = [0.0]*(npts+1)
_rbasis(k, t, npts, x, h, nbasis)
# generate a point on the curve
for j in range(1, 4):
jcount = j
p[icount+j] = 0.0
# Do local matrix multiplication
for i in range(1, npts+1):
p[icount+j] += nbasis[i]*b[jcount]
jcount += 3
icount += 3
t += step
# Accuracy for comparison operators
_accuracy = 1E-15
def Cmp0(x):
"""Compare against zero within _accuracy"""
return abs(x) < _accuracy
def gauss(A, B):
"""Solve A*X = B using the Gauss elimination method"""
n = len(A)
s = [0.0] * n
X = [0.0] * n
p = [i for i in range(n)]
for i in range(n):
s[i] = max([abs(x) for x in A[i]])
for k in range(n - 1):
# select j>=k so that
# |A[p[j]][k]| / s[p[i]] >= |A[p[i]][k]| / s[p[i]] for i = k,k+1,...,n
j = k
ap = abs(A[p[j]][k]) / s[p[j]]
for i in range(k + 1, n):
api = abs(A[p[i]][k]) / s[p[i]]
if api > ap:
j = i
ap = api
if j != k:
p[k], p[j] = p[j], p[k] # Swap values
for i in range(k + 1, n):
z = A[p[i]][k] / A[p[k]][k]
A[p[i]][k] = z
for j in range(k + 1, n):
A[p[i]][j] -= z * A[p[k]][j]
for k in range(n - 1):
for i in range(k + 1, n):
B[p[i]] -= A[p[i]][k] * B[p[k]]
for i in range(n - 1, -1, -1):
X[i] = B[p[i]]
for j in range(i + 1, n):
X[i] -= A[p[i]][j] * X[j]
X[i] /= A[p[i]][i]
return X
# Vector class
# Inherits from List
class Vector(list):
"""Vector class"""
def __init__(self, x=3, *args):
"""Create a new vector,
Vector(size), Vector(list), Vector(x,y,z,...)"""
list.__init__(self)
if isinstance(x, int) and not args:
for i in range(x):
self.append(0.0)
elif isinstance(x, (list, tuple)):
for i in x:
self.append(float(i))
else:
self.append(float(x))
for i in args:
self.append(float(i))
# ----------------------------------------------------------------------
def set(self, x, y, z=None):
"""Set vector"""
self[0] = x
self[1] = y
if z:
self[2] = z
# ----------------------------------------------------------------------
def __repr__(self):
return "[%s]" % ", ".join([repr(x) for x in self])
# ----------------------------------------------------------------------
def __str__(self):
return "[%s]" % ", ".join([("%15g" % x).strip() for x in self])
# ----------------------------------------------------------------------
def eq(self, v, acc=_accuracy):
"""Test for equality with vector v within accuracy"""
if len(self) != len(v):
return False
s2 = 0.0
for a, b in zip(self, v):
s2 += (a - b) ** 2
return s2 <= acc ** 2
def __eq__(self, v):
return self.eq(v)
# ----------------------------------------------------------------------
def __neg__(self):
"""Negate vector"""
new = Vector(len(self))
for i, s in enumerate(self):
new[i] = -s
return new
# ----------------------------------------------------------------------
def __add__(self, v):
"""Add 2 vectors"""
size = min(len(self), len(v))
new = Vector(size)
for i in range(size):
new[i] = self[i] + v[i]
return new
# ----------------------------------------------------------------------
def __iadd__(self, v):
"""Add vector v to self"""
for i in range(min(len(self), len(v))):
self[i] += v[i]
return self
# ----------------------------------------------------------------------
def __sub__(self, v):
"""Subtract 2 vectors"""
size = min(len(self), len(v))
new = Vector(size)
for i in range(size):
new[i] = self[i] - v[i]
return new
# ----------------------------------------------------------------------
def __isub__(self, v):
"""Subtract vector v from self"""
for i in range(min(len(self), len(v))):
self[i] -= v[i]
return self
# ----------------------------------------------------------------------
# Scale or Dot product
# ----------------------------------------------------------------------
def __mul__(self, v):
"""scale*Vector() or Vector()*Vector() - Scale vector or dot product"""
if isinstance(v, list):
return self.dot(v)
else:
return Vector([x * v for x in self])
# ----------------------------------------------------------------------
# Scale or Dot product
# ----------------------------------------------------------------------
def __rmul__(self, v):
"""scale*Vector() or Vector()*Vector() - Scale vector or dot product"""
if isinstance(v, Vector):
return self.dot(v)
else:
return Vector([x * v for x in self])
# ----------------------------------------------------------------------
# Divide by floating point
# ----------------------------------------------------------------------
def __div__(self, b):
return Vector([x / b for x in self])
# ----------------------------------------------------------------------
def __xor__(self, v):
"""Cross product"""
return self.cross(v)
# ----------------------------------------------------------------------
def dot(self, v):
"""Dot product of 2 vectors"""
s = 0.0
for a, b in zip(self, v):
s += a * b
return s
# ----------------------------------------------------------------------
def cross(self, v):
"""Cross product of 2 vectors"""
if len(self) == 3:
return Vector(self[1] * v[2] - self[2] * v[1],
self[2] * v[0] - self[0] * v[2],
self[0] * v[1] - self[1] * v[0])
elif len(self) == 2:
return self[0] * v[1] - self[1] * v[0]
else:
raise Exception("Cross product needs 2d or 3d vectors")
# ----------------------------------------------------------------------
def length2(self):
"""Return length squared of vector"""
s2 = 0.0
for s in self:
s2 += s ** 2
return s2
# ----------------------------------------------------------------------
def length(self):
"""Return length of vector"""
s2 = 0.0
for s in self:
s2 += s ** 2
return math.sqrt(s2)
__abs__ = length
# ----------------------------------------------------------------------
def arg(self):
"""return vector angle"""
return math.atan2(self[1], self[0])
# ----------------------------------------------------------------------
def norm(self):
"""Normalize vector and return length"""
length = self.length()
if length > 0.0:
invlen = 1.0 / length
for i in range(len(self)):
self[i] *= invlen
return length
normalize = norm
# ----------------------------------------------------------------------
def unit(self):
"""return a unit vector"""
v = self.clone()
v.norm()
return v
# ----------------------------------------------------------------------
def clone(self):
"""Clone vector"""
return Vector(self)
# ----------------------------------------------------------------------
def x(self):
return self[0]
def y(self):
return self[1]
def z(self):
return self[2]
# ----------------------------------------------------------------------
def orthogonal(self):
"""return a vector orthogonal to self"""
xx = abs(self.x())
yy = abs(self.y())
if len(self) >= 3:
zz = abs(self.z())
if xx < yy:
if xx < zz:
return Vector(0.0, self.z(), -self.y())
else:
return Vector(self.y(), -self.x(), 0.0)
else:
if yy < zz:
return Vector(-self.z(), 0.0, self.x())
else:
return Vector(self.y(), -self.x(), 0.0)
else:
return Vector(-self.y(), self.x())
# ----------------------------------------------------------------------
def direction(self, zero=_accuracy):
"""return containing the direction if normalized with any of the axis"""
v = self.clone()
length = v.norm()
if abs(length) <= zero:
return "O"
if abs(v[0] - 1.0) < zero:
return "X"
elif abs(v[0] + 1.0) < zero:
return "-X"
elif abs(v[1] - 1.0) < zero:
return "Y"
elif abs(v[1] + 1.0) < zero:
return "-Y"
elif abs(v[2] - 1.0) < zero:
return "Z"
elif abs(v[2] + 1.0) < zero:
return "-Z"
else:
# nothing special about the direction, return N
return "N"
# ----------------------------------------------------------------------
# Set the vector directly in polar coordinates
# @param ma magnitude of vector
# @param ph azimuthal angle in radians
# @param th polar angle in radians
# ----------------------------------------------------------------------
def setPolar(self, ma, ph, th):
"""Set the vector directly in polar coordinates"""
sf = math.sin(ph)
cf = math.cos(ph)
st = math.sin(th)
ct = math.cos(th)
self[0] = ma * st * cf
self[1] = ma * st * sf
self[2] = ma * ct
# ----------------------------------------------------------------------
def phi(self):
"""return the azimuth angle."""
if Cmp0(self.x()) and Cmp0(self.y()):
return 0.0
return math.atan2(self.y(), self.x())
# ----------------------------------------------------------------------
def theta(self):
"""return the polar angle."""
if Cmp0(self.x()) and Cmp0(self.y()) and Cmp0(self.z()):
return 0.0
return math.atan2(self.perp(), self.z())
# ----------------------------------------------------------------------
def cosTheta(self):
"""return cosine of the polar angle."""
ptot = self.length()
if Cmp0(ptot):
return 1.0
else:
return self.z() / ptot
# ----------------------------------------------------------------------
def perp2(self):
"""return the transverse component squared
(R^2 in cylindrical coordinate system)."""
return self.x() * self.x() + self.y() * self.y()
# ----------------------------------------------------------------------
def perp(self):
"""@return the transverse component
(R in cylindrical coordinate system)."""
return math.sqrt(self.perp2())
# ----------------------------------------------------------------------
# Return a random 3D vector
# ----------------------------------------------------------------------
# @staticmethod
# def random():
# cosTheta = 2.0 * random.random() - 1.0
# sinTheta = math.sqrt(1.0 - cosTheta ** 2)
# phi = 2.0 * math.pi * random.random()
# return Vector(math.cos(phi) * sinTheta, math.sin(phi) * sinTheta, cosTheta)
# #===============================================================================
# # Cardinal cubic spline class
# #===============================================================================
# class CardinalSpline:
# def __init__(self, A=0.5):
# # The default matrix is the Catmull-Rom spline
# # which is equal to Cardinal matrix
# # for A = 0.5
# #
# # Note: Vasilis
# # The A parameter should be the fraction in t where
# # the second derivative is zero
# self.setMatrix(A)
#
# #-----------------------------------------------------------------------
# # Set the matrix according to Cardinal
# #-----------------------------------------------------------------------
# def setMatrix(self, A=0.5):
# self.M = []
# self.M.append([ -A, 2.-A, A-2., A ])
# self.M.append([2.*A, A-3., 3.-2.*A, -A ])
# self.M.append([ -A, 0., A, 0.])
# self.M.append([ 0., 1., 0, 0.])
#
# #-----------------------------------------------------------------------
# # Evaluate Cardinal spline at position t
# # @param P list or tuple with 4 points y positions
# # @param t [0..1] fraction of interval from points 1..2
# # @param k index of starting 4 elements in P
# # @return spline evaluation
# #-----------------------------------------------------------------------
# def __call__(self, P, t, k=1):
# T = [t*t*t, t*t, t, 1.0]
# R = [0.0]*4
# for i in range(4):
# for j in range(4):
# R[i] += T[j] * self.M[j][i]
# y = 0.0
# for i in range(4):
# y += R[i]*P[k+i-1]
#
# return y
#
# #-----------------------------------------------------------------------
# # Return the coefficients of a 3rd degree polynomial
# # f(x) = a t^3 + b t^2 + c t + d
# # @return [a, b, c, d]
# #-----------------------------------------------------------------------
# def coefficients(self, P, k=1):
# C = [0.0]*4
# for i in range(4):
# for j in range(4):
# C[i] += self.M[i][j] * P[k+j-1]
# return C
#
# #-----------------------------------------------------------------------
# # Evaluate the value of the spline using the coefficients
# #-----------------------------------------------------------------------
# def evaluate(self, C, t):
# return ((C[0]*t + C[1])*t + C[2])*t + C[3]
#
# #===============================================================================
# # Cubic spline ensuring that the first and second derivative are continuous
# # adapted from Penelope Manual Appending B.1
# # It requires all the points (xi,yi) and the assumption on how to deal
# # with the second derivative on the extremities
# # Option 1: assume zero as second derivative on both ends
# # Option 2: assume the same as the next or previous one
# #===============================================================================
# class CubicSpline:
# def __init__(self, X, Y):
# self.X = X
# self.Y = Y
# self.n = len(X)
#
# # Option #1
# s1 = 0.0 # zero based = s0
# sN = 0.0 # zero based = sN-1
#
# # Construct the tri-diagonal matrix
# A = []
# B = [0.0] * (self.n-2)
# for i in range(self.n-2):
# A.append([0.0] * (self.n-2))
#
# for i in range(1,self.n-1):
# hi = self.h(i)
# Hi = 2.0*(self.h(i-1) + hi)
# j = i-1
# A[j][j] = Hi
# if i+1<self.n-1:
# A[j][j+1] = A[j+1][j] = hi
#
# if i==1:
# B[j] = 6.*(self.d(i) - self.d(j)) - hi*s1
# elif i<self.n-2:
# B[j] = 6.*(self.d(i) - self.d(j))
# else:
# B[j] = 6.*(self.d(i) - self.d(j)) - hi*sN
#
#
# self.s = gauss(A,B)
# self.s.insert(0,s1)
# self.s.append(sN)
# # print ">> s <<"
# # pprint(self.s)
#
# #-----------------------------------------------------------------------
# def h(self, i):
# return self.X[i+1] - self.X[i]
#
# #-----------------------------------------------------------------------
# def d(self, i):
# return (self.Y[i+1] - self.Y[i]) / (self.X[i+1] - self.X[i])
#
# #-----------------------------------------------------------------------
# def coefficients(self, i):
# """return coefficients of cubic spline for interval i a*x**3+b*x**2+c*x+d"""
# hi = self.h(i)
# si = self.s[i]
# si1 = self.s[i+1]
# xi = self.X[i]
# xi1 = self.X[i+1]
# fi = self.Y[i]
# fi1 = self.Y[i+1]
#
# a = 1./(6.*hi)*(si*xi1**3 - si1*xi**3 + 6.*(fi*xi1 - fi1*xi)) + hi/6.*(si1*xi - si*xi1)
# b = 1./(2.*hi)*(si1*xi**2 - si*xi1**2 + 2*(fi1 - fi)) + hi/6.*(si - si1)
# c = 1./(2.*hi)*(si*xi1 - si1*xi)
# d = 1./(6.*hi)*(si1-si)
#
# return [d,c,b,a]
#
# #-----------------------------------------------------------------------
# def __call__(self, i, x):
# C = self.coefficients(i)
# return ((C[0]*x + C[1])*x + C[2])*x + C[3]
#
# #-----------------------------------------------------------------------
# # @return evaluation of cubic spline at x using coefficients C
# #-----------------------------------------------------------------------
# def evaluate(self, C, x):
# return ((C[0]*x + C[1])*x + C[2])*x + C[3]
#
# #-----------------------------------------------------------------------
# # Return evaluated derivative at x using coefficients C
# #-----------------------------------------------------------------------
# def derivative(self, C, x):
# a = 3.0*C[0] # derivative coefficients
# b = 2.0*C[1] # ... for sampling with rejection
# c = C[2]
# return (3.0*C[0]*x + 2.0*C[1])*x + C[2]
#
| 34.585956 | 97 | 0.405734 | 3,447 | 28,568 | 3.334784 | 0.121845 | 0.00696 | 0.014615 | 0.02488 | 0.43906 | 0.406438 | 0.379121 | 0.327795 | 0.322575 | 0.301174 | 0 | 0.025231 | 0.303556 | 28,568 | 825 | 98 | 34.627879 | 0.552523 | 0.592621 | 0 | 0.300885 | 0 | 0 | 0.00578 | 0 | 0 | 0 | 0 | 0.001212 | 0 | 1 | 0.129794 | false | 0.00295 | 0.00295 | 0.023599 | 0.312684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d543b523a6a570e804080ffe9c0c868e31e09b | 4,001 | py | Python | invenio_app_ils/circulation/views.py | kpsherva/invenio-app-ils | 1ecf9d1ac436ce49d75fdd4e494526b056d9bb1e | [
"MIT"
] | null | null | null | invenio_app_ils/circulation/views.py | kpsherva/invenio-app-ils | 1ecf9d1ac436ce49d75fdd4e494526b056d9bb1e | [
"MIT"
] | null | null | null | invenio_app_ils/circulation/views.py | kpsherva/invenio-app-ils | 1ecf9d1ac436ce49d75fdd4e494526b056d9bb1e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright (C) 2018 CERN.
#
# invenio-app-ils is free software; you can redistribute it and/or modify it
# under the terms of the MIT License; see LICENSE file for more details.
"""Invenio App ILS Circulation views."""
from __future__ import absolute_import, print_function
from functools import wraps
from flask import Blueprint, abort, current_app, request
from invenio_circulation.errors import CirculationException, \
InvalidCirculationPermission
from invenio_circulation.links import loan_links_factory
from invenio_circulation.views import create_error_handlers
from invenio_records_rest.utils import obj_or_import_string
from invenio_rest import ContentNegotiatedMethodView
from invenio_app_ils.permissions import check_permission
from .api import create_loan, request_loan
def need_permissions(action):
"""View decorator to check permissions for the given action or abort.
:param action: The action needed.
"""
def decorator_builder(f):
@wraps(f)
def decorate(*args, **kwargs):
check_permission(
current_app.config["ILS_VIEWS_PERMISSIONS_FACTORY"](action)
)
return f(*args, **kwargs)
return decorate
return decorator_builder
def create_circulation_blueprint(_):
"""Add circulation views to the blueprint."""
blueprint = Blueprint(
"invenio_app_ils_circulation",
__name__,
url_prefix="",
)
create_error_handlers(blueprint)
rec_serializers = {
"application/json": (
"invenio_records_rest.serializers" ":json_v1_response"
)
}
serializers = {
mime: obj_or_import_string(func)
for mime, func in rec_serializers.items()
}
loan_request = LoanRequestResource.as_view(
LoanRequestResource.view_name,
serializers=serializers,
ctx=dict(links_factory=loan_links_factory),
)
blueprint.add_url_rule(
"/circulation/loans/request", view_func=loan_request, methods=["POST"]
)
loan_create = LoanCreateResource.as_view(
LoanCreateResource.view_name,
serializers=serializers,
ctx=dict(links_factory=loan_links_factory),
)
blueprint.add_url_rule(
"/circulation/loans/create", view_func=loan_create, methods=["POST"]
)
return blueprint
class IlsResource(ContentNegotiatedMethodView):
"""ILS resource."""
def __init__(self, serializers, ctx, *args, **kwargs):
"""Constructor."""
super(IlsResource, self).__init__(serializers, *args, **kwargs)
for key, value in ctx.items():
setattr(self, key, value)
class LoanRequestResource(IlsResource):
"""Loan request action resource."""
view_name = "loan_request"
@need_permissions('circulation-loan-request')
def post(self, **kwargs):
"""Loan request post method."""
try:
pid, loan = request_loan(request.get_json())
except InvalidCirculationPermission as ex:
current_app.logger.exception(ex.msg)
return abort(403)
except CirculationException as ex:
current_app.logger.exception(ex.msg)
return abort(400)
return self.make_response(
pid, loan, 202, links_factory=self.links_factory
)
class LoanCreateResource(IlsResource):
"""Loan create action resource."""
view_name = "loan_create"
@need_permissions('circulation-loan-create')
def post(self, **kwargs):
"""Loan create post method."""
try:
pid, loan = create_loan(request.get_json())
except InvalidCirculationPermission as ex:
current_app.logger.exception(ex.msg)
return abort(403)
except CirculationException as ex:
current_app.logger.exception(ex.msg)
return abort(400)
return self.make_response(
pid, loan, 202, links_factory=self.links_factory
)
| 28.578571 | 78 | 0.672082 | 440 | 4,001 | 5.877273 | 0.288636 | 0.042537 | 0.020108 | 0.021655 | 0.310131 | 0.258314 | 0.258314 | 0.258314 | 0.258314 | 0.258314 | 0 | 0.007853 | 0.236191 | 4,001 | 139 | 79 | 28.784173 | 0.838351 | 0.126718 | 0 | 0.292135 | 0 | 0 | 0.072801 | 0.054164 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078652 | false | 0 | 0.123596 | 0 | 0.370787 | 0.089888 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d57b41a068554c1044a53dcd6aadb0fdeeb06a | 3,287 | py | Python | node/blockchain/tests/test_models/test_block_message/test_pv_schedule_update.py | thenewboston-developers/Node | e71a405f4867786a54dd17ddd97595dd3a630018 | [
"MIT"
] | 18 | 2021-11-30T04:02:13.000Z | 2022-03-24T12:33:57.000Z | node/blockchain/tests/test_models/test_block_message/test_pv_schedule_update.py | thenewboston-developers/Node | e71a405f4867786a54dd17ddd97595dd3a630018 | [
"MIT"
] | 1 | 2022-02-04T17:07:38.000Z | 2022-02-04T17:07:38.000Z | node/blockchain/tests/test_models/test_block_message/test_pv_schedule_update.py | thenewboston-developers/Node | e71a405f4867786a54dd17ddd97595dd3a630018 | [
"MIT"
] | 5 | 2022-01-31T05:28:13.000Z | 2022-03-08T17:25:31.000Z | import re
from datetime import datetime
import pytest
from pydantic import ValidationError
from node.blockchain.facade import BlockchainFacade
from node.blockchain.inner_models import (
BlockMessage, BlockMessageUpdate, PVScheduleUpdateBlockMessage, PVScheduleUpdateSignedChangeRequest
)
from node.blockchain.types import Type
@pytest.mark.usefixtures('base_blockchain')
def test_create_from_signed_change_request(
pv_schedule_update_signed_change_request_message, primary_validator_key_pair
):
request = PVScheduleUpdateSignedChangeRequest.create_from_signed_change_request_message(
message=pv_schedule_update_signed_change_request_message,
signing_key=primary_validator_key_pair.private,
)
blockchain_facade = BlockchainFacade.get_instance()
expected_block_number = blockchain_facade.get_next_block_number()
expected_identifier = blockchain_facade.get_next_block_identifier()
message = BlockMessage.create_from_signed_change_request(request, blockchain_facade)
assert message.number == expected_block_number
assert message.identifier == expected_identifier
assert message.type == Type.PV_SCHEDULE_UPDATE
assert isinstance(message.timestamp, datetime)
assert message.timestamp.tzinfo is None
update = message.update
assert update.schedule == {'1': primary_validator_key_pair.public}
assert update.accounts is None
def test_serialize_deserialize_works(pv_schedule_update_block_message):
serialized = pv_schedule_update_block_message.json()
deserialized = BlockMessage.parse_raw(serialized)
assert deserialized.type == pv_schedule_update_block_message.type
assert deserialized.number == pv_schedule_update_block_message.number
assert deserialized.identifier == pv_schedule_update_block_message.identifier
assert deserialized.timestamp == pv_schedule_update_block_message.timestamp
assert deserialized.request.signer == pv_schedule_update_block_message.request.signer
assert deserialized.request.signature == pv_schedule_update_block_message.request.signature
assert deserialized.request.message == pv_schedule_update_block_message.request.message
assert deserialized.request == pv_schedule_update_block_message.request
assert deserialized.update == pv_schedule_update_block_message.update
assert deserialized == pv_schedule_update_block_message
serialized2 = deserialized.json()
assert serialized == serialized2
def test_block_identifier_is_mandatory(pv_schedule_update_signed_change_request, primary_validator_key_pair):
PVScheduleUpdateBlockMessage(
number=1,
identifier='0' * 64,
timestamp=datetime.utcnow(),
request=pv_schedule_update_signed_change_request,
update=BlockMessageUpdate(schedule={'1': primary_validator_key_pair.public}),
)
with pytest.raises(ValidationError) as exc_info:
PVScheduleUpdateBlockMessage(
number=1,
identifier=None,
timestamp=datetime.utcnow(),
request=pv_schedule_update_signed_change_request,
update=BlockMessageUpdate(schedule={'1': primary_validator_key_pair.public}),
)
assert re.search(r'identifier.*none is not an allowed value', str(exc_info.value), flags=re.DOTALL)
| 43.826667 | 109 | 0.797688 | 370 | 3,287 | 6.710811 | 0.224324 | 0.072493 | 0.115989 | 0.10149 | 0.374547 | 0.229561 | 0.159082 | 0.122433 | 0.102296 | 0.102296 | 0 | 0.003531 | 0.138424 | 3,287 | 74 | 110 | 44.418919 | 0.873234 | 0 | 0 | 0.163934 | 0 | 0 | 0.01795 | 0 | 0 | 0 | 0 | 0 | 0.311475 | 1 | 0.04918 | false | 0 | 0.114754 | 0 | 0.163934 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d6faddd02aa95de6df1404de8580095272c24b | 11,606 | py | Python | asdf/commands/edit.py | KenMighell/asdf | aae8d9aeb5ff0bfe7248bfa595f256d4756ade18 | [
"BSD-3-Clause"
] | null | null | null | asdf/commands/edit.py | KenMighell/asdf | aae8d9aeb5ff0bfe7248bfa595f256d4756ade18 | [
"BSD-3-Clause"
] | null | null | null | asdf/commands/edit.py | KenMighell/asdf | aae8d9aeb5ff0bfe7248bfa595f256d4756ade18 | [
"BSD-3-Clause"
] | null | null | null | """
Contains commands for lightweight text editing of an ASDF file.
"""
import io
import os
import re
import shutil
# Marked safe because the editor command is specified by an
# environment variable that the user controls.
import subprocess # nosec
import sys
import tempfile
import yaml
from .. import constants, generic_io, schema, util
from ..asdf import AsdfFile, open_asdf
from ..block import BlockManager
from .main import Command
__all__ = ["edit"]
if sys.platform.startswith("win"):
DEFAULT_EDITOR = "notepad"
else:
DEFAULT_EDITOR = "vi"
class Edit(Command):
@classmethod
def setup_arguments(cls, subparsers):
"""
Set up a command line argument parser for the edit subcommand.
"""
# Set up the parser
parser = subparsers.add_parser(
"edit",
description="Edit the YAML portion of an ASDF file in-place.",
)
# Need an input file
parser.add_argument(
"filename",
help="Path to an ASDF file.",
)
parser.set_defaults(func=cls.run)
return parser
@classmethod
def run(cls, args):
"""
Execute the edit subcommand.
"""
return edit(args.filename)
def read_yaml(fd):
"""
Read the YAML portion of an open ASDF file's content.
Parameters
----------
fd : GenericFile
Returns
-------
bytes
YAML content
int
total number of bytes available for YAML area
bool
True if the file contains binary blocks
"""
# All ASDF files produced by this library, even the binary files
# of an exploded ASDF file, include a YAML header, so we'll just
# let this raise an error if the end marker can't be found.
# Revisit this if someone starts producing files without a
# YAML section, which the standard permits but is not possible
# with current software.
reader = fd.reader_until(
constants.YAML_END_MARKER_REGEX,
7,
"End of YAML marker",
include=True,
)
content = reader.read()
reader = fd.reader_until(
constants.BLOCK_MAGIC,
len(constants.BLOCK_MAGIC),
include=False,
exception=False,
)
buffer = reader.read()
contains_blocks = fd.peek(len(constants.BLOCK_MAGIC)) == constants.BLOCK_MAGIC
return content, len(content) + len(buffer), contains_blocks
def write_edited_yaml_larger(path, new_content, version):
"""
Rewrite an ASDF file, replacing the YAML portion with the
specified YAML content and updating the block index if present.
The file is assumed to contain binary blocks.
Parameters
----------
path : str
Path to ASDF file
content : bytes
Updated YAML content
"""
prefix = os.path.splitext(os.path.basename(path))[0] + "-"
# Since the original file may be large, create the temporary
# file in the same directory to avoid filling up the system
# temporary area.
temp_file = tempfile.NamedTemporaryFile(dir=os.path.dirname(path), prefix=prefix, suffix=".asdf", delete=False)
try:
temp_file.close()
with generic_io.get_file(temp_file.name, mode="w") as fd:
fd.write(new_content)
# Allocate additional space for future YAML updates:
pad_length = util.calculate_padding(len(new_content), True, fd.block_size)
fd.fast_forward(pad_length)
with generic_io.get_file(path) as original_fd:
# Consume the file up to the first block, which must exist
# as a precondition to using this method.
original_fd.seek_until(
constants.BLOCK_MAGIC,
len(constants.BLOCK_MAGIC),
)
ctx = AsdfFile(version=version)
blocks = BlockManager(ctx, copy_arrays=False, lazy_load=False)
blocks.read_internal_blocks(original_fd, past_magic=True, validate_checksums=False)
blocks.finish_reading_internal_blocks()
blocks.write_internal_blocks_serial(fd)
blocks.write_block_index(fd, ctx)
blocks.close()
# Swap in the new version of the file atomically:
shutil.copy(temp_file.name, path)
finally:
os.unlink(temp_file.name)
def write_edited_yaml(path, new_content, available_bytes):
"""
Overwrite the YAML portion of an ASDF tree with the specified
YAML content. The content must fit in the space available.
Parameters
----------
path : str
Path to ASDF file
yaml_content : bytes
Updated YAML content
available_bytes : int
Number of bytes available for YAML
"""
# generic_io mode "rw" opens the file as "r+b":
with generic_io.get_file(path, mode="rw") as fd:
fd.write(new_content)
pad_length = available_bytes - len(new_content)
if pad_length > 0:
fd.write(b"\0" * pad_length)
def edit(path):
"""
Copy the YAML portion of an ASDF file to a temporary file, present
the file to the user for editing, then update the original file
with the modified YAML.
Parameters
----------
path : str
Path to ASDF file
"""
# Extract the YAML portion of the original file:
with generic_io.get_file(path, mode="r") as fd:
if util.get_file_type(fd) != util.FileType.ASDF:
print(f"Error: '{path}' is not an ASDF file.")
return 1
original_content, available_bytes, contains_blocks = read_yaml(fd)
original_asdf_version = parse_asdf_version(original_content)
original_yaml_version = parse_yaml_version(original_content)
prefix = os.path.splitext(os.path.basename(path))[0] + "-"
# We can't use temp_file's automatic delete because Windows
# won't allow reading the file from the editor process unless
# it is closed here.
temp_file = tempfile.NamedTemporaryFile(prefix=prefix, suffix=".yaml", delete=False)
try:
# Write the YAML to a temporary path:
temp_file.write(original_content)
temp_file.close()
# Loop so that the user can correct errors in the edited file:
while True:
open_editor(temp_file.name)
with open(temp_file.name, "rb") as f:
new_content = f.read()
if new_content == original_content:
print("No changes made to file")
return 0
try:
new_asdf_version = parse_asdf_version(new_content)
new_yaml_version = parse_yaml_version(new_content)
except Exception as e:
print("Error: failed to parse ASDF header: " + str(e))
choice = request_input("(c)ontinue editing or (a)bort? ", ["c", "a"])
if choice == "a":
return 1
else:
continue
if new_asdf_version != original_asdf_version or new_yaml_version != original_yaml_version:
print("Error: cannot modify ASDF Standard or YAML version using this tool.")
choice = request_input("(c)ontinue editing or (a)bort? ", ["c", "a"])
if choice == "a":
return 1
else:
continue
try:
# Blocks are not read during validation, so this will not raise
# an error even though we're only opening the YAML portion of
# the file.
with open_asdf(io.BytesIO(new_content), _force_raw_types=True):
pass
except yaml.YAMLError as e:
print("Error: failed to parse updated YAML:")
print_exception(e)
choice = request_input("(c)ontinue editing or (a)bort? ", ["c", "a"])
if choice == "a":
return 1
else:
continue
except schema.ValidationError as e:
print("Warning: updated ASDF tree failed validation:")
print_exception(e)
choice = request_input("(c)ontinue editing, (f)orce update, or (a)bort? ", ["c", "f", "a"])
if choice == "a":
return 1
elif choice == "c":
continue
except Exception as e:
print("Error: failed to read updated file as ASDF:")
print_exception(e)
choice = request_input("(c)ontinue editing or (a)bort? ", ["c", "a"])
if choice == "a":
return 1
else:
continue
# We've either opened the file without error, or
# the user has agreed to ignore validation errors.
# Break out of the loop so that we can update the
# original file.
break
finally:
os.unlink(temp_file.name)
if len(new_content) <= available_bytes:
# File has sufficient space allocated in the YAML area.
write_edited_yaml(path, new_content, available_bytes)
elif not contains_blocks:
# File does not have sufficient space, but there are
# no binary blocks, so we can just expand the file.
write_edited_yaml(path, new_content, len(new_content))
else:
# File does not have sufficient space, and binary blocks
# are present.
print("Warning: updated YAML larger than allocated space. File must be rewritten.")
choice = request_input("(c)ontinue or (a)bort? ", ["c", "a"])
if choice == "a":
return 1
else:
write_edited_yaml_larger(path, new_content, new_asdf_version)
def parse_asdf_version(content):
"""
Extract the ASDF Standard version from YAML content.
Parameters
----------
content : bytes
Returns
-------
asdf.versioning.AsdfVersion
ASDF Standard version
"""
comments = AsdfFile._read_comment_section(generic_io.get_file(io.BytesIO(content)))
return AsdfFile._find_asdf_version_in_comments(comments)
def parse_yaml_version(content):
"""
Extract the YAML version from YAML content.
Parameters
----------
content : bytes
Returns
-------
bytes
YAML version string.
"""
match = re.search(b"^%YAML (.*)$", content, flags=re.MULTILINE)
if match is None:
raise ValueError("YAML version number not found")
return match.group(1)
def print_exception(e):
"""
Print an exception, indented 4 spaces and elided if too many lines.
"""
lines = str(e).split("\n")
if len(lines) > 20:
lines = lines[0:20] + ["..."]
for line in lines:
print(f" {line}")
def request_input(message, choices):
"""
Request user input.
Parameters
----------
message : str
Message to display
choices : list of str
List of recognized inputs
"""
while True:
choice = input(message).strip().lower()
if choice in choices:
return choice
else:
print(f"Invalid choice: {choice}")
def open_editor(path):
"""
Launch an editor process with the file at path opened.
"""
editor = os.environ.get("EDITOR", DEFAULT_EDITOR)
# Marked safe because the editor command is specified by an
# environment variable that the user controls.
subprocess.run(f"{editor} {path}", check=True, shell=True) # nosec
| 31.367568 | 115 | 0.598397 | 1,439 | 11,606 | 4.703961 | 0.23975 | 0.023637 | 0.014478 | 0.014182 | 0.29118 | 0.238144 | 0.192938 | 0.148323 | 0.096321 | 0.08923 | 0 | 0.002507 | 0.312683 | 11,606 | 369 | 116 | 31.452575 | 0.846057 | 0.297519 | 0 | 0.324176 | 0 | 0 | 0.105202 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06044 | false | 0.005495 | 0.065934 | 0 | 0.208791 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d8f90fb39051b74c0923032a78ca2ae8b3fdb2 | 12,765 | py | Python | Manually_based_detector.py | PeterHedley94/viz_eda | b4a80fc1382ed75bef80b675b9cca1cf86fe2b7f | [
"MIT"
] | 5 | 2020-12-02T12:24:45.000Z | 2021-10-02T18:58:30.000Z | Manually_based_detector.py | PeterHedley94/viz_eda | b4a80fc1382ed75bef80b675b9cca1cf86fe2b7f | [
"MIT"
] | 7 | 2021-02-01T11:17:32.000Z | 2022-03-12T00:58:30.000Z | Manually_based_detector.py | Recycleye/viz_eda | b4a80fc1382ed75bef80b675b9cca1cf86fe2b7f | [
"MIT"
] | 3 | 2021-12-17T12:56:09.000Z | 2022-01-27T08:36:57.000Z | import json
import os
import cv2
import numpy as np
import pandas as pd
from pycocotools.coco import COCO
from sklearn.decomposition import PCA
from sklearn.ensemble import IsolationForest
from sklearn.neighbors import LocalOutlierFactor
from sklearn.preprocessing import StandardScaler
from anomaly_detector import create_destination
from crop_utils import batch_crop_images
from anomaly_analysis.anomaly_feature_extraction import get_roughness, get_histograms, \
get_obj_colors, get_proportion
def form_crop_image(image_id, annotation_id, cat_id, coco, image_path, crop_destination_path):
"""
:param image_id:{int or str} image_id of the image
:param annotation_id:{int or str} annotation id of the image
:param cat_id:{int or str} category id of image
:param coco:{coco format} coco dataset
:param image_path:{str} directory of the image
:param crop_destination_path:{str} directory to contain the cropped image
:return:
crop_image_filename: {str} the filename of the cropped image
"""
#create the filename
crop_image_filename = f"{image_id}_{annotation_id}_{cat_id}.jpg"
#check if the image has already been cropped
#if not then crop
if not os.path.exists(os.path.join(crop_destination_path, crop_image_filename)):
batch_crop_images(coco, img_ids=[image_id], img_source=image_path,
img_destination=crop_destination_path, proportion=0.005)
return crop_image_filename
def combine_feature_dataset(annotation_file, img_folder, intermediate_rlt_path, cat_name=[]):
"""
:param annotation_file:{str} path to JSON coco-style annotation file
:param imgs_path:{str} path to folder containing images corresponding to annotation_file
:param intermediate_rlt_path:{str} path to hold intermediate result
:param cat_name:{list of str} categories needed to be analyzed
:return
feature_dataset:{pd.dataframe} final feature dataframe
"""
coco_data = COCO(annotation_file)
# detect anomaly in every category
for idx, cat in enumerate(cat_name):
cat = [cat]
print(cat[0] + ": " + str(idx + 1) + "/" + str(len(cat_name)))
# get all cat_ids,img_ids,ann_ids in this cat
cat_ids = coco_data.getCatIds(catNms=cat)
img_ids = coco_data.getImgIds(catIds=cat_ids)
ann_ids = coco_data.getAnnIds(catIds=cat_ids)
num_objs = len(ann_ids)
print("the number of objects is:", num_objs)
num_imgs = len(img_ids)
print("the number of images is:", num_imgs)
#cropped all images
create_destination(intermediate_rlt_path)
crop_destination_path = os.path.join(intermediate_rlt_path, "crop_bbox_images")
create_destination(crop_destination_path)
croped_image = []
croped_ann_id = []
croped_image_id = []
for imgid in img_ids:
all_ann_ids = coco_data.getAnnIds(imgIds=imgid, catIds=cat_ids, iscrowd=0)
objs = coco_data.loadAnns(ids=all_ann_ids)
for obj in objs:
form_crop_image(image_id=imgid, annotation_id=obj['id'], cat_id=obj['category_id'], coco=coco_data,
image_path=img_folder, crop_destination_path=crop_destination_path)
if os.path.exists(os.path.join(crop_destination_path,
str(obj['image_id']) + "_" + str(obj['id']) + "_" + str(
obj['category_id']) + ".jpg")):
img = cv2.imread(os.path.join(crop_destination_path,
str(obj['image_id']) + "_" + str(obj['id']) + "_" + str(
obj['category_id']) + ".jpg"))
croped_ann_id.append(obj['id'])
croped_image_id.append(obj['image_id'])
croped_image.append(img)
print("Getting average area...")
avg_area, area_data = get_proportion(cat_ids, croped_image_id, coco_data, croped_ann_id)
print("Getting average roughness of segmentation...")
avg_roughness, roughness = get_roughness(cat_ids, croped_image_id, coco_data, croped_ann_id)
print("Merge the dataset")
feature_dataset = area_data.merge(roughness, left_on='annID', right_on='annID')
assert (feature_dataset["imgID_x"].equals(feature_dataset["imgID_y"]) == True)
print("Getting colorHist !")
color_hist = get_histograms(croped_image, croped_ann_id, hist_size=16, hist_range=(0, 256), acc=False)
feature_dataset = feature_dataset.merge(color_hist, left_on="annID", right_on="annID")
print("Getting object color vector!")
obj_color_feature = get_obj_colors(croped_image, croped_ann_id)
feature_dataset = feature_dataset.merge(obj_color_feature, left_on="annID", right_on="annID")
#put ground truth in the feature dataset
objs = coco_data.loadAnns(ids=croped_ann_id)
anomaly_label = []
for obj in objs:
anomaly_label.append(obj["anomaly"])
feature_dataset["label"] = anomaly_label
return feature_dataset
def get_outliers(feature_dataset,nn=30, contam=0.05):
"""
:param feature_dataset: {pd.dataframe} feature dataset generated by combine_feature_dataset
:param nn:{int} number of neighbours used in LOF outlier detection
:param contam:{double} estimated percentage of outliers/anomalies in the given dataset
:return:
results:{pd.dataframe} df containing annID, lof, isolation forest score (-1 for outlier, 1 for inlier),
and negative outlier factor of both algorithms all objects
var :{float} variance of PCA
"""
#expand the feature
results = pd.DataFrame()
train=feature_dataset
train=train.drop(["annID","imgID_x","imgID_y","label"], axis=1)
red = train['red'].apply(pd.Series)
# rename each variable is red
red = red.rename(columns=lambda x: 'red_' + str(x))
# view the tags dataframe
train = pd.concat([train[:], red[:]], axis=1)
train = train.drop(["red"], axis=1)
blue = train['blue'].apply(pd.Series)
# rename each variable is red
blue = blue.rename(columns=lambda x: 'blue_' + str(x))
# view the tags dataframe
train = pd.concat([train[:], blue[:]], axis=1)
train = train.drop(["blue"], axis=1)
green = train['green'].apply(pd.Series)
# rename each variable is red
green = green.rename(columns=lambda x: 'green_' + str(x))
# view the tags dataframe
train = pd.concat([train[:], green[:]], axis=1)
train = train.drop(["green"], axis=1)
#lof algorithm
n_component=0.8
if train.shape[0]==1:
results["lof"] = [1]
results["iforest"]=[1]
results["lof_negative_outlier_factor"]=[-100]
results["iforest_negative_outlier_factor"]=[-1]
results["annID"] = feature_dataset["annID"]
return results,1
# nomalising before pca
scaler = StandardScaler()
train_s = scaler.fit_transform(train)
#perform PCA
pca = PCA(n_components=n_component)
final_train = pd.DataFrame(pca.fit_transform(train_s))
explained_variance = np.sum(pca.explained_variance_ratio_)
if final_train.shape[0] < nn:
nn = final_train.shape[0]
#lof
lof = LocalOutlierFactor(n_neighbors=nn, contamination=contam)
results["lof"] = lof.fit_predict(final_train)
results["lof_negative_outlier_factor"] = lof.negative_outlier_factor_
# isolation forest algorithm
rng = np.random.RandomState(42)
#feature warning(changing from old version sklearn to new version, you need to specify the behaviour)
iforest = IsolationForest(n_estimators=100, contamination=contam, random_state=rng,behaviour="new").fit(final_train)
results["iforest"] = iforest.predict(final_train)
results["iforest_negative_outlier_factor"] = iforest.score_samples(final_train)
results["annID"]=feature_dataset["annID"]
return results,explained_variance
def get_anomalies(predicate, algorithm="lof"):
"""
:param predicate: {pd.dataframe} prediction result generated by get_outlier function
:param algorithm: {str} classification algorithm to get anomalies
:return:
preds: {pd.dataframe} df containing anomalies generated by corresponding algorithm
"""
preds = pd.DataFrame()
preds["annID"] = predicate["annID"]
if algorithm == "iforest":
preds["anomaly"] = predicate["iforest"]
preds["anomaly_score"] = predicate["iforest_negative_outlier_factor"]
else:
preds["anomaly"] = predicate["lof"]
preds["anomaly_score"] = predicate["lof_negative_outlier_factor"]
preds = preds[preds["anomaly"] == -1]
return preds
#===== Manually iforest ======
def detect_anomalies_manual_iforest(annotation_path, images_path, intermediate_rlt_path, cat_ids=None):
"""
:param annotation_file:{str} path to JSON coco-style annotation file
:param imgs_path:{str} path to folder containing images corresponding to annotation_file
:param intermediate_rlt_path:{str} path to hold intermediate result
"""
#generate output json path
anomaly_path = "output/output_manually_isolationforest.json"
if os.path.exists(anomaly_path):
with open(anomaly_path, 'r') as ano_f:
anomalies = json.load(ano_f)
return anomalies
#extract all category names
cocoData = COCO(annotation_path)
cats = cocoData.loadCats(cocoData.getCatIds())
names = [cat["name"] for cat in cats]
class_result = []
#begin to analysis
for idx, cat in enumerate(names):
cat = [cat]
print(cat[0] + ": " + str(idx + 1) + "/" + str(len(names)))
#genertate feature dataset
feature_dataset = combine_feature_dataset(annotation_file=annotation_path, img_folder=images_path,
intermediate_rlt_path=intermediate_rlt_path, cat_name=cat)
#predict anomalies
print("Getting abnormal objects...")
preds_df, var = get_outliers(feature_dataset, contam=0.05)
#get anomalies according to the used algorithm
algorithm = 'iforest'
anomalies = get_anomalies(preds_df, algorithm)
#store result in json
for index, row in anomalies.iterrows():
anomaly = {
"id": int(row["annID"]),
"variance": var,
"anomaly_score": float(row["anomaly_score"])
}
class_result.append(anomaly)
with open(anomaly_path, 'w+') as outfile:
json.dump(class_result, outfile)
def detect_anomalies_manual_lof(annotation_path, images_path, intermediate_rlt_path, cat_ids=None):
"""
:param annotation_file:{str} path to JSON coco-style annotation file
:param imgs_path:{str} path to folder containing images corresponding to annotation_file
:param intermediate_rlt_path:{str} path to hold intermediate result
"""
#generate output json path
anomaly_path = "output/output_manually_lof.json"
if os.path.exists(anomaly_path):
with open(anomaly_path, 'r') as ano_f:
anomalies = json.load(ano_f)
return anomalies
#extract all category names
cocoData = COCO(annotation_path)
cats = cocoData.loadCats(cocoData.getCatIds())
names = [cat["name"] for cat in cats]
class_result = []
for idx, cat in enumerate(names):
cat = [cat]
print(cat[0] + ": " + str(idx + 1) + "/" + str(len(names)))
#genertate feature dataset
feature_dataset = combine_feature_dataset(annotation_file=annotation_path, img_folder=images_path,
intermediate_rlt_path=intermediate_rlt_path, cat_name=cat)
#predict anomalies
preds_df, var = get_outliers(feature_dataset, contam=0.05)
#get anomalies according to the used algorithm
algorithm = 'lof'
anomalies = get_anomalies(preds_df, algorithm)
#store result in json
for index, row in anomalies.iterrows():
anomaly = {
"id": int(row["annID"]),
"variance": var,
"anomaly_score": float(row["anomaly_score"])
}
class_result.append(anomaly)
with open(anomaly_path, 'w+') as outfile:
json.dump(class_result, outfile)
if __name__ == "__main__":
annotation_file = "VOC_COCO/annotations/voc_add_anomaly.json"
image_folder = "VOC_COCO/images"
intermediate_path = "output/intermediate"
detect_anomalies_manual_iforest(annotation_file, image_folder, intermediate_path)
| 38.448795 | 120 | 0.662201 | 1,607 | 12,765 | 5.028625 | 0.172371 | 0.046776 | 0.028214 | 0.009652 | 0.455884 | 0.373592 | 0.359114 | 0.348224 | 0.33486 | 0.328301 | 0 | 0.005935 | 0.234391 | 12,765 | 331 | 121 | 38.564955 | 0.820935 | 0.221935 | 0 | 0.289617 | 0 | 0 | 0.10548 | 0.033787 | 0 | 0 | 0 | 0 | 0.005464 | 1 | 0.032787 | false | 0 | 0.071038 | 0 | 0.142077 | 0.060109 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57d96d1d01e7cac1022bf09907e3e5e8a364e1a1 | 1,174 | py | Python | download_from_alphavantage_daily/securities_info.py | wispwisp/finances | cccc6077701f97c233b9aaae73c6d7e2b3c5734b | [
"MIT"
] | null | null | null | download_from_alphavantage_daily/securities_info.py | wispwisp/finances | cccc6077701f97c233b9aaae73c6d7e2b3c5734b | [
"MIT"
] | null | null | null | download_from_alphavantage_daily/securities_info.py | wispwisp/finances | cccc6077701f97c233b9aaae73c6d7e2b3c5734b | [
"MIT"
] | null | null | null | import pandas as pd
import datetime
import urllib
import json
import apikey
def apply_request(req_str: str):
""" Request to API. Returns json
"""
assert isinstance(req_str, str)
http = urllib.request.urlopen(req_str)
return json.loads(http.read().decode('utf-8'))
def daily_prices_pattern(symbol, apikey=apikey.apikey):
""" String request pattern, formatted by symbol and apikey
- Data for security by its symbol (OHLC and volume)
"""
return "https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol={}&apikey={}".format(
symbol, apikey
)
def get_today_security_info(symbol: str):
assert isinstance(symbol, str)
j = apply_request(daily_prices_pattern(symbol))
date = datetime.datetime.strptime(j['Global Quote']['07. latest trading day']
, "%Y-%m-%d").date()
open_ = float(j['Global Quote']['02. open'])
high = float(j['Global Quote']['03. high'])
low = float(j['Global Quote']['04. low'])
close = float(j['Global Quote']['05. price'])
volume = int(j['Global Quote']['06. volume'])
return (date, open_, high, low, close, volume)
| 26.681818 | 96 | 0.642249 | 156 | 1,174 | 4.737179 | 0.474359 | 0.104195 | 0.097429 | 0.092016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014024 | 0.210392 | 1,174 | 43 | 97 | 27.302326 | 0.783172 | 0.11925 | 0 | 0 | 0 | 0 | 0.221344 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.125 | false | 0 | 0.208333 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57da2c529a47750e8d39f62066ecb43dcb0b6615 | 2,290 | py | Python | constants.py | tiago-ferreiraa/QuickUMLS | 8179058b9bc0996dfe9bdbe7c0d95381f55fb552 | [
"MIT"
] | 1 | 2021-04-05T03:51:50.000Z | 2021-04-05T03:51:50.000Z | constants.py | tiago-ferreiraa/QuickUMLS | 8179058b9bc0996dfe9bdbe7c0d95381f55fb552 | [
"MIT"
] | null | null | null | constants.py | tiago-ferreiraa/QuickUMLS | 8179058b9bc0996dfe9bdbe7c0d95381f55fb552 | [
"MIT"
] | 1 | 2018-02-06T19:47:35.000Z | 2018-02-06T19:47:35.000Z | HEADERS_MRCONSO = [
'cui', 'lat', 'ts', 'lui', 'stt', 'sui', 'ispref', 'aui', 'saui',
'scui', 'sdui', 'sab', 'tty', 'code', 'str', 'srl', 'suppress', 'cvf'
]
HEADERS_MRSTY = [
'cui', 'sty', 'hier' 'desc', 'sid', 'num'
]
NEGATIONS = {'none', 'non', 'neither', 'nor', 'no', 'not'}
ACCEPTED_SEMTYPES = {
'T029', # Body Location or Region
'T023', # Body Part, Organ, or Organ Component
'T031', # Body Substance
'T060', # Diagnostic Procedure
'T047', # Disease or Syndrome
'T074', # Medical Device
'T200', # Clinical Drug
'T203', # Drug Delivery Device
'T033', # Finding
'T184', # Sign or Symptom
'T034', # Laboratory or Test Result
'T058', # Health Care Activity
'T059', # Laboratory Procedure
'T037', # Injury or Poisoning
'T061', # Therapeutic or Preventive Procedure
'T048', # Mental or Behavioral Dysfunction
'T046', # Pathologic Function
'T121', # Pharmacologic Substance
'T201', # Clinical Attribute
'T130', # Indicator, Reagent, or Diagnostic Aid
'T195', # Antibiotic
'T039', # Physiologic Function
'T040', # Organism Function
'T041', # Mental Process
'T170', # Intellectual Product
'T191' # Neoplastic Process
}
UNICODE_DASHES = {
u'\u002d', u'\u007e', u'\u00ad', u'\u058a', u'\u05be', u'\u1400',
u'\u1806', u'\u2010', u'\u2011', u'\u2010', u'\u2012', u'\u2013',
u'\u2014', u'\u2015', u'\u2053', u'\u207b', u'\u2212', u'\u208b',
u'\u2212', u'\u2212', u'\u2e17', u'\u2e3a', u'\u2e3b', u'\u301c',
u'\u3030', u'\u30a0', u'\ufe31', u'\ufe32', u'\ufe58', u'\ufe63',
u'\uff0d'
}
LANGUAGES = {
'BAQ', # Basque
'CHI', # Chinese
'CZE', # Czech
'DAN', # Danish
'DUT', # Dutch
'ENG', # English
'EST', # Estonian
'FIN', # Finnish
'FRE', # French
'GER', # German
'GRE', # Greek
'HEB', # Hebrew
'HUN', # Hungarian
'ITA', # Italian
'JPN', # Japanese
'KOR', # Korean
'LAV', # Latvian
'NOR', # Norwegian
'POL', # Polish
'POR', # Portuguese
'RUS', # Russian
'SCR', # Croatian
'SPA', # Spanish
'SWE', # Swedish
'TUR', # Turkish
}
| 29.74026 | 73 | 0.524454 | 248 | 2,290 | 4.826613 | 0.762097 | 0.015038 | 0.017544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107034 | 0.286026 | 2,290 | 76 | 74 | 30.131579 | 0.625076 | 0.327511 | 0 | 0 | 0 | 0 | 0.317418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57da3403bc8f8476be6108328d73449adeabe923 | 2,147 | py | Python | pymcxray/FileFormat/Results/test_XraySpectraSpecimen.py | drix00/pymcxray | bf650aa0f31c635040a6cb79fe1cb7ecf27b8990 | [
"Apache-2.0"
] | 1 | 2020-07-23T12:13:30.000Z | 2020-07-23T12:13:30.000Z | pymcxray/FileFormat/Results/test_XraySpectraSpecimen.py | drix00/pymcxray | bf650aa0f31c635040a6cb79fe1cb7ecf27b8990 | [
"Apache-2.0"
] | 3 | 2017-03-05T16:09:30.000Z | 2017-03-05T16:11:41.000Z | pymcxray/FileFormat/Results/test_XraySpectraSpecimen.py | drix00/pymcxray | bf650aa0f31c635040a6cb79fe1cb7ecf27b8990 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""
.. py:currentmodule:: FileFormat.Results.test_XraySpectraSpecimen
.. moduleauthor:: Hendrix Demers <hendrix.demers@mail.mcgill.ca>
Tests for the module `XraySpectraSpecimen`.
"""
# Script information for the file.
__author__ = "Hendrix Demers (hendrix.demers@mail.mcgill.ca)"
__version__ = ""
__date__ = ""
__copyright__ = "Copyright (c) 2012 Hendrix Demers"
__license__ = ""
# Standard library modules.
import unittest
import logging
# Third party modules.
# Local modules.
from pymcxray import get_current_module_path
# Project modules
import pymcxray.FileFormat.Results.XraySpectraSpecimen as XraySpectraSpecimen
# Globals and constants variables.
class TestXraySpectraSpecimen(unittest.TestCase):
"""
TestCase class for the module `XraySpectraSpecimen`.
"""
def setUp(self):
"""
Setup method.
"""
unittest.TestCase.setUp(self)
def tearDown(self):
"""
Teardown method.
"""
unittest.TestCase.tearDown(self)
def testSkeleton(self):
"""
First test to check if the testcase is working with the testing framework.
"""
#self.fail("Test if the testcase is working.")
self.assert_(True)
def test_read(self):
"""
Tests for method `read`.
"""
spectrumFile = XraySpectraSpecimen.XraySpectraSpecimen()
spectrumFile.path = get_current_module_path(__file__, "../../../test_data/results")
spectrumFile.basename = "ExperimentalSpectraMCXRay_Au100T250000A_E200d0keV_N1000e_N21000000X_t600s_w20eV_N64W"
spectrumFile.read()
self.assertEquals(40000, len(spectrumFile.energies_keV))
self.assertEquals(40000, len(spectrumFile.totals))
self.assertEquals(40000, len(spectrumFile.characteristics))
self.assertEquals(40000, len(spectrumFile.backgrounds))
#self.fail("Test if the testcase is working.")
if __name__ == '__main__': #pragma: no cover
logging.getLogger().setLevel(logging.DEBUG)
from pymcxray.Testings import runTestModuleWithCoverage
runTestModuleWithCoverage(__file__)
| 27.177215 | 118 | 0.696786 | 216 | 2,147 | 6.675926 | 0.467593 | 0.045076 | 0.058252 | 0.066574 | 0.214979 | 0.099861 | 0.099861 | 0.047157 | 0 | 0 | 0 | 0.032615 | 0.200279 | 2,147 | 78 | 119 | 27.525641 | 0.807222 | 0.293433 | 0 | 0 | 0 | 0 | 0.141421 | 0.10122 | 0 | 0 | 0 | 0 | 0.172414 | 1 | 0.137931 | false | 0 | 0.172414 | 0 | 0.344828 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57de324fbb9302512e614e71dd115e5bf98de430 | 22,209 | py | Python | ocdsmacaw/ocds.py | odscjames/open-contracting-macaw-alpha | f6a0f1d20c1ebdc9cc7278c0e51294e84a0cea26 | [
"BSD-3-Clause"
] | null | null | null | ocdsmacaw/ocds.py | odscjames/open-contracting-macaw-alpha | f6a0f1d20c1ebdc9cc7278c0e51294e84a0cea26 | [
"BSD-3-Clause"
] | null | null | null | ocdsmacaw/ocds.py | odscjames/open-contracting-macaw-alpha | f6a0f1d20c1ebdc9cc7278c0e51294e84a0cea26 | [
"BSD-3-Clause"
] | null | null | null | import re
import json
import collections
import ocdsmacaw.common.tools as tools
from ocdsmacaw.common.common import common_checks_context, get_additional_codelist_values
from django.utils.html import mark_safe, escape, conditional_escape, format_html
import commonmark
import bleach
validation_error_lookup = {
'date-time': mark_safe('Incorrect date format. Dates should use the form YYYY-MM-DDT00:00:00Z. Learn more about <a href="http://standard.open-contracting.org/latest/en/schema/reference/#date">dates in OCDS</a>.'),
}
@tools.ignore_errors
def get_releases_aggregates(json_data):
release_count = 0
unique_ocids = set()
tags = collections.Counter()
unique_lang = set()
unique_initation_type = set()
unique_release_ids = set()
duplicate_release_ids = set()
##for matching with contracts
unique_award_id = set()
planning_ocids = set()
tender_ocids = set()
awardid_ocids = set()
award_ocids = set()
contractid_ocids = set()
contract_ocids = set()
implementation_contractid_ocids = set()
implementation_ocids = set()
release_dates = []
tender_dates = []
award_dates = []
contract_dates = []
unique_buyers_identifier = dict()
unique_buyers_name_no_id = set()
unique_suppliers_identifier = dict()
unique_suppliers_name_no_id = set()
unique_procuring_identifier = dict()
unique_procuring_name_no_id = set()
unique_tenderers_identifier = dict()
unique_tenderers_name_no_id = set()
unique_organisation_schemes = set()
organisation_identifier_address = set()
organisation_name_no_id_address = set()
organisation_identifier_contact_point = set()
organisation_name_no_id_contact_point = set()
release_tender_item_ids = set()
release_award_item_ids = set()
release_contract_item_ids = set()
item_identifier_schemes = set()
unique_currency = set()
planning_doctype = collections.Counter()
planning_doc_count = 0
tender_doctype = collections.Counter()
tender_doc_count = 0
tender_milestones_doctype = collections.Counter()
tender_milestones_doc_count = 0
award_doctype = collections.Counter()
award_doc_count = 0
contract_doctype = collections.Counter()
contract_doc_count = 0
implementation_doctype = collections.Counter()
implementation_doc_count = 0
implementation_milestones_doctype = collections.Counter()
implementation_milestones_doc_count = 0
def process_org(org, unique_id, unique_name):
identifier = org.get('identifier')
org_id = None
if identifier:
org_id = identifier.get('id')
if org_id:
unique_id[org_id] = org.get('name', '') or ''
scheme = identifier.get('scheme')
if scheme:
unique_organisation_schemes.add(scheme)
if org.get('address'):
organisation_identifier_address.add(org_id)
if org.get('contactPoint'):
organisation_identifier_contact_point.add(org_id)
if not org_id:
name = org.get('name')
if name:
unique_name.add(name)
if org.get('address'):
organisation_name_no_id_address.add(name)
if org.get('contactPoint'):
organisation_name_no_id_contact_point.add(name)
def get_item_scheme(item):
classification = item.get('classification')
if classification:
scheme = classification.get('scheme')
if scheme:
item_identifier_schemes.add(scheme)
releases = tools.get_no_exception(json_data, 'releases', [])
for release in releases:
# ### Release Section ###
release_count = release_count + 1
ocid = release.get('ocid')
release_id = release.get('id')
if not ocid:
continue
if release_id:
if release_id in unique_release_ids:
duplicate_release_ids.add(release_id)
unique_release_ids.add(release_id)
unique_ocids.add(release['ocid'])
if 'tag' in release:
tags.update(tools.to_list(release['tag']))
initiation_type = release.get('initiationType')
if initiation_type:
unique_initation_type.add(initiation_type)
release_date = release.get('date', '')
if release_date:
release_dates.append(str(release_date))
if 'language' in release:
unique_lang.add(release['language'])
buyer = release.get('buyer')
if buyer:
process_org(buyer, unique_buyers_identifier, unique_buyers_name_no_id)
# ### Planning Section ###
planning = tools.get_no_exception(release, 'planning', {})
if planning and isinstance(planning, dict):
planning_ocids.add(ocid)
planning_doc_count += tools.update_docs(planning, planning_doctype)
# ### Tender Section ###
tender = tools.get_no_exception(release, 'tender', {})
if tender and isinstance(tender, dict):
tender_ocids.add(ocid)
tender_doc_count += tools.update_docs(tender, tender_doctype)
tender_period = tender.get('tenderPeriod')
if tender_period:
start_date = tender_period.get('startDate', '')
if start_date:
tender_dates.append(str(start_date))
procuring_entity = tender.get('procuringEntity')
if procuring_entity:
process_org(procuring_entity, unique_procuring_identifier, unique_procuring_name_no_id)
tenderers = tender.get('tenderers', [])
for tenderer in tenderers:
process_org(tenderer, unique_tenderers_identifier, unique_tenderers_name_no_id)
tender_items = tender.get('items', [])
for item in tender_items:
item_id = item.get('id')
if item_id and release_id:
release_tender_item_ids.add((ocid, release_id, item_id))
get_item_scheme(item)
milestones = tender.get('milestones')
if milestones:
for milestone in milestones:
tender_milestones_doc_count += tools.update_docs(milestone, tender_milestones_doctype)
# ### Award Section ###
awards = tools.get_no_exception(release, 'awards', [])
for award in awards:
if not isinstance(award, dict):
continue
award_id = award.get('id')
award_ocids.add(ocid)
if award_id:
unique_award_id.add(award_id)
awardid_ocids.add((award_id, ocid))
award_date = award.get('date', '')
if award_date:
award_dates.append(str(award_date))
award_items = award.get('items', [])
for item in award_items:
item_id = item.get('id')
if item_id and release_id and award_id:
release_award_item_ids.add((ocid, release_id, award_id, item_id))
get_item_scheme(item)
suppliers = award.get('suppliers', [])
for supplier in suppliers:
process_org(supplier, unique_suppliers_identifier, unique_suppliers_name_no_id)
award_doc_count += tools.update_docs(award, award_doctype)
# ### Contract section
contracts = tools.get_no_exception(release, 'contracts', [])
for contract in contracts:
contract_id = contract.get('id')
contract_ocids.add(ocid)
if contract_id:
contractid_ocids.add((contract_id, ocid))
period = contract.get('period')
if period:
start_date = period.get('startDate', '')
if start_date:
contract_dates.append(start_date)
contract_items = contract.get('items', [])
for item in contract_items:
item_id = item.get('id')
if item_id and release_id and contract_id:
release_contract_item_ids.add((ocid, release_id, contract_id, item_id))
get_item_scheme(item)
contract_doc_count += tools.update_docs(contract, contract_doctype)
implementation = contract.get('implementation')
if implementation:
implementation_ocids.add(ocid)
if contract_id:
implementation_contractid_ocids.add((contract_id, ocid))
implementation_doc_count += tools.update_docs(implementation, implementation_doctype)
implementation_milestones = implementation.get('milestones', [])
for milestone in implementation_milestones:
implementation_milestones_doc_count += tools.update_docs(milestone, implementation_milestones_doctype)
contracts_without_awards = []
for release in releases:
contracts = release.get('contracts', [])
for contract in contracts:
award_id = contract.get('awardID')
if award_id not in unique_award_id:
contracts_without_awards.append(contract)
unique_buyers_count = len(unique_buyers_identifier) + len(unique_buyers_name_no_id)
unique_buyers = [name + ' (' + str(id) + ')' for id, name in unique_buyers_identifier.items()] + list(unique_buyers_name_no_id)
unique_suppliers_count = len(unique_suppliers_identifier) + len(unique_suppliers_name_no_id)
unique_suppliers = [name + ' (' + str(id) + ')' for id, name in unique_suppliers_identifier.items()] + list(unique_suppliers_name_no_id)
unique_procuring_count = len(unique_procuring_identifier) + len(unique_procuring_name_no_id)
unique_procuring = [name + ' (' + str(id) + ')' for id, name in unique_procuring_identifier.items()] + list(unique_procuring_name_no_id)
unique_tenderers_count = len(unique_tenderers_identifier) + len(unique_tenderers_name_no_id)
unique_tenderers = [name + ' (' + str(id) + ')' for id, name in unique_tenderers_identifier.items()] + list(unique_tenderers_name_no_id)
unique_org_identifier_count = len(set(unique_buyers_identifier) |
set(unique_suppliers_identifier) |
set(unique_procuring_identifier) |
set(unique_tenderers_identifier))
unique_org_name_count = len(unique_buyers_name_no_id |
unique_suppliers_name_no_id |
unique_procuring_name_no_id |
unique_tenderers_name_no_id)
unique_org_count = unique_org_identifier_count + unique_org_name_count
def get_currencies(object):
if isinstance(object, dict):
for key, value in object.items():
if key == 'currency':
unique_currency.add(value)
get_currencies(value)
if isinstance(object, list):
for item in object:
get_currencies(item)
get_currencies(json_data)
return dict(
release_count=release_count,
unique_ocids=sorted(unique_ocids, key=lambda x: str(x)),
unique_initation_type=sorted(unique_initation_type, key=lambda x: str(x)),
duplicate_release_ids=sorted(duplicate_release_ids, key=lambda x: str(x)),
tags=dict(tags),
unique_lang=sorted(unique_lang, key=lambda x: str(x)),
unique_award_id=sorted(unique_award_id, key=lambda x: str(x)),
planning_count=len(planning_ocids),
tender_count=len(tender_ocids),
award_count=len(awardid_ocids),
processes_award_count=len(award_ocids),
contract_count=len(contractid_ocids),
processes_contract_count=len(contract_ocids),
implementation_count=len(implementation_contractid_ocids),
processes_implementation_count=len(implementation_ocids),
min_release_date=min(release_dates) if release_dates else '',
max_release_date=max(release_dates) if release_dates else '',
min_tender_date=min(tender_dates) if tender_dates else '',
max_tender_date=max(tender_dates) if tender_dates else '',
min_award_date=min(award_dates) if award_dates else '',
max_award_date=max(award_dates) if award_dates else '',
min_contract_date=min(contract_dates) if contract_dates else '',
max_contract_date=max(contract_dates) if contract_dates else '',
unique_buyers_identifier=unique_buyers_identifier,
unique_buyers_name_no_id=sorted(unique_buyers_name_no_id, key=lambda x: str(x)),
unique_suppliers_identifier=unique_suppliers_identifier,
unique_suppliers_name_no_id=sorted(unique_suppliers_name_no_id, key=lambda x: str(x)),
unique_procuring_identifier=unique_procuring_identifier,
unique_procuring_name_no_id=sorted(unique_procuring_name_no_id, key=lambda x: str(x)),
unique_tenderers_identifier=unique_tenderers_identifier,
unique_tenderers_name_no_id=sorted(unique_tenderers_name_no_id, key=lambda x: str(x)),
unique_buyers=sorted(set(unique_buyers)),
unique_suppliers=sorted(set(unique_suppliers)),
unique_procuring=sorted(set(unique_procuring)),
unique_tenderers=sorted(set(unique_tenderers)),
unique_buyers_count=unique_buyers_count,
unique_suppliers_count=unique_suppliers_count,
unique_procuring_count=unique_procuring_count,
unique_tenderers_count=unique_tenderers_count,
unique_org_identifier_count=unique_org_identifier_count,
unique_org_name_count=unique_org_name_count,
unique_org_count=unique_org_count,
unique_organisation_schemes=sorted(unique_organisation_schemes, key=lambda x: str(x)),
organisations_with_address=len(organisation_identifier_address) + len(organisation_name_no_id_address),
organisations_with_contact_point=len(organisation_identifier_contact_point) + len(organisation_name_no_id_contact_point),
total_item_count=len(release_tender_item_ids) + len(release_award_item_ids) + len(release_contract_item_ids),
tender_item_count=len(release_tender_item_ids),
award_item_count=len(release_award_item_ids),
contract_item_count=len(release_contract_item_ids),
item_identifier_schemes=sorted(item_identifier_schemes, key=lambda x: str(x)),
unique_currency=sorted(unique_currency, key=lambda x: str(x)),
planning_doc_count=planning_doc_count,
tender_doc_count=tender_doc_count,
tender_milestones_doc_count=tender_milestones_doc_count,
award_doc_count=award_doc_count,
contract_doc_count=contract_doc_count,
implementation_doc_count=implementation_doc_count,
implementation_milestones_doc_count=implementation_milestones_doc_count,
planning_doctype=dict(planning_doctype),
tender_doctype=dict(tender_doctype),
tender_milestones_doctype=dict(tender_milestones_doctype),
award_doctype=dict(award_doctype),
contract_doctype=dict(contract_doctype),
implementation_doctype=dict(implementation_doctype),
implementation_milestones_doctype=dict(implementation_milestones_doctype),
contracts_without_awards=contracts_without_awards,
)
def _lookup_schema(schema, path, ref_info=None):
if len(path) == 0:
return schema, ref_info
if hasattr(schema, '__reference__'):
ref_info = {
'path': path,
'reference': schema.__reference__,
}
path_item, *child_path = path
if 'items' in schema:
return _lookup_schema(schema['items'], path, ref_info)
elif 'properties' in schema:
if path_item in schema['properties']:
return _lookup_schema(schema['properties'][path_item], child_path, ref_info)
else:
return None, None
def lookup_schema(schema, path):
return _lookup_schema(schema, path.split('/'))
def common_checks_ocds(context, upload_dir, json_data, schema_obj, api=False, cache=True):
schema_name = schema_obj.release_pkg_schema_name
if 'records' in json_data:
schema_name = schema_obj.record_pkg_schema_name
common_checks = common_checks_context(upload_dir, json_data, schema_obj, schema_name, context,
fields_regex=True, api=api, cache=cache)
validation_errors = common_checks['context']['validation_errors']
new_validation_errors = []
for (json_key, values) in validation_errors:
error = json.loads(json_key)
new_message = validation_error_lookup.get(error['message_type'])
if new_message:
error['message_safe'] = conditional_escape(new_message)
else:
if 'message_safe' in error:
error['message_safe'] = mark_safe(error['message_safe'])
else:
error['message_safe'] = conditional_escape(error['message'])
schema_block, ref_info = lookup_schema(schema_obj.get_release_pkg_schema_obj(deref=True), error['path_no_number'])
if schema_block and error['message_type'] != 'required':
if 'description' in schema_block:
error['schema_title'] = escape(schema_block.get('title', ''))
error['schema_description_safe'] = mark_safe(bleach.clean(
commonmark.commonmark(schema_block['description']),
tags=bleach.sanitizer.ALLOWED_TAGS + ['p']
))
if ref_info:
ref = ref_info['reference']['$ref']
if ref.endswith('release-schema.json'):
ref = ''
else:
ref = ref.strip('#')
ref_path = '/'.join(ref_info['path'])
schema = 'release-schema.json'
else:
ref = ''
ref_path = error['path_no_number']
schema = 'release-package-schema.json'
error['docs_ref'] = format_html('{},{},{}', schema, ref, ref_path)
new_validation_errors.append([json.dumps(error, sort_keys=True), values])
common_checks['context']['validation_errors'] = new_validation_errors
context.update(common_checks['context'])
if schema_name == 'record-package-schema.json':
context['records_aggregates'] = get_records_aggregates(json_data, ignore_errors=bool(validation_errors))
context['schema_url'] = schema_obj.record_pkg_schema_url
else:
additional_codelist_values = get_additional_codelist_values(schema_obj, json_data)
closed_codelist_values = {key: value for key, value in additional_codelist_values.items() if not value['isopen']}
open_codelist_values = {key: value for key, value in additional_codelist_values.items() if value['isopen']}
context.update({
'releases_aggregates': get_releases_aggregates(json_data, ignore_errors=bool(validation_errors)),
'additional_closed_codelist_values': closed_codelist_values,
'additional_open_codelist_values': open_codelist_values
})
context = add_conformance_rule_errors(context, json_data, schema_obj)
return context
@tools.ignore_errors
def get_records_aggregates(json_data):
# Unique ocids
unique_ocids = set()
if 'records' in json_data:
for record in json_data['records']:
# Gather all the ocids
if 'ocid' in record:
unique_ocids.add(record['ocid'])
# Number of records
count = len(json_data['records']) if 'records' in json_data else 0
return {
'count': count,
'unique_ocids': unique_ocids,
}
def get_bad_ocds_prefixes(json_data):
'''Yield tuples with ('ocid', 'path/to/ocid') for ocids with malformed prefixes'''
prefix_regex = re.compile(r'^ocds-[a-zA-Z0-9]{6}-')
releases = json_data.get('releases', [])
records = json_data.get('records', [])
bad_prefixes = []
if releases and isinstance(releases, list):
for n_rel, release in enumerate(releases):
if not isinstance(release, dict):
continue
ocid = release.get('ocid', '')
if ocid and isinstance(ocid, str) and not prefix_regex.match(ocid):
bad_prefixes.append((ocid, 'releases/%s/ocid' % n_rel))
elif records and isinstance(records, list):
for n_rec, record in enumerate(records):
if not isinstance(record, dict):
continue
for n_rel, release in enumerate(record.get('releases', {})):
ocid = release.get('ocid', '')
if ocid and not prefix_regex.match(ocid):
bad_prefixes.append((ocid, 'records/%s/releases/%s/ocid' % (n_rec, n_rel)))
compiled_release = record.get('compiledRelease', {})
if compiled_release:
ocid = compiled_release.get('ocid', '')
if ocid and not prefix_regex.match(ocid):
bad_prefixes.append((ocid, 'records/%s/compiledRelease/ocid' % n_rec))
bad_prefixes.append((ocid, 'records/%s/compiledRelease/ocid' % n_rec))
return bad_prefixes
def add_conformance_rule_errors(context, json_data, schema_obj):
'''Return context dict augmented with conformance errors if any'''
ocds_prefixes_bad_format = get_bad_ocds_prefixes(json_data)
if ocds_prefixes_bad_format:
ocid_schema_description = schema_obj.get_release_schema_obj()['properties']['ocid']['description']
ocid_info_index = ocid_schema_description.index('For more information')
ocid_description = ocid_schema_description[:ocid_info_index]
ocid_info_url = ocid_schema_description[ocid_info_index:].split('[')[1].split(']')[1][1:-1]
context['conformance_errors'] = {
'ocds_prefixes_bad_format': ocds_prefixes_bad_format,
'ocid_description': ocid_description,
'ocid_info_url': ocid_info_url
}
return context
| 43.547059 | 217 | 0.657301 | 2,590 | 22,209 | 5.269498 | 0.089961 | 0.014947 | 0.01993 | 0.01231 | 0.372729 | 0.258719 | 0.16017 | 0.112984 | 0.062866 | 0.047773 | 0 | 0.001441 | 0.250124 | 22,209 | 509 | 218 | 43.632613 | 0.818062 | 0.013733 | 0 | 0.108491 | 0 | 0.002358 | 0.066523 | 0.013497 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023585 | false | 0 | 0.018868 | 0.002358 | 0.066038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57dff369aa6167a5a61dcb434fd2d34ccbb54cf4 | 4,893 | py | Python | reservationsystem/views/trips.py | RajaFaizanNazir/bus-booking-system | df895793930535d7c1c50d84393d557be73780f3 | [
"MIT"
] | 1 | 2021-01-21T20:44:53.000Z | 2021-01-21T20:44:53.000Z | reservationsystem/views/trips.py | RajaFaizanNazir/bus-booking-system | df895793930535d7c1c50d84393d557be73780f3 | [
"MIT"
] | 7 | 2020-05-02T09:45:54.000Z | 2021-04-08T20:19:17.000Z | reservationsystem/views/trips.py | RajaFaizanNazir/bus-booking-system | df895793930535d7c1c50d84393d557be73780f3 | [
"MIT"
] | 1 | 2021-11-22T17:21:22.000Z | 2021-11-22T17:21:22.000Z | from rest_framework import permissions, status, serializers
from rest_framework.response import Response
from rest_framework.views import APIView
from reservationsystem.models import BusStation, BusSeat, Trip
from reservationsystem.services.trips import get_all_trips, get_available_seats
class TripListApi(APIView):
"""
List all Trips
"""
permission_classes = [permissions.AllowAny]
class InputSerializer(serializers.Serializer):
"""
Serializer for QueryParams validation on GET /trips
"""
date_from = serializers.DateTimeField()
date_to = serializers.DateTimeField()
busstop_error_message = "invalid param format 'station_end'; use GET '/stations' to retrieve list of stations"
error_messages = {
"does_not_exist": busstop_error_message,
"incorrect_type": busstop_error_message,
}
departure_station = serializers.PrimaryKeyRelatedField(
queryset=BusStation.objects.all(),
error_messages=error_messages
)
arrival_station = serializers.PrimaryKeyRelatedField(
queryset=BusStation.objects.all(),
error_messages=error_messages
)
def validate(self, data):
"""
Check that date_from is before date_to.
"""
if data['date_from'] > data['date_to']:
raise serializers.ValidationError("'date_to' must be later than 'date_from'")
return data
class OutputSerializer(serializers.ModelSerializer):
class Meta:
model = Trip
fields = ['id', 'name', 'departure_time']
def get(self, request, format=None):
"""
List all Trips
"""
# [Step1] retrieve and validate query params from request
query_params = self.InputSerializer(data=request.query_params)
if not query_params.is_valid():
return Response(query_params.errors, status=status.HTTP_400_BAD_REQUEST)
query_params = query_params.validated_data
# [Step2] search for appropriate trips
trips = get_all_trips(
date_from=query_params['date_from'],
date_to=query_params['date_to'],
departure_station=query_params['departure_station'],
arrival_station=query_params['arrival_station'],
)
# [Step3] return results
serializer = self.OutputSerializer(trips, many=True)
return Response(serializer.data)
class TripDetailApi(APIView):
"""
Query availability of Trip
"""
permission_classes = [permissions.AllowAny]
class InputSerializer(serializers.Serializer):
"""
Serializer for QueryParams validation on GET /trips
"""
busstop_error_message = "invalid param format 'station_end'; use GET '/stations' to retrieve list of stations"
error_messages = {
"does_not_exist": busstop_error_message,
"incorrect_type": busstop_error_message,
}
departure_station = serializers.PrimaryKeyRelatedField(
queryset=BusStation.objects.all(),
error_messages=error_messages
)
arrival_station = serializers.PrimaryKeyRelatedField(
queryset=BusStation.objects.all(),
error_messages=error_messages
)
class URLInputSerializer(serializers.Serializer):
"""
Serializer for pk id
"""
trip_error_message = "invalid param format 'trip_id'; use GET '/trips' to retreive list of available trips"
trip = serializers.PrimaryKeyRelatedField(
queryset=Trip.objects.all(),
error_messages={
"does_not_exist": trip_error_message,
"incorrect_type": trip_error_message,
})
class OutputSerializer(serializers.ModelSerializer):
class Meta:
model = BusSeat
fields = ['id', 'name']
def get(self, request, pk, format=None):
# [Step1] retrieve and validate query params from request
query_params = self.InputSerializer(data=request.query_params)
if not query_params.is_valid():
return Response(query_params.errors, status=status.HTTP_400_BAD_REQUEST)
query_params = query_params.validated_data
# [Step2] retrieve and validate url params from request
url_params = self.URLInputSerializer(data={'trip': pk})
if not url_params.is_valid():
return Response(url_params.errors, status=status.HTTP_400_BAD_REQUEST)
url_params = url_params.validated_data
# [Step3] get available seats
available_seats = get_available_seats(**query_params, **url_params)
# [Step4] return results
serializer = self.OutputSerializer(available_seats, many=True)
return Response(serializer.data)
| 35.977941 | 118 | 0.653382 | 500 | 4,893 | 6.164 | 0.218 | 0.067813 | 0.036989 | 0.037313 | 0.589552 | 0.535042 | 0.511681 | 0.472096 | 0.458793 | 0.458793 | 0 | 0.004446 | 0.264459 | 4,893 | 135 | 119 | 36.244444 | 0.851903 | 0.101982 | 0 | 0.464286 | 0 | 0 | 0.111006 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.059524 | 0 | 0.297619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57e29d0d8d3829d997d3e20f0140e84e33ef307d | 1,284 | py | Python | robot-server/robot_server/sessions/dependencies.py | mrakitin/opentrons | d9c7ed23d13cdb62bd1bc397dc2871d4bd5b77e9 | [
"Apache-2.0"
] | null | null | null | robot-server/robot_server/sessions/dependencies.py | mrakitin/opentrons | d9c7ed23d13cdb62bd1bc397dc2871d4bd5b77e9 | [
"Apache-2.0"
] | null | null | null | robot-server/robot_server/sessions/dependencies.py | mrakitin/opentrons | d9c7ed23d13cdb62bd1bc397dc2871d4bd5b77e9 | [
"Apache-2.0"
] | null | null | null | """Session router dependency-injection wire-up."""
from fastapi import Depends
from starlette.datastructures import State
from typing import cast
from opentrons.hardware_control import ThreadManager, API as HardwareAPI
from robot_server.service.dependencies import get_app_state, get_hardware
from .engine_store import EngineStore
from .session_store import SessionStore
_SESSION_STORE_KEY = "session_store"
_ENGINE_STORE_KEY = "engine_store"
def get_session_store(state: State = Depends(get_app_state)) -> SessionStore:
"""Get a singleton SessionStore to keep track of created sessions."""
session_store = getattr(state, _SESSION_STORE_KEY, None)
if session_store is None:
session_store = SessionStore()
setattr(state, _SESSION_STORE_KEY, session_store)
return session_store
def get_engine_store(
state: State = Depends(get_app_state),
hardware: ThreadManager = Depends(get_hardware),
) -> EngineStore:
"""Get a singleton EngineStore to keep track of created engines / runners."""
engine_store = getattr(state, _ENGINE_STORE_KEY, None)
if engine_store is None:
engine_store = EngineStore(hardware_api=cast(HardwareAPI, hardware))
setattr(state, _ENGINE_STORE_KEY, engine_store)
return engine_store
| 31.317073 | 81 | 0.76947 | 164 | 1,284 | 5.737805 | 0.304878 | 0.128587 | 0.035069 | 0.046759 | 0.223167 | 0.070138 | 0.070138 | 0 | 0 | 0 | 0 | 0 | 0.158879 | 1,284 | 40 | 82 | 32.1 | 0.871296 | 0.140187 | 0 | 0 | 0 | 0 | 0.022978 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.291667 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57ea8458cd9b990679be00301373298b816e1512 | 2,244 | py | Python | data_analysis/scene/detect_faces_video.py | vedb/data_analysis | b46f58ba424680353d3abd0014a7d0a339bf6e6c | [
"MIT"
] | null | null | null | data_analysis/scene/detect_faces_video.py | vedb/data_analysis | b46f58ba424680353d3abd0014a7d0a339bf6e6c | [
"MIT"
] | null | null | null | data_analysis/scene/detect_faces_video.py | vedb/data_analysis | b46f58ba424680353d3abd0014a7d0a339bf6e6c | [
"MIT"
] | null | null | null | """
Created on Tue Jun 16 16:41:07 2020
Usage: python detect_faces_video.py [file_name]
Parameters
----------
file_name: str
image file name
@author: KamranBinaee
"""
import cv2
import numpy as np
import argparse
import os
# Set the input arguments to the function and their types
parser = argparse.ArgumentParser(description="Detects faces in a video")
parser.add_argument(
"-file_name", type=str, nargs=1, help="video file name or webcam", default="webcam"
)
# Read the input arguments passed to the function and print them out
args = parser.parse_args()
if args.file_name == "webcam":
print("reading from: Webcam")
cap = cv2.VideoCapture(0)
else:
print("reading from: ", args.file_name[0])
cap = cv2.VideoCapture(os.getcwd() + "/FaceDetectionData/" + args.file_name[0])
cap.set(3, 640) # WIDTH
cap.set(4, 480) # HEIGHT
fps = cap.get(cv2.CAP_PROP_FPS)
print("FPS: ", fps)
video_size = (640, 480)
fourcc = "XVID"
# fourcc = 'FMP4'
out_video = cv2.VideoWriter(
os.getcwd() + "/output_faces.avi", cv2.VideoWriter_fourcc(*fourcc), fps, video_size
)
face_cascade = cv2.CascadeClassifier(
os.getcwd() + "/haarcascade_frontalface_default.xml"
)
eye_cascade = cv2.CascadeClassifier(os.getcwd() + "/haarcascade_eye.xml")
while True:
# Capture frame-by-frame
ret, frame = cap.read()
if ret != True:
break
# Our operations on the frame come here
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
faces = face_cascade.detectMultiScale(gray, 1.3, 5)
# print(len(faces))
# Display the resulting frame
for (x, y, w, h) in faces:
cv2.rectangle(frame, (x, y), (x + w, y + h), (255, 0, 0), 2)
roi_gray = gray[y : y + h, x : x + w]
roi_color = frame[y : y + h, x : x + w]
eyes = eye_cascade.detectMultiScale(roi_gray)
for (ex, ey, ew, eh) in eyes:
cv2.rectangle(roi_color, (ex, ey), (ex + ew, ey + eh), (0, 255, 0), 2)
cv2.imshow("frame", frame)
out_video.write(cv2.resize(frame, video_size, interpolation=cv2.INTER_AREA))
if cv2.waitKey(1) & 0xFF == ord("q"):
break
# When everything done, release the capture
out_video.release()
cap.release()
cv2.destroyAllWindows()
| 28.405063 | 87 | 0.652406 | 327 | 2,244 | 4.376147 | 0.428135 | 0.044724 | 0.025157 | 0.022362 | 0.095038 | 0.072676 | 0 | 0 | 0 | 0 | 0 | 0.037162 | 0.208556 | 2,244 | 78 | 88 | 28.769231 | 0.768581 | 0.222816 | 0 | 0.041667 | 0 | 0 | 0.122756 | 0.020845 | 0 | 0 | 0.002316 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57ea9ae7db2ecfdc085e49ef66700ffd929aa698 | 13,705 | py | Python | hamiltonian/heisenbergj1j2.py | remmyzen/nqs-tensorflow2 | 2af5d5ebb108eac4d2daa5082bdef11c8107bd1b | [
"MIT"
] | 4 | 2021-07-29T17:52:54.000Z | 2022-02-15T06:32:15.000Z | hamiltonian/heisenbergj1j2.py | remmyzen/nqs-tensorflow2 | 2af5d5ebb108eac4d2daa5082bdef11c8107bd1b | [
"MIT"
] | null | null | null | hamiltonian/heisenbergj1j2.py | remmyzen/nqs-tensorflow2 | 2af5d5ebb108eac4d2daa5082bdef11c8107bd1b | [
"MIT"
] | null | null | null | import tensorflow as tf
from hamiltonian import Hamiltonian
import itertools
import numpy as np
import scipy
import scipy.sparse.linalg
class HeisenbergJ1J2(Hamiltonian):
"""
This class is used to define Heisenberg J1-J2 model.
Nearest neighbor interaction along x-, y- and z-axis with magnitude J_1,
next nearest neighbor interaction along x-, y- and z-axis with magntitude J_2,
nearest neighbor interaction along z-axis with magnitude \Delta,.
$H_{HJ} = J_1 \sum_{<i,j>} (\Delta \sigma^z_i \sigma^z_j + \sigma^y_i \sigma^y_j + \sigma^x_i \sigma^x_j) + J_2 \sum_{<i,j>} (\sigma^z_i \sigma^z_j + \sigma^y_i \sigma^y_j + \sigma^x_i \sigma^x_j)$
"""
def __init__(self, graph, j1=1.0, delta=1.0, j2=1.0, total_sz = None):
"""
Construct an Heisenberg J1-J2 model.
Args:
j1: magnitude of the nearest neighbor interaction along x,y,z-axis
delta: magnitude of the nearest neighbor interaction along z-axis
j2: magnitude of the next nearest neighbor interaction along x,y,z-axis
total_sz: total_sz if we want to restrict the hilbert space
"""
Hamiltonian.__init__(self, graph)
self.j1 = j1
self.delta = delta
self.j2 = j2
self.total_sz = total_sz
def calculate_hamiltonian_matrix(self, samples, num_samples):
"""
Calculate the Hamiltonian matrix $H_{x,x'}$ from a given samples x.
Only non-zero elements are returned.
Args:
samples: The samples
num_samples: number of samples
Return:
The Hamiltonian where the first column contains the diagonal, which is $J_1 \sum_{<i,j>} x_i x_j + J_2 \sum_{<<i,j>>} x_i x_j$.
The rest of the column contains the off-diagonal, which is (J_x - J_y * x_i * x_j).
Therefore, the number of column equals the number of particles + 1 and the number of rows = num_samples
"""
diagonal = tf.zeros((num_samples,))
off_diagonal = None
for (s, s_2) in self.graph.bonds:
# Diagonal element of the hamiltonian
# $J_1 \sum_{<i,j>} x_i x_j$
diagonal += self.j1 * samples[:,s] * samples[:,s_2]
# Off diagonal element of the hamiltonian
# $J_1 * (1 - x_i * x_j)$
if off_diagonal is None:
off_diagonal = (self.j1 - samples[:,s] * samples[:, s_2])
off_diagonal = tf.reshape(off_diagonal, (num_samples, 1))
else:
temp = (self.j1 - samples[:,s] * samples[:, s_2])
temp = tf.reshape(temp, (num_samples, 1))
off_diagonal = tf.concat((off_diagonal, temp ),
axis=1)
for (s, s_2) in self.graph.bonds_next:
# Diagonal element of the hamiltonian
# $J_2 \sum_{<<i,j>>} x_i x_j$
diagonal += self.j2 * samples[:,s] * samples[:,s_2]
# Off diagonal element of the hamiltonian
# $J_2 * (1 - x_i * x_j)$
if off_diagonal is None:
off_diagonal = (self.j2 - samples[:,s] * samples[:, s_2])
off_diagonal = tf.reshape(off_diagonal, (num_samples, 1))
else:
temp = (self.j1 - samples[:,s] * samples[:, s_2])
temp = tf.reshape(temp, (num_samples, 1))
off_diagonal = tf.concat((off_diagonal, temp ),
axis=1)
diagonal = tf.reshape(diagonal, (num_samples, 1))
hamiltonian = tf.concat((diagonal, off_diagonal), axis=1)
return hamiltonian
def calculate_ratio(self, samples, model, num_samples):
"""
Calculate the ratio of \Psi(x') and \Psi(x) from a given x
as log(\Psi(x')) - log(\Psi(x))
\Psi is defined in the model.
However, the Hamiltonian determines which x' gives non-zero.
Args:
samples: the samples x
model: the model used to define \Psi
num_samples: the number of samples
Return:
The ratio where the first column contains \Psi(x) / \Psi(x).
The rest of the column contains the non-zero \Psi(x') / \Psi(x).
In the Heisenberg model, this corresponds x' where two adjacent spins are flipped.
Therefore, the number of column equals the number of particles + 1 and the number of rows = num_samples
"""
lvd = model.log_val_diff(samples, samples)
for (s, s2) in self.graph.bonds:
## Flip 2 adjacent spin
flipped_s = tf.reshape(samples[:,s] * -1 , (num_samples, 1))
flipped_s2 = tf.reshape(samples[:,s2] * -1, (num_samples, 1))
if s == 0:
new_config = tf.concat((flipped_s, flipped_s2, samples[:,s2+1:]), axis = 1)
elif s2 == self.graph.num_points-1:
new_config = tf.concat((samples[:, :s], flipped_s, flipped_s2), axis = 1)
else:
new_config = tf.concat((samples[:, :s], flipped_s, flipped_s2, samples[:,s2+1:]), axis = 1)
lvd = tf.concat((lvd, model.log_val_diff(new_config, samples)), axis=1)
for (s, s2) in self.graph.bonds_next:
## Flip 2 adjacent spin
flipped_s = tf.reshape(samples[:,s] * -1 , (num_samples, 1))
flipped_s2 = tf.reshape(samples[:,s2] * -1, (num_samples, 1))
## Store the configuration between s and s2
middle_conf = tf.reshape(samples[:,s+1], (num_samples, 1))
if s == 0:
new_config = tf.concat((flipped_s, middle_conf, flipped_s2, samples[:,s2+1:]), axis = 1)
elif s2 == self.graph.num_points-1:
new_config = tf.concat((samples[:, :s], flipped_s, middle_conf, flipped_s2), axis = 1)
else:
new_config = tf.concat((samples[:, :s], flipped_s, middle_conf, flipped_s2, samples[:,s2+1:]), axis = 1)
lvd = tf.concat((lvd, model.log_val_diff(new_config, samples)), axis=1)
return lvd
def diagonalize(self):
"""
Diagonalize hamiltonian with exact diagonalization.
Only works for small systems (<= 10)!
"""
num_particles = self.graph.num_points
## Initialize zeroes hamiltonian
H = np.zeros((2 ** num_particles, 2 ** num_particles), dtype='complex')
## Calculate interaction energy
for i, a in self.graph.bonds:
togg_vect = np.zeros(num_particles)
togg_vect[i] = 1
togg_vect[a] = 1
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_X)
else:
temp = np.kron(temp, np.identity(2))
H += self.j1 * temp
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_Y)
else:
temp = np.kron(temp, np.identity(2))
H += self.j1 * temp
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_Z)
else:
temp = np.kron(temp, np.identity(2))
H += self.j1 * self.delta * temp
## Calculate interaction energy
for i, a in self.graph.bonds_next:
togg_vect = np.zeros(num_particles)
togg_vect[i] = 1
togg_vect[a] = 1
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_X)
else:
temp = np.kron(temp, np.identity(2))
H += self.j2 * temp
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_Y)
else:
temp = np.kron(temp, np.identity(2))
H += self.j2 * temp
temp = 1
for j in togg_vect:
if j == 1:
temp = np.kron(temp, self.SIGMA_Z)
else:
temp = np.kron(temp, np.identity(2))
H += self.j2 * temp
## Filter total sz
if self.total_sz is not None:
index = []
num_confs = 2 ** num_particles
for row in range(num_confs):
## configuration in binary 0 1
conf_bin = format(row, '#0%db' % (num_particles + 2))
## configuration in binary -1 1
conf = [1 if c == '1' else -1 for c in conf_bin[2:]]
if np.sum(conf) == self.total_sz:
index.append(row)
H = H[index]
H = H[:, index]
## Calculate the eigen value
self.eigen_values, self.eigen_vectors = np.linalg.eig(H)
self.hamiltonian = H
def diagonalize_sparse(self):
"""
Diagonalize hamiltonian with exact diagonalization with sparse matrix.
Only works for small (<= 20) systems!
"""
num_particles = self.graph.num_points
num_confs = 2 ** num_particles
## Constructing the COO sparse matrix
row_ind = []
col_ind = []
data = []
if self.total_sz is not None:
index = []
num_confs = 2 ** num_particles
for row in range(num_confs):
## configuration in binary 0 1
conf_bin = format(row, '#0%db' % (num_particles + 2))
## configuration in binary -1 1
conf = [1 if c == '1' else -1 for c in conf_bin[2:]]
if np.sum(conf) == self.total_sz:
index.append(row)
index = np.array(index)
for row in range(num_confs):
if self.total_sz is not None:
if row not in index:
continue
row_map = np.where(index == row)[0][0]
else:
row_map = row
## configuration in binary 0 1
conf_bin = format(row, '#0%db' % (num_particles + 2))
## configuration in binary -1 1
conf = [1 if c == '1' else -1 for c in conf_bin[2:]]
## Diagonal = J1 \sum SiSj + J2 \sum SiSj
row_ind.append(row_map)
col_ind.append(row_map)
total_j1 = 0
for (i,j) in self.graph.bonds:
total_j1 += conf[i] * conf[j]
total_j1 *= self.j1
total_j2 = 0
for (i,j) in self.graph.bonds_next:
total_j2 += conf[i] * conf[j]
total_j2 *= self.j2
data.append(total_j1 + total_j2)
for (i,j) in self.graph.bonds:
## flip i and j
conf_temp = conf[:]
conf_temp[i] *= -1
conf_temp[j] *= -1
col = int(''.join(['1' if a == 1 else '0' for a in conf_temp]), 2)
if col == row: continue
if self.total_sz is not None:
if col not in index:
continue
col_map = np.where(index == col)[0][0]
else:
col_map = col
value = self.j1 * (1 - conf[i] * conf[j])
if value != 0:
row_ind.append(row_map)
col_ind.append(col_map)
data.append(value)
for (i,j) in self.graph.bonds_next:
## flip i and j
conf_temp = conf[:]
conf_temp[i] *= -1
conf_temp[j] *= -1
col = int(''.join(['1' if a == 1 else '0' for a in conf_temp]), 2)
if col == row: continue
if self.total_sz is not None:
if col not in index:
continue
col_map = np.where(index == col)[0][0]
else:
col_map = col
value = self.j2 * (1 - conf[i] * conf[j])
if value != 0:
row_ind.append(row_map)
col_ind.append(col_map)
data.append(value)
row_ind = np.array(row_ind)
col_ind = np.array(col_ind)
data = np.array(data, dtype=float)
mat_coo = scipy.sparse.coo_matrix((data, (row_ind, col_ind)))
self.eigen_values, self.eigen_vectors = scipy.sparse.linalg.eigs(mat_coo, k=1, which='SR')
self.hamiltonian = mat_coo
def get_name(self):
"""
Get the name of the Hamiltonian
"""
if self.graph.pbc:
bc = 'pbc'
else:
bc = 'obc'
return 'heisenbergj1j2_%dd_%d_%.3f_%.3f_%.3f_%s' % (
self.graph.dimension, self.graph.length, self.j1,
self.j2, self.delta, bc)
def __str__(self):
return "Heisenberg J1-J2 %dD, delta=%.2f, j1=%.2f, j2=%.2f" % (self.graph.dimension, self.delta, self.j1, self.j2)
def to_xml(self):
str = ""
str += "<hamiltonian>\n"
str += "\t<type>heisenberg j1 j2</type>\n"
str += "\t<params>\n"
str += "\t\t<j1>%.2f</j1>\n" % self.j1
str += "\t\t<j2>%.2f</j2>\n" % self.j2
str += "\t\t<delta>%.2f</delta>\n" % self.delta
str += "\t</params>\n"
str += "</hamiltonian>\n"
return str
| 37.651099 | 201 | 0.50062 | 1,782 | 13,705 | 3.70651 | 0.108866 | 0.027252 | 0.018168 | 0.025435 | 0.662074 | 0.629674 | 0.586071 | 0.546556 | 0.506586 | 0.499924 | 0 | 0.028075 | 0.386647 | 13,705 | 363 | 202 | 37.754821 | 0.757673 | 0.214155 | 0 | 0.598214 | 0 | 0 | 0.026915 | 0.006196 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.026786 | 0.004464 | 0.089286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57ec2b8b22f7b18f80cfbb8c66154bb73a055fea | 4,995 | py | Python | th_trello/tests.py | Leopere/django-th | 86c999d16bcf30b6224206e5b40824309834ac8c | [
"BSD-3-Clause"
] | 1,069 | 2015-01-07T01:55:57.000Z | 2022-02-17T10:50:57.000Z | th_trello/tests.py | barrygolden/django-th | 86c999d16bcf30b6224206e5b40824309834ac8c | [
"BSD-3-Clause"
] | 207 | 2015-01-06T21:41:17.000Z | 2018-02-20T14:10:15.000Z | th_trello/tests.py | barrygolden/django-th | 86c999d16bcf30b6224206e5b40824309834ac8c | [
"BSD-3-Clause"
] | 117 | 2015-01-04T16:21:13.000Z | 2022-02-22T06:18:49.000Z | # coding: utf-8
from django.conf import settings
from django_th.tests.test_main import MainTest
from th_trello.models import Trello
from th_trello.forms import TrelloProviderForm, TrelloConsumerForm
from th_trello.my_trello import ServiceTrello
from trello import TrelloClient
from unittest.mock import patch
class TrelloTest(MainTest):
def create_trello(self):
trigger = self.create_triggerservice(consumer_name='ServiceTrello')
board_name = 'Trigger Happy'
list_name = 'To Do'
status = True
return Trello.objects.create(trigger=trigger,
board_name=board_name,
list_name=list_name,
status=status)
class TrelloModelAndFormTest(TrelloTest):
"""
TrelloTest Model
"""
def setUp(self):
"""
create a user
"""
super(TrelloTest, self).setUp()
self.token = 'AZERTY123#TH#FOOBAR'
self.trigger_id = 1
def test_get_config_th(self):
"""
does this settings exists ?
"""
self.assertTrue(settings.TH_TRELLO_KEY)
self.assertIn('consumer_key', settings.TH_TRELLO_KEY)
self.assertIn('consumer_secret', settings.TH_TRELLO_KEY)
def test_get_services_list(self):
th_service = ('th_trello.my_trello.ServiceTrello',)
for service in th_service:
self.assertIn(service, settings.TH_SERVICES)
def test_trello(self):
t = self.create_trello()
self.assertTrue(isinstance(t, Trello))
self.assertEqual(t.show(), "My Trello %s %s %s" % (t.board_name,
t.list_name,
t.card_title))
self.assertEqual(t.__str__(), "%s %s %s" % (t.board_name,
t.list_name,
t.card_title))
"""
Form
"""
# provider
def test_valid_provider_form(self):
t = self.create_trello()
data = {'board_name': t.board_name, 'list_name': t.list_name}
form = TrelloProviderForm(data=data)
self.assertTrue(form.is_valid())
def test_invalid_provider_form(self):
form = TrelloProviderForm(data={})
self.assertFalse(form.is_valid())
# consumer
def test_valid_consumer_form(self):
t = self.create_trello()
data = {'board_name': t.board_name, 'list_name': t.list_name}
form = TrelloConsumerForm(data=data)
self.assertTrue(form.is_valid())
def test_invalid_consumer_form(self):
form = TrelloConsumerForm(data={})
self.assertFalse(form.is_valid())
def test_read_data(self):
r = self.create_trello()
from th_trello.my_trello import ServiceTrello
kwargs = {'model_name': 'Trello', 'app_label': 'th_trello', 'trigger_id': r.trigger_id}
t = ServiceTrello()
t.read_data(**kwargs)
data = list()
self.assertTrue(type(data) is list)
self.assertTrue('trigger_id' in kwargs)
class ServiceTrelloTest(TrelloTest):
def setUp(self):
"""
create a user
"""
super(ServiceTrelloTest, self).setUp()
self.token = 'AZERTY123#TH#FOOBAR'
self.trigger_id = 1
def test_read_data(self):
"""
Test if the reading of the Trello object looks fine
"""
r = self.create_trello()
data = {'model_name': 'Trello',
'app_label': 'th_trello',
'trigger_id': r.trigger_id,
'link': 'http://foo.bar/some/thing/else/what/else',
'title': 'what else',
'content': 'foobar'}
se = ServiceTrello(self.token)
data = se.read_data(**data)
self.assertIsInstance(data, list)
def test_save_data(self):
"""
Test if the creation of the Trello object looks fine
"""
t = self.create_trello()
data = {'link': 'http://foo.bar/some/thing/else/what/else',
'title': 'what else',
'content': 'foobar'}
with patch.object(TrelloClient, 'add_board') as mock_save_data2:
with patch.object(TrelloClient, 'list_boards') as mock_save_data:
se = ServiceTrello(self.token)
se.save_data(self.trigger_id, **data)
mock_save_data.assert_called_once_with()
mock_save_data2.assert_called_once_with(t.board_name)
def test_save_data_no_title(self):
"""
Test if the creation of the Trello object looks fine (no title)
"""
self.create_trello()
data = {'link': '',
'title': '',
'content': ''}
se = ServiceTrello(self.token)
result = se.save_data(self.trigger_id, **data)
self.assertFalse(result)
| 31.815287 | 96 | 0.570971 | 556 | 4,995 | 4.906475 | 0.21223 | 0.028226 | 0.041056 | 0.036657 | 0.434751 | 0.394428 | 0.362903 | 0.265396 | 0.265396 | 0.265396 | 0 | 0.003245 | 0.321321 | 4,995 | 156 | 97 | 32.019231 | 0.801475 | 0.054855 | 0 | 0.333333 | 0 | 0 | 0.103807 | 0.007304 | 0 | 0 | 0 | 0 | 0.171717 | 1 | 0.141414 | false | 0 | 0.080808 | 0 | 0.262626 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57ec75f2c2a1f82a75ff48cf060ee8f3b2486121 | 1,685 | py | Python | tools/flip_video.py | TwentyBN/Sense | 6503c7e7c4bbf7604b46d9b9009d05dd0e9033f9 | [
"MIT"
] | 483 | 2020-12-14T23:13:48.000Z | 2021-07-29T06:01:44.000Z | tools/flip_video.py | TwentyBN/Sense | 6503c7e7c4bbf7604b46d9b9009d05dd0e9033f9 | [
"MIT"
] | 55 | 2020-12-16T23:20:15.000Z | 2021-07-12T18:15:48.000Z | tools/flip_video.py | TwentyBN/Sense | 6503c7e7c4bbf7604b46d9b9009d05dd0e9033f9 | [
"MIT"
] | 52 | 2021-01-07T02:17:27.000Z | 2021-07-29T02:06:34.000Z | #!/usr/bin/env python
"""
This script helps to flip videos horizontally for data augmentation.
Generally, it can be used to quickly double the size of your dataset, or,
in the case where you've collected data for an action performed on a specific side,
you can flip these videos and use them to classify the opposite side.
Usage:
flip_video.py --path_in=PATH_IN
[--path_out=PATH_OUT]
flip_video.py (-h | --help)
Options:
--path_in=PATH_IN Path to the folder containing videos to be flipped
--path_out=PATH_OUT Path to the folder to save flipped videos
"""
import ffmpeg
import os
from docopt import docopt
from os.path import join
if __name__ == '__main__':
# Parse arguments
args = docopt(__doc__)
videos_path_in = join(os.getcwd(), args['--path_in'])
videos_path_out = join(os.getcwd(), args['--path_out']) if args.get('--path_out') else videos_path_in
# Training script expects videos in MP4 format
VIDEO_EXT = '.mp4'
# Create directory to save flipped videos
os.makedirs(videos_path_out, exist_ok=True)
for video in os.listdir(videos_path_in):
print(f'Processing video: {video}')
flipped_video_name = video.split('.')[0] + '_flipped' + VIDEO_EXT
# Original video as input
original_video = ffmpeg.input(join(videos_path_in, video))
# Do horizontal flip
flipped_video = ffmpeg.hflip(original_video)
# Get flipped video output
flipped_video_output = ffmpeg.output(flipped_video, filename=join(videos_path_out, flipped_video_name))
# Run to render and save video
ffmpeg.run(flipped_video_output)
print("Processing complete!")
| 34.387755 | 111 | 0.703858 | 249 | 1,685 | 4.542169 | 0.409639 | 0.047745 | 0.035367 | 0.02122 | 0.06366 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002242 | 0.205935 | 1,685 | 48 | 112 | 35.104167 | 0.843049 | 0.461128 | 0 | 0 | 0 | 0 | 0.106383 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57f23bf1f3fc47a0587b17ad1ee628523061d345 | 464 | py | Python | tests/test_utils.py | jborean93/httpcore | ed76a30277529c5687cbdb6e203d6dbd0d56833f | [
"BSD-3-Clause"
] | 381 | 2019-04-08T09:45:25.000Z | 2022-03-12T20:18:00.000Z | tests/test_utils.py | jborean93/httpcore | ed76a30277529c5687cbdb6e203d6dbd0d56833f | [
"BSD-3-Clause"
] | 334 | 2019-04-17T16:03:50.000Z | 2022-03-31T20:49:03.000Z | tests/test_utils.py | jborean93/httpcore | ed76a30277529c5687cbdb6e203d6dbd0d56833f | [
"BSD-3-Clause"
] | 82 | 2019-04-24T18:04:10.000Z | 2022-03-31T23:26:19.000Z | import itertools
from typing import List
import pytest
from httpcore._utils import exponential_backoff
@pytest.mark.parametrize(
"factor, expected",
[
(0.1, [0, 0.1, 0.2, 0.4, 0.8]),
(0.2, [0, 0.2, 0.4, 0.8, 1.6]),
(0.5, [0, 0.5, 1.0, 2.0, 4.0]),
],
)
def test_exponential_backoff(factor: float, expected: List[int]) -> None:
delays = list(itertools.islice(exponential_backoff(factor), 5))
assert delays == expected
| 23.2 | 73 | 0.616379 | 73 | 464 | 3.849315 | 0.39726 | 0.02847 | 0.042705 | 0.042705 | 0.067616 | 0.067616 | 0 | 0 | 0 | 0 | 0 | 0.093151 | 0.213362 | 464 | 19 | 74 | 24.421053 | 0.676712 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.266667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57f69d5c801539d21b5f7acdd513cc3f6047f669 | 3,840 | py | Python | import_sequence.py | ConanODoyle/io_scene_dts | 3970d625b81a21a3d3691afddc358781633502e0 | [
"MIT"
] | 11 | 2017-11-03T18:25:27.000Z | 2021-11-11T01:29:12.000Z | import_sequence.py | ConanODoyle/io_scene_dts | 3970d625b81a21a3d3691afddc358781633502e0 | [
"MIT"
] | 11 | 2017-07-02T08:05:21.000Z | 2021-07-18T00:33:36.000Z | import_sequence.py | ConanODoyle/io_scene_dts | 3970d625b81a21a3d3691afddc358781633502e0 | [
"MIT"
] | 15 | 2017-06-18T05:11:05.000Z | 2021-08-21T01:30:53.000Z | # This file is currently unused
import bpy
def import_sequence(is_dsq, shape, seq):
if is_dsq:
name = shape.names[seq.nameIndex]
else:
name = seq.name
act = bpy.data.actions.new(name)
flags = ["priority {}".format(seq.priority)]
if seq.flags & Sequence.Cyclic:
flags.append("cyclic")
if seq.flags & Sequence.Blend:
flags.append("blend")
# sequences_text.append(name + ": " + ", ".join(flags))
if is_dsq:
nodes = shape.nodes
rotations = shape.rotations
else:
nodes = tuple(map(lambda n: shape.names[n.name], shape.nodes))
rotations = shape.node_rotations
if seq.flags & Sequence.UniformScale:
scales = tuple(map(lambda s: (s, s, s), shape.uniform_scales))
elif seq.flags & Sequence.AlignedScale:
scales = shape.aligned_scales
elif seq.flags & Sequence.ArbitraryScale:
print("Warning: Arbitrary scale animation not implemented")
break
else:
print("Warning: Invalid scale flags found in sequence")
break
nodes_translation = tuple(map(lambda p: p[0], filter(lambda p: p[1], zip(nodes, seq.translationMatters))))
nodes_rotation = tuple(map(lambda p: p[0], filter(lambda p: p[1], zip(nodes, seq.rotationMatters))))
nodes_scale = tuple(map(lambda p: p[0], filter(lambda p: p[1], zip(nodes, seq.scaleMatters))))
for matters_index, node_name in enumerate(nodes_translation):
data_path = 'pose.bones["{}"].location'.format(node_name)
fcus = tuple(map(lambda array_index: act.fcurves.new(data_path, array_index), range(3)))
for frame_index in range(seq.numKeyframes):
array = translations[seq.baseTranslation + matters_index * seq.numKeyframes + frame_index]
for array_index, fcu in enumerate(fcus):
fcu.keyframe_points.add(1)
key = fcu.keyframe_points[-1]
key.interpolation = "LINEAR"
key.co = (1 + frame_index, array[array_index])
for matters_index, node_name in enumerate(nodes_rotation):
data_path = 'pose.bones["{}"].rotation_quaternion'.format(node_name)
fcus = tuple(map(lambda array_index: act.fcurves.new(data_path, array_index), range(4)))
for frame_index in range(seq.numKeyframes):
array = rotations[seq.baseRotation + matters_index * seq.numKeyframes + frame_index]
for array_index, fcu in enumerate(fcus):
fcu.keyframe_points.add(1)
key = fcu.keyframe_points[-1]
key.interpolation = "LINEAR"
key.co = (1 + frame_index, array[array_index])
for matters_index, node_name in enumerate(nodes_scale):
data_path = 'pose.bones["{}"].scale'.format(node_name)
fcus = tuple(map(lambda array_index: act.fcurves.new(data_path, array_index), range(3)))
for frame_index in range(seq.numKeyframes):
array = scales[seq.baseScale + matters_index * seq.numKeyframes + frame_index]
for array_index, fcu in enumerate(fcus):
fcu.keyframe_points.add(1)
key = fcu.keyframe_points[-1]
key.interpolation = "LINEAR"
key.co = (1 + frame_index, array[array_index])
# if seq.flags & Sequence.Blend:
# if reference_frame is None:
# return fail(operator, "Missing 'reference' marker for blend animation '{}'".format(name))
# ref_vec = Vector(evaluate_all(curves, reference_frame))
# vec = ref_vec + vec
# if seq.flags & Sequence.Blend:
# if reference_frame is None:
# return fail(operator, "Missing 'reference' marker for blend animation '{}'".format(name))
# ref_rot = Quaternion(evaluate_all(curves, reference_frame))
# rot = ref_rot * rot
| 45.176471 | 110 | 0.632813 | 481 | 3,840 | 4.906445 | 0.226611 | 0.050847 | 0.047458 | 0.038136 | 0.617797 | 0.559746 | 0.559746 | 0.559746 | 0.526271 | 0.526271 | 0 | 0.00625 | 0.25 | 3,840 | 84 | 111 | 45.714286 | 0.813194 | 0.148177 | 0 | 0.442623 | 0 | 0 | 0.067219 | 0.025476 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016393 | false | 0 | 0.032787 | 0 | 0.04918 | 0.032787 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57f8be7a47894308866dc4a4d2a39176c683e9bb | 1,170 | py | Python | r2c_isg/functions/sample.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 3 | 2019-11-02T20:14:34.000Z | 2020-01-23T21:47:20.000Z | r2c_isg/functions/sample.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 19 | 2019-09-18T01:48:07.000Z | 2021-11-04T11:20:48.000Z | r2c_isg/functions/sample.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 3 | 2019-11-15T22:31:13.000Z | 2020-03-10T10:19:39.000Z | import random
from r2c_isg.structures import Dataset
def sample(ds: Dataset, n: int,
on_versions: bool = True, seed: str = None) -> None:
"""Samples n projects in place."""
# seed random, if a seed was provided
if seed:
random.seed(seed)
# select a sample of versions in each project
if on_versions:
dropped = 0
for project in ds.projects:
dropped += len(project.versions)
if len(project.versions) > n:
project.versions = random.sample(project.versions, n)
dropped -= len(project.versions)
print(' Sampled {:,} versions from each of {:,} projects ({:,} '
'total versions dropped).'.format(n, len(ds.projects), dropped))
# select a sample of projects
elif len(ds.projects) > n:
orig_count = len(ds.projects)
ds.projects = random.sample(ds.projects, n)
print(' Sampled {:,} projects from {:,} (dropped {:,}).'
.format(n, orig_count, max(orig_count - n, 0)))
else:
# this should never happen...
raise Exception('Dataset has no projects; cannot sample.')
| 32.5 | 80 | 0.582906 | 142 | 1,170 | 4.760563 | 0.373239 | 0.088757 | 0.079882 | 0.044379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00365 | 0.297436 | 1,170 | 35 | 81 | 33.428571 | 0.818735 | 0.141026 | 0 | 0 | 0 | 0 | 0.183735 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.090909 | 0 | 0.136364 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
57f90524a58f45dd2786c57bd5bf34079c510359 | 908 | py | Python | tests/test_contrib_flask.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | tests/test_contrib_flask.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | tests/test_contrib_flask.py | nicoddemus/dependencies | 74180e2c6098d8ad03bc53c5703bdf8dc61c3ed9 | [
"BSD-2-Clause"
] | null | null | null | import pytest
pytest.importorskip("flask")
views = pytest.importorskip("flask.views")
contrib = pytest.importorskip("dependencies.contrib.flask")
http_methods = views.http_method_funcs
http_methods_no_head = http_methods - {"head"}
@pytest.fixture()
def app():
from flask_project.app import create_app
app = create_app()
app.config["TESTING"] = True
return app
@pytest.mark.parametrize("method", http_methods_no_head)
def test_dispatch_request(client, method):
"""Dispatch request to the `Injector` subclass attributes."""
response = getattr(client, method)("/test_dispatch_request/1/test/")
assert response.status_code == 200
assert response.data == b"<h1>OK</h1>"
def test_docstrings():
"""Access `method_view` docstring."""
assert (
contrib.method_view.__doc__
== "Create Flask method based dispatching view from injector class."
)
| 23.894737 | 76 | 0.713656 | 112 | 908 | 5.5625 | 0.473214 | 0.070626 | 0.073836 | 0.089888 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007895 | 0.162996 | 908 | 37 | 77 | 24.540541 | 0.811842 | 0.095815 | 0 | 0 | 0 | 0 | 0.201235 | 0.069136 | 0 | 0 | 0 | 0 | 0.136364 | 1 | 0.136364 | false | 0 | 0.227273 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17a829983d3b1a8c764fa28260f58e73e37fbea3 | 1,731 | py | Python | build/lib/cleth/io_.py | shmakn99/cleth | fcc2d42f0213c51b22f4a8d195d3a975497e111f | [
"MIT"
] | null | null | null | build/lib/cleth/io_.py | shmakn99/cleth | fcc2d42f0213c51b22f4a8d195d3a975497e111f | [
"MIT"
] | null | null | null | build/lib/cleth/io_.py | shmakn99/cleth | fcc2d42f0213c51b22f4a8d195d3a975497e111f | [
"MIT"
] | null | null | null | from organism import Organism
import networkx as nx
def load(string_id,ppi_infile_path,threshold,ess_infile_path=None,name=None):
'''
Parameters
------------
string_id - The STRING database ID of the orgaism.
ppi_infile_path - Path of the file containing the protein protein interaction (PPI) data.
For this package this file is assumed to be obtained from STRING database.
The format is-
Potein1 Protein2 Score1 Score2 ... Overall Score
ess_file_path - Path of the file containing the list of essential protiens.
For this package this file is assumed to be obtained from
The format is-
Protein 1
Protein 2
.
.
.
Protein N
threshold - The cut-off to be consedered for making edges between any two proteins.
For example if the threshold is 700, all the PPIs below this overall score are neglected.
This would result in what is generally refered to as a high confidence network.
name - Taxonomical name of the organism.
'''
G=nx.Graph()
with open(ppi_infile_path) as f:
line=''
while True:
line=f.readline()
if line=='':
break
split_line=line.strip().split()
if int(split_line[len(split_line)-1])>=threshold:
G.add_edge(split_line[0],split_line[1],weight=int(split_line[len(split_line)-1]))
essential_protiens=[]
if ess_infile_path is not None:
with open(ess_infile_path) as f:
line=''
while True:
line=f.readline()
if line=='':
break
split_line=line.strip().split()
# print (split_line)
essential_protiens.append(split_line[0])
if name is not None:
org=Organism(string_id,name=name)
else:
org=Organism(string_id)
org.graph=G
# print (essential_protiens)
org.essential_proteins+=essential_protiens
return org
| 20.607143 | 90 | 0.722126 | 269 | 1,731 | 4.520446 | 0.371747 | 0.074013 | 0.032072 | 0.021382 | 0.287829 | 0.287829 | 0.287829 | 0.197368 | 0.197368 | 0.197368 | 0 | 0.009936 | 0.18602 | 1,731 | 83 | 91 | 20.855422 | 0.853087 | 0.482958 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.066667 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17a82bed1350fceabd4c5deaba7ba86dcaa3b673 | 7,001 | py | Python | src/py/flwr/server/strategy/strategy.py | sandracl72/flower | bb7f6e2e1f52753820784d262618113b4e7ebc42 | [
"Apache-2.0"
] | null | null | null | src/py/flwr/server/strategy/strategy.py | sandracl72/flower | bb7f6e2e1f52753820784d262618113b4e7ebc42 | [
"Apache-2.0"
] | null | null | null | src/py/flwr/server/strategy/strategy.py | sandracl72/flower | bb7f6e2e1f52753820784d262618113b4e7ebc42 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Adap GmbH. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Flower server strategy."""
from abc import ABC, abstractmethod
from typing import Dict, List, Optional, Tuple, Union
from ...common import (
EvaluateIns,
EvaluateRes,
FitIns,
FitRes,
Parameters,
Scalar,
Weights,
)
from ...server.client_manager import ClientManager
from ...server.client_proxy import ClientProxy
class Strategy(ABC):
"""Abstract base class for server strategy implementations."""
@abstractmethod
def initialize_parameters(
self, client_manager: ClientManager
) -> Optional[Parameters]:
"""Initialize the (global) model parameters.
Parameters
----------
client_manager: ClientManager. The client manager which holds all currently
connected clients.
Returns
-------
parameters: Parameters (optional)
If parameters are returned, then the server will treat these as the
initial global model parameters.
"""
@abstractmethod
def configure_fit(
self, rnd: int, parameters: Parameters, client_manager: ClientManager
) -> List[Tuple[ClientProxy, FitIns]]:
"""Configure the next round of training.
Parameters
----------
rnd : int
The current round of federated learning.
parameters : Parameters
The current (global) model parameters.
client_manager : ClientManager
The client manager which holds all currently connected clients.
Returns
-------
A list of tuples. Each tuple in the list identifies a `ClientProxy` and the
`FitIns` for this particular `ClientProxy`. If a particular `ClientProxy`
is not included in this list, it means that this `ClientProxy`
will not participate in the next round of federated learning.
"""
@abstractmethod
def aggregate_fit(
self,
rnd: int,
results: List[Tuple[ClientProxy, FitRes]],
failures: List[BaseException],
) -> Union[
Tuple[Optional[Parameters], Dict[str, Scalar]],
Optional[Weights], # Deprecated
]:
"""Aggregate training results.
Parameters
----------
rnd : int
The current round of federated learning.
results : List[Tuple[ClientProxy, FitRes]]
Successful updates from the previously selected and configured
clients. Each pair of `(ClientProxy, FitRes)` constitutes a
successful update from one of the previously selected clients. Not
that not all previously selected clients are necessarily included in
this list: a client might drop out and not submit a result. For each
client that did not submit an update, there should be an `Exception`
in `failures`.
failures : List[BaseException]
Exceptions that occurred while the server was waiting for client
updates.
Returns
-------
parameters: Parameters (optional)
If parameters are returned, then the server will treat these as the
new global model parameters (i.e., it will replace the previous
parameters with the ones returned from this method). If `None` is
returned (e.g., because there were only failures and no viable
results) then the server will no update the previous model
parameters, the updates received in this round are discarded, and
the global model parameters remain the same.
"""
@abstractmethod
def configure_evaluate(
self, rnd: int, parameters: Parameters, client_manager: ClientManager
) -> List[Tuple[ClientProxy, EvaluateIns]]:
"""Configure the next round of evaluation.
Arguments:
rnd: Integer. The current round of federated learning.
parameters: Parameters. The current (global) model parameters.
client_manager: ClientManager. The client manager which holds all currently
connected clients.
Returns:
A list of tuples. Each tuple in the list identifies a `ClientProxy` and the
`EvaluateIns` for this particular `ClientProxy`. If a particular
`ClientProxy` is not included in this list, it means that this
`ClientProxy` will not participate in the next round of federated
evaluation.
"""
@abstractmethod
def aggregate_evaluate(
self,
rnd: int,
results: List[Tuple[ClientProxy, EvaluateRes]],
failures: List[BaseException],
) -> Union[
Tuple[Optional[float], Dict[str, Scalar]],
Optional[float], # Deprecated
]:
"""Aggregate evaluation results.
Arguments:
rnd: int. The current round of federated learning.
results: List[Tuple[ClientProxy, FitRes]]. Successful updates from the
previously selected and configured clients. Each pair of
`(ClientProxy, FitRes` constitutes a successful update from one of the
previously selected clients. Not that not all previously selected
clients are necessarily included in this list: a client might drop out
and not submit a result. For each client that did not submit an update,
there should be an `Exception` in `failures`.
failures: List[BaseException]. Exceptions that occurred while the server
was waiting for client updates.
Returns:
Optional `float` representing the aggregated evaluation result. Aggregation
typically uses some variant of a weighted average.
"""
@abstractmethod
def evaluate(
self, parameters: Parameters
) -> Optional[Tuple[float, Dict[str, Scalar]]]:
"""Evaluate the current model parameters.
This function can be used to perform centralized (i.e., server-side) evaluation
of model parameters.
Arguments:
parameters: Parameters. The current (global) model parameters.
Returns:
The evaluation result, usually a Tuple containing loss and a
dictionary containing task-specific metrics (e.g., accuracy).
"""
| 38.256831 | 87 | 0.636481 | 772 | 7,001 | 5.755181 | 0.272021 | 0.02926 | 0.033086 | 0.040513 | 0.55458 | 0.540626 | 0.521269 | 0.493135 | 0.493135 | 0.487283 | 0 | 0.001599 | 0.285531 | 7,001 | 182 | 88 | 38.467033 | 0.886645 | 0.67019 | 0 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0 | 0.1 | 0 | 0.24 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17a8ab28d36c8dfd4b9787fa8d8de84f56df509b | 428 | py | Python | 2018/csaw18finalsctf/v35/lab3_solve.py | theKidOfArcrania/ctf-writeups | dc03760ca8e8b9a246a4e4f27357ed427b1a51fe | [
"MIT"
] | 5 | 2019-04-06T19:32:21.000Z | 2020-04-18T07:16:37.000Z | 2018/csaw18finalsctf/v35/lab3_solve.py | theKidOfArcrania/ctf-writeups | dc03760ca8e8b9a246a4e4f27357ed427b1a51fe | [
"MIT"
] | null | null | null | 2018/csaw18finalsctf/v35/lab3_solve.py | theKidOfArcrania/ctf-writeups | dc03760ca8e8b9a246a4e4f27357ed427b1a51fe | [
"MIT"
] | null | null | null | from lab3_values import values
from struct import pack
from binascii import hexlify
def mix(x):
return (x + ((x >> 5)^(x<<4))) & 0xffffffff
fn = 0x57415343
fprev = 0x41484148
#fprev = 0x57415343
#fn = 0x41484148
for x in values[::-1]:
tmp = (fn + (mix(fprev) ^ x)) & 0xffffffff
fn = fprev
fprev = tmp
#print("a[0:4] = {}; a[4:8] = {}".format(fn, fprev))
print(hex(fn)[2:] + '-' + hex(fprev)[2:])
| 19.454545 | 52 | 0.584112 | 63 | 428 | 3.952381 | 0.47619 | 0.096386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145015 | 0.226636 | 428 | 21 | 53 | 20.380952 | 0.607251 | 0.207944 | 0 | 0 | 0 | 0 | 0.002985 | 0 | 0 | 0 | 0.119403 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.25 | 0.083333 | 0.416667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17a9a35b7b76fd155e3fd5168962689cdeff53fc | 3,747 | py | Python | critiquebrainz/frontend/views/test/test_user.py | akshaaatt/critiquebrainz | 39184152af5f23adaa991c4b43ecbbb6f086f809 | [
"Apache-2.0"
] | 70 | 2015-03-10T00:08:21.000Z | 2022-02-20T05:36:53.000Z | critiquebrainz/frontend/views/test/test_user.py | akshaaatt/critiquebrainz | 39184152af5f23adaa991c4b43ecbbb6f086f809 | [
"Apache-2.0"
] | 279 | 2015-12-08T14:10:45.000Z | 2022-03-29T13:54:23.000Z | critiquebrainz/frontend/views/test/test_user.py | akshaaatt/critiquebrainz | 39184152af5f23adaa991c4b43ecbbb6f086f809 | [
"Apache-2.0"
] | 95 | 2015-03-12T21:39:42.000Z | 2022-03-10T00:51:04.000Z | from unittest import mock
import critiquebrainz.db.users as db_users
from critiquebrainz.db.user import User
from critiquebrainz.frontend.testing import FrontendTestCase
class UserViewsTestCase(FrontendTestCase):
def setUp(self):
super(UserViewsTestCase, self).setUp()
self.user = User(db_users.get_or_create(1, "aef06569-098f-4218-a577-b413944d9493", new_user_data={
"display_name": u"Tester",
}))
self.admin = User(db_users.get_or_create(2, "9371e5c7-5995-4471-a5a9-33481f897f9c", new_user_data={
"display_name": u"Admin",
}))
def test_reviews(self):
# test reviews for user not in db
response = self.client.get("/user/{user_id}".format(user_id="random-user-id"))
self.assert404(response, "Can't find a user with ID: random-user-id")
# test reviews for user present in db, but not logged in
response = self.client.get("/user/{user_id}".format(user_id=self.user.id))
self.assert200(response)
self.assertIn("Tester", str(response.data))
def test_info(self):
# test info for user not in db
response = self.client.get("/user/{user_id}/info".format(user_id="random-user-id"))
self.assert404(response, "Can't find a user with ID: random-user-id")
# test info for user present in db
response = self.client.get("/user/{user_id}/info".format(user_id=self.user.id))
self.assert200(response)
self.assertIn("Tester", str(response.data))
@mock.patch('critiquebrainz.db.user.User.is_admin')
def test_block_unblock(self, is_user_admin):
self.temporary_login(self.admin)
# test block user when user is not in db
response = self.client.get("/user/{user_id}/block".format(user_id="random-user-id"))
self.assert404(response, "Can't find a user with ID: random-user-id")
# make self.admin a moderator
is_user_admin.return_value = True
# admin blocks tester
response = self.client.post(
"user/{user_id}/block".format(user_id=self.user.id),
data=dict(reason="Test blocking user."),
follow_redirects=True,
)
self.assertIn("This user account has been blocked.", str(response.data))
user = db_users.get_by_id(self.user.id)
self.assertEqual(user["is_blocked"], True)
# testing when admin blocks an already blocked user
response = self.client.post(
"user/{user_id}/block".format(user_id=self.user.id),
data=dict(reason="Test blocking already blocker user."),
follow_redirects=True,
)
self.assertIn("This account is already blocked.", str(response.data))
# test unblock user when user is not in db
response = self.client.get("/user/{user_id}/unblock".format(user_id="random-user-id"))
self.assert404(response, "Can't find a user with ID: random-user-id")
# admin unblocks tester
response = self.client.post(
"user/{user_id}/unblock".format(user_id=self.user.id),
data=dict(reason="Test unblocking user."),
follow_redirects=True,
)
self.assertIn("This user account has been unblocked.", str(response.data))
user = db_users.get_by_id(self.user.id)
self.assertEqual(user["is_blocked"], False)
# testing when admin unblocks a user that is not blocked
response = self.client.post(
"user/{user_id}/unblock".format(user_id=self.user.id),
data=dict(reason="Test unblocking user that is not blocked."),
follow_redirects=True,
)
self.assertIn("This account is not blocked.", str(response.data))
| 42.101124 | 107 | 0.646651 | 511 | 3,747 | 4.632094 | 0.189824 | 0.091255 | 0.059147 | 0.047317 | 0.681876 | 0.653147 | 0.615125 | 0.608365 | 0.566117 | 0.566117 | 0 | 0.024255 | 0.229784 | 3,747 | 88 | 108 | 42.579545 | 0.795911 | 0.107553 | 0 | 0.393443 | 0 | 0 | 0.252401 | 0.058824 | 0 | 0 | 0 | 0 | 0.229508 | 1 | 0.065574 | false | 0 | 0.065574 | 0 | 0.147541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17ac13e842ed688f786ef624a703a87e8887367e | 887 | py | Python | src/PYnative/exercise/Matplotlib/Q1.py | c-w-m/learning_python | 8f06aa41faf9195d978a7d21cbb329280b0d3200 | [
"CNRI-Python"
] | null | null | null | src/PYnative/exercise/Matplotlib/Q1.py | c-w-m/learning_python | 8f06aa41faf9195d978a7d21cbb329280b0d3200 | [
"CNRI-Python"
] | null | null | null | src/PYnative/exercise/Matplotlib/Q1.py | c-w-m/learning_python | 8f06aa41faf9195d978a7d21cbb329280b0d3200 | [
"CNRI-Python"
] | null | null | null | # Read Total profit of all months and show it using a line plot
import matplotlib.pyplot as plt
# My Solution
import pandas as pd
data = pd.read_csv("company_sales_data.csv")
plt.plot(data.month_number, data.total_profit)
plt.title("Company profit per month")
plt.xticks(range(1, 13))
plt.yticks([100000, 200000, 300000, 400000, 500000])
plt.xlabel("Month number")
plt.ylabel("Total profit")
plt.show()
# Given Solution
import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_csv("D:\\Python\\Articles\\matplotlib\\sales_data.csv")
profitList = df['total_profit'].tolist()
monthList = df['month_number'].tolist()
plt.plot(monthList, profitList, label='Month-wise Profit data of last year')
plt.xlabel('Month number')
plt.ylabel('Profit in dollar')
plt.xticks(monthList)
plt.title('Company profit per month')
plt.yticks([100000, 200000, 300000, 400000, 500000])
plt.show()
| 27.71875 | 76 | 0.754228 | 139 | 887 | 4.748201 | 0.402878 | 0.066667 | 0.066667 | 0.072727 | 0.457576 | 0.30303 | 0.219697 | 0.127273 | 0 | 0 | 0 | 0.079646 | 0.10823 | 887 | 31 | 77 | 28.612903 | 0.754741 | 0.099211 | 0 | 0.363636 | 0 | 0 | 0.28805 | 0.08805 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17adfc9cffb4c099c9a53de0491cef39faaf2b37 | 2,410 | py | Python | experiments/train.py | kongxiangrui15095288006/AutoML | a046ca3485b8a255067be5d75933813684a365e9 | [
"MIT"
] | 266 | 2017-12-11T20:58:42.000Z | 2022-03-13T06:35:57.000Z | experiments/train.py | kongxiangrui15095288006/AutoML | a046ca3485b8a255067be5d75933813684a365e9 | [
"MIT"
] | 14 | 2017-12-13T00:02:52.000Z | 2021-03-15T20:04:35.000Z | experiments/train.py | kongxiangrui15095288006/AutoML | a046ca3485b8a255067be5d75933813684a365e9 | [
"MIT"
] | 91 | 2017-12-15T00:25:26.000Z | 2022-03-16T13:30:01.000Z | import tensorflow as tf
import argparse
import sys
sys.path.append('../')
from cnn import CNN
from tensorflow.examples.tutorials.mnist import input_data
def main(action, name):
mnist = input_data.read_data_sets("../MNIST_data/", one_hot=True)
action = [int(x) for x in action.split(",")]
training_epochs = 10
batch_size = 100
action = [action[x:x+4] for x in range(0, len(action), 4)]
cnn_drop_rate = [c[3] for c in action]
model = CNN(784, 10, action)
loss_op = tf.reduce_mean(model.loss)
optimizer = tf.train.AdamOptimizer(learning_rate=0.0001)
train_op = optimizer.minimize(loss_op)
tf.summary.scalar('acc', model.accuracy)
tf.summary.scalar('loss', tf.reduce_mean(model.loss))
merged_summary_op = tf.summary.merge_all()
summary_writer = tf.summary.FileWriter(name, graph=tf.get_default_graph())
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
for epoch in range(training_epochs):
for step in range(int(mnist.train.num_examples/batch_size)):
batch_x, batch_y = mnist.train.next_batch(batch_size)
feed = {model.X: batch_x,
model.Y: batch_y,
model.dropout_keep_prob: 0.85,
model.cnn_dropout_rates: cnn_drop_rate}
_, summary = sess.run([train_op, merged_summary_op], feed_dict=feed)
summary_writer.add_summary(summary, step+(epoch+1)*int(mnist.train.num_examples/batch_size))
print("epoch: ", epoch+1, " of ", training_epochs)
batch_x, batch_y = mnist.test.next_batch(mnist.test.num_examples)
loss, acc = sess.run(
[loss_op, model.accuracy],
feed_dict={model.X: batch_x,
model.Y: batch_y,
model.dropout_keep_prob: 1.0,
model.cnn_dropout_rates: [1.0]*len(cnn_drop_rate)})
print("Network accuracy =", acc, " loss =", loss)
print("Final accuracy for", name, " =", acc)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--architecture', default="5, 32, 2, 5, 3, 64, 2, 3")
parser.add_argument('--name', default="model")
args = parser.parse_args()
main(args.architecture, args.name)
| 37.65625 | 104 | 0.606639 | 318 | 2,410 | 4.36478 | 0.327044 | 0.025937 | 0.023775 | 0.024496 | 0.165706 | 0.110951 | 0.110951 | 0.063401 | 0.063401 | 0.063401 | 0 | 0.021579 | 0.269295 | 2,410 | 63 | 105 | 38.253968 | 0.76661 | 0 | 0 | 0.040816 | 0 | 0 | 0.0577 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020408 | false | 0 | 0.102041 | 0 | 0.122449 | 0.061224 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17aed8ee4085834ee232d13ac00e9db2b2432020 | 4,133 | py | Python | dash_auth_external/routes.py | jamesholcombe/dash-auth-external | cab0b49b236223d037e22a09473597bd28f67ecd | [
"MIT"
] | 9 | 2022-01-04T23:00:07.000Z | 2022-01-18T18:56:37.000Z | dash_auth_external/routes.py | jamesholcombe/dash-auth-external | cab0b49b236223d037e22a09473597bd28f67ecd | [
"MIT"
] | null | null | null | dash_auth_external/routes.py | jamesholcombe/dash-auth-external | cab0b49b236223d037e22a09473597bd28f67ecd | [
"MIT"
] | null | null | null | from flask import session, redirect, request
import os
import base64
import re
import urllib.parse
from flask.app import Flask
import requests
import hashlib
from requests_oauthlib import OAuth2Session
os.environ["OAUTHLIB_INSECURE_TRANSPORT"] = "1"
def make_code_challenge(length: int = 40):
code_verifier = base64.urlsafe_b64encode(os.urandom(length)).decode("utf-8")
code_verifier = re.sub("[^a-zA-Z0-9]+", "", code_verifier)
code_challenge = hashlib.sha256(code_verifier.encode("utf-8")).digest()
code_challenge = base64.urlsafe_b64encode(code_challenge).decode("utf-8")
code_challenge = code_challenge.replace("=", "")
return code_challenge, code_verifier
def make_auth_route(
app: Flask,
external_auth_url: str,
client_id: str,
auth_suffix: str,
redirect_uri: str,
with_pkce: bool = True,
client_secret: str = None,
scope: str = None,
auth_request_headers: dict = None,
):
@app.route(auth_suffix)
def get_auth_code():
"""
Redirect the user/resource owner to the OAuth provider
using an URL with a few key OAuth parameters.
"""
# making code verifier and challenge for PKCE
if with_pkce:
code_challenge, code_verifier = make_code_challenge()
session["cv"] = code_verifier
# TODO implement this myself
oauth_session = OAuth2Session(
client_id,
redirect_uri=redirect_uri,
scope=scope,
)
if with_pkce:
authorization_url, state = oauth_session.authorization_url(
external_auth_url,
code_challenge=code_challenge,
code_challenge_method="S256",
)
else:
authorization_url, state = oauth_session.authorization_url(
external_auth_url,
)
resp = redirect(authorization_url)
return resp
return app
def build_token_body(
url, redirect_uri: str, client_id: str, with_pkce: bool, client_secret: str
):
query = urllib.parse.urlparse(url).query
redirect_params = urllib.parse.parse_qs(query)
code = redirect_params["code"][0]
state = redirect_params["state"][0]
if with_pkce:
code_verifier = session["cv"]
body = dict(
grant_type="authorization_code",
code=code,
redirect_uri=redirect_uri,
code_verifier=code_verifier,
client_id=client_id,
state=state,
client_secret=client_secret,
)
else:
body = dict(
grant_type="authorization_code",
code=code,
redirect_uri=redirect_uri,
client_id=client_id,
state=state,
client_secret=client_secret,
)
return body
def make_access_token_route(
app: Flask,
external_token_url: str,
redirect_suffix: str,
_home_suffix: str,
redirect_uri: str,
client_id: str,
_token_field_name: str,
with_pkce: bool = True,
client_secret: str = None,
token_request_headers: dict = None,
):
@app.route(redirect_suffix, methods=["GET", "POST"])
def get_token():
url = request.url
body = build_token_body(
url=url,
redirect_uri=redirect_uri,
with_pkce=with_pkce,
client_id=client_id,
client_secret=client_secret,
)
response_data = get_token_response_data(
external_token_url, body, token_request_headers
)
token = response_data[_token_field_name]
response = redirect(_home_suffix)
response.headers.add(_token_field_name, token)
return response
return app
def token_request(url: str, body: dict, headers: dict):
r = requests.post(url, data=body, headers=headers)
if r.status_code != 200:
raise requests.RequestException(
f"{r.status_code} {r.reason}:The request to the access token endpoint failed."
)
return r
def get_token_response_data(*args):
r = token_request(*args)
return r.json()
| 27.553333 | 90 | 0.632712 | 491 | 4,133 | 5.038697 | 0.244399 | 0.063056 | 0.034357 | 0.03557 | 0.2308 | 0.219078 | 0.174616 | 0.174616 | 0.174616 | 0.143897 | 0 | 0.01042 | 0.280184 | 4,133 | 149 | 91 | 27.738255 | 0.821176 | 0.041616 | 0 | 0.352941 | 0 | 0 | 0.04888 | 0.006874 | 0 | 0 | 0 | 0.006711 | 0 | 1 | 0.067227 | false | 0 | 0.07563 | 0 | 0.210084 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b0ba6b90ba9cb2577f1f6ec4a58d8b1adbcd74 | 758 | py | Python | abm/in-depth/in_depth_agent_based_modeling/simulation_models/spm/task.py | vishalbelsare/bptk_py_tutorial | c1e581e9442ed02e55194529e9d75bbe3af7c491 | [
"MIT"
] | 34 | 2020-02-01T04:53:56.000Z | 2022-03-07T19:28:59.000Z | abm/in-depth/in_depth_agent_based_modeling/simulation_models/spm/task.py | vishalbelsare/bptk_py_tutorial | c1e581e9442ed02e55194529e9d75bbe3af7c491 | [
"MIT"
] | 3 | 2021-05-04T07:08:26.000Z | 2022-03-02T11:39:51.000Z | abm/in-depth/in_depth_agent_based_modeling/simulation_models/spm/task.py | vishalbelsare/bptk_py_tutorial | c1e581e9442ed02e55194529e9d75bbe3af7c491 | [
"MIT"
] | 14 | 2020-03-26T21:08:54.000Z | 2022-02-04T14:20:01.000Z | from BPTK_Py import Agent
from BPTK_Py import log
class Task(Agent):
def initialize(self):
self.agent_type = "task"
self.state = "open"
self.set_property("remaining_effort", {"type": "Double", "value": 0})
self.register_event_handler(["open"], "task_started", self.handle_started_event)
self.register_event_handler(["in_progress"], "task_progress", self.handle_progress_event)
def handle_started_event(self, event):
self.remaining_effort = self.effort
self.state = "in_progress"
def handle_progress_event(self, event):
self.remaining_effort = max(self.remaining_effort-event.data["progress"], 0)
if self.remaining_effort == 0:
self.state = "closed"
| 25.266667 | 97 | 0.668865 | 95 | 758 | 5.073684 | 0.336842 | 0.155602 | 0.157676 | 0.06639 | 0.136929 | 0.136929 | 0 | 0 | 0 | 0 | 0 | 0.005025 | 0.212401 | 758 | 29 | 98 | 26.137931 | 0.802345 | 0 | 0 | 0 | 0 | 0 | 0.137748 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b237f8a11e7c293a22cd71fb2d2b6f70c1241b | 4,458 | py | Python | pygeotile/tile.py | Murthy10/pyGeoTile | c744e540ba698fbe0d822616a62702918d24f71e | [
"MIT"
] | 93 | 2017-04-24T10:49:20.000Z | 2022-03-30T00:12:09.000Z | pygeotile/tile.py | Murthy10/pyGeoTile | c744e540ba698fbe0d822616a62702918d24f71e | [
"MIT"
] | 12 | 2017-04-24T09:40:54.000Z | 2021-12-09T16:26:19.000Z | pygeotile/tile.py | Murthy10/pyGeoTile | c744e540ba698fbe0d822616a62702918d24f71e | [
"MIT"
] | 9 | 2017-11-14T08:16:02.000Z | 2021-03-07T13:23:29.000Z | import math
import re
from functools import reduce
from collections import namedtuple
from .point import Point
from .meta import TILE_SIZE
BaseTile = namedtuple('BaseTile', 'tms_x tms_y zoom')
class Tile(BaseTile):
"""Immutable Tile class"""
@classmethod
def from_quad_tree(cls, quad_tree):
"""Creates a tile from a Microsoft QuadTree"""
assert bool(re.match('^[0-3]*$', quad_tree)), 'QuadTree value can only consists of the digits 0, 1, 2 and 3.'
zoom = len(str(quad_tree))
offset = int(math.pow(2, zoom)) - 1
google_x, google_y = [reduce(lambda result, bit: (result << 1) | bit, bits, 0)
for bits in zip(*(reversed(divmod(digit, 2))
for digit in (int(c) for c in str(quad_tree))))]
return cls(tms_x=google_x, tms_y=(offset - google_y), zoom=zoom)
@classmethod
def from_tms(cls, tms_x, tms_y, zoom):
"""Creates a tile from Tile Map Service (TMS) X Y and zoom"""
max_tile = (2 ** zoom) - 1
assert 0 <= tms_x <= max_tile, 'TMS X needs to be a value between 0 and (2^zoom) -1.'
assert 0 <= tms_y <= max_tile, 'TMS Y needs to be a value between 0 and (2^zoom) -1.'
return cls(tms_x=tms_x, tms_y=tms_y, zoom=zoom)
@classmethod
def from_google(cls, google_x, google_y, zoom):
"""Creates a tile from Google format X Y and zoom"""
max_tile = (2 ** zoom) - 1
assert 0 <= google_x <= max_tile, 'Google X needs to be a value between 0 and (2^zoom) -1.'
assert 0 <= google_y <= max_tile, 'Google Y needs to be a value between 0 and (2^zoom) -1.'
return cls(tms_x=google_x, tms_y=(2 ** zoom - 1) - google_y, zoom=zoom)
@classmethod
def for_point(cls, point, zoom):
"""Creates a tile for given point"""
latitude, longitude = point.latitude_longitude
return cls.for_latitude_longitude(latitude=latitude, longitude=longitude, zoom=zoom)
@classmethod
def for_pixels(cls, pixel_x, pixel_y, zoom):
"""Creates a tile from pixels X Y Z (zoom) in pyramid"""
tms_x = int(math.ceil(pixel_x / float(TILE_SIZE)) - 1)
tms_y = int(math.ceil(pixel_y / float(TILE_SIZE)) - 1)
return cls(tms_x=tms_x, tms_y=(2 ** zoom - 1) - tms_y, zoom=zoom)
@classmethod
def for_meters(cls, meter_x, meter_y, zoom):
"""Creates a tile from X Y meters in Spherical Mercator EPSG:900913"""
point = Point.from_meters(meter_x=meter_x, meter_y=meter_y)
pixel_x, pixel_y = point.pixels(zoom=zoom)
return cls.for_pixels(pixel_x=pixel_x, pixel_y=pixel_y, zoom=zoom)
@classmethod
def for_latitude_longitude(cls, latitude, longitude, zoom):
"""Creates a tile from lat/lon in WGS84"""
point = Point.from_latitude_longitude(latitude=latitude, longitude=longitude)
pixel_x, pixel_y = point.pixels(zoom=zoom)
return cls.for_pixels(pixel_x=pixel_x, pixel_y=pixel_y, zoom=zoom)
@property
def tms(self):
"""Gets the tile in pyramid from Tile Map Service (TMS)"""
return self.tms_x, self.tms_y
@property
def quad_tree(self):
"""Gets the tile in the Microsoft QuadTree format, converted from TMS"""
value = ''
tms_x, tms_y = self.tms
tms_y = (2 ** self.zoom - 1) - tms_y
for i in range(self.zoom, 0, -1):
digit = 0
mask = 1 << (i - 1)
if (tms_x & mask) != 0:
digit += 1
if (tms_y & mask) != 0:
digit += 2
value += str(digit)
return value
@property
def google(self):
"""Gets the tile in the Google format, converted from TMS"""
tms_x, tms_y = self.tms
return tms_x, (2 ** self.zoom - 1) - tms_y
@property
def bounds(self):
"""Gets the bounds of a tile represented as the most west and south point and the most east and north point"""
google_x, google_y = self.google
pixel_x_west, pixel_y_north = google_x * TILE_SIZE, google_y * TILE_SIZE
pixel_x_east, pixel_y_south = (google_x + 1) * TILE_SIZE, (google_y + 1) * TILE_SIZE
point_min = Point.from_pixel(pixel_x=pixel_x_west, pixel_y=pixel_y_south, zoom=self.zoom)
point_max = Point.from_pixel(pixel_x=pixel_x_east, pixel_y=pixel_y_north, zoom=self.zoom)
return point_min, point_max
__all__ = ['Tile']
| 40.899083 | 118 | 0.618663 | 692 | 4,458 | 3.790462 | 0.151734 | 0.027449 | 0.020587 | 0.0183 | 0.432711 | 0.367518 | 0.194053 | 0.174228 | 0.160884 | 0.149447 | 0 | 0.018394 | 0.268282 | 4,458 | 108 | 119 | 41.277778 | 0.785714 | 0.14087 | 0 | 0.24359 | 0 | 0 | 0.082515 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 1 | 0.141026 | false | 0 | 0.076923 | 0 | 0.371795 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b32a03eacb3b7786a65a8d9678832d9e175f53 | 857 | py | Python | tools/pythonpkg/tests/fast/arrow/test_multiple_reads.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 2,816 | 2018-06-26T18:52:52.000Z | 2021-04-06T10:39:15.000Z | tools/pythonpkg/tests/fast/arrow/test_multiple_reads.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 1,310 | 2021-04-06T16:04:52.000Z | 2022-03-31T13:52:53.000Z | tools/pythonpkg/tests/fast/arrow/test_multiple_reads.py | AldoMyrtaj/duckdb | 3aa4978a2ceab8df25e4b20c388bcd7629de73ed | [
"MIT"
] | 270 | 2021-04-09T06:18:28.000Z | 2022-03-31T11:55:37.000Z | import duckdb
import os
try:
import pyarrow
import pyarrow.parquet
can_run = True
except:
can_run = False
class TestArrowReads(object):
def test_multiple_queries_same_relation(self, duckdb_cursor):
if not can_run:
return
parquet_filename = os.path.join(os.path.dirname(os.path.realpath(__file__)),'data','userdata1.parquet')
cols = 'id, first_name, last_name, email, gender, ip_address, cc, country, birthdate, salary, title, comments'
userdata_parquet_table = pyarrow.parquet.read_table(parquet_filename)
userdata_parquet_table.validate(full=True)
rel = duckdb.from_arrow_table(userdata_parquet_table)
assert(rel.aggregate("(avg(salary))::INT").execute().fetchone()[0] == 149005)
assert(rel.aggregate("(avg(salary))::INT").execute().fetchone()[0] == 149005)
| 38.954545 | 118 | 0.697783 | 108 | 857 | 5.296296 | 0.592593 | 0.031469 | 0.104895 | 0.073427 | 0.181818 | 0.181818 | 0.181818 | 0.181818 | 0.181818 | 0.181818 | 0 | 0.021307 | 0.17853 | 857 | 21 | 119 | 40.809524 | 0.791193 | 0 | 0 | 0.105263 | 0 | 0.052632 | 0.184364 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0.052632 | false | 0 | 0.210526 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b3d503d5b75870f3026522e73650f8bb3da1e8 | 899 | py | Python | python/seeder.py | lokhiufung/bookshelf-backend | fb34aea726404bd65d7c736afb16abdb2fc424a5 | [
"Apache-2.0"
] | null | null | null | python/seeder.py | lokhiufung/bookshelf-backend | fb34aea726404bd65d7c736afb16abdb2fc424a5 | [
"Apache-2.0"
] | null | null | null | python/seeder.py | lokhiufung/bookshelf-backend | fb34aea726404bd65d7c736afb16abdb2fc424a5 | [
"Apache-2.0"
] | null | null | null | import pymongo
from app import config
def main():
# initialize a database
db_name = config.DATABASE_NAME
col_name = config.BOOK_COLLECTION
client = pymongo.MongoClient("mongodb://localhost:27017/")
db = client[db_name]
col= db[col_name]
col.insert_one({
'title': '<seeder-book>',
'url': 'https://<seeder-book>',
'description': '<seeder-book>',
'tags': [],
'book_id': "0000"
})
client.close()
print(f'created a database {db_name} and a collection {col_name}')
# remove seeder doc from db
client = pymongo.MongoClient("mongodb://localhost:27017/")
db = client[db_name]
col= db[col_name]
col.delete_many({}) # remove all docs
client.close()
print(f'removed all seeder docs from the database {db_name} and collection {col_name}')
if __name__ == '__main__':
main()
| 23.657895 | 91 | 0.615128 | 112 | 899 | 4.732143 | 0.392857 | 0.056604 | 0.079245 | 0.056604 | 0.279245 | 0.279245 | 0.279245 | 0.279245 | 0.279245 | 0.279245 | 0 | 0.020679 | 0.246941 | 899 | 38 | 92 | 23.657895 | 0.762186 | 0.070078 | 0 | 0.32 | 0 | 0 | 0.328932 | 0.062425 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.08 | 0 | 0.12 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b3d797b5ecac3518638907a1a5630900abbc3b | 790 | py | Python | src/TimeLimitGenerator.py | RobertOlechowski/RR_Utils_Python | e25375638ba765c5f7bab545e63d1fdc9743d4c4 | [
"MIT"
] | null | null | null | src/TimeLimitGenerator.py | RobertOlechowski/RR_Utils_Python | e25375638ba765c5f7bab545e63d1fdc9743d4c4 | [
"MIT"
] | null | null | null | src/TimeLimitGenerator.py | RobertOlechowski/RR_Utils_Python | e25375638ba765c5f7bab545e63d1fdc9743d4c4 | [
"MIT"
] | null | null | null | import time
class TimeLimitGenerator:
def __init__(self, time_limit_seconds, func):
self._time_limit = time_limit_seconds
self._func = func
def get_counter(self):
return self._counter
def __iter__(self):
self._counter = 0
self._start_time = time.time()
self._prev_time = self._start_time
return self
def __next__(self):
current_time = time.time()
delta_time_sec = current_time - self._start_time
if delta_time_sec > self._time_limit:
raise StopIteration
since_prev_time = current_time - self._prev_time
self._prev_time = current_time
result = self._func(self._counter, delta_time_sec, since_prev_time)
self._counter += 1
return result
| 26.333333 | 75 | 0.651899 | 99 | 790 | 4.666667 | 0.272727 | 0.103896 | 0.084416 | 0.103896 | 0.08658 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003509 | 0.278481 | 790 | 29 | 76 | 27.241379 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.045455 | 0.045455 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17b67744b7c2e3cc06102338dce1301cfa770840 | 932 | py | Python | singleuser/srv/jupyter_via_proxy/jupyter_via_proxy/via.py | victor-moreno/jupyterhub-deploy-docker-VM | 002af508122d0f1919c704f719acd3d837174d4b | [
"BSD-3-Clause"
] | 3 | 2021-11-15T12:54:24.000Z | 2022-02-07T07:45:24.000Z | singleuser/srv/jupyter_via_proxy/jupyter_via_proxy/via.py | victor-moreno/jupyterhub-deploy-docker-VM | 002af508122d0f1919c704f719acd3d837174d4b | [
"BSD-3-Clause"
] | 1 | 2022-01-10T21:01:31.000Z | 2022-03-15T03:48:13.000Z | singleuser/srv/jupyter_via_proxy/jupyter_via_proxy/via.py | victor-moreno/jupyterhub-deploy-docker-VM | 002af508122d0f1919c704f719acd3d837174d4b | [
"BSD-3-Clause"
] | 1 | 2022-02-08T20:05:45.000Z | 2022-02-08T20:05:45.000Z | #!/usr/bin/env python
from flask import Flask, render_template, url_for
from optparse import OptionParser
import os
app = Flask(__name__)
@app.route('/')
def index():
return render_template('via.html')
if __name__ == '__main__':
parser = OptionParser(usage='Usage: %prog [options]')
parser.add_option('-l', '--listen', metavar='ADDRESS', dest='host', default='127.0.0.1', help='address to listen on [127.0.0.1]')
parser.add_option('-p', '--port', metavar='PORT', dest='port', type='int', default=5000, help='port to listen on [5000]')
# parser.add_option('-b', '--base_url', metavar='{base_url}', dest='BASE_URL', type='str', help='base_url')
(opts, args) = parser.parse_args()
# set options
for k in dir(opts):
if not k.startswith('_') and getattr(opts, k) is None:
delattr(opts, k)
app.config.from_object(opts)
app.run(host=opts.host, port=opts.port, threaded=True) | 37.28 | 133 | 0.655579 | 138 | 932 | 4.246377 | 0.507246 | 0.047782 | 0.076792 | 0.020478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02551 | 0.158798 | 932 | 25 | 134 | 37.28 | 0.721939 | 0.148069 | 0 | 0 | 0 | 0 | 0.183081 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.176471 | 0.058824 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17badf19bf270da535a1fca161c597d79bf9a0c8 | 3,104 | py | Python | setup.py | elijahr/aiozyre | b81e958ec45b422cd26a6b2de80a9cbf97ad0416 | [
"BSD-3-Clause"
] | 4 | 2020-01-07T10:23:27.000Z | 2021-03-24T08:19:33.000Z | setup.py | elijahr/aiozyre | b81e958ec45b422cd26a6b2de80a9cbf97ad0416 | [
"BSD-3-Clause"
] | 3 | 2020-05-24T04:45:10.000Z | 2020-09-11T18:15:56.000Z | setup.py | elijahr/aiozyre | b81e958ec45b422cd26a6b2de80a9cbf97ad0416 | [
"BSD-3-Clause"
] | 1 | 2020-05-23T15:05:30.000Z | 2020-05-23T15:05:30.000Z | import glob
import itertools
import os
import sys
from setuptools import setup, Extension
from setuptools.command.build_ext import build_ext as _build_ext
from setuptools.command.install import install as _install
DIR = os.path.dirname(__file__)
with open("README.md", "r") as fh:
long_description = fh.read()
class build_ext(_build_ext):
def initialize_options(self):
super(build_ext, self).initialize_options()
self.debug = '--debug' in sys.argv
def finalize_options(self):
from Cython.Build.Dependencies import cythonize
for item in itertools.chain(
glob.glob(os.path.join(DIR, 'src', 'aiozyre', '*.c')),
glob.glob(os.path.join(DIR, 'src', 'aiozyre', '*.h'))):
os.remove(item)
self.distribution.ext_modules[:] = cythonize(
self.distribution.ext_modules,
gdb_debug=self.debug,
)
super(build_ext, self).finalize_options()
# Never install as an egg
self.single_version_externally_managed = False
class install(_install):
user_options = _install.user_options + [
('debug', None, 'Build with debug symbols'),
]
def initialize_options(self):
super(install, self).initialize_options()
self.debug = '--debug' in sys.argv
def finalize_options(self):
super(install, self).finalize_options()
# Never install as an egg
self.single_version_externally_managed = False
def get_pyx():
for path in glob.glob(os.path.join(DIR, 'src', 'aiozyre', '*.pyx')):
module = 'aiozyre.%s' % os.path.splitext(os.path.basename(path))[0]
source = os.path.join('src', 'aiozyre', os.path.basename(path))
yield module, source
setup(
name='aiozyre',
version='1.1.5',
description='asyncio-friendly Python bindings for Zyre',
long_description=long_description,
long_description_content_type="text/markdown",
author='Elijah Shaw-Rutschman',
author_email='elijahr+aiozyre@gmail.com',
packages=['aiozyre'],
package_dir={
'aiozyre': os.path.join('src', 'aiozyre'),
},
data_files=['README.md', 'LICENSE'],
package_data={
'aiozyre': [
# Include cython source
'*.pyx',
'*.pxd',
],
},
cmdclass={
'build_ext': build_ext,
'install': install
},
ext_modules=[
Extension(
module,
sources=[source],
libraries=['czmq', 'zyre'],
)
for module, source in get_pyx()
],
setup_requires=['cython'],
extras_require={
'dev': [
'blessed',
'aioconsole',
]
},
classifiers=[
'Environment :: Console',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Operating System :: MacOS :: MacOS X',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3',
'Topic :: System :: Networking',
'Framework :: AsyncIO',
],
zip_safe=False,
)
| 27.22807 | 75 | 0.596649 | 342 | 3,104 | 5.260234 | 0.380117 | 0.040022 | 0.027793 | 0.023346 | 0.275709 | 0.210673 | 0.210673 | 0.210673 | 0.158977 | 0.158977 | 0 | 0.002648 | 0.269974 | 3,104 | 113 | 76 | 27.469027 | 0.791262 | 0.022229 | 0 | 0.142857 | 0 | 0 | 0.191026 | 0.008248 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054945 | false | 0 | 0.087912 | 0 | 0.175824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17bc2551886ce368e0b3dda66717113fed757f21 | 1,159 | py | Python | Week8/D_chainsaw_jugglers_small.py | ACM-UCI/Spring-2018-Practice | 559af1946a45eb5e6ba4b9f03eb0e95341426868 | [
"MIT"
] | 1 | 2018-04-18T06:47:32.000Z | 2018-04-18T06:47:32.000Z | Week8/D_chainsaw_jugglers_small.py | ACM-UCI/Spring-2018-Practice | 559af1946a45eb5e6ba4b9f03eb0e95341426868 | [
"MIT"
] | null | null | null | Week8/D_chainsaw_jugglers_small.py | ACM-UCI/Spring-2018-Practice | 559af1946a45eb5e6ba4b9f03eb0e95341426868 | [
"MIT"
] | null | null | null |
##code for SMALL testcase begins here
DP = [[0 for j in range(501)] for k in range(501)]
DPp = [[0 for j in range(501)] for k in range(501)]
#summation = 51
rbs = []
for i in range(35):
for j in range(35):
if i>0 or j>0:
rbs.append((i,j))
for i in range(len(rbs)):
red, blue = rbs[i]
for j in range(len(DP)):
for k in range(len(DP[j])):
if j-red >= 0 and k-blue >= 0:
DP[j][k] = max(DPp[j][k], DPp[j-red][k-blue]+1)
else:
DP[j][k] = DPp[j][k]
DPp = [[x for x in l] for l in DP]
##ends here
T = int(input())
for t in range(T):
#print(t)
r, b = map(int, input().split())
##
## ans = 0
##
## summation = 1
## while r > 0 or b > 0:
## rbs = [(x, summation-x) for x in range(summation+1)]
## rbs.sort(key=lambda x: abs(x[0]-x[1]))
## #print(rbs)
## for red, blue in rbs:
## if r >= red and b >= blue:
## r-=red
## b-= blue
## ans+=1
## summation+=1
## if summation > 500:
## break
print("Case #{}: {}".format(t+1, DP[r][b]))
| 25.195652 | 63 | 0.451251 | 195 | 1,159 | 2.682051 | 0.25641 | 0.147228 | 0.045889 | 0.08413 | 0.110899 | 0.110899 | 0.110899 | 0.110899 | 0.110899 | 0.110899 | 0 | 0.050938 | 0.356342 | 1,159 | 45 | 64 | 25.755556 | 0.650134 | 0.381363 | 0 | 0 | 0 | 0 | 0.017725 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17bd0aaa942f33294411e73e2bf01d05dd0073cf | 2,083 | py | Python | tests/test_regridder.py | cp4cds/c4cds-wps | 5abd9281195548bbd1e7653fe2ab1fee26745200 | [
"Apache-2.0"
] | null | null | null | tests/test_regridder.py | cp4cds/c4cds-wps | 5abd9281195548bbd1e7653fe2ab1fee26745200 | [
"Apache-2.0"
] | 4 | 2018-10-24T15:08:48.000Z | 2020-01-14T16:05:26.000Z | tests/test_regridder.py | cp4cds/c4cds-wps | 5abd9281195548bbd1e7653fe2ab1fee26745200 | [
"Apache-2.0"
] | null | null | null | import pytest
from c4cds.regridder import Regridder, GLOBAL, REGIONAL
from .common import C3S_CMIP5_NC, CORDEX_NC, C3S_CMIP5_ARCHIVE_BASE, CORDEX_ARCHIVE_BASE, resource_ok
def test_create_output_dir():
regridder = Regridder()
assert 'out_regrid/1_deg' in regridder.create_output_dir(domain_type=GLOBAL)
assert 'out_regrid/0.5_deg' in regridder.create_output_dir(domain_type=REGIONAL)
def test_get_grid_definition_file():
regridder = Regridder(archive_base=CORDEX_ARCHIVE_BASE)
assert 'grid_files/ll1deg_grid.nc' in regridder.get_grid_definition_file(
C3S_CMIP5_NC, domain_type=GLOBAL)
assert 'grid_files/ll0.5deg_AFR-44i.nc' in regridder.get_grid_definition_file(
CORDEX_NC, domain_type=REGIONAL)
@pytest.mark.skipif(not resource_ok(CORDEX_NC),
reason="Test data not available.")
def test_validate_input_grid():
regridder = Regridder(archive_base=CORDEX_ARCHIVE_BASE)
regridder.validate_input_grid(CORDEX_NC)
@pytest.mark.skipif(not resource_ok(CORDEX_NC),
reason="Test data not available.")
def test_validate_regridded_file_cordex():
regridder = Regridder(archive_base=CORDEX_ARCHIVE_BASE)
regridder.validate_regridded_file(CORDEX_NC, REGIONAL)
@pytest.mark.skip(reason='no regridded file')
def test_validate_regridded_file_cmip5():
regridder = Regridder(archive_base=C3S_CMIP5_ARCHIVE_BASE)
regridder.validate_regridded_file(C3S_CMIP5_NC, GLOBAL)
@pytest.mark.skip(reason="no grid file for CORDEX.")
def test_regrid_cordex():
regridder = Regridder(archive_base=CORDEX_ARCHIVE_BASE)
assert '0.5_deg/tasmin_AFR-44i_ECMWF-ERAINT_evaluation_r1i1p1_MOHC-HadRM3P_v1_mon_199001-199012.nc' \
in regridder.regrid(CORDEX_NC, REGIONAL)
@pytest.mark.skipif(not resource_ok(C3S_CMIP5_NC),
reason="Test data not available.")
def test_regrid_cmip5():
regridder = Regridder(archive_base=C3S_CMIP5_ARCHIVE_BASE)
assert '1_deg/tas_Amon_HadGEM2-ES_historical_r1i1p1_186001-186012.nc' \
in regridder.regrid(C3S_CMIP5_NC, GLOBAL)
| 37.872727 | 105 | 0.776284 | 292 | 2,083 | 5.157534 | 0.243151 | 0.102258 | 0.099602 | 0.115538 | 0.60425 | 0.51328 | 0.484728 | 0.409695 | 0.257636 | 0.103586 | 0 | 0.036171 | 0.137302 | 2,083 | 54 | 106 | 38.574074 | 0.801892 | 0 | 0 | 0.289474 | 0 | 0 | 0.168987 | 0.098416 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.184211 | false | 0 | 0.078947 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c0d5ddeaf776c63d3684a33a6d59abcaacc4e1 | 15,345 | py | Python | python/tk_houdini_flipbook/dialog.py | SinkingShipEntertainment/tk-houdini-flipbook | cea03404cc3056f640ee06003d3654523adb352b | [
"MIT"
] | null | null | null | python/tk_houdini_flipbook/dialog.py | SinkingShipEntertainment/tk-houdini-flipbook | cea03404cc3056f640ee06003d3654523adb352b | [
"MIT"
] | null | null | null | python/tk_houdini_flipbook/dialog.py | SinkingShipEntertainment/tk-houdini-flipbook | cea03404cc3056f640ee06003d3654523adb352b | [
"MIT"
] | 1 | 2021-06-28T22:13:04.000Z | 2021-06-28T22:13:04.000Z | # MIT License
# Copyright (c) 2020 Netherlands Film Academy
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import os
import hou
from .create_flipbook import CreateFlipbook
from .create_slate import CreateSlate
from .submit_version import SubmitVersion
from PySide2 import QtGui
from PySide2 import QtWidgets
class FlipbookDialog(QtWidgets.QDialog):
def __init__(self, app, parent=None):
QtWidgets.QDialog.__init__(self, parent)
self.app = app
# create an instance of CreateFlipbook
# create an instance of CreateSlate
self.flipbook = CreateFlipbook(app)
self.slate = CreateSlate(app)
# other properties
self.setWindowTitle("SGTK Flipbook")
# define general layout
layout = QtWidgets.QVBoxLayout()
groupLayout = QtWidgets.QVBoxLayout()
# widgets
self.outputLabel = QtWidgets.QLabel(
"Flipbooking to: %s"
% (os.path.basename(self.flipbook.getOutputPath()["finFile"]))
)
self.outputToMplay = QtWidgets.QCheckBox("MPlay Output", self)
self.outputToMplay.setChecked(True)
self.beautyPassOnly = QtWidgets.QCheckBox("Beauty Pass", self)
self.useMotionblur = QtWidgets.QCheckBox("Motion Blur", self)
# save new version widget
self.saveNewVersionCheckbox = QtWidgets.QCheckBox(
"Save New Version", self)
self.saveNewVersionCheckbox.setChecked(True)
# description widget
self.descriptionLabel = QtWidgets.QLabel("Description")
self.description = QtWidgets.QLineEdit()
# resolution sub-widgets x
self.resolutionX = QtWidgets.QWidget()
resolutionXLayout = QtWidgets.QVBoxLayout()
self.resolutionXLabel = QtWidgets.QLabel("Width")
self.resolutionXLine = QtWidgets.QLineEdit()
self.resolutionX.default = "1920"
self.resolutionXLine.setPlaceholderText(self.resolutionX.default)
self.resolutionXLine.setInputMask("9990")
resolutionXLayout.addWidget(self.resolutionXLabel)
resolutionXLayout.addWidget(self.resolutionXLine)
self.resolutionX.setLayout(resolutionXLayout)
# resolution sub-widgets y
self.resolutionY = QtWidgets.QWidget()
resolutionYLayout = QtWidgets.QVBoxLayout()
self.resolutionYLabel = QtWidgets.QLabel("Height")
self.resolutionYLine = QtWidgets.QLineEdit()
self.resolutionY.default = "1080"
self.resolutionYLine.setPlaceholderText(self.resolutionY.default)
self.resolutionYLine.setInputMask("9990")
resolutionYLayout.addWidget(self.resolutionYLabel)
resolutionYLayout.addWidget(self.resolutionYLine)
self.resolutionY.setLayout(resolutionYLayout)
# resolution group
self.resolutionGroup = QtWidgets.QGroupBox("Resolution")
resolutionGroupLayout = QtWidgets.QHBoxLayout()
resolutionGroupLayout.addWidget(self.resolutionX)
resolutionGroupLayout.addWidget(self.resolutionY)
self.resolutionGroup.setLayout(resolutionGroupLayout)
# frame range widget
self.frameRange = QtWidgets.QGroupBox("Frame range")
frameRangeGroupLayout = QtWidgets.QHBoxLayout()
# frame range start sub-widget
self.frameRangeStart = QtWidgets.QWidget()
frameRangeStartLayout = QtWidgets.QVBoxLayout()
self.frameRangeStartLabel = QtWidgets.QLabel("Start")
self.frameRangeStartLine = QtWidgets.QLineEdit()
self.frameRangeStartLine.setPlaceholderText(
"%i" % (self.flipbook.getFrameRange()[0])
)
self.frameRangeStartLine.setInputMask("9000")
frameRangeStartLayout.addWidget(self.frameRangeStartLabel)
frameRangeStartLayout.addWidget(self.frameRangeStartLine)
self.frameRangeStart.setLayout(frameRangeStartLayout)
frameRangeGroupLayout.addWidget(self.frameRangeStart)
# frame range end sub-widget
self.frameRangeEnd = QtWidgets.QWidget()
frameRangeEndLayout = QtWidgets.QVBoxLayout()
self.frameRangeEndLabel = QtWidgets.QLabel("End")
self.frameRangeEndLine = QtWidgets.QLineEdit()
self.frameRangeEndLine.setPlaceholderText(
"%i" % (self.flipbook.getFrameRange()[1])
)
self.frameRangeEndLine.setInputMask("9000")
frameRangeEndLayout.addWidget(self.frameRangeEndLabel)
frameRangeEndLayout.addWidget(self.frameRangeEndLine)
self.frameRangeEnd.setLayout(frameRangeEndLayout)
frameRangeGroupLayout.addWidget(self.frameRangeEnd)
# frame range widget finalizing
self.frameRange.setLayout(frameRangeGroupLayout)
# copy to path widget
self.copyPathButton = QtWidgets.QPushButton("Copy Path to Clipboard")
# options group
self.optionsGroup = QtWidgets.QGroupBox("Flipbook options")
groupLayout.addWidget(self.outputToMplay)
groupLayout.addWidget(self.beautyPassOnly)
groupLayout.addWidget(self.useMotionblur)
groupLayout.addWidget(self.saveNewVersionCheckbox)
groupLayout.addWidget(self.copyPathButton)
self.optionsGroup.setLayout(groupLayout)
# button box buttons
self.cancelButton = QtWidgets.QPushButton("Cancel")
self.startButton = QtWidgets.QPushButton("Start Flipbook")
# lower right button box
buttonBox = QtWidgets.QDialogButtonBox()
buttonBox.addButton(
self.startButton, QtWidgets.QDialogButtonBox.ActionRole)
buttonBox.addButton(self.cancelButton,
QtWidgets.QDialogButtonBox.ActionRole)
# widgets additions
layout.addWidget(self.outputLabel)
layout.addWidget(self.descriptionLabel)
layout.addWidget(self.description)
layout.addWidget(self.frameRange)
layout.addWidget(self.resolutionGroup)
layout.addWidget(self.optionsGroup)
layout.addWidget(buttonBox)
# connect button functionality
self.cancelButton.clicked.connect(self.closeWindow)
self.startButton.clicked.connect(self.startFlipbook)
self.copyPathButton.clicked.connect(self.copyPathToClipboard)
# finally, set layout
self.setLayout(layout)
def closeWindow(self):
self.close()
def startFlipbook(self):
inputSettings = {}
outputPath = self.flipbook.getOutputPath()
description = self.validateDescription()
# create submitter class
submit = SubmitVersion(
self.app,
# @CH - updating finFile for jpg sequence instead mov
#outputPath["writeTempFile"].replace("$F4", "%04d"),
outputPath["finFile"],
int(self.validateFrameRange()[0]),
int(self.validateFrameRange()[1]),
description,
)
# validation of inputs
inputSettings["frameRange"] = self.validateFrameRange()
inputSettings["resolution"] = self.validateResolution()
inputSettings["mplay"] = self.validateMplay()
inputSettings["beautyPass"] = self.validateBeauty()
inputSettings["motionBlur"] = self.validateMotionBlur()
# inputSettings["output"] = outputPath["writeTempFile"].replace("$F4", "####")
inputSettings["output"] = outputPath["writeTempFile"]
# @CH - updating finFile for jpg sequence instead mov
#inputSettings["sessionLabel"] = outputPath["writeTempFile"]
inputSettings["sessionLabel"] = outputPath["finFile"]
self.app.logger.debug(
"Using the following settings, %s" % (inputSettings))
# retrieve full settings object
settings = self.flipbook.getFlipbookSettings(inputSettings)
# run the actual flipbook
try:
with hou.InterruptableOperation(
"Flipbooking",
long_operation_name="Creating a flipbook",
open_interrupt_dialog=True,
) as operation:
operation.updateLongProgress(0, "Starting Flipbook")
self.flipbook.runFlipbook(settings)
operation.updateLongProgress(
0.25, "Passing: Rendering to Nuke, please sit tight."
)
# breaking nuke dependency @CH
#
# heres the method request for Nuke
# self.slate.runSlate(
# outputPath["inputTempFile"],
# outputPath["finFile"],
# inputSettings,
# )
# debug purpose
# self.app.logger.debug()
operation.updateLongProgress(
0.35, "Converting image-sequence, please sit tight."
)
m = ">> Converting image-sequence using ffmpeg..."
self.app.logger.debug(m)
self.app.logger.debug(">> Saving to output path -> {}".format(
outputPath))
# converting mov sequence
submit.convert_sequence(
outputPath["inputTempFile"],
outputPath["finFile"])
operation.updateLongProgress(0.75, "Saving")
self.app.logger.debug(">> Saving...")
self.saveNewVersion()
operation.updateLongProgress(1, "Done, closing window.")
self.closeWindow()
# submit/upload version confirmation
text_confirm = "Upload as Version to Shotgun?"
upload_confirmed = hou.ui.displayMessage(
text_confirm,
buttons=('Not now', 'Ok',),
severity=hou.severityType.ImportantMessage,
default_choice=1,
close_choice=-1,
help="",
title="SG Version",
details="",
details_label="",
details_expanded=False)
if (upload_confirmed):
# operation.updateLongProgress(0.5, "Uploading to Shotgun")
self.app.logger.debug(">> Uploading to Shotgun...")
submit.submit_version()
self.app.logger.info("Done! Flipbook successful")
hou.ui.displayMessage("Done! Flipbook successful!")
except Exception as e:
self.app.logger.error("Oops, something went wrong!")
self.app.logger.error(e)
hou.ui.displayMessage("Oops, something went wrong: {}".format(e))
return
# copyPathButton callback
# copy the output path to the clipboard
def copyPathToClipboard(self):
path = self.flipbook.getOutputPath()['finFile']
self.app.logger.debug("Copying path to clipboard: %s" % path)
QtGui.QGuiApplication.clipboard().setText(path)
return
# saveNewVersion callback
def saveNewVersion(self):
# if validateSaveNewVersion returns true, save the current hipfile with an incremented version number
if(self.validateSaveNewVersion()):
self.app.logger.debug("Saving new version.")
hou.hipFile.saveAndIncrementFileName()
# if validateSaveNewVersion returns false, just save the current hipfile
else:
hou.hipFile.save()
# saveNewVersion validation
# check if the save new version option is ticked
def validateSaveNewVersion(self):
return self.saveNewVersionCheckbox.isChecked()
def validateFrameRange(self):
# validating the frame range input
frameRange = []
if self.frameRangeStartLine.hasAcceptableInput():
self.app.logger.debug(
"Setting start of frame range to %s" % (
self.frameRangeStartLine.text())
)
frameRange.append(int(self.frameRangeStartLine.text()))
else:
self.app.logger.debug(
"Setting start of frame range to %i"
% (self.flipbook.getFrameRange()[0])
)
frameRange.append(self.flipbook.getFrameRange()[0])
if self.frameRangeEndLine.hasAcceptableInput():
self.app.logger.debug(
"Setting end of frame range to %s" % (
self.frameRangeEndLine.text())
)
frameRange.append(int(self.frameRangeEndLine.text()))
else:
self.app.logger.debug(
"Setting end of frame range to %i" % (
self.flipbook.getFrameRange()[1])
)
frameRange.append(self.flipbook.getFrameRange()[1])
return tuple(frameRange)
def validateResolution(self):
# validating the resolution input
resolution = []
if self.resolutionXLine.hasAcceptableInput():
self.app.logger.debug(
"Setting width resolution to %s" % (
self.resolutionXLine.text())
)
resolution.append(int(self.resolutionXLine.text()))
else:
self.app.logger.debug(
"Setting width resolution to %s" % (self.resolutionX.default)
)
resolution.append(int(self.resolutionX.default))
if self.resolutionYLine.hasAcceptableInput():
self.app.logger.debug(
"Setting height resolution to %s" % (
self.resolutionYLine.text())
)
resolution.append(int(self.resolutionYLine.text()))
else:
self.app.logger.debug(
"Setting height resolution to %s" % (self.resolutionY.default)
)
resolution.append(int(self.resolutionY.default))
return tuple(resolution)
def validateMplay(self):
# validating the mplay checkbox
return self.outputToMplay.isChecked()
def validateBeauty(self):
# validating the beauty pass checkbox
return self.beautyPassOnly.isChecked()
def validateMotionBlur(self):
# validating the motion blur checkbox
return self.useMotionblur.isChecked()
def validateDescription(self):
return str(self.description.text().encode('utf-8'))
| 40.170157 | 109 | 0.632584 | 1,364 | 15,345 | 7.09824 | 0.271261 | 0.030882 | 0.025511 | 0.029746 | 0.117228 | 0.063004 | 0.054534 | 0.051229 | 0.037389 | 0.037389 | 0 | 0.005241 | 0.278788 | 15,345 | 381 | 110 | 40.275591 | 0.869612 | 0.180059 | 0 | 0.071713 | 0 | 0 | 0.088924 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047809 | false | 0.01992 | 0.031873 | 0.01992 | 0.119522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c0ff208b3d9fad2f476aaa9302a5ad8c54a42d | 929 | py | Python | userbot/plugins/detail.py | aksr-aashish/FIREXUSERBOT | dff0b7bf028cb27779626ce523402346cc990402 | [
"MIT"
] | null | null | null | userbot/plugins/detail.py | aksr-aashish/FIREXUSERBOT | dff0b7bf028cb27779626ce523402346cc990402 | [
"MIT"
] | 1 | 2022-01-09T11:35:06.000Z | 2022-01-09T11:35:06.000Z | userbot/plugins/detail.py | aksr-aashish/FIREXUSERBOT | dff0b7bf028cb27779626ce523402346cc990402 | [
"MIT"
] | null | null | null | import asyncio
from userbot import CmdHelp, bot
from userbot.cmdhelp import CmdHelp
from userbot.utils import admin_cmd, sudo_cmd
CmdHelp("detail").add_command("detailed", None, "help to get detail of plugin").add()
@bot.on(sudo_cmd(pattern="detailed ?(.*)", allow_sudo=True))
@bot.on(admin_cmd(pattern="detailed ?(.*)"))
async def _(event):
help_plugs = event.pattern_match.group(1).lower()
if help_plugs:
if help_plugs in CmdHelp:
await event.edit(f"Details For 🗡 {CmdHelp[help_plugs]}")
else:
await event.edit(
f"Nothign Is Named as {help_plugs} `.help` to see valid plugs"
)
else:
help_string = "".join(f'`{i[0]}`, ' for i in CmdHelp.values())
help_string = help_string[:-2]
await event.edit("`Are You Commedy Me?`!\n\n" f"{help_string}")
await asyncio.sleep(2)
await event.edit("`Specify A Plugin`")
| 34.407407 | 85 | 0.632939 | 132 | 929 | 4.333333 | 0.462121 | 0.078671 | 0.097902 | 0.052448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005548 | 0.223897 | 929 | 26 | 86 | 35.730769 | 0.786408 | 0 | 0 | 0.090909 | 0 | 0 | 0.248654 | 0.022605 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c1601ccb8f54551bb1bf729a71067229571a33 | 2,345 | py | Python | TedSeg/learner/utils.py | tadeephuy/CoFo | 28461e923f112182887d66d1db499da7a2535b28 | [
"MIT"
] | 2 | 2022-02-15T07:58:29.000Z | 2022-02-25T10:08:59.000Z | TedSeg/learner/utils.py | tadeephuy/CoFo | 28461e923f112182887d66d1db499da7a2535b28 | [
"MIT"
] | null | null | null | TedSeg/learner/utils.py | tadeephuy/CoFo | 28461e923f112182887d66d1db499da7a2535b28 | [
"MIT"
] | null | null | null | from . import *
from ..utils import *
@torch.no_grad()
def show_img(self, bar=None):
ds = self.data[1].dataset
self.model.eval()
idx = np.random.randint(len(ds))
img, mask = ds[idx]
pred = self.model(img.unsqueeze(0).cuda())[0].detach().cpu()
pred = torch.sigmoid(pred).numpy()
img_pred = (pred[0]>=0.5)
img = img.permute(1,2,0).detach().numpy()
color_mask = np.array([0.4*img_pred, 0.0*img_pred, 0.92*img_pred])
color_mask = np.transpose(color_mask, (1,2,0))
img_pred = img_pred[...,None]
img = (img * 0.22) + 0.5
pred_blend = 0.4*color_mask + 0.6*img*img_pred + (1 - img_pred)*img
mask = mask.permute(1,2,0)[...,0].detach().numpy()
color_mask = np.array([0.4*mask, 1.0*mask, 0.25*mask])
color_mask = np.transpose(color_mask, (1,2,0))
mask = mask[...,None]
target_blend = 0.5*color_mask + 0.5*img*mask + (1 - mask)*img
img = np.concatenate([img, target_blend, pred_blend], axis=1)*255
imgs_out = Image.fromarray(img.astype(np.uint8), 'RGB')
if bar is None:
display(imgs_out)
return
if not hasattr(bar, 'imgs_out'):
bar.imgs_out = display(imgs_out, display_id=True)
else:
bar.imgs_out.update(imgs_out)
Learner.show_img = show_img
def get_dice_score(self, test_loader):
self.model.eval()
dices = []
with torch.no_grad():
b = progress_bar(test_loader)
for xb, yb in b:
xb = xb.cuda()
yb = yb.cuda()
pred = self.model(xb)
dice = batch_dice_score(pred, yb)
dice = dice.detach().cpu().numpy()
dices.append(dice)
dice_score = np.concatenate(dices, axis=0)
b.comment = f'{dice_score.mean():.2f}'
return dice_score.mean()
Learner.get_dice_score = get_dice_score
@torch.no_grad()
def predict(self, img_path, preprocess=None):
self.model.eval()
img = self.data[1].dataset.imread(img_path)
if preprocess is None:
img = self.data[1].dataset.transforms(image=img)['image']
else:
img = preprocess(img)
img = torch.tensor(np.transpose(img, (2,0,1)))/255
if self.data[1].dataset.normalize:
img = (img - 0.5)/0.22
img = img[None].cuda()
mask = self.model(img).sigmoid().detach().cpu().numpy()[0]
return mask
Learner.predict = predict | 31.266667 | 71 | 0.604691 | 366 | 2,345 | 3.737705 | 0.251366 | 0.040936 | 0.026316 | 0.046784 | 0.118421 | 0.090643 | 0.090643 | 0.090643 | 0.090643 | 0 | 0 | 0.03861 | 0.226866 | 2,345 | 75 | 72 | 31.266667 | 0.71594 | 0 | 0 | 0.142857 | 0 | 0 | 0.016624 | 0.009804 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.031746 | 0 | 0.126984 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c1e8e11a64f95ca20819d8d491121fc16a8e05 | 1,408 | py | Python | quadlib/utils/Confirm.py | symunona/quadroscope | 8bd9591b6e8201daebc9804a9e35204a9a9eb512 | [
"MIT"
] | null | null | null | quadlib/utils/Confirm.py | symunona/quadroscope | 8bd9591b6e8201daebc9804a9e35204a9a9eb512 | [
"MIT"
] | null | null | null | quadlib/utils/Confirm.py | symunona/quadroscope | 8bd9591b6e8201daebc9804a9e35204a9a9eb512 | [
"MIT"
] | null | null | null | import pygame
from .. import utils
from ..states.State import State
from ..utils.Scroller import Scroller
from ..utils import pygame_utils
offsety = pygame_utils.Calc.centerY(utils.screen['fontsize'])
offsetx = pygame_utils.Calc.centerX(10)
class Confirm(State):
def __init__(self, stack, question, success_callback):
State.__init__(self, stack)
self.title = question
self.success_callback = success_callback
self.scroller = Scroller(['no','yes'])
def draw(self, surface):
State.draw(self, surface)
pygame_utils.txt_large(surface, (offsetx, offsety), self.scroller.get_value(), (0,255,255))
pygame_utils.txt(surface, (offsetx, offsety+30), self.scroller.get_value(self.scroller.get_index()+1), (128,128,128))
def event(self, event):
self.scroller.event(event)
if event.type == pygame.MOUSEBUTTONDOWN:
# ok
if event.button == 2 :
if self.scroller.get_value() == 'yes':
self.success_callback()
self.back()
return
# cancel
if event.button == 1 :
self.back()
return
State.event(self, event)
| 34.341463 | 126 | 0.533381 | 144 | 1,408 | 5.0625 | 0.340278 | 0.098765 | 0.082305 | 0.082305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0.362926 | 1,408 | 41 | 127 | 34.341463 | 0.787068 | 0.006392 | 0 | 0.137931 | 0 | 0 | 0.011791 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0 | 0.172414 | 0 | 0.37931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c292b7c9c0b62674e710a4685413cb9eb2fbed | 7,275 | py | Python | assets/sintel_lite_v2_1/tfx_exporter.py | fire/TressFX-OpenGL | 15f1820408d2b9c4696bb9fd8777d7445875705b | [
"MIT"
] | 7 | 2019-03-12T08:11:39.000Z | 2020-08-11T18:47:50.000Z | assets/sintel_lite_v2_1/tfx_exporter.py | fire/TressFX-OpenGL | 15f1820408d2b9c4696bb9fd8777d7445875705b | [
"MIT"
] | null | null | null | assets/sintel_lite_v2_1/tfx_exporter.py | fire/TressFX-OpenGL | 15f1820408d2b9c4696bb9fd8777d7445875705b | [
"MIT"
] | 3 | 2019-02-28T13:06:17.000Z | 2021-10-31T01:24:01.000Z | import struct
import cmath
from array import array
# DONE co.w defines if is movable, set 1 for root
# DONE increase sintel scale
# TODO wmtx?
# TODO check if conversion to mesh is required
# TODO remove tmp object
HEADER_SIZE_BYTES = 160
def debug(*argv):
print('[DEBUG]', ' '.join([str(x) for x in argv]))
def to_opengl_coordinates(co):
return [co[0], co[2], -co[1]]
def create_header_bytes(num_strands, verts_per_strand):
reserved_cnt = 32
reserved = ('I', 0, '_pad')
fields = [
('f', 4.0, 'version'),
('I', num_strands, 'numHairStrands'),
('I', verts_per_strand, 'numVerticesPerStrand'),
('I', HEADER_SIZE_BYTES, 'offsetVertexPosition'),
('I', 0, 'offsetStrandUV'),
('I', 0, 'offsetVertexUV'),
('I', 0, 'offsetStrandThickness'),
('I', 0, 'offsetVertexColor'),
*([reserved] * reserved_cnt) # spread_op(array of 32 of 'reserved')
]
# print('write ' +str(len(fields))+' fields')
# print(fields[16])
pack_fmt = ''.join([x[0] for x in fields])
size = struct.calcsize(pack_fmt)
if size != HEADER_SIZE_BYTES:
return 0, 'Expected header data to take {} bytes, it took {} bytes'.format(HEADER_SIZE_BYTES, size)
data = struct.pack(pack_fmt, *[x[1] for x in fields])
return size, data
def find_strands(hair_mesh_object):
edges = [(e.vertices[0], e.vertices[1]) for e in hair_mesh_object.data.edges]
# debug('edges', edges)
# for each strand we have array of vertex ids
strands = []
last_vertex_id = None
for edge in edges:
if edge[0] != last_vertex_id: # new strand
strands.append([edge[0]])
current_strand = strands[-1]
current_strand.append(edge[1])
last_vertex_id = edge[1]
# debug - print indices
# for (i,s) in enumerate(strands):
# debug('strand {} vertices({}): [{}]'.format(i, len(s), ', '.join([str(x) for x in s])))
# debug - add 1st vertex from each strand to vertex group 'roots'
roots_vertex_group = hair_mesh_object.vertex_groups.new("roots")
for s in strands:
roots_vertex_group.add([s[0]], 1.0, "ADD")
return strands
def get_strand_vertex_locations(object, vertex_ids):
def get_vert_location (vert_id):
co = object.data.vertices[vert_id].co
return [*to_opengl_coordinates(co)]
return [get_vert_location(x) for x in vertex_ids]
class StrandVertexDistributor:
'this class is only to not have n functions in global namespace, feelsbadman'
expected_vertices_cnt = 0
def __init__(self, expected_vertices_cnt):
self.expected_vertices_cnt = expected_vertices_cnt
def distribute(self, strand):
vertices = [self._create_point(strand[0], False)]
segment_length = self._get_strand_len(strand) / (self.expected_vertices_cnt - 1)
last_original_vertex = 0 # id of last used point from original strand collection
measure_start_point = strand[0] # coordinates from which we start measurement
for i in range(self.expected_vertices_cnt - 2):
next_point_in = segment_length
to_next_orginal_vertex = self._distance(measure_start_point, strand[last_original_vertex + 1])
while next_point_in > to_next_orginal_vertex:
last_original_vertex += 1
measure_start_point = strand[last_original_vertex]
next_point_in -= to_next_orginal_vertex
to_next_orginal_vertex = self._distance(measure_start_point, strand[last_original_vertex + 1])
p = self._get_point(measure_start_point, strand[last_original_vertex + 1], next_point_in)
measure_start_point = p
vertices.append(self._create_point(p, True))
# add last vertex
vertices.append(self._create_point(strand[-1], True))
return vertices
def _distance(self, p1, p2):
''' p1, p2 <- arrays of 3 values [x, y, z] '''
L = [ p1[0] - p2[0] , p1[1] - p2[1] , p1[2] - p2[2] ]
return abs( cmath.sqrt( L[0]*L[0] + L[1]*L[1] + L[2]*L[2]) )
def _get_point(self, start, end, dist):
''' point between start and end in range of 'dist' from start '''
percent = dist / self._distance(start, end)
l = [0,0,0]
delta = [0,0,0]
for i in range (3):
delta[i] = end[i] - start[i]
l[i] = start[i] + (percent * delta[i])
return l
def _get_strand_len(self, strand):
d = 0
for i in range(1, len(strand)):
d += self._distance(strand[i-1], strand[i])
return d
def _create_point(self, co, is_movable):
return [*co, 1.0 if is_movable else 0.0]
def export_hair(particle_modifier, verts_per_strand):
particle_system = particle_modifier.particle_system
strands_cnt = particle_system.settings.count
# create mesh from hair object to get nicer b-spline interpolated version
bpy.ops.object.modifier_convert(modifier=particle_modifier.name) # we prob. can remove this and use raw hair system
hair_mesh_object = bpy.context.object
debug('created tmp object: ', hair_mesh_object.name)
# process mesh version of hair
strands = find_strands(hair_mesh_object)
if len(strands) != strands_cnt:
strands_cnt = len(strands)
print('[Warning] expected {} strands, found {}'.format(strands_cnt, len(strands)))
# save strands to vertex positions
locations = []
vert_distributor = StrandVertexDistributor(verts_per_strand)
for (i, strand) in enumerate(strands):
vertex_locations = get_strand_vertex_locations(hair_mesh_object, strand)
for vert in vert_distributor.distribute(vertex_locations):
# locations.extend([*vert, 1])
locations.extend(vert)
#dbg
# for i in range(0, len(locations), 4):
# vert_in_strand_id = (i/4)%verts_per_strand
# if (vert_in_strand_id == 0): print('EDGE ', int((i/4)/verts_per_strand))
# print(' vert {:4}: (x={:7.4}, y={:7.4}, z={:7.4}, w={:.4})'.format(i, locations[i], locations[i+1], locations[i+2], locations[i+3]))
strands_array = array('f', locations)
# create header
header_bytes, header_data = create_header_bytes(strands_cnt, verts_per_strand)
if header_bytes == 0:
return False, header_data
return True, (header_data, strands_array)
def export_hair_systems(hair_object, verts_per_strand):
# TODO raise if verts_per_strand < len(strand.vertices), as this requires cutting vertices
hair_modifiers = [x for x in hair_object.modifiers if x.type == 'PARTICLE_SYSTEM']
if len(hair_modifiers) == 0:
return (False,
'Selected object does not have any hair systems, checked \'{}\''.format(hair_object.name))
for mod in hair_modifiers:
particle_system = mod.particle_system
strands_cnt = particle_system.settings.count
print('TFx exporting: \'{}\' :: \'{}\', {} hair strands'.format(
hair_object.name, particle_system.name, strands_cnt))
ok, data = export_hair(mod, verts_per_strand)
if not ok:
return False, data
filename = '//{}-{}.tfx'.format(hair_object.name, particle_system.name)
filepath = bpy.path.abspath(filename)
print('Filepath:', filepath)
with open(filepath, 'wb') as file:
debug('processing went ok, writing file')
for d in data:
file.write(d)
return True, None
if __name__ == '__main__':
print('')
print('----------------------')
verts_per_strand = 32
hair_object = bpy.context.object
ok, err_msg = export_hair_systems(hair_object, verts_per_strand)
if not ok:
print('[Error]', err_msg)
print('--- fin ---')
| 35.487805 | 139 | 0.684811 | 1,070 | 7,275 | 4.429907 | 0.207477 | 0.020253 | 0.035443 | 0.024262 | 0.173629 | 0.127637 | 0.115823 | 0.078481 | 0.030802 | 0.030802 | 0 | 0.017305 | 0.181856 | 7,275 | 204 | 140 | 35.661765 | 0.779066 | 0.196976 | 0 | 0.043165 | 0 | 0 | 0.094963 | 0.007318 | 0 | 0 | 0 | 0.004902 | 0 | 1 | 0.100719 | false | 0 | 0.021583 | 0.014388 | 0.251799 | 0.057554 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c2a41830fcaa626b4148295c1015717b48e9b3 | 475 | py | Python | vocab.py | alexalemi/textbench | bf39462734e8d345205f37bf08a10969ad0b8044 | [
"MIT"
] | 2 | 2015-05-20T10:11:26.000Z | 2015-07-13T08:48:46.000Z | vocab.py | alexalemi/textbench | bf39462734e8d345205f37bf08a10969ad0b8044 | [
"MIT"
] | null | null | null | vocab.py | alexalemi/textbench | bf39462734e8d345205f37bf08a10969ad0b8044 | [
"MIT"
] | null | null | null | from __future__ import print_function
import sys
from collections import Counter
from operator import itemgetter
def main():
cut = 10
counter = Counter()
with open(sys.argv[1], 'r') as f:
for line in f:
for word in line.split():
counter[word] += 1
for word, count in sorted(counter.items(), key=itemgetter(1), reverse=True):
if count > cut:
print(word, count)
if __name__ == "__main__":
main()
| 21.590909 | 80 | 0.606316 | 63 | 475 | 4.365079 | 0.539683 | 0.029091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014793 | 0.288421 | 475 | 21 | 81 | 22.619048 | 0.798817 | 0 | 0 | 0 | 0 | 0 | 0.018947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.25 | 0 | 0.3125 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c2ce2cbd255faf557f3e9c18ec6fe8f7668bb7 | 2,258 | py | Python | CNN/CIFAR10/CNN_classifier.py | kopok2/DeepLearning | 6c66ce6d2d91706135cc844342aa12f8ca13b8dd | [
"MIT"
] | null | null | null | CNN/CIFAR10/CNN_classifier.py | kopok2/DeepLearning | 6c66ce6d2d91706135cc844342aa12f8ca13b8dd | [
"MIT"
] | null | null | null | CNN/CIFAR10/CNN_classifier.py | kopok2/DeepLearning | 6c66ce6d2d91706135cc844342aa12f8ca13b8dd | [
"MIT"
] | null | null | null | # coding=utf-8
"""CIFAR 100 Dataset CNN classifier."""
from keras.datasets import cifar100
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation, Flatten
from keras.layers.convolutional import Conv2D, MaxPooling2D
from keras.optimizers import Adam
if __name__ == '__main__':
IMG_CHANNELS = 3
IMG_X_SIZE = 32
IMG_Y_SIZE = 32
BATCH_SIZE = 128
EPOCHS = 20
CLASSES = 100
VERBOSE = 1
VALIDATION_SPLIT = 0.2
OPTIMIZER = Adam()
print("Loading data...")
(X_train, y_train), (X_test, y_test) = cifar100.load_data()
print("X_train shape:", X_train.shape)
print(X_train.shape[0], "train samples")
print(X_test.shape[0], "test samples")
Y_train = np_utils.to_categorical(y_train, CLASSES)
Y_test = np_utils.to_categorical(y_test, CLASSES)
X_train = X_train.astype("float32")
X_test = X_test.astype("float32")
X_train /= 255
X_test /= 255
print("Creating model...")
model = Sequential()
# Convolutional part
model.add(Conv2D(10, (3, 3), padding="same", input_shape=(IMG_X_SIZE, IMG_Y_SIZE, IMG_CHANNELS)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Conv2D(50, kernel_size=5, padding="same"))
model.add(Activation("relu"))
model.add(Dropout(0.25))
model.add(Conv2D(10, kernel_size=5, padding="same"))
model.add(Activation("relu"))
model.add(Dropout(0.25))
model.add(Conv2D(10, kernel_size=3, padding="same"))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2, 2)))
# Dense part
model.add(Flatten())
model.add(Dense(40))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(CLASSES))
model.add(Activation('softmax'))
model.summary()
model.compile(loss="categorical_crossentropy", optimizer=OPTIMIZER, metrics=["accuracy"])
model.fit(X_train, Y_train, batch_size=BATCH_SIZE, epochs=EPOCHS, validation_split=VALIDATION_SPLIT,
verbose=VERBOSE)
score = model.evaluate(X_test, Y_test, batch_size=BATCH_SIZE, verbose=VERBOSE)
print("Test score:", score[0])
print("Test accuracy:", score[1])
| 32.724638 | 104 | 0.68512 | 320 | 2,258 | 4.64375 | 0.275 | 0.102288 | 0.072678 | 0.074024 | 0.264468 | 0.236205 | 0.236205 | 0.236205 | 0.181696 | 0.181696 | 0 | 0.041711 | 0.171833 | 2,258 | 68 | 105 | 33.205882 | 0.752941 | 0.034101 | 0 | 0.163636 | 0 | 0 | 0.088817 | 0.011045 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.109091 | 0 | 0.109091 | 0.127273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c7322f19f14429dd09d775f87b4ab3277746cb | 2,610 | py | Python | Chapter03/Ch3.HeartDisease.py | AcornPublishing/keras-projects | 1a8486a375af3bacf9aa78e93c9fc1736ac16d52 | [
"MIT"
] | null | null | null | Chapter03/Ch3.HeartDisease.py | AcornPublishing/keras-projects | 1a8486a375af3bacf9aa78e93c9fc1736ac16d52 | [
"MIT"
] | null | null | null | Chapter03/Ch3.HeartDisease.py | AcornPublishing/keras-projects | 1a8486a375af3bacf9aa78e93c9fc1736ac16d52 | [
"MIT"
] | null | null | null | import pandas as pd
#Import data
HDNames= ['age','sex','cp','trestbps','chol','fbs','restecg','thalach','exang','oldpeak','slope','ca','hal','HeartDisease']
Data = pd.read_excel('Ch3.ClevelandData.xlsx', names=HDNames)
print(Data.head(20))
print(Data.info())
summary = Data.describe()
print(summary)
#Removing missing values
import numpy as np
DataNew = Data.replace('?', np.nan)
print(DataNew.info())
print(DataNew.describe())
print(DataNew.isnull().sum())
DataNew = DataNew.dropna()
print(DataNew.info())
print(DataNew.isnull().sum())
#Divide DataFrame
InputNames = HDNames
InputNames.pop()
Input = pd.DataFrame(DataNew.iloc[:, 0:13],columns=InputNames)
Target = pd.DataFrame(DataNew.iloc[:, 13],columns=['HeartDisease'])
#Data scaling
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
print(scaler.fit(Input))
InputScaled = scaler.fit_transform(Input)
InputScaled = pd.DataFrame(InputScaled,columns=InputNames)
summary = InputScaled.describe()
summary = summary.transpose()
print(summary)
#Data visualitation
#DataScaled = pd.concat([InputScaled, Target], axis=1)
import matplotlib.pyplot as plt
boxplot = InputScaled.boxplot(column=InputNames,showmeans=True)
plt.show()
pd.plotting.scatter_matrix(InputScaled, figsize=(6, 6))
plt.show()
CorData = InputScaled.corr(method='pearson')
with pd.option_context('display.max_rows', None, 'display.max_columns', CorData.shape[1]):
print(CorData)
plt.matshow(CorData)
plt.xticks(range(len(CorData.columns)), CorData.columns)
plt.yticks(range(len(CorData.columns)), CorData.columns)
plt.colorbar()
plt.show()
#Split the data
from sklearn.model_selection import train_test_split
Input_train, Input_test, Target_train, Target_test = train_test_split(InputScaled, Target, test_size = 0.30, random_state = 5)
print(Input_train.shape)
print(Input_test.shape)
print(Target_train.shape)
print(Target_test.shape)
from keras.models import Sequential
from keras.layers import Dense
model = Sequential()
model.add(Dense(30, input_dim=13, activation='tanh'))
model.add(Dense(20, activation='tanh'))
model.add(Dense(1, activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
model.fit(Input_train, Target_train, epochs=1000, verbose=1)
model.summary()
score = model.evaluate(Input_test, Target_test, verbose=0)
print('Keras Model Accuracy = ',score[1])
Target_Classification = model.predict(Input_test)
Target_Classification = (Target_Classification > 0.5)
from sklearn.metrics import confusion_matrix
print(confusion_matrix(Target_test, Target_Classification))
| 23.727273 | 126 | 0.766667 | 347 | 2,610 | 5.665706 | 0.391931 | 0.030519 | 0.022889 | 0.021363 | 0.095626 | 0.039674 | 0.039674 | 0 | 0 | 0 | 0 | 0.013423 | 0.08659 | 2,610 | 109 | 127 | 23.944954 | 0.811242 | 0.056322 | 0 | 0.147541 | 0 | 0 | 0.088355 | 0.008958 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.131148 | 0 | 0.131148 | 0.278689 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17c9d62e972e34463d9f9f34d2e74a4c74ad6087 | 4,935 | py | Python | project/annunci/forms.py | Pasqualeso/Use_and_return | 1cb050868ca0ce790fee7f119c37a281854107d1 | [
"Apache-2.0"
] | null | null | null | project/annunci/forms.py | Pasqualeso/Use_and_return | 1cb050868ca0ce790fee7f119c37a281854107d1 | [
"Apache-2.0"
] | null | null | null | project/annunci/forms.py | Pasqualeso/Use_and_return | 1cb050868ca0ce790fee7f119c37a281854107d1 | [
"Apache-2.0"
] | null | null | null | from flask_wtf import FlaskForm
from wtforms import StringField, SelectField, IntegerField, DateField, FileField, SubmitField, TextAreaField
from wtforms.validators import DataRequired, Length
# classe form registrazione annuncio
from wtforms.widgets import TextArea
class RegistrationFormAnnuncio(FlaskForm):
titolo_annuncio = StringField("Inserisci il titolo dell'annuncio", validators=[DataRequired(), Length(1, 64)])
categoria_annuncio = SelectField(
"Inserisci la categoria dell'annuncio",
choices=[("musica", "Musica"), ("telefonia", "Telefonia"), ("console_e_videogiochi", "Console e videogiochi"),
("informatica", "Informatica"),
("auto", "Accessori auto"), ("giocattoli", "Giocattoli"), ("fotografia", "Fotografia"),
("videomaker", "Video-maker"), ("altro", "Altro")]
, validators=[DataRequired()])
immagine_annuncio = FileField("Inserisci un'immagine", validators=[DataRequired()])
prezzo_per_giorno_annuncio = IntegerField("Inserisci il prezzo al giorno per l'annuncio",
validators=[DataRequired()])
descrizione_annuncio = TextAreaField("Inserisci una descrizione (Max 200 caratteri)",
validators=[DataRequired(), Length(1, 200)])
data_inizio_noleggio_annuncio = DateField("Inserisci una data di inizio noleggio", format='%Y-%m-%d',
validators=[DataRequired()])
data_fine_noleggio_annuncio = DateField("Inserisci una data di fine noleggio", format='%Y-%m-%d',
validators=[DataRequired()])
citta_annuncio = StringField("Inserisci la città dell'annuncio",
validators=[DataRequired(message='Città obbligatoria'), Length(1, 64)])
provincia_annuncio = SelectField(
"Inserisci la provincia dell'annuncio",
choices=[("ag", "Agrigento"), ("al", "Alessandria"), ("an", "Ancona"), ("ao", "Aosta"), ("ar", "Arezzo"),
("ap", "Ascoli Piceno"),
("at", "Asti"), ("av", "Avellino"), ("ba", "Bari"), ("bt", "Barletta-Andria-Trani"), ("bl", "Belluno"),
("bn", "Benevento"),
("bg", "Bergamo"), ("bi", "Biella"), ("bo", "Bologna"), ("bz", "Bolzano"), ("bs", "Brescia"),
("br", "Brindisi"), ("ca", "Cagliari"),
("cl", "Caltanissetta"), ("cb", "Campobasso"), ("ci", "Carbonia - iglesias "), ("ce", "Caserta"),
("ct", "Catania"), ("cz", "Catanzaro"),
("ch", "Chieti"), ("co", "Como"), ("cs", "Cosenza"), ("cr", "Cremona"), ("kr", "Crotone"),
("cn", "Cuneo"), ("en", "Enna"), ("fm", "Fermo"),
("fe", "Ferrara"), ("fi", "Firenze"), ("fg", "Foggia"), ("fc", "Forli-Cesena"), ("fr", "Frosinone"),
("ge", "Genova"), ("go", "Gorizia"),
("gr", "Grosseto"), ("im", "Imperia"), ("is", "Isernia"), ("sp", "La spezia"), ("aq", "L'aquila"),
("lt", "Latina"), ("le", "Lecce"),
("lc", "Lecco"), ("li", "Livorno"), ("lo", "Lodi"), ("lu", "Lucca"), ("mc", "Macerata"),
("mn", "Mantova"), ("ms", "Massa - Carrara"),
("mt", "Matera"), ("vs", "Medio Campidano"), ("me", "Messina"), ("mi", "Milano"), ("mo", "Modena"),
("mb", "Monza e della Brianza"),
("na", "Napoli"), ("no", "Novara"), ("nu", "Nuoro"), ("og", "Ogliastra"), ("ot", "Olbia - Tempio"),
("or", "Oristano"), ("pd", "Padova"),
("pa", "Palermo"), ("pr", "Parma"), ("pv", "Pavia"), ("pg", "Perugia"), ("pu", "Pesaro e Urbino"),
("pe", "Pescara"), ("pc", "Piacenza"),
("pi", "Pisa"), ("pt", "Pistoia"), ("pn", "Pordenone"), ("pz", "Potenza"), ("po", "Prato"),
("rg", "Ragusa"), ("ra", "Ravenna"), ("rc", "Reggio di Calabria"),
("re", "Reggio nell'Emilia"), ("ri", "Rieti"), ("rn", "Rimini"), ("rm", "Roma"), ("ro", "Rovigo"),
("sa", "Salerno"), ("ss", "Sassari"),
("sv", "Savona"), ("si", "Siena"), ("sr", "Siracusa"), ("so", "Sondrio"), ("ta", "Taranto"),
("te", "Teramo"), ("tr", "Terni"), ("to", "Torino"),
("tp", "Trapani"), ("tn", "Trento"), ("tv", "Treviso"), ("ts", "Trieste"), ("ud", "Udine"),
("va", "Varese"), ("ve", "Venezia"), ("vb", "Verbano - Cusio - Ossola "),
("vc", "Vercelli"), ("vr", "Verona"), ("vv", "ibo valentia"), ("vi", "Vicenza"), ("vt", "Viterbo")]
, validators=[DataRequired()])
via_annuncio = StringField("Inserisci la via dell'annuncio", validators=[DataRequired()])
cap_annuncio = IntegerField("Inserisci il cap dell'annuncio", validators=[DataRequired()])
submit_annuncio = SubmitField("Aggiungi")
| 67.60274 | 120 | 0.509017 | 445 | 4,935 | 5.597753 | 0.701124 | 0.09715 | 0.060217 | 0.054597 | 0.065837 | 0.065837 | 0.065837 | 0 | 0 | 0 | 0 | 0.003549 | 0.257751 | 4,935 | 72 | 121 | 68.541667 | 0.676495 | 0.00689 | 0 | 0.081967 | 0 | 0 | 0.34456 | 0.008573 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065574 | 0 | 0.278689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17cb1bffebb70c650e6891783a5b2554fe932069 | 18,293 | py | Python | collective_blog/models/blog.py | AmatanHead/collective-blog | 9bf040faac43feae08b33900e30bf7d17b817ae4 | [
"MIT"
] | null | null | null | collective_blog/models/blog.py | AmatanHead/collective-blog | 9bf040faac43feae08b33900e30bf7d17b817ae4 | [
"MIT"
] | 4 | 2016-09-22T06:37:20.000Z | 2016-09-22T16:49:48.000Z | collective_blog/models/blog.py | AmatanHead/collective-blog | 9bf040faac43feae08b33900e30bf7d17b817ae4 | [
"MIT"
] | null | null | null | from django.db import models
from django.db.models import Q, QuerySet, F
from django.utils.translation import ugettext_lazy as _
from django.utils import timezone
from collective_blog import settings
from collective_blog.utils.errors import PermissionCheckFailed
from s_markdown.models import MarkdownField, HtmlCacheField
from s_markdown.datatype import Markdown
from s_markdown.renderer import BaseRenderer
from s_markdown.extensions import (FencedCodeExtension,
EscapeHtmlExtension,
SemiSaneListExtension,
StrikethroughExtension,
AutomailExtension,
AutolinkExtension,
CommentExtension)
from s_appearance.utils.icons import ICONS
from s_voting.models import VoteCacheField
from .post import PostVote
from .comment import CommentVote
from uuslug import uuslug
class Blog(models.Model):
name = models.CharField(max_length=100,
verbose_name=_('Name'),
unique=True)
slug = models.SlugField(max_length=100,
db_index=True,
unique=True,
blank=True,
editable=False)
about = MarkdownField(blank=True,
markdown=Markdown,
renderer=BaseRenderer(
extensions=[
'markdown.extensions.smarty',
'markdown.extensions.abbr',
'markdown.extensions.def_list',
'markdown.extensions.tables',
'markdown.extensions.smart_strong',
FencedCodeExtension(),
EscapeHtmlExtension(),
SemiSaneListExtension(),
StrikethroughExtension(),
AutolinkExtension(),
AutomailExtension(),
CommentExtension(),
]
),
verbose_name=_('About this blog'))
_about_html = HtmlCacheField(about)
icon = models.CharField(max_length=100, blank=True, choices=ICONS)
TYPES = (
('O', _('Open')),
('P', _('Private')),
)
type = models.CharField(max_length=2, default='0', choices=TYPES,
verbose_name=_('Type of the blog'))
JOIN_CONDITIONS = (
('A', _('Anyone can join')),
('K', _('Only users with high karma can join')),
('I', _('Manual approval required'))
)
join_condition = models.CharField(max_length=2, default='A',
choices=JOIN_CONDITIONS,
verbose_name=_('Who can join the blog'))
join_karma_threshold = models.SmallIntegerField(default=0,
verbose_name=_(
'Join karma threshold'))
POST_CONDITIONS = (
('A', _('Anyone can add posts')),
('K', _('Only users with high karma can add posts')),
)
post_condition = models.CharField(max_length=2, default='K',
choices=POST_CONDITIONS,
verbose_name=_('Who can add posts'))
post_membership_required = models.BooleanField(
default=True, verbose_name=_('Require membership to write posts'))
post_admin_required = models.BooleanField(
default=False, verbose_name=_('Only admins can write posts'))
post_karma_threshold = models.SmallIntegerField(
default=0, verbose_name=_('Post karma threshold'))
COMMENT_CONDITIONS = (
('A', _('Anyone can comment')),
('K', _('Only users with high karma can comment')),
)
comment_condition = models.CharField(max_length=2, default='A',
choices=COMMENT_CONDITIONS,
verbose_name=_(
'Who can comment in the blog'))
comment_membership_required = models.BooleanField(
default=False, verbose_name=_('Require membership to write comments'))
comment_karma_threshold = models.SmallIntegerField(
default=0, verbose_name=_('Comment karma threshold'))
members = models.ManyToManyField(settings.AUTH_USER_MODEL,
through='Membership',
editable=False)
# Common methods
# --------------
def save(self, force_insert=False, force_update=False, using=None,
update_fields=None):
self.slug = uuslug(self.name,
instance=self,
max_length=100,
start_no=2,
word_boundary=True,
save_order=True)
if not self.slug:
self.slug = uuslug(self.heading + '_blog',
instance=self,
max_length=100,
start_no=2,
word_boundary=True,
save_order=True)
self.slug = self.slug.lower()
super(Blog, self).save(force_insert, force_update, using, update_fields)
class Meta:
verbose_name = _("Blog")
verbose_name_plural = _("Blogs")
ordering = ("name",)
def __str__(self):
return str(self.name)
# Permissions control
# -------------------
def check_membership(self, user):
"""Check if the given user is a member of the blog"""
if user.is_anonymous():
return None
return Membership.objects.filter(blog=self, user=user).with_rating().first()
@staticmethod
def can_be_moderated_by(user):
"""Check if the user is a moderator with profile editing rights"""
return user.is_active and user.is_staff and (
user.has_perm('blog.change_membership') or
user.has_perm('blog.change_blog'))
@staticmethod
def is_banned(membership):
"""Check if the given user is banned in this blog
No-members (membership==None) considered to be not banned.
"""
if membership is not None and not membership.is_left():
return membership.is_banned()
else:
return False
@staticmethod
def check_can_change_settings(membership):
"""Check if the given user has permissions to change settings
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_change_settings()
else:
return False
@staticmethod
def check_can_delete_posts(membership):
"""Check if the given user has permissions delete posts in the blog
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_delete_posts()
else:
return False
@staticmethod
def check_can_delete_comments(membership):
"""Check if the given user has permissions delete comments in the blog
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_delete_comments()
else:
return False
@staticmethod
def check_can_ban(membership):
"""Check if the given user has permissions to ban members of the blog
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_ban()
else:
return False
@staticmethod
def check_can_accept_new_users(membership):
"""Check if the given user has permissions to can accept new users
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_accept_new_users()
else:
return False
@staticmethod
def check_can_manage_permissions(membership):
"""Check if the given user has permissions to manage permissions
of other users.
No-members (membership==None) considered to have no rights.
"""
if membership is not None and not membership.is_left():
return membership.can_manage_permissions()
else:
return False
def check_can_post(self, user):
"""Check if the given user has permissions to add posts to this blog"""
if not user.is_active or user.is_anonymous():
return False
membership = self.check_membership(user)
if ((self.type != 'O' or self.post_membership_required or self.post_admin_required) and
(membership is None or
membership.is_banned() or
membership.is_left())):
return False
elif self.post_admin_required and membership.role not in ['O', 'A']:
return False
elif (self.post_condition == 'K' and
user.profile.karma < self.post_karma_threshold):
return False
else:
return True
def check_can_comment(self, user):
"""Check if the given user has permissions to add comments to this blog"""
if not user.is_active or user.is_anonymous():
return False
membership = self.check_membership(user)
if ((self.type != 'O' or self.comment_membership_required) and
(membership is None or
membership.is_banned() or
membership.is_left())):
return False
elif (self.comment_condition == 'K' and
user.profile.karma < self.post_karma_threshold):
return False
else:
return True
def check_can_join(self, user):
"""Checks if the user can join the blog
Note that joining process should go through the special method.
Note also that this method returns `True` for blogs with manual
approval required.
Makes database queries: `check_membership` and karma calculation.
"""
if not user.is_active or user.is_anonymous():
return False
membership = self.check_membership(user)
if membership is not None and not membership.is_left():
return False # Already joined
if self.join_condition == 'A':
return True
elif self.join_condition == 'K':
return user.profile.karma >= self.join_karma_threshold
elif self.join_condition == 'I':
return True # Can send a request
else:
return False
# Actions
# -------
def join(self, user, role=None):
"""Add the user to the blog's membership
:param user: User which wants to be a member.
:param role: Force the role of the user. Ignore join conditions.
Does change the role of the users who already joined.
:return: Message or None if the role passed.
:raises PermissionCheckFailed: If the user can't join the blog.
"""
if self.check_can_join(user) or role is not None:
membership, c = Membership.objects.get_or_create(user=user, blog=self)
if c:
post_rating = PostVote.objects.filter(object__author=user, object__blog=self).score()
membership.overall_posts_rating = post_rating
comment_rating = CommentVote.objects.filter(object__author=user, object__post__blog=self).score()
membership.overall_comments_rating = comment_rating
if role is not None:
membership.role = role
membership.save()
return
if membership.role == 'LB':
membership.role = 'B'
membership.save()
return _("Success. You are still banned, though")
elif membership.role != 'L':
return _("You've already joined to the=is blog")
elif self.join_condition == 'I':
membership.role = 'W'
membership.save()
return _("A request has been sent")
else:
membership.role = 'M'
membership.save()
return _("Success")
else:
raise PermissionCheckFailed(_("You can't join this blog"))
def leave(self, user):
"""Remove the user to the blog's membership
:param user: User which wants to leave.
"""
membership = self.check_membership(user)
if membership is not None and membership.role != 'O':
if membership.role == 'B':
membership.role = 'LB'
else:
membership.role = 'L'
membership.save()
class MembershipQuerySet(QuerySet):
"""Queryset of votes
Allows for routine operations like getting overall rating etc.
"""
def with_rating(self):
"""Annotate rating of the member"""
return self.annotate(
rating=F('overall_posts_rating') * 10 + F('overall_comments_rating')
)
class MembershipManager(models.Manager):
"""Wrap objects to the `MembershipQuerySet`"""
def get_queryset(self):
return MembershipQuerySet(self.model)
def _overall_posts_rating_cache_query(v):
return Q(user__pk=v.object.author.pk) & Q(blog__pk=v.object.blog.pk)
def _overall_comments_rating_cache_query(v):
return Q(user__pk=v.object.author.pk) & Q(blog__pk=v.object.post.blog.pk)
class Membership(models.Model):
"""Members of blogs"""
user = models.ForeignKey(settings.AUTH_USER_MODEL,
models.CASCADE)
blog = models.ForeignKey(Blog,
models.CASCADE)
COLORS = (
('gray', _('Gray')),
('black', _('Black')),
('blue', _('Blue')),
('orange', _('Orange')),
('purple', _('Purple')),
('marshy', _('Marshy')),
('turquoise', _('Turquoise')),
('red', _('Red')),
('yellow', _('Yellow')),
('green', _('Green')),
)
color = models.CharField(max_length=10, choices=COLORS, default='gray')
ROLES = (
('O', _('Owner')),
('M', _('Member')),
('B', _('Banned')),
('A', _('Administrator')),
('W', _('Waiting for approval')),
('L', _('Left the blog')),
('LB', _('Left the blog (banned)')),
)
ROLE_ORDERING = dict(O=0, A=2, W=3, M=4, B=5, LB=5, L=6)
role = models.CharField(max_length=2, choices=ROLES, default='L')
ban_expiration = models.DateTimeField(default=timezone.now)
can_change_settings_flag = models.BooleanField(
default=False, verbose_name=_("Can change blog's settings"))
can_delete_posts_flag = models.BooleanField(
default=False, verbose_name=_("Can delete posts"))
can_delete_comments_flag = models.BooleanField(
default=False, verbose_name=_("Can delete comments"))
can_ban_flag = models.BooleanField(
default=False, verbose_name=_("Can ban a member"))
can_accept_new_users_flag = models.BooleanField(
default=False, verbose_name=_("Can accept new users"))
can_manage_permissions_flag = models.BooleanField(
default=False, verbose_name=_("Can manage permissions"))
overall_posts_rating = VoteCacheField(PostVote, _overall_posts_rating_cache_query)
overall_comments_rating = VoteCacheField(CommentVote, _overall_comments_rating_cache_query)
# Common methods
# --------------
objects = MembershipManager()
class Meta:
unique_together = ('user', 'blog')
def __str__(self):
return str(self.user) + ' in ' + str(self.blog)
# Permissions control
# -------------------
def can_be_banned(self):
return self.role in ['M', 'B', 'LB']
def ban(self, time=None):
if self.can_be_banned():
if time is None:
self.role = 'B'
else:
self.ban_expiration = timezone.now() + time
self.save()
def unban(self):
if self.can_be_banned():
self.role = 'M'
self.ban_expiration = timezone.now()
self.save()
def is_banned(self):
return self.role == 'B' or self.ban_expiration >= timezone.now()
def ban_is_permanent(self):
return self.role == 'B'
def is_left(self):
return self.role in ['L', 'LB']
def _common_check(self, flag):
"""Check that the member can perform an action
Here to reduce code duplication.
"""
has_perms = self.user.is_active and self.user.is_staff and (
self.user.has_perm('blog.change_membership') or
self.user.has_perm('blog.change_blog'))
return has_perms or (self.role in ['O', 'A'] and
not self.is_left() and
not self.is_banned() and
(flag or self.role == 'O'))
def can_change_settings(self):
return self._common_check(self.can_change_settings_flag)
def can_delete_posts(self):
return self._common_check(self.can_delete_posts_flag)
def can_delete_comments(self):
return self._common_check(self.can_delete_comments_flag)
def can_ban(self):
return self._common_check(self.can_ban_flag)
def can_accept_new_users(self):
return self._common_check(self.can_accept_new_users_flag)
def can_manage_permissions(self):
return self._common_check(self.can_manage_permissions_flag)
| 34.065177 | 113 | 0.571202 | 1,975 | 18,293 | 5.095696 | 0.147848 | 0.028617 | 0.01093 | 0.014905 | 0.460056 | 0.388514 | 0.35463 | 0.29213 | 0.240064 | 0.191971 | 0 | 0.003036 | 0.33368 | 18,293 | 536 | 114 | 34.128731 | 0.822627 | 0.120429 | 0 | 0.276353 | 0 | 0 | 0.07988 | 0.012921 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096866 | false | 0 | 0.042735 | 0.042735 | 0.415954 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17cbb46fb8beb427bd241afd6d6d4e42544e1a26 | 7,152 | py | Python | build/PureCloudPlatformClientV2/models/survey_assignment.py | cjohnson-ctl/platform-client-sdk-python | 38ce53bb8012b66e8a43cc8bd6ff00cf6cc99100 | [
"MIT"
] | 10 | 2019-02-22T00:27:08.000Z | 2021-09-12T23:23:44.000Z | libs/PureCloudPlatformClientV2/models/survey_assignment.py | rocketbot-cl/genesysCloud | dd9d9b5ebb90a82bab98c0d88b9585c22c91f333 | [
"MIT"
] | 5 | 2018-06-07T08:32:00.000Z | 2021-07-28T17:37:26.000Z | libs/PureCloudPlatformClientV2/models/survey_assignment.py | rocketbot-cl/genesysCloud | dd9d9b5ebb90a82bab98c0d88b9585c22c91f333 | [
"MIT"
] | 6 | 2020-04-09T17:43:07.000Z | 2022-02-17T08:48:05.000Z | # coding: utf-8
"""
Copyright 2016 SmartBear Software
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Ref: https://github.com/swagger-api/swagger-codegen
"""
from pprint import pformat
from six import iteritems
import re
import json
from ..utils import sanitize_for_serialization
class SurveyAssignment(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
def __init__(self):
"""
SurveyAssignment - a model defined in Swagger
:param dict swaggerTypes: The key is attribute name
and the value is attribute type.
:param dict attributeMap: The key is attribute name
and the value is json key in definition.
"""
self.swagger_types = {
'survey_form': 'PublishedSurveyFormReference',
'flow': 'DomainEntityRef',
'invite_time_interval': 'str',
'sending_user': 'str',
'sending_domain': 'str'
}
self.attribute_map = {
'survey_form': 'surveyForm',
'flow': 'flow',
'invite_time_interval': 'inviteTimeInterval',
'sending_user': 'sendingUser',
'sending_domain': 'sendingDomain'
}
self._survey_form = None
self._flow = None
self._invite_time_interval = None
self._sending_user = None
self._sending_domain = None
@property
def survey_form(self):
"""
Gets the survey_form of this SurveyAssignment.
The survey form used for this survey.
:return: The survey_form of this SurveyAssignment.
:rtype: PublishedSurveyFormReference
"""
return self._survey_form
@survey_form.setter
def survey_form(self, survey_form):
"""
Sets the survey_form of this SurveyAssignment.
The survey form used for this survey.
:param survey_form: The survey_form of this SurveyAssignment.
:type: PublishedSurveyFormReference
"""
self._survey_form = survey_form
@property
def flow(self):
"""
Gets the flow of this SurveyAssignment.
The URI reference to the flow associated with this survey.
:return: The flow of this SurveyAssignment.
:rtype: DomainEntityRef
"""
return self._flow
@flow.setter
def flow(self, flow):
"""
Sets the flow of this SurveyAssignment.
The URI reference to the flow associated with this survey.
:param flow: The flow of this SurveyAssignment.
:type: DomainEntityRef
"""
self._flow = flow
@property
def invite_time_interval(self):
"""
Gets the invite_time_interval of this SurveyAssignment.
An ISO 8601 repeated interval consisting of the number of repetitions, the start datetime, and the interval (e.g. R2/2018-03-01T13:00:00Z/P1M10DT2H30M). Total duration must not exceed 90 days.
:return: The invite_time_interval of this SurveyAssignment.
:rtype: str
"""
return self._invite_time_interval
@invite_time_interval.setter
def invite_time_interval(self, invite_time_interval):
"""
Sets the invite_time_interval of this SurveyAssignment.
An ISO 8601 repeated interval consisting of the number of repetitions, the start datetime, and the interval (e.g. R2/2018-03-01T13:00:00Z/P1M10DT2H30M). Total duration must not exceed 90 days.
:param invite_time_interval: The invite_time_interval of this SurveyAssignment.
:type: str
"""
self._invite_time_interval = invite_time_interval
@property
def sending_user(self):
"""
Gets the sending_user of this SurveyAssignment.
User together with sendingDomain used to send email, null to use no-reply
:return: The sending_user of this SurveyAssignment.
:rtype: str
"""
return self._sending_user
@sending_user.setter
def sending_user(self, sending_user):
"""
Sets the sending_user of this SurveyAssignment.
User together with sendingDomain used to send email, null to use no-reply
:param sending_user: The sending_user of this SurveyAssignment.
:type: str
"""
self._sending_user = sending_user
@property
def sending_domain(self):
"""
Gets the sending_domain of this SurveyAssignment.
Validated email domain, required
:return: The sending_domain of this SurveyAssignment.
:rtype: str
"""
return self._sending_domain
@sending_domain.setter
def sending_domain(self, sending_domain):
"""
Sets the sending_domain of this SurveyAssignment.
Validated email domain, required
:param sending_domain: The sending_domain of this SurveyAssignment.
:type: str
"""
self._sending_domain = sending_domain
def to_dict(self):
"""
Returns the model properties as a dict
"""
result = {}
for attr, _ in iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_json(self):
"""
Returns the model as raw JSON
"""
return json.dumps(sanitize_for_serialization(self.to_dict()))
def to_str(self):
"""
Returns the string representation of the model
"""
return pformat(self.to_dict())
def __repr__(self):
"""
For `print` and `pprint`
"""
return self.to_str()
def __eq__(self, other):
"""
Returns true if both objects are equal
"""
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""
Returns true if both objects are not equal
"""
return not self == other
| 30.434043 | 200 | 0.608082 | 818 | 7,152 | 5.152812 | 0.243276 | 0.02847 | 0.104389 | 0.032028 | 0.440095 | 0.386951 | 0.354448 | 0.292764 | 0.253381 | 0.237248 | 0 | 0.013514 | 0.317114 | 7,152 | 234 | 201 | 30.564103 | 0.849509 | 0.465604 | 0 | 0.081395 | 0 | 0 | 0.080166 | 0.008943 | 0 | 0 | 0 | 0 | 0 | 1 | 0.197674 | false | 0 | 0.05814 | 0 | 0.395349 | 0.011628 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17cc2a1b0b2839207ad5506129850251e0a98d15 | 2,453 | py | Python | lib/JumpScale/servers/key_value_store/memory_store.py | Jumpscale/jumpscale_core8 | f80ac9b1ab99b833ee7adb17700dcf4ef35f3734 | [
"Apache-2.0"
] | 8 | 2016-04-14T14:04:57.000Z | 2020-06-09T00:24:34.000Z | lib/JumpScale/servers/key_value_store/memory_store.py | Jumpscale/jumpscale_core8 | f80ac9b1ab99b833ee7adb17700dcf4ef35f3734 | [
"Apache-2.0"
] | 418 | 2016-01-25T10:30:00.000Z | 2021-09-08T12:29:13.000Z | lib/JumpScale/servers/key_value_store/memory_store.py | Jumpscale/jumpscale_core8 | f80ac9b1ab99b833ee7adb17700dcf4ef35f3734 | [
"Apache-2.0"
] | 9 | 2016-04-21T07:21:17.000Z | 2022-01-24T10:35:54.000Z | from servers.key_value_store.store import KeyValueStoreBase
NAMESPACES = dict()
import re
class MemoryKeyValueStore(KeyValueStoreBase):
def __init__(self, name=None, namespace=None):
self.name = name
if namespace:
self.db = NAMESPACES.setdefault(namespace, dict())
else:
self.db = dict()
KeyValueStoreBase.__init__(self, namespace=namespace)
self.dbindex = dict()
self.inMem = True
def get(self, key, secret=""):
key = str(key)
if not self.exists(key):
raise j.exceptions.RuntimeError("Could not find object with category %s key %s" % (self.category, key))
return self.db[key]
def getraw(self, key, secret="", die=False, modecheck="r"):
key = str(key)
if not self.exists(key):
if die == False:
return None
else:
raise j.exceptions.RuntimeError("Could not find object with category %s key %s" % (self.category, key))
return self.db[key]
def set(self, key, value, secret=""):
key = str(key)
self.db[key] = value
def delete(self, key, secret=""):
key = str(key)
if self.exists(key):
del(self.db[key])
def exists(self, key, secret=""):
key = str(key)
if key in self.db:
return True
else:
return False
def index(self, items, secret=""):
"""
@param items is {indexitem:key}
indexitem is e.g. $actorname:$state:$role (is a text which will be index to key)
indexitems are always made lowercase
key links to the object in the db
':' is not allowed in indexitem
"""
self.dbindex.update(items)
def index_remove(self, keys, secret=""):
self.dbindex = {}
def list(self, regex=".*", returnIndex=False, secret=""):
"""
regex is regex on the index, will return matched keys
e.g. .*:new:.* would match e.g. all obj with state new
"""
res = set()
for item, key in self.dbindex.items():
if re.match(regex, item) is not None:
if returnIndex is False:
for key2 in key.split(","):
res.add(key2)
else:
for key2 in key.split(","):
res.add((item, key2))
return list(res)
| 30.6625 | 119 | 0.538932 | 298 | 2,453 | 4.399329 | 0.312081 | 0.032037 | 0.034325 | 0.045767 | 0.26926 | 0.26926 | 0.26926 | 0.187643 | 0.146453 | 0.146453 | 0 | 0.002492 | 0.345699 | 2,453 | 79 | 120 | 31.050633 | 0.81433 | 0.138606 | 0 | 0.314815 | 0 | 0 | 0.04689 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.037037 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17cd59fc54077918df61da8dbcd9437f5bf62395 | 25,611 | py | Python | cvnets/modules/mobilevit_block.py | apple/ml-cvnets | 84d992f413e52c0468f86d23196efd9dad885e6f | [
"AML"
] | 209 | 2021-10-30T08:32:10.000Z | 2022-03-31T16:18:03.000Z | cvnets/modules/mobilevit_block.py | apple/ml-cvnets | 84d992f413e52c0468f86d23196efd9dad885e6f | [
"AML"
] | 12 | 2021-12-04T10:47:11.000Z | 2022-03-31T15:39:40.000Z | cvnets/modules/mobilevit_block.py | apple/ml-cvnets | 84d992f413e52c0468f86d23196efd9dad885e6f | [
"AML"
] | 50 | 2021-11-01T08:15:02.000Z | 2022-03-29T08:17:34.000Z | #
# For licensing see accompanying LICENSE file.
# Copyright (C) 2022 Apple Inc. All Rights Reserved.
#
import numpy as np
from torch import nn, Tensor
import math
import torch
from torch.nn import functional as F
from typing import Optional, Dict, Tuple, Union, Sequence
from .transformer import TransformerEncoder, LinearAttnFFN
from .base_module import BaseModule
from ..misc.profiler import module_profile
from ..layers import ConvLayer, get_normalization_layer
class MobileViTBlock(BaseModule):
"""
This class defines the `MobileViT block <https://arxiv.org/abs/2110.02178?context=cs.LG>`_
Args:
opts: command line arguments
in_channels (int): :math:`C_{in}` from an expected input of size :math:`(N, C_{in}, H, W)`
transformer_dim (int): Input dimension to the transformer unit
ffn_dim (int): Dimension of the FFN block
n_transformer_blocks (Optional[int]): Number of transformer blocks. Default: 2
head_dim (Optional[int]): Head dimension in the multi-head attention. Default: 32
attn_dropout (Optional[float]): Dropout in multi-head attention. Default: 0.0
dropout (Optional[float]): Dropout rate. Default: 0.0
ffn_dropout (Optional[float]): Dropout between FFN layers in transformer. Default: 0.0
patch_h (Optional[int]): Patch height for unfolding operation. Default: 8
patch_w (Optional[int]): Patch width for unfolding operation. Default: 8
transformer_norm_layer (Optional[str]): Normalization layer in the transformer block. Default: layer_norm
conv_ksize (Optional[int]): Kernel size to learn local representations in MobileViT block. Default: 3
dilation (Optional[int]): Dilation rate in convolutions. Default: 1
no_fusion (Optional[bool]): Do not combine the input and output feature maps. Default: False
"""
def __init__(
self,
opts,
in_channels: int,
transformer_dim: int,
ffn_dim: int,
n_transformer_blocks: Optional[int] = 2,
head_dim: Optional[int] = 32,
attn_dropout: Optional[float] = 0.0,
dropout: Optional[int] = 0.0,
ffn_dropout: Optional[int] = 0.0,
patch_h: Optional[int] = 8,
patch_w: Optional[int] = 8,
transformer_norm_layer: Optional[str] = "layer_norm",
conv_ksize: Optional[int] = 3,
dilation: Optional[int] = 1,
no_fusion: Optional[bool] = False,
*args,
**kwargs
) -> None:
conv_3x3_in = ConvLayer(
opts=opts,
in_channels=in_channels,
out_channels=in_channels,
kernel_size=conv_ksize,
stride=1,
use_norm=True,
use_act=True,
dilation=dilation,
)
conv_1x1_in = ConvLayer(
opts=opts,
in_channels=in_channels,
out_channels=transformer_dim,
kernel_size=1,
stride=1,
use_norm=False,
use_act=False,
)
conv_1x1_out = ConvLayer(
opts=opts,
in_channels=transformer_dim,
out_channels=in_channels,
kernel_size=1,
stride=1,
use_norm=True,
use_act=True,
)
conv_3x3_out = None
if not no_fusion:
conv_3x3_out = ConvLayer(
opts=opts,
in_channels=2 * in_channels,
out_channels=in_channels,
kernel_size=conv_ksize,
stride=1,
use_norm=True,
use_act=True,
)
super().__init__()
self.local_rep = nn.Sequential()
self.local_rep.add_module(name="conv_3x3", module=conv_3x3_in)
self.local_rep.add_module(name="conv_1x1", module=conv_1x1_in)
assert transformer_dim % head_dim == 0
num_heads = transformer_dim // head_dim
global_rep = [
TransformerEncoder(
opts=opts,
embed_dim=transformer_dim,
ffn_latent_dim=ffn_dim,
num_heads=num_heads,
attn_dropout=attn_dropout,
dropout=dropout,
ffn_dropout=ffn_dropout,
transformer_norm_layer=transformer_norm_layer,
)
for _ in range(n_transformer_blocks)
]
global_rep.append(
get_normalization_layer(
opts=opts,
norm_type=transformer_norm_layer,
num_features=transformer_dim,
)
)
self.global_rep = nn.Sequential(*global_rep)
self.conv_proj = conv_1x1_out
self.fusion = conv_3x3_out
self.patch_h = patch_h
self.patch_w = patch_w
self.patch_area = self.patch_w * self.patch_h
self.cnn_in_dim = in_channels
self.cnn_out_dim = transformer_dim
self.n_heads = num_heads
self.ffn_dim = ffn_dim
self.dropout = dropout
self.attn_dropout = attn_dropout
self.ffn_dropout = ffn_dropout
self.dilation = dilation
self.n_blocks = n_transformer_blocks
self.conv_ksize = conv_ksize
def __repr__(self) -> str:
repr_str = "{}(".format(self.__class__.__name__)
repr_str += "\n\t Local representations"
if isinstance(self.local_rep, nn.Sequential):
for m in self.local_rep:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.local_rep)
repr_str += "\n\t Global representations with patch size of {}x{}".format(
self.patch_h, self.patch_w
)
if isinstance(self.global_rep, nn.Sequential):
for m in self.global_rep:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.global_rep)
if isinstance(self.conv_proj, nn.Sequential):
for m in self.conv_proj:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.conv_proj)
if self.fusion is not None:
repr_str += "\n\t Feature fusion"
if isinstance(self.fusion, nn.Sequential):
for m in self.fusion:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.fusion)
repr_str += "\n)"
return repr_str
def unfolding(self, feature_map: Tensor) -> Tuple[Tensor, Dict]:
patch_w, patch_h = self.patch_w, self.patch_h
patch_area = int(patch_w * patch_h)
batch_size, in_channels, orig_h, orig_w = feature_map.shape
new_h = int(math.ceil(orig_h / self.patch_h) * self.patch_h)
new_w = int(math.ceil(orig_w / self.patch_w) * self.patch_w)
interpolate = False
if new_w != orig_w or new_h != orig_h:
# Note: Padding can be done, but then it needs to be handled in attention function.
feature_map = F.interpolate(
feature_map, size=(new_h, new_w), mode="bilinear", align_corners=False
)
interpolate = True
# number of patches along width and height
num_patch_w = new_w // patch_w # n_w
num_patch_h = new_h // patch_h # n_h
num_patches = num_patch_h * num_patch_w # N
# [B, C, H, W] --> [B * C * n_h, p_h, n_w, p_w]
reshaped_fm = feature_map.reshape(
batch_size * in_channels * num_patch_h, patch_h, num_patch_w, patch_w
)
# [B * C * n_h, p_h, n_w, p_w] --> [B * C * n_h, n_w, p_h, p_w]
transposed_fm = reshaped_fm.transpose(1, 2)
# [B * C * n_h, n_w, p_h, p_w] --> [B, C, N, P] where P = p_h * p_w and N = n_h * n_w
reshaped_fm = transposed_fm.reshape(
batch_size, in_channels, num_patches, patch_area
)
# [B, C, N, P] --> [B, P, N, C]
transposed_fm = reshaped_fm.transpose(1, 3)
# [B, P, N, C] --> [BP, N, C]
patches = transposed_fm.reshape(batch_size * patch_area, num_patches, -1)
info_dict = {
"orig_size": (orig_h, orig_w),
"batch_size": batch_size,
"interpolate": interpolate,
"total_patches": num_patches,
"num_patches_w": num_patch_w,
"num_patches_h": num_patch_h,
}
return patches, info_dict
def folding(self, patches: Tensor, info_dict: Dict) -> Tensor:
n_dim = patches.dim()
assert n_dim == 3, "Tensor should be of shape BPxNxC. Got: {}".format(
patches.shape
)
# [BP, N, C] --> [B, P, N, C]
patches = patches.contiguous().view(
info_dict["batch_size"], self.patch_area, info_dict["total_patches"], -1
)
batch_size, pixels, num_patches, channels = patches.size()
num_patch_h = info_dict["num_patches_h"]
num_patch_w = info_dict["num_patches_w"]
# [B, P, N, C] --> [B, C, N, P]
patches = patches.transpose(1, 3)
# [B, C, N, P] --> [B*C*n_h, n_w, p_h, p_w]
feature_map = patches.reshape(
batch_size * channels * num_patch_h, num_patch_w, self.patch_h, self.patch_w
)
# [B*C*n_h, n_w, p_h, p_w] --> [B*C*n_h, p_h, n_w, p_w]
feature_map = feature_map.transpose(1, 2)
# [B*C*n_h, p_h, n_w, p_w] --> [B, C, H, W]
feature_map = feature_map.reshape(
batch_size, channels, num_patch_h * self.patch_h, num_patch_w * self.patch_w
)
if info_dict["interpolate"]:
feature_map = F.interpolate(
feature_map,
size=info_dict["orig_size"],
mode="bilinear",
align_corners=False,
)
return feature_map
def forward_spatial(self, x: Tensor) -> Tensor:
res = x
fm = self.local_rep(x)
# convert feature map to patches
patches, info_dict = self.unfolding(fm)
# learn global representations
for transformer_layer in self.global_rep:
patches = transformer_layer(patches)
# [B x Patch x Patches x C] --> [B x C x Patches x Patch]
fm = self.folding(patches=patches, info_dict=info_dict)
fm = self.conv_proj(fm)
if self.fusion is not None:
fm = self.fusion(torch.cat((res, fm), dim=1))
return fm
def forward_temporal(
self, x: Tensor, x_prev: Optional[Tensor] = None
) -> Union[Tensor, Tuple[Tensor, Tensor]]:
res = x
fm = self.local_rep(x)
# convert feature map to patches
patches, info_dict = self.unfolding(fm)
# learn global representations
for global_layer in self.global_rep:
if isinstance(global_layer, TransformerEncoder):
patches = global_layer(x=patches, x_prev=x_prev)
else:
patches = global_layer(patches)
# [B x Patch x Patches x C] --> [B x C x Patches x Patch]
fm = self.folding(patches=patches, info_dict=info_dict)
fm = self.conv_proj(fm)
if self.fusion is not None:
fm = self.fusion(torch.cat((res, fm), dim=1))
return fm, patches
def forward(
self, x: Union[Tensor, Tuple[Tensor]], *args, **kwargs
) -> Union[Tensor, Tuple[Tensor, Tensor]]:
if isinstance(x, Tuple) and len(x) == 2:
# for spatio-temporal MobileViT
return self.forward_temporal(x=x[0], x_prev=x[1])
elif isinstance(x, Tensor):
# For image data
return self.forward_spatial(x)
else:
raise NotImplementedError
def profile_module(
self, input: Tensor, *args, **kwargs
) -> Tuple[Tensor, float, float]:
params = macs = 0.0
res = input
out, p, m = module_profile(module=self.local_rep, x=input)
params += p
macs += m
patches, info_dict = self.unfolding(feature_map=out)
patches, p, m = module_profile(module=self.global_rep, x=patches)
params += p
macs += m
fm = self.folding(patches=patches, info_dict=info_dict)
out, p, m = module_profile(module=self.conv_proj, x=fm)
params += p
macs += m
if self.fusion is not None:
out, p, m = module_profile(
module=self.fusion, x=torch.cat((out, res), dim=1)
)
params += p
macs += m
return res, params, macs
# TODO: Add reference to MobileViTv2 paper
class MobileViTBlockv2(BaseModule):
"""
This class defines the `MobileViTv2 block <>`_
Args:
opts: command line arguments
in_channels (int): :math:`C_{in}` from an expected input of size :math:`(N, C_{in}, H, W)`
attn_unit_dim (int): Input dimension to the attention unit
ffn_multiplier (int): Expand the input dimensions by this factor in FFN. Default is 2.
n_attn_blocks (Optional[int]): Number of attention units. Default: 2
attn_dropout (Optional[float]): Dropout in multi-head attention. Default: 0.0
dropout (Optional[float]): Dropout rate. Default: 0.0
ffn_dropout (Optional[float]): Dropout between FFN layers in transformer. Default: 0.0
patch_h (Optional[int]): Patch height for unfolding operation. Default: 8
patch_w (Optional[int]): Patch width for unfolding operation. Default: 8
conv_ksize (Optional[int]): Kernel size to learn local representations in MobileViT block. Default: 3
dilation (Optional[int]): Dilation rate in convolutions. Default: 1
attn_norm_layer (Optional[str]): Normalization layer in the attention block. Default: layer_norm_2d
"""
def __init__(
self,
opts,
in_channels: int,
attn_unit_dim: int,
ffn_multiplier: Optional[Union[Sequence[Union[int, float]], int, float]] = 2.0,
n_attn_blocks: Optional[int] = 2,
attn_dropout: Optional[float] = 0.0,
dropout: Optional[float] = 0.0,
ffn_dropout: Optional[float] = 0.0,
patch_h: Optional[int] = 8,
patch_w: Optional[int] = 8,
conv_ksize: Optional[int] = 3,
dilation: Optional[int] = 1,
attn_norm_layer: Optional[str] = "layer_norm_2d",
*args,
**kwargs
) -> None:
cnn_out_dim = attn_unit_dim
conv_3x3_in = ConvLayer(
opts=opts,
in_channels=in_channels,
out_channels=in_channels,
kernel_size=conv_ksize,
stride=1,
use_norm=True,
use_act=True,
dilation=dilation,
groups=in_channels,
)
conv_1x1_in = ConvLayer(
opts=opts,
in_channels=in_channels,
out_channels=cnn_out_dim,
kernel_size=1,
stride=1,
use_norm=False,
use_act=False,
)
super(MobileViTBlockv2, self).__init__()
self.local_rep = nn.Sequential(conv_3x3_in, conv_1x1_in)
self.global_rep, attn_unit_dim = self._build_attn_layer(
opts=opts,
d_model=attn_unit_dim,
ffn_mult=ffn_multiplier,
n_layers=n_attn_blocks,
attn_dropout=attn_dropout,
dropout=dropout,
ffn_dropout=ffn_dropout,
attn_norm_layer=attn_norm_layer,
)
self.conv_proj = ConvLayer(
opts=opts,
in_channels=cnn_out_dim,
out_channels=in_channels,
kernel_size=1,
stride=1,
use_norm=True,
use_act=False,
)
self.patch_h = patch_h
self.patch_w = patch_w
self.patch_area = self.patch_w * self.patch_h
self.cnn_in_dim = in_channels
self.cnn_out_dim = cnn_out_dim
self.transformer_in_dim = attn_unit_dim
self.dropout = dropout
self.attn_dropout = attn_dropout
self.ffn_dropout = ffn_dropout
self.n_blocks = n_attn_blocks
self.conv_ksize = conv_ksize
self.enable_coreml_compatible_fn = getattr(
opts, "common.enable_coreml_compatible_module", False
)
if self.enable_coreml_compatible_fn:
# we set persistent to false so that these weights are not part of model's state_dict
self.register_buffer(
name="unfolding_weights",
tensor=self._compute_unfolding_weights(),
persistent=False,
)
def _compute_unfolding_weights(self) -> Tensor:
# [P_h * P_w, P_h * P_w]
weights = torch.eye(self.patch_h * self.patch_w, dtype=torch.float)
# [P_h * P_w, P_h * P_w] --> [P_h * P_w, 1, P_h, P_w]
weights = weights.reshape(
(self.patch_h * self.patch_w, 1, self.patch_h, self.patch_w)
)
# [P_h * P_w, 1, P_h, P_w] --> [P_h * P_w * C, 1, P_h, P_w]
weights = weights.repeat(self.cnn_out_dim, 1, 1, 1)
return weights
def _build_attn_layer(
self,
opts,
d_model: int,
ffn_mult: Union[Sequence, int, float],
n_layers: int,
attn_dropout: float,
dropout: float,
ffn_dropout: float,
attn_norm_layer: str,
*args,
**kwargs
) -> Tuple[nn.Module, int]:
if isinstance(ffn_mult, Sequence) and len(ffn_mult) == 2:
ffn_dims = (
np.linspace(ffn_mult[0], ffn_mult[1], n_layers, dtype=float) * d_model
)
elif isinstance(ffn_mult, Sequence) and len(ffn_mult) == 1:
ffn_dims = [ffn_mult[0] * d_model] * n_layers
elif isinstance(ffn_mult, (int, float)):
ffn_dims = [ffn_mult * d_model] * n_layers
else:
raise NotImplementedError
# ensure that dims are multiple of 16
ffn_dims = [int((d // 16) * 16) for d in ffn_dims]
global_rep = [
LinearAttnFFN(
opts=opts,
embed_dim=d_model,
ffn_latent_dim=ffn_dims[block_idx],
attn_dropout=attn_dropout,
dropout=dropout,
ffn_dropout=ffn_dropout,
norm_layer=attn_norm_layer,
)
for block_idx in range(n_layers)
]
global_rep.append(
get_normalization_layer(
opts=opts, norm_type=attn_norm_layer, num_features=d_model
)
)
return nn.Sequential(*global_rep), d_model
def __repr__(self) -> str:
repr_str = "{}(".format(self.__class__.__name__)
repr_str += "\n\t Local representations"
if isinstance(self.local_rep, nn.Sequential):
for m in self.local_rep:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.local_rep)
repr_str += "\n\t Global representations with patch size of {}x{}".format(
self.patch_h,
self.patch_w,
)
if isinstance(self.global_rep, nn.Sequential):
for m in self.global_rep:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.global_rep)
if isinstance(self.conv_proj, nn.Sequential):
for m in self.conv_proj:
repr_str += "\n\t\t {}".format(m)
else:
repr_str += "\n\t\t {}".format(self.conv_proj)
repr_str += "\n)"
return repr_str
def unfolding_pytorch(self, feature_map: Tensor) -> Tuple[Tensor, Tuple[int, int]]:
batch_size, in_channels, img_h, img_w = feature_map.shape
# [B, C, H, W] --> [B, C, P, N]
patches = F.unfold(
feature_map,
kernel_size=(self.patch_h, self.patch_w),
stride=(self.patch_h, self.patch_w),
)
patches = patches.reshape(
batch_size, in_channels, self.patch_h * self.patch_w, -1
)
return patches, (img_h, img_w)
def folding_pytorch(self, patches: Tensor, output_size: Tuple[int, int]) -> Tensor:
batch_size, in_dim, patch_size, n_patches = patches.shape
# [B, C, P, N]
patches = patches.reshape(batch_size, in_dim * patch_size, n_patches)
feature_map = F.fold(
patches,
output_size=output_size,
kernel_size=(self.patch_h, self.patch_w),
stride=(self.patch_h, self.patch_w),
)
return feature_map
def unfolding_coreml(self, feature_map: Tensor) -> Tuple[Tensor, Tuple[int, int]]:
# im2col is not implemented in Coreml, so here we hack its implementation using conv2d
# we compute the weights
# [B, C, H, W] --> [B, C, P, N]
batch_size, in_channels, img_h, img_w = feature_map.shape
#
patches = F.conv2d(
feature_map,
self.unfolding_weights,
bias=None,
stride=(self.patch_h, self.patch_w),
padding=0,
dilation=1,
groups=in_channels,
)
patches = patches.reshape(
batch_size, in_channels, self.patch_h * self.patch_w, -1
)
return patches, (img_h, img_w)
def folding_coreml(self, patches: Tensor, output_size: Tuple[int, int]) -> Tensor:
# col2im is not supported on coreml, so tracing fails
# We hack folding function via pixel_shuffle to enable coreml tracing
batch_size, in_dim, patch_size, n_patches = patches.shape
n_patches_h = output_size[0] // self.patch_h
n_patches_w = output_size[1] // self.patch_w
feature_map = patches.reshape(
batch_size, in_dim * self.patch_h * self.patch_w, n_patches_h, n_patches_w
)
assert (
self.patch_h == self.patch_w
), "For Coreml, we need patch_h and patch_w are the same"
feature_map = F.pixel_shuffle(feature_map, upscale_factor=self.patch_h)
return feature_map
def resize_input_if_needed(self, x):
batch_size, in_channels, orig_h, orig_w = x.shape
if orig_h % self.patch_h != 0 or orig_w % self.patch_w != 0:
new_h = int(math.ceil(orig_h / self.patch_h) * self.patch_h)
new_w = int(math.ceil(orig_w / self.patch_w) * self.patch_w)
x = F.interpolate(
x, size=(new_h, new_w), mode="bilinear", align_corners=True
)
return x
def forward_spatial(self, x: Tensor, *args, **kwargs) -> Tensor:
x = self.resize_input_if_needed(x)
fm = self.local_rep(x)
# convert feature map to patches
if self.enable_coreml_compatible_fn:
patches, output_size = self.unfolding_coreml(fm)
else:
patches, output_size = self.unfolding_pytorch(fm)
# learn global representations on all patches
patches = self.global_rep(patches)
# [B x Patch x Patches x C] --> [B x C x Patches x Patch]
if self.enable_coreml_compatible_fn:
fm = self.folding_coreml(patches=patches, output_size=output_size)
else:
fm = self.folding_pytorch(patches=patches, output_size=output_size)
fm = self.conv_proj(fm)
return fm
def forward_temporal(
self, x: Tensor, x_prev: Tensor, *args, **kwargs
) -> Union[Tensor, Tuple[Tensor, Tensor]]:
x = self.resize_input_if_needed(x)
fm = self.local_rep(x)
# convert feature map to patches
if self.enable_coreml_compatible_fn:
patches, output_size = self.unfolding_coreml(fm)
else:
patches, output_size = self.unfolding_pytorch(fm)
# learn global representations
for global_layer in self.global_rep:
if isinstance(global_layer, LinearAttnFFN):
patches = global_layer(x=patches, x_prev=x_prev)
else:
patches = global_layer(patches)
# [B x Patch x Patches x C] --> [B x C x Patches x Patch]
if self.enable_coreml_compatible_fn:
fm = self.folding_coreml(patches=patches, output_size=output_size)
else:
fm = self.folding_pytorch(patches=patches, output_size=output_size)
fm = self.conv_proj(fm)
return fm, patches
def forward(
self, x: Union[Tensor, Tuple[Tensor]], *args, **kwargs
) -> Union[Tensor, Tuple[Tensor, Tensor]]:
if isinstance(x, Tuple) and len(x) == 2:
# for spatio-temporal data (e.g., videos)
return self.forward_temporal(x=x[0], x_prev=x[1])
elif isinstance(x, Tensor):
# for image data
return self.forward_spatial(x)
else:
raise NotImplementedError
def profile_module(
self, input: Tensor, *args, **kwargs
) -> Tuple[Tensor, float, float]:
params = macs = 0.0
input = self.resize_input_if_needed(input)
res = input
out, p, m = module_profile(module=self.local_rep, x=input)
params += p
macs += m
patches, output_size = self.unfolding_pytorch(feature_map=out)
patches, p, m = module_profile(module=self.global_rep, x=patches)
params += p
macs += m
fm = self.folding_pytorch(patches=patches, output_size=output_size)
out, p, m = module_profile(module=self.conv_proj, x=fm)
params += p
macs += m
return res, params, macs
| 35.228336 | 113 | 0.582055 | 3,367 | 25,611 | 4.173745 | 0.0891 | 0.037145 | 0.019925 | 0.022415 | 0.693873 | 0.638369 | 0.579876 | 0.54636 | 0.50893 | 0.483598 | 0 | 0.009542 | 0.316622 | 25,611 | 726 | 114 | 35.27686 | 0.793395 | 0.164187 | 0 | 0.557223 | 0 | 0 | 0.031216 | 0.001792 | 0 | 0 | 0 | 0.001377 | 0.005629 | 1 | 0.0394 | false | 0 | 0.018762 | 0 | 0.101313 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17cdaecb39f6435a61283b4e5e7041c26c062bf8 | 1,782 | py | Python | fpl/command_line/db.py | martgra/fpl-timeseries-data | f5135c27cc75e56370d310bb359157dd84c29cee | [
"Apache-2.0"
] | 2 | 2020-09-13T14:51:13.000Z | 2021-07-21T13:44:17.000Z | fpl/command_line/db.py | martgra/fpl-timeseries-data | f5135c27cc75e56370d310bb359157dd84c29cee | [
"Apache-2.0"
] | 22 | 2020-10-01T09:24:56.000Z | 2021-08-02T12:38:19.000Z | fpl/command_line/db.py | martgra/fpl2021 | f5135c27cc75e56370d310bb359157dd84c29cee | [
"Apache-2.0"
] | null | null | null | """Interaction with Cosmos DB through CLI."""
import os
from pathlib import Path
import click
import pandas as pd
from fpl.data.cosmos import ElementsInserter
@click.group(help="Procedures to interact with Azure Cosmos DB", name="cosmos")
@click.option("--uri", "-u", type=str, default=None)
@click.option("--token", "-t", type=str, default=None)
@click.pass_context
def cosmos_cli(ctx, uri, token):
"""Download group."""
common = {"database": "fplstats", "container": "elements", "partition_key": "download_time"}
try:
if uri and token:
db_client = ElementsInserter(uri, token, common)
else:
db_client = ElementsInserter(
os.getenv("AZURE_COSMOS_URI"), os.getenv("AZURE_COSMOS_TOKEN"), common
)
ctx.obj = db_client
except TypeError:
ctx.obj = None
print("ERROR IN CREDENTIALS")
@cosmos_cli.command(name="dump")
@click.option(
"--path",
"-p",
type=click.Path(exists=False),
default="./dump",
help="File path to where to write data dump",
)
@click.option("--last", "-l", is_flag=True, help="Get data with last timestamp")
@click.option(
"--format-type",
"-f",
type=click.Choice(["json", "csv"]),
help="Choose format in which to dump data",
)
@click.pass_obj
def dump(db_client, path, last, format_type):
"""Dump all or latest data."""
path = Path(path)
if str(path.suffix) == "":
path = Path(str(path) + "." + format)
if last:
dataframe = pd.DataFrame(db_client.get_latest_download())
else:
dataframe = pd.DataFrame(db_client.search_db())
if format_type == "csv":
dataframe.to_csv(path)
else:
dataframe.to_json(path, force_ascii=False, orient="records", indent=4)
| 29.213115 | 96 | 0.632435 | 233 | 1,782 | 4.729614 | 0.403433 | 0.043557 | 0.025408 | 0.032668 | 0.092559 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000714 | 0.213805 | 1,782 | 60 | 97 | 29.7 | 0.785867 | 0.044893 | 0 | 0.1 | 0 | 0 | 0.199881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0.04 | 0.1 | 0 | 0.14 | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d03ef0c7711355ff8f293c43026a69152f12ea | 2,160 | py | Python | project/models.py | Ken-mbira/Tuzo | fdc25a4af91a452d78c628e3c21b27f016138ba4 | [
"MIT"
] | 1 | 2021-11-04T21:50:48.000Z | 2021-11-04T21:50:48.000Z | project/models.py | Ken-mbira/Tuzo | fdc25a4af91a452d78c628e3c21b27f016138ba4 | [
"MIT"
] | null | null | null | project/models.py | Ken-mbira/Tuzo | fdc25a4af91a452d78c628e3c21b27f016138ba4 | [
"MIT"
] | null | null | null | from django.db import models
from django.core.validators import MinValueValidator,MaxValueValidator
from django.db.models import Avg
import math
from cloudinary.models import CloudinaryField
from account.models import Account
# Create your models here.
class Project(models.Model):
"""This defines the behaviours of a project
Args:
models ([type]): [description]
"""
owner = models.ForeignKey(Account,on_delete=models.RESTRICT)
name = models.CharField(max_length=50,unique=True)
date_added = models.DateTimeField(auto_now_add=True)
date_created = models.DateField()
description = models.TextField(blank=True)
repo_link = models.CharField(max_length=200)
live_link = models.CharField(max_length=200,blank=True)
image = CloudinaryField(blank=True)
def __str__(self):
return self.name
def update(self,new):
self.name = new.name
self.date_created = new.date_created
self.description = new.description
self.repo_link = new.repo_link
self.live_link = new.live_link
self.image = new.image
self.save()
class Vote(models.Model):
"""This define all behaviours of a vote to a project
Args:
models ([type]): [description]
"""
owner = models.ForeignKey(Account,on_delete=models.RESTRICT)
project = models.ForeignKey(Project,on_delete=models.CASCADE,related_name="votes")
design_vote = models.IntegerField(default=0,validators=[MaxValueValidator(10),MinValueValidator(0)])
design_comment = models.TextField()
usability_vote = models.IntegerField(default=0,validators=[MaxValueValidator(10),MinValueValidator(0)])
usability_comment = models.TextField()
content_vote = models.IntegerField(default=0,validators=[MaxValueValidator(10),MinValueValidator(0)])
content_comment = models.TextField()
overall_comment = models.TextField()
def __str__(self):
return self.owner.username + " comment | " + str(self.project.name)
@property
def average(self):
average = (self.design_vote + self.usability_vote + self.content_vote) / 3
return math.trunc(average) | 36.610169 | 107 | 0.716667 | 261 | 2,160 | 5.789272 | 0.321839 | 0.049636 | 0.05824 | 0.047651 | 0.330245 | 0.303772 | 0.26274 | 0.26274 | 0.26274 | 0.26274 | 0 | 0.011844 | 0.179167 | 2,160 | 59 | 108 | 36.610169 | 0.840384 | 0.092593 | 0 | 0.097561 | 0 | 0 | 0.008316 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097561 | false | 0 | 0.146341 | 0.04878 | 0.780488 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d158b6437a6694cfb68d2a2960b0d73e216611 | 416 | py | Python | bevrand.playlistapi/api/db/database_models.py | fossabot/bevrand | cc444c6ac9e0f2838d4bc862cd2932babc77de78 | [
"MIT"
] | null | null | null | bevrand.playlistapi/api/db/database_models.py | fossabot/bevrand | cc444c6ac9e0f2838d4bc862cd2932babc77de78 | [
"MIT"
] | null | null | null | bevrand.playlistapi/api/db/database_models.py | fossabot/bevrand | cc444c6ac9e0f2838d4bc862cd2932babc77de78 | [
"MIT"
] | null | null | null | class MongoObject:
# Common base class for all mongo objects
def __init__(self, id, user_name, list_name, beverages, display_name=None, image_url=None):
self.id = id
self.user = user_name
self.list = list_name
self.displayName = display_name
self.imageUrl = image_url
self.dateinserted = None
self.dateupdated = None
self.beverages = beverages | 34.666667 | 95 | 0.65625 | 52 | 416 | 5.019231 | 0.461538 | 0.091954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.274038 | 416 | 12 | 96 | 34.666667 | 0.864238 | 0.09375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d2e11a21f2a29919b0d0f7268c7bfef1cc107d | 1,750 | py | Python | widgets/pad.py | peterhinch/micropython_ra8875 | a61314d62d6add831f6618c857b01d1a5b7ce388 | [
"MIT"
] | 6 | 2019-08-15T11:50:20.000Z | 2022-01-22T12:09:57.000Z | widgets/pad.py | peterhinch/micropython_ra8875 | a61314d62d6add831f6618c857b01d1a5b7ce388 | [
"MIT"
] | null | null | null | widgets/pad.py | peterhinch/micropython_ra8875 | a61314d62d6add831f6618c857b01d1a5b7ce388 | [
"MIT"
] | 2 | 2020-04-19T13:38:52.000Z | 2021-08-16T13:31:39.000Z | # pad.py Extension to lcd160gui providing the invisible touchpad class
# Released under the MIT License (MIT). See LICENSE.
# Copyright (c) 2020 Peter Hinch
# Usage: import classes as required:
# from gui.widgets.pad import Pad
import uasyncio as asyncio
from micropython_ra8875.py.ugui import Touchable
from micropython_ra8875.primitives.delay_ms import Delay_ms
# Pad coordinates relate to bounding box (BB). x, y are of BB top left corner.
# likewise width and height refer to BB
class Pad(Touchable):
long_press_time = 1000
def __init__(self, location, *, height=20, width=50, onrelease=True,
callback=None, args=[], lp_callback=None, lp_args=[]):
super().__init__(location, None, height, width, None, None, None, None, False, '', None)
self.callback = (lambda *_: None) if callback is None else callback
self.callback_args = args
self.onrelease = onrelease
self.lp_callback = lp_callback
self.lp_args = lp_args
self.lp_task = None # Long press not in progress
def show(self):
pass
def _touched(self, x, y): # Process touch
if self.lp_callback is not None:
self.lp_task = asyncio.create_task(self.longpress())
if not self.onrelease:
self.callback(self, *self.callback_args) # Callback not a bound method so pass self
def _untouched(self):
if self.lp_task is not None:
self.lp_task.cancel()
self.lp_task = None
if self.onrelease:
self.callback(self, *self.callback_args) # Callback not a bound method so pass self
async def longpress(self):
await asyncio.sleep_ms(Pad.long_press_time)
self.lp_callback(self, *self.lp_args)
| 36.458333 | 96 | 0.675429 | 247 | 1,750 | 4.631579 | 0.388664 | 0.052448 | 0.043706 | 0.024476 | 0.16958 | 0.16958 | 0.136364 | 0.136364 | 0.136364 | 0.136364 | 0 | 0.017267 | 0.238857 | 1,750 | 47 | 97 | 37.234043 | 0.841592 | 0.26 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0.033333 | 0.1 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d33576ff680e2d94743644465bde35d9d9f737 | 947 | py | Python | padinfo/view/monster_list/all_mats.py | muffin-rice/pad-cogs | 820ecf08f9569a3d7cf3264d0eb9567264b42edf | [
"MIT"
] | 2 | 2020-09-25T01:57:21.000Z | 2020-10-02T13:46:48.000Z | padinfo/view/monster_list/all_mats.py | muffin-rice/pad-cogs | 820ecf08f9569a3d7cf3264d0eb9567264b42edf | [
"MIT"
] | 43 | 2020-08-29T06:16:39.000Z | 2020-10-29T12:00:15.000Z | padinfo/view/monster_list/all_mats.py | muffin-rice/pad-cogs | 820ecf08f9569a3d7cf3264d0eb9567264b42edf | [
"MIT"
] | 6 | 2020-08-31T04:37:55.000Z | 2020-10-19T05:09:17.000Z | from typing import List, Optional, TYPE_CHECKING
from padinfo.view.materials import MaterialsViewState
from padinfo.view.monster_list.monster_list import MonsterListViewState
if TYPE_CHECKING:
from dbcog.models.monster_model import MonsterModel
class AllMatsViewState(MonsterListViewState):
VIEW_STATE_TYPE = "AllMats"
@classmethod
async def do_query(cls, dbcog, monster: "MonsterModel") -> Optional[List["MonsterModel"]]:
_, usedin, _, gemusedin, _, _, _, _ = await MaterialsViewState.do_query(dbcog, monster)
if usedin is None and gemusedin is None:
return None
monster_list = usedin or gemusedin
return monster_list
@classmethod
async def query_from_ims(cls, dbcog, ims) -> List["MonsterModel"]:
monster = await dbcog.find_monster(ims['raw_query'], ims['original_author_id'])
monster_list = await cls.do_query(dbcog, monster)
return monster_list
| 36.423077 | 95 | 0.727561 | 111 | 947 | 5.972973 | 0.378378 | 0.099548 | 0.048265 | 0.057315 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19113 | 947 | 25 | 96 | 37.88 | 0.865535 | 0 | 0 | 0.210526 | 0 | 0 | 0.073918 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.210526 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d4abcfbfea6664cbf0599157e7211b1671191d | 16,686 | py | Python | testing/base_test_case.py | lagvier/echo-sense | fe8ab921e7f61c48b224f0cc2832103a395a6cf7 | [
"MIT"
] | null | null | null | testing/base_test_case.py | lagvier/echo-sense | fe8ab921e7f61c48b224f0cc2832103a395a6cf7 | [
"MIT"
] | null | null | null | testing/base_test_case.py | lagvier/echo-sense | fe8ab921e7f61c48b224f0cc2832103a395a6cf7 | [
"MIT"
] | 1 | 2019-02-20T13:22:22.000Z | 2019-02-20T13:22:22.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Base test case class to bootstrap application testing.
Code downloaded from: http://github.com/rzajac/gaeteststarter
@author: Simon Ndunda: Modified to add support for deferred tasks
@author: Rafal Zajac rzajac<at>gmail<dot>com
@copyright: Copyright 2007-2013 Rafal Zajac rzajac<at>gmail<dot>com. All rights reserved.
@license: Licensed under the MIT license
"""
# Python imports
import os
import logging
import json
import base64
import pickle
import webtest
import datetime
import unittest
import logging
# Google imports
from google.appengine.ext import ndb, testbed
from google.appengine.api.files import file_service_stub
from google.appengine.datastore import datastore_stub_util
from google.appengine.api.blobstore import blobstore_stub, file_blob_storage
class TestbedWithFiles(testbed.Testbed):
def init_blobstore_stub(self, blobstore_path='/tmp/testbed.blobstore', app_id='test-app'):
"""Helper method to create testbed with files"""
blob_storage = file_blob_storage.FileBlobStorage(
blobstore_path, app_id)
blob_stub = blobstore_stub.BlobstoreServiceStub(blob_storage)
file_stub = file_service_stub.FileServiceStub(blob_storage)
self._register_stub('blobstore', blob_stub)
self._register_stub('file', file_stub)
class BaseTestCase(unittest.TestCase):
"""Base class for all tests"""
# The WSGIApplication
#
# In your tests assign assign to it whatever your application
# returns from webapp2.WSGIApplication().
#
APPLICATION = None
# Internal property that wraps your application in webtest.TestApp()
_app = None
# This is the format usable with strftime / strptime for parsing the
# ``eta`` field for a particular task
TASK_ETA_FORMAT = "%Y/%m/%d %H:%M:%S"
# Setup helpers
def setup_testbed(self, app_id='test-app'):
logging.getLogger().setLevel(logging.DEBUG)
self.testbed = testbed.Testbed()
self.testbed.activate()
self.testbed.setup_env(app_id=app_id)
def setup_testbed_with_files(self, app_id='test-app'):
self.testbed = TestbedWithFiles()
self.testbed.activate()
self.testbed.setup_env(app_id=app_id)
def teardown_testbed(self):
self.testbed.deactivate()
def register_search_api_stub(self):
from google.appengine.api.search.simple_search_stub import SearchServiceStub
self.testbed._register_stub('search', SearchServiceStub())
def init_taskqueue_stub(self, queue_yaml_path='.'):
self.testbed.init_taskqueue_stub()
# Setup task queue stub
taskqueue_stub = self.get_task_queue_stub()
# Ensure dev appserver task queue knows where to find queue.yaml
taskqueue_stub._root_path = os.path.dirname(
os.path.dirname(queue_yaml_path))
def get_task_queue_stub(self):
"""Get task queue stub"""
return self.testbed.get_stub(testbed.TASKQUEUE_SERVICE_NAME)
def init_modules_stub(self):
self.testbed.init_modules_stub()
def init_urlfetch_stub(self):
self.testbed.init_urlfetch_stub()
def init_app_identity_stub(self):
self.testbed.init_app_identity_stub()
def init_mail_stub(self):
self.testbed.init_mail_stub()
def init_image_stub(self):
self.testbed.init_images_stub()
def init_blobstore_stub(self):
self.testbed.init_blobstore_stub()
def init_memcache_stub(self):
self.testbed.init_memcache_stub()
def init_datastore_stub(self, probability=1):
"""Initialize datastore stub
See: https://developers.google.com/appengine/docs/python/tools/localunittesting#Writing_HRD_Datastore_Tests
"""
ds_policy = datastore_stub_util.PseudoRandomHRConsistencyPolicy(
probability=probability)
self.testbed.init_datastore_v3_stub(consistency_policy=ds_policy)
# Application helpers
def set_application(self, application):
"""Set application and TestApp to use in tests"""
self.APPLICATION = application
self._app = webtest.TestApp(self.APPLICATION)
def clear_application(self):
"""Clear application and TestApp
You should put it in your tearDown()
"""
self.APPLICATION = None
self._app = None
def save_application(self):
"""Save currently used application so you can switch to different one
This helps when your app is composed from many
small applications that you define in app.yaml
file. Example:
- url: /admin/batch/.*
script: myapp.routes.app
login: admin
- url: /admin/scripts/.*
script: myapp.scripts.routes.app
login: admin
In this case your application has at least two webapp2.WSGIApplication()
Returns: The current TestApp and APPLICATION tuple
"""
return self._app, self.APPLICATION
def restore_application(self, saved_application):
"""Restore APPLICATION saved with save_application method"""
self._app = saved_application[0]
self.APPLICATION = saved_application[1]
@property
def app(self):
"""Get application wrapped in webtest.TestApp"""
error = 'APPLICATION not set'
self.assertTrue(self.APPLICATION is not None, error)
error = '_app not set'
self.assertTrue(self._app is not None, error)
return self._app
# Helpers for testing web handlers and responses
def assertRedirects(self, response, to=None):
"""Asserts that a response from the test web server returns a 301 or 302 status.
This assertion would fail if you expect the page to redirect and instead
the server tells the browser that there was a 500 error, or some other
non-redirecting status code.
"""
error = 'Response did not redirect (status code was %i).' % response.status_int
self.assertTrue(response.status_int in (301, 302), error)
if to is not None:
error = 'Response redirected, but went to %s instead of %s' % (
response.location, to)
self.assertEqual(
response.location, 'http://localhost%s' % to, error)
def assertOK(self, response):
"""Asserts that a response from the test web server returns a 200 OK status code.
This assertion would fail if you expect a standard page to be returned
and instead the server tells the browser to redirect elsewhere.
"""
error = 'Response did not return a 200 OK (status code was %i)' % response.status_int
return self.assertEqual(response.status_int, 200, error)
def assertNotFound(self, response):
"""Asserts that a response from the test web server returns a 404 status code."""
error = 'Response was found (status code was %i)' % response.status_int
return self.assertEqual(response.status_int, 404, error)
def assertForbidden(self, response):
"""Asserts that a response from the test web server returns a 403 status code."""
error = 'Response was allowed (status code was %i)' % response.status_int
return self.assertEqual(response.status_int, 403, error)
def assertUnauthorized(self, response):
"""Asserts that a response from the test web server returns a 401 status code."""
error = 'Response was allowed (status code was %i)' % response.status_int
return self.assertEqual(response.status_int, 401, error)
def get(self, url, *args, **kwargs):
"""Performs GET request to your application"""
return self.app.get(url, *args, **kwargs)
def head(self, *args, **kwargs):
"""Performs HEAD request to your application"""
return self.app.head(*args, **kwargs)
def post(self, url, data, *args, **kwargs):
"""Performs POST request to your application"""
data = self.url_encode(data)
return self.app.post(url, data, *args, **kwargs)
def post_json(self, url, data, *args, **kwargs):
"""Performs POST request to your application and expects JSON"""
data = self.url_encode(data)
res = self.app.post(url, data, *args, **kwargs)
self.assertOK(res)
return json.loads(res.normal_body)
def get_json(self, url, *args, **kwargs):
"""Performs GET request to your application and expects JSON"""
res = self.app.get(url, *args, **kwargs)
self.assertOK(res)
return json.loads(res.normal_body)
def delete(self, *args, **kwargs):
"""Performs DELETE request to your application"""
return self.app.delete(*args, **kwargs)
def put(self, *args, **kwargs):
"""Performs PUT request to your application"""
return self.app.put(*args, **kwargs)
def url_encode(self, data):
"""Encode data in URL friendly way"""
if isinstance(data, dict):
items = []
for k, v in data.copy().items():
if isinstance(v, (list, tuple)):
for item in v:
items.append('%s=%s' % (k, item))
else:
items.append('%s=%s' % (k, v))
data = '&'.join(items)
return data
def get_cookie(self, cookie_name):
"""Get cookie from your application by name"""
return self.app.cookies.get(cookie_name)
def set_cookie(self, cookie_name, cookie_value):
"""Set cookie in your application"""
self.app.cookies[cookie_name] = cookie_value
# Task queue testing helpers
def assertTasksInQueue(self, n=None, url=None, name=None, queue_names=None):
"""Assert number of tasks in queue is not 0 or equal to n"""
tasks = self.get_tasks(url=url, name=name, queue_names=queue_names)
if n is None:
self.assertNotEqual(0, len(tasks))
else:
self.assertEqual(n, len(tasks))
def clear_task_queue(self):
"""Clear all task queues"""
stub = self.get_task_queue_stub()
for name in self.get_task_queue_names():
stub.FlushQueue(name)
def is_deferred_task(self, task):
return task.get("url") == "/_ah/queue/deferred"
def get_tasks(self, url=None, name=None, queue_names=None):
"""Get tasks
Arguments:
url - get task by URL
name - get task by name
queue_names - names of the queues to get tasks from
If none of the arguments is provided all tasks from all queues
will be returned.
Returns: array of tasks
"""
tasks = []
stub = self.get_task_queue_stub()
for queue_name in queue_names or self.get_task_queue_names():
tasks.extend(stub.GetTasks(queue_name))
if url is not None:
tasks = [t for t in tasks if t['url'] == url]
if name is not None:
tasks = [t for t in tasks if t['name'] == name]
for task in tasks:
params = {}
decoded_body = base64.b64decode(task['body'])
if not self.is_deferred_task(task) and decoded_body:
# urlparse.parse_qs doesn't seem to be in Python 2.5...
params = dict([item.split('=', 2)
for item in decoded_body.split('&')])
task.update({
'decoded_body': decoded_body,
'params': params,
})
if task.get('eta'):
task['eta_datetime'] = datetime.datetime.strptime(
task['eta'], self.TASK_ETA_FORMAT)
task['eta_date'] = task['eta_datetime'].date()
task['eta_time'] = task['eta_datetime'].time()
else:
task.update({
'eta_datetime': None,
'eta_date': None,
'eta_time': None,
})
return tasks
def get_task_queues(self, queue_name=None):
"""Get task queue names
If queue_name is provided only named queue is returned.
If there are no queues or queue_name is not found None is returned
Returns: task queue or None
"""
queues = self.get_task_queue_stub().GetQueues()
if queue_name is None:
return queues
else:
found = None
for queue in queues:
if queue['name'] == queue_name:
found = queue
break
return found
def get_task_queue_names(self):
"""Get all task names from all queues
Returns: array of task queue names
"""
return [q['name'] for q in self.get_task_queues()]
def execute_task(self, task, application=None):
"""Execute task and remove it from the queue"""
logging.debug("-------------Excecuting task: %s (%s)-----------------" %
(task.get("name"), task.get("url")))
save_app = (None, None)
if application is not None:
save_app = self.save_application()
self.set_application(application)
restore_app = True
else:
restore_app = False
if self.is_deferred_task(task):
(func, args, kwargs) = pickle.loads(task['decoded_body'])
func(*args, **kwargs)
else:
response = self.post(task['url'], task['params'])
self.assertOK(response)
stub = self.get_task_queue_stub()
stub.DeleteTask(task['queue_name'], task['name'])
if restore_app:
self.restore_application(save_app)
def execute_tasks(self, application=None):
"""Executes all currently queued tasks, and also remove them from the queue.
The tasks are executed against the provided web application.
Returns: Number of tasks that have been executed
"""
# Get all of the tasks, and then clear them.
tasks = self.get_tasks()
self.clear_task_queue()
# Run each of the tasks, checking that they succeeded.
for task in tasks:
self.execute_task(task, application)
return len(tasks)
def execute_tasks_until_empty(self, application=None):
"""Execute all tasks in the queue
If any of the tasks already in the queue create more tasks this
method will be excecuting them as well till there is
no more tasks to execute.
Returns: Number of tasks that have been executed
"""
total_count = 0
while True:
exec_count = self.execute_tasks(application)
logging.debug("executed %d tasks" % exec_count)
if exec_count > 0:
total_count += exec_count
else:
break
logging.debug(
"----------------Executed %d tasks (recursively)----------------" % total_count)
return total_count
# Other helper methods
def load_json_fixture(self, fixture_name):
"""Load JSON fixture and return Python structure"""
fixture = open('fixtures/%s.json' % fixture_name, 'r')
return json.loads(fixture.read())
def check_if_api_error(self, response):
"""Helper to test APIs
NOTE: You have to customize this method to match your API errors.
This is expects that API returns JSON with following structure:
{
"status_code": 200,
"error": "The error message"
}
"""
self.assertTrue(response.status_int == 400 or response.status_int ==
401, 'API status code should be 400 or 401.')
response = json.loads(response.body)
self.assertTrue(response['status_code'] == 400 or response[
'status_code'] == 401, 'API status code should be 400 or 401.')
self.assertTrue(
'error' in response, 'Response should have error property.')
def compare_lists(self, list1, list2):
"""Compare lists using sets
Returns:
returns 0 if the lists are the same
"""
return len(set(list1) ^ set(list2))
def removeNDBCache(self, key):
"""Helper method to remove key from context cache"""
# key.delete(use_datastore=False)
ndb.get_context()._clear_memcache((key,)).get_result()
| 33.506024 | 119 | 0.614947 | 2,065 | 16,686 | 4.839225 | 0.193705 | 0.017112 | 0.020414 | 0.012809 | 0.251776 | 0.189933 | 0.179826 | 0.126489 | 0.118483 | 0.112078 | 0 | 0.008586 | 0.288026 | 16,686 | 497 | 120 | 33.573441 | 0.832576 | 0.270826 | 0 | 0.130081 | 0 | 0 | 0.078532 | 0.0105 | 0 | 0 | 0 | 0 | 0.089431 | 1 | 0.195122 | false | 0 | 0.056911 | 0.004065 | 0.373984 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d54dd95a0e93baa717ccacb284a65fb4e5ab71 | 3,558 | py | Python | src/csv2list_ind.py | brsynth/rpVisualizer | fe48cbeb15e2e4807b41e7d3495dec11da34f83c | [
"MIT"
] | 1 | 2021-10-13T22:51:14.000Z | 2021-10-13T22:51:14.000Z | src/csv2list_ind.py | brsynth/rpVisualizer | fe48cbeb15e2e4807b41e7d3495dec11da34f83c | [
"MIT"
] | null | null | null | src/csv2list_ind.py | brsynth/rpVisualizer | fe48cbeb15e2e4807b41e7d3495dec11da34f83c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Jul 25 09:35:28 2019
@author: anael
"""
import os
import sys
sys.path.insert(0, '/home/rpviz/')
from smile2picture import picture,picture2
from smarts2tab import smarts2tab
def csv2list2(csvfolder,path,datapath,datainf,selenzyme_table):
name=str(path)
LR=[] #List of reactions
Lreact=[]
Lprod=[]
for i in range(len(datapath)):
if datapath[i][0]==str(path):#if good pathway
LR.append((datapath[i][1][:-2])+"/"+name)#problème with the last 0
Lreact.append(list((datapath[i][3]).split(":")))
Lprod.append(list((datapath[i][4]).split(":")))
# GET NODES INFORMATION
species_name={}
species_smiles={}
reac_smiles={}
dic_types={}
rule_score={}
rule_id={}
for r in LR:
dic_types[r]="reaction"
for i in datainf:
if i[1]==r.split("/")[0]: #problem with the last 0
reac_smiles[r]=i[2]
rule_score[r]=i[12]
rule_id[r]=i[10]
# species_name[reactant]=i[8]
# species_smiles[reactant]=i[5]
#
#
# species_name[product]=i[8]
# species_smiles[product]=i[5]
# To individualize each reactant
Listprod=[]
for j in range(len(Lprod)):
for i in range(len(Lprod[j])):
Listprod.append(Lprod[j][i])
Listreact=[]
for j in range(len(Lreact)):
for i in range(len(Lreact[j])):
if Lreact[j][i] not in Listprod : #if not an intermediate product
Lreact[j][i]+='_'+name
if Lreact[j][i] in Listreact: #element already exists:
c=0
for k in Listreact:
if Lreact[j][i] in k:
c+=1
Lreact[j][i]+='_'+str(c+1)
Listreact.append(Lreact[j][i])
# SET ATTRIBUTES
sp_names={}
sp_smiles={}
for reac in Listreact:
dic_types[reac]="reactant"
for key in species_name.keys():
if key in reac:
sp_names[reac]=species_name[key]
sp_smiles[reac]=species_smiles[key]
for prod in Listprod:
dic_types[prod]="product"
#Attribute target
roots={}
# for i in range(len(Lprod)):
# for j in Lprod[i]:
# if 'TARGET' in j:
# roots[j]="target"
# roots[LR[-1]]="target_reaction"
image=picture(sp_smiles)
image2=picture2(reac_smiles)[0]
image2big=picture2(reac_smiles)[1]
if selenzyme_table=='Y':
data_tab=smarts2tab(reac_smiles)
else :
data_tab={i:"" for i in reac_smiles}
# DELETE USELESS REACTION NODES
# LR2=[]
# Lreact2=[]
# Lprod2=[]
# for i in range(len(LR)) :
# if Lreact[i]!=[]:
# LR2.append(LR[i])
# Lreact2.append(Lreact[i])
# Lprod2.append(Lprod[i])
#Attributes not available with the csv
species_links=dfG_prime_o=dfG_prime_m=dfG_uncert=flux_value\
=fba_obj_name={}
RdfG_o=RdfG_m=RdfG_uncert=0
return(LR, Lreact, Lprod, name, sp_smiles, reac_smiles,image,image2,\
sp_names, species_links,roots,dic_types,image2big,data_tab,\
dfG_prime_o,dfG_prime_m, dfG_uncert, flux_value, rule_id,rule_score,\
fba_obj_name,RdfG_o,RdfG_m,RdfG_uncert)
| 25.782609 | 78 | 0.53204 | 453 | 3,558 | 4.033113 | 0.284768 | 0.014778 | 0.022989 | 0.030104 | 0.151615 | 0.106185 | 0.07225 | 0.07225 | 0.07225 | 0.039409 | 0 | 0.024566 | 0.336425 | 3,558 | 137 | 79 | 25.970803 | 0.749259 | 0.255762 | 0 | 0 | 0 | 0 | 0.016098 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014493 | false | 0 | 0.057971 | 0 | 0.072464 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d554efbdb7e44fecf8e0bd9039027a469cf97d | 3,981 | py | Python | tests/providers/test_factory.py | keelerm84/antidote | a30d488cd6d3421e50a2414bc9a20af052d3b821 | [
"MIT"
] | null | null | null | tests/providers/test_factory.py | keelerm84/antidote | a30d488cd6d3421e50a2414bc9a20af052d3b821 | [
"MIT"
] | null | null | null | tests/providers/test_factory.py | keelerm84/antidote | a30d488cd6d3421e50a2414bc9a20af052d3b821 | [
"MIT"
] | null | null | null | import pytest
from antidote.core import DependencyContainer
from antidote.exceptions import DuplicateDependencyError
from antidote.providers.factory import Build, FactoryProvider
class Service:
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
class AnotherService(Service):
pass
@pytest.fixture()
def provider():
container = DependencyContainer()
provider = FactoryProvider(container=container)
container.register_provider(provider)
return provider
@pytest.mark.parametrize(
'wrapped,kwargs',
[
(1, {'test': 1}),
(Service, {'another': 'no'}),
(Service, {'not_hashable': {'hey': 'hey'}})
]
)
def test_build_eq_hash(wrapped, kwargs):
b = Build(wrapped, **kwargs)
# does not fail
hash(b)
for f in (lambda e: e, hash):
assert f(Build(wrapped, **kwargs)) == f(b)
assert repr(wrapped) in repr(b)
assert repr(kwargs) in repr(b)
@pytest.mark.parametrize(
'args,kwargs',
[
[(), {}],
[(1,), {}],
[(), {'test': 1}],
]
)
def test_invalid_build(args: tuple, kwargs: dict):
with pytest.raises(TypeError):
Build(*args, **kwargs)
def test_simple(provider: FactoryProvider):
provider.register_class(Service)
dependency = provider.provide(Service)
assert isinstance(dependency.instance, Service)
assert repr(Service) in repr(provider)
def test_singleton(provider: FactoryProvider):
provider.register_class(Service, singleton=True)
provider.register_class(AnotherService, singleton=False)
provide = provider.provide
assert provide(Service).singleton is True
assert provide(AnotherService).singleton is False
def test_takes_dependency(provider: FactoryProvider):
provider.register_factory(factory=lambda cls: cls(), dependency=Service,
takes_dependency=True)
assert isinstance(provider.provide(Service).instance, Service)
assert provider.provide(AnotherService) is None
def test_build(provider: FactoryProvider):
provider.register_class(Service)
s = provider.provide(Build(Service, val=object)).instance
assert isinstance(s, Service)
assert dict(val=object) == s.kwargs
provider.register_factory(AnotherService, factory=AnotherService,
takes_dependency=True)
s = provider.provide(Build(AnotherService, val=object)).instance
assert isinstance(s, AnotherService)
assert (AnotherService,) == s.args
assert dict(val=object) == s.kwargs
def test_non_singleton_factory(provider: FactoryProvider):
def factory_builder():
def factory(o=object()):
return o
return factory
provider.register_factory('factory', factory=factory_builder, singleton=False)
provider.register_providable_factory('service', factory_dependency='factory')
service = provider.provide('service').instance
assert provider.provide('service').instance is not service
def test_duplicate_error(provider: FactoryProvider):
provider.register_class(Service)
with pytest.raises(DuplicateDependencyError):
provider.register_class(Service)
with pytest.raises(DuplicateDependencyError):
provider.register_factory(factory=lambda: Service(), dependency=Service)
with pytest.raises(DuplicateDependencyError):
provider.register_providable_factory(factory_dependency='dummy',
dependency=Service)
@pytest.mark.parametrize(
'kwargs',
[dict(factory='test', dependency=Service),
dict(factory=object(), dependency=Service)]
)
def test_invalid_type(provider: FactoryProvider, kwargs):
with pytest.raises(TypeError):
provider.register_factory(**kwargs)
@pytest.mark.parametrize('dependency', ['test', Service, object()])
def test_unknown_dependency(provider: FactoryProvider, dependency):
assert provider.provide(dependency) is None
| 28.035211 | 82 | 0.697563 | 418 | 3,981 | 6.533493 | 0.186603 | 0.076163 | 0.046137 | 0.071402 | 0.207616 | 0.187111 | 0.078726 | 0.055657 | 0.055657 | 0.055657 | 0 | 0.001242 | 0.190907 | 3,981 | 141 | 83 | 28.234043 | 0.846631 | 0.003266 | 0 | 0.163265 | 0 | 0 | 0.031266 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 1 | 0.142857 | false | 0.010204 | 0.040816 | 0.010204 | 0.234694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17d5b9b2556f10de25250e8fb5620d7f58277e0f | 8,384 | py | Python | nwb_conversion_tools/utils.py | wuffi/nwb-conversion-tools | 39cfb95b714155b26a17fdda9ed7d801eefd14ea | [
"BSD-3-Clause"
] | null | null | null | nwb_conversion_tools/utils.py | wuffi/nwb-conversion-tools | 39cfb95b714155b26a17fdda9ed7d801eefd14ea | [
"BSD-3-Clause"
] | null | null | null | nwb_conversion_tools/utils.py | wuffi/nwb-conversion-tools | 39cfb95b714155b26a17fdda9ed7d801eefd14ea | [
"BSD-3-Clause"
] | null | null | null | """Authors: Cody Baker, Ben Dichter and Luiz Tauffer."""
import inspect
from datetime import datetime
import numpy as np
import pynwb
def get_base_schema(tag=None):
base_schema = dict(
required=[],
properties={},
type='object',
additionalProperties=False
)
if tag is not None:
base_schema.update(tag=tag)
return base_schema
def get_root_schema():
root_schema = get_base_schema()
root_schema.update({
"$schema": "http://json-schema.org/draft-07/schema#",
})
return root_schema
def get_input_schema():
input_schema = get_root_schema()
input_schema.update({
"title": "Source data and conversion options",
"description": "Schema for the source data and conversion options",
"version": "0.1.0",
"type": "object",
})
return input_schema
def get_schema_from_method_signature(class_method):
input_schema = get_base_schema()
for param in inspect.signature(class_method.__init__).parameters.values():
if param.name != 'self':
arg_spec = {
param.name: dict(
type='string'
)
}
if param.default is param.empty:
input_schema['required'].append(param.name)
elif param.default is not None:
arg_spec[param.name].update(default=param.default)
input_schema['properties'].update(arg_spec)
input_schema['additionalProperties'] = param.kind == inspect.Parameter.VAR_KEYWORD
return input_schema
def get_schema_from_hdmf_class(hdmf_class):
"""Get metadata schema from hdmf class"""
schema = get_base_schema()
schema['tag'] = hdmf_class.__module__ + '.' + hdmf_class.__name__
pynwb_children_fields = [f['name'] for f in hdmf_class.get_fields_conf() if f.get('child', False)]
docval = hdmf_class.__init__.__docval__
for docval_arg in docval['args']:
schema_arg = {docval_arg['name']: dict(description=docval_arg['doc'])}
# type float
if docval_arg['type'] == 'float' or (isinstance(docval_arg['type'], tuple) and 'float' in docval_arg['type']):
schema_arg[docval_arg['name']].update(type='number')
# type string
elif docval_arg['type'] is str or (isinstance(docval_arg['type'], tuple) and str in docval_arg['type']):
schema_arg[docval_arg['name']].update(type='string')
# type datetime
elif docval_arg['type'] is datetime or (isinstance(docval_arg['type'], tuple) and datetime in docval_arg['type']):
schema_arg[docval_arg['name']].update(type='string', format='date-time')
# if TimeSeries, skip it
elif docval_arg['type'] is pynwb.base.TimeSeries or \
(isinstance(docval_arg['type'], tuple) and
pynwb.base.TimeSeries in docval_arg['type']):
continue
# if PlaneSegmentation, skip it
elif docval_arg['type'] is pynwb.ophys.PlaneSegmentation or \
(isinstance(docval_arg['type'], tuple) and
pynwb.ophys.PlaneSegmentation in docval_arg['type']):
continue
else:
if not isinstance(docval_arg['type'], tuple):
docval_arg_type = [docval_arg['type']]
else:
docval_arg_type = docval_arg['type']
# if another nwb object (or list of nwb objects)
if any([t.__module__.split('.')[0] == 'pynwb' for t in docval_arg_type if hasattr(t, '__module__')]):
is_nwb = [t.__module__.split('.')[0] == 'pynwb' for t in list(docval_arg_type) if
hasattr(t, '__module__')]
item = docval_arg_type[np.where(is_nwb)[0][0]]
# if it is child
if docval_arg['name'] in pynwb_children_fields:
items = [get_schema_from_hdmf_class(item)]
schema_arg[docval_arg['name']].update(
type='array', items=items, minItems=1, maxItems=1
)
# if it is link
else:
target = item.__module__ + '.' + item.__name__
schema_arg[docval_arg['name']].update(
type='string',
target=target
)
else:
continue
# Check for default arguments
if 'default' in docval_arg:
if docval_arg['default'] is not None:
schema_arg[docval_arg['name']].update(default=docval_arg['default'])
else:
schema['required'].append(docval_arg['name'])
schema['properties'].update(schema_arg)
if 'allow_extra' in docval:
schema['additionalProperties'] = docval['allow_extra']
return schema
def get_schema_for_NWBFile():
schema = get_base_schema()
schema['tag'] = 'pynwb.file.NWBFile'
schema['required'] = ["session_description", "identifier", "session_start_time"]
schema['properties'] = {
"session_description": {
"type": "string",
"format": "long",
"description": "a description of the session where this data was generated"
},
"identifier": {
"type": "string",
"description": "a unique text identifier for the file"
},
"session_start_time": {
"type": "string",
"description": "the start date and time of the recording session",
"format": "date-time"
},
"experimenter": {
"type": "array",
"items": {"type": "string", "title": "experimenter"},
"description": "name of person who performed experiment"
},
"experimentd_description": {
"type": "string",
"description": "general description of the experiment"
},
"sessiond_id": {
"type": "string",
"description": "lab-specific ID for the session"
},
"institution": {
"type": "string",
"description": "institution(s) where experiment is performed"
},
"notes": {
"type": "string",
"description": "Notes about the experiment."
},
"pharmacology": {
"type": "string",
"description": "Description of drugs used, including how and when they were administered. Anesthesia(s), "
"painkiller(s), etc., plus dosage, concentration, etc."
},
"protocol": {
"type": "string",
"description": "Experimental protocol, if applicable. E.g., include IACUC protocol"
},
"related_publications": {
"type": "string",
"description": "Publication information.PMID, DOI, URL, etc. If multiple, concatenate together and describe"
" which is which. such as PMID, DOI, URL, etc"
},
"slices": {
"type": "string",
"description": "Description of slices, including information about preparation thickness, orientation, "
"temperature and bath solution"
},
"source_script": {
"type": "string",
"description": "Script file used to create this NWB file."
},
"source_script_file_name": {
"type": "string",
"description": "Name of the source_script file"
},
"data_collection": {
"type": "string",
"description": "Notes about data collection and analysis."
},
"surgery": {
"type": "string",
"description": "Narrative description about surgery/surgeries, including date(s) and who performed surgery."
},
"virus": {
"type": "string",
"description": "Information about virus(es) used in experiments, including virus ID, source, date made, "
"injection location, volume, etc."
},
"stimulus_notes": {
"type": "string",
"description": "Notes about stimuli, such as how and where presented."
},
"lab": {
"type": "string",
"description": "lab where experiment was performed"
}
}
return schema
| 37.097345 | 122 | 0.558802 | 873 | 8,384 | 5.164948 | 0.239404 | 0.073852 | 0.066312 | 0.027944 | 0.257707 | 0.185851 | 0.139721 | 0.083167 | 0.033932 | 0.033932 | 0 | 0.001927 | 0.319179 | 8,384 | 225 | 123 | 37.262222 | 0.788017 | 0.033516 | 0 | 0.203209 | 0 | 0 | 0.306408 | 0.00569 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032086 | false | 0 | 0.02139 | 0 | 0.085562 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17dc8945b0479c7215573feed672bee00f1b9f85 | 501 | py | Python | ctflearn/the-credit-card-fraudster/recover.py | onealmond/hacking-lab | 631e615944add02db3c2afef47bf1de7171eb065 | [
"MIT"
] | 9 | 2021-04-20T15:28:36.000Z | 2022-03-08T19:53:48.000Z | ctflearn/the-credit-card-fraudster/recover.py | onealmond/hacking-lab | 631e615944add02db3c2afef47bf1de7171eb065 | [
"MIT"
] | null | null | null | ctflearn/the-credit-card-fraudster/recover.py | onealmond/hacking-lab | 631e615944add02db3c2afef47bf1de7171eb065 | [
"MIT"
] | 6 | 2021-06-24T03:25:21.000Z | 2022-02-20T21:44:52.000Z | #!/usr/bin/env python3
# The Luhn algorithm https://www.geeksforgeeks.org/luhn-algorithm/
s = "543210******1234"
def luhn(s):
ret = 0
for i in range(len(s)-2, -1, -2):
a = int(s[i]) * 2
if a > 9:
a = a//10 + a%10
ret += a
ret += sum([int(s[i]) for i in range(len(s)-1, -1, -2)])
return ret
for i in range(0, 999999):
t = s[:6] + str(i).rjust(6, '0') + s[-4:]
if luhn(t) % 10 == 0 and \
int(t) % 123457 == 0:
print(t)
| 22.772727 | 66 | 0.467066 | 89 | 501 | 2.629213 | 0.449438 | 0.051282 | 0.076923 | 0.141026 | 0.128205 | 0.128205 | 0 | 0 | 0 | 0 | 0 | 0.131965 | 0.319361 | 501 | 21 | 67 | 23.857143 | 0.554252 | 0.171657 | 0 | 0 | 0 | 0 | 0.041162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0 | 0 | 0.133333 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17dd93241085bcefddefbef71a49db8b7ac2bc00 | 9,258 | py | Python | py/legacyhalos/integrate.py | Christopher-Bradshaw/legacyhalos | 8a7644425dedc85849dc532d8252f280fa6d1f56 | [
"MIT"
] | 2 | 2020-06-06T16:11:09.000Z | 2020-12-18T01:27:55.000Z | py/legacyhalos/integrate.py | Christopher-Bradshaw/legacyhalos | 8a7644425dedc85849dc532d8252f280fa6d1f56 | [
"MIT"
] | 87 | 2017-08-06T22:07:58.000Z | 2021-07-09T12:26:55.000Z | py/legacyhalos/integrate.py | Christopher-Bradshaw/legacyhalos | 8a7644425dedc85849dc532d8252f280fa6d1f56 | [
"MIT"
] | 3 | 2018-06-28T19:04:16.000Z | 2021-03-02T22:38:37.000Z | """
legacyhalos.integrate
=====================
Code to integrate the surface brightness profiles, including extrapolation.
"""
import os, warnings, pdb
import multiprocessing
import numpy as np
from scipy.interpolate import interp1d
from astropy.table import Table, Column, vstack, hstack
import legacyhalos.io
import legacyhalos.misc
import legacyhalos.hsc
import legacyhalos.ellipse
def _init_phot(nrad_uniform=30, ngal=1, band=('g', 'r', 'z')):
"""Initialize the output photometry table.
"""
phot = Table()
[phot.add_column(Column(name='RMAX_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX10_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX30_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX100_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUXRMAX_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX10_IVAR_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX30_IVAR_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUX100_IVAR_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
[phot.add_column(Column(name='FLUXRMAX_IVAR_{}'.format(bb.upper()), dtype='f4', length=ngal)) for bb in band]
phot.add_column(Column(name='RAD', dtype='f4', length=ngal, shape=(nrad_uniform,)))
phot.add_column(Column(name='RAD_AREA', dtype='f4', length=ngal, shape=(nrad_uniform,)))
[phot.add_column(Column(name='FLUXRAD_{}'.format(bb.upper()), dtype='f4', length=ngal, shape=(nrad_uniform,))) for bb in band]
[phot.add_column(Column(name='FLUXRAD_IVAR_{}'.format(bb.upper()), dtype='f4', length=ngal, shape=(nrad_uniform,))) for bb in band]
return phot
def _dointegrate(radius, sb, sberr, rmin=None, rmax=None, band='r'):
"""Do the actual profile integration.
"""
from scipy import integrate
if len(radius) < 10:
return 0.0, 0.0, 0.0 # need at least 10 points
# Evaluate the profile at r=rmin
if rmin is None:
rmin = 0.0
sberr_rmin = sberr[0]
else:
sberr_rmin = interp1d(radius, sberr, kind='linear', fill_value='extrapolate')(rmin)
sb_rmin = interp1d(radius, sb, kind='quadratic', fill_value='extrapolate')(rmin)
if rmax is None:
rmax = radius.max() # [kpc]
if rmax > radius.max() or rmax < radius.min():
return 0.0, 0.0, 0.0 # do not extrapolate outward
else:
# Interpolate the last point to the desired rmax
#if band == 'z':
# pdb.set_trace()
sb_rmax = interp1d(radius, sb, kind='linear')(rmax)
sberr_rmax = np.sqrt(interp1d(radius, sberr**2, kind='linear')(rmax))
keep = np.where((radius > rmin) * (radius < rmax))[0]
nkeep = len(keep)
_radius = np.insert(radius[keep], [0, nkeep], [rmin, rmax])
_sb = np.insert(sb[keep], [0, nkeep], [sb_rmin, sb_rmax])
_sberr = np.insert(sberr[keep], [0, nkeep], [sberr_rmin, sberr_rmax])
# Integrate!
flux = 2 * np.pi * integrate.simps(x=_radius, y=_radius*_sb) # [nanomaggies]
ferr = 2 * np.pi * integrate.simps(x=_radius, y=_radius*_sberr) # [nanomaggies]
if band == 'r':
area = 2 * np.pi * integrate.simps(x=_radius, y=_radius) # [kpc2]
else:
area = 0.0
if flux < 0 or ferr < 0 or np.isnan(flux) or np.isnan(ferr):
#print('Negative or infinite flux or variance in band {}'.format(band))
return 0.0, 0.0, 0.0
else:
return flux, 1/ferr**2, area
def _integrate_one(args):
"""Wrapper for the multiprocessing."""
return integrate_one(*args)
def integrate_one(galaxy, galaxydir, phot=None, minerr=0.01, snrmin=1,
nrad_uniform=30, count=1):
"""Integrate over various radial ranges.
"""
if phot is None:
phot = _init_phot(ngal=1, nrad_uniform=nrad_uniform)
phot = Table(phot)
print(count, galaxy, nrad_uniform)
ellipsefit = legacyhalos.io.read_ellipsefit(galaxy, galaxydir)
if not bool(ellipsefit) or ellipsefit['success'] == False:
return phot
sbprofile = legacyhalos.ellipse.ellipse_sbprofile(ellipsefit, minerr=minerr,
snrmin=snrmin, linear=True)
allband, refpixscale = ellipsefit['bands'], ellipsefit['refpixscale']
arcsec2kpc = legacyhalos.misc.arcsec2kpc(ellipsefit['redshift']) # [kpc/arcsec]
def _get_sbprofile(sbprofile, band, minerr=0.01, snrmin=1):
sb = sbprofile['mu_{}'.format(band)] / arcsec2kpc**2 # [nanomaggies/kpc2]
sberr = sbprofile['muerr_{}'.format(band)] / arcsec2kpc**2 # [nanomaggies/kpc2]
radius = sbprofile['radius_{}'.format(band)] * arcsec2kpc # [kpc]
return radius, sb, sberr
# First integrate to r=10, 30, 100, and max kpc.
min_r, max_r = [], []
for band in allband:
radius, sb, sberr = _get_sbprofile(sbprofile, band, minerr=minerr, snrmin=snrmin)
if len(radius) == 0:
continue
min_r.append(radius.min())
max_r.append(radius.max())
for rmax in (10, 30, 100, None):
obsflux, obsivar, _ = _dointegrate(radius, sb, sberr, rmax=rmax, band=band)
#ff = interp1d(radius, np.cumsum(sb), kind='linear')(rmax)
if rmax is not None:
fkey = 'FLUX{}_{}'.format(rmax, band.upper())
ikey = 'FLUX{}_IVAR_{}'.format(rmax, band.upper())
else:
fkey = 'FLUXRMAX_{}'.format(band.upper())
ikey = 'FLUXRMAX_IVAR_{}'.format(band.upper())
phot[fkey] = obsflux
phot[ikey] = obsivar
phot['RMAX_{}'.format(band.upper())] = radius.max()
# Now integrate over fixed apertures to get the differential flux.
if len(min_r) == 0:
return phot
min_r, max_r = np.min(min_r), np.max(max_r)
if False:
rad_uniform = 10**np.linspace(np.log10(min_r), np.log10(max_r), nrad_uniform+1) # log-spacing
else:
rad_uniform = np.linspace(min_r**0.25, max_r**0.25, nrad_uniform+1)**4 # r^1/4 spacing
rmin_uniform, rmax_uniform = rad_uniform[:-1], rad_uniform[1:]
phot['RAD'][:] = (rmax_uniform - rmin_uniform) / 2 + rmin_uniform
for band in allband:
radius, sb, sberr = _get_sbprofile(sbprofile, band, minerr=minerr, snrmin=snrmin)
for ii, (rmin, rmax) in enumerate(zip(rmin_uniform, rmax_uniform)):
#if band == 'r' and ii == 49:
# pdb.set_trace()
obsflux, obsivar, obsarea = _dointegrate(radius, sb, sberr, rmin=rmin, rmax=rmax, band=band)
#print(band, ii, rmin, rmax, 22.5-2.5*np.log10(obsflux), obsarea)
if band == 'r':
phot['RAD_AREA'][0][ii] = obsarea
phot['FLUXRAD_{}'.format(band.upper())][0][ii] = obsflux
phot['FLUXRAD_IVAR_{}'.format(band.upper())][0][ii] = obsivar
return phot
def legacyhalos_integrate(sample, galaxy=None, galaxydir=None, nproc=1,
minerr=0.01, snrmin=1, nrad_uniform=30,
columns=None, verbose=False, clobber=False):
"""Wrapper script to integrate the profiles for the full sample.
columns - columns to include in the output table
"""
ngal = len(sample)
phot = _init_phot(ngal=ngal, nrad_uniform=nrad_uniform)
if columns is None:
columns = ['MEM_MATCH_ID', 'RA', 'DEC', 'Z_LAMBDA', 'LAMBDA_CHISQ', 'ID_CENT',
'MW_TRANSMISSION_G', 'MW_TRANSMISSION_R', 'MW_TRANSMISSION_Z']
if galaxy is None and galaxydir is None:
galaxy, galaxydir = legacyhalos.io.get_galaxy_galaxydir(sample)
#if hsc:
# galaxy, galaxydir = legacyhalos.hsc.get_galaxy_galaxydir(sample)
# columns = ['ID_S16A', 'RA', 'DEC', 'Z_BEST']
#else:
# columns = ['MEM_MATCH_ID', 'RA', 'DEC', 'Z_LAMBDA', 'LAMBDA_CHISQ', 'ID_CENT',
# 'MW_TRANSMISSION_G', 'MW_TRANSMISSION_R', 'MW_TRANSMISSION_Z']
integratedfile = legacyhalos.io.get_integrated_filename(hsc=hsc)
if os.path.exists(integratedfile) and clobber is False:
print('Output file {} exists; use --clobber.'.format(integratedfile))
return []
galaxy, galaxydir = np.atleast_1d(galaxy), np.atleast_1d(galaxydir)
args = list()
for ii in range(ngal):
args.append((galaxy[ii], galaxydir[ii], phot[ii], minerr, snrmin, nrad_uniform, ii))
# Divide the sample by cores.
if nproc > 1:
pool = multiprocessing.Pool(nproc)
out = pool.map(_integrate_one, args)
else:
out = list()
for _args in args:
out.append(_integrate_one(_args))
results = vstack(out)
out = hstack((sample[columns], results))
if verbose:
print('Writing {}'.format(integratedfile))
out.write(integratedfile, overwrite=True)
return out
| 39.905172 | 135 | 0.61536 | 1,222 | 9,258 | 4.52455 | 0.181669 | 0.006149 | 0.030566 | 0.044674 | 0.303491 | 0.275276 | 0.258636 | 0.252125 | 0.240912 | 0.218846 | 0 | 0.022197 | 0.236012 | 9,258 | 231 | 136 | 40.077922 | 0.759508 | 0.1469 | 0 | 0.138889 | 0 | 0 | 0.065184 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.069444 | 0 | 0.194444 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17de608e3382696ff0757cf2047b4123e91fcfb4 | 8,251 | py | Python | examples/topcam.py | k-space-ee/zoidberg | 46eaef75db7caac95a6c0089f04c720fd1936e5b | [
"MIT"
] | 2 | 2018-07-19T11:48:53.000Z | 2020-02-20T11:42:30.000Z | examples/topcam.py | k-space-ee/zoidberg | 46eaef75db7caac95a6c0089f04c720fd1936e5b | [
"MIT"
] | null | null | null | examples/topcam.py | k-space-ee/zoidberg | 46eaef75db7caac95a6c0089f04c720fd1936e5b | [
"MIT"
] | 2 | 2018-09-15T10:11:46.000Z | 2020-02-20T11:42:51.000Z | #!/usr/bin/env python3
from v4l2 import *
import logging
import fcntl
import mmap
import select
import time
import cv2
import numpy
logger = logging.getLogger("grabber")
class Grabber(object):
def __init__(self, device, fps=30, exposure=None, gain=None, saturation=None, name=None, vflip=False, hflip=False):
logger.info("Starting grabber for:", device)
self.path = device if device.startswith("/") else os.path.join("/dev/v4l/by-path", device)
self.fps = fps
self.exposure = exposure
self.gain = gain
self.saturation = saturation
self.vflip = vflip
self.hflip = hflip
self.handle = None
def open(self):
self.handle = open(self.path, 'rb+', buffering=0)
cp = v4l2_capability()
fcntl.ioctl(self.handle, VIDIOC_QUERYCAP, cp)
fmt = v4l2_format()
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
fcntl.ioctl(self.handle, VIDIOC_G_FMT, fmt) # get current settings
print("width:", fmt.fmt.pix.width, "height", fmt.fmt.pix.height)
print("pxfmt:", "V4L2_PIX_FMT_YUYV" if fmt.fmt.pix.pixelformat == V4L2_PIX_FMT_YUYV else fmt.fmt.pix.pixelformat)
print("bytesperline:", fmt.fmt.pix.bytesperline)
print("sizeimage:", fmt.fmt.pix.sizeimage)
fcntl.ioctl(self.handle, VIDIOC_S_FMT, fmt) # set whatever default settings we got before
parm = v4l2_streamparm()
parm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
parm.parm.capture.capability = V4L2_CAP_TIMEPERFRAME
fcntl.ioctl(self.handle, VIDIOC_G_PARM, parm) # get current camera settings
# set framerate to 60fps or 1/60
parm.parm.capture.timeperframe.numerator = 1
parm.parm.capture.timeperframe.denominator = 5
print("parm.capture.timeperframe: 1/60 fps")
fcntl.ioctl(self.handle, VIDIOC_S_PARM, parm) # change camera capture settings
logger.info("Disabling auto white balance for %s", self.path)
ctrl = v4l2_control()
ctrl.id = V4L2_CID_AUTO_WHITE_BALANCE
ctrl.value = 0
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
if self.saturation is not None:
logger.info("Setting saturation for %s to %d", self.path, self.saturation)
ctrl = v4l2_control()
ctrl.id = V4L2_CID_SATURATION
ctrl.value = self.saturation
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
if self.exposure is not None:
logger.info("Setting exposure for %s to %d", self.path, self.exposure)
# Disable auto exposure
ctrl = v4l2_control()
ctrl.id = V4L2_CID_EXPOSURE_AUTO
ctrl.value = V4L2_EXPOSURE_MANUAL
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
# Set exposure manually
ctrl = v4l2_control()
ctrl.id = V4L2_CID_EXPOSURE
ctrl.value = self.exposure
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
else:
# Enable auto exposure
logger.info("Setting auto exposure for %s", self.path)
ctrl = v4l2_control()
ctrl.id = V4L2_CID_EXPOSURE_AUTO
ctrl.value = V4L2_EXPOSURE_AUTO
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
# Flip camera horizontally
ctrl = v4l2_control()
ctrl.id = V4L2_CID_HFLIP
ctrl.value = self.hflip
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
# Flip camera vertically
ctrl = v4l2_control()
ctrl.id = V4L2_CID_VFLIP
ctrl.value = self.vflip
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
if self.gain is not None:
# Disable autogain
logger.info("Setting gain for %s to %d", self.path, self.gain)
ctrl = v4l2_control()
ctrl.id = V4L2_CID_AUTOGAIN
ctrl.value = 0
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
# Set gain manually
ctrl = v4l2_control()
ctrl.id = V4L2_CID_GAIN
ctrl.value = self.gain
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
else:
# Enable autogain
logger.info("Setting autogain for %s", self.path)
ctrl = v4l2_control()
ctrl.id = V4L2_CID_AUTOGAIN
ctrl.value = 1
fcntl.ioctl(self.handle, VIDIOC_S_CTRL, ctrl)
if self.fps is not None:
# Set framerate
parm = v4l2_streamparm()
parm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
parm.parm.capture.capability = V4L2_CAP_TIMEPERFRAME
fcntl.ioctl(self.handle, VIDIOC_G_PARM, parm) # get current camera settings
parm.parm.capture.timeperframe.numerator = 1
parm.parm.capture.timeperframe.denominator = self.fps
fcntl.ioctl(self.handle, VIDIOC_S_PARM, parm) # change camera capture settings
req = v4l2_requestbuffers()
req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
req.memory = V4L2_MEMORY_MMAP
req.count = 2 # nr of buffer frames
fcntl.ioctl(self.handle, VIDIOC_REQBUFS, req) # tell the driver that we want some buffers
self.buffers = []
for ind in range(req.count):
buf = v4l2_buffer()
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
buf.memory = V4L2_MEMORY_MMAP
buf.index = ind
fcntl.ioctl(self.handle, VIDIOC_QUERYBUF, buf)
mm = mmap.mmap(self.handle.fileno(), buf.length, mmap.MAP_SHARED, mmap.PROT_READ | mmap.PROT_WRITE, offset=buf.m.offset)
self.buffers.append(mm)
fcntl.ioctl(self.handle, VIDIOC_QBUF, buf)
buf_type = v4l2_buf_type(V4L2_BUF_TYPE_VIDEO_CAPTURE)
fcntl.ioctl(self.handle, VIDIOC_STREAMON, buf_type)
t0 = time.time()
max_t = 1
ready_to_read, ready_to_write, in_error = ([], [], [])
while len(ready_to_read) == 0 and time.time() - t0 < max_t:
ready_to_read, ready_to_write, in_error = select.select([self.handle], [], [], max_t)
def pop(self):
buf = v4l2_buffer()
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE
buf.memory = V4L2_MEMORY_MMAP
fcntl.ioctl(self.handle, VIDIOC_DQBUF, buf)
mm = self.buffers[buf.index]
uv = numpy.asarray(mm, numpy.uint8)[1::2].reshape(((480, 320, 2)))
uv = numpy.repeat(uv, 2, axis=1) # kills perf but fixes aspect ratio
#blurred_uv = cv2.blur(uv, (4,4)) # kills perf but smooths the picture
blurred_uv = uv
mask = cv2.inRange(blurred_uv, (60, 160), (90, 255)) ## FILTER THE COLORS!!
#mask = cv2.dilate(mask, None, iterations=2) # kills perf, removes sparkling
frame = numpy.asarray(mm, numpy.uint8).reshape((480, 640, 2))
frame = cv2.cvtColor(frame, cv2.COLOR_YUV2BGR_YUYV)
fcntl.ioctl(self.handle, VIDIOC_QBUF, buf) # requeue the buffer
return frame
def close(self):
fcntl.ioctl(self.handle, VIDIOC_STREAMOFF, buf_type)
self.handle.close()
self.handle = None
class QuadGrabber(object):
def __init__(self, a="/dev/video0", b="/dev/video1", c="/dev/video2", d="/dev/video3"):
self.grabbers = Grabber(c), Grabber(b), Grabber(d), Grabber(a)
def open(self):
for grabber in self.grabbers:
grabber.open()
def pop(self):
return numpy.vstack([
numpy.hstack([self.grabbers[0].pop(), self.grabbers[1].pop()]),
numpy.hstack([self.grabbers[2].pop(), self.grabbers[3].pop()])
])
def close(self):
for grabber in self.grabbers:
grabber.close()
from flask import Flask, Response
app = Flask(__name__)
@app.route('/')
def hello_world():
grabber = QuadGrabber()
grabber.open()
frame = grabber.pop()
def generator():
while True:
ret, jpeg = cv2.imencode('.jpg', frame, (cv2.IMWRITE_JPEG_QUALITY, 50))
buf = jpeg.tostring()
yield b'--frame\r\nContent-Type: image/jpeg\r\n\r\n'
yield buf
yield b'\r\n\r\n'
return Response(generator(), mimetype='multipart/x-mixed-replace; boundary=frame')
app.run(debug=True, host="0.0.0.0")
| 36.834821 | 132 | 0.616531 | 1,077 | 8,251 | 4.563603 | 0.225627 | 0.061038 | 0.068362 | 0.09766 | 0.456765 | 0.418311 | 0.396541 | 0.339573 | 0.311089 | 0.302747 | 0 | 0.028419 | 0.274997 | 8,251 | 223 | 133 | 37 | 0.793213 | 0.083747 | 0 | 0.35503 | 0 | 0 | 0.061065 | 0.010089 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059172 | false | 0 | 0.053254 | 0.005917 | 0.142012 | 0.029586 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17df80bfe8ac7de1cd53a259c5a56afbdd793b23 | 8,430 | py | Python | python/vineyard/deploy/local.py | linlih/v6d | b53cb648cd797d583ab28c88c2e3b6b45f6acf4c | [
"Apache-2.0",
"CC0-1.0"
] | 417 | 2020-10-23T12:35:27.000Z | 2021-04-15T09:37:00.000Z | python/vineyard/deploy/local.py | linlih/v6d | b53cb648cd797d583ab28c88c2e3b6b45f6acf4c | [
"Apache-2.0",
"CC0-1.0"
] | 160 | 2020-10-27T16:27:12.000Z | 2021-04-19T01:35:29.000Z | python/vineyard/deploy/local.py | linlih/v6d | b53cb648cd797d583ab28c88c2e3b6b45f6acf4c | [
"Apache-2.0",
"CC0-1.0"
] | 28 | 2020-10-27T15:40:48.000Z | 2021-04-16T08:03:16.000Z | #! /usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright 2020-2021 Alibaba Group Holding Limited.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import atexit
import contextlib
import logging
import os
import shutil
import subprocess
import sys
import tempfile
import textwrap
import time
from .etcd import start_etcd
from .utils import find_vineyardd_path, check_socket
from .._C import connect
logger = logging.getLogger('vineyard')
@contextlib.contextmanager
def start_vineyardd(etcd_endpoints=None,
etcd_prefix=None,
vineyardd_path=None,
size='256M',
socket=None,
rpc=True,
rpc_socket_port=9600,
debug=False):
''' Launch a local vineyard cluster.
Parameters:
etcd_endpoint: str
Launching vineyard using specified etcd endpoints. If not specified, vineyard
will launch its own etcd instance.
etcd_prefix: str
Specify a common prefix to establish a local vineyard cluster.
vineyardd_path: str
Location of vineyard server program. If not specified, vineyard will use its
own bundled vineyardd binary.
size: int
The memory size limit for vineyard's shared memory. The memory size can be a plain
integer or as a fixed-point number using one of these suffixes:
.. code::
E, P, T, G, M, K.
You can also use the power-of-two equivalents: Ei, Pi, Ti, Gi, Mi, Ki.
For example, the following represent roughly the same value:
.. code::
128974848, 129k, 129M, 123Mi, 1G, 10Gi, ...
socket: str
The UNIX domain socket socket path that vineyard server will listen on.
Default is None.
When the socket parameter is None, a random path under temporary directory will be
generated and used.
rpc_socket_port: int
The port that vineyard will use to privode RPC service.
debug: bool
Whether print debug logs.
Returns:
(proc, socket):
Yields a tuple with the subprocess as the first element and the UNIX-domain
IPC socket as the second element.
'''
if not vineyardd_path:
vineyardd_path = find_vineyardd_path()
if not vineyardd_path:
raise RuntimeError('Unable to find the "vineyardd" executable')
if not socket:
socketfp = tempfile.NamedTemporaryFile(delete=True, prefix='vineyard-', suffix='.sock')
socket = socketfp.name
socketfp.close()
if etcd_endpoints is None:
etcd_ctx = start_etcd()
etcd_proc, etcd_endpoints = etcd_ctx.__enter__() # pylint: disable=no-member
else:
etcd_ctx = None
env = os.environ.copy()
if debug:
env['GLOG_v'] = 11
# yapf: disable
command = [
vineyardd_path,
'--deployment', 'local',
'--size', str(size),
'--socket', socket,
'--rpc' if rpc else '--norpc',
'--rpc_socket_port', str(rpc_socket_port),
'--etcd_endpoint', etcd_endpoints
]
# yapf: enable
if etcd_prefix is not None:
command.extend(('--etcd_prefix', etcd_prefix))
proc = None
try:
proc = subprocess.Popen(command,
env=env,
stdout=subprocess.PIPE,
stderr=sys.__stderr__,
universal_newlines=True,
encoding='utf-8')
# wait for vineyardd ready: check the rpc port and ipc sockets
rc = proc.poll()
while rc is None:
if check_socket(socket) and ((not rpc) or check_socket(('0.0.0.0', rpc_socket_port))):
break
time.sleep(1)
rc = proc.poll()
if rc is not None:
err = textwrap.indent(proc.stdout.read(), ' ' * 4)
raise RuntimeError('vineyardd exited unexpectedly with code %d, error is:\n%s' % (rc, err))
logger.debug('vineyardd is ready.............')
yield proc, socket, etcd_endpoints
finally:
logger.debug('Local vineyardd being killed')
if proc is not None and proc.poll() is None:
proc.terminate()
proc.wait()
try:
shutil.rmtree(socket)
except:
pass
if etcd_ctx is not None:
etcd_ctx.__exit__(None, None, None) # pylint: disable=no-member
__default_instance_contexts = {}
def init(num_instances=1, **kw):
'''
Launching a local vineyardd instance and get a client as easy as possible
In a clean environment, simply use:
.. code:: python
vineyard.init()
It will launch a local vineyardd and return a connected client to the vineyardd.
It will also setup the environment variable :code:`VINEYARD_IPC_SOCKET`.
For the case to establish a local vineyard cluster consists of multiple vineyardd
instances, using the :code:`num_instances` parameter:
.. code:: python
client1, client2, client3 = vineyard.init(num_instances=3)
In this case, three vineyardd instances will be launched.
The init method can only be called once in a process, to get the established
sockets or clients later in the process, use :code:`get_current_socket` or
:code:`get_current_client` respectively.
'''
assert __default_instance_contexts == {}
if 'VINEYARD_IPC_SOCKET' in os.environ:
raise ValueError("VINEYARD_IPC_SOCKET has already been set: %s, which "
"means there might be a vineyard daemon already running "
"locally" % os.environ['VINEYARD_IPC_SOCKET'])
etcd_endpoints = None
etcd_prefix = f'vineyard_init_at_{time.time()}'
for idx in range(num_instances):
ctx = start_vineyardd(etcd_endpoints=etcd_endpoints, etcd_prefix=etcd_prefix, rpc=False, **kw)
_, ipc_socket, etcd_endpoints = ctx.__enter__()
client = connect(ipc_socket)
__default_instance_contexts[ipc_socket] = (ctx, client)
if idx == 0:
os.environ['VINEYARD_IPC_SOCKET'] = ipc_socket
return get_current_client()
def get_current_client():
'''
Get current vineyard IPC clients established by :code:`vineyard.init()`.
Raises:
ValueError if vineyard is not initialized.
'''
if not __default_instance_contexts:
raise ValueError("Vineyard has not been initialized, use vineyard.init() to launch vineyard instances")
clients = [__default_instance_contexts[k][1] for k in __default_instance_contexts]
return clients if len(clients) > 1 else clients[0]
def get_current_socket():
'''
Get current vineyard UNIX-domain socket established by :code:`vineyard.init()`.
Raises:
ValueError if vineyard is not initialized.
'''
if not __default_instance_contexts:
raise ValueError("Vineyard has not been initialized, use vineyard.init() to launch vineyard instances")
sockets = __default_instance_contexts.keys()
return sockets if len(sockets) > 1 else sockets[0]
def shutdown():
'''
Shutdown the vineyardd instances launched by previous :code:`vineyard.init()`.
'''
global __default_instance_contexts
if __default_instance_contexts:
for ipc_socket in reversed(__default_instance_contexts):
__default_instance_contexts[ipc_socket][0].__exit__(None, None, None)
# NB. don't pop pre-existing env if we not launch
os.environ.pop('VINEYARD_IPC_SOCKET', None)
__default_instance_contexts = {}
@atexit.register
def __shutdown_handler():
try:
shutdown()
except Exception: # pylint: disable=broad-except
pass
__all__ = ['start_vineyardd']
| 32.801556 | 111 | 0.636299 | 1,045 | 8,430 | 4.969378 | 0.321531 | 0.02253 | 0.057578 | 0.012132 | 0.125554 | 0.082804 | 0.070479 | 0.070479 | 0.070479 | 0.070479 | 0 | 0.010248 | 0.282325 | 8,430 | 256 | 112 | 32.929688 | 0.848099 | 0.400949 | 0 | 0.123967 | 0 | 0 | 0.145199 | 0.006304 | 0 | 0 | 0 | 0 | 0.008264 | 1 | 0.049587 | false | 0.016529 | 0.107438 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17e0b34e3c63a0466f597b4096fb50d5727d52ad | 8,101 | py | Python | reinvent_scoring/scoring/score_components/rocs/parallel_rocs_similarity.py | MolecularAI/reinvent-scoring | f7e052ceeffd29e17e1672c33607189873c82a45 | [
"MIT"
] | null | null | null | reinvent_scoring/scoring/score_components/rocs/parallel_rocs_similarity.py | MolecularAI/reinvent-scoring | f7e052ceeffd29e17e1672c33607189873c82a45 | [
"MIT"
] | 2 | 2021-11-01T23:19:42.000Z | 2021-11-22T23:41:39.000Z | reinvent_scoring/scoring/score_components/rocs/parallel_rocs_similarity.py | MolecularAI/reinvent-scoring | f7e052ceeffd29e17e1672c33607189873c82a45 | [
"MIT"
] | 2 | 2021-11-18T13:14:22.000Z | 2022-03-16T07:52:57.000Z | import os
import multiprocessing
from multiprocessing import Pool
from pathlib import Path
import numpy as np
from openeye import oechem, oeomega, oeshape, oequacpac
from reinvent_scoring.scoring.component_parameters import ComponentParameters
from reinvent_scoring.scoring.enums import ROCSSimilarityMeasuresEnum, ROCSInputFileTypesEnum, ROCSSpecificParametersEnum
from reinvent_scoring.scoring.score_components.rocs import oehelper, oefuncs
from reinvent_scoring.scoring.score_components.rocs.base_rocs_component import BaseROCSComponent
from reinvent_scoring.scoring.score_components.rocs.default_values import ROCS_DEFAULT_VALUES
class ParallelRocsSimilarity(BaseROCSComponent):
def __init__(self, parameters: ComponentParameters):
super().__init__(parameters)
avail_cpus = multiprocessing.cpu_count()
oechem.OEThrow.SetLevel(10000)
self.sim_measure_enum = ROCSSimilarityMeasuresEnum()
self.input_types_enum = ROCSInputFileTypesEnum()
self.param_names_enum = ROCSSpecificParametersEnum()
self.num_cpus = min(avail_cpus, self._specific_param("MAX_CPUS"))
self._set_omega_parameters()
self._set_rocs_parameters()
self.shape_weight = self._specific_param("SHAPE_WEIGHT")
self.color_weight = self._specific_param("COLOR_WEIGHT")
self.sim_func_name_set = oefuncs.get_similarity_name_set(parameters, self.param_names_enum,
self.sim_measure_enum)
def _set_omega_parameters(self):
self.max_confs = self._specific_param("MAX_CONFS")
self.erange = self._specific_param("EWINDOW")
self.enum_stereo = self._specific_param("ENUM_STEREO")
self.max_stereo = self._specific_param("MAX_STEREO")
if self.max_stereo == 0:
self.enum_stereo = False
self.setup_omega(self.erange, self.max_confs)
def _set_rocs_parameters(self):
self.file_path = self._specific_param("ROCS_INPUT")
self.file_type = self._specific_param("INPUT_TYPE")
self.cff_path = self._specific_param("CUSTOM_CFF")
self.save_overlays = self._specific_param("SAVE_ROCS_OVERLAYS")
if self.save_overlays:
self.dir_name = self._specific_param("ROCS_OVERLAYS_DIR")
self.overlay_prefix = self._specific_param("ROCS_OVERLAYS_PREFIX")
Path(self.dir_name).mkdir(parents=True, exist_ok=True)
self.protein_file = ""
self.ligand_file = ""
self.neg_vol = self._specific_param("NEGATIVE_VOLUME")
if self.neg_vol:
self.protein_file = self._specific_param("PROTEIN_NEG_VOL_FILE")
self.ligand_file = self._specific_param("LIGAND_NEG_VOL_FILE")
def _calculate_omega_score(self, smiles, step) -> np.array:
inputs = []
if len(smiles) == 0:
return np.array(())
self._prepare_overlay()
ind = str(step).zfill(4)
for smile in smiles:
input = {"smile": smile, "shape_weight": self.shape_weight, "color_weight": self.color_weight,
"sim_func_name_set": self.sim_func_name_set, "batch_id": ind,
"enum_stereo": self.enum_stereo, "max_stereo": self.max_stereo, "save_overlays": self.save_overlays,
"neg_vol_file": self.protein_file, "neg_vol_lig": self.ligand_file
}
inputs.append(input)
with Pool(processes=min(self.num_cpus, len(inputs))) as pool:
results = pool.map(self._unfold, inputs)
scores = []
if self.save_overlays:
overlay_filename = self.overlay_prefix + ind + ".sdf"
overlay_file_path = os.path.join(self.dir_name, overlay_filename)
outfs = oechem.oemolostream(overlay_file_path)
for result in results:
score, outmol = result
scores.append(score)
if self.save_overlays:
oechem.OEWriteMolecule(outfs, outmol)
return np.array(scores)
def _prepare_overlay(self):
overlay_function_types = {
self.input_types_enum.SHAPE_QUERY: self.setup_reference_molecule_with_shape_query,
self.input_types_enum.SDF_QUERY: self.setup_reference_molecule
}
overlay_function = overlay_function_types.get(self.file_type)
overlay_function(self.file_path, self.cff_path)
def _unfold(self, args):
return self.parallel_scoring(**args)
def _specific_param(self, key_enum):
key = self.param_names_enum.__getattribute__(key_enum)
default = ROCS_DEFAULT_VALUES[key_enum]
ret = self.parameters.specific_parameters.get(key, default)
if ret is not None:
return ret
raise KeyError(f"specific parameter \'{key}\' was not set")
@classmethod
def setup_reference_molecule_with_shape_query(cls, shape_query, cff_path):
cls.prep = oeshape.OEOverlapPrep()
qry = oeshape.OEShapeQuery()
oefuncs.init_cff(cls.prep, cff_path)
cls.rocs_overlay = oeshape.OEOverlay()
if oeshape.OEReadShapeQuery(shape_query, qry):
cls.rocs_overlay.SetupRef(qry)
else:
raise FileNotFoundError("A ROCS shape query file was not found")
@classmethod
def setup_reference_molecule(cls, file_path, cff_path):
cls.prep = oeshape.OEOverlapPrep()
input_stream = oechem.oemolistream()
input_stream.SetFormat(oechem.OEFormat_SDF)
input_stream.SetConfTest(oechem.OEAbsoluteConfTest(compTitles=False))
refmol = oechem.OEMol()
if input_stream.open(file_path):
oechem.OEReadMolecule(input_stream, refmol)
else:
raise FileNotFoundError("A ROCS reference sdf file was not found")
oefuncs.init_cff(cls.prep, cff_path)
cls.prep.Prep(refmol)
cls.rocs_overlay = oeshape.OEMultiRefOverlay()
cls.rocs_overlay.SetupRef(refmol)
@classmethod
def setup_omega(cls, erange, max_confs):
omegaOpts = oeomega.OEOmegaOptions()
omegaOpts.SetStrictStereo(False)
omegaOpts.SetEnergyWindow(erange)
omegaOpts.SetMaxConfs(max_confs)
cls.omega = oeomega.OEOmega(omegaOpts)
return cls.omega
@classmethod
def parallel_scoring(cls, smile, shape_weight, color_weight, sim_func_name_set, batch_id, enum_stereo=False,
max_stereo=0, save_overlays=False, neg_vol_file="", neg_vol_lig=""):
predicate = getattr(oeshape, sim_func_name_set.predicate)()
imol = oechem.OEMol()
outmol = oechem.OEMol()
best_score = 0.0
if oechem.OESmilesToMol(imol, smile):
oequacpac.OEGetReasonableProtomer(imol)
omega_success, imol = oehelper.get_omega_confs(imol, cls.omega, enum_stereo, max_stereo)
if omega_success:
cls.prep.Prep(imol)
score = oeshape.OEBestOverlayScore()
cls.rocs_overlay.BestOverlay(score, imol, predicate)
outmol = oechem.OEGraphMol(imol.GetConf(oechem.OEHasConfIdx(score.GetFitConfIdx())))
best_score, best_score_shape, best_score_color, neg_score = oehelper.get_score(outmol, score,
sim_func_name_set,
shape_weight,
color_weight,
neg_vol_file,
neg_vol_lig)
if save_overlays:
oeshape.OERemoveColorAtoms(outmol)
oehelper.prep_sdf_file(outmol, score, smile, batch_id, best_score_shape, best_score_color,
neg_score)
return best_score, outmol
| 47.934911 | 121 | 0.636465 | 889 | 8,101 | 5.470191 | 0.211474 | 0.045445 | 0.055933 | 0.017273 | 0.164713 | 0.107958 | 0.055316 | 0.027555 | 0 | 0 | 0 | 0.001891 | 0.281817 | 8,101 | 168 | 122 | 48.220238 | 0.833964 | 0 | 0 | 0.086667 | 0 | 0 | 0.054191 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073333 | false | 0 | 0.073333 | 0.006667 | 0.193333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17e18ea3c92aa4447785c69712c4085be827c0a3 | 4,214 | py | Python | tensor2tensor/insights/graph.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | 44 | 2018-11-07T18:52:33.000Z | 2019-07-06T12:48:18.000Z | tensor2tensor/insights/graph.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | 63 | 2017-12-19T20:29:10.000Z | 2021-08-04T21:49:36.000Z | tensor2tensor/insights/graph.py | sivaramakrishna7/tensor2tensor | eb0118d3f459913133e3d68a96944480a928bff1 | [
"Apache-2.0"
] | 44 | 2018-11-09T21:04:52.000Z | 2019-06-24T07:40:28.000Z | # coding=utf-8
# Copyright 2018 The Tensor2Tensor Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Graph representation for building decoding graph visualizations."""
class Vertex(object):
"""Vertex stores in and out edge connections to other Vertex instances.
The Vertex class supports serialization to a JSON data format expected by the
client side representation. When serializing, it generates the following
fields:
in_edge_index: The list of directed edge indices into the Vertex.
out_edge_index: The list of directed edge indices from the Vertex.
"""
def __init__(self, idx):
"""Initialize the Vertex.
Args:
idx: The index of the vertex.
"""
self.idx = idx
self.in_edges = []
self.out_edges = []
def to_dict(self):
"""Returns a simplified dictionary representing the Vertex.
Returns:
A dictionary that can easily be serialized to JSON.
"""
return {
"in_edge_index": self.in_edges,
"out_edge_index": self.out_edges,
}
class Edge(object):
"""Edge stores edge details connecting two Vertex instances.
The Edge class supports serialization to a JSON data format expected by the
client side representation. When serializing, it generates the following
fields:
source_index: The source Vertex index for this Edge.
target_index: The target Vertex index for this Edge.
data: Arbitrary data for this Edge.
"""
def __init__(self, idx):
"""Initialize the Edge.
Args:
idx: The index of the Edge.
"""
self.idx = idx
self.source = -1
self.target = -1
self.data = {}
def to_dict(self):
"""Returns a simplified dictionary representing the Vertex.
Returns:
A dictionary that can easily be serialized to JSON.
"""
return {
"source_index": self.source,
"target_index": self.target,
"data": self.data,
}
def __str__(self):
return str(self.to_dict())
class Graph(object):
"""A directed graph that can easily be JSON serialized for visualization.
When serializing, it generates the following fields:
edge: The list of all serialized Edge instances.
node: The list of all serialized Vertex instances.
"""
def __init__(self):
self.vertices = []
self.edges = []
self.vertex_map = {}
def new_vertex(self):
"""Creates and returns a new vertex.
Returns:
A new Vertex instance with a unique index.
"""
vertex = Vertex(len(self.vertices))
self.vertices.append(vertex)
return vertex
def get_vertex(self, key):
"""Returns or Creates a Vertex mapped by key.
Args:
key: A string reference for a vertex. May refer to a new Vertex in which
case it will be created.
Returns:
A the Vertex mapped to by key.
"""
if key in self.vertex_map:
return self.vertex_map[key]
vertex = self.new_vertex()
self.vertex_map[key] = vertex
return vertex
def add_edge(self, source, target):
"""Returns a new edge connecting source and target vertices.
Args:
source: The source Vertex.
target: The target Vertex.
Returns:
A new Edge linking source to target.
"""
edge = Edge(len(self.edges))
self.edges.append(edge)
source.out_edges.append(edge.idx)
target.in_edges.append(edge.idx)
edge.source = source.idx
edge.target = target.idx
return edge
def to_dict(self):
"""Returns a simplified dictionary representing the Graph.
Returns:
A dictionary that can easily be serialized to JSON.
"""
return {
"node": [v.to_dict() for v in self.vertices],
"edge": [e.to_dict() for e in self.edges]
}
| 27.012821 | 79 | 0.676554 | 587 | 4,214 | 4.775128 | 0.258944 | 0.031395 | 0.012843 | 0.021406 | 0.334285 | 0.287192 | 0.253657 | 0.237959 | 0.211559 | 0.211559 | 0 | 0.003742 | 0.238965 | 4,214 | 155 | 80 | 27.187097 | 0.870284 | 0.603227 | 0 | 0.230769 | 0 | 0 | 0.043964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.192308 | false | 0 | 0 | 0.019231 | 0.403846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17e3400ace244a552493cbfa796324f0cc464500 | 445 | py | Python | vivisect/analysis/ms/msvc.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 716 | 2015-01-01T14:41:11.000Z | 2022-03-28T06:51:50.000Z | vivisect/analysis/ms/msvc.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 266 | 2015-01-01T15:07:27.000Z | 2022-03-30T15:19:26.000Z | vivisect/analysis/ms/msvc.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 159 | 2015-01-01T16:19:44.000Z | 2022-03-21T21:55:34.000Z |
"""
An emulation module to detect SEH setup and apply structs where possible.
"""
import vivisect.vamp.msvc as v_msvc
vs = v_msvc.VisualStudioVamp()
def analyzeFunction(vw, funcva):
offset, bytes = vw.getByteDef(funcva)
sig = vs.getSignature(bytes, offset)
if sig is not None:
fname = sig.split(".")[-1]
vw.makeName(funcva, "%s_%.8x" % (fname, funcva), filelocal=True)
vw.makeFunctionThunk(funcva, sig)
| 23.421053 | 73 | 0.669663 | 59 | 445 | 5 | 0.711864 | 0.033898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005634 | 0.202247 | 445 | 18 | 74 | 24.722222 | 0.825352 | 0.164045 | 0 | 0 | 0 | 0 | 0.022039 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17e3dbff3ac2937b06f18b4395565d33ed3ab633 | 2,624 | py | Python | developing/node/libs/gpio/gpiodef.py | Pyro4Bot-RoboLab/PYRobot | 917ec23f2abf483a29f652cd2b43e1eaa49b82be | [
"MIT"
] | null | null | null | developing/node/libs/gpio/gpiodef.py | Pyro4Bot-RoboLab/PYRobot | 917ec23f2abf483a29f652cd2b43e1eaa49b82be | [
"MIT"
] | null | null | null | developing/node/libs/gpio/gpiodef.py | Pyro4Bot-RoboLab/PYRobot | 917ec23f2abf483a29f652cd2b43e1eaa49b82be | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# lock().acquire()
# ____________developed by paco andres____________________
# _________collaboration with cristian vazquez____________
from node.libs.gpio.Platform import HARDWARE
if HARDWARE == "RASPBERRY_PI":
import RPi.GPIO as GPIO
# Modes
BCM = GPIO.BCM
BOARD = GPIO.BOARD
UNSET = -1
# _dir_mapping
OUT = GPIO.OUT
IN = GPIO.IN
HIGH = GPIO.HIGH
LOW = GPIO.LOW
# _edge_mapping
RISING = GPIO.RISING
FALLING = GPIO.FALLING
BOTH = GPIO.BOTH
# _pud_mapping
PUD_OFF = GPIO.PUD_OFF
PUD_DOWN = GPIO.PUD_DOWN
PUD_UP = GPIO.PUD_UP
modes = {"Unset": -1,
"BCM": 11,
"BOARD": 10
}
status = {0: "OUT",
1: "IN",
10: "PWM",
11: "EVENT",
40: "SERIAL",
41: "SPI",
42: "I2C",
43: "HARD_PWM",
-1: "UNKNOWN",
-2: "READ_ERROR",
-3: "IN USE BY PYRO4BOT OBJECT"
}
""" gpioport "is a dict that define 40 physical pins bus gpio and his correlation with BOARD,BCM and wiringPI
specifications pin/BOARD: [description,BCM,wiringPI]"""
gpioport = {1: ["3.3v", None, None], 2: ["5v", None, None],
3: ["SDA.1", 2, 8], 4: ["5v", None, None],
5: ["SCL.1", 3, 9], 6: ["0v", None, None],
7: ["GPIO.", 4, 7], 8: ["TxD", 14, 15],
9: ["0v", None, None], 10: ["RxD", 15, 16],
11: ["GPIO.", 17, 0], 12: ["GPIO.", 18, 1],
13: ["GPIO.", 27, 2], 14: ["0v", None, None],
15: ["GPIO.", 22, 3], 16: ["GPIO.", 23, 4],
17: ["3.3v", None, None], 18: ["GPIO.", 24, 5],
19: ["MOSI", 10, 12], 20: ["0v", None, None],
21: ["MISO", 9, 13], 22: ["GPIO.", 25, 6],
23: ["SCLK.", 11, 14], 24: ["CE0", 8, 10],
25: ["0v", None, None], 26: ["CE1", 7, 18],
27: ["SDA.0", 0, 30], 28: ["SCL.0", 1, 31],
29: ["GPIO.", 5, 21], 30: ["0v", None, None],
31: ["GPIO.", 6, 22], 32: ["GPIO.", 12, 26],
33: ["GPIO.", 13, 23], 34: ["0v", None, None],
35: ["GPIO.", 19, 24], 36: ["GPIO.", 16, 27],
37: ["GPIO.", 26, 25], 38: ["GPIO.", 20, 28],
39: ["0v", None, None], 40: ["GPIO.", 21, 29],
}
max_pwm = 20
if HARDWARE == "BEAGLEBONE_BLACK":
pass
if HARDWARE == "MINNOWBOARD":
pass
if HARDWARE == "UNKNOWN":
pass
| 32 | 113 | 0.444741 | 328 | 2,624 | 3.344512 | 0.417683 | 0.087511 | 0.072926 | 0.020055 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12822 | 0.363948 | 2,624 | 81 | 114 | 32.395062 | 0.529059 | 0.083841 | 0 | 0.05 | 0 | 0 | 0.131472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.05 | 0.033333 | 0 | 0.033333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17e8e7cf14e590f4cc40f2c234bd7beeebfa001e | 1,042 | py | Python | code/generate_metrics.py | waingram/npc-analysis | a955bb9cba27d2c5b83559e36c36cfd5672541f4 | [
"BSD-3-Clause"
] | 1 | 2021-07-16T17:08:24.000Z | 2021-07-16T17:08:24.000Z | code/generate_metrics.py | waingram/npc-analysis | a955bb9cba27d2c5b83559e36c36cfd5672541f4 | [
"BSD-3-Clause"
] | null | null | null | code/generate_metrics.py | waingram/npc-analysis | a955bb9cba27d2c5b83559e36c36cfd5672541f4 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
"""
Evaluate citation extracting
"""
from sklearn.metrics import confusion_matrix, classification_report
__author__ = "William A. Ingram"
__version__ = "0.1.0"
__license__ = "BSD3"
def main():
""" """
filename = '../results/npc_analysis.txt'
f = open(filename, 'r')
labels = ['author',
'booktitle',
'date',
'editor',
'institution',
'journal',
'location',
'note',
'pages',
'publisher',
'title',
'volume']
y_true = []
y_pred = []
for lines in f:
a = lines.strip().split()
if len(a) == 0:
continue
y_pred.append(a[-2])
y_true.append(a[-1])
print('Confusion Matrix')
print(confusion_matrix(y_true, y_pred, labels=labels))
print('\nClassification Report')
print(classification_report(y_true, y_pred, labels=labels, ))
if __name__ == "__main__":
""" """
main()
| 20.038462 | 67 | 0.516315 | 104 | 1,042 | 4.855769 | 0.596154 | 0.039604 | 0.035644 | 0.059406 | 0.087129 | 0.087129 | 0 | 0 | 0 | 0 | 0 | 0.011561 | 0.335893 | 1,042 | 51 | 68 | 20.431373 | 0.718208 | 0.047985 | 0 | 0 | 0 | 0 | 0.186983 | 0.027893 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.030303 | 0 | 0.060606 | 0.121212 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17ed7d71591efd84b58b5c7512ccc5185afb565b | 14,403 | py | Python | mtp_api/apps/credit/tests/test_notices.py | uk-gov-mirror/ministryofjustice.money-to-prisoners-api | fdf74298284804779e95294cf418ce97e5ea8666 | [
"MIT"
] | null | null | null | mtp_api/apps/credit/tests/test_notices.py | uk-gov-mirror/ministryofjustice.money-to-prisoners-api | fdf74298284804779e95294cf418ce97e5ea8666 | [
"MIT"
] | null | null | null | mtp_api/apps/credit/tests/test_notices.py | uk-gov-mirror/ministryofjustice.money-to-prisoners-api | fdf74298284804779e95294cf418ce97e5ea8666 | [
"MIT"
] | null | null | null | import collections
import contextlib
import datetime
import functools
import itertools
import os
import unittest
from unittest import mock
from django.core import mail
from django.core.management import call_command
from faker import Faker
from credit.constants import LOG_ACTIONS as CREDIT_ACTIONS
from credit.models import Credit, Log as CreditLog
from credit.notices import Canvas
from credit.notices.prisoner_credits import PrisonerCreditNoticeBundle
from credit.tests.test_base import BaseCreditViewTestCase
from disbursement.constants import LOG_ACTIONS as DISBURSEMENT_ACTIONS
from disbursement.models import Disbursement, Log as DisbursementLog
from prison.models import Prison, PrisonerCreditNoticeEmail
fake = Faker(locale='en_GB')
sample_location = {
'description': 'LEI-A-2-002',
'levels': [
{'type': 'Wing', 'value': 'A'},
{'type': 'Landing', 'value': '2'},
{'type': 'Cell', 'value': '002'}
],
}
credit_cls = collections.namedtuple('Credit', ('amount', 'sender_name'))
disbursement_cls = collections.namedtuple('Disbursement', 'amount method recipient_first_name recipient_last_name')
class PrisonerCreditNoticeTestCase(unittest.TestCase):
image_per_template = 3
text_per_template = 6
text_per_update = 3
text_per_message = 2
def assertPageUpdates(self, show_page, draw_string, updates_per_page): # noqa: N802
self.assertEqual(show_page.call_count, len(updates_per_page))
self.assertEqual(draw_string.call_count, (
self.text_per_template * len(updates_per_page) +
self.text_per_update * functools.reduce(lambda updates, page: updates + len(page), updates_per_page, 0) +
self.text_per_message * sum(itertools.chain.from_iterable(updates_per_page))
))
@mock.patch.object(Canvas, 'drawImage')
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_one_credit(self, canvas_save, canvas_show_page, canvas_draw_string, canvas_draw_image):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [credit_cls(1000, 'Mrs. Halls')], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1]])
self.assertEqual(canvas_draw_image.call_count, self.image_per_template)
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_two_credits(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [credit_cls(1000, 'Mrs. Halls'),
credit_cls(2000, 'Mrs. Halls')], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[2]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_two_disbursements(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [], [
disbursement_cls(2000, 'cheque', 'Rose', 'Johnson'),
disbursement_cls(3000, 'bank_transfer', 'Janet', 'Johnson')
])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[2]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_two_different_updates(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location,
[credit_cls(1000, 'Mrs. Halls')],
[disbursement_cls(2000, 'cheque', 'Mary', 'Johnson')])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1, 1]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_two_prisoners(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [credit_cls(1000, 'Mrs. Halls')], []),
('RICKIE RIPPIN', 'A1617FY', sample_location, [credit_cls(2500, 'JOHNSON & ASSOCIATES')], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1], [1]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_many_credits(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [credit_cls(1000, 'Mrs. Halls')] * 11, [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[9], [2]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_one_prisoner_many_updates(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location,
[credit_cls(1000, 'Mrs. Halls')] * 11,
[disbursement_cls(2000, 'bank_transfer', 'Mary', 'Johnson')] * 11)]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[9], [2, 3], [8]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_long_text(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('NKFUVMY PMNDINERGGPGL-UMR-X-YFMESG', 'A1234AA', sample_location, [
credit_cls(3035011, 'X' * 100)
], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1]])
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_location_malformed(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
malformed_location = {
'description': 'LEIA2',
}
prisoners = [('JAMES HALLS', 'A1409AE', malformed_location, [credit_cls(1000, 'Mrs. Halls')], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1]])
expected_string = 'Location: LEIA2'
drawn_strings = [call[0][2] for call in canvas_draw_string.call_args_list]
self.assertTrue(any(expected_string in drawn_string for drawn_string in drawn_strings))
@mock.patch.object(Canvas, 'drawString')
@mock.patch.object(Canvas, 'showPage')
@mock.patch.object(Canvas, 'save')
def test_location_complete(self, canvas_save, canvas_show_page, canvas_draw_string):
canvas_save.return_value = None
prisoners = [('JAMES HALLS', 'A1409AE', sample_location, [credit_cls(1000, 'Mrs. Halls')], [])]
bundle = PrisonerCreditNoticeBundle('INB', prisoners, datetime.date(2017, 6, 16))
bundle.render(None)
self.assertPageUpdates(canvas_show_page, canvas_draw_string, [[1]])
expected_string = 'Wing: A Landing: 2 Cell: 002'
drawn_strings = [call[0][2] for call in canvas_draw_string.call_args_list]
self.assertTrue(any(expected_string in drawn_string for drawn_string in drawn_strings))
class NoticesCommandTestCase(BaseCreditViewTestCase):
def assign_email_addresses(self):
for prison in Prison.objects.all():
PrisonerCreditNoticeEmail.objects.create(
prison=prison,
email='%s@mtp.local' % fake.user_name(),
)
class CreatePrisonerNoticesTestCase(NoticesCommandTestCase):
def setUp(self):
super().setUp()
self.assign_email_addresses()
DisbursementLog.objects.filter(action=DISBURSEMENT_ACTIONS.SENT).delete()
credited_logs = CreditLog.objects.filter(action=CREDIT_ACTIONS.CREDITED).order_by('-created')
self.latest_log = credited_logs.first()
credited_logs.exclude(pk=self.latest_log.pk).delete()
self.latest_credit = self.latest_log.credit
# leave only 1 credit as credited and no sent disbursements
@mock.patch(
'credit.management.commands.create_prisoner_credit_notices.can_access_nomis',
mock.Mock(return_value=True),
)
@mock.patch('credit.management.commands.create_prisoner_credit_notices.PrisonerCreditNoticeBundle')
@mock.patch('credit.management.commands.create_prisoner_credit_notices.nomis_get_location')
def call_command(self, housing_response, expected_location, mock_get_location, bundle_class):
credited_date = self.latest_credit.modified.date()
location_response = {
'nomis_id': self.latest_credit.prison.nomis_id,
'name': self.latest_credit.prison.name,
}
location_response.update(housing_response)
mock_get_location.return_value = location_response
call_command(
'create_prisoner_credit_notices',
'/tmp/fake-path',
self.latest_credit.prison.nomis_id,
verbosity=0,
date=credited_date.strftime('%Y-%m-%d')
)
bundle_class.assert_called_once_with(
self.latest_credit.prison.name,
[(
self.latest_credit.prisoner_name,
self.latest_credit.prisoner_number,
expected_location,
[self.latest_credit],
[],
)],
self.latest_log.created.date()
)
def test_location_missing(self):
self.call_command({}, None)
def test_location_present(self):
self.call_command(
{
'housing_location': {
'description': 'LEI-A-2-002',
'levels': [
{'type': 'Wing', 'value': 'A'},
{'type': 'Landing', 'value': '2'},
{'type': 'Cell', 'value': '002'},
],
},
},
{
'description': 'LEI-A-2-002',
'levels': [
{'type': 'Wing', 'value': 'A'},
{'type': 'Landing', 'value': '2'},
{'type': 'Cell', 'value': '002'},
],
},
)
def test_location_long_form(self):
self.call_command(
{
'housing_location': {
'description': 'HEI-1-1-A-001',
'levels': [
{'type': 'Block', 'value': '1'},
{'type': 'Tier', 'value': '1'},
{'type': 'Spur', 'value': 'A'},
{'type': 'Cell', 'value': '001'},
],
},
},
{
'description': 'HEI-1-1-A-001',
'levels': [
{'type': 'Block', 'value': '1'},
{'type': 'Tier', 'value': '1'},
{'type': 'Spur', 'value': 'A'},
{'type': 'Cell', 'value': '001'},
],
},
)
class SendPrisonerCreditNoticeTestCase(NoticesCommandTestCase):
@mock.patch('credit.management.commands.create_prisoner_credit_notices.nomis_get_location')
def test_no_emails_sent_if_prisons_have_addresses(self, nomis_get_location):
nomis_get_location.side_effect = NotImplementedError
with open(os.devnull, 'w') as devnull, contextlib.redirect_stderr(devnull):
call_command('send_prisoner_credit_notices', verbosity=0)
self.assertEqual(len(mail.outbox), 0)
@mock.patch('credit.management.commands.create_prisoner_credit_notices.nomis_get_location')
def test_nothing_credited_sends_no_email(self, nomis_get_location):
nomis_get_location.side_effect = NotImplementedError
self.assign_email_addresses()
Credit.objects.credited().delete()
Disbursement.objects.sent().delete()
call_command('send_prisoner_credit_notices', verbosity=0)
self.assertEqual(len(mail.outbox), 0)
@mock.patch('credit.management.commands.create_prisoner_credit_notices.nomis_get_location')
def test_one_email_per_prison(self, nomis_get_location):
nomis_get_location.return_value = None
self.assign_email_addresses()
Disbursement.objects.sent().delete()
credited_logs = CreditLog.objects.filter(action=CREDIT_ACTIONS.CREDITED).order_by('-created')
latest = credited_logs.first().created.date()
credited_logs = CreditLog.objects.filter(
action=CREDIT_ACTIONS.CREDITED,
created__date__range=(latest, latest + datetime.timedelta(days=1))
)
prison_set = {credited_log.credit.prison_id for credited_log in credited_logs}
call_command('send_prisoner_credit_notices', date=latest.strftime('%Y-%m-%d'), verbosity=0)
self.assertEqual(len(mail.outbox), len(prison_set))
| 43.911585 | 117 | 0.645768 | 1,590 | 14,403 | 5.595597 | 0.149686 | 0.037428 | 0.052265 | 0.073171 | 0.642801 | 0.620321 | 0.603237 | 0.588176 | 0.588176 | 0.567832 | 0 | 0.024507 | 0.229397 | 14,403 | 327 | 118 | 44.045872 | 0.777097 | 0.004721 | 0 | 0.454874 | 0 | 0 | 0.133757 | 0.042004 | 0 | 0 | 0 | 0 | 0.072202 | 1 | 0.072202 | false | 0 | 0.068592 | 0 | 0.169675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17ee0476722f82448075b67b45c7c42ae6e8c917 | 3,833 | py | Python | quantlplot/examples/embed.py | shamazkhan/qtlplot | b1099a9cd75fc688f9f43d378bdc46a987e112ab | [
"MIT"
] | 1 | 2021-11-17T05:50:13.000Z | 2021-11-17T05:50:13.000Z | quantlplot/examples/embed.py | shamazkhan/qtlplot | b1099a9cd75fc688f9f43d378bdc46a987e112ab | [
"MIT"
] | null | null | null | quantlplot/examples/embed.py | shamazkhan/qtlplot | b1099a9cd75fc688f9f43d378bdc46a987e112ab | [
"MIT"
] | 1 | 2021-09-19T09:34:54.000Z | 2021-09-19T09:34:54.000Z | import json
import bson
import pymongo
from pymongo import MongoClient
from collections import defaultdict
from polygon_rest import RESTClient
import datetime
import pandas as pd
from pandas.io.json import json_normalize
import quantlplot as qplt
from functools import lru_cache
from PyQt5.QtWidgets import QApplication, QGridLayout, QGraphicsView, QComboBox, QLabel
from threading import Thread
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)
def _connect_mongo(host, port, username, password, db):
'''function to establish a connection with MongoDB'''
if username and password:
mongo_uri = "mongodb+srv://skhan:A330airbus@cluster0.f4uut.mongodb.net/POLYGON_STOCKS_EOD?retryWrites=true&w=majority"
conn = MongoClient(mongo_uri)
else:
''' Change this part of code when MongoDB is deployed as a Service. Host/Port configuration is defined in
<config> file. Until than keep it as it is. However this isnt the most efficient way to do this.
'''
#conn = MongoClient(host, port)
mongo_uri = "mongodb+srv://skhan:A330airbus@cluster0.f4uut.mongodb.net/POLYGON_STOCKS_EOD?retryWrites=true&w=majority"
conn = MongoClient(mongo_uri)
return conn[db]
def read_mongo(db, collection, query={}, host='localhost', port=27017, username='skhan', password='A330airbus', no_id=True):
""" Read from Mongo and Store into DataFrame """
# Connect to MongoDB
db = _connect_mongo(host=host, port=port, username=username, password=password, db=db)
# Make a query to the specific DB and Collection
cursor = db[collection].find(query)
# Expand the cursor and construct the DataFrame
imported_data = list(cursor)
df = pd.DataFrame(imported_data)
# Delete the _id
if no_id:
del df['_id']
'''MongoDB Cursor had whitespaces in Colums, hence we must rename them TO BE FIXED LATER'''
df = df.rename(columns={' Date': 'Date', ' Open': 'Open', ' High': 'High', ' Low': 'Low', ' Close': 'Close',
' Volume': 'Volume'})
df = df.astype({'Date': 'datetime64[ns]'})
df['time'] =df['Date']
df.set_index('Date', inplace=True)
print(df)
return df
app = QApplication([])
win = QGraphicsView()
win.setWindowTitle('Quantl AI Technical Analysis')
layout = QGridLayout()
win.setLayout(layout)
win.resize(600, 500)
combo = QComboBox()
combo.setEditable(True)
[combo.addItem(i) for i in 'AAPL SHOP ZI'.split()]
layout.addWidget(combo, 0, 0, 1, 1)
info = QLabel()
layout.addWidget(info, 0, 1, 1, 1)
ax = qplt.create_plot(init_zoom_periods=100)
win.axs = [ax] # quantlplot requres this property
axo = ax.overlay()
layout.addWidget(ax.vb.win, 1, 0, 1, 2)
@lru_cache(maxsize=15)
def download(symbol):
return read_mongo('POLYGON_STOCKS_EOD',symbol)
#@lru_cache(maxsize=100)
def get_name(symbol):
return read_mongo('POLYGON_STOCKS_EOD',symbol)
plots = []
def update(txt):
df = download(txt)
if len(df) < 20: # symbol does not exist
return
#info.setText('Loading symbol name...')
price = df['Open Close High Low'.split()]
ma20 = df.Close.rolling(20).mean()
ma50 = df.Close.rolling(50).mean()
volume = df['Open Close Volume'.split()]
ax.reset() # remove previous plots
axo.reset() # remove previous plots
qplt.candlestick_ochl(price)
qplt.plot(ma20, legend='MA-20')
qplt.plot(ma50, legend='MA-50')
qplt.volume_ocv(volume, ax=axo)
qplt.refresh() # refresh autoscaling when all plots complete
Thread(target=lambda: info.setText(get_name(txt))).start() # slow, so use thread
combo.currentTextChanged.connect(update)
update(combo.currentText())
if __name__ == '__main__':
qplt.show(qt_exec=False) # prepares plots when they're all setup
win.show()
app.exec_()
| 31.677686 | 126 | 0.695539 | 529 | 3,833 | 4.94896 | 0.440454 | 0.012223 | 0.024446 | 0.013751 | 0.123759 | 0.123759 | 0.123759 | 0.123759 | 0.090909 | 0.090909 | 0 | 0.020734 | 0.182103 | 3,833 | 120 | 127 | 31.941667 | 0.814354 | 0.132794 | 0 | 0.074074 | 0 | 0.024691 | 0.155831 | 0.069705 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061728 | false | 0.049383 | 0.197531 | 0.024691 | 0.320988 | 0.012346 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17f9141f86bf886ac513d7ce4ca7db30669ed304 | 450 | py | Python | tests/loaders/test_csv_loader.py | BazaroZero/DBeditor | ba487c0b4f6e7f7f1551a24a1ff02290103c6323 | [
"MIT"
] | 11 | 2021-11-16T16:42:36.000Z | 2021-12-16T21:33:20.000Z | tests/loaders/test_csv_loader.py | BazaroZero/DBeditor | ba487c0b4f6e7f7f1551a24a1ff02290103c6323 | [
"MIT"
] | null | null | null | tests/loaders/test_csv_loader.py | BazaroZero/DBeditor | ba487c0b4f6e7f7f1551a24a1ff02290103c6323 | [
"MIT"
] | 3 | 2021-11-22T19:49:58.000Z | 2022-02-02T12:07:30.000Z | from io import StringIO
import pytest
from dbeditor.loaders.csv_loader import CSVLoader
ANSWER = [{"a": "a", "b": "123", "c": "b"}, {"a": "c", "b": "456", "c": "d"}]
INPUT = "a,b,c\n" + "\n".join(map(lambda x: ",".join(x.values()), ANSWER))
@pytest.fixture
def loader() -> CSVLoader:
data = StringIO(INPUT)
return CSVLoader(data)
def test_load_next(loader: CSVLoader) -> None:
for r, a in zip(loader, ANSWER):
assert r == a
| 23.684211 | 77 | 0.604444 | 68 | 450 | 3.955882 | 0.544118 | 0.01487 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016349 | 0.184444 | 450 | 18 | 78 | 25 | 0.716621 | 0 | 0 | 0 | 0 | 0 | 0.057778 | 0 | 0.083333 | 0 | 0 | 0 | 0.083333 | 1 | 0.166667 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17fd372197284ae9da4e1713a21e896dd0c6b4ee | 21,843 | py | Python | testcases/functional_testcases/test_variables.py | singnet/snet-converter-services | 346b26f8281944a9f47d4bdd1eba54c8fb43e799 | [
"MIT"
] | null | null | null | testcases/functional_testcases/test_variables.py | singnet/snet-converter-services | 346b26f8281944a9f47d4bdd1eba54c8fb43e799 | [
"MIT"
] | 1 | 2022-03-21T04:43:48.000Z | 2022-03-21T04:43:48.000Z | testcases/functional_testcases/test_variables.py | singnet/snet-converter-services | 346b26f8281944a9f47d4bdd1eba54c8fb43e799 | [
"MIT"
] | 4 | 2021-11-30T04:32:59.000Z | 2022-03-23T07:20:53.000Z | import json
from constants.status import TransactionVisibility, TransactionOperation, TransactionStatus, \
ConversionTransactionStatus, ConversionStatus
from infrastructure.models import BlockChainDBModel, TokenDBModel, TokenPairDBModel, ConversionFeeDBModel, \
ConversionTransactionDBModel, ConversionDBModel, WalletPairDBModel, TransactionDBModel
DAPP_AS_CREATED_BY = "DApp"
def create_blockchain_record(row_id, id, name, description, symbol, logo, chain_id, block_confirmation,
is_extension_available,
created_by, created_at, updated_at):
return BlockChainDBModel(row_id=row_id, id=id, name=name, description=description, symbol=symbol,
logo=logo, chain_id=chain_id, block_confirmation=block_confirmation,
is_extension_available=is_extension_available, created_by=created_by,
created_at=created_at, updated_at=updated_at)
def create_token_record(row_id, id, name, description, symbol, logo, blockchain_id, allowed_decimal, token_address,
created_by, created_at, updated_at):
return TokenDBModel(row_id=row_id, id=id, name=name, description=description, symbol=symbol,
logo=logo, blockchain_id=blockchain_id, allowed_decimal=allowed_decimal,
token_address=token_address, created_by=created_by, created_at=created_at,
updated_at=updated_at)
def create_token_pair_record(row_id, id, from_token_id, to_token_id, min_value, max_value, contract_address,
conversion_fee_id, is_enabled, created_by, created_at, updated_at):
return TokenPairDBModel(row_id=row_id, id=id, from_token_id=from_token_id, to_token_id=to_token_id,
min_value=min_value, max_value=max_value, contract_address=contract_address,
conversion_fee_id=conversion_fee_id, is_enabled=is_enabled, created_by=created_by,
created_at=created_at, updated_at=updated_at)
def create_conversion_fee(row_id, id, percentage_from_source, created_by, created_at, updated_at):
return ConversionFeeDBModel(row_id=row_id, id=id, percentage_from_source=percentage_from_source,
created_by=created_by, created_at=created_at, updated_at=updated_at)
def create_wallet_pair(row_id, id, token_pair_id, from_address, to_address, deposit_address, deposit_address_detail,
signature,
signature_metadata, signature_expiry, created_by, created_at, updated_at):
return WalletPairDBModel(row_id=row_id, id=id, token_pair_id=token_pair_id, from_address=from_address,
to_address=to_address, deposit_address=deposit_address,
deposit_address_detail=deposit_address_detail, signature=signature,
signature_metadata=signature_metadata, signature_expiry=signature_expiry,
created_by=created_by, created_at=created_at, updated_at=updated_at)
def create_conversion(row_id, id, wallet_pair_id, deposit_amount, claim_amount, fee_amount, status, claim_signature,
created_by, created_at, updated_at):
return ConversionDBModel(row_id=row_id, id=id, wallet_pair_id=wallet_pair_id, deposit_amount=deposit_amount,
claim_amount=claim_amount, fee_amount=fee_amount, status=status,
claim_signature=claim_signature, created_by=created_by, created_at=created_at,
updated_at=updated_at)
def create_conversion_transaction(row_id, id, conversion_id, status, created_by, created_at, updated_at):
return ConversionTransactionDBModel(row_id=row_id, id=id, conversion_id=conversion_id, status=status,
created_by=created_by, created_at=created_at, updated_at=updated_at)
def create_transaction(row_id, id, conversion_transaction_id, token_id, transaction_visibility,
transaction_operation, transaction_hash, transaction_amount, confirmation, status, created_by,
created_at, updated_at):
return TransactionDBModel(row_id=row_id, id=id, conversion_transaction_id=conversion_transaction_id,
token_id=token_id, transaction_visibility=transaction_visibility,
transaction_operation=transaction_operation, transaction_hash=transaction_hash,
transaction_amount=transaction_amount, confirmation=confirmation, status=status,
created_by=created_by,
created_at=created_at, updated_at=updated_at)
class TestVariables:
def __init__(self):
created_at = "2022-01-12 04:10:54"
created_at1 = "2022-01-11 04:10:54"
created_at2 = "2022-01-10 04:10:54"
updated_at = "2022-01-12 04:10:54"
self.blockchain_row_id_1 = 1
self.blockchain_row_id_2 = 2
self.token_row_id_1 = 1
self.token_row_id_2 = 2
self.token_pair_row_id_1 = 1
self.token_pair_row_id_2 = 2
self.conversion_fee_row_id_1 = 1
self.conversion_fee_row_id_2 = 2
self.wallet_pair_id_1 = 1
self.wallet_pair_id_2 = 2
self.conversion_id_1 = 1
self.conversion_id_2 = 2
self.conversion_id_3 = 3
self.conversion_transaction_id_1 = 1
self.conversion_transaction_id_2 = 2
self.transaction_id_1 = 1
self.transaction_id_2 = 2
self.transaction_id_3 = 3
self.transaction_id_4 = 4
self.blockchain = [
create_blockchain_record(row_id=self.blockchain_row_id_1, id="a38b4038c3a04810805fb26056dfabdd",
name="Ethereum",
description="Connect with your wallet",
symbol="ETH", logo="www.ethereum.com/image.png", chain_id=42,
block_confirmation=25,
is_extension_available=True, created_by=DAPP_AS_CREATED_BY, created_at=created_at,
updated_at=updated_at),
create_blockchain_record(row_id=self.blockchain_row_id_2, id="5b21294fe71a4145a40f6ab918a50f96",
name="Cardano",
description="Add your wallet address",
symbol="ADA", logo="www.cardano.com/image.png", chain_id=2,
block_confirmation=23,
is_extension_available=False, created_by=DAPP_AS_CREATED_BY, created_at=created_at,
updated_at=updated_at)
]
self.token_record_1 = create_token_record(row_id=self.token_row_id_1, id="53ceafdb42ad4f3d81eeb19c674437f9",
name="Singularity Ethereum",
description="We are crazy on blockchain",
symbol="AGIX", logo="www.findOurUrl.com/image.png",
blockchain_id=self.blockchain_row_id_1, allowed_decimal=5,
token_address="0xA1e841e8F770E5c9507E2f8cfd0aA6f73009715d",
created_by=DAPP_AS_CREATED_BY,
created_at=created_at, updated_at=updated_at)
self.token_record_2 = create_token_record(row_id=self.token_row_id_2, id="aa5763de861e4a52ab24464790a5c017",
name="Singularity Cardano",
description="We are crazy on blockchain",
symbol="AGIX", logo="www.findOurUrl.com/image.png",
blockchain_id=self.blockchain_row_id_2, allowed_decimal=10,
token_address="ae8a0b54484418a3db56f4e9b472d51cbc860667489366ba6e150c8a",
created_by=DAPP_AS_CREATED_BY,
created_at=created_at, updated_at=updated_at)
self.token = [self.token_record_1, self.token_record_2]
self.token_pair_record_1 = create_token_pair_record(row_id=self.token_pair_row_id_1,
id="22477fd4ea994689a04646cbbaafd133",
from_token_id=self.token_row_id_1,
to_token_id=self.token_row_id_2,
min_value=10, max_value=1000000000000000000,
contract_address="0xacontractaddress",
conversion_fee_id=self.conversion_fee_row_id_1,
is_enabled=True,
created_by=DAPP_AS_CREATED_BY,
created_at=created_at, updated_at=updated_at)
self.token_pair_record_2 = create_token_pair_record(row_id=self.token_pair_row_id_2,
id="fdd6a416d8414154bcdd95f82b6ab239",
from_token_id=self.token_row_id_2,
to_token_id=self.token_row_id_1,
min_value=100, max_value=100000000000000000000000,
contract_address="0xacontractaddress",
conversion_fee_id=None, is_enabled=True,
created_by=DAPP_AS_CREATED_BY,
created_at=created_at, updated_at=updated_at)
self.token_pair = [self.token_pair_record_1, self.token_pair_record_2]
self.conversion_fee_record_1 = create_conversion_fee(row_id=self.conversion_fee_row_id_1,
id="ccd10383bd434bd7b1690754f8b98df3",
percentage_from_source=1.5,
created_by=DAPP_AS_CREATED_BY, created_at=created_at,
updated_at=updated_at)
self.conversion_fee_record_2 = create_conversion_fee(row_id=self.conversion_fee_row_id_2,
id="099b90e8f60540228e3ccb948a1a708f",
percentage_from_source=2.23,
created_by=DAPP_AS_CREATED_BY, created_at=created_at,
updated_at=updated_at)
self.conversion_fee = [self.conversion_fee_record_1, self.conversion_fee_record_2]
self.wallet_pair = [
create_wallet_pair(row_id=self.wallet_pair_id_1, id="1b0c8e059600478ca9de05e5fbb559b1",
token_pair_id=self.token_pair_row_id_1,
from_address="0xa18b95A9371Ac18C233fB024cdAC5ef6300efDa1",
to_address="addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
deposit_address=None, deposit_address_detail=None,
signature="0xd4159d88ccc844ced5f0fa19b2975877813ab82f5c260d8cbacc1c11e9d61e8c776db78473a052ee02da961e98c7326f70c5e37e9caa2240dbb17baea2d4c69c1b",
signature_metadata={"amount": "1333.05",
"to_address": "addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
"block_number": 12345678,
"from_address": "0xa18b95A9371Ac18C233fB024cdAC5ef6300efDa1",
"token_pair_id": "22477fd4ea994689a04646cbbaafd133"},
signature_expiry=None,
created_by=DAPP_AS_CREATED_BY, created_at=created_at, updated_at=updated_at),
create_wallet_pair(row_id=self.wallet_pair_id_2, id="f8cff5ec5fd04d41afc32443117d2284",
token_pair_id=self.token_pair_row_id_2,
from_address="addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
to_address="0xa18b95A9371Ac18C233fB024cdAC5ef6300efDa1",
deposit_address="addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
deposit_address_detail={
"derived_address": "addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
"index": 1, "role": 0},
signature="0x84cad9a7adbd444f156906a44381135ae2d81140fb4a0a0ea286287706c36eda643268252c6760f18309aa6f8396b53a48d1ffa9784f326b880758b8f11f03d21b",
signature_metadata={"amount": "1333.05",
"to_address": "0xa18b95A9371Ac18C233fB024cdAC5ef6300efDa1",
"block_number": 12345678,
"from_address": "addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8",
"token_pair_id": "fdd6a416d8414154bcdd95f82b6ab239"},
signature_expiry=None,
created_by=DAPP_AS_CREATED_BY, created_at=created_at, updated_at=updated_at)
]
self.conversion = [create_conversion(row_id=self.conversion_id_1, id="7298bce110974411b260cac758b37ee0",
wallet_pair_id=self.wallet_pair_id_1, deposit_amount=133305000,
claim_amount=(133305000 - 1999575), fee_amount=1999575,
status=ConversionStatus.USER_INITIATED.value,
claim_signature=None, created_by=DAPP_AS_CREATED_BY,
created_at=created_at,
updated_at=updated_at),
create_conversion(row_id=self.conversion_id_2, id="5086b5245cd046a68363d9ca8ed0027e",
wallet_pair_id=self.wallet_pair_id_2,
deposit_amount=1333050000000000000,
claim_amount=1333050000000000000, fee_amount=0,
status=ConversionStatus.USER_INITIATED.value,
claim_signature=None, created_by=DAPP_AS_CREATED_BY,
created_at=created_at1,
updated_at=updated_at),
create_conversion(row_id=self.conversion_id_3, id="51769f201e46446fb61a9c197cb0706b",
wallet_pair_id=self.wallet_pair_id_1,
deposit_amount=1663050000000000000,
claim_amount=1638104000000000000, fee_amount=24946000000000000,
status=ConversionStatus.PROCESSING.value,
claim_signature=None, created_by=DAPP_AS_CREATED_BY,
created_at=created_at2,
updated_at=updated_at)
]
self.conversion_transaction = [
create_conversion_transaction(row_id=self.conversion_transaction_id_1,
id="a33d4c759f884cd58b471b302c192fc6", conversion_id=self.conversion_id_3,
status=ConversionTransactionStatus.FAILED.value,
created_by=DAPP_AS_CREATED_BY, created_at=created_at, updated_at=updated_at),
create_conversion_transaction(row_id=self.conversion_transaction_id_2,
id="a942ea29b2ee4400ad9597443ca24645", conversion_id=self.conversion_id_3,
status=ConversionTransactionStatus.PROCESSING.value,
created_by=DAPP_AS_CREATED_BY, created_at=created_at, updated_at=updated_at)
]
self.transaction = [
create_transaction(row_id=self.transaction_id_1, id="391be6385abf4b608bdd20a44acd6abc",
conversion_transaction_id=self.conversion_transaction_id_2,
token_id=self.token_row_id_1,
transaction_visibility=TransactionVisibility.EXTERNAL.value,
transaction_operation=TransactionOperation.TOKEN_RECEIVED.value,
transaction_hash="22477fd4ea994689a04646cbbaafd133",
transaction_amount=1663050000000000000, confirmation=10,
status=TransactionStatus.SUCCESS.value,
created_by=DAPP_AS_CREATED_BY,
created_at=created_at, updated_at=updated_at),
create_transaction(row_id=self.transaction_id_3, id="1df60a2369f34247a5dc3ed29a8eef67",
conversion_transaction_id=self.conversion_transaction_id_2,
token_id=self.token_row_id_2,
transaction_visibility=TransactionVisibility.EXTERNAL.value,
transaction_operation=TransactionOperation.TOKEN_RECEIVED.value,
transaction_hash="22477fd4ea994689a04646cbbaafd133",
transaction_amount=1663050000000000000, confirmation=10,
status=TransactionStatus.WAITING_FOR_CONFIRMATION.value,
created_by=DAPP_AS_CREATED_BY, created_at=created_at, updated_at=updated_at)
]
def prepare_consumer_cardano_event_format(message):
records = []
body = {"Message": json.dumps(message)}
records.append({"body": json.dumps(body)})
input_event = {"Records": records}
return input_event
def prepare_consumer_ethereum_event_format(message):
records = [{"body": json.dumps(message)}]
input_event = {"Records": records}
return input_event
def prepare_converter_bridge_event_format(message):
records = [{"body": json.dumps(message)}]
input_event = {"Records": records}
return input_event
consumer_token_received_event_message = {'id': '3f998ad2acd5427da9dcee73c9043b2f',
'tx_hash': '1667dce54e1729aec07ab11342f2464335d6542530102e64f7dc47847f669449',
'event_type': 'TOKEN_TRANSFER',
'address': 'addr_test1qza8485avt2xn3vy63plawqt0gk3ykpf98wusc4qrml2avu0pkm5rp3pkz6q4n3kf8znlf3y749lll8lfmg5x86kgt8qju7vx8',
'event_status': None, 'updated_at': '2022-02-20 10:39:19',
'asset': {'id': 'bc3e8590103d4d5cb2701f2faa2d5927',
'asset': '34d1adbf3a7e95b253fd0999fb85e2d41d4121b36b834b83ac069ebb41474958',
'policy_id': '34d1adbf3a7e95b253fd0999fb85e2d41d4121b36b834b83ac069ebb',
'asset_name': '41474958',
'allowed_decimal': 8,
'updated_at': '2022-02-20 10:39:15'},
'transaction_detail': {'id': '6f1b5f9f7e654433baf6941986ec7e7d',
'tx_type': 'TOKEN_RECEIVED',
'assurance_level': 'HIGH', 'confirmations': 71514,
'tx_amount': '1E+8',
'tx_fee': '172101', 'block_number': 3263733,
'block_time': 1643042281,
'tx_metadata': {},
'updated_at': '2022-02-20 10:40:14'}}
| 71.149837 | 178 | 0.553999 | 1,831 | 21,843 | 6.181868 | 0.112507 | 0.030038 | 0.054422 | 0.052478 | 0.620284 | 0.505787 | 0.441558 | 0.363283 | 0.333598 | 0.29932 | 0 | 0.123047 | 0.387584 | 21,843 | 306 | 179 | 71.382353 | 0.723107 | 0 | 0 | 0.263359 | 0 | 0 | 0.14357 | 0.107357 | 0 | 0 | 0.0217 | 0 | 0 | 1 | 0.045802 | false | 0 | 0.01145 | 0.030534 | 0.103053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
17ff9ea597e379227350c6f98938c3f871970f93 | 1,858 | py | Python | symneqsys/gsl/interface.py | bjodah/symneqsys | 677307d6b94e452262f7ffe944ec2bed6314d34b | [
"BSD-2-Clause"
] | 1 | 2015-01-10T09:00:04.000Z | 2015-01-10T09:00:04.000Z | symneqsys/gsl/interface.py | bjodah/symneqsys | 677307d6b94e452262f7ffe944ec2bed6314d34b | [
"BSD-2-Clause"
] | null | null | null | symneqsys/gsl/interface.py | bjodah/symneqsys | 677307d6b94e452262f7ffe944ec2bed6314d34b | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import (absolute_import, division, print_function)
import os
import cython_gsl
from pycodeexport.codeexport import C_Code
from symneqsys.codeexport import BinarySolver, NEQSys_Code
class GSL_Code(NEQSys_Code, C_Code):
build_files = [
'solvers.c',
'prebuilt/solvers.o',
'prebuilt/_solvers.o',
'solvers.h', 'neqsys.h', 'Makefile',
]
obj_files = ['neqsys.o', 'solvers.o', '_solvers.o']
templates = [
'neqsys_template.c',
'main_ex_template.c',
]
source_files = ['neqsys.c'] # other are precompiled
so_file = '_solvers.so'
extension_name = 'solvers'
compile_kwargs = {
'std': 'c99',
'options': ['fast', 'warn', 'pic'],
'defmacros': ['GSL_RANGE_CHECK_OFF', 'HAVE_INLINE'],
'libs': cython_gsl.get_libraries(),
'inc_dirs': [cython_gsl.get_include(),
cython_gsl.get_cython_include_dir()],
'lib_dirs': [cython_gsl.get_library_dir()]
}
v_tok = 'y' # see neqsys_template.c
v_offset = None
param_tok = 'k' # see neqsys_template.c
param_offset = None
def __init__(self, *args, **kwargs):
self.basedir = os.path.dirname(__file__)
super(GSL_Code, self).__init__(*args, **kwargs)
class GSL_Solver(BinarySolver):
"""
Used to solve systems with equal number of expressions
as variables.
"""
CodeClass = GSL_Code
solve_args = {'fdfsolver_type': (
'newton', 'gnewton', 'hybridj', 'hybridsj'), }
def run(self, x0, params, itermax=100, **kwargs):
self.num_result = self.binary_mod.solve(
x0, params, atol=self.abstol,
itermax=itermax, **kwargs)
class GSL_Multifit_Nlin_Solver(BinarySolver):
pass
class GSL_MultiRoot_Solver(BinarySolver):
pass
| 23.225 | 66 | 0.622713 | 220 | 1,858 | 4.936364 | 0.518182 | 0.041436 | 0.044199 | 0.029466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005698 | 0.244349 | 1,858 | 79 | 67 | 23.518987 | 0.767806 | 0.084499 | 0 | 0.041667 | 0 | 0 | 0.170441 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0.041667 | 0.104167 | 0 | 0.5 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa0274ea750d3a9326ce59c37381b624f57e2f45 | 2,024 | py | Python | trip_features.py | kurtrm/gas_usage_calculator | fce3e1b10dfcc6257e6e055097f2d2266dbc5b84 | [
"MIT"
] | null | null | null | trip_features.py | kurtrm/gas_usage_calculator | fce3e1b10dfcc6257e6e055097f2d2266dbc5b84 | [
"MIT"
] | null | null | null | trip_features.py | kurtrm/gas_usage_calculator | fce3e1b10dfcc6257e6e055097f2d2266dbc5b84 | [
"MIT"
] | null | null | null | """
Module containing functions to calculate routes via the Google Maps
API, and return the total number of miles on the route.
"""
import os
import googlemaps
import requests
from bs4 import BeautifulSoup
def get_maps_data(start_point: str, end_point: str, api_key: str=None) -> str:
"""
"""
if api_key is None:
try:
api_key = os.environ['GMAPS_API_KEY']
except KeyError:
raise ValueError('No API key is available in the environment, '
'please pass in a valid api key.')
gmaps = googlemaps.Client(key=api_key)
directions_json = gmaps.directions(start_point, end_point, mode='driving')
total_dist = directions_json[0]['legs'][0]['distance']['text']
return total_dist
def parse_dist_text(text: str) -> float:
"""
"""
try:
return float(text[:-3])
except ValueError:
for i in range(0, -11, -1):
try:
return float(text[:i])
except ValueError:
continue
else:
raise ValueError('Unable to parse distance from string')
def get_gas_mileage(year: str, make: str, model: str) -> str:
"""
"""
fueleconomy_car_menu = 'https://www.fueleconomy.gov/'
'ws/rest/vehicle/menu/options?year={}&make={}&model={}'.format(year, make, model)
fueleconomy_car_info = 'https://www.fueleconomy.gov/ws/rest/vehicle/{}'
menu_response = requests.get(fueleconomy_car_menu)
menu_soup = BeautifulSoup(menu_response.content, 'html.parser')
car_id = menu_soup.find('value').text
car_response = requests.get(fueleconomy_car_info.format(car_id))
car_soup = BeautifulSoup(car_response.content, 'html.parser')
car_mileage_highway = car_soup.find('highway08').text
car_mileage_city = car_soup.find('city08').text
car_mileage_combined = car_soup.find('comb08').text
return car_mileage_highway, car_mileage_city, car_mileage_combined
| 32.645161 | 86 | 0.638834 | 258 | 2,024 | 4.810078 | 0.406977 | 0.033844 | 0.026591 | 0.029009 | 0.16116 | 0.062853 | 0.062853 | 0.062853 | 0 | 0 | 0 | 0.009186 | 0.247036 | 2,024 | 61 | 87 | 33.180328 | 0.805118 | 0.060771 | 0 | 0.128205 | 0 | 0 | 0.178988 | 0.029461 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.025641 | 0.102564 | 0 | 0.282051 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa04225e3101a62704cfeb6bf33f5b2ef594c4ae | 9,828 | py | Python | watcher/camera_holders/camera_holder.py | framaz/eye_control | 2b4a15b95b4e1f2e9e8c7359416747fd4d26d4a9 | [
"MIT"
] | 2 | 2020-07-19T08:04:03.000Z | 2021-02-03T14:16:04.000Z | watcher/camera_holders/camera_holder.py | framaz/eye_control | 2b4a15b95b4e1f2e9e8c7359416747fd4d26d4a9 | [
"MIT"
] | 3 | 2020-01-31T11:15:06.000Z | 2022-03-25T19:10:47.000Z | watcher/camera_holders/camera_holder.py | framaz/eye_control | 2b4a15b95b4e1f2e9e8c7359416747fd4d26d4a9 | [
"MIT"
] | null | null | null | import base64
import signal
import subprocess
import typing
from io import BytesIO
import PIL
import cv2
import gevent
import numpy as np
import zerorpc
from PIL import Image
import predictor_module
from from_internet_or_for_from_internet import PNP_solver as PNP_solver
from utilities import get_world_to_camera_matrix
from .camera_system_factory import CameraSystemFactory
class CameraHolder:
"""This class is specified to store all logic and information about one camera
:ivar _camera: (cv2.VideoCapture) real-world camera obj
:ivar _l_eye: (Eye) used for remembering and smoothing eye gaze vector.
:ivar _r_eye: (Eye) used for remembering and smoothing eye gaze vector.
:ivar _solver: (from_internet_or_for_from_internet.PNP_solver.PoseEstimator)
used to store unrotated face position
:ivar _factory: (CameraSystemFactory) used to create full camera system tick by tick.
:ivar _screen: (Screen) used to match eye gaze vectors and onscreen target position
:ivar _head: (Head) used for remembering and smoothing head position(rotation and translation)
"""
def __init__(self, camera: cv2.VideoCapture, calibration_needed: bool = True):
"""The constructor of CameraHolder
if calibration_needed is true then an electron app is run for calibrating the camera.
:param camera: stores real-world camera object
:param calibration_needed: flag that specifies whether camera calibration(brightness etc)
should be used
"""
self._camera = camera
self._l_eye = None
self._r_eye = None
self._screen = None
self._head = None
width = camera.get(cv2.CAP_PROP_FRAME_WIDTH)
height = camera.get(cv2.CAP_PROP_FRAME_HEIGHT)
self._solver = PNP_solver.PoseEstimator((height, width))
self._factory = CameraSystemFactory(self._solver)
if calibration_needed:
self._camera_calibration_server = CameraCalibrationServer(camera)
zpc = zerorpc.Server(self._camera_calibration_server)
self._camera_calibration_server.add_server(zpc)
zpc.bind('tcp://127.0.0.1:4243')
self._electron = subprocess.Popen(
["./frontend/node_modules/.bin/electron", "./frontend", "camera_calibrator"])
zpc.run()
def calibration_tick(self, time_now: float,
predictor: predictor_module.BasicPredictor) -> str:
"""Just call a tick of _factory
:param time_now: current time to remember
:param predictor: which predictor to use
:return: current head position and gazes
"""
img = [self.get_picture()]
return self._factory.calibrate_remember(img, time_now, predictor)
def calibration_corner_end(self, corner: typing.Union[int, str]) -> None:
"""Just call a tick series end for one corner of _factory
:param corner: corner_number, may be [1, 2...] or ["TL", "TR"...]
:return: nothing is returned
"""
self._factory.calibration_end(corner)
def calibration_end(self) -> None:
"""Just call an end of _factory
:return: nothing
"""
self._l_eye, self._r_eye, self._head = self._factory.calibration_final()
def get_picture(self) -> PIL.Image.Image:
"""Takes a photo with camera and turns it to grayscale"""
ret, img = self._camera.read()
img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
img = PIL.Image.fromarray(img)
return img
def get_screen_positions(self,
eye_one_vector: np.ndarray(shape=(3,)),
eye_two_vector: np.ndarray(shape=(3,)),
np_points: np.ndarray(shape=(68, 2)),
time_now: float) -> (np.ndarray, np.ndarray):
"""Remembers current gazes and head pos and finds pixel gaze target.
:param eye_one_vector: gaze vector from first eye
:param eye_two_vector: gaze vector
:param np_points: 68 face markers detected by dlib
:param time_now: current time
:return:
"""
head_rotation, head_translation = self._head.get_smoothed_position(np_points, time_now)
world_to_camera = get_world_to_camera_matrix(head_rotation, head_translation)
left_eye = sum(self._solver.model_points_68[36:41]) / 6
right_eye = sum(self._solver.model_points_68[42:47]) / 6
left_eye = np.array([*left_eye, 1])
left_eye = np.matmul(world_to_camera, left_eye)
right_eye = np.array([*right_eye, 1])
right_eye = np.matmul(world_to_camera, right_eye)
eye_one_screen = self._l_eye.get_screen_point(eye_one_vector, time_now, right_eye)
eye_two_screen = self._r_eye.get_screen_point(eye_two_vector, time_now, left_eye)
return eye_one_screen, eye_two_screen
class CameraCalibrationServer:
"""This class is for camera calibration(brightness etc)
To run it u have to mannually create and run zerorpc server etc(example in CameraHolder)
:ivar _server: (zerorpc.Server) the server to communicate between electron and watcher
:ivar _camera:
:ivar _attributes:
:ivar _attribute_names:
"""
def __init__(self, camera: cv2.VideoCapture):
"""Contsruct object and check camera for parameter availability
:param camera:
"""
self._server = None
self._camera = camera
self._attributes = {}
self._attribute_names = {
"CAP_PROP_POS_MSEC": cv2.CAP_PROP_POS_MSEC,
"CAP_PROP_POS_FRAMES": cv2.CAP_PROP_POS_FRAMES,
"CAP_PROP_POS_AVI_RATIO": cv2.CAP_PROP_POS_AVI_RATIO,
"CAP_PROP_FRAME_WIDTH": cv2.CAP_PROP_FRAME_WIDTH,
"CAP_PROP_FRAME_HEIGHT": cv2.CAP_PROP_FRAME_HEIGHT,
"CAP_PROP_FPS": cv2.CAP_PROP_FPS,
"CAP_PROP_FOURCC": cv2.CAP_PROP_FOURCC,
"CAP_PROP_FRAME_COUNT": cv2.CAP_PROP_FRAME_COUNT,
"CAP_PROP_FORMAT": cv2.CAP_PROP_FORMAT,
"CAP_PROP_MODE": cv2.CAP_PROP_MODE,
"CAP_PROP_BRIGHTNESS": cv2.CAP_PROP_BRIGHTNESS,
"CAP_PROP_CONTRAST": cv2.CAP_PROP_CONTRAST,
"CAP_PROP_SATURATION": cv2.CAP_PROP_SATURATION,
"CAP_PROP_HUE": cv2.CAP_PROP_HUE,
"CAP_PROP_GAIN": cv2.CAP_PROP_GAIN,
"CAP_PROP_EXPOSURE": cv2.CAP_PROP_EXPOSURE,
"CAP_PROP_CONVERT_RGB": cv2.CAP_PROP_CONVERT_RGB,
"CAP_PROP_WHITE_BALANCE_BLUE_U": cv2.CAP_PROP_WHITE_BALANCE_BLUE_U,
"CAP_PROP_RECTIFICATION": cv2.CAP_PROP_RECTIFICATION,
"CAP_PROP_MONOCHROME": cv2.CAP_PROP_MONOCHROME,
"CAP_PROP_SHARPNESS": cv2.CAP_PROP_SHARPNESS,
"CAP_PROP_AUTO_EXPOSURE": cv2.CAP_PROP_AUTO_EXPOSURE,
"CAP_PROP_GAMMA": cv2.CAP_PROP_GAMMA,
"CAP_PROP_TEMPERATURE": cv2.CAP_PROP_TEMPERATURE,
"CAP_PROP_TRIGGER": cv2.CAP_PROP_TRIGGER,
"CAP_PROP_TRIGGER_DELAY": cv2.CAP_PROP_TRIGGER_DELAY,
"CAP_PROP_WHITE_BALANCE_RED_V": cv2.CAP_PROP_WHITE_BALANCE_RED_V,
"CAP_PROP_ZOOM": cv2.CAP_PROP_ZOOM,
"CAP_PROP_FOCUS": cv2.CAP_PROP_FOCUS,
"CAP_PROP_GUID": cv2.CAP_PROP_GUID,
"CAP_PROP_ISO_SPEED": cv2.CAP_PROP_ISO_SPEED,
"CAP_PROP_BACKLIGHT": cv2.CAP_PROP_BACKLIGHT,
"CAP_PROP_PAN": cv2.CAP_PROP_PAN,
"CAP_PROP_TILT": cv2.CAP_PROP_TILT,
"CAP_PROP_ROLL": cv2.CAP_PROP_ROLL,
"CAP_PROP_IRIS": cv2.CAP_PROP_IRIS,
"CAP_PROP_SETTINGS": cv2.CAP_PROP_SETTINGS,
"CAP_PROP_BUFFERSIZE": cv2.CAP_PROP_BUFFERSIZE,
"CAP_PROP_AUTOFOCUS": cv2.CAP_PROP_AUTOFOCUS,
"CAP_PROP_SAR_NUM": cv2.CAP_PROP_SAR_NUM,
"CAP_PROP_SAR_DEN": cv2.CAP_PROP_SAR_DEN,
"CAP_PROP_BACKEND": cv2.CAP_PROP_BACKEND,
"CAP_PROP_CHANNEL": cv2.CAP_PROP_CHANNEL,
"CAP_PROP_AUTO_WB": cv2.CAP_PROP_AUTO_WB,
"CAP_PROP_WB_TEMPERATURE": cv2.CAP_PROP_WB_TEMPERATURE,
}
for attr_name in self._attribute_names:
i = self._attribute_names[attr_name]
res = self._camera.get(i)
if res != -1:
self._attributes[i] = res, attr_name
def add_server(self, server: zerorpc.Server) -> None:
"""Remember the zerorpc server and states its stop signal to gevent
:param server:
:return:
"""
self._server = server
gevent.signal(signal.SIGTERM, self._server.stop)
def exit(self) -> None:
"""FOR RPC, Stop the zerorpc server"""
self._server.stop()
def get_attributes(self) -> typing.Dict[int, typing.Tuple[int, str]]:
"""FOR RPC, Get attributes and values
:return:
"""
return self._attributes
def set_attribute(self,
attribute: typing.Union[str, int],
value: typing.Union[str, int]) -> None:
"""Set attribute value by its number
:param attribute: attribute number, part of cv2.CAP_PROP...
:param value: integer value
:return:
"""
attribute = int(attribute)
value = int(value)
self._camera.set(attribute, value)
def get_frame(self) -> str:
"""Take a picture from cam, encode it to base64
:return: (str) jpg picture in base64 encode
"""
ret, image = self._camera.read()
image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
buffered = BytesIO()
image = Image.fromarray(image)
image.save(buffered, format="JPEG")
img_str = base64.b64encode(buffered.getvalue())
return img_str
| 39.312 | 98 | 0.649369 | 1,266 | 9,828 | 4.704581 | 0.221959 | 0.109302 | 0.080591 | 0.012592 | 0.139691 | 0.079919 | 0.027535 | 0.017797 | 0.017797 | 0.017797 | 0 | 0.014359 | 0.263024 | 9,828 | 249 | 99 | 39.46988 | 0.807953 | 0.24186 | 0 | 0.014184 | 0 | 0 | 0.124094 | 0.032125 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085106 | false | 0 | 0.106383 | 0 | 0.241135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa084f77afe736d9793155a878ee9ea7a00e19de | 10,398 | py | Python | bruhat/sedenions.py | punkdit/bruhat | 3231eacc49fd3464542f7eb72684751371d9876c | [
"MIT"
] | 3 | 2020-04-07T13:21:30.000Z | 2020-07-15T02:07:20.000Z | bruhat/sedenions.py | punkdit/bruhat | 3231eacc49fd3464542f7eb72684751371d9876c | [
"MIT"
] | null | null | null | bruhat/sedenions.py | punkdit/bruhat | 3231eacc49fd3464542f7eb72684751371d9876c | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
"""
Implement the Cayley-Dickson construction to
get complex numbers, quaternions, octonions, sedenions, etc.
"""
import math, os
from isomorph import Point, Graph, search
from argv import argv
class Number(object):
def __init__(self, a):
self.a = a
self.shape = ()
def __str__(self):
return str(self.a)
def __repr__(self):
return "%s(%s)"%(self.__class__.__name__, self.a)
def __hash__(self):
return hash(str(self))
def promote(self, a):
if not isinstance(a, Number):
a = Number(a) # wrap it up
return a
def __add__(self, other):
assert self.__class__ is other.__class__
return self.a + other.a
def __sub__(self, other):
assert self.__class__ is other.__class__
return self.a - other.a
def __mul__(self, other):
assert self.__class__ is other.__class__
return self.a * other.a
def __eq__(self, other):
assert self.__class__ is other.__class__
return self.a == other.a
def __ne__(self, other):
assert self.__class__ is other.__class__
return self.a != other.a
def __neg__(self):
return Number(-self.a)
def conj(self):
return self
def is_real(self):
return True
def is_zero(self):
return self.a == 0
def get_zero(self):
return Number(0)
class Double(Number):
def __init__(self, a, b):
if not isinstance(a, Number):
a = Number(a)
if not isinstance(b, Number):
b = Number(b)
assert a.shape == b.shape
self.shape = (a.shape, b.shape)
assert isinstance(a, Number)
self.a = a
self.b = b
def get_zero(self):
return Double(self.a.get_zero(), self.b.get_zero())
def promote(self, a):
if not isinstance(a, Number):
a = Number(a)
if a.shape == self.shape:
return a
assert str(a.shape) in str(self.shape)
a = self.a.promote(a)
return Double(a, self.b.get_zero())
def __repr__(self):
#a, b = self.pair
return "%s(%s, %s)"%(self.__class__.__name__, self.a, self.b)
def __str__(self):
return "(%s, %s)"%(self.a, self.b)
def __add__(self, other):
other = self.promote(other)
assert self.__class__ is other.__class__
assert self.shape == other.shape
a = self.a + other.a
b = self.b + other.b
return self.__class__(a, b)
def __sub__(self, other):
other = self.promote(other)
assert self.__class__ is other.__class__
assert self.shape == other.shape
a = self.a - other.a
b = self.b - other.b
return self.__class__(a, b)
def __mul__(self, other):
other = self.promote(other)
assert self.__class__ is other.__class__
assert self.shape == other.shape
a, b = self.a, self.b
c, d = other.a, other.b
x = self.__class__(a*c - d.conj()*b, d*a + b*c.conj())
return x
def __eq__(self, other):
other = self.promote(other)
assert self.__class__ is other.__class__
assert self.shape == other.shape
return self.a == other.a and self.b == other.b
def __ne__(self, other):
other = self.promote(other)
assert self.__class__ is other.__class__
assert self.shape == other.shape
return self.a != other.a or self.b != other.b
def __neg__(self):
return self.__class__(-self.a, -self.b)
def conj(self):
return self.__class__(self.a.conj(), -self.b)
def norm2(self):
return self.conj() * self
def is_real(self):
return self.a.is_real() and self.b.is_zero()
def is_zero(self):
return self.a.is_zero() and self.b.is_zero()
def is_commutative(items):
for a in items:
for b in items:
if a*b != b*a:
return False
return True
def is_anticommutative(items):
for a in items:
for b in items:
if a!=b and a*b != -b*a:
return False
return True
def is_associative(items):
for a in items:
for b in items:
for c in items:
if a*(b*c) != (a*b)*c:
return False
return True
def is_alternative(items):
for a in items:
for b in items:
if a*(b*b) != (a*b)*b:
return False
return True
def get_geometry(imag):
graph = Graph()
N = len(imag)
triples = set()
cycles = []
for idx in range(N):
graph.add('p')
for idx in range(N):
for jdx in range(N):
if idx==jdx:
continue
k = imag[idx]*imag[jdx]
if k not in imag:
continue
kdx = imag.index(k)
key = [idx, jdx, kdx]
cycle = list(key)
key.sort()
key = tuple(key)
if key in triples:
continue
triples.add(key)
cycles.append(cycle)
p = graph.add('l')
graph.join(idx, p)
graph.join(jdx, p)
graph.join(kdx, p)
return graph, cycles
def test_structure(imag):
bag0, cycles = get_geometry(imag)
bag1, cycles = get_geometry(imag)
#print(cycles)
N = 3
struct = []
for cycle in cycles:
items = []
for i in range(N):
items.append(tuple(cycle[(i+j)%N] for j in range(N)))
items.sort()
#print items
struct.append(items)
struct.sort()
for cycle in cycles:
cycle = [bag0[i] for i in cycle]
nbd = set(cycle[0].nbd)
for point in cycle:
nbd = nbd.intersection(point.nbd)
assert len(nbd)==1
#print(struct)
count = 0
total = 0
for f in search(bag0, bag1):
_struct = [[tuple(f[i] for i in cycle) for cycle in items] for items in struct ]
for items in _struct:
items.sort()
_struct.sort()
#print(_struct)
if struct==_struct:
count += 1
#print("*")
total += 1
return count, total
def test():
x = Number(2)
y = Double(2, 0)
assert x==y
# ----------- double: complex --------------------
one = Double(1, 0)
i = Double(0, 1)
assert i*i == -1
cplex = [one, i]
assert is_commutative(cplex)
assert is_associative(cplex)
# ----------- double: quaternions --------------------
zero = Double(Double(0, 0), Double(0, 0))
one = Double(Double(1, 0), Double(0, 0))
i = Double(Double(0, 1), Double(0, 0))
j = Double(Double(0, 0), Double(1, 0))
k = Double(Double(0, 0), Double(0, 1))
for x in [i, j, k]:
assert x*x == -1
for y in [i, j, k]:
if x==y:
continue
assert x*y == -y*x
assert i*j == -j*i
assert i*j*k == -1
quaternions = [one, i, j, k]
assert not is_commutative(quaternions)
assert is_anticommutative(quaternions[1:])
assert is_associative(quaternions)
count, total = test_structure(quaternions[1:])
assert count==3
assert total==6 # GL(2, 2) = S_3
# ----------- double: octonions --------------------
octonions = [
Double(one, zero), Double(zero, one),
Double(i, zero), Double(j, zero), Double(k, zero),
Double(zero, i), Double(zero, j), Double(zero, k)]
imag = octonions[1:]
for i in imag:
assert i*i == -1
assert not is_commutative(octonions)
assert not is_associative(octonions)
assert is_anticommutative(octonions[1:])
assert is_alternative(octonions)
count, total = test_structure(imag)
assert count == 21
assert total == 168 # GL(3, 2)
# ----------- double: sedenions --------------------
one = Double(one, zero)
zero = Double(zero, zero)
sedenions = [Double(one, zero), Double(zero, one)]
for i in octonions[1:]:
sedenions.append(Double(i, zero))
sedenions.append(Double(zero, i))
assert not is_commutative(sedenions)
assert not is_associative(sedenions)
assert is_anticommutative(sedenions[1:])
assert is_alternative(sedenions) # um...
# try some more sedenions here:
items = list(sedenions)
for a in sedenions:
for b in sedenions:
items.append(a+b)
assert not is_alternative(items)
N = len(sedenions)
for idx in range(1, N):
for jdx in range(1, N):
if idx==jdx:
continue
e = sedenions[idx] * sedenions[jdx]
if e in sedenions:
kdx = sedenions.index(e)
#print("e_%d * e_%d = e_%d" % (idx, jdx, kdx))
for idx in range(1, N):
for jdx in range(idx+1, N):
e = sedenions[idx] * sedenions[jdx]
count, total = test_structure(sedenions[1:])
assert count == 21 # does this make any sense?
assert total == 20160 # GL(4, 2)
def test_quaternion():
x = Number(2)
y = Double(2, 0)
assert x==y
# ----------- double: complex --------------------
one = Double(1, 0)
i = Double(0, 1)
assert i*i == -1
cplex = [one, i]
assert is_commutative(cplex)
assert is_associative(cplex)
# ----------- double: quaternions --------------------
zero = Double(Double(0, 0), Double(0, 0))
one = Double(Double(1, 0), Double(0, 0))
i = Double(Double(0, 1), Double(0, 0))
j = Double(Double(0, 0), Double(1, 0))
k = Double(Double(0, 0), Double(0, 1))
for x in [i, j, k]:
assert x*x == -1
for y in [i, j, k]:
if x==y:
continue
assert x*y == -y*x
assert i*j == -j*i
assert i*j*k == -1
quaternions = [one, i, j, k]
assert not is_commutative(quaternions)
assert is_anticommutative(quaternions[1:])
assert is_associative(quaternions)
count, total = test_structure(quaternions[1:])
assert count==3
assert total==6 # GL(2, 2) = S_3
# from element import Q
# from poly import Poly
#
# ring = Q
# r_zero = Poly({}, ring)
# r_one = Poly({():1}, ring)
# x = Poly("x", ring)
# y = Poly("y", ring)
# z = Poly("z", ring)
# u = Poly("u", ring)
# v = Poly("v", ring)
# w = Poly("w", ring)
#
# print(x*i) # argh, this is just not going to work...
#
if __name__ == "__main__":
test()
#test_quaternion()
| 24.013857 | 88 | 0.544816 | 1,449 | 10,398 | 3.731539 | 0.1049 | 0.026817 | 0.017755 | 0.036989 | 0.526355 | 0.459774 | 0.427964 | 0.405585 | 0.405585 | 0.394304 | 0 | 0.016359 | 0.312175 | 10,398 | 432 | 89 | 24.069444 | 0.739653 | 0.096653 | 0 | 0.513514 | 0 | 0 | 0.003639 | 0 | 0 | 0 | 0 | 0 | 0.199324 | 1 | 0.128378 | false | 0 | 0.010135 | 0.054054 | 0.277027 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa121f9aaebadf45573ca3d20d2d3ec4962d7ab9 | 4,250 | py | Python | app/routes/v1/application.py | brionmario/cssi-api | 2bf4673afd5baa81d230b5e7da730b694cb67a09 | [
"MIT"
] | 2 | 2019-07-04T16:57:07.000Z | 2019-07-09T16:21:12.000Z | app/routes/v1/application.py | project-cssi/cssi-api | 2bf4673afd5baa81d230b5e7da730b694cb67a09 | [
"MIT"
] | null | null | null | app/routes/v1/application.py | project-cssi/cssi-api | 2bf4673afd5baa81d230b5e7da730b694cb67a09 | [
"MIT"
] | 3 | 2019-05-31T06:05:15.000Z | 2019-06-27T19:02:54.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# (c) Copyright 2019 CSSI.
# (c) This file is part of the CSSI REST API and is made available under MIT license.
# (c) For more information, see https://github.com/project-cssi/cssi-api/blob/master/LICENSE.md
# (c) Please forward any queries to the given email address. email: opensource@apareciumlabs.com
"""Application routes module
This modules contains all the different routes to interact with applications.
Authors:
Brion Mario
"""
import logging
import uuid
import traceback
from flask_cors import cross_origin
from flask import Blueprint, jsonify, request
from app.models import Application, ApplicationType, ApplicationTypeSchema, ApplicationSchema, Genre, GenreSchema
from app import db
logger = logging.getLogger('cssi.api')
application = Blueprint('application', __name__)
application_schema = ApplicationSchema(strict=True)
applications_schema = ApplicationSchema(many=True, strict=True)
application_types_schema = ApplicationTypeSchema(many=True, strict=True)
application_genres_schema = GenreSchema(many=True, strict=True)
@application.route('/', methods=['GET'])
@cross_origin(supports_credentials=True)
def get_application_list():
"""Get a list of all the Applications"""
applications = Application.query.all()
result = applications_schema.dump(applications).data
return jsonify({'status': 'success', 'message': None, 'data': result}), 200
@application.route('/<int:id>', methods=['GET'])
@cross_origin(supports_credentials=True)
def get_application(id):
"""Get info on an Applications when an id is passed in"""
application = Application.query.get(id)
result = application_schema.dump(application).data
return jsonify({'status': 'success', 'message': None, 'data': result}), 200
@application.route('/types', methods=['GET'])
@cross_origin(supports_credentials=True)
def get_application_types():
"""Get all the available application types"""
application_types = ApplicationType.query.all()
result = application_types_schema.dump(application_types).data
return jsonify({'status': 'success', 'message': None, 'data': result}), 200
@application.route('/genres', methods=['GET'])
@cross_origin(supports_credentials=True)
def get_application_genres():
"""Get all the available application genres"""
application_genres = Genre.query.all()
result = application_genres_schema.dump(application_genres).data
return jsonify({'status': 'success', 'message': None, 'data': result}), 200
@application.route('/', methods=['POST'])
@cross_origin(supports_credentials=True)
def create_application():
"""Create a new Application"""
name = request.json['name']
identifier = str(uuid.uuid4().hex)
developer = request.json['developer']
type = ApplicationType.query.filter_by(id=request.json['type']).first()
description = request.json['description']
genre = Genre.query.filter_by(id=request.json['genre']).first()
# validate application type
if not type:
return {'status': 'error', 'message': 'Invalid Application Type'}, 400
# validate genre
if not genre:
return {'status': 'error', 'message': 'Invalid Genre Type'}, 400
new_application = Application(name=name, identifier=identifier,
developer=developer, type=type, description=description, genre=genre)
db.session.add(new_application)
db.session.commit()
result = application_schema.dump(new_application).data
return jsonify({'status': 'success', 'message': 'Created new application {}.'.format(name), 'data': result}), 201
@application.after_request
def after_request(response):
"""Logs a debug message on every successful request."""
logger.debug('%s %s %s %s %s', request.remote_addr, request.method,
request.scheme, request.full_path, response.status)
return response
@application.errorhandler(Exception)
def exceptions(e):
"""Logs an error message and stacktrace if a request ends in error."""
tb = traceback.format_exc()
logger.error('%s %s %s %s 5xx INTERNAL SERVER ERROR\n%s', request.remote_addr,
request.method, request.scheme, request.full_path, tb)
return e.status_code
| 36.324786 | 117 | 0.721412 | 521 | 4,250 | 5.78119 | 0.318618 | 0.004648 | 0.031541 | 0.049801 | 0.331009 | 0.262284 | 0.232736 | 0.213147 | 0.213147 | 0.213147 | 0 | 0.007743 | 0.149176 | 4,250 | 116 | 118 | 36.637931 | 0.825221 | 0.192941 | 0 | 0.134328 | 0 | 0 | 0.110059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104478 | false | 0 | 0.104478 | 0 | 0.343284 | 0.029851 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa136170de9b0bad52c19245436ca17d8a9d7bac | 3,327 | py | Python | pypy/module/sys/version.py | woodrow/pyoac | b5dc59e6a38e7912db47f26fb23ffa4764a3c0e7 | [
"MIT"
] | 1 | 2019-05-27T00:58:46.000Z | 2019-05-27T00:58:46.000Z | pypy/module/sys/version.py | woodrow/pyoac | b5dc59e6a38e7912db47f26fb23ffa4764a3c0e7 | [
"MIT"
] | null | null | null | pypy/module/sys/version.py | woodrow/pyoac | b5dc59e6a38e7912db47f26fb23ffa4764a3c0e7 | [
"MIT"
] | null | null | null | """
Version numbers exposed by PyPy through the 'sys' module.
"""
import os
CPYTHON_VERSION = (2, 5, 2, "beta", 42)
CPYTHON_API_VERSION = 1012
# release 1.1.0
PYPY_VERSION = (1, 1, 0, "beta", '?')
# the last item is replaced by the svn revision ^^^
TRIM_URL_UP_TO = 'svn/pypy/'
SVN_URL = "$HeadURL: http://codespeak.net/svn/pypy/dist/pypy/module/sys/version.py $"[10:-28]
REV = "$LastChangedRevision: 64770 $"[22:-2]
import pypy
pypydir = os.path.dirname(os.path.abspath(pypy.__file__))
del pypy
import time as t
gmtime = t.gmtime()
date = t.strftime("%b %d %Y", gmtime)
time = t.strftime("%H:%M:%S", gmtime)
del t
# ____________________________________________________________
def get_api_version(space):
return space.wrap(CPYTHON_API_VERSION)
def get_version_info(space):
return space.wrap(CPYTHON_VERSION)
def get_version(space):
return space.wrap("%d.%d.%d (%d, %s, %s)\n[PyPy %d.%d.%d]" % (
CPYTHON_VERSION[0],
CPYTHON_VERSION[1],
CPYTHON_VERSION[2],
svn_revision(),
date,
time,
PYPY_VERSION[0],
PYPY_VERSION[1],
PYPY_VERSION[2]))
def get_hexversion(space):
return space.wrap(tuple2hex(CPYTHON_VERSION))
def get_pypy_version_info(space):
ver = PYPY_VERSION
ver = ver[:-1] + (svn_revision(),)
return space.wrap(ver)
def get_svn_url(space):
return space.wrap((SVN_URL, svn_revision()))
def get_subversion_info(space):
svnbranch = SVN_URL
if TRIM_URL_UP_TO in svnbranch:
svnbranch = svnbranch.split(TRIM_URL_UP_TO, 1)[1]
svnbranch = svnbranch.strip('/')
return space.newtuple([space.wrap('PyPy'),
space.wrap(svnbranch),
space.wrap(str(svn_revision()))])
def tuple2hex(ver):
d = {'alpha': 0xA,
'beta': 0xB,
'candidate': 0xC,
'final': 0xF,
}
subver = ver[4]
if not (0 <= subver <= 9):
subver = 0
return (ver[0] << 24 |
ver[1] << 16 |
ver[2] << 8 |
d[ver[3]] << 4 |
subver)
def svn_revision():
"Return the last-changed svn revision number."
# NB. we hack the number directly out of the .svn directory to avoid
# to depend on an external 'svn' executable in the path.
rev = int(REV)
try:
f = open(os.path.join(pypydir, '.svn', 'format'), 'r')
format = int(f.readline().strip())
f.close()
if format <= 6: # Old XML-format
f = open(os.path.join(pypydir, '.svn', 'entries'), 'r')
for line in f:
line = line.strip()
if line.startswith('committed-rev="') and line.endswith('"'):
rev = int(line[15:-1])
break
f.close()
else: # New format
f = open(os.path.join(pypydir, '.svn', 'entries'), 'r')
format = int(f.readline().strip())
for entry in f.read().split('\f'):
lines = entry.split('\n')
name, kind, revstr = lines[:3]
if name == '' and kind == 'dir': # The current directory
rev = int(revstr)
break
f.close()
except (IOError, OSError):
pass
return rev
| 28.930435 | 93 | 0.554253 | 427 | 3,327 | 4.063232 | 0.337237 | 0.046686 | 0.051873 | 0.057637 | 0.137176 | 0.086455 | 0.059366 | 0.044957 | 0.044957 | 0.044957 | 0 | 0.026225 | 0.300872 | 3,327 | 114 | 94 | 29.184211 | 0.71969 | 0.119627 | 0 | 0.102273 | 0 | 0.022727 | 0.102737 | 0.007097 | 0 | 0 | 0.004055 | 0 | 0 | 1 | 0.102273 | false | 0.011364 | 0.034091 | 0.056818 | 0.238636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa157705eec6f5775bf633fc7592d0bbc2b635b9 | 1,863 | py | Python | fifo_parser/messageslistener.py | ThibF/G-youmus | db57226ff3d787c03fece32d224ea03493b70618 | [
"MIT"
] | null | null | null | fifo_parser/messageslistener.py | ThibF/G-youmus | db57226ff3d787c03fece32d224ea03493b70618 | [
"MIT"
] | null | null | null | fifo_parser/messageslistener.py | ThibF/G-youmus | db57226ff3d787c03fece32d224ea03493b70618 | [
"MIT"
] | null | null | null | import json
import logging
from enum import Enum
import boto3 as boto3
from config import config
from fifo_parser.facebookmessage import FacebookMessage
from fifo_parser.googlemessage import GoogleMessage
class Source(Enum):
GOOGLE = 1
FACEBOOK = 2
def is_payload_from(payload):
if type(payload) == str:
payload = json.loads(payload)
if type(payload) == str:
payload = json.loads(payload)
try:
sender_id = payload["entry"][0]["messaging"][0]["sender"]["id"]
return Source.FACEBOOK
except KeyError as e:
return Source.GOOGLE
class MessagesListener:
sqs = None
def __init__(self):
logging.info("Service starting")
self.sqs = boto3.resource('sqs', aws_access_key_id=config["access_key"],
aws_secret_access_key=config["secret_access_key"], region_name=config["region_name"],
endpoint_url=config["endpoint_url"])
self.queue = self.sqs.get_queue_by_name(QueueName=config["QueueName"])
logging.info("sqs successfully accessed")
def receive_messages(self):
logging.info("Waiting for message")
logging.getLogger().setLevel(level=logging.ERROR)
for msg in self.queue.receive_messages():
logging.getLogger().setLevel(level=logging.INFO)
logging.info("Received =" + str(msg.body))
msg.delete()
return MessagesListener.wrap_raw_msg(msg)
@staticmethod
def wrap_raw_msg(msg):
source = is_payload_from(msg.body)
if source == Source.FACEBOOK:
msg_wrapper = FacebookMessage(msg.body)
return msg_wrapper
elif source == Source.GOOGLE:
msg_wrapper = GoogleMessage(msg.body)
return msg_wrapper
else:
raise NotImplementedError
| 31.05 | 119 | 0.642512 | 214 | 1,863 | 5.420561 | 0.373832 | 0.047414 | 0.024138 | 0.034483 | 0.175 | 0.073276 | 0.073276 | 0.073276 | 0.073276 | 0 | 0 | 0.005084 | 0.26087 | 1,863 | 59 | 120 | 31.576271 | 0.837328 | 0 | 0 | 0.125 | 0 | 0 | 0.082662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.145833 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa15ea424cf4404c5a4c6c262081d709093a5249 | 3,719 | py | Python | src/mp_znorm_np.py | Tyred/BigData | f5d106fac0580082acaa8db958d6d890afdab881 | [
"MIT"
] | 2 | 2021-07-16T07:30:17.000Z | 2021-09-20T10:01:53.000Z | src/mp_znorm_np.py | Tyred/BigData | f5d106fac0580082acaa8db958d6d890afdab881 | [
"MIT"
] | null | null | null | src/mp_znorm_np.py | Tyred/BigData | f5d106fac0580082acaa8db958d6d890afdab881 | [
"MIT"
] | 1 | 2020-12-11T08:56:05.000Z | 2020-12-11T08:56:05.000Z | import numpy as np
import random
import argparse
import sys
import matplotlib.pyplot as plt
_EPS = 1e-14
def mpself(seq, subseq_len):
# prerequisites
exclusion_zone = int(np.round(subseq_len/2))
ndim = seq.shape[0]
seq_len = seq.shape[1]
matrix_profile_len = seq_len - subseq_len + 1
first_subseq = np.flip(seq[:,0:subseq_len],1)
# windowed cumulative sum of the sequence
seq_cum_sum = np.hstack((np.zeros((ndim,1)), np.cumsum(seq,1)))
seq_cum_sum = seq_cum_sum[:,subseq_len:]-seq_cum_sum[:,0:seq_len - subseq_len + 1]
seq_cum_sum2 = np.hstack((np.zeros((ndim,1)), np.cumsum(np.square(seq),1)))
seq_cum_sum2 = seq_cum_sum2[:,subseq_len:]-seq_cum_sum2[:,0:seq_len - subseq_len + 1]
# mean and standard deviations (necessary for z-norm)
mu_all = seq_cum_sum / subseq_len
# TODO improve this equation
sigma_all = np.sqrt((seq_cum_sum2 + seq_cum_sum*seq_cum_sum/subseq_len - 2 * seq_cum_sum * mu_all) / subseq_len)
# sliding dot product
prods = np.full([ndim,seq_len+subseq_len-1], np.inf)
for i_dim in range(0,ndim):
prods[i_dim,:] = np.convolve(first_subseq[i_dim,:],seq[i_dim,:])
prods = prods[:, subseq_len-1:seq_len] # only the interesting products
prods_inv = np.copy(prods)
# first distance profile
# DP^2 = 2m * {1 - [(QT - m*mu_q*mu_t) / (m*sigma_q*sigma_t)] }
dist_profile = np.sum(2*subseq_len*(1-((prods - subseq_len*mu_all[:,0:1]*mu_all)/(subseq_len*sigma_all[:,0:1]*sigma_all))), axis=0)
dist_profile[0:exclusion_zone] = np.inf
matrix_profile = np.full(matrix_profile_len, np.inf)
matrix_profile[0] = np.min(dist_profile)
mp_index = -np.ones((matrix_profile_len), dtype=int)
mp_index[0] = np.argmin(dist_profile)
# for all the other values of the profile
for i_subseq in range(1,matrix_profile_len):
sub_value = seq[:,i_subseq-1, np.newaxis] * seq[:,0:prods.shape[1]-1]
add_value = seq[:,i_subseq+subseq_len-1, np.newaxis] * seq[:, subseq_len:subseq_len+prods.shape[1]-1]
prods[:,1:] = prods[:,0:prods.shape[1]-1] - sub_value + add_value
prods[:,0] = prods_inv[:,i_subseq]
# dist_profile
dist_profile = np.sum(2*subseq_len*(1-((prods - subseq_len*mu_all[:,i_subseq:i_subseq+1]*mu_all)/(subseq_len*sigma_all[:,i_subseq:i_subseq+1]*sigma_all))), axis=0)
# excluding trivial matches
dist_profile[max(0,i_subseq-exclusion_zone+1):min(matrix_profile_len,i_subseq+exclusion_zone)] = np.inf
matrix_profile[i_subseq] = np.min(dist_profile)
mp_index[i_subseq] = np.argmin(dist_profile)
return matrix_profile, mp_index
def parser_args(cmd_args):
parser = argparse.ArgumentParser(sys.argv[0], description="", formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument("-d", "--dataset", type=str, action="store", default="PigCVP", help="Dataset for evaluation")
return parser.parse_args(cmd_args)
# obtaining arguments from command line
args = parser_args(sys.argv[1:])
dataset = args.dataset
coded_data = np.genfromtxt('../data/matrix_profile/' + dataset + '/' + dataset + '_coded.txt', delimiter=" ")
print(coded_data.shape)
coded_data = coded_data.flatten()
coded_data.shape = 1, coded_data.shape[0]
print(coded_data.shape)
mp, mpi = mpself(coded_data, 128)
print("Motif", np.min(mp))
print("Motif index", np.argmin(mp))
raw_data = np.genfromtxt('../data/matrix_profile/' + dataset + '/' + dataset +'_test.txt', delimiter=" ")
print(raw_data.shape)
raw_data.shape = 1, raw_data.shape[0]
print(raw_data.shape)
mp, mpi = mpself(raw_data, 1024)
print("Motif", np.min(mp))
print("Motif Index:", np.argmin(mp)) | 36.821782 | 171 | 0.684862 | 593 | 3,719 | 4.042159 | 0.237774 | 0.082603 | 0.037547 | 0.025031 | 0.319566 | 0.249061 | 0.177722 | 0.158532 | 0.073425 | 0.073425 | 0 | 0.023166 | 0.164291 | 3,719 | 101 | 172 | 36.821782 | 0.74807 | 0.103254 | 0 | 0.1 | 0 | 0 | 0.043936 | 0.013843 | 0 | 0 | 0 | 0.009901 | 0 | 1 | 0.033333 | false | 0 | 0.083333 | 0 | 0.15 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa199e1014a75c0283c21a4e8e65fc34ba1e71af | 4,652 | py | Python | src/ambiance/_doc.py | bws428/ambiance | 8cbc5fe38f34e1ce8ccf568d0961ad6573f7b612 | [
"Apache-2.0"
] | 18 | 2020-03-06T14:54:29.000Z | 2022-03-21T20:20:42.000Z | src/ambiance/_doc.py | bws428/ambiance | 8cbc5fe38f34e1ce8ccf568d0961ad6573f7b612 | [
"Apache-2.0"
] | 7 | 2020-04-19T15:21:54.000Z | 2022-03-05T14:27:38.000Z | src/ambiance/_doc.py | bws428/ambiance | 8cbc5fe38f34e1ce8ccf568d0961ad6573f7b612 | [
"Apache-2.0"
] | 7 | 2019-12-30T16:22:24.000Z | 2021-09-08T07:36:23.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
List of properties for documentation purposes
"""
class V:
def __init__(self, *, symb='', name='', unit=''):
self.symb = symb
self.name = name
self.unit = unit
class P:
def __init__(self, name, unit='', name_long='', *, log=False, symb='', eq=''):
self.name = name
self.unit = unit
self.name_long = name_long if name_long else self.name.replace('_', ' ').capitalize()
self.log = log
self.symb = symb
self.eq = eq
vars_const = (
V(
symb='g_0',
name='Standard gravitational acceleration',
unit='m/s²',
),
V(
symb='M_0',
name='Sea level mean molar mass',
unit='kg/mol',
),
V(
symb='N_A',
name='Avogadro constant',
unit='mol⁻¹',
),
V(
symb='P_0',
name='Sea level atmospheric pressure',
unit='Pa',
),
V(
symb='R^{*}',
name='Universal gas constant',
unit='J/(K·mol)',
),
V(
symb='R',
name='Specific gas constant',
unit='J/(K·kg)',
),
V(
symb='S',
name='Sutherland\'s empirical constant in the equation for dynamic viscosity',
unit='K',
),
V(
symb='T_i',
name='Temperature of the ice point at mean sea level',
unit='K',
),
V(
symb='T_0',
name='Sea level temperature',
unit='K',
),
V(
symb='t_i',
name='Celsius temperature of the ice point at mean sea level',
unit='°C',
),
V(
symb='t_0',
name='Celsius sea level temperature',
unit='°C',
),
V(
symb='\\beta_s',
name='Sutherland\'s empirical constant in the equation for dynamic viscosity',
unit='kg/(m·s·K^(1/2))',
),
V(
symb='\\kappa',
name='Adiabatic index',
unit='1',
),
V(
symb='\\rho_0',
name='Sea level atmospheric density',
unit='kg/m³',
),
V(
symb='\\sigma',
name='Effective collision diameter of an air molecule',
unit='m',
),
V(
symb='r',
name='Nominal Earth\'s radius',
unit='m',
),
)
props = (
P(
'collision_frequency',
'Hz',
log=True,
symb='\\omega',
eq='\\omega = 4 \\sigma^2 N_A \\left( \\frac{\\pi}{R^{*} M_0} \\right)^{1/2} \\frac{p}{\\sqrt{T}}',
),
P(
'density',
'kg/m³',
log=True,
symb='\\rho',
eq='\\rho = \\frac{p}{R T}'
),
P(
'dynamic_viscosity',
'Pa·s',
symb='\\mu',
eq='\\mu = \\frac{\\beta_s T^{3/2}}{T + S}',
),
P(
'grav_accel',
'm/s²',
'Gravitational acceleration',
symb='g',
eq='g = g_0 \\left( \\frac{r}{r + h} \\right)^2'
),
P(
'kinematic_viscosity',
'm²/s',
log=True,
symb='\\nu',
eq='\\nu = \\frac{\\mu}{\\rho}',
),
P(
'mean_free_path',
'm',
log=True,
symb='l',
eq='l = \\frac{1}{\\sqrt{2} \\pi \\sigma^2 n}',
),
P(
'mean_particle_speed',
'm/s',
symb='\\bar{\\nu}',
eq='\\bar{\\nu} = \\left( \\frac{8}{\\pi} R T \\right)^{1/2}',
),
P(
'number_density',
'm⁻³',
log=True,
symb='n',
eq='n = \\frac{N_A p}{R^{*} T}',
),
P(
'pressure',
'Pa',
log=True,
symb='p',
eq=(
'p = p_b \\exp \\left[ - \\frac{g_0}{R T} (H - H_b) \\right] \\quad \\text{for} \\quad \\beta = 0',
'p = p_b \\left[ 1 + \\frac{\\beta}{T_b} (H - H_b) \\right]^{-g_0 \\beta / R} \\quad \\text{for} \\quad \\beta \\neq 0',
),
),
P(
'pressure_scale_height',
'm',
symb='H_p',
eq='H_p = \\frac{R T}{g}',
),
P(
'specific_weight',
'N/m³',
log=True,
symb='\\gamma',
eq='\\gamma = \\rho g',
),
P(
'speed_of_sound',
'm/s',
symb='a',
eq='a = \\sqrt{\\kappa R T}',
),
P(
'temperature',
'K',
symb='T',
eq='T = T_b + \\beta (H - H_b)',
),
P(
'temperature_in_celsius',
'°C',
'Temperature (Celsius)',
symb='t',
eq='t = T - T_i',
),
P(
'thermal_conductivity',
'W/(m·K)',
symb='\\lambda',
eq='\\lambda = \\frac{2.648151 \\cdot 10^{-3} T^{3/2}}{T + (245.4 \\cdot 10^{-12/T})}'
),
)
| 21.637209 | 130 | 0.421109 | 584 | 4,652 | 3.273973 | 0.243151 | 0.041841 | 0.040272 | 0.027197 | 0.247908 | 0.174686 | 0.130753 | 0.116109 | 0.116109 | 0.116109 | 0 | 0.020205 | 0.372313 | 4,652 | 214 | 131 | 21.738318 | 0.630822 | 0.019132 | 0 | 0.447236 | 0 | 0 | 0.366711 | 0.009442 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01005 | false | 0 | 0 | 0 | 0.020101 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
aa19b00d12f617259c72936b6cb06e9a2293073b | 9,830 | py | Python | fancyscraper.py | meltedlilacs/Banner-Scraper | a79f185f93bdda61bfbf6cd7e7e97429a36d8b82 | [
"MIT"
] | null | null | null | fancyscraper.py | meltedlilacs/Banner-Scraper | a79f185f93bdda61bfbf6cd7e7e97429a36d8b82 | [
"MIT"
] | null | null | null | fancyscraper.py | meltedlilacs/Banner-Scraper | a79f185f93bdda61bfbf6cd7e7e97429a36d8b82 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import wx
import wx.adv
import scrapy # web scraping
from scrapy import signals
from scrapy.crawler import CrawlerProcess
import json # dumping and loading variabls
import re # regex
from pandas import DataFrame # excel
from pandas import ExcelWriter
from bs4 import BeautifulSoup
df1 = False
df2 = False
prof = False
class BannerParser:
raw = ''
students_list = []
students_stripped = []
subject = ''
number = ''
term = ''
crn = ''
section = ''
def __init__(self, text):
self.raw = text
self.students_list = re.findall('\d{9}\n(.+?(?=,).+)', self.raw)
self.students_stripped = list(map(self.FormatName, self.students_list))
info_line = re.search('Term.+\n', self.raw).group(0).split(' ')
self.subject = info_line[5]
self.number = info_line[6]
self.term = info_line[1]
self.crn = info_line[4]
self.section = info_line[7]
print([self.subject, self.number, self.term, self.crn, self.section])
def FormatName(self, name):
name = name.split(',')
name.reverse()
name = ' '.join(name)
name = name.strip()
name = re.sub('\(.+\)', '', name)
name = re.sub('\s+', ' ', name)
name = re.sub('\.', '', name)
return name
class StudentSpider(scrapy.Spider):
# errors = ''
name = 'uvm'
start_urls = []
names = []
emails = []
depyears = []
error_names = []
error_errors = []
name_from_url = {}
def parse(self, response):
responses = response.json()['data']
people = list(map(lambda x: {'name': x['cn']['0'], 'email': x['mail']['0'], 'depyears': x['ou']['0']}, responses))
pset = set()
for p in people:
pset.add(json.dumps(p, sort_keys='true'))
people = list(map(lambda x: json.loads(x), pset))
if len(people)==0:
self.error_names.append(self.name_from_url[response.url])
self.error_errors.append('email not found')
elif len(people)>1:
self.error_names.append(self.name_from_url[response.url])
self.error_errors.append('too many matches for name')
else:
self.names.append(self.name_from_url[response.url])
self.emails.append(people[0]['email'])
self.depyears.append(people[0]['depyears'])
def closed(self, reason):
# with open('errors.txt', 'w') as f:
# f.write(self.errors)
global df1
df1 = DataFrame({'Name': self.names, 'Email': self.emails, 'Department/Year': self.depyears})
df1.sort_values(by=['Name'], inplace=True)
global df2
df2 = DataFrame({'Name': self.error_names, 'Error': self.error_errors})
df2.sort_values(by=['Name'], inplace=True)
class ProfSpider(scrapy.Spider):
name = 'prof'
crn = ''
start_urls = []
def parse(self, response):
print(self.start_urls)
print(self.crn)
print(response)
soup = BeautifulSoup(str(response.body))
table = soup.find('table', id='sections')
for row in table.find_all('tr'):
if self.crn in str(row):
global prof
prof = row.select_one('td[data-label="Instructor"]').get_text().rstrip('\\t').split()[-1]
print(prof)
class GuiManager(wx.Frame):
def __init__(self, parent, title):
super(GuiManager, self).__init__(parent, title=title)
self.InitUI()
self.Centre()
def InitUI(self):
font = wx.Font(9, wx.FONTFAMILY_DEFAULT, wx.FONTSTYLE_NORMAL, wx.FONTWEIGHT_NORMAL)
font_title = wx.Font(12, wx.FONTFAMILY_DEFAULT, wx.FONTSTYLE_NORMAL, wx.FONTWEIGHT_BOLD)
panel = wx.Panel(self)
sizer = wx.GridBagSizer(5, 4)
title = wx.StaticText(panel, label="Banner Scraper by Lilac Damon")
title.SetFont(font_title)
sizer.Add(title, pos=(0, 0), span=(1, 10), flag=wx.TOP|wx.LEFT|wx.BOTTOM, border=5)
text = wx.StaticText(panel, label="Paste text from Banner")
text.SetFont(font)
sizer.Add(text, pos=(1, 0), flag=wx.TOP|wx.LEFT|wx.BOTTOM, border=5)
# staticIcon = wx.Button(panel, size=(24, 24))
# staticIcon.SetBitmap(wx.ArtProvider.GetBitmap(wx.ART_WARNING))
staticIcon = wx.StaticBitmap(panel, id=wx.ID_INFO, bitmap=wx.ArtProvider.GetBitmap(wx.ART_QUESTION, size=(16, 16)),
size=(24, 24), style=0, name='')
staticIcon.SetToolTip('Ctrl+A on Banner then Ctrl+V here. If there are multiple pages on Banner, paste them sequentially here.')
sizer.Add(staticIcon, pos=(1, 1), border=0, flag=wx.CENTER)
tc = wx.TextCtrl(panel, style=wx.TE_MULTILINE)
sizer.Add(tc, pos=(2, 0), span=(5, 5),
flag=wx.EXPAND|wx.LEFT|wx.RIGHT, border=5)
self.tc = tc
buttonSave = wx.Button(panel, wx.ID_SAVE, label="Scrape", size=(90, 28))
buttonClose = wx.Button(panel, wx.ID_CLEAR, label="Reset", size=(90, 28))
buttonInfo = wx.Button(panel, wx.ID_INFO, label="Info and Privacy", size=(110, 28))
sizer.Add(buttonInfo, pos=(7, 0), flag=wx.LEFT|wx.BOTTOM, border=5)
sizer.Add(buttonSave, pos=(7, 3))
sizer.Add(buttonClose, pos=(7, 4), flag=wx.RIGHT|wx.BOTTOM, border=5)
sizer.AddGrowableCol(1)
sizer.AddGrowableRow(2)
panel.SetSizer(sizer)
self.Bind(wx.EVT_BUTTON, self.OnInfo, id=wx.ID_INFO)
self.Bind(wx.EVT_BUTTON, self.OnSaveAs, id=wx.ID_SAVE)
self.Bind(wx.EVT_BUTTON, self.OnClear, id=wx.ID_CLEAR)
def Scrape(self, banner_parser):
base_url = 'https://www.uvm.edu/directory/api/query_results.php?name='
urls = list(map(lambda x: base_url + x.replace(" ", "%20"), banner_parser.students_stripped))
name_from_url = dict(zip(urls, banner_parser.students_list))
process = CrawlerProcess(settings={})
process.crawl(StudentSpider, start_urls=urls, name_from_url=name_from_url)
prof_url = 'https://www.uvm.edu/coursedirectory/search.php?subject=' + banner_parser.subject + \
'&number=' + banner_parser.number + \
'&term=' + banner_parser.term + '§ion'
print(prof_url)
process.crawl(ProfSpider, start_urls=[prof_url], crn=banner_parser.crn)
process.start()
def OnSaveAs(self, event):
banner_parser = BannerParser(self.tc.GetValue())
self.Scrape(banner_parser)
print(df1)
print(df2)
print(prof)
with wx.DirDialog(None, "Choose save directory", "",
wx.DD_DEFAULT_STYLE | wx.DD_DIR_MUST_EXIST) as dirDialog:
if dirDialog.ShowModal() == wx.ID_CANCEL:
return # the user changed their mind
# save the current contents in the file
pathname = dirDialog.GetPath() + '\\' + banner_parser.subject + ' ' + banner_parser.number + ' ' + \
banner_parser.section + ' (' + prof + ').xlsx'
print(pathname)
try:
with ExcelWriter(pathname) as writer:
df1.to_excel(writer, sheet_name="Results", index=False)
df2.to_excel(writer, sheet_name="Errors", index=False)
except IOError:
wx.LogError("Cannot save current data in file '%s'." % pathname)
def OnClear(self, event):
self.tc.Clear()
def OnInfo(self, event):
description = """Created by Lilac Damon http://www.meltedlilacs.com
This program uses pasted info from Banner in order to compile a list of student emails and other details.
The source code is freely available at the below url. This repository is unlisted but publicly accessible.
If you have privacy concerns, please note:
1) This program never directly connects to Banner. It therefore has no way to access or manipulate privileged information.
2) This program only connects to public websites (UVM directories).
3) The source code does not reveal any information about Banner other than basic information
about where items are located on the page (ex: student names appear near 9-digit numbers).
4) All of these claims may be verified by looking at the source code which is available at the below url."""
licence = """Copyright 2021 Lilac Damon
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."""
info = wx.adv.AboutDialogInfo()
info.SetName('Banner Scraper')
info.SetVersion('1.0')
info.SetDescription(description)
info.SetWebSite('[url]')
info.SetLicence(licence)
wx.adv.AboutBox(info)
def main():
app = wx.App()
ex = GuiManager(None, title='Banner Scraper')
ex.Show()
app.MainLoop()
if __name__ == '__main__':
main() | 40.619835 | 463 | 0.633672 | 1,308 | 9,830 | 4.674312 | 0.334862 | 0.023553 | 0.012594 | 0.009814 | 0.12545 | 0.077691 | 0.054138 | 0.054138 | 0.038436 | 0.021917 | 0 | 0.013281 | 0.241709 | 9,830 | 242 | 464 | 40.619835 | 0.806949 | 0.032553 | 0 | 0.053476 | 0 | 0.032086 | 0.264659 | 0.002842 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064171 | false | 0 | 0.053476 | 0 | 0.251337 | 0.053476 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |