hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
437021d671825e959375a0374106a655349dffb0 | 7,803 | py | Python | PassWord.py | IQUBE-X/passGenerator | a56a5928c1e8ee503d2757ecf0ab4108a52ec677 | [
"MIT"
] | 1 | 2020-07-11T07:59:54.000Z | 2020-07-11T07:59:54.000Z | PassWord.py | dhruvaS-hub/passGenerator | a56a5928c1e8ee503d2757ecf0ab4108a52ec677 | [
"MIT"
] | null | null | null | PassWord.py | dhruvaS-hub/passGenerator | a56a5928c1e8ee503d2757ecf0ab4108a52ec677 | [
"MIT"
] | 1 | 2021-06-02T10:11:19.000Z | 2021-06-02T10:11:19.000Z | # PassWord - The Safe Password Generator App!
# importing the tkinter module for GUI
from tkinter import *
# importing the message box widget from tkinter
from tkinter import messagebox
# importing sqlite3 for database
import sqlite3
# importing random for password generation
import random
# creating fonts
font = ('Fixedsys', 10)
font2 = ('Comic Sans MS', 9)
font3 = ('System', 9)
font4 = ('Two Cen MT', 9)
# creating a database and establishing a connection
conn = sqlite3.connect('password.db')
# creating a cursor to navigate through database
c = conn.cursor()
# creating the table
'''
c.execute("""CREATE TABLE passwords (
password text
)""")
'''
# defining the root variable
root = Tk()
# Naming the app
root.title('PassWord')
# creating a label frame to organize content
label_frame = LabelFrame(root, padx=10, pady=10, text='Password Generator', font=font)
# printing the label frame onto the screen or window
label_frame.grid(row=0, column=0, columnspan=1, padx=10, pady=10, sticky=E + W)
# creating a separate label frame to perform delete functions
delete_labelframe = LabelFrame(root, text='Delete Password', padx=10, pady=10, font=font4)
# printing delete labelframe onto the screen
delete_labelframe.grid(row=5, column=0, columnspan=1, padx=10, pady=10, sticky=E + W)
# making the text box where password is going to be displayed
e = Entry(label_frame, fg='black', bg='white')
# printing the text box to the screen
e.grid(row=0, column=0, padx=10, pady=10, columnspan=1)
# (for the delete function) to give information on input for delete function
# (for the delete function) to give information on input for delete function
info = Label(delete_labelframe, text='Password ID', fg='black', font=font2)
# printing the label onto the screen
info.grid(row=6, column=0, pady=10)
# making the entry for user to input which password
e2 = Entry(delete_labelframe, fg='black', bg='white')
# printing the entry onto the screen
e2.grid(row=6, column=1, pady=10)
# making the password generate function
def generate():
# creating lists
lowercase_letters = ['a', 'b', 'c', 'd', 'e' 'f' 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's',
't',
'u' 'v', 'w', 'x', 'y', 'z']
# creating lists
uppercase_letters = ['A', 'B', 'C', 'D', 'E' 'F' 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S',
'T', 'U' 'V', 'W', 'X', 'Y', 'Z']
# creating lists
symbols_list = ['-', '@', '!' '$', '%' '&' '?', '#', '^']
# creating lists
numbers_list = ['1', '2', '3', '4', '5', '6', '7' '8', '9' '0']
# generating a random value from the lists
lowercase_letter = random.choice(lowercase_letters)
# generating a random value from the lists
lowercase_letter2 = random.choice(lowercase_letters)
# generating a random value from the lists
uppercase_letter = random.choice(uppercase_letters)
# generating a random value from the lists
uppercase2_letter = random.choice(uppercase_letters)
# generating a random value from the lists
symbol = random.choice(symbols_list)
# generating a random value from the lists
symbol2 = random.choice(symbols_list)
# generating a random value from the lists
number = random.choice(numbers_list)
# generating a random value from the lists
number2 = random.choice(numbers_list)
# creating a password list made of random values from previous lists
password = [lowercase_letter, uppercase_letter, uppercase2_letter, lowercase_letter2, symbol, symbol2, number,
number2]
# shuffling password list
password1 = random.sample(password, 8)
# concatenating and making final list
final_password = password1[0] + password1[1] + password1[2] + password1[3] + password1[4] + password1[5] + \
password1[6] + password1[7]
# deleting previous item from entry
e.delete(0, END)
# inserting the final password
e.insert(0, final_password)
# making a function to save the password into the database
def save_password():
conn = sqlite3.connect('password.db')
c = conn.cursor()
c.execute("INSERT INTO passwords VALUES (?)", (e.get(),))
e.delete(0, END)
conn.commit()
conn.close()
# making a function to show all the saved passwords
def show_password():
global passcode_label
conn = sqlite3.connect('password.db')
c = conn.cursor()
c.execute("SELECT rowid, * FROM passwords")
passcodes = c.fetchall()
print_code = ''
for passcode in passcodes:
print_code += str(passcode[0]) + '.' + ' ' + str(passcode[1]) + '\n'
passcode_label = Text(label_frame, height=15, width=25)
passcode_label.configure(state='normal')
passcode_label.insert(1.0, print_code)
passcode_label.grid(row=5, column=0, padx=10, pady=10)
passcode_label.configure(state='disabled')
conn.commit()
conn.close()
# making a function to hide the saved passwords
def hide_password():
passcode_label.destroy()
# making a function to delete passwords from database
def delete():
conn = sqlite3.connect('password.db')
c = conn.cursor()
c.execute("DELETE from passwords WHERE oid = (?)", (e2.get(),))
e2.delete(0, END)
passcode_label.destroy()
conn.commit()
conn.close()
# making a function to delete all the passwords in the database
def delete_all():
global number_of_passwords
conn = sqlite3.connect('password.db')
c = conn.cursor()
c.execute("SELECT rowid FROM passwords")
number_of_passwords = c.fetchall()
num_of_passwords = len(number_of_passwords)
confirmation = messagebox.askyesno('Delete All Passwords?',
'You have chosen to delete ' + str(
num_of_passwords) + ' passwords. This action cannot be reversed. Do you wish to proceed?')
if confirmation == 1:
c.execute("DELETE FROM passwords")
conn.commit()
conn.close()
# button for generating password
generate_password = Button(label_frame, text='Generate Strong Password', command=generate, font=font2)
# printing the button onto the screen
generate_password.grid(row=1, padx=10, pady=10, column=0)
# button to save password
save = Button(label_frame, text='Save Password', command=save_password, font=font2)
# printing the button onto the screen
save.grid(row=2, padx=10, pady=10, column=0)
# making a button to show all the passwords
show = Button(label_frame, text='Show Passwords', command=show_password, font=font2)
# printing the button onto the screen
show.grid(row=4, padx=10, pady=10, column=0)
# making a button to hide the shown passwords
hide = Button(label_frame, text='Hide Passwords', command=hide_password, font=font2)
# printing the button onto the screen
hide.grid(row=6, column=0, padx=10, pady=10)
# making a button to delete a password
delete = Button(delete_labelframe, text='Delete Password', command=delete, font=font2)
# printing the button onto the screen
delete.grid(row=8, padx=10, pady=10, column=1)
# making a button to delete all the passwords
delete_all = Button(delete_labelframe, text='Delete All', command=delete_all, fg='dark red', width=20, anchor=CENTER,
font=font3)
# printing the button onto the screen
delete_all.grid(row=9, column=1, padx=10, pady=10, ipadx=15)
# committing the changes to the database
conn.commit()
# closing the connection with database
conn.close()
# making the final loop
root.mainloop()
| 32.648536 | 134 | 0.656927 | 1,065 | 7,803 | 4.748357 | 0.213146 | 0.016611 | 0.023729 | 0.028475 | 0.371564 | 0.307099 | 0.282183 | 0.273878 | 0.2193 | 0.191418 | 0 | 0.026251 | 0.22376 | 7,803 | 238 | 135 | 32.785714 | 0.808651 | 0.301679 | 0 | 0.222222 | 0 | 0 | 0.12163 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.407407 | 0.037037 | 0 | 0.092593 | 0.027778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
437727aaebd2b60da03893cf1960a1dac044f4b8 | 14,215 | py | Python | train.py | MEfeTiryaki/trpo | e1c7bc25165730afa60d9733555398e078a13e67 | [
"MIT"
] | 2 | 2020-03-26T23:36:41.000Z | 2020-03-27T03:04:27.000Z | train.py | MEfeTiryaki/trpo | e1c7bc25165730afa60d9733555398e078a13e67 | [
"MIT"
] | null | null | null | train.py | MEfeTiryaki/trpo | e1c7bc25165730afa60d9733555398e078a13e67 | [
"MIT"
] | 1 | 2020-03-27T03:04:28.000Z | 2020-03-27T03:04:28.000Z | import argparse
from itertools import count
import signal
import sys
import os
import time
import numpy as np
import gym
import torch
import torch.autograd as autograd
from torch.autograd import Variable
import scipy.optimize
import matplotlib.pyplot as plt
from value import Value
from policy import Policy
from utils import *
from trpo import trpo_step
parser = argparse.ArgumentParser(description='PyTorch actor-critic example')
# Algorithm Parameters
parser.add_argument('--gamma', type=float, default=0.995, metavar='G', help='discount factor (default: 0.995)')
parser.add_argument('--lambda-', type=float, default=0.97, metavar='G', help='gae (default: 0.97)')
# Value Function Learning Parameters
parser.add_argument('--l2-reg', type=float, default=1e-3, metavar='G', help='(NOT USED)l2 regularization regression (default: 1e-3)')
parser.add_argument('--val-opt-iter', type=int, default=200, metavar='G', help='iteration number for value function learning(default: 200)')
parser.add_argument('--lr', type=float, default=1e-3, metavar='G', help='learning rate for value function (default: 1e-3)')
parser.add_argument('--value-memory', type=int, default=1, metavar='G', help='ratio of past value to be used to batch size (default: 1)')
parser.add_argument('--value-memory-shuffle', action='store_true',help='if not shuffled latest memory stay') # TODO: implement
# Policy Optimization parameters
parser.add_argument('--max-kl', type=float, default=1e-2, metavar='G', help='max kl value (default: 1e-2)')
parser.add_argument('--damping', type=float, default=1e-1, metavar='G', help='damping (default: 1e-1)')
parser.add_argument('--fisher-ratio', type=float, default=1, metavar='G', help='ratio of data to calcualte fisher vector product (default: 1)')
# Environment parameters
parser.add_argument('--env-name', default="Pendulum-v0", metavar='G', help='name of the environment to run')
parser.add_argument('--seed', type=int, default=543, metavar='N', help='random seed (default: 1)')
# Training length
parser.add_argument('--batch-size', type=int, default=5000, metavar='N', help='number of steps per iteration')
parser.add_argument('--episode-length', type=int, default=1000, metavar='N', help='max step size for one episode')
parser.add_argument('--max-iteration-number', type=int, default=200, metavar='N', help='max policy iteration number')
# Rendering
parser.add_argument('--render', action='store_true', help='render the environment')
# Logging
parser.add_argument('--log-interval', type=int, default=1, metavar='N', help='interval between training status logs (default: 10)')
parser.add_argument('--log', action='store_true', help='log the results at the end')
parser.add_argument('--log-dir', type=str, default=".", metavar='N', help='log directory')
parser.add_argument('--log-prefix', type=str, default="log", metavar='N', help='log file prefix')
# Load
parser.add_argument('--load', action='store_true', help='load models')
parser.add_argument('--save', action='store_true', help='load models')
parser.add_argument('--load-dir', type=str, default=".", metavar='N', help='')
args = parser.parse_args()
env = gym.make(args.env_name)
env.seed(args.seed)
num_inputs = env.observation_space.shape[0]
num_actions = env.action_space.shape[0]
torch.set_printoptions(profile="full")
if args.load:
policy_net = Policy(num_inputs, num_actions,30)
value_net = Value(num_inputs,30)
set_flat_params_to(value_net, loadParameterCsv(args.load_dir+"/ValueNet"))
set_flat_params_to(policy_net, loadParameterCsv(args.load_dir+"/PolicyNet"))
print("Networks are loaded from "+args.load_dir+"/")
else:
policy_net = Policy(num_inputs, num_actions,30)
value_net = Value(num_inputs,30)
def signal_handler(sig, frame):
""" Signal Handler to save the networks when shutting down via ctrl+C
Parameters:
Returns:
"""
if(args.save):
valueParam = get_flat_params_from(value_net)
policyParam = get_flat_params_from(policy_net)
saveParameterCsv(valueParam,args.load_dir+"/ValueNet")
saveParameterCsv(policyParam,args.load_dir+"/PolicyNet")
print("Networks are saved in "+args.load_dir+"/")
print('Closing!!')
env.close()
sys.exit(0)
def prepare_data(batch,valueBatch,previousBatch):
""" Get the batch data and calculate value,return and generalized advantage
Detail: TODO
Parameters:
batch (dict of arrays of numpy) : TODO
valueBatch (dict of arrays of numpy) : TODO
previousBatch (dict of arrays of numpy) : TODO
Returns:
"""
# TODO : more description above
stateList = [ torch.from_numpy(np.concatenate(x,axis=0)) for x in batch["states"]]
actionsList = [torch.from_numpy(np.concatenate(x,axis=0)) for x in batch["actions"]]
for states in stateList:
value = value_net.forward(states)
batch["values"].append(value)
advantagesList = []
returnsList = []
rewardsList = []
for rewards,values,masks in zip(batch["rewards"],batch["values"],batch["mask"]):
returns = torch.Tensor(len(rewards),1)
advantages = torch.Tensor(len(rewards),1)
deltas = torch.Tensor(len(rewards),1)
prev_return = 0
prev_value = 0
prev_advantage = 0
for i in reversed(range(len(rewards))):
returns[i] = rewards[i] + args.gamma * prev_value * masks[i] # TD
# returns[i] = rewards[i] + args.gamma * prev_return * masks[i] # Monte Carlo
deltas[i] = rewards[i] + args.gamma * prev_value * masks[i]- values.data[i]
advantages[i] = deltas[i] + args.gamma * args.lambda_* prev_advantage* masks[i]
prev_return = returns[i, 0]
prev_value = values.data[i, 0]
prev_advantage = advantages[i, 0]
returnsList.append(returns)
advantagesList.append(advantages)
rewardsList.append(torch.Tensor(rewards))
batch["states"] = torch.cat(stateList,0)
batch["actions"] = torch.cat(actionsList,0)
batch["rewards"] = torch.cat(rewardsList,0)
batch["returns"] = torch.cat(returnsList,0)
advantagesList = torch.cat(advantagesList,0)
batch["advantages"] = (advantagesList- advantagesList.mean()) / advantagesList.std()
valueBatch["states"] = torch.cat(( previousBatch["states"],batch["states"]),0)
valueBatch["targets"] = torch.cat((previousBatch["returns"],batch["returns"]),0)
def update_policy(batch):
""" Get advantage , states and action and calls trpo step
Parameters:
batch (dict of arrays of numpy) : TODO (batch is different than prepare_data by structure)
Returns:
"""
advantages = batch["advantages"]
states = batch["states"]
actions = batch["actions"]
trpo_step(policy_net, states,actions,advantages , args.max_kl, args.damping)
def update_value(valueBatch):
""" Get valueBatch and run adam optimizer to learn value function
Parameters:
valueBatch (dict of arrays of numpy) : TODO
Returns:
"""
# shuffle the data
dataSize = valueBatch["targets"].size()[0]
permutation = torch.randperm(dataSize)
input = valueBatch["states"][permutation]
target = valueBatch["targets"][permutation]
iter = args.val_opt_iter
batchSize = int(dataSize/ iter)
loss_fn = torch.nn.MSELoss(reduction='sum')
optimizer = torch.optim.Adam(value_net.parameters(), lr=args.lr)
for t in range(iter):
prediction = value_net(input[t*batchSize:t*batchSize+batchSize])
loss = loss_fn(prediction, target[t*batchSize:t*batchSize+batchSize])
# XXX : Comment out for debug
# if t%100==0:
# print("\t%f"%loss.data)
optimizer.zero_grad()
loss.backward()
optimizer.step()
def save_to_previousBatch(previousBatch,batch):
""" Save previous batch to use in future value optimization
Details: TODO
Parameters:
Returns:
"""
if args.value_memory<0:
print("Value memory should be equal or greater than zero")
elif args.value_memory>0:
if previousBatch["returns"].size() == 0:
previousBatch= {"states":batch["states"],
"returns":batch["returns"]}
else:
previous_size = previousBatch["returns"].size()[0]
size = batch["returns"].size()[0]
if previous_size/size == args.value_memory:
previousBatch["states"] = torch.cat([previousBatch["states"][size:],batch["states"]],0)
previousBatch["returns"] = torch.cat([previousBatch["returns"][size:],batch["returns"]],0)
else:
previousBatch["states"] = torch.cat([previousBatch["states"],batch["states"]],0)
previousBatch["returns"] = torch.cat([previousBatch["returns"],batch["returns"]],0)
if args.value_memory_shuffle:
permutation = torch.randperm(previousBatch["returns"].size()[0])
previousBatch["states"] = previousBatch["states"][permutation]
previousBatch["returns"] = previousBatch["returns"][permutation]
def calculate_loss(reward_sum_mean,reward_sum_std,test_number = 10):
""" Calculate mean cummulative reward for test_nubmer of trials
Parameters:
reward_sum_mean (list): holds the history of the means.
reward_sum_std (list): holds the history of the std.
Returns:
list: new value appended means
list: new value appended stds
"""
rewardSum = []
for i in range(test_number):
state = env.reset()
rewardSum.append(0)
for t in range(args.episode_length):
state, reward, done, _ = env.step(policy_net.get_action(state)[0] )
state = np.transpose(state)
rewardSum[-1] += reward
if done:
break
reward_sum_mean.append(np.array(rewardSum).mean())
reward_sum_std.append(np.array(rewardSum).std())
return reward_sum_mean, reward_sum_std
def log(rewards):
""" Saves mean and std over episodes in log file
Parameters:
Returns:
"""
# TODO : add duration to log
filename = args.log_dir+"/"+ args.log_prefix \
+ "_env_" + args.env_name \
+ "_maxIter_" + str(args.max_iteration_number) \
+ "_batchSize_" + str(args.batch_size) \
+ "_gamma_" + str(args.gamma) \
+ "_lambda_" + str(args.lambda_) \
+ "_lr_" + str(args.lr) \
+ "_valOptIter_" + str(args.val_opt_iter)
if os.path.exists(filename + "_index_0.csv"):
id = 0
file = filename + "_index_" + str(id)
while os.path.exists(file + ".csv"):
id = id +1
file = filename + "_index_" + str(id)
filename = file
else:
filename = filename + "_index_0"
import csv
filename = filename+ ".csv"
pythonVersion = sys.version_info[0]
if pythonVersion == 3:
with open(filename, 'w', newline='') as csvfile:
spamwriter = csv.writer(csvfile, delimiter=' ',
quotechar='|', quoting=csv.QUOTE_MINIMAL)
spamwriter.writerow(rewards)
elif pythonVersion == 2:
with open(filename, 'w', ) as csvfile:
spamwriter = csv.writer(csvfile, delimiter=' ',
quotechar='|', quoting=csv.QUOTE_MINIMAL)
spamwriter.writerow(rewards)
def main():
"""
Parameters:
Returns:
"""
signal.signal(signal.SIGINT, signal_handler)
time_start = time.time()
reward_sum_mean,reward_sum_std = [], []
previousBatch= {"states":torch.Tensor(0) ,
"returns":torch.Tensor(0)}
reward_sum_mean,reward_sum_std = calculate_loss(reward_sum_mean,reward_sum_std)
print("Initial loss \n\tloss | mean : %6.4f / std : %6.4f"%(reward_sum_mean[-1],reward_sum_std[-1]) )
for i_episode in range(args.max_iteration_number):
time_episode_start = time.time()
# reset batches
batch = {"states":[] ,
"actions":[],
"next_states":[] ,
"rewards":[],
"returns":[],
"values":[],
"advantages":[],
"mask":[]}
valueBatch = {"states" :[],
"targets" : []}
num_steps = 0
while num_steps < args.batch_size:
state = env.reset()
reward_sum = 0
states,actions,rewards,next_states,masks = [],[],[],[],[]
steps = 0
for t in range(args.episode_length):
action = policy_net.get_action(state)[0] # agent
next_state, reward, done, info = env.step(action)
next_state = np.transpose(next_state)
mask = 0 if done else 1
masks.append(mask)
states.append(state)
actions.append(action)
next_states.append(next_state)
rewards.append(reward)
state = next_state
reward_sum += reward
steps+=1
if args.render:
env.render()
if done:
break
batch["states"].append(np.expand_dims(states, axis=1) )
batch["actions"].append(actions)
batch["next_states"].append(np.expand_dims(next_states, axis=1))
batch["rewards"].append(rewards)
batch["mask"].append(masks)
num_steps += steps
prepare_data(batch,valueBatch,previousBatch)
update_policy(batch) # First policy update to avoid overfitting
update_value(valueBatch)
save_to_previousBatch(previousBatch,batch)
print("episode %d | total: %.4f "%( i_episode, time.time()-time_episode_start))
reward_sum_mean,reward_sum_std = calculate_loss(reward_sum_mean,reward_sum_std)
print("\tloss | mean : %6.4f / std : %6.4f"%(reward_sum_mean[-1],reward_sum_std[-1]) )
if args.log:
print("Data is logged in "+args.log_dir+"/")
log(reward_sum_mean)
print("Total training duration: %.4f "%(time.time()-time_start))
env.close()
if __name__ == '__main__':
main()
| 38.838798 | 143 | 0.636722 | 1,758 | 14,215 | 5.010808 | 0.191695 | 0.025542 | 0.044386 | 0.014531 | 0.281644 | 0.221137 | 0.181746 | 0.138949 | 0.10353 | 0.071972 | 0 | 0.012913 | 0.226381 | 14,215 | 365 | 144 | 38.945205 | 0.788124 | 0.105241 | 0 | 0.112903 | 0 | 0 | 0.153175 | 0.003519 | 0 | 0 | 0 | 0.027397 | 0 | 1 | 0.032258 | false | 0 | 0.072581 | 0 | 0.108871 | 0.040323 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
437c6a6a6d5abf3db9e497007b852df839401638 | 2,075 | py | Python | setup.py | phaustin/MyST-Parser | 181e921cea2794f10ca612df6bf2a2057b66c372 | [
"MIT"
] | null | null | null | setup.py | phaustin/MyST-Parser | 181e921cea2794f10ca612df6bf2a2057b66c372 | [
"MIT"
] | null | null | null | setup.py | phaustin/MyST-Parser | 181e921cea2794f10ca612df6bf2a2057b66c372 | [
"MIT"
] | null | null | null | """myst-parser package setup."""
from importlib import import_module
from setuptools import find_packages, setup
setup(
name="myst-parser",
version=import_module("myst_parser").__version__,
description=(
"An extended commonmark compliant parser, " "with bridges to docutils & sphinx."
),
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
url="https://github.com/executablebooks/MyST-Parser",
project_urls={"Documentation": "https://myst-parser.readthedocs.io"},
author="Chris Sewell",
author_email="chrisj_sewell@hotmail.com",
license="MIT",
packages=find_packages(),
entry_points={
"console_scripts": ["myst-benchmark = myst_parser.cli.benchmark:main"]
},
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.3",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Text Processing :: Markup",
"Framework :: Sphinx :: Extension",
],
keywords="markdown lexer parser development docutils sphinx",
python_requires=">=3.6",
install_requires=["markdown-it-py~=0.4.5"],
extras_require={
"sphinx": ["pyyaml", "docutils>=0.15", "sphinx>=2,<3"],
"code_style": ["flake8<3.8.0,>=3.7.0", "black", "pre-commit==1.17.0"],
"testing": [
"coverage",
"pytest>=3.6,<4",
"pytest-cov",
"pytest-regressions",
"beautifulsoup4",
],
"rtd": ["sphinxcontrib-bibtex", "ipython", "sphinx-book-theme", "sphinx_tabs"],
},
zip_safe=True,
)
| 37.727273 | 88 | 0.61012 | 215 | 2,075 | 5.772093 | 0.567442 | 0.107172 | 0.141015 | 0.104754 | 0.043513 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021848 | 0.227952 | 2,075 | 54 | 89 | 38.425926 | 0.752809 | 0.01253 | 0 | 0.039216 | 0 | 0 | 0.554577 | 0.0372 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
437d5b7a20ce44c03f3d4a4f70ef524faf474a1a | 554 | py | Python | python/tests/extractor/refmt.py | kho/cdec | d88186af251ecae60974b20395ce75807bfdda35 | [
"BSD-3-Clause-LBNL",
"Apache-2.0"
] | 114 | 2015-01-11T05:41:03.000Z | 2021-08-31T03:47:12.000Z | python/tests/extractor/refmt.py | kho/cdec | d88186af251ecae60974b20395ce75807bfdda35 | [
"BSD-3-Clause-LBNL",
"Apache-2.0"
] | 29 | 2015-01-09T01:00:09.000Z | 2019-09-25T06:04:02.000Z | python/tests/extractor/refmt.py | kho/cdec | d88186af251ecae60974b20395ce75807bfdda35 | [
"BSD-3-Clause-LBNL",
"Apache-2.0"
] | 50 | 2015-02-13T13:48:39.000Z | 2019-08-07T09:45:11.000Z | #!/usr/bin/env python
import collections, sys
lines = []
f = collections.defaultdict(int)
fe = collections.defaultdict(lambda: collections.defaultdict(int))
for line in sys.stdin:
tok = [x.strip() for x in line.split('|||')]
count = int(tok[4])
f[tok[1]] += count
fe[tok[1]][tok[2]] += count
lines.append(tok)
for tok in lines:
feat = 'IsSingletonF={0}.0 IsSingletonFE={1}.0'.format(
0 if f[tok[1]] > 1 else 1,
0 if fe[tok[1]][tok[2]] > 1 else 1)
print ' ||| '.join((tok[0], tok[1], tok[2], feat, tok[3]))
| 26.380952 | 66 | 0.590253 | 90 | 554 | 3.633333 | 0.4 | 0.061162 | 0.06422 | 0.073395 | 0.061162 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047727 | 0.205776 | 554 | 20 | 67 | 27.7 | 0.695455 | 0.036101 | 0 | 0 | 0 | 0 | 0.086304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.066667 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
437e1e0973bde8b1e251b37ffc137a684d4dc2b8 | 436 | py | Python | blog/models.py | tomitokko/django-blog-with-astradb | 236aaf625ceb854345b6d6bbdd6d17b81e0e3c4f | [
"Apache-2.0"
] | 3 | 2021-12-13T21:40:32.000Z | 2022-03-28T08:08:36.000Z | blog/models.py | tomitokko/django-blog-with-astradb | 236aaf625ceb854345b6d6bbdd6d17b81e0e3c4f | [
"Apache-2.0"
] | null | null | null | blog/models.py | tomitokko/django-blog-with-astradb | 236aaf625ceb854345b6d6bbdd6d17b81e0e3c4f | [
"Apache-2.0"
] | 1 | 2022-02-11T20:49:08.000Z | 2022-02-11T20:49:08.000Z | from django.db import models
import uuid
from datetime import datetime
from cassandra.cqlengine import columns
from django_cassandra_engine.models import DjangoCassandraModel
# Create your models here.
class PostModel(DjangoCassandraModel):
id = columns.UUID(primary_key=True, default=uuid.uuid4)
title = columns.Text(required=True)
body = columns.Text(required=True)
created_at = columns.DateTime(default=datetime.now) | 36.333333 | 63 | 0.802752 | 56 | 436 | 6.178571 | 0.535714 | 0.057803 | 0.109827 | 0.132948 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002618 | 0.123853 | 436 | 12 | 64 | 36.333333 | 0.903141 | 0.055046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
43844440dd179ab3f122498113b16b020a8f05b8 | 15,375 | py | Python | xverse/transformer/_woe.py | gb-andreygsouza/XuniVerse | 74f4b9112c32a8f1411ae0c5a6de906f8d2e895a | [
"MIT"
] | null | null | null | xverse/transformer/_woe.py | gb-andreygsouza/XuniVerse | 74f4b9112c32a8f1411ae0c5a6de906f8d2e895a | [
"MIT"
] | null | null | null | xverse/transformer/_woe.py | gb-andreygsouza/XuniVerse | 74f4b9112c32a8f1411ae0c5a6de906f8d2e895a | [
"MIT"
] | null | null | null | import pandas as pd
import numpy as np
from sklearn.base import BaseEstimator, TransformerMixin
import scipy.stats.stats as stats
import pandas.core.algorithms as algos
#from sklearn.utils.validation import check_is_fitted
from sklearn.utils import check_array
from ..transformer import MonotonicBinning
pd.options.mode.chained_assignment = None
class WOE(BaseEstimator, TransformerMixin):
"""Weight of evidence transformation for categorical variables. For numeric variables,
monotonic operation is provided as default with this package.
Parameters
----------
feature_names: 'all' or list (default='all')
list of features to perform WOE transformation.
- 'all' (default): All categorical features in the dataset will be used
- list of features: ['age', 'income',......]
exclude_features: list (default=None)
list of features to be excluded from WOE transformation.
- Example - ['age', 'income', .......]
woe_prefix: string (default=None)
Variable prefix to be used for the column created by WOE transformer. The default value is set 'None'.
treat_missing: {'separate', 'mode', 'least_frequent'} (default='separate')
This parameter setting is used to handle missing values in the dataset.
'separate' - Missing values are treated as a own group (category)
'mode' - Missing values are combined with the highest frequent item in the dataset
'least_frequent' - Missing values are combined with the least frequent item in the dataset
woe_bins: dict of dicts(default=None)
This feature is added as part of future WOE transformations or scoring. If this value is set,
then WOE values provided for each of the features here will be used for transformation.
Applicable only in the transform method.
Dictionary structure - {'feature_name': float list}
Example - {'education': {'primary' : 0.1, 'tertiary' : 0.5, 'secondary', 0.7}}
monotonic_binning: bool (default=True)
This parameter is used to perform monotonic binning on numeric variables. If set to False,
numeric variables would be ignored.
mono_feature_names: 'all' or list (default='all')
list of features to perform monotonic binning operation.
- 'all' (default): All features in the dataset will be used
- list of features: ['age', 'income',......]
mono_max_bins: int (default=20)
Maximum number of bins that can be created for any given variable. The final number of bins
created will be less than or equal to this number.
mono_force_bins: int (default=3)
It forces the module to create bins for a variable, when it cannot find monotonic relationship
using "max_bins" option. The final number of bins created will be equal to the number specified.
mono_cardinality_cutoff: int (default=5)
Cutoff to determine if a variable is eligible for monotonic binning operation. Any variable
which has unique levels less than this number will be treated as character variables.
At this point no binning operation will be performed on the variable and it will return the
unique levels as bins for these variable.
mono_prefix: string (default=None)
Variable prefix to be used for the column created by monotonic binning.
mono_custom_binning: dict (default=None)
Using this parameter, the user can perform custom binning on variables. This parameter is also
used to apply previously computed bins for each feature (Score new data).
Dictionary structure - {'feature_name': float list}
Example - {'age': [0., 1., 2., 3.]}
"""
# Initialize the parameters for the function
def __init__(self, feature_names='all', exclude_features=None, woe_prefix=None,
treat_missing='separate', woe_bins=None, monotonic_binning=True,
mono_feature_names='all', mono_max_bins=20, mono_force_bins=3,
mono_cardinality_cutoff=5, mono_prefix=None, mono_custom_binning=None):
self.feature_names = feature_names
self.exclude_features = exclude_features
self.woe_prefix = woe_prefix
self.treat_missing = treat_missing
self.woe_bins = woe_bins #only used for future transformations
#these features below are for monotonic operations on numeric variables.
#It uses MonotonicBinning class from binning package.
self.monotonic_binning = monotonic_binning
self.mono_feature_names = mono_feature_names
self.mono_max_bins = mono_max_bins
self.mono_force_bins = mono_force_bins
self.mono_cardinality_cutoff = mono_cardinality_cutoff
self.mono_prefix = mono_prefix
self.mono_custom_binning = mono_custom_binning #only used for monotonic transformations
# check input data type - Only Pandas Dataframe allowed
def check_datatype(self, X):
if not isinstance(X, pd.DataFrame):
raise ValueError("The input data must be pandas dataframe. But the input provided is " + str(type(X)))
return self
# the fit function for WOE transformer
def fit(self, X, y):
#if the function is used as part of pipeline, then try to unpack tuple values
#produced in the previous step. Added as a part of pipeline feature.
try:
X, y = X
except:
pass
#check datatype of X
self.check_datatype(X)
#The length of X and Y should be equal
if X.shape[0] != y.shape[0]:
raise ValueError("Mismatch in input lengths. Length of X is " + str(X.shape[0]) + " \
but length of y is " + str(y.shape[0]) + ".")
# The label must be binary with values {0,1}
unique = np.unique(y)
if len(unique) != 2:
raise ValueError("The target column y must be binary. But the target contains " + str(len(unique)) + \
" unique value(s).")
#apply monotonic binning operation
if self.monotonic_binning:
self.mono_bin_clf = MonotonicBinning(feature_names=self.mono_feature_names,
max_bins=self.mono_max_bins, force_bins=self.mono_force_bins,
cardinality_cutoff=self.mono_cardinality_cutoff,
prefix=self.mono_prefix, custom_binning=self.mono_custom_binning)
if self.mono_custom_binning:
X = self.mono_bin_clf.transform(X)
self.mono_custom_binning = self.mono_bin_clf.bins
else:
X = self.mono_bin_clf.fit_transform(X, y)
self.mono_custom_binning = self.mono_bin_clf.bins
#identify the variables to tranform and assign the bin mapping dictionary
self.woe_bins = {} #bin mapping
if not self.mono_custom_binning:
self.mono_custom_binning= {}
else:
for i in self.mono_custom_binning:
X[i] = X[i].astype('object')
numerical_features = list(X._get_numeric_data().columns)
categorical_features = list(X.columns.difference(numerical_features))
#Identifying the features to perform fit
if self.feature_names == 'all':
self.transform_features = categorical_features
else:
self.transform_features = list(set(self.feature_names))
#Exclude variables provided in the exclusion list
if self.exclude_features:
self.transform_features = list(set(self.transform_features) - set(self.exclude_features))
temp_X = X[self.transform_features] #subset data only on features to fit
temp_X = temp_X.astype('object') #convert categorical columns to object columns
temp_X = self.treat_missing_values(temp_X) #treat missing values function
#apply the WOE train function on dataset
temp_X.apply(lambda x: self.train(x, y), axis=0)
#provide Information value for each variable as a separate dataset
self.iv_df = pd.DataFrame({'Information_Value':self.woe_df.groupby('Variable_Name').Information_Value.max()})
self.iv_df = self.iv_df.reset_index()
self.iv_df = self.iv_df.sort_values('Information_Value', ascending=False)
return self
#treat missing values based on the 'treat_missing' option provided by user
def treat_missing_values(self, X):
"""
treat_missing: {'separate', 'mode', 'least_frequent'} (default='separate')
This parameter setting is used to handle missing values in the dataset.
'separate' - Missing values are treated as a own group (category)
'mode' - Missing values are combined with the highest frequent item in the dataset
'least_frequent' - Missing values are combined with the least frequent item in the dataset
"""
if self.treat_missing == 'separate':
X = X.fillna('NA')
elif self.treat_missing == 'mode':
X = X.fillna(X.mode().iloc[0])
elif self.treat_missing == 'least_frequent':
for i in X:
X[i] = X[i].fillna(X[i].value_counts().index[-1])
else:
raise ValueError("Missing values could be treated with one of these three options - \
'separate', 'mode', 'least_frequent'. \
The provided option is - " + str(self.treat_missing))
return X
#WOE binning - The function is applied on each columns identified in the fit function.
#Here, the input X is a Pandas Series type.
def train(self, X, y):
# Assign values
woe_mapping = {} #dictionary mapping for the current feature
temp_woe = pd.DataFrame({},index=[])
temp_df = pd.DataFrame({'X': X, "Y":y})
grouped_df = temp_df.groupby('X', as_index=True)
#calculate stats for variable and store it in temp_woe
target_sum = grouped_df.Y.sum()
temp_woe['Count'] = grouped_df.Y.count()
temp_woe['Category'] = target_sum.index
temp_woe['Event'] = target_sum
temp_woe['Non_Event'] = temp_woe['Count'] - temp_woe['Event']
temp_woe['Event_Rate'] = temp_woe['Event']/temp_woe['Count']
temp_woe['Non_Event_Rate'] = temp_woe['Non_Event']/temp_woe['Count']
#calculate distributions and woe
total_event = temp_woe['Event'].sum()
total_non_event = temp_woe['Non_Event'].sum()
temp_woe['Event_Distribution'] = temp_woe['Event']/total_event
temp_woe['Non_Event_Distribution'] = temp_woe['Non_Event']/total_non_event
temp_woe['WOE'] = np.log(temp_woe['Event_Distribution']/temp_woe['Non_Event_Distribution'])
temp_woe['Information_Value'] = (temp_woe['Event_Distribution']- \
temp_woe['Non_Event_Distribution'])*temp_woe['WOE']
temp_woe['Variable_Name'] = X.name
temp_woe = temp_woe[['Variable_Name', 'Category', 'Count', 'Event', 'Non_Event', \
'Event_Rate', 'Non_Event_Rate', 'Event_Distribution', 'Non_Event_Distribution', \
'WOE', 'Information_Value']]
temp_woe = temp_woe.replace([np.inf, -np.inf], 0)
temp_woe['Information_Value'] = temp_woe['Information_Value'].sum()
temp_woe = temp_woe.reset_index(drop=True)
woe_mapping[str(X.name)] = dict(zip(temp_woe['Category'], temp_woe['WOE']))
#assign computed values to class variables
try:
self.woe_df = self.woe_df.append(temp_woe, ignore_index=True)
self.woe_bins.update(woe_mapping)
except:
self.woe_df = temp_woe
self.woe_bins = woe_mapping
return self
#Transform new data or existing data based on the fit identified or custom transformation provided by user
def transform(self, X, y=None):
#if the function is used as part of pipeline, then try to unpack tuple values
#produced in the previous step. Added as a part of pipeline feature.
try:
X, y = X
except:
pass
self.check_datatype(X) #check input datatype.
outX = X.copy(deep=True)
#identify the features on which the transformation should be performed
try:
if self.transform_features:
transform_features = self.transform_features
except:
if self.woe_bins:
transform_features = list(self.woe_bins.keys())
else:
raise ValueError("Estimator has to be fitted to make WOE transformations")
#final list of features to be transformed
transform_features = list(set(transform_features) & set(outX.columns))
#raise error if the list is empty
if not transform_features:
raise ValueError("Empty list for WOE transformation. \
Estimator has to be fitted to make WOE transformations")
#use the custom bins provided by user for numeric variables
if self.mono_custom_binning:
try:
if self.mono_bin_clf:
pass
except:
self.mono_bin_clf = MonotonicBinning(feature_names=self.mono_feature_names,
max_bins=self.mono_max_bins, force_bins=self.mono_force_bins,
cardinality_cutoff=self.mono_cardinality_cutoff,
prefix=self.mono_prefix, custom_binning=self.mono_custom_binning)
outX = self.mono_bin_clf.transform(outX)
outX = outX.astype('object') #convert categorical columns to object columns
outX = self.treat_missing_values(outX) #treat missing values function
#iterate through the dataframe and apply the bins
for i in transform_features:
tempX = outX[i] #pandas Series
original_column_name = str(i)
#create the column name based on user provided prefix
if self.woe_prefix:
new_column_name = str(self.woe_prefix) + '_' + str(i)
else:
new_column_name = original_column_name
#check if the bin mapping is present
#check_is_fitted(self, 'woe_bins')
if not self.woe_bins:
raise ValueError("woe_bins variable is not present. \
Estimator has to be fitted to apply transformations.")
outX[new_column_name] = tempX.replace(self.woe_bins[original_column_name])
#transformed dataframe
return outX
#Method that describes what we need this transformer to do
def fit_transform(self, X, y):
return self.fit(X, y).transform(X)
| 47.748447 | 117 | 0.625821 | 1,934 | 15,375 | 4.806101 | 0.158738 | 0.030124 | 0.023776 | 0.022593 | 0.308768 | 0.27262 | 0.242711 | 0.223346 | 0.205056 | 0.186767 | 0 | 0.00268 | 0.296195 | 15,375 | 321 | 118 | 47.897196 | 0.856298 | 0.145171 | 0 | 0.227545 | 0 | 0 | 0.090713 | 0.009515 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.017964 | 0.041916 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
438e9f1f07ffd73b9b9fd9f25c52f215537b1381 | 1,358 | py | Python | NumPy/Array Basics/Random Shuffle/tests/test_task.py | jetbrains-academy/Python-Libraries-NumPy | 7ce0f2d08f87502d5d97bbc6921f0566184d4ebb | [
"MIT"
] | null | null | null | NumPy/Array Basics/Random Shuffle/tests/test_task.py | jetbrains-academy/Python-Libraries-NumPy | 7ce0f2d08f87502d5d97bbc6921f0566184d4ebb | [
"MIT"
] | 4 | 2022-01-14T10:40:47.000Z | 2022-02-14T13:01:13.000Z | NumPy/Array Basics/Random Shuffle/tests/test_task.py | jetbrains-academy/Python-Libraries-NumPy | 7ce0f2d08f87502d5d97bbc6921f0566184d4ebb | [
"MIT"
] | null | null | null | import unittest
import numpy as np
from task import arr, permuted_2d, fully_random
class TestCase(unittest.TestCase):
def test_shape(self):
self.assertEqual((5, 20), arr.shape, msg="Wrong shape of the array 'arr'.")
self.assertEqual((5, 20), permuted_2d.shape, msg="Wrong shape of the array 'permuted_2d'.")
self.assertEqual((5, 20), fully_random.shape, msg="Wrong shape of the array 'fully_random'.")
def test_arr(self):
for i in arr:
# This test checks if in each row the minimum element goes first and maximum - last.
self.assertTrue(i[0] == min(i) and i[-1] == max(i), msg="'arr' should be shuffled along the 0th axis.")
def test_two_d(self):
for i in permuted_2d:
# This test checks that differences between all neighboring elements in rows of the array
# are not equal to 1 (in non-shuffled rows they would be).
self.assertFalse(all([(x - i[i.tolist().index(x) - 1]) == 1 for x in i if i.tolist().index(x) > 0]),
msg="'permuted_2d' should be shuffled along the 1st axis.")
def test_random(self):
# This test checks if elements were also randomized between the rows.
for i in fully_random:
self.assertTrue(max(i) - min(i) > 19, "'fully_random' needs to be fully shuffled.")
| 46.827586 | 115 | 0.635493 | 209 | 1,358 | 4.057416 | 0.373206 | 0.058962 | 0.04717 | 0.063679 | 0.15566 | 0.099057 | 0.099057 | 0 | 0 | 0 | 0 | 0.023692 | 0.25405 | 1,358 | 28 | 116 | 48.5 | 0.813425 | 0.217231 | 0 | 0 | 0 | 0 | 0.234405 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.222222 | false | 0 | 0.166667 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4392cd17a2182a5ad123dad587354133d5fbcf62 | 3,471 | py | Python | open/users/serializers.py | lawrendran/open | d136f694bafab647722c78be6f39ec79d589f774 | [
"MIT"
] | 105 | 2019-06-01T08:34:47.000Z | 2022-03-15T11:48:36.000Z | open/users/serializers.py | lawrendran/open | d136f694bafab647722c78be6f39ec79d589f774 | [
"MIT"
] | 111 | 2019-06-04T15:34:14.000Z | 2022-03-12T21:03:20.000Z | open/users/serializers.py | lawrendran/open | d136f694bafab647722c78be6f39ec79d589f774 | [
"MIT"
] | 26 | 2019-09-04T06:06:12.000Z | 2022-01-03T03:40:11.000Z | import pytz
from rest_auth.serializers import TokenSerializer
from rest_framework.authtoken.models import Token
from rest_framework.exceptions import ValidationError
from rest_framework.fields import (
CharField,
CurrentUserDefault,
HiddenField,
UUIDField,
ChoiceField,
)
from rest_framework.serializers import ModelSerializer, Serializer
from rest_framework.validators import UniqueValidator
from django.contrib.auth.hashers import check_password
from open.users.models import User
class SimpleUserReadSerializer(ModelSerializer):
class Meta:
model = User
fields = (
"name",
"uuid",
)
class UserReadSerializer(ModelSerializer):
class Meta:
model = User
fields = (
"name",
"uuid",
"signed_up_from",
"date_joined",
"username",
"email",
"created",
"modified",
)
class UserTokenSerializer(TokenSerializer):
user = UserReadSerializer()
class Meta:
model = Token
fields = ["key", "user"]
# TODO - this view and serializer is on hold as you figure out registration (later)
class UserCreateSerializer(ModelSerializer):
username = CharField(validators=[UniqueValidator(queryset=User.objects.all())])
# need to make email optional ... prob should think through signup form a little
email = CharField(
validators=[UniqueValidator(queryset=User.objects.all())], required=False
)
password = CharField(write_only=True, min_length=8)
signed_up_from = CharField(
write_only=True, min_length=8, required=False, default="", trim_whitespace=True
)
timezone_string = ChoiceField(
choices=pytz.all_timezones, required=False, default="US/Eastern"
)
class Meta:
model = User
fields = ["username", "email", "password", "signed_up_from", "timezone_string"]
# TODO test - does this work with just username / no email, etc.
def create(self, validated_data):
username = validated_data.pop("username")
password = validated_data.pop("password")
is_betterself_user = False
if validated_data["signed_up_from"] == "betterself":
is_betterself_user = True
validated_data["is_betterself_user"] = is_betterself_user
user = User.objects.create(username=username, **validated_data)
user.set_password(password)
user.save()
return user
class UserDeleteSerializer(Serializer):
# most of this is actually redundant, i don't need to have a validation step, but i do this
# out of paranoia reasons that someone may delete their account by mistake
password = CharField()
user = HiddenField(default=CurrentUserDefault())
uuid = UUIDField()
def validate(self, data):
user = data["user"]
validated_password = check_password(data["password"], user.password)
if not validated_password:
raise ValidationError("Invalid Password Entered")
validated_uuid = str(user.uuid) == str(data["uuid"])
if not validated_uuid:
raise ValidationError("Invalid UUID", str(user.uuid))
validate_user = user.username != "demo-testing@senrigan.io"
if not validate_user:
raise ValidationError(
f"This is a protected user and cannot be deleted. {user.username}"
)
return data
| 30.182609 | 95 | 0.661481 | 374 | 3,471 | 6.016043 | 0.40107 | 0.021333 | 0.037778 | 0.024 | 0.130667 | 0.12 | 0.12 | 0.041778 | 0 | 0 | 0 | 0.000769 | 0.25036 | 3,471 | 114 | 96 | 30.447368 | 0.863951 | 0.111207 | 0 | 0.154762 | 0 | 0 | 0.108152 | 0.007795 | 0 | 0 | 0 | 0.008772 | 0 | 1 | 0.02381 | false | 0.107143 | 0.107143 | 0 | 0.369048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
43956cd7582f0725f3e08ed11af962dc403ba2f7 | 402 | py | Python | archetype/settings/local_stg.py | kingsdigitallab/archetype-django | 6315c8f38e873e2d3b2d99fcfd47d01ce0ae35bc | [
"MIT"
] | 1 | 2018-11-18T22:42:09.000Z | 2018-11-18T22:42:09.000Z | archetype/settings/local_stg.py | kingsdigitallab/archetype-django | 6315c8f38e873e2d3b2d99fcfd47d01ce0ae35bc | [
"MIT"
] | null | null | null | archetype/settings/local_stg.py | kingsdigitallab/archetype-django | 6315c8f38e873e2d3b2d99fcfd47d01ce0ae35bc | [
"MIT"
] | null | null | null | from .base import * # noqa
CACHE_REDIS_DATABASE = '1'
CACHES['default']['LOCATION'] = '127.0.0.1:6379:' + CACHE_REDIS_DATABASE
INTERNAL_IPS = INTERNAL_IPS + ('', )
ALLOWED_HOSTS = ['']
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'app_archetype_stg',
'USER': 'app_archetype',
'PASSWORD': '',
'HOST': ''
},
}
| 22.333333 | 72 | 0.58209 | 42 | 402 | 5.309524 | 0.761905 | 0.089686 | 0.161435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03871 | 0.228856 | 402 | 17 | 73 | 23.647059 | 0.680645 | 0.00995 | 0 | 0 | 0 | 0 | 0.333333 | 0.09596 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.071429 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
439aafbb1c6af8fc6a5c2fcb3a71f36930de52f2 | 605 | py | Python | authors/apps/profiles/renderers.py | MuhweziDeo/Ah-backend-xmen | 60c830977fa39a7eea9ab978a9ba0c3beb0c4d88 | [
"BSD-3-Clause"
] | 4 | 2019-01-07T09:15:17.000Z | 2020-11-09T09:58:54.000Z | authors/apps/profiles/renderers.py | MuhweziDeo/Ah-backend-xmen | 60c830977fa39a7eea9ab978a9ba0c3beb0c4d88 | [
"BSD-3-Clause"
] | 34 | 2019-01-07T15:30:14.000Z | 2019-03-06T08:23:34.000Z | authors/apps/profiles/renderers.py | MuhweziDeo/Ah-backend-xmen | 60c830977fa39a7eea9ab978a9ba0c3beb0c4d88 | [
"BSD-3-Clause"
] | 10 | 2018-12-18T14:43:52.000Z | 2020-02-07T08:27:50.000Z | from authors.apps.utils.renderers import AppJSONRenderer
import json
from rest_framework.renderers import JSONRenderer
class UserProfileJSONRenderer(AppJSONRenderer):
name = 'profile'
class UserProfileListRenderer(JSONRenderer):
"""
Returns profiles of existing users
"""
charset = 'utf-8'
def render(self, data, media_type=None, renderer_context=None):
""" present a list of
user profiles in json format
"""
return json.dumps({
'profiles':data
})
class ReadStatsJsonRenderer(AppJSONRenderer):
name = 'read_stats'
| 23.269231 | 67 | 0.679339 | 61 | 605 | 6.672131 | 0.721311 | 0.07371 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002165 | 0.236364 | 605 | 25 | 68 | 24.2 | 0.878788 | 0.135537 | 0 | 0 | 0 | 0 | 0.061983 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.230769 | 0 | 0.846154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
439c484fa1d9a64793cf4da644af68eabbc13295 | 13,932 | py | Python | omtk/models/model_avar_surface_lips.py | CDufour909/omtk_unreal | 64ae76a7b0a3f73a4b32d3b330f3174d02c54234 | [
"MIT"
] | null | null | null | omtk/models/model_avar_surface_lips.py | CDufour909/omtk_unreal | 64ae76a7b0a3f73a4b32d3b330f3174d02c54234 | [
"MIT"
] | null | null | null | omtk/models/model_avar_surface_lips.py | CDufour909/omtk_unreal | 64ae76a7b0a3f73a4b32d3b330f3174d02c54234 | [
"MIT"
] | null | null | null | import math
import pymel.core as pymel
from omtk.core.classNode import Node
from omtk.libs import libAttr
from omtk.libs import libRigging
from . import model_avar_surface
class SplitterNode(Node):
"""
A splitter is a node network that take the parameterV that is normally sent through the follicles and
split it between two destination: the follicles and the jaw ref constraint.
The more the jaw is opened, the more we'll transfer to the jaw ref before sending to the follicle.
This is mainly used to ensure that any lip movement created by the jaw is canceled when the
animator try to correct the lips and the jaw is open. Otherwise since the jaw space and the surface space
To compute the displacement caused by the was, we'll usethe circumference around the jaw pivot.
This create an 'approximation' that might be wrong if some translation also occur in the jaw.
todo: test with corrective jaw translation
"""
def __init__(self):
super(SplitterNode, self).__init__() # useless
self.attr_inn_jaw_pt = None
self.attr_inn_jaw_radius = None
self.attr_inn_surface_v = None
self.attr_inn_surface_range_v = None
self.attr_inn_jaw_default_ratio = None
self.attr_out_surface_v = None
self.attr_out_jaw_ratio = None
def build(self, nomenclature_rig, **kwargs):
super(SplitterNode, self).build(**kwargs)
#
# Create inn and out attributes.
#
grp_splitter_inn = pymel.createNode(
'network',
name=nomenclature_rig.resolve('udSplitterInn')
)
# The jaw opening amount in degree.
self.attr_inn_jaw_pt = libAttr.addAttr(grp_splitter_inn, 'innJawOpen')
# The relative uv coordinates normally sent to the follicles.
# Note that this value is expected to change at the output of the SplitterNode (see outSurfaceU and outSurfaceV)
self.attr_inn_surface_u = libAttr.addAttr(grp_splitter_inn, 'innSurfaceU')
self.attr_inn_surface_v = libAttr.addAttr(grp_splitter_inn, 'innSurfaceV')
# Use this switch to disable completely the splitter.
self.attr_inn_bypass = libAttr.addAttr(grp_splitter_inn, 'innBypassAmount')
# The arc length in world space of the surface controlling the follicles.
self.attr_inn_surface_range_v = libAttr.addAttr(grp_splitter_inn,
'innSurfaceRangeV') # How many degree does take the jaw to create 1 unit of surface deformation? (ex: 20)
# How much inn percent is the lips following the jaw by default.
# Note that this value is expected to change at the output of the SplitterNode (see attr_out_jaw_ratio)
self.attr_inn_jaw_default_ratio = libAttr.addAttr(grp_splitter_inn, 'jawDefaultRatio')
# The radius of the influence circle normally resolved by using the distance between the jaw and the avar as radius.
self.attr_inn_jaw_radius = libAttr.addAttr(grp_splitter_inn, 'jawRadius')
grp_splitter_out = pymel.createNode(
'network',
name=nomenclature_rig.resolve('udSplitterOut')
)
self.attr_out_surface_u = libAttr.addAttr(grp_splitter_out, 'outSurfaceU')
self.attr_out_surface_v = libAttr.addAttr(grp_splitter_out, 'outSurfaceV')
self.attr_out_jaw_ratio = libAttr.addAttr(grp_splitter_out,
'outJawRatio') # How much percent this influence follow the jaw after cancellation.
#
# Connect inn and out network nodes so they can easily be found from the SplitterNode.
#
attr_inn = libAttr.addAttr(grp_splitter_inn, longName='inn', attributeType='message')
attr_out = libAttr.addAttr(grp_splitter_out, longName='out', attributeType='message')
pymel.connectAttr(self.node.message, attr_inn)
pymel.connectAttr(self.node.message, attr_out)
#
# Create node networks
# Step 1: Get the jaw displacement in uv space (parameterV only).
#
attr_jaw_circumference = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getJawCircumference'),
input1X=self.attr_inn_jaw_radius,
input2X=(math.pi * 2.0)
).outputX
attr_jaw_open_circle_ratio = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getJawOpenCircleRatio'),
operation=2, # divide
input1X=self.attr_inn_jaw_pt,
input2X=360.0
).outputX
attr_jaw_active_circumference = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getJawActiveCircumference'),
input1X=attr_jaw_circumference,
input2X=attr_jaw_open_circle_ratio
).outputX
attr_jaw_v_range = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getActiveJawRangeInSurfaceSpace'),
operation=2, # divide
input1X=attr_jaw_active_circumference,
input2X=self.attr_inn_surface_range_v
).outputX
#
# Step 2: Resolve the output jaw_ratio
#
# Note that this can throw a zero division warning in Maya.
# To prevent that we'll use some black-magic-ugly-ass-trick.
attr_jaw_ratio_cancelation = libRigging.create_safe_division(
self.attr_inn_surface_v,
attr_jaw_v_range,
nomenclature_rig,
'getJawRatioCancellation'
)
attr_jaw_ratio_out_raw = libRigging.create_utility_node(
'plusMinusAverage',
name=nomenclature_rig.resolve('getJawRatioOutUnlimited'),
operation=2, # substraction,
input1D=(
self.attr_inn_jaw_default_ratio,
attr_jaw_ratio_cancelation
)
).output1D
attr_jaw_ratio_out_limited = libRigging.create_utility_node(
'clamp',
name=nomenclature_rig.resolve('getJawRatioOutLimited'),
inputR=attr_jaw_ratio_out_raw,
minR=0.0,
maxR=1.0
).outputR
#
# Step 3: Resolve attr_out_surface_u & attr_out_surface_v
#
attr_inn_jaw_default_ratio_inv = libRigging.create_utility_node(
'reverse',
name=nomenclature_rig.resolve('getJawDefaultRatioInv'),
inputX=self.attr_inn_jaw_default_ratio
).outputX
util_jaw_uv_default_ratio = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getJawDefaultRatioUvSpace'),
input1X=self.attr_inn_jaw_default_ratio,
input1Y=attr_inn_jaw_default_ratio_inv,
input2X=attr_jaw_v_range,
input2Y=attr_jaw_v_range
)
attr_jaw_uv_default_ratio = util_jaw_uv_default_ratio.outputX
attr_jaw_uv_default_ratio_inv = util_jaw_uv_default_ratio.outputY
attr_jaw_uv_limit_max = libRigging.create_utility_node(
'plusMinusAverage',
name=nomenclature_rig.resolve('getJawSurfaceLimitMax'),
operation=2, # substract
input1D=(attr_jaw_v_range, attr_jaw_uv_default_ratio_inv)
).output1D
attr_jaw_uv_limit_min = libRigging.create_utility_node(
'plusMinusAverage',
name=nomenclature_rig.resolve('getJawSurfaceLimitMin'),
operation=2, # substract
input1D=(attr_jaw_uv_default_ratio, attr_jaw_v_range)
).output1D
attr_jaw_cancel_range = libRigging.create_utility_node(
'clamp',
name=nomenclature_rig.resolve('getJawCancelRange'),
inputR=self.attr_inn_surface_v,
minR=attr_jaw_uv_limit_min,
maxR=attr_jaw_uv_limit_max
).outputR
attr_out_surface_v_cancelled = libRigging.create_utility_node(
'plusMinusAverage',
name=nomenclature_rig.resolve('getCanceledUv'),
operation=2, # substraction
input1D=(self.attr_inn_surface_v, attr_jaw_cancel_range)
).output1D
#
# Connect output attributes
#
attr_inn_bypass_inv = libRigging.create_utility_node(
'reverse',
name=nomenclature_rig.resolve('getBypassInv'),
inputX=self.attr_inn_bypass
).outputX
# Connect output jaw_ratio
attr_output_jaw_ratio = libRigging.create_utility_node(
'blendWeighted',
input=(attr_jaw_ratio_out_limited, self.attr_inn_jaw_default_ratio),
weight=(attr_inn_bypass_inv, self.attr_inn_bypass)
).output
pymel.connectAttr(attr_output_jaw_ratio, self.attr_out_jaw_ratio)
# Connect output surface u
pymel.connectAttr(self.attr_inn_surface_u, self.attr_out_surface_u)
# Connect output surface_v
attr_output_surface_v = libRigging.create_utility_node(
'blendWeighted',
input=(attr_out_surface_v_cancelled, self.attr_inn_surface_v),
weight=(attr_inn_bypass_inv, self.attr_inn_bypass)
).output
pymel.connectAttr(attr_output_surface_v, self.attr_out_surface_v)
class AvarSurfaceLipModel(model_avar_surface.AvarSurfaceModel):
"""
Custom avar model for the complex situation that is the lips.
This ensure that we are moving according to the jaw before sliding on the surface.
"""
def __init__(self, *args, **kwargs):
super(AvarSurfaceLipModel, self).__init__(*args, **kwargs)
self._attr_inn_jaw_bindpose = None
self._attr_inn_jaw_pitch = None
self._attr_inn_jaw_ratio_default = None
self._attr_inn_bypass_splitter = None
self._attr_out_jaw_ratio = None
def _create_interface(self):
super(AvarSurfaceLipModel, self)._create_interface()
self._attr_inn_jaw_bindpose = libAttr.addAttr(self.grp_rig, 'innJawBindPose', dataType='matrix')
self._attr_inn_jaw_pitch = libAttr.addAttr(self.grp_rig, 'innJawPitch', defaultValue=0)
self._attr_inn_jaw_ratio_default = libAttr.addAttr(self.grp_rig, 'innJawRatioDefault', defaultValue=0)
self._attr_inn_bypass_splitter = libAttr.addAttr(self.grp_rig, 'innBypassSplitter')
self._attr_inn_ud_bypass = libAttr.addAttr(self.grp_rig, 'innBypassUD')
# self._attr_inn_surface_length_u = libAttr.addAttr(self.grp_rig, 'innSurfaceLengthU', defaultValue=0)
# self._attr_inn_surface_length_v = libAttr.addAttr(self.grp_rig, 'innSurfaceLengthV', defaultValue=0)
self._attr_out_jaw_ratio = libAttr.addAttr(self.grp_rig, 'outJawRatio')
def connect_avar(self, avar):
super(AvarSurfaceLipModel, self).connect_avar(avar)
# Note: We expect a FaceLipAvar
pymel.connectAttr(avar._attr_jaw_bind_tm, self._attr_inn_jaw_bindpose)
pymel.connectAttr(avar._attr_jaw_pitch, self._attr_inn_jaw_pitch)
pymel.connectAttr(avar._attr_inn_jaw_ratio_default, self._attr_inn_jaw_ratio_default)
pymel.connectAttr(avar._attr_bypass_splitter, self._attr_inn_bypass_splitter)
pymel.connectAttr(avar.attr_ud_bypass, self._attr_inn_ud_bypass)
def _get_follicle_relative_uv_attr(self, **kwargs):
nomenclature_rig = self.get_nomenclature_rig()
attr_u, attr_v = super(AvarSurfaceLipModel, self)._get_follicle_relative_uv_attr(**kwargs)
util_decompose_jaw_bind_tm = libRigging.create_utility_node(
'decomposeMatrix',
inputMatrix=self._attr_inn_jaw_bindpose,
)
#
# Create and connect Splitter Node
#
splitter = SplitterNode()
splitter.build(
nomenclature_rig,
name=nomenclature_rig.resolve('splitter')
)
splitter.setParent(self.grp_rig)
# Resolve the radius of the jaw influence. Used by the splitter.
attr_jaw_radius = libRigging.create_utility_node(
'distanceBetween',
name=nomenclature_rig.resolve('getJawRadius'),
point1=self.grp_offset.translate,
point2=util_decompose_jaw_bind_tm.outputTranslate
).distance
# Resolve the jaw pitch. Used by the splitter.
attr_jaw_pitch = self._attr_inn_jaw_pitch
# Connect the splitter inputs
pymel.connectAttr(attr_u, splitter.attr_inn_surface_u)
pymel.connectAttr(attr_v, splitter.attr_inn_surface_v)
pymel.connectAttr(self._attr_inn_jaw_ratio_default, splitter.attr_inn_jaw_default_ratio)
pymel.connectAttr(self._attr_length_v, splitter.attr_inn_surface_range_v)
pymel.connectAttr(attr_jaw_radius, splitter.attr_inn_jaw_radius)
pymel.connectAttr(attr_jaw_pitch, splitter.attr_inn_jaw_pt)
pymel.connectAttr(self._attr_inn_bypass_splitter, splitter.attr_inn_bypass)
attr_u = splitter.attr_out_surface_u
attr_v = splitter.attr_out_surface_v
# Create constraint to controller the jaw reference
pymel.connectAttr(splitter.attr_out_jaw_ratio, self._attr_out_jaw_ratio)
#
# Implement the 'bypass' avars.
# Thoses avars bypass the splitter, used in corner cases only.
#
attr_attr_ud_bypass_adjusted = libRigging.create_utility_node(
'multiplyDivide',
name=nomenclature_rig.resolve('getAdjustedUdBypass'),
input1X=self._attr_inn_ud_bypass,
input2X=self.multiplier_ud
).outputX
attr_v = libRigging.create_utility_node(
'addDoubleLinear',
name=nomenclature_rig.resolve('addBypassAvar'),
input1=attr_v,
input2=attr_attr_ud_bypass_adjusted
).output
return attr_u, attr_v
| 42.090634 | 162 | 0.680879 | 1,672 | 13,932 | 5.313397 | 0.184809 | 0.04964 | 0.059433 | 0.037821 | 0.497636 | 0.293224 | 0.208802 | 0.159838 | 0.146105 | 0.079244 | 0 | 0.004972 | 0.249282 | 13,932 | 330 | 163 | 42.218182 | 0.84444 | 0.196454 | 0 | 0.205607 | 0 | 0 | 0.07727 | 0.020942 | 0 | 0 | 0 | 0.00303 | 0 | 1 | 0.028037 | false | 0.079439 | 0.028037 | 0 | 0.070093 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
439e1a09f9246f51a2f4aa291d6172d1d6ae55e7 | 808 | py | Python | DQM/L1TMonitor/python/L1TGCT_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | DQM/L1TMonitor/python/L1TGCT_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | DQM/L1TMonitor/python/L1TGCT_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
from DQMServices.Core.DQMEDAnalyzer import DQMEDAnalyzer
l1tGct = DQMEDAnalyzer('L1TGCT',
gctCentralJetsSource = cms.InputTag("gctDigis","cenJets"),
gctForwardJetsSource = cms.InputTag("gctDigis","forJets"),
gctTauJetsSource = cms.InputTag("gctDigis","tauJets"),
gctIsoTauJetsSource = cms.InputTag("gctDigis","fake"),
gctEnergySumsSource = cms.InputTag("gctDigis"),
gctIsoEmSource = cms.InputTag("gctDigis","isoEm"),
gctNonIsoEmSource = cms.InputTag("gctDigis","nonIsoEm"),
monitorDir = cms.untracked.string("L1T/L1TGCT"),
verbose = cms.untracked.bool(False),
stage1_layer2_ = cms.bool(False),
DQMStore = cms.untracked.bool(True),
disableROOToutput = cms.untracked.bool(True),
filterTriggerType = cms.int32(1)
)
| 38.47619 | 62 | 0.72896 | 79 | 808 | 7.43038 | 0.518987 | 0.131175 | 0.226576 | 0.068143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012821 | 0.131188 | 808 | 20 | 63 | 40.4 | 0.823362 | 0 | 0 | 0 | 0 | 0 | 0.136476 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
439e723ba661ca0696137f422b31b51f63930e6a | 387 | py | Python | OLD/karma_module/text.py | alentoghostflame/StupidAlentoBot | c024bfb79a9ecb0d9fda5ddc4e361a0cb878baba | [
"MIT"
] | 1 | 2021-12-12T02:50:20.000Z | 2021-12-12T02:50:20.000Z | OLD/karma_module/text.py | alentoghostflame/StupidAlentoBot | c024bfb79a9ecb0d9fda5ddc4e361a0cb878baba | [
"MIT"
] | 17 | 2020-02-07T23:40:36.000Z | 2020-12-22T16:38:44.000Z | OLD/karma_module/text.py | alentoghostflame/StupidAlentoBot | c024bfb79a9ecb0d9fda5ddc4e361a0cb878baba | [
"MIT"
] | null | null | null | ADDED_KARMA_TO_MEMBER = "Gave {} karma to {}, their karma is now at {}."
REMOVED_KARMA_FROM_MEMBER = "Removed {} karma from {}, their karma is now at {}."
LIST_KARMA_OWN = "You currently have {} karma."
LIST_KARMA_OBJECT = "\"{}\" currently has {} karma."
LIST_KARMA_MEMBER = "{} currently has {} karma."
KARMA_TOP_START = "Top karma in server:\n"
KARMA_TOP_FORMAT = "{}. {} \\| {}\n"
| 38.7 | 81 | 0.669251 | 55 | 387 | 4.418182 | 0.436364 | 0.111111 | 0.098765 | 0.123457 | 0.139918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157623 | 387 | 9 | 82 | 43 | 0.745399 | 0 | 0 | 0 | 0 | 0 | 0.550388 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43a255b174f2f6995694a3ff518d32d995c17049 | 981 | py | Python | setup.py | sdu-cfei/modest-py | dc14091fb8c20a8b3fa5ab33bbf597c0b566ba0a | [
"BSD-2-Clause"
] | 37 | 2017-06-21T19:09:11.000Z | 2022-03-13T09:26:07.000Z | setup.py | sdu-cfei/modest-py | dc14091fb8c20a8b3fa5ab33bbf597c0b566ba0a | [
"BSD-2-Clause"
] | 51 | 2017-06-21T17:40:42.000Z | 2021-10-31T09:16:21.000Z | setup.py | sdu-cfei/modest-py | dc14091fb8c20a8b3fa5ab33bbf597c0b566ba0a | [
"BSD-2-Clause"
] | 12 | 2017-10-02T12:32:50.000Z | 2022-03-13T09:26:15.000Z | from setuptools import setup
setup(
name='modestpy',
version='0.1',
description='FMI-compliant model identification package',
url='https://github.com/sdu-cfei/modest-py',
keywords='fmi fmu optimization model identification estimation',
author='Krzysztof Arendt, Center for Energy Informatics SDU',
author_email='krzysztof.arendt@gmail.com, veje@mmmi.sdu.dk',
license='BSD',
platforms=['Windows', 'Linux'],
packages=[
'modestpy',
'modestpy.estim',
'modestpy.estim.ga_parallel',
'modestpy.estim.ga',
'modestpy.estim.ps',
'modestpy.estim.scipy',
'modestpy.fmi',
'modestpy.utilities',
'modestpy.test'],
include_package_data=True,
install_requires=[
'fmpy[complete]',
'scipy',
'pandas',
'matplotlib',
'numpy',
'pyDOE',
'modestga'
],
classifiers=[
'Programming Language :: Python :: 3'
]
)
| 26.513514 | 68 | 0.59633 | 96 | 981 | 6.041667 | 0.71875 | 0.112069 | 0.051724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004138 | 0.260958 | 981 | 36 | 69 | 27.25 | 0.795862 | 0 | 0 | 0 | 0 | 0 | 0.494393 | 0.054027 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.028571 | 0 | 0.028571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43a6cf6a117a9bd891a315706e175a03b6175d39 | 51,390 | py | Python | python/ccxt/async_support/uex.py | victor95pc/ccxt | 5c3e606296a1b15852a35f1330b645f451fa08d6 | [
"MIT"
] | 1 | 2019-03-17T22:44:30.000Z | 2019-03-17T22:44:30.000Z | python/ccxt/async_support/uex.py | Lara-Bell/ccxt | e09230b4b60d5c33e3f6ebc044002bab6f733553 | [
"MIT"
] | null | null | null | python/ccxt/async_support/uex.py | Lara-Bell/ccxt | e09230b4b60d5c33e3f6ebc044002bab6f733553 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# PLEASE DO NOT EDIT THIS FILE, IT IS GENERATED AND WILL BE OVERWRITTEN:
# https://github.com/ccxt/ccxt/blob/master/CONTRIBUTING.md#how-to-contribute-code
from ccxt.async_support.base.exchange import Exchange
# -----------------------------------------------------------------------------
try:
basestring # Python 3
except NameError:
basestring = str # Python 2
import json
from ccxt.base.errors import ExchangeError
from ccxt.base.errors import AuthenticationError
from ccxt.base.errors import PermissionDenied
from ccxt.base.errors import ArgumentsRequired
from ccxt.base.errors import InsufficientFunds
from ccxt.base.errors import InvalidAddress
from ccxt.base.errors import InvalidOrder
from ccxt.base.errors import OrderNotFound
from ccxt.base.errors import ExchangeNotAvailable
class uex (Exchange):
def describe(self):
return self.deep_extend(super(uex, self).describe(), {
'id': 'uex',
'name': 'UEX',
'countries': ['SG', 'US'],
'version': 'v1.0.3',
'rateLimit': 1000,
'certified': False,
# new metainfo interface
'has': {
'CORS': False,
'fetchMyTrades': True,
'fetchOHLCV': True,
'fetchOrder': True,
'fetchOpenOrders': True,
'fetchClosedOrders': True,
'fetchDepositAddress': True,
'fetchDeposits': True,
'fetchWithdrawals': True,
'withdraw': True,
},
'timeframes': {
'1m': '1',
'5m': '5',
'15m': '15',
'30m': '30',
'1h': '60',
'2h': '120',
'3h': '180',
'4h': '240',
'6h': '360',
'12h': '720',
'1d': '1440',
},
'urls': {
'logo': 'https://user-images.githubusercontent.com/1294454/43999923-051d9884-9e1f-11e8-965a-76948cb17678.jpg',
'api': 'https://open-api.uex.com/open/api',
'www': 'https://www.uex.com',
'doc': 'https://download.uex.com/doc/UEX-API-English-1.0.3.pdf',
'fees': 'https://www.uex.com/footer/ufees.html',
'referral': 'https://www.uex.com/signup.html?code=VAGQLL',
},
'api': {
'public': {
'get': [
'common/coins', # funding limits
'common/symbols',
'get_records', # ohlcvs
'get_ticker',
'get_trades',
'market_dept', # dept here is not a typo... they mean depth
],
},
'private': {
'get': [
'deposit_address_list',
'withdraw_address_list',
'deposit_history',
'withdraw_history',
'user/account',
'market', # an assoc array of market ids to corresponding prices traded most recently(prices of last trades per market)
'order_info',
'new_order', # a list of currently open orders
'all_order',
'all_trade',
],
'post': [
'create_order',
'cancel_order',
'create_withdraw',
],
},
},
'fees': {
'trading': {
'tierBased': False,
'percentage': True,
'maker': 0.0010,
'taker': 0.0010,
},
},
'exceptions': {
# descriptions from ↓ exchange
# '0': 'no error', # succeed
'4': InsufficientFunds, # {"code":"4","msg":"余额不足:0E-16","data":null}
'5': InvalidOrder, # fail to order {"code":"5","msg":"Price fluctuates more than1000.0%","data":null}
'6': InvalidOrder, # the quantity value less than the minimum one {"code":"6","msg":"数量小于最小值:0.001","data":null}
'7': InvalidOrder, # the quantity value more than the maximum one {"code":"7","msg":"数量大于最大值:10000","data":null}
'8': InvalidOrder, # fail to cancel order
'9': ExchangeError, # transaction be frozen
'13': ExchangeError, # Sorry, the program made an error, please contact with the manager.
'19': InsufficientFunds, # Available balance is insufficient.
'22': OrderNotFound, # The order does not exist. {"code":"22","msg":"not exist order","data":null}
'23': InvalidOrder, # Lack of parameters of numbers of transaction
'24': InvalidOrder, # Lack of parameters of transaction price
'100001': ExchangeError, # System is abnormal
'100002': ExchangeNotAvailable, # Update System
'100004': ExchangeError, # {"code":"100004","msg":"request parameter illegal","data":null}
'100005': AuthenticationError, # {"code":"100005","msg":"request sign illegal","data":null}
'100007': PermissionDenied, # illegal IP
'110002': ExchangeError, # unknown currency code
'110003': AuthenticationError, # fund password error
'110004': AuthenticationError, # fund password error
'110005': InsufficientFunds, # Available balance is insufficient.
'110020': AuthenticationError, # Username does not exist.
'110023': AuthenticationError, # Phone number is registered.
'110024': AuthenticationError, # Email box is registered.
'110025': PermissionDenied, # Account is locked by background manager
'110032': PermissionDenied, # The user has no authority to do self operation.
'110033': ExchangeError, # fail to recharge
'110034': ExchangeError, # fail to withdraw
'-100': ExchangeError, # {"code":"-100","msg":"Your request path is not exist or you can try method GET/POST.","data":null}
'-1000': ExchangeNotAvailable, # {"msg":"System maintenancenot ","code":"-1000","data":null}
},
'requiredCredentials': {
'apiKey': True,
'secret': True,
},
'options': {
'createMarketBuyOrderRequiresPrice': True,
'limits': {
'BTC/USDT': {'amount': {'min': 0.001}, 'price': {'min': 0.01}},
'ETH/USDT': {'amount': {'min': 0.001}, 'price': {'min': 0.01}},
'BCH/USDT': {'amount': {'min': 0.001}, 'price': {'min': 0.01}},
'ETH/BTC': {'amount': {'min': 0.001}, 'price': {'min': 0.000001}},
'BCH/BTC': {'amount': {'min': 0.001}, 'price': {'min': 0.000001}},
'LEEK/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'CTXC/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'COSM/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'MANA/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'LBA/BTC': {'amount': {'min': 10}, 'price': {'min': 10}},
'OLT/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'DTA/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'KNT/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'REN/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'LBA/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'EXC/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'ZIL/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'RATING/ETH': {'amount': {'min': 100}, 'price': {'min': 100}},
'CENNZ/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
'TTC/ETH': {'amount': {'min': 10}, 'price': {'min': 10}},
},
},
})
def calculate_fee(self, symbol, type, side, amount, price, takerOrMaker='taker', params={}):
market = self.markets[symbol]
key = 'quote'
rate = market[takerOrMaker]
cost = float(self.cost_to_precision(symbol, amount * rate))
if side == 'sell':
cost *= price
else:
key = 'base'
return {
'type': takerOrMaker,
'currency': market[key],
'rate': rate,
'cost': float(self.currency_to_precision(market[key], cost)),
}
async def fetch_markets(self, params={}):
response = await self.publicGetCommonSymbols()
#
# {code: "0",
# msg: "suc",
# data: [{ symbol: "btcusdt",
# count_coin: "usdt",
# amount_precision: 3,
# base_coin: "btc",
# price_precision: 2 },
# { symbol: "ethusdt",
# count_coin: "usdt",
# amount_precision: 3,
# base_coin: "eth",
# price_precision: 2 },
# { symbol: "ethbtc",
# count_coin: "btc",
# amount_precision: 3,
# base_coin: "eth",
# price_precision: 6 }]}
#
result = []
markets = response['data']
for i in range(0, len(markets)):
market = markets[i]
id = market['symbol']
baseId = market['base_coin']
quoteId = market['count_coin']
base = baseId.upper()
quote = quoteId.upper()
base = self.common_currency_code(base)
quote = self.common_currency_code(quote)
symbol = base + '/' + quote
precision = {
'amount': market['amount_precision'],
'price': market['price_precision'],
}
active = True
defaultLimits = self.safe_value(self.options['limits'], symbol, {})
limits = self.deep_extend({
'amount': {
'min': None,
'max': None,
},
'price': {
'min': None,
'max': None,
},
'cost': {
'min': None,
'max': None,
},
}, defaultLimits)
result.append({
'id': id,
'symbol': symbol,
'base': base,
'quote': quote,
'baseId': baseId,
'quoteId': quoteId,
'active': active,
'info': market,
'precision': precision,
'limits': limits,
})
return result
async def fetch_balance(self, params={}):
await self.load_markets()
response = await self.privateGetUserAccount(params)
#
# {code: "0",
# msg: "suc",
# data: {total_asset: "0.00000000",
# coin_list: [{ normal: "0.00000000",
# btcValuatin: "0.00000000",
# locked: "0.00000000",
# coin: "usdt" },
# { normal: "0.00000000",
# btcValuatin: "0.00000000",
# locked: "0.00000000",
# coin: "btc" },
# { normal: "0.00000000",
# btcValuatin: "0.00000000",
# locked: "0.00000000",
# coin: "eth" },
# { normal: "0.00000000",
# btcValuatin: "0.00000000",
# locked: "0.00000000",
# coin: "ren" }]}}
#
balances = response['data']['coin_list']
result = {'info': balances}
for i in range(0, len(balances)):
balance = balances[i]
currencyId = balance['coin']
code = currencyId.upper()
if currencyId in self.currencies_by_id:
code = self.currencies_by_id[currencyId]['code']
else:
code = self.common_currency_code(code)
account = self.account()
free = float(balance['normal'])
used = float(balance['locked'])
total = self.sum(free, used)
account['free'] = free
account['used'] = used
account['total'] = total
result[code] = account
return self.parse_balance(result)
async def fetch_order_book(self, symbol, limit=None, params={}):
await self.load_markets()
response = await self.publicGetMarketDept(self.extend({
'symbol': self.market_id(symbol),
'type': 'step0', # step1, step2 from most detailed to least detailed
}, params))
#
# {code: "0",
# msg: "suc",
# data: {tick: {asks: [["0.05824200", 9.77],
# ["0.05830000", 7.81],
# ["0.05832900", 8.59],
# ["0.10000000", 0.001] ],
# bids: [["0.05780000", 8.25],
# ["0.05775000", 8.12],
# ["0.05773200", 8.57],
# ["0.00010000", 0.79] ],
# time: 1533412622463 }} }
#
timestamp = self.safe_integer(response['data']['tick'], 'time')
return self.parse_order_book(response['data']['tick'], timestamp)
def parse_ticker(self, ticker, market=None):
#
# {code: "0",
# msg: "suc",
# data: {symbol: "ETHBTC",
# high: 0.058426,
# vol: 19055.875,
# last: 0.058019,
# low: 0.055802,
# change: 0.03437271,
# buy: "0.05780000",
# sell: "0.05824200",
# time: 1533413083184} }
#
timestamp = self.safe_integer(ticker, 'time')
symbol = None
if market is None:
marketId = self.safe_string(ticker, 'symbol')
marketId = marketId.lower()
if marketId in self.markets_by_id:
market = self.markets_by_id[marketId]
if market is not None:
symbol = market['symbol']
last = self.safe_float(ticker, 'last')
change = self.safe_float(ticker, 'change')
percentage = change * 100
return {
'symbol': symbol,
'timestamp': timestamp,
'datetime': self.iso8601(timestamp),
'high': self.safe_float(ticker, 'high'),
'low': self.safe_float(ticker, 'low'),
'bid': self.safe_float(ticker, 'buy'),
'bidVolume': None,
'ask': self.safe_float(ticker, 'sell'),
'askVolume': None,
'vwap': None,
'open': None,
'close': last,
'last': last,
'previousClose': None,
'change': None,
'percentage': percentage,
'average': None,
'baseVolume': self.safe_float(ticker, 'vol'),
'quoteVolume': None,
'info': ticker,
}
async def fetch_ticker(self, symbol, params={}):
await self.load_markets()
market = self.market(symbol)
response = await self.publicGetGetTicker(self.extend({
'symbol': market['id'],
}, params))
#
# {code: "0",
# msg: "suc",
# data: {symbol: "ETHBTC",
# high: 0.058426,
# vol: 19055.875,
# last: 0.058019,
# low: 0.055802,
# change: 0.03437271,
# buy: "0.05780000",
# sell: "0.05824200",
# time: 1533413083184} }
#
return self.parse_ticker(response['data'], market)
def parse_trade(self, trade, market=None):
#
# public fetchTrades
#
# { amount: 0.88,
# create_time: 1533414358000,
# price: 0.058019,
# id: 406531,
# type: "sell" },
#
# private fetchMyTrades, fetchOrder, fetchOpenOrders, fetchClosedOrders
#
# { volume: "0.010",
# side: "SELL",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出",
# bid_id: 3669539, # only in fetchMyTrades
# ask_id: 3669583, # only in fetchMyTrades
# }
#
timestamp = self.safe_integer_2(trade, 'create_time', 'ctime')
if timestamp is None:
timestring = self.safe_string(trade, 'created_at')
if timestring is not None:
timestamp = self.parse8601('2018-' + timestring + ':00Z')
side = self.safe_string_2(trade, 'side', 'type')
if side is not None:
side = side.lower()
id = self.safe_string(trade, 'id')
symbol = None
if market is not None:
symbol = market['symbol']
price = self.safe_float(trade, 'price')
amount = self.safe_float_2(trade, 'volume', 'amount')
cost = self.safe_float(trade, 'deal_price')
if cost is None:
if amount is not None:
if price is not None:
cost = amount * price
fee = None
feeCost = self.safe_float_2(trade, 'fee', 'deal_fee')
if feeCost is not None:
feeCurrency = self.safe_string(trade, 'feeCoin')
if feeCurrency is not None:
currencyId = feeCurrency.lower()
if currencyId in self.currencies_by_id:
feeCurrency = self.currencies_by_id[currencyId]['code']
fee = {
'cost': feeCost,
'currency': feeCurrency,
}
orderIdField = 'ask_id' if (side == 'sell') else 'bid_id'
orderId = self.safe_string(trade, orderIdField)
return {
'id': id,
'info': trade,
'timestamp': timestamp,
'datetime': self.iso8601(timestamp),
'symbol': symbol,
'order': orderId,
'type': None,
'side': side,
'price': price,
'amount': amount,
'cost': cost,
'fee': fee,
}
async def fetch_trades(self, symbol, since=None, limit=None, params={}):
await self.load_markets()
market = self.market(symbol)
response = await self.publicGetGetTrades(self.extend({
'symbol': market['id'],
}, params))
#
# {code: "0",
# msg: "suc",
# data: [{ amount: 0.88,
# create_time: 1533414358000,
# price: 0.058019,
# id: 406531,
# type: "sell" },
# { amount: 4.88,
# create_time: 1533414331000,
# price: 0.058019,
# id: 406530,
# type: "buy" },
# { amount: 0.5,
# create_time: 1533414311000,
# price: 0.058019,
# id: 406529,
# type: "sell" }]}
#
return self.parse_trades(response['data'], market, since, limit)
def parse_ohlcv(self, ohlcv, market=None, timeframe='1d', since=None, limit=None):
return [
ohlcv[0] * 1000, # timestamp
ohlcv[1], # open
ohlcv[2], # high
ohlcv[3], # low
ohlcv[4], # close
ohlcv[5], # volume
]
async def fetch_ohlcv(self, symbol, timeframe='1m', since=None, limit=None, params={}):
await self.load_markets()
market = self.market(symbol)
request = {
'symbol': market['id'],
'period': self.timeframes[timeframe], # in minutes
}
response = await self.publicGetGetRecords(self.extend(request, params))
#
# {code: '0',
# msg: 'suc',
# data:
# [[1533402420, 0.057833, 0.057833, 0.057833, 0.057833, 18.1],
# [1533402480, 0.057833, 0.057833, 0.057833, 0.057833, 29.88],
# [1533402540, 0.057833, 0.057833, 0.057833, 0.057833, 29.06] ]}
#
return self.parse_ohlcvs(response['data'], market, timeframe, since, limit)
async def create_order(self, symbol, type, side, amount, price=None, params={}):
if type == 'market':
# for market buy it requires the amount of quote currency to spend
if side == 'buy':
if self.options['createMarketBuyOrderRequiresPrice']:
if price is None:
raise InvalidOrder(self.id + " createOrder() requires the price argument with market buy orders to calculate total order cost(amount to spend), where cost = amount * price. Supply a price argument to createOrder() call if you want the cost to be calculated for you from price and amount, or, alternatively, add .options['createMarketBuyOrderRequiresPrice'] = False to supply the cost in the amount argument(the exchange-specific behaviour)")
else:
amount = amount * price
await self.load_markets()
market = self.market(symbol)
orderType = '1' if (type == 'limit') else '2'
orderSide = side.upper()
amountToPrecision = self.amount_to_precision(symbol, amount)
request = {
'side': orderSide,
'type': orderType,
'symbol': market['id'],
'volume': amountToPrecision,
# An excerpt from their docs:
# side required Trading Direction
# type required pending order types,1:Limit-price Delegation 2:Market- price Delegation
# volume required
# Purchase Quantity(polysemy,multiplex field)
# type=1: Quantity of buying and selling
# type=2: Buying represents gross price, and selling represents total number
# Trading restriction user/me-user information
# price optional Delegation Price:type=2:self parameter is no use.
# fee_is_user_exchange_coin optional
# 0,when making transactions with all platform currencies,
# self parameter represents whether to use them to pay
# fees or not and 0 is no, 1 is yes.
}
priceToPrecision = None
if type == 'limit':
priceToPrecision = self.price_to_precision(symbol, price)
request['price'] = priceToPrecision
response = await self.privatePostCreateOrder(self.extend(request, params))
#
# {code: '0',
# msg: 'suc',
# data: {'order_id' : 34343} }
#
result = self.parse_order(response['data'], market)
return self.extend(result, {
'info': response,
'symbol': symbol,
'type': type,
'side': side,
'status': 'open',
'price': float(priceToPrecision),
'amount': float(amountToPrecision),
})
async def cancel_order(self, id, symbol=None, params={}):
await self.load_markets()
market = self.market(symbol)
request = {
'order_id': id,
'symbol': market['id'],
}
response = await self.privatePostCancelOrder(self.extend(request, params))
order = self.safe_value(response, 'data', {})
return self.extend(self.parse_order(order), {
'id': id,
'symbol': symbol,
'status': 'canceled',
})
def parse_order_status(self, status):
statuses = {
'0': 'open', # INIT(0,"primary order,untraded and not enter the market")
'1': 'open', # NEW_(1,"new order,untraded and enter the market ")
'2': 'closed', # FILLED(2,"complete deal")
'3': 'open', # PART_FILLED(3,"partial deal")
'4': 'canceled', # CANCELED(4,"already withdrawn")
'5': 'canceled', # PENDING_CANCEL(5,"pending withdrawak")
'6': 'canceled', # EXPIRED(6,"abnormal orders")
}
if status in statuses:
return statuses[status]
return status
def parse_order(self, order, market=None):
#
# createOrder
#
# {"order_id":34343}
#
# fetchOrder, fetchOpenOrders, fetchClosedOrders
#
# { side: "BUY",
# total_price: "0.10000000",
# created_at: 1510993841000,
# avg_price: "0.10000000",
# countCoin: "btc",
# source: 1,
# type: 1,
# side_msg: "买入",
# volume: "1.000",
# price: "0.10000000",
# source_msg: "WEB",
# status_msg: "完全成交",
# deal_volume: "1.00000000",
# id: 424,
# remain_volume: "0.00000000",
# baseCoin: "eth",
# tradeList: [{ volume: "1.000",
# feeCoin: "YLB",
# price: "0.10000000",
# fee: "0.16431104",
# ctime: 1510996571195,
# deal_price: "0.10000000",
# id: 306,
# type: "买入" }],
# status: 2 }
#
# fetchOrder
#
# {trade_list: [{ volume: "0.010",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出" }],
# order_info: { side: "SELL",
# total_price: "0.010",
# created_at: 1533616673000,
# avg_price: "0.05816200",
# countCoin: "btc",
# source: 3,
# type: 2,
# side_msg: "卖出",
# volume: "0.010",
# price: "0.00000000",
# source_msg: "API",
# status_msg: "完全成交",
# deal_volume: "0.01000000",
# id: 3669583,
# remain_volume: "0.00000000",
# baseCoin: "eth",
# tradeList: [{ volume: "0.010",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出" }],
# status: 2 }}
#
side = self.safe_string(order, 'side')
if side is not None:
side = side.lower()
status = self.parse_order_status(self.safe_string(order, 'status'))
symbol = None
if market is None:
baseId = self.safe_string(order, 'baseCoin')
quoteId = self.safe_string(order, 'countCoin')
marketId = baseId + quoteId
if marketId in self.markets_by_id:
market = self.markets_by_id[marketId]
else:
if (baseId is not None) and(quoteId is not None):
base = baseId.upper()
quote = quoteId.upper()
base = self.common_currency_code(base)
quote = self.common_currency_code(quote)
symbol = base + '/' + quote
if market is not None:
symbol = market['symbol']
timestamp = self.safe_integer(order, 'created_at')
if timestamp is None:
timestring = self.safe_string(order, 'created_at')
if timestring is not None:
timestamp = self.parse8601('2018-' + timestring + ':00Z')
lastTradeTimestamp = None
fee = None
average = self.safe_float(order, 'avg_price')
price = self.safe_float(order, 'price')
if price == 0:
price = average
amount = self.safe_float(order, 'volume')
filled = self.safe_float(order, 'deal_volume')
remaining = self.safe_float(order, 'remain_volume')
cost = self.safe_float(order, 'total_price')
id = self.safe_string_2(order, 'id', 'order_id')
trades = None
tradeList = self.safe_value(order, 'tradeList', [])
feeCurrencies = {}
feeCost = None
for i in range(0, len(tradeList)):
trade = self.parse_trade(tradeList[i], market)
if feeCost is None:
feeCost = 0
feeCost = feeCost + trade['fee']['cost']
tradeFeeCurrency = trade['fee']['currency']
feeCurrencies[tradeFeeCurrency] = trade['fee']['cost']
if trades is None:
trades = []
lastTradeTimestamp = trade['timestamp']
trades.append(self.extend(trade, {
'order': id,
}))
if feeCost is not None:
feeCurrency = None
keys = list(feeCurrencies.keys())
numCurrencies = len(keys)
if numCurrencies == 1:
feeCurrency = keys[0]
fee = {
'cost': feeCost,
'currency': feeCurrency,
}
result = {
'info': order,
'id': id,
'timestamp': timestamp,
'datetime': self.iso8601(timestamp),
'lastTradeTimestamp': lastTradeTimestamp,
'symbol': symbol,
'type': 'limit',
'side': side,
'price': price,
'cost': cost,
'average': average,
'amount': amount,
'filled': filled,
'remaining': remaining,
'status': status,
'fee': fee,
'trades': trades,
}
return result
async def fetch_orders_with_method(self, method, symbol=None, since=None, limit=None, params={}):
if symbol is None:
raise ArgumentsRequired(self.id + ' fetchOrdersWithMethod() requires a symbol argument')
await self.load_markets()
market = self.market(symbol)
request = {
# pageSize optional page size
# page optional page number
'symbol': market['id'],
}
if limit is not None:
request['pageSize'] = limit
response = await getattr(self, method)(self.extend(request, params))
#
# {code: "0",
# msg: "suc",
# data: { count: 1,
# orderList: [{ side: "SELL",
# total_price: "0.010",
# created_at: 1533616673000,
# avg_price: "0.05816200",
# countCoin: "btc",
# source: 3,
# type: 2,
# side_msg: "卖出",
# volume: "0.010",
# price: "0.00000000",
# source_msg: "API",
# status_msg: "完全成交",
# deal_volume: "0.01000000",
# id: 3669583,
# remain_volume: "0.00000000",
# baseCoin: "eth",
# tradeList: [{ volume: "0.010",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出" }],
# status: 2 }]} }
#
# privateGetNewOrder returns resultList, privateGetAllOrder returns orderList
orders = self.safe_value_2(response['data'], 'orderList', 'resultList', [])
return self.parse_orders(orders, market, since, limit)
async def fetch_open_orders(self, symbol=None, since=None, limit=None, params={}):
return self.fetch_orders_with_method('privateGetNewOrder', symbol, since, limit, params)
async def fetch_closed_orders(self, symbol=None, since=None, limit=None, params={}):
return self.fetch_orders_with_method('privateGetAllOrder', symbol, since, limit, params)
async def fetch_order(self, id, symbol=None, params={}):
await self.load_markets()
market = self.market(symbol)
request = {
'order_id': id,
'symbol': market['id'],
}
response = await self.privateGetOrderInfo(self.extend(request, params))
#
# {code: "0",
# msg: "suc",
# data: {trade_list: [{ volume: "0.010",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出" }],
# order_info: { side: "SELL",
# total_price: "0.010",
# created_at: 1533616673000,
# avg_price: "0.05816200",
# countCoin: "btc",
# source: 3,
# type: 2,
# side_msg: "卖出",
# volume: "0.010",
# price: "0.00000000",
# source_msg: "API",
# status_msg: "完全成交",
# deal_volume: "0.01000000",
# id: 3669583,
# remain_volume: "0.00000000",
# baseCoin: "eth",
# tradeList: [{ volume: "0.010",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出" }],
# status: 2 }} }
#
return self.parse_order(response['data']['order_info'], market)
async def fetch_my_trades(self, symbol=None, since=None, limit=None, params={}):
if symbol is None:
raise ArgumentsRequired(self.id + ' fetchMyTrades requires a symbol argument')
await self.load_markets()
market = self.market(symbol)
request = {
# pageSize optional page size
# page optional page number
'symbol': market['id'],
}
if limit is not None:
request['pageSize'] = limit
response = await self.privateGetAllTrade(self.extend(request, params))
#
# {code: "0",
# msg: "suc",
# data: { count: 1,
# resultList: [{ volume: "0.010",
# side: "SELL",
# feeCoin: "BTC",
# price: "0.05816200",
# fee: "0.00000029",
# ctime: 1533616674000,
# deal_price: "0.00058162",
# id: 415779,
# type: "卖出",
# bid_id: 3669539,
# ask_id: 3669583 }]} }
#
trades = self.safe_value(response['data'], 'resultList', [])
return self.parse_trades(trades, market, since, limit)
async def fetch_deposit_address(self, code, params={}):
await self.load_markets()
currency = self.currency(code)
request = {
'coin': currency['id'],
}
# https://github.com/UEX-OpenAPI/API_Docs_en/wiki/Query-deposit-address-of-assigned-token
response = await self.privateGetDepositAddressList(self.extend(request, params))
#
# {
# "code": "0",
# "msg": "suc",
# "data": {
# "addressList": [
# {
# "address": "0x198803ef8e0df9e8812c0105421885e843e6d2e2",
# "tag": "",
# },
# ],
# },
# }
#
data = self.safe_value(response, 'data')
if data is None:
raise InvalidAddress(self.id + ' privateGetDepositAddressList() returned no data')
addressList = self.safe_value(data, 'addressList')
if addressList is None:
raise InvalidAddress(self.id + ' privateGetDepositAddressList() returned no address list')
numAddresses = len(addressList)
if numAddresses < 1:
raise InvalidAddress(self.id + ' privatePostDepositAddresses() returned no addresses')
firstAddress = addressList[0]
address = self.safe_string(firstAddress, 'address')
tag = self.safe_string(firstAddress, 'tag')
self.check_address(address)
return {
'currency': code,
'address': address,
'tag': tag,
'info': response,
}
async def fetch_transactions_by_type(self, type, code=None, since=None, limit=None, params={}):
if code is None:
raise ArgumentsRequired(self.id + ' fetchWithdrawals requires a currency code argument')
currency = self.currency(code)
request = {
'coin': currency['id'],
}
if limit is not None:
request['pageSize'] = limit # default 10
transactionType = 'deposit' if (type == 'deposit') else 'withdraw' # instead of withdrawal...
method = 'privateGet' + self.capitalize(transactionType) + 'History'
# https://github.com/UEX-OpenAPI/API_Docs_en/wiki/Query-deposit-record-of-assigned-token
# https://github.com/UEX-OpenAPI/API_Docs_en/wiki/Query-withdraw-record-of-assigned-token
response = await getattr(self, method)(self.extend(request, params))
#
# {code: "0",
# msg: "suc",
# data: {depositList: [{ createdAt: 1533615955000,
# amount: "0.01",
# updateAt: 1533616311000,
# txid: "0x0922fde6ab8270fe6eb31cb5a37dc732d96dc8193f81cf46c4ab29fde…",
# tag: "",
# confirmations: 30,
# addressTo: "0x198803ef8e0df9e8812c0105421885e843e6d2e2",
# status: 1,
# coin: "ETH" }]} }
#
# {
# "code": "0",
# "msg": "suc",
# "data": {
# "withdrawList": [{
# "updateAt": 1540344965000,
# "createdAt": 1539311971000,
# "status": 0,
# "addressTo": "tz1d7DXJXU3AKWh77gSmpP7hWTeDYs8WF18q",
# "tag": "100128877",
# "id": 5,
# "txid": "",
# "fee": 0.0,
# "amount": "1",
# "symbol": "XTZ"
# }]
# }
# }
#
transactions = self.safe_value(response['data'], transactionType + 'List')
return self.parse_transactions_by_type(type, transactions, code, since, limit)
async def fetch_deposits(self, code=None, since=None, limit=None, params={}):
return await self.fetch_transactions_by_type('deposit', code, since, limit, params)
async def fetch_withdrawals(self, code=None, since=None, limit=None, params={}):
return await self.fetch_transactions_by_type('withdrawal', code, since, limit, params)
def parse_transactions_by_type(self, type, transactions, code=None, since=None, limit=None):
result = []
for i in range(0, len(transactions)):
transaction = self.parse_transaction(self.extend({
'type': type,
}, transactions[i]))
result.append(transaction)
return self.filterByCurrencySinceLimit(result, code, since, limit)
def parse_transaction(self, transaction, currency=None):
#
# deposits
#
# { createdAt: 1533615955000,
# amount: "0.01",
# updateAt: 1533616311000,
# txid: "0x0922fde6ab8270fe6eb31cb5a37dc732d96dc8193f81cf46c4ab29fde…",
# tag: "",
# confirmations: 30,
# addressTo: "0x198803ef8e0df9e8812c0105421885e843e6d2e2",
# status: 1,
# coin: "ETH" }]} }
#
# withdrawals
#
# {
# "updateAt": 1540344965000,
# "createdAt": 1539311971000,
# "status": 0,
# "addressTo": "tz1d7DXJXU3AKWh77gSmpP7hWTeDYs8WF18q",
# "tag": "100128877",
# "id": 5,
# "txid": "",
# "fee": 0.0,
# "amount": "1",
# "symbol": "XTZ"
# }
#
id = self.safe_string(transaction, 'id')
txid = self.safe_string(transaction, 'txid')
timestamp = self.safe_integer(transaction, 'createdAt')
updated = self.safe_integer(transaction, 'updateAt')
code = None
currencyId = self.safe_string_2(transaction, 'symbol', 'coin')
currency = self.safe_value(self.currencies_by_id, currencyId)
if currency is not None:
code = currency['code']
else:
code = self.common_currency_code(currencyId)
address = self.safe_string(transaction, 'addressTo')
tag = self.safe_string(transaction, 'tag')
amount = self.safe_float(transaction, 'amount')
status = self.parse_transaction_status(self.safe_string(transaction, 'status'))
type = self.safe_string(transaction, 'type') # injected from the outside
feeCost = self.safe_float(transaction, 'fee')
if (type == 'deposit') and(feeCost is None):
feeCost = 0
return {
'info': transaction,
'id': id,
'currency': code,
'amount': amount,
'address': address,
'tag': tag,
'status': status,
'type': type,
'updated': updated,
'txid': txid,
'timestamp': timestamp,
'datetime': self.iso8601(timestamp),
'fee': {
'currency': code,
'cost': feeCost,
},
}
def parse_transaction_status(self, status):
statuses = {
'0': 'pending', # unaudited
'1': 'ok', # audited
'2': 'failed', # audit failed
'3': 'pending', # "payment"
'4': 'failed', # payment failed
'5': 'ok',
'6': 'canceled',
}
return self.safe_string(statuses, status, status)
async def withdraw(self, code, amount, address, tag=None, params={}):
await self.load_markets()
fee = self.safe_float(params, 'fee')
if fee is None:
raise ArgumentsRequired(self.id + 'requires a "fee" extra parameter in its last argument')
self.check_address(address)
currency = self.currency(code)
request = {
'coin': currency['id'],
'address': address, # only supports existing addresses in your withdraw address list
'amount': amount,
'fee': fee, # balance >= self.sum(amount, fee)
}
if tag is not None:
request['tag'] = tag
# https://github.com/UEX-OpenAPI/API_Docs_en/wiki/Withdraw
response = await self.privatePostCreateWithdraw(self.extend(request, params))
id = None
return {
'info': response,
'id': id,
}
def sign(self, path, api='public', method='GET', params={}, headers=None, body=None):
url = self.urls['api'] + '/' + self.implode_params(path, params)
if api == 'public':
if params:
url += '?' + self.urlencode(params)
else:
self.check_required_credentials()
timestamp = str(self.seconds())
auth = ''
query = self.keysort(self.extend(params, {
'api_key': self.apiKey,
'time': timestamp,
}))
keys = list(query.keys())
for i in range(0, len(keys)):
key = keys[i]
auth += key
auth += str(query[key])
signature = self.hash(self.encode(auth + self.secret))
if query:
if method == 'GET':
url += '?' + self.urlencode(query) + '&sign=' + signature
else:
body = self.urlencode(query) + '&sign=' + signature
headers = {
'Content-Type': 'application/x-www-form-urlencoded',
}
return {'url': url, 'method': method, 'body': body, 'headers': headers}
def handle_errors(self, httpCode, reason, url, method, headers, body, response):
if not isinstance(body, basestring):
return # fallback to default error handler
if len(body) < 2:
return # fallback to default error handler
if (body[0] == '{') or (body[0] == '['):
response = json.loads(body)
#
# {"code":"0","msg":"suc","data":{}}
#
code = self.safe_string(response, 'code')
# message = self.safe_string(response, 'msg')
feedback = self.id + ' ' + self.json(response)
exceptions = self.exceptions
if code != '0':
if code in exceptions:
raise exceptions[code](feedback)
else:
raise ExchangeError(feedback)
| 44.882096 | 465 | 0.432088 | 4,205 | 51,390 | 5.206421 | 0.158383 | 0.021559 | 0.015347 | 0.007537 | 0.390993 | 0.342347 | 0.30681 | 0.280181 | 0.245056 | 0.221532 | 0 | 0.075185 | 0.448978 | 51,390 | 1,144 | 466 | 44.921329 | 0.697708 | 0.340008 | 0 | 0.281907 | 1 | 0.004208 | 0.12438 | 0.00828 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01683 | false | 0 | 0.015428 | 0.002805 | 0.078541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43a8bd9cb32de8f8138b7b033dc19e078566fbea | 426 | py | Python | src/enum/__init__.py | NazarioJL/faker_enum | c2703cae232b229b4d4ab2b73757102453d541ab | [
"MIT"
] | 5 | 2019-08-02T17:59:10.000Z | 2021-05-14T08:30:55.000Z | src/enum/__init__.py | NazarioJL/faker_enum | c2703cae232b229b4d4ab2b73757102453d541ab | [
"MIT"
] | 4 | 2018-10-26T06:52:05.000Z | 2022-01-31T20:31:17.000Z | src/enum/__init__.py | NazarioJL/faker_enum | c2703cae232b229b4d4ab2b73757102453d541ab | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from enum import Enum
from typing import TypeVar, Type, List, Iterable, cast
from faker.providers import BaseProvider
TEnum = TypeVar("TEnum", bound=Enum)
class EnumProvider(BaseProvider):
"""
A Provider for enums.
"""
def enum(self, enum_cls: Type[TEnum]) -> TEnum:
members: List[TEnum] = list(cast(Iterable[TEnum], enum_cls))
return self.random_element(members)
| 22.421053 | 68 | 0.676056 | 54 | 426 | 5.277778 | 0.555556 | 0.049123 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002924 | 0.197183 | 426 | 18 | 69 | 23.666667 | 0.830409 | 0.103286 | 0 | 0 | 0 | 0 | 0.013661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
43a90c6754ed5d7199ff6f282438c86387b7e8d9 | 1,485 | py | Python | usuarios/views.py | alvarocneto/alura_django | da2d3619b30c9d1c8767fa910eb7253bc20eeb90 | [
"MIT"
] | 1 | 2017-04-25T10:46:24.000Z | 2017-04-25T10:46:24.000Z | usuarios/views.py | alvarocneto/alura_django | da2d3619b30c9d1c8767fa910eb7253bc20eeb90 | [
"MIT"
] | null | null | null | usuarios/views.py | alvarocneto/alura_django | da2d3619b30c9d1c8767fa910eb7253bc20eeb90 | [
"MIT"
] | null | null | null | from django.shortcuts import redirect
from django.shortcuts import render
from django.contrib.auth.models import User
from django.views.generic.base import View
from perfis.models import Perfil
from usuarios.forms import RegistrarUsuarioForm
class RegistrarUsuarioView(View):
template_name = 'registrar.html'
def get(self, request):
return render(request, self.template_name)
def post(self, request):
# preenche o from
form = RegistrarUsuarioForm(request.POST)
# verifica se eh valido
if form.is_valid():
dados_form = form.data
# cria o usuario
usuario = User.objects.create_user(dados_form['nome'],
dados_form['email'],
dados_form['senha'])
# cria o perfil
perfil = Perfil(nome=dados_form['nome'],
telefone=dados_form['telefone'],
nome_empresa=dados_form['nome_empresa'],
usuario=usuario)
# grava no banco
perfil.save()
# redireciona para index
return redirect('index')
# so chega aqui se nao for valido
# vamos devolver o form para mostrar o formulario preenchido
return render(request, self.template_name, {'form': form})
| 31.595745 | 72 | 0.546801 | 147 | 1,485 | 5.428571 | 0.469388 | 0.078947 | 0.048872 | 0.062657 | 0.087719 | 0.087719 | 0 | 0 | 0 | 0 | 0 | 0 | 0.382492 | 1,485 | 46 | 73 | 32.282609 | 0.870229 | 0.131313 | 0 | 0 | 0 | 0 | 0.047656 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.25 | 0.041667 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
43af38bca718e0df9a99877c70ee02be87e0d3d0 | 842 | py | Python | shopping_cart_test/shoppingcart2.py | Simbadeveloper/studious-octo-waddle.io | 7ace6bb93e3b87c97d59df858e3079ec7a2db30e | [
"MIT"
] | null | null | null | shopping_cart_test/shoppingcart2.py | Simbadeveloper/studious-octo-waddle.io | 7ace6bb93e3b87c97d59df858e3079ec7a2db30e | [
"MIT"
] | null | null | null | shopping_cart_test/shoppingcart2.py | Simbadeveloper/studious-octo-waddle.io | 7ace6bb93e3b87c97d59df858e3079ec7a2db30e | [
"MIT"
] | null | null | null | class ShoppingCart(object):
def __init__(self):
self.total = 0
self.items = dict()
def add_item(self, item_name, quantity, price):
if item_name != None and quantity >= 1:
self.items.update({item_name: quantity})
if quantity and price >= 1:
self.total += (quantity * price)
def remove_item(self, item_name, quantity, price):
if item_name in self.items:
if quantity < self.items[item_name] and quantity > 0:
self.items[item_name] -= quantity
self.total -= price*quantity
def checkout(self, cash_paid):
balance = 0
if cash_paid < self.total:
return "Cash paid not enough"
balance = cash_paid - self.total
return balance
class Shop(ShoppingCart):
def __init__(self):
self.quantity = 100
def remove_item(self):
self.quantity -= 1
| 26.3125 | 59 | 0.644893 | 114 | 842 | 4.578947 | 0.27193 | 0.10728 | 0.122605 | 0.057471 | 0.237548 | 0.149425 | 0.149425 | 0.149425 | 0.149425 | 0 | 0 | 0.014241 | 0.249406 | 842 | 31 | 60 | 27.16129 | 0.811709 | 0 | 0 | 0.08 | 0 | 0 | 0.023753 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43b32db495f046dd61a5bbd3592b8806b465b229 | 785 | py | Python | LEVEL2/다리를지나는트럭/solution.py | seunghwanly/CODING-TEST | a820da950c163d399594770199aa2e782d1fbbde | [
"MIT"
] | null | null | null | LEVEL2/다리를지나는트럭/solution.py | seunghwanly/CODING-TEST | a820da950c163d399594770199aa2e782d1fbbde | [
"MIT"
] | null | null | null | LEVEL2/다리를지나는트럭/solution.py | seunghwanly/CODING-TEST | a820da950c163d399594770199aa2e782d1fbbde | [
"MIT"
] | null | null | null | def solution(bridge_length, weight, truck_weights):
answer = 0
# { weight, time }
wait = truck_weights[:]
bridge = []
passed = 0
currWeight = 0
while True:
if passed == len(truck_weights) and len(wait) == 0: return answer
answer += 1
# sth needs to be passed
if bridge:
if bridge[0]['t'] + bridge_length == answer:
front = bridge.pop(0)
currWeight -= front['w']
passed += 1
# add new truck
if wait:
if currWeight + wait[0] <= weight:
bridge.append({ 'w' : wait[0], 't' : answer })
currWeight += wait[0]
wait.pop(0)
# print(solution(2, 10, [7, 4, 5, 6]))
print(solution(100, 100, [10]))
| 28.035714 | 73 | 0.49172 | 93 | 785 | 4.096774 | 0.419355 | 0.052493 | 0.07874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055102 | 0.375796 | 785 | 27 | 74 | 29.074074 | 0.722449 | 0.11465 | 0 | 0 | 0 | 0 | 0.005797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0.15 | 0 | 0 | 0.05 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
43b5471678e7c510bd2a55fdced1140414dcd734 | 440 | py | Python | device_geometry.py | AstroShen/fpga21-scaled-tech | 8a7016913c18d71844f733bc80a3ceaa2d033ac2 | [
"MIT"
] | 2 | 2021-09-02T13:13:35.000Z | 2021-12-19T11:35:03.000Z | device_geometry.py | AstroShen/fpga21-scaled-tech | 8a7016913c18d71844f733bc80a3ceaa2d033ac2 | [
"MIT"
] | null | null | null | device_geometry.py | AstroShen/fpga21-scaled-tech | 8a7016913c18d71844f733bc80a3ceaa2d033ac2 | [
"MIT"
] | 2 | 2021-09-29T02:53:03.000Z | 2022-03-27T09:55:35.000Z | """Holds the device gemoetry parameters (Table 5), taken from Wu et al.,
>> A Predictive 3-D Source/Drain Resistance Compact Model and the Impact on 7 nm and Scaled FinFets<<, 2020, with interpolation for 4nm. 16nm is taken from PTM HP.
"""
node_names = [16, 7, 5, 4, 3]
GP = [64, 56, 48, 44, 41]
FP = [40, 30, 28, 24, 22]
GL = [20, 18, 16, 15, 14]
FH = [26, 35, 45, 50, 55]
FW = [12, 6.5, 6, 5.5, 5.5]
vdd = [0.85, 0.75, 0.7, 0.65, 0.65]
| 36.666667 | 163 | 0.615909 | 92 | 440 | 2.934783 | 0.782609 | 0.022222 | 0.022222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.226361 | 0.206818 | 440 | 11 | 164 | 40 | 0.547278 | 0.529545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43bbc2ac72a79eec23f8c2578bc9f103ba32b758 | 8,684 | py | Python | hivwholeseq/sequencing/check_pipeline.py | neherlab/hivwholeseq | 978ce4060362e4973f92b122ed5340a5314d7844 | [
"MIT"
] | 3 | 2016-09-13T12:15:47.000Z | 2021-07-03T01:28:56.000Z | hivwholeseq/sequencing/check_pipeline.py | iosonofabio/hivwholeseq | d504c63b446c3a0308aad6d6e484ea1666bbe6df | [
"MIT"
] | null | null | null | hivwholeseq/sequencing/check_pipeline.py | iosonofabio/hivwholeseq | d504c63b446c3a0308aad6d6e484ea1666bbe6df | [
"MIT"
] | 3 | 2016-01-17T03:43:46.000Z | 2020-03-25T07:00:11.000Z | #!/usr/bin/env python
# vim: fdm=marker
'''
author: Fabio Zanini
date: 15/06/14
content: Check the status of the pipeline for one or more sequencing samples.
'''
# Modules
import os
import sys
from itertools import izip
import argparse
from Bio import SeqIO
from hivwholeseq.utils.generic import getchar
from hivwholeseq.sequencing.samples import SampleSeq, load_sequencing_run
from hivwholeseq.patients.patients import load_samples_sequenced as lssp
from hivwholeseq.patients.patients import SamplePat
from hivwholeseq.sequencing.samples import load_samples_sequenced as lss
from hivwholeseq.utils.mapping import get_number_reads
from hivwholeseq.cluster.fork_cluster import fork_check_pipeline as fork_self
# Globals
len_fr = 8
len_msg = 6
spacing_fragments = 4
# Functions
def check_status(sample, step, detail=1):
'''Check for a sample a certain step of the pipeline at a certain detail'''
if detail == 1:
if step == 'premapped':
return [os.path.isfile(sample.get_premapped_filename())]
elif step == 'divided':
return [(fr, os.path.isfile(sample.get_divided_filename(fr)))
for fr in sample.regions_complete]
elif step == 'consensus':
return [(fr, os.path.isfile(sample.get_consensus_filename(fr)))
for fr in sample.regions_generic]
elif step == 'mapped':
return [(fr, os.path.isfile(sample.get_mapped_filename(fr, filtered=False)))
for fr in sample.regions_generic]
elif step == 'filtered':
return [(fr, os.path.isfile(sample.get_mapped_filename(fr, filtered=True)))
for fr in sample.regions_generic]
elif step == 'mapped_initial':
return [(fr, os.path.isfile(sample.get_mapped_to_initial_filename(fr)))
for fr in sample.regions_generic]
elif step == 'mapped_filtered':
# Check whether the mapped filtered is older than the mapped_initial
from hivwholeseq.utils.generic import modification_date
out = []
for fr in sample.regions_generic:
fn_mi = sample.get_mapped_to_initial_filename(fr)
fn_mf = sample.get_mapped_filtered_filename(fr)
if not os.path.isfile(fn_mf):
out.append((fr, False))
continue
if not os.path.isfile(fn_mi):
out.append((fr, True))
continue
md_mi = modification_date(fn_mi)
md_mf = modification_date(fn_mf)
if md_mf < md_mi:
out.append((fr, 'OLD'))
else:
out.append((fr, True))
return out
elif detail == 2:
if step in ('filtered', 'consensus'):
return check_status(sample, step, detail=3)
else:
return check_status(sample, step, detail=1)
elif detail == 3:
if step == 'premapped':
if os.path.isfile(sample.get_premapped_filename()):
return [get_number_reads(sample.get_premapped_filename())]
else:
return [False]
elif step == 'divided':
stati = []
for fr in sample.regions_complete:
fn = sample.get_divided_filename(fr)
if os.path.isfile(fn):
status = (fr, get_number_reads(fn))
else:
status = (fr, False)
stati.append(status)
return stati
elif step == 'consensus':
stati = []
for fr in sample.regions_generic:
fn = sample.get_consensus_filename(fr)
if os.path.isfile(fn):
status = (fr, len(SeqIO.read(fn, 'fasta')))
else:
status = (fr, False)
stati.append(status)
return stati
elif step == 'mapped':
stati = []
for fr in sample.regions_generic:
fn = sample.get_mapped_filename(fr, filtered=False)
if os.path.isfile(fn):
status = (fr, get_number_reads(fn))
else:
status = (fr, False)
stati.append(status)
return stati
elif step == 'filtered':
stati = []
for fr in sample.regions_generic:
fn = sample.get_mapped_filename(fr, filtered=True)
if os.path.isfile(fn):
status = (fr, get_number_reads(fn))
else:
status = (fr, False)
stati.append(status)
return stati
# TODO: add mapped_to_initial and downstream
elif step in ('mapped_initial', 'mapped_filtered'):
return check_status(sample, step, detail=1)
def print_info(name, status, detail=1):
'''Print info on these files'''
print '{:<20s}'.format(name+':'),
if name.lower() in ['premapped']:
status = status[0]
if status == True:
print 'OK'
elif status == False:
print 'MISS'
else:
print str(status)
else:
stati = list(status)
msg = []
for (fr, status) in stati:
ms = ('{:<'+str(len_fr)+'s}').format(fr+':')
if status == True:
msg.append(ms+('{:>'+str(len_msg)+'}').format('OK'))
elif status == False:
msg.append(ms+('{:>'+str(len_msg)+'}').format('MISS'))
else:
msg.append(ms+('{:>'+str(len_msg)+'}').format(str(status)))
print (' ' * spacing_fragments).join(msg)
# Script
if __name__ == '__main__':
# Parse input args
parser = argparse.ArgumentParser(description='Check sequencing run for missing parts of the analysis',
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('--runs', required=True, nargs='+',
help='Seq runs to analyze (e.g. Tue28, test_tiny)')
parser.add_argument('--adaIDs', nargs='+',
help='Adapter IDs to analyze (e.g. TS2)')
parser.add_argument('--nopatients', action='store_false', dest='use_pats',
help='Include non-patient samples (e.g. reference strains)')
parser.add_argument('--interactive', action='store_true',
help='Interactive mode')
parser.add_argument('--detail', type=int, default=1,
help='Include details on number of reads, length of consensus')
parser.add_argument('--submit', action='store_true',
help='Execute the script in parallel on the cluster')
args = parser.parse_args()
seq_runs = args.runs
adaIDs = args.adaIDs
use_pats = args.use_pats
use_interactive = args.interactive
detail = args.detail
submit = args.submit
if submit:
fork_self(seq_runs, adaIDs=adaIDs,
pats=use_pats,
detail=detail)
sys.exit()
samples_pat = lssp(include_wrong=True)
samples = lss()
samples = samples.loc[samples['seq run'].isin(seq_runs)]
if adaIDs is not None:
samples = samples.loc[samples.adapter.isin(adaIDs)]
if len(seq_runs) >= 2:
samples.sort(columns=['patient sample', 'seq run'], inplace=True)
for isa, (samplename, sample) in enumerate(samples.iterrows()):
sample = SampleSeq(sample)
print sample.name, 'seq:', sample['seq run'], sample.adapter,
if sample['patient sample'] == 'nan':
print 'not a patient sample',
if use_pats:
print '(skip)'
continue
else:
print ''
else:
sample_pat = samples_pat.loc[sample['patient sample']]
print 'patient: '+sample_pat.patient
steps = ['premapped', 'divided', 'consensus', 'mapped', 'filtered',
'mapped_initial', 'mapped_filtered']
for step in steps:
status = check_status(sample, step, detail=detail)
print_info(step.capitalize(), status, detail=detail)
if (isa != len(samples) - 1):
print ''
if use_interactive and (isa != len(samples) - 1):
print 'Press q to exit',
sys.stdout.flush()
ch = getchar()
if ch.lower() in ['q']:
print 'stopped'
break
else:
sys.stdout.write("\x1b[1A")
print ''
| 36.033195 | 106 | 0.554353 | 979 | 8,684 | 4.777324 | 0.215526 | 0.02694 | 0.033355 | 0.027796 | 0.378448 | 0.300192 | 0.270259 | 0.184734 | 0.169767 | 0.151593 | 0 | 0.005029 | 0.33602 | 8,684 | 240 | 107 | 36.183333 | 0.806105 | 0.02257 | 0 | 0.354167 | 0 | 0 | 0.098798 | 0 | 0 | 0 | 0 | 0.004167 | 0 | 0 | null | null | 0 | 0.067708 | null | null | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43c4fed77cd489496d3337fe3e83cfcc13582afb | 2,390 | py | Python | api/files/api/app/monthly_report.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | 2 | 2018-02-01T09:18:05.000Z | 2020-03-12T18:11:11.000Z | api/files/api/app/monthly_report.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | null | null | null | api/files/api/app/monthly_report.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | 5 | 2018-05-11T10:32:52.000Z | 2021-05-26T12:09:47.000Z | import jinja2
import json
from send_email import send_email
from app.models import User, MyResourcesAWS, db
from app.es.awsdetailedlineitem import AWSDetailedLineitem
from sqlalchemy import desc
import subprocess
import datetime
from flask import render_template
def monthly_html_template():
template_dir = '/usr/trackit/templates'
loader = jinja2.FileSystemLoader(template_dir)
env = jinja2.Environment(loader=loader)
template = env.get_template('emailPDFreport.html')
now = datetime.datetime.now()
try:
users = User.query.all()
for user in users:
if user.report_last_emailed_at == None:
user.report_last_emailed_at = datetime.datetime.utcnow()
db.session.add(user)
db.session.commit()
last_emailed_days = (now - user.report_last_emailed_at).days
if last_emailed_days >= 30:
for key in user.aws_keys:
date = "{} {}".format(now.strftime("%B"), now.year)
pretty_key = user.get_aws_key(key.key).pretty + ' ' + key.key
monthly_cost = AWSDetailedLineitem.get_monthly_cost_by_product(key.get_aws_user_id())
estimation_hour, estimation_month = get_estimation(user, key)
total = sum(float(i.get("cost")) for i in monthly_cost['products'])
email_template = template.render(email=user.email, date=date, key=pretty_key, products=monthly_cost['products'], total=total, hourly_cost=estimation_hour, monthly_cost=estimation_month)
if user.email.endswith("msolution.io"):
send_email(user.email, 'Trackit monthly report', email_template.encode('utf-8').strip(), True)
user.report_last_emailed_at = datetime.datetime.utcnow()
db.session.add(user)
db.session.commit()
except Exception, e:
print("ERROR " + str(e))
def get_estimation(user, key):
estimation = MyResourcesAWS.query.filter(MyResourcesAWS.key == key.key).order_by(desc(MyResourcesAWS.date)).first()
estimation = [] if not estimation else estimation.json()
cost = sum(estimation_cost(e) for e in estimation)
return cost, cost*720
def estimation_cost(estimation):
return sum(item['cost'] for item in estimation['prices'] if item['name'] == 'aws')
| 47.8 | 205 | 0.65523 | 292 | 2,390 | 5.181507 | 0.335616 | 0.043622 | 0.037013 | 0.055519 | 0.130866 | 0.100463 | 0.100463 | 0.100463 | 0.100463 | 0.100463 | 0 | 0.004951 | 0.239331 | 2,390 | 49 | 206 | 48.77551 | 0.827283 | 0 | 0 | 0.133333 | 0 | 0 | 0.054812 | 0.009205 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.2 | null | null | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43cc7a30161b57bb1e1d6f7efc6e267ff0a84af5 | 471 | py | Python | myhoodApp/migrations/0002_healthfacilities_hospital_image.py | MutuaFranklin/MyHood | 6ddd21c4a67936c8926d6f5a8665a06edf81f39e | [
"MIT"
] | null | null | null | myhoodApp/migrations/0002_healthfacilities_hospital_image.py | MutuaFranklin/MyHood | 6ddd21c4a67936c8926d6f5a8665a06edf81f39e | [
"MIT"
] | null | null | null | myhoodApp/migrations/0002_healthfacilities_hospital_image.py | MutuaFranklin/MyHood | 6ddd21c4a67936c8926d6f5a8665a06edf81f39e | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-09-23 20:01
import cloudinary.models
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('myhoodApp', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='healthfacilities',
name='hospital_image',
field=cloudinary.models.CloudinaryField(blank=True, max_length=255, verbose_name='Hospital Image'),
),
]
| 23.55 | 111 | 0.647558 | 50 | 471 | 6 | 0.78 | 0.106667 | 0.113333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061798 | 0.244161 | 471 | 19 | 112 | 24.789474 | 0.780899 | 0.095541 | 0 | 0 | 1 | 0 | 0.153302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43cee9ce3aeb6af7cef400c841ab802c88461d4b | 8,148 | py | Python | gslib/tests/test_stet_util.py | ttobisawa/gsutil | ef665b590aa8e6cecfe251295bce8bf99ea69467 | [
"Apache-2.0"
] | null | null | null | gslib/tests/test_stet_util.py | ttobisawa/gsutil | ef665b590aa8e6cecfe251295bce8bf99ea69467 | [
"Apache-2.0"
] | null | null | null | gslib/tests/test_stet_util.py | ttobisawa/gsutil | ef665b590aa8e6cecfe251295bce8bf99ea69467 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2021 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for stet_util.py."""
from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
from __future__ import unicode_literals
import os
import shutil
from gslib import storage_url
from gslib.tests import testcase
from gslib.tests import util
from gslib.tests.util import unittest
from gslib.utils import execution_util
from gslib.utils import stet_util
import mock
class TestStetUtil(testcase.GsUtilUnitTestCase):
"""Test STET utils."""
@mock.patch.object(execution_util, 'ExecuteExternalCommand')
def test_stet_upload_uses_binary_and_config_from_boto(
self, mock_execute_external_command):
fake_config_path = self.CreateTempFile()
mock_execute_external_command.return_value = ('stdout', 'stderr')
mock_logger = mock.Mock()
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', 'fake_binary_path'),
('GSUtil', 'stet_config_path', fake_config_path),
]):
out_file_url = stet_util.encrypt_upload(source_url, destination_url,
mock_logger)
self.assertEqual(out_file_url,
storage_url.StorageUrlFromString('in_.stet_tmp'))
mock_execute_external_command.assert_called_once_with([
'fake_binary_path',
'encrypt',
'--config-file={}'.format(fake_config_path),
'--blob-id=gs://bucket/obj',
'in',
'in_.stet_tmp',
])
mock_logger.debug.assert_called_once_with('stderr')
@mock.patch.object(execution_util, 'ExecuteExternalCommand')
def test_stet_upload_runs_with_binary_from_path_with_correct_settings(
self, mock_execute_external_command):
fake_config_path = self.CreateTempFile()
temporary_path_directory = self.CreateTempDir()
fake_stet_binary_path = self.CreateTempFile(tmpdir=temporary_path_directory,
file_name='stet')
previous_path = os.getenv('PATH')
os.environ['PATH'] += os.path.pathsep + temporary_path_directory
mock_execute_external_command.return_value = ('stdout', 'stderr')
mock_logger = mock.Mock()
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', None),
('GSUtil', 'stet_config_path', fake_config_path),
]):
out_file_url = stet_util.encrypt_upload(source_url, destination_url,
mock_logger)
self.assertEqual(out_file_url,
storage_url.StorageUrlFromString('in_.stet_tmp'))
mock_execute_external_command.assert_called_once_with([
fake_stet_binary_path,
'encrypt',
'--config-file={}'.format(fake_config_path),
'--blob-id=gs://bucket/obj',
'in',
'in_.stet_tmp',
])
mock_logger.debug.assert_called_once_with('stderr')
os.environ['PATH'] = previous_path
@mock.patch.object(execution_util, 'ExecuteExternalCommand')
def test_stet_upload_uses_config_from_default_path_with_correct_settings(
self, mock_execute_external_command):
mock_execute_external_command.return_value = ('stdout', 'stderr')
mock_logger = mock.Mock()
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', 'fake_binary_path'),
('GSUtil', 'stet_config_path', None),
]):
with mock.patch.object(os.path,
'exists',
new=mock.Mock(return_value=True)):
out_file_url = stet_util.encrypt_upload(source_url, destination_url,
mock_logger)
self.assertEqual(out_file_url,
storage_url.StorageUrlFromString('in_.stet_tmp'))
mock_execute_external_command.assert_called_once_with([
'fake_binary_path',
'encrypt',
'--config-file={}'.format(
os.path.expanduser(stet_util.DEFAULT_STET_CONFIG_PATH)),
'--blob-id=gs://bucket/obj',
'in',
'in_.stet_tmp',
])
mock_logger.debug.assert_called_once_with('stderr')
@mock.patch.object(shutil, 'move')
@mock.patch.object(execution_util, 'ExecuteExternalCommand')
def test_stet_download_runs_binary_and_replaces_temp_file(
self, mock_execute_external_command, mock_move):
fake_config_path = self.CreateTempFile()
mock_execute_external_command.return_value = ('stdout', 'stderr')
mock_logger = mock.Mock()
source_url = storage_url.StorageUrlFromString('gs://bucket/obj')
destination_url = storage_url.StorageUrlFromString('out')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', 'fake_binary_path'),
('GSUtil', 'stet_config_path', fake_config_path),
]):
stet_util.decrypt_download(source_url, destination_url, mock_logger)
mock_execute_external_command.assert_called_once_with([
'fake_binary_path', 'decrypt',
'--config-file={}'.format(fake_config_path),
'--blob-id=gs://bucket/obj', 'out', 'out_.stet_tmp'
])
mock_logger.debug.assert_called_once_with('stderr')
mock_move.assert_called_once_with('out_.stet_tmp', 'out')
@mock.patch.object(stet_util,
'_get_stet_binary_from_path',
new=mock.Mock(return_value=None))
def test_stet_util_errors_if_no_binary(self):
fake_config_path = self.CreateTempFile()
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', None),
('GSUtil', 'stet_config_path', fake_config_path),
]):
with self.assertRaises(KeyError):
stet_util.encrypt_upload(source_url, destination_url, None)
def test_stet_util_errors_if_no_config(self):
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', 'fake_binary_path'),
('GSUtil', 'stet_config_path', None),
]):
with mock.patch.object(os.path,
'exists',
new=mock.Mock(return_value=False)):
with self.assertRaises(KeyError):
stet_util.encrypt_upload(source_url, destination_url, None)
@mock.patch.object(os.path, 'expanduser', autospec=True)
@mock.patch.object(execution_util,
'ExecuteExternalCommand',
new=mock.Mock(return_value=('stdout', 'stderr')))
def test_stet_util_expands_home_directory_symbol(self, mock_expanduser):
fake_config_path = self.CreateTempFile()
source_url = storage_url.StorageUrlFromString('in')
destination_url = storage_url.StorageUrlFromString('gs://bucket/obj')
with util.SetBotoConfigForTest([
('GSUtil', 'stet_binary_path', 'fake_binary_path'),
('GSUtil', 'stet_config_path', fake_config_path),
]):
stet_util.encrypt_upload(source_url, destination_url, mock.Mock())
mock_expanduser.assert_has_calls(
[mock.call('fake_binary_path'),
mock.call(fake_config_path)])
| 40.74 | 80 | 0.689494 | 962 | 8,148 | 5.468815 | 0.179834 | 0.041817 | 0.042007 | 0.106634 | 0.710511 | 0.689793 | 0.667554 | 0.665463 | 0.655199 | 0.621555 | 0 | 0.00138 | 0.199435 | 8,148 | 199 | 81 | 40.944724 | 0.805151 | 0.077688 | 0 | 0.68125 | 0 | 0 | 0.151649 | 0.031504 | 0 | 0 | 0 | 0 | 0.09375 | 1 | 0.04375 | false | 0 | 0.08125 | 0 | 0.13125 | 0.00625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43d8185a62fc1d316a49c5b7d44a50853bf56a88 | 9,682 | py | Python | upload.py | snymainn/tools- | af57a1a4d0f1aecff33ab28c6f27acc893f37fbc | [
"MIT"
] | null | null | null | upload.py | snymainn/tools- | af57a1a4d0f1aecff33ab28c6f27acc893f37fbc | [
"MIT"
] | null | null | null | upload.py | snymainn/tools- | af57a1a4d0f1aecff33ab28c6f27acc893f37fbc | [
"MIT"
] | null | null | null | #!/usr/bin/python
import sys
from loglib import SNYLogger
import ftplib
import argparse
import re
import os
import calendar
import time
def read_skipfile(infile, log):
skiplines = list()
skipfile = open(infile, 'r')
for line in skipfile:
newline = line.rstrip('\r\n')
linelength = len(newline)
if linelength>0:
log.debug("Adding "+newline+" to skiplines")
tmpobjects = re.compile(newline)
skiplines.append(tmpobjects)
skipfile.close()
return skiplines
#GET LOCAL FILELIST
def get_local_files(localpath,log):
locallist = list()
os.chdir(localpath)
log.debug("*** GETTING LOCAL FILELIST ***")
for name in os.listdir("."):
if (not name.startswith('.')):
statinfo = os.stat(name)
if (statinfo.st_mode>=32768):
entrytype = "file"
else:
entrytype = "dir"
size = statinfo.st_size
date = statinfo.st_mtime
log.debug("Date:"+str(int(date))+" type:"+entrytype+", name:"+name+" size:"+str(size))
locallist.append({'name':name,'type':entrytype,'modify':int(date),'size':size})
return locallist
#
# login to ftp server
#
def ftp_login(args, log):
ftp = ftplib.FTP()
port = 21
ftp.connect(args.host, port)
try:
log.debug("Logging in...")
ftp.login(args.user, args.password)
log.debug(ftp.getwelcome())
except ftplib.error_perm, resp:
log.logprint(str(resp))
except:
log.logprint("Login section failed..")
return ftp
#
# get remote files
#
def get_remote_files(ftp, remotepath, args, log):
# LIST CONTENTS
contents = list()
dirlist = list()
log.debug("*** GET REMOTE FILELIST ***")
try:
ftp.cwd(remotepath)
# Entry point
ftp.retrlines('MLSD', contents.append)
for line in contents:
# log.debug(line)
entry = line.split(";")
size = "0" #Set this because directories does not report size
for item in entry:
cell = item.split("=")
if (cell[0]=="modify"):
date = cell[1]
modify=calendar.timegm(time.strptime(str(date), "%Y%m%d%H%M%S"))
#for loops/if checks are not blocks in python, i.e. no need to predefine modify
if (cell[0]=="type"):
entrytype=cell[1]
if (cell[0]=="size"):
size = cell[1]
if (len(cell[0])>0) and cell[0].startswith(' '):
#If string does not contain =, cell[1] will not be defined
#and first entry in cell[0] string will be whitespace
name = cell[0].lstrip()
log.debug("Date:"+str(modify)+" type:"+entrytype+" Name:"+name+" size:"+size)
if (entrytype=='file' or entrytype=='dir'): #Do not include current and parent dir entries
dirlist.append({'name':name,'type':entrytype,'modify':int(modify),'size':size})
except ftplib.error_perm, resp:
log.logprint(str(resp))
exit(1)
return dirlist
def touch(fname):
try:
os.utime(fname, None)
except:
log.logprint("Updating mtime failed, "+fname+" does not exist")
def sync_files(ftp, args, skiplines, localpath, remotepath, log):
locallist = get_local_files(localpath,log)
remotelist = get_remote_files(ftp, remotepath, args, log)
#Create dictionaries for easy lookup
localdict = {}
index = 0
for lfile in locallist:
localdict[lfile['name']]=index
index+=1
remotedict = {}
index = 0
for rfile in remotelist:
remotedict[rfile['name']]=index
index+=1
# Traverse local filelist and
# check if local file is present on remote
for lfile in locallist:
#Check if file is present in skipfile
#If present in skipfile, skip to next file in locallist
skiptonext = False
for p in skiplines:
m=p.match(lfile['name'])
if (m):
#log.logprint(lfile['name']+" match "+m.group()+", thus present in skipfile "+args.skipfile)
log.logprint("Skipping: "+lfile['name'])
skiptonext = True
break
if skiptonext: continue
#
#Check if remote has the local file
#if present remote, type file and modify time is older than local file, set upload flag
#
upload = False #Set to True here instead of False since this will handle the case where
#remote does not exist, i.e. always upload except when remote is present
#and up to date
if lfile['name'] in remotedict:
rfile = remotelist[remotedict[lfile['name']]] #Get fileinfo from remotelist using index
if lfile['type']=="file":
log.debug(lfile['name']+" is present remote : "+rfile['name'])
if (lfile['modify']>rfile['modify']):
log.debug("Local file is newer by "+str(lfile['modify']-rfile['modify'])+" seconds, try to upload...")
upload = True
elif lfile['type']=="dir":
log.debug(lfile['name']+" is present remote and is directory: "+rfile['name'])
sync_files(ftp, args, skiplines, lfile['name'], rfile['name'], log)
elif lfile['type']=="dir":
log.debug(lfile['name']+" is NOT present remote and is directory: ")
try:
ftp.mkd(lfile['name'])
log.logprint("CREATED DIR : "+lfile['name'])
sync_files(ftp, args, skiplines, lfile['name'], lfile['name'], log)
except ftplib.all_errors, resp:
log.logprint("ERROR: Failed to create directory "+lfile['name']+" - "+str(resp))
elif lfile['type']=="file":
log.debug(lfile['name']+" is NOT present remote and is file")
upload = True
#Handle upload flag
if (upload and lfile['type']=="file"):
try:
touch(lfile['name']) #Touch local file to set modify time to approx the same as the remote will get
ftp.storbinary('STOR '+lfile['name'], open(lfile['name'], 'rb'))
log.logprint("UPLOADED : "+lfile['name'])
except ftplib.all_errors, resp:
log.logprint("ERROR: Failed to upload "+lfile['name']+" - "+str(resp))
#Make sure locally deleted items are deleted remotely
for rfile in remotelist:
if rfile['name'] not in localdict:
if rfile['type']=="file":
#Remote file is not present locally=>Delete it
try:
ftp.delete(rfile['name'])
log.logprint("DELETED: "+rfile['name'])
except ftplib.all_errors, resp:
log.logprint("ERROR: Failed to delete "+rfile['name']+" - "+str(resp))
elif rfile['type']=="dir":
log.debug("Remote dir "+rfile['name']+" not present locally, delete it recursively")
#Remote dir is not present locally, decend and recursively delete everything
#TODO: recursive_delete(ftp, rfile['name'])
delete_recursive(ftp, args, rfile['name'], log)
ftp.cwd("..")
os.chdir("..")
def delete_recursive(ftp, args, remotepath, log):
remotelist = get_remote_files(ftp, remotepath, args, log)
#Make sure locally deleted items are deleted remotely
for rfile in remotelist:
if rfile['type']=="file":
try:
ftp.delete(rfile['name'])
log.logprint("DELETED: "+rfile['name'])
except ftplib.all_errors, resp:
log.logprint("ERROR: Failed to delete "+rfile['name']+" - "+str(resp))
elif rfile['type']=="dir":
log.debug("Remote dir "+rfile['name']+" not present locally, delete it recursively")
delete_recursive(ftp, args, rfile['name'], log)
ftp.cwd("..")
try:
ftp.rmd(remotepath)
log.logprint("DELETED DIR: "+remotepath)
except ftplib.all_errors, resp:
log.logprint("ERROR: Failed to delete directory "+remotepath+" - "+str(resp))
parser = argparse.ArgumentParser()
parser.add_argument("-o", "--host", help="ftp hostname", required=True)
parser.add_argument("-u", "--user", help="username on ftp server", required=True)
parser.add_argument("-p", "--password", help="password", required=True)
parser.add_argument("-d", "--debug",
help="print debug to terminal, default 0, use multiple times to increase verbosity, i.e. -d -d",
action="count")
parser.add_argument("-b", "--basedir", help="Toplevel directory on ftp server, default www")
parser.add_argument("-t", "--path", help="Local toplevel directory, default ., i.e. current dir")
parser.add_argument("-s", "--skipfile", help="Do not upload files in <skipfile>, default name upload.skip")
parser.set_defaults(debug=0)
parser.set_defaults(skipfile="upload.skip")
parser.set_defaults(basedir="www")
parser.set_defaults(path=".")
args = parser.parse_args()
log = SNYLogger(basename="upload", size_limit=10, no_logfiles=2, stdout=args.debug)
skiplines = read_skipfile(args.skipfile, log)
ftp = ftp_login(args, log)
sync_files(ftp, args, skiplines, args.path, args.basedir, log)
ftp.quit()
| 37.968627 | 122 | 0.567858 | 1,152 | 9,682 | 4.732639 | 0.22309 | 0.034666 | 0.019259 | 0.019259 | 0.307227 | 0.245598 | 0.245598 | 0.221387 | 0.199193 | 0.14215 | 0 | 0.004564 | 0.298492 | 9,682 | 254 | 123 | 38.11811 | 0.798145 | 0.144185 | 0 | 0.271739 | 0 | 0.005435 | 0.172811 | 0 | 0 | 0 | 0 | 0.003937 | 0 | 0 | null | null | 0.01087 | 0.043478 | null | null | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43d92304705312e029e4656dd5bbcccaf8cbee7d | 861 | py | Python | data/models/svm_benchmark.py | Laurenhut/Machine_Learning_Final | 4fca33754ef42acde504cc64e6bbe4e463caadf8 | [
"MIT"
] | null | null | null | data/models/svm_benchmark.py | Laurenhut/Machine_Learning_Final | 4fca33754ef42acde504cc64e6bbe4e463caadf8 | [
"MIT"
] | null | null | null | data/models/svm_benchmark.py | Laurenhut/Machine_Learning_Final | 4fca33754ef42acde504cc64e6bbe4e463caadf8 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from sklearn import svm
import csv_io
def main():
training, target = csv_io.read_data("../Data/train.csv")
training = [x[1:] for x in training]
target = [float(x) for x in target]
test, throwaway = csv_io.read_data("../Data/test.csv")
test = [x[1:] for x in test]
svc = svm.SVC(kernel='poly', degree=2)
scores = cross_val_score(rf, training, target, cv=10)
print np.mean(scores)
# svc.fit(training, target)
# predicted_probs = svc.predict_proba(test)
# predicted_probs = [[min(max(x,0.001),0.999) for x in y]
# for y in predicted_probs]
# predicted_probs = [["%f" % x for x in y] for y in predicted_probs]
# csv_io.write_delimited_file("../Submissions/svm_benchmark.csv",
# predicted_probs)
if __name__=="__main__":
main()
| 31.888889 | 72 | 0.615563 | 128 | 861 | 3.9375 | 0.453125 | 0.166667 | 0.059524 | 0.051587 | 0.206349 | 0.107143 | 0.107143 | 0.107143 | 0.107143 | 0 | 0 | 0.019908 | 0.24158 | 861 | 26 | 73 | 33.115385 | 0.751914 | 0.423926 | 0 | 0 | 0 | 0 | 0.092213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.153846 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43ddbd75df809ab6f556d3498600ef7c94a80521 | 16,408 | py | Python | bootstrap.py | tqchen/yarn-ec2 | 303f3980ad41770011b72532ed9f7c6bbe876508 | [
"Apache-2.0"
] | 35 | 2016-02-23T19:15:46.000Z | 2021-01-01T02:57:43.000Z | bootstrap.py | tqchen/cloud-scripts | 303f3980ad41770011b72532ed9f7c6bbe876508 | [
"Apache-2.0"
] | 4 | 2016-11-12T16:49:16.000Z | 2018-11-02T21:20:23.000Z | bootstrap.py | tqchen/yarn-ec2 | 303f3980ad41770011b72532ed9f7c6bbe876508 | [
"Apache-2.0"
] | 25 | 2016-02-26T20:28:13.000Z | 2020-07-26T12:02:34.000Z | #!/usr/bin/env python
# encoding: utf-8
"""
script to install all the necessary things
for working on a linux machine with nothing
Installing minimum dependencies
"""
import sys
import os
import logging
import subprocess
import xml.etree.ElementTree as ElementTree
import xml.dom.minidom as minidom
import socket
import time
import pwd
###---------------------------------------------------##
# Configuration Section, will be modified by script #
###---------------------------------------------------##
node_apt_packages = [
'emacs',
'git',
'g++',
'make',
'python-numpy',
'libprotobuf-dev',
'libcurl4-openssl-dev']
# master only packages
master_apt_packages = [
'protobuf-compiler']
# List of r packages to be installed in master
master_r_packages = [
'r-base-dev',
'r-base',
'r-cran-statmod',
'r-cran-RCurl',
'r-cran-rjson'
]
# download link of hadoop.
hadoop_url = 'http://apache.claz.org/hadoop/common/hadoop-2.8.0/hadoop-2.8.0.tar.gz'
hadoop_dir = 'hadoop-2.8.0'
# customized installation script.
# See optional installation scripts for options.
def custom_master_install():
#install_spark()
#install_r()
pass
# customized installation script for all nodes.
def custom_all_nodes_install():
install_gcc()
pass
###---------------------------------------------------##
# Automatically set by script #
###---------------------------------------------------##
USER_NAME = 'ubuntu'
# setup variables
MASTER = os.getenv('MY_MASTER_DNS', '')
# node type the type of current node
NODE_TYPE = os.getenv('MY_NODE_TYPE', 'm3.xlarge')
NODE_VMEM = int(os.getenv('MY_NODE_VMEM', str(1024*15)))
NODE_VCPU = int(os.getenv('MY_NODE_VCPU', '4'))
AWS_ID = os.getenv('AWS_ACCESS_KEY_ID', 'undefined')
AWS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY', 'undefined')
JAVA_HOME = os.getenv('JAVA_HOME')
HADOOP_HOME = os.getenv('HADOOP_HOME')
DISK_LIST = [('xvd' + chr(ord('b') + i)) for i in range(10)]
ENVIRON = os.environ.copy()
###--------------------------------##
# Optional installation scripts. #
###--------------------------------##
def install_r():
if master_r_packages:
sudo("apt-key adv --keyserver keyserver.ubuntu.com --recv-keys E084DAB9")
sudo("echo deb https://cran.r-project.org/bin/linux/ubuntu trusty/ >>/etc/apt/sources.list")
sudo('apt-get -y update')
sudo('apt-get -y install %s' % (' '.join(master_r_packages)))
def install_spark():
run('wget https://www.apache.org/dist/spark/spark-2.1.1/spark-2.1.1-bin-hadoop2.7.tgz')
run('tar xf spark-2.1.1-bin-hadoop2.7.tgz')
run('rm -rf spark-2.1.1-bin-hadoop2.7.tgz')
with open('.bashrc', 'a') as fo:
fo.write('\nexport PATH=${PATH}:spark-2.1.1-bin-hadoop2.7\n')
def install_xgboost():
run('git clone --recursive https://github.com/dmlc/xgboost')
run('cd xgboost; cp make/config.mk .; echo USE_S3=1 >> config.mk; make -j4')
### Script section ###
def run(cmd):
try:
print cmd
logging.info(cmd)
proc = subprocess.Popen(cmd, shell=True, env = ENVIRON,
stdout=subprocess.PIPE, stderr = subprocess.PIPE)
out, err = proc.communicate()
retcode = proc.poll()
if retcode != 0:
logging.error('Command %s returns %d' % (cmd,retcode))
logging.error(out)
logging.error(err)
else:
print out
except Exception as e:
print(str(e))
logging.error('Exception running: %s' % cmd)
logging.error(str(e))
pass
def sudo(cmd):
run('sudo %s' % cmd)
### Installation helpers ###
def install_packages(pkgs):
sudo('apt-get -y update')
sudo('apt-get -y install %s' % (' '.join(pkgs)))
# install g++4.9, needed for regex match.
def install_gcc():
sudo('add-apt-repository -y ppa:ubuntu-toolchain-r/test')
sudo('apt-get -y update')
sudo('apt-get -y install g++-4.9')
def install_java():
"""
install java and setup environment variables
Returns environment variables that needs to be exported
"""
if not os.path.exists('jdk1.8.0_131'):
run('wget --no-check-certificate --no-cookies'\
' --header \"Cookie: oraclelicense=accept-securebackup-cookie\"'\
' http://download.oracle.com/otn-pub/java/jdk/8u131-b11/d54c1d3a095b4ff2b6607d096fa80163/jdk-8u131-linux-x64.tar.gz')
run('tar xf jdk-8u131-linux-x64.tar.gz')
run('rm -f jdk-8u131-linux-x64.tar.gz')
global JAVA_HOME
if JAVA_HOME is None:
JAVA_HOME = os.path.abspath('jdk1.8.0_131')
return [('JAVA_HOME', JAVA_HOME)]
def install_hadoop(is_master):
def update_site(fname, rmap):
"""
update the site script
"""
try:
tree = ElementTree.parse(fname)
root = tree.getroot()
except Exception:
cfg = ElementTree.Element("configuration")
tree = ElementTree.ElementTree(cfg)
root = tree.getroot()
rset = set()
for prop in root.getiterator('property'):
prop = dict((p.tag, p) for p in prop)
name = prop['name'].text.strip()
if name in rmap:
prop['value'].text = str(rmap[name])
rset.add(name)
for name, text in rmap.iteritems():
if name in rset:
continue
prop = ElementTree.SubElement(root, 'property')
ElementTree.SubElement(prop, 'name').text = name
ElementTree.SubElement(prop, 'value').text = str(text)
rough_string = ElementTree.tostring(root, 'utf-8')
reparsed = minidom.parseString(rough_string)
pretty = reparsed.toprettyxml(indent='\t')
fo = open(fname, 'w')
fo.write(pretty)
fo.close()
def setup_hadoop_site(master, hadoop_dir, hdfs_dir, vcpu, vmem):
"""
setup hadoop side given the parameters
Parameters
----------
master: the dns to master uri
hadoop_dir: the directory to store temp files
hdfs_dir: the directories for hdfs
vcpu: the number of cpus current machine have
vmem: the memory(MB) current machine have
"""
if vmem < 4 * 1024:
reserved_ram = 256
elif vmem < 8 * 1024:
reserved_ram = 1 * 1024
elif vmem < 24 * 1024 :
reserved_ram = 2 * 1024
elif vmem < 48 * 1024:
reserved_ram = 2 * 1024
elif vmem < 64 * 1024:
reserved_ram = 6 * 1024
else:
reserved_ram = 8 * 1024
ram_per_container = (vmem - reserved_ram) / vcpu
if is_master:
vcpu = vcpu - 2
tmp_dir = hadoop_dir[0]
core_site = {
'fs.defaultFS': 'hdfs://%s:9000/' % master,
'fs.s3n.impl': 'org.apache.hadoop.fs.s3native.NativeS3FileSystem',
'hadoop.tmp.dir': tmp_dir
}
if AWS_ID != 'undefined':
core_site['fs.s3n.awsAccessKeyId'] = AWS_ID
core_site['fs.s3n.awsSecretAccessKey'] = AWS_KEY
update_site('%s/etc/hadoop/core-site.xml' % HADOOP_HOME, core_site)
hdfs_site = {
'dfs.data.dir': ','.join(['%s/data' % d for d in hdfs_dir]),
'dfs.permissions': 'false',
'dfs.replication': '1'
}
update_site('%s/etc/hadoop/hdfs-site.xml' % HADOOP_HOME, hdfs_site)
yarn_site = {
'yarn.resourcemanager.resource-tracker.address': '%s:8025' % master,
'yarn.resourcemanager.scheduler.address': '%s:8030' % master,
'yarn.resourcemanager.address': '%s:8032' % master,
'yarn.scheduler.minimum-allocation-mb': 512,
'yarn.scheduler.maximum-allocation-mb': 640000,
'yarn.scheduler.minimum-allocation-vcores': 1,
'yarn.scheduler.maximum-allocation-vcores': 32,
'yarn.nodemanager.resource.memory-mb': vcpu * ram_per_container,
'yarn.nodemanager.resource.cpu-vcores': vcpu,
'yarn.log-aggregation-enable': 'true',
'yarn.nodemanager.vmem-check-enabled': 'false',
'yarn.nodemanager.aux-services': 'mapreduce_shuffle',
'yarn.nodemanager.aux-services.mapreduce.shuffle.class': 'org.apache.hadoop.mapred.ShuffleHandler',
'yarn.nodemanager.remote-app-log-dir': os.path.join(tmp_dir, 'logs'),
'yarn.nodemanager.log-dirs': os.path.join(tmp_dir, 'userlogs'),
'yarn.nodemanager.local-dirs': ','.join(['%s/yarn/nm-local-dir' % d for d in hadoop_dir])
}
update_site('%s/etc/hadoop/yarn-site.xml' % HADOOP_HOME, yarn_site)
mapred_site = {
'mapreduce.application.classpath' : ':'.join(['$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*',
'$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*',
'$HADOOP_MAPRED_HOME/share/hadoop/tools/lib/*']),
'yarn.app.mapreduce.am.resource.mb': 2 * ram_per_container,
'yarn.app.mapreduce.am.command-opts': '-Xmx%dm' % int(0.8 * 2 * ram_per_container),
'mapreduce.framework.name': 'yarn',
'mapreduce.map.cpu.vcores': 1,
'mapreduce.map.memory.mb': ram_per_container,
'mapreduce.map.java.opts': '-Xmx%dm' % int(0.8 * ram_per_container),
'mapreduce.reduce.cpu.vcores': 1,
'mapreduce.reduce.memory.mb': 2 * ram_per_container,
'mapreduce.reduce.java.opts': '-Xmx%dm' % int(0.8 * ram_per_container)
}
update_site('%s/etc/hadoop/mapred-site.xml' % HADOOP_HOME, mapred_site)
capacity_site = {
'yarn.scheduler.capacity.resource-calculator': 'org.apache.hadoop.yarn.util.resource.DominantResourceCalculator'
}
update_site('%s/etc/hadoop/capacity-scheduler.xml' % HADOOP_HOME, capacity_site)
fo = open('%s/etc/hadoop/hadoop-env.sh' % HADOOP_HOME, 'w')
fo.write('export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_PREFIX/share/hadoop/tools/lib/*\n')
fo.write('export HADOOP_LOG_DIR=%s/log\n' % tmp_dir)
fo.write('export YARN_LOG_DIR=%s/log\n' % tmp_dir)
fo.write('export JAVA_HOME=\"%s\"\n' % JAVA_HOME)
fo.close()
fo = open('%s/etc/hadoop/slaves' % HADOOP_HOME, 'w')
fo.write(master + '\n')
fo.close()
def run_install():
if not os.path.exists('hadoop-2.8.0'):
run('wget %s' % hadoop_url)
run('tar xf hadoop-2.8.0.tar.gz')
run('rm -f hadoop-2.8.0.tar.gz')
global HADOOP_HOME
if HADOOP_HOME is None:
HADOOP_HOME = os.path.abspath('hadoop-2.8.0')
env = [('HADOOP_HOME', HADOOP_HOME)]
env += [('HADOOP_PREFIX', HADOOP_HOME)]
env += [('HADOOP_MAPRED_HOME', HADOOP_HOME)]
env += [('HADOOP_COMMON_HOME', HADOOP_HOME)]
env += [('HADOOP_HDFS_HOME', HADOOP_HOME)]
env += [('YARN_HOME', HADOOP_HOME)]
env += [('YARN_CONF_DIR', '%s/etc/hadoop' % HADOOP_HOME)]
env += [('HADOOP_CONF_DIR', '%s/etc/hadoop' % HADOOP_HOME)]
disks = ['/disk/%s' % d for d in DISK_LIST if os.path.exists('/dev/%s' % d)]
setup_hadoop_site(MASTER,
['%s/hadoop' % d for d in disks],
['%s/hadoop/dfs' % d for d in disks],
NODE_VCPU, NODE_VMEM)
return env
return run_install()
def regsshkey(fname):
for dns in (open(fname).readlines() + ['localhost', '0.0.0.0']):
try:
run('ssh-keygen -R %s' % dns.strip())
except:
pass
run('ssh-keyscan %s >> ~/.ssh/known_hosts' % dns.strip())
# main script to install all dependencies
def install_main(is_master):
if is_master:
install_packages(master_apt_packages + node_apt_packages)
else:
install_packages(node_apt_packages)
env = []
env += install_java()
env += install_hadoop(is_master)
path = ['$HADOOP_HOME/bin', '$HADOOP_HOME/sbin', '$JAVA_HOME/bin']
env += [('LD_LIBRARY_PATH', '$HADOOP_HOME/native/lib')]
env += [('LD_LIBRARY_PATH', '${LD_LIBRARY_PATH}:$HADOOP_HDFS_HOME/lib/native:$JAVA_HOME/jre/lib/amd64/server')]
env += [('LD_LIBRARY_PATH', '${LD_LIBRARY_PATH}:/usr/local/lib')]
env += [('LIBHDFS_OPTS', '--Xmx128m')]
env += [('MY_MASTER_DNS', MASTER)]
env += [('MY_NODE_TYPE', NODE_TYPE)]
env += [('MY_NODE_VMEM', str(NODE_VMEM))]
env += [('MY_NODE_VCPU', str(NODE_VCPU))]
if AWS_ID != 'undefined':
env += [('AWS_ACCESS_KEY_ID', AWS_ID)]
if AWS_KEY != 'undefined':
env += [('AWS_SECRET_ACCESS_KEY', AWS_KEY)]
# setup environments
fo = open('.hadoop_env', 'w')
for k, v in env:
fo.write('export %s=%s\n' % (k,v))
ENVIRON[k] = v
fo.write('export PATH=$PATH:%s\n' % (':'.join(path)))
fo.write('export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/usr/local/lib\n')
fo.close()
for l in open('.bashrc'):
if l.find('.hadoop_env') != -1:
return
run('echo source ~/.hadoop_env >> ~/.bashrc')
# allow ssh, if they already share the key.
key_setup = """
[ -f ~/.ssh/id_rsa ] ||
(ssh-keygen -q -t rsa -N '' -f ~/.ssh/id_rsa &&
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys)
"""
run(key_setup)
regsshkey('%s/etc/hadoop/slaves' % HADOOP_HOME)
# end of instalation.
# Make startup script for bulding
def make_startup_script(is_master):
assert JAVA_HOME is not None
assert HADOOP_HOME is not None
assert NODE_VCPU is not None
assert NODE_VMEM is not None
disks = []
cmds = []
if is_master:
cmds.append('$HADOOP_HOME/sbin/stop-all.sh')
for d in DISK_LIST:
if os.path.exists('/dev/%s' % d):
cmds.append('sudo umount /dev/%s' % d)
cmds.append('sudo mkfs -t ext4 /dev/%s' % d)
cmds.append('sudo mkdir -p /disk/%s' % d)
cmds.append('sudo mount /dev/%s /disk/%s' % (d, d))
disks.append('/disk/%s' % d)
for d in disks:
cmds.append('sudo mkdir -p %s/hadoop' %d)
cmds.append('sudo chown ubuntu:ubuntu %s/hadoop' % d)
cmds.append('sudo mkdir -p %s/tmp' %d)
cmds.append('sudo chown ubuntu:ubuntu %s/tmp' % d)
cmds.append('rm -rf %s/hadoop/dfs' % d)
cmds.append('mkdir %s/hadoop/dfs' % d)
cmds.append('mkdir %s/hadoop/dfs/name' % d)
cmds.append('mkdir %s/hadoop/dfs/data' % d)
# run command
if is_master:
cmds.append('$HADOOP_HOME/bin/hadoop namenode -format')
cmds.append('$HADOOP_HOME/sbin/start-all.sh')
else:
cmds.append('export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec &&'\
' $HADOOP_HOME/sbin/yarn-daemon.sh --config $HADOOP_HOME/etc/hadoop start nodemanager')
with open('startup.sh', 'w') as fo:
fo.write('#!/bin/bash\n')
fo.write('set -v\n')
fo.write('\n'.join(cmds))
run('chmod +x startup.sh')
run('./startup.sh')
def main():
global MASTER
logging.basicConfig(filename = 'bootstrap.log', level = logging.INFO,
format='%(asctime)s %(levelname)s %(message)s')
if MASTER == '':
is_master = True
MASTER = socket.getfqdn()
logging.info('assuming master is myself as %s' % MASTER)
else:
is_master = socket.getfqdn() == MASTER
tstart = time.time()
install_main(is_master)
tmid = time.time()
logging.info('installation finishes in %g secs' % (tmid - tstart))
make_startup_script(is_master)
ENVIRON['HADOOP_HOME'] = HADOOP_HOME
ENVIRON['JAVA_HOME'] = JAVA_HOME
tend = time.time()
if is_master:
custom_master_install()
custom_all_nodes_install()
logging.info('boostrap finishes in %g secs' % (tend - tmid))
logging.info('all finishes in %g secs' % (tend - tstart))
if __name__ == '__main__':
pw_record = pwd.getpwnam(USER_NAME)
user_name = pw_record.pw_name
user_home_dir = pw_record.pw_dir
user_uid = pw_record.pw_uid
user_gid = pw_record.pw_gid
env = os.environ.copy()
cwd = user_home_dir
ENVIRON['HOME'] = user_home_dir
os.setgid(user_gid)
os.setuid(user_uid)
os.chdir(user_home_dir)
main()
| 37.461187 | 133 | 0.585629 | 2,146 | 16,408 | 4.332246 | 0.21575 | 0.036571 | 0.013015 | 0.006776 | 0.216952 | 0.13897 | 0.094224 | 0.062493 | 0.053028 | 0.03969 | 0 | 0.019511 | 0.250305 | 16,408 | 437 | 134 | 37.546911 | 0.736282 | 0.059727 | 0 | 0.086455 | 0 | 0.017291 | 0.342273 | 0.149036 | 0 | 0 | 0 | 0 | 0.011527 | 0 | null | null | 0.011527 | 0.025937 | null | null | 0.008646 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
43e2d67fdf43b1951abb85a9aaab6711fb8852be | 1,132 | py | Python | tests/core/test_plugins.py | franalgaba/nile | f771467f27f03c8d20b8032bac64b3ab60436d3c | [
"MIT"
] | null | null | null | tests/core/test_plugins.py | franalgaba/nile | f771467f27f03c8d20b8032bac64b3ab60436d3c | [
"MIT"
] | null | null | null | tests/core/test_plugins.py | franalgaba/nile | f771467f27f03c8d20b8032bac64b3ab60436d3c | [
"MIT"
] | null | null | null | """
Tests for plugins in core module.
Only unit tests for now.
"""
from unittest.mock import patch
import click
from nile.core.plugins import get_installed_plugins, load_plugins, skip_click_exit
def test_skip_click_exit():
def dummy_method(a, b):
return a + b
dummy_result = dummy_method(1, 2)
decorated = skip_click_exit(dummy_method)
decorated_result = decorated(1, 2)
assert callable(decorated)
assert dummy_result == decorated_result
def testget_installed_plugins():
class Dummy:
value = "nile.core.plugins.get_installed_plugins"
name = "get_installed_plugins"
with patch("nile.core.plugins.entry_points", return_value=[Dummy()]):
installed_plugins = get_installed_plugins()
assert "get_installed_plugins" in installed_plugins
def test_load_plugins():
@click.group()
def cli():
"""Nile CLI group."""
pass
def dummy():
print("dummy_result")
with patch(
"nile.core.plugins.get_installed_plugins", return_value={"dummy": dummy}
):
app = load_plugins(cli)
assert callable(app)
| 22.64 | 82 | 0.681095 | 144 | 1,132 | 5.090278 | 0.319444 | 0.196453 | 0.155525 | 0.106412 | 0.13779 | 0.092769 | 0 | 0 | 0 | 0 | 0 | 0.00453 | 0.219965 | 1,132 | 49 | 83 | 23.102041 | 0.825595 | 0.066254 | 0 | 0 | 0 | 0 | 0.159962 | 0.143678 | 0 | 0 | 0 | 0 | 0.137931 | 1 | 0.206897 | false | 0.034483 | 0.103448 | 0.034483 | 0.448276 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78d8d23f31a9ec6e42dd56f7cc23f8c31fbd70c2 | 376 | py | Python | django_git_info/management/commands/get_git_info.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | 1 | 2019-03-15T10:32:21.000Z | 2019-03-15T10:32:21.000Z | django_git_info/management/commands/get_git_info.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | null | null | null | django_git_info/management/commands/get_git_info.py | spapas/django-git | a62215d315263bce5d5d0afcfa14152601f76901 | [
"MIT"
] | 1 | 2016-03-25T03:57:49.000Z | 2016-03-25T03:57:49.000Z | # -*- coding: utf-8 -*-
from django.core.management.base import BaseCommand, CommandError
from django_git_info import get_git_info
class Command(BaseCommand):
help = 'Gets git info'
#@transaction.commit_manually
def handle(self, *args, **options):
info = get_git_info()
for key in info.keys():
print '{0}={1}'.format(key, info[key]) | 26.857143 | 65 | 0.656915 | 50 | 376 | 4.8 | 0.68 | 0.116667 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010101 | 0.210106 | 376 | 14 | 66 | 26.857143 | 0.79798 | 0.130319 | 0 | 0 | 0 | 0 | 0.061538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.25 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78db0363110019cfe555b18f1fdc95de024b7945 | 19,306 | py | Python | mevis/_internal/conversion.py | robert-haas/mevis | 1bbf8dfb56aa8fc52b8f38c570ee7b2d2a9d3327 | [
"Apache-2.0"
] | 2 | 2022-01-12T23:08:52.000Z | 2022-01-12T23:21:23.000Z | mevis/_internal/conversion.py | robert-haas/mevis | 1bbf8dfb56aa8fc52b8f38c570ee7b2d2a9d3327 | [
"Apache-2.0"
] | null | null | null | mevis/_internal/conversion.py | robert-haas/mevis | 1bbf8dfb56aa8fc52b8f38c570ee7b2d2a9d3327 | [
"Apache-2.0"
] | null | null | null | from collections.abc import Callable as _Callable
import networkx as _nx
from opencog.type_constructors import AtomSpace as _AtomSpace
from .args import check_arg as _check_arg
def convert(data, graph_annotated=True, graph_directed=True,
node_label=None, node_color=None, node_opacity=None, node_size=None, node_shape=None,
node_border_color=None, node_border_size=None,
node_label_color=None, node_label_size=None, node_hover=None, node_click=None,
node_image=None, node_properties=None,
edge_label=None, edge_color=None, edge_opacity=None, edge_size=None,
edge_label_color=None, edge_label_size=None, edge_hover=None, edge_click=None):
"""Convert an Atomspace or list of Atoms to a NetworkX graph with annotations.
Several arguments accept a Callable.
- In case of node annotations, the Callable gets an Atom as input,
which the node represents in the graph.
The Callable needs to return one of the other types accepted by the argument,
e.g. ``str`` or ``int``/``float``.
- In case of edge annotations, the Callable gets two Atoms as input,
which the edge connects in the graph.
The Callable needs to return one of the other types accepted by the argument,
e.g. ``str`` or ``int``/``float``.
Several arguments accept a color, which can be in following formats:
- Name: ``"black"``, ``"red"``, ``"green"``, ...
- Color code
- 6 digit hex RGB code: ``"#05ac05"``
- 3 digit hex RGB code: ``"#0a0"`` (equivalent to ``"#00aa00"``)
Parameters
----------
data : Atomspace, list of Atoms
Input that gets converted to a graph.
graph_annotated : bool
If ``False``, no annotations are added to the graph. This could be used for
converting large AtomSpaces quickly to graphs that use less RAM and can
be exported to smaller files (e.g. also compressed as gml.gz) for inspection
with other tools.
graph_directed : bool
If ``True``, a NetworkX DiGraph is created. If ``False``, a NetworkX Graph is created.
node_label : str, Callable
Set a label for each node, which is shown as text below it.
node_color : str, Callable
Set a color for each node, which becomes the fill color of its shape.
node_opacity : float between 0.0 and 1.0
Set an opacity for each node, which becomes the opacity of its shape.
Caution: This is only supported by d3.
node_size : int, float, Callable
Set a size for each node, which becomes the height and width of its shape.
node_shape : str, Callable
Set a shape for each node, which is some geometrical form that has the
node coordinates in its center.
Possible values: ``"circle"``, ``"rectangle"``, ``"hexagon"``
node_border_color : str, Callable
Set a border color for each node, which influences the border drawn around its shape.
node_border_size : int, float, Callable
Set a border size for each node, which influences the border drawn around its shape.
node_label_color : str, Callable
Set a label color for each node, which determines the font color
of the text below the node.
node_label_size : int, float, Callable
Set a label size for each node, which determines the font size
of the text below the node.
node_hover : str, Callable
Set a hover text for each node, which shows up besides the mouse cursor
when hovering over a node.
node_click : str, Callable
Set a click text for each node, which shows up in a div element below the plot
when clicking on a node and can easily be copied and pasted.
node_image : str, Callable
Set an image for each node, which appears within its shape.
Possible values:
- URL pointing to an image
- Data URL encoding the image
node_properties : str, dict, Callable
Set additional properties for each node, which may not immediately be translated
into a visual element, but can be chosen in the data selection menu in the
interactive HTML visualizations to map them on some plot element.
These properties also appear when exporting a graph to a file in a format
such as GML and may be recognized by external visualization tools.
Note that a Callable needs to return a dict in this case, and each key becomes
a property, which is equivalent to the other properties such as node_size and
node_color.
Special cases:
- ``node_properties="tv"`` is a shortcut for using a function that returns
``{"mean": atom.tv.mean, "confidence": atom.tv.confidence}``
- Keys ``"x"``, ``"y"`` and ``"z"`` properties are translated into node coordinates.
Examples:
- ``dict(x=0.0)``: This fixes the x coordinate of each node to 0.0, so that the
JavaScript layout algorithm does not influence it, but the nodes remain
free to move in the y and z directions.
- ``lambda atom: dict(x=2.0) if atom.is_node() else None``:
This fixes the x coordinate of each Atom of type Node to 2.0
but allows each Atom of type Link to move freely.
- ``lambda atom: dict(y=-len(atom.out)*100) if atom.is_link() else dict(y=0)``
This fixes the y coordinates of Atoms at different heights. Atoms of type Node
are put at the bottom and Atoms of type Link are ordered by the number of their
outgoing edges. The results is a hierarchical visualization that has some
similarity with the "dot" layout.
- ``lambda atom: dict(x=-100) if atom.is_node() else dict(x=100)``:
This fixes the x coordinate of Node Atoms at -100 and of Link Atoms at 100.
The results is a visualization with two lines of nodes that has some
similarity with the "bipartite" layout.
edge_label : str, Callable
Set a label for each edge, which becomes the text plotted in the middle of the edge.
edge_color : str, Callable
Set a color for each edge, which becomes the color of the line representing the edge.
edge_opacity : int, float, Callable
Set an opacity for each edge, which allows to make it transparent to some degree.
edge_size : int, float, Callable
Set a size for each edge, which becomes the width of the line representing the edge.
edge_label_color : str, Callable
Set a color for each edge label, which becomes the color of the text in the midpoint
of the edge.
edge_label_size : int, float, Callable
Set a size for each edge label, which becomes the size of the text in the midpoint
of the edge.
edge_hover : str, Callable
edge_click : str, Callable
Returns
-------
graph : NetworkX Graph or DiGraph
Whether an undirected or directed graph is created depends on the argument "directed".
"""
# Argument processing
_check_arg(data, 'data', (list, _AtomSpace))
_check_arg(graph_annotated, 'graph_annotated', bool)
_check_arg(graph_directed, 'graph_directed', bool)
_check_arg(node_label, 'node_label', (str, _Callable), allow_none=True)
_check_arg(node_color, 'node_color', (str, _Callable), allow_none=True)
_check_arg(node_opacity, 'node_opacity', (int, float, _Callable), allow_none=True)
_check_arg(node_size, 'node_size', (int, float, _Callable), allow_none=True)
_check_arg(node_shape, 'node_shape', (str, _Callable), allow_none=True)
_check_arg(node_border_color, 'node_border_color', (str, _Callable), allow_none=True)
_check_arg(node_border_size, 'node_border_size', (int, float, _Callable), allow_none=True)
_check_arg(node_label_color, 'node_label_color', (str, _Callable), allow_none=True)
_check_arg(node_label_size, 'node_label_size', (int, float, _Callable), allow_none=True)
_check_arg(node_hover, 'node_hover', (str, _Callable), allow_none=True)
_check_arg(node_click, 'node_click', (str, _Callable), allow_none=True)
_check_arg(node_image, 'node_image', (str, _Callable), allow_none=True)
_check_arg(node_properties, 'node_properties', (str, dict, _Callable), allow_none=True)
_check_arg(edge_label, 'edge_label', (str, _Callable), allow_none=True)
_check_arg(edge_color, 'edge_color', (str, _Callable), allow_none=True)
_check_arg(edge_opacity, 'edge_opacity', (int, float, _Callable), allow_none=True)
_check_arg(edge_size, 'edge_size', (int, float, _Callable), allow_none=True)
_check_arg(edge_label_color, 'edge_label_color', (str, _Callable), allow_none=True)
_check_arg(edge_label_size, 'edge_label_size', (int, float, _Callable), allow_none=True)
_check_arg(edge_hover, 'edge_hover', (str, _Callable), allow_none=True)
_check_arg(edge_click, 'edge_click', (str, _Callable), allow_none=True)
# Prepare annoation functions
if graph_annotated:
node_ann = prepare_node_func(
node_label, node_color, node_opacity, node_size, node_shape, node_border_color,
node_border_size, node_label_color, node_label_size, node_hover, node_click,
node_image, node_properties)
edge_ann = prepare_edge_func(
edge_label, edge_color, edge_opacity, edge_size,
edge_label_color, edge_label_size, edge_hover, edge_click)
else:
empty = dict()
def node_ann(atom):
return empty
def edge_ann(atom1, atom2):
return empty
# Create the NetworkX graph
graph = _nx.DiGraph() if graph_directed else _nx.Graph()
# 0) Set graph annotations
graph.graph['node_click'] = '$hover' # node_click will by default show content of node_hover
# 1) Add vertices and their annotations
for atom in data:
graph.add_node(to_uid(atom), **node_ann(atom))
# 2) Add edges and their annotations (separate step to exclude edges to filtered vertices)
for atom in data:
uid = to_uid(atom)
if atom.is_link():
# for all that is incoming to the Atom
for atom2 in atom.incoming:
uid2 = to_uid(atom2)
if uid2 in graph.nodes:
graph.add_edge(uid2, uid, **edge_ann(atom2, atom))
# for all that is outgoing of the Atom
for atom2 in atom.out:
uid2 = to_uid(atom2)
if uid2 in graph.nodes:
graph.add_edge(uid, uid2, **edge_ann(atom, atom2))
return graph
def prepare_node_func(node_label, node_color, node_opacity, node_size, node_shape,
node_border_color, node_border_size, node_label_color, node_label_size,
node_hover, node_click, node_image, node_properties):
"""Prepare a function that calculates all annoations for a node representing an Atom."""
# individual node annotation functions
node_label = use_node_def_or_str(node_label, node_label_default)
node_color = use_node_def_or_str(node_color, node_color_default)
node_opacity = use_node_def_or_num(node_opacity, node_opacity_default)
node_size = use_node_def_or_num(node_size, node_size_default)
node_shape = use_node_def_or_str(node_shape, node_shape_default)
node_border_color = use_node_def_or_str(node_border_color, node_border_color_default)
node_border_size = use_node_def_or_num(node_border_size, node_border_size_default)
node_label_color = use_node_def_or_str(node_label_color, node_label_color_default)
node_label_size = use_node_def_or_num(node_label_size, node_label_size_default)
node_hover = use_node_def_or_str(node_hover, node_hover_default)
node_click = use_node_def_or_str(node_click, node_click_default)
node_image = use_node_def_or_str(node_image, node_image_default)
# special case: additional user-defined node properties by a function that returns a dict
if node_properties is None:
node_properties = node_properties_default
elif isinstance(node_properties, dict):
val = node_properties
def node_properties(atom):
return val
elif node_properties == 'tv':
node_properties = node_properties_tv
# combined node annotation function: calls each of the individual ones
name_func = (
('label', node_label),
('color', node_color),
('opacity', node_opacity),
('size', node_size),
('shape', node_shape),
('border_color', node_border_color),
('border_size', node_border_size),
('label_color', node_label_color),
('label_size', node_label_size),
('hover', node_hover),
('click', node_click),
('image', node_image),
)
def func(atom):
data = {}
for n, f in name_func:
val = f(atom)
if val is not None:
data[n] = val
try:
data.update(node_properties(atom))
except Exception:
pass
return data
return func
def prepare_edge_func(edge_label, edge_color, edge_opacity, edge_size, edge_label_color,
edge_label_size, edge_hover, edge_click):
"""Prepare a function that calculates all annoations for an edge between Atoms."""
# individual edge annotation functions
edge_label = use_edge_def_or_str(edge_label, edge_label_default)
edge_color = use_edge_def_or_str(edge_color, edge_color_default)
edge_opacity = use_edge_def_or_num(edge_opacity, edge_opacity_default)
edge_size = use_edge_def_or_num(edge_size, edge_size_default)
edge_label_color = use_edge_def_or_str(edge_label_color, edge_label_color_default)
edge_label_size = use_edge_def_or_num(edge_label_size, edge_label_size_default)
edge_hover = use_edge_def_or_str(edge_hover, edge_hover_default)
edge_click = use_edge_def_or_str(edge_click, edge_click_default)
# combined edge annotation function: calls each of the individual ones
name_func = (
('label', edge_label),
('color', edge_color),
('opacity', edge_opacity),
('size', edge_size),
('label_color', edge_label_color),
('label_size', edge_label_size),
('hover', edge_hover),
('click', edge_click),
)
def func(atom1, atom2):
data = {}
for n, f in name_func:
val = f(atom1, atom2)
if val is not None:
data[n] = val
return data
return func
def use_node_def_or_str(given_value, default_func):
"""Transform a value of type (None, str, Callable) to a node annotation function."""
# Default: use pre-defined function from this module
if given_value is None:
func = default_func
# Transform: value to function that returns the value
elif isinstance(given_value, str):
given_value = str(given_value)
def func(atom):
return given_value
# Passthrough: value itself is a function
else:
func = given_value
return func
def use_node_def_or_num(given_value, default_func):
"""Transform a value of type (None, int, float, Callable) to a node annotation function."""
# Default: use pre-defined function from this module
if given_value is None:
func = default_func
# Transform: value to function that returns the value
elif isinstance(given_value, (int, float)):
given_value = float(given_value)
def func(atom):
return given_value
# Passthrough: value itself is a function
else:
func = given_value
return func
def use_edge_def_or_str(given_value, default_func):
"""Transform a value of type (None, str, Callable) to an edge annotation function."""
# Default: use pre-defined function from this module
if given_value is None:
func = default_func
# Transform: value to function that returns the value
elif isinstance(given_value, str):
given_value = str(given_value)
def func(atom1, atom2):
return given_value
# Passthrough: value itself is a function
else:
func = given_value
return func
def use_edge_def_or_num(given_value, default_func):
"""Transform a value of type (None, int, float, Callable) to an edge annotation function."""
# Default: use pre-defined function from this module
if given_value is None:
func = default_func
# Transform: value to function that returns the value
elif isinstance(given_value, (int, float)):
given_value = float(given_value)
def func(atom1, atom2):
return given_value
# Passthrough: value itself is a function
else:
func = given_value
return func
def to_uid(atom):
"""Return a unique identifier for an Atom."""
return atom.id_string()
# Default functions for node annotations
# - "return None" means that the attribute and value won't be included
# to the output data, so that defaults of the JS library are used and files get smaller
# - A return of a value in some cases and None in other cases means that the
# default value of the JS library is used in None cases and again files get smaller
def node_label_default(atom):
# None => no node labels
return '{} "{}"'.format(atom.type_name, atom.name) if atom.is_node() else atom.type_name
def node_color_default(atom):
# None => black
return 'red' if atom.is_node() else None
def node_opacity_default(atom):
# None => 1.0
return None
def node_size_default(atom):
# None => 10
return None
def node_shape_default(atom):
# None => circle
return 'rectangle' if atom.is_node() else None
def node_border_color_default(atom):
# None => black
return None
def node_border_size_default(atom):
# None => 0.0
return None
def node_label_color_default(atom):
# None => black
return None
def node_label_size_default(atom):
# None => 12.0
return None
def node_hover_default(atom):
# None => no hover text
return atom.short_string()
def node_click_default(atom):
# None => no click text (in addition to always shown "Node: <id>" in header)
return None
def node_image_default(atom):
# None => no image inside node
return None
def node_properties_default(atom):
# None => no extra node annotations
return None
def node_properties_tv(atom):
return dict(mean=atom.tv.mean, confidence=atom.tv.confidence)
# Default functions for edge annotations
def edge_label_default(atom1, atom2):
# None => no edge label
return None
def edge_color_default(atom1, atom2):
# None => black
return None if atom1.is_link() and atom2.is_link() else 'red'
def edge_opacity_default(atom1, atom2):
# None => 1.0
return None
def edge_size_default(atom1, atom2):
# None => 1.0
return None
def edge_label_color_default(atom1, atom2):
# None => black
return None
def edge_label_size_default(atom1, atom2):
# None => 8.0
return None
def edge_hover_default(atom1, atom2):
# None => no hover text
return None
def edge_click_default(atom1, atom2):
# None => no click text (in addition to always shown "Edge: <id>" in header)
return None
| 38.923387 | 97 | 0.682897 | 2,803 | 19,306 | 4.474492 | 0.123439 | 0.025833 | 0.028464 | 0.035162 | 0.521049 | 0.460533 | 0.400175 | 0.345718 | 0.286557 | 0.232977 | 0 | 0.006731 | 0.238113 | 19,306 | 495 | 98 | 39.00202 | 0.845945 | 0.43919 | 0 | 0.337719 | 0 | 0 | 0.044872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.171053 | false | 0.004386 | 0.017544 | 0.127193 | 0.359649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
78df4f62738c15a3903b9ac814a118e7bd487166 | 1,214 | py | Python | test/tests.py | gzu300/Linear_Algebra | 437a285b0230f4da8b0573b04da32ee965b09233 | [
"MIT"
] | null | null | null | test/tests.py | gzu300/Linear_Algebra | 437a285b0230f4da8b0573b04da32ee965b09233 | [
"MIT"
] | null | null | null | test/tests.py | gzu300/Linear_Algebra | 437a285b0230f4da8b0573b04da32ee965b09233 | [
"MIT"
] | null | null | null | import unittest
from pkg import Linear_Algebra
import numpy as np
class TestLU(unittest.TestCase):
def setUp(self):
self.U_answer = np.around(np.array([[2,1,0],[0,3/2,1],[0,0,4/3]], dtype=float), decimals=2).tolist()
self.L_answer = np.around(np.array([[1,0,0],[1/2,1,0],[0,2/3,1]], dtype=float), decimals=2).tolist()
def test_perm(self):
answer = np.array([[0,1,0], [1,0,0], [0,0,1]], dtype=float).tolist()
result = Linear_Algebra.make_perm_mx(3, 0, 1).tolist()
self.assertEqual(result, answer)
def test_LU(self):
L_result, U_result = np.around(Linear_Algebra.LU(np.array([[2,1,0],[1,2,1],[0,1,2]], dtype=float)), decimals=2).tolist()
self.assertEqual(U_result, self.U_answer)
self.assertEqual(L_result, self.L_answer)
class TestDet(unittest.TestCase):
def setUp(self):
self.input_mx = np.array([[2,-1,0,0],[-1,2,-1,0],[0,-1,2,-1],[0,0,-1,2]], dtype=float)
def test_find_det(self):
result = np.around(Linear_Algebra.find_det(A = self.input_mx), decimals=2).tolist()
answer = np.around(5, decimals=2).tolist()
self.assertEqual(result, answer)
if __name__ == '__main__':
unittest.main() | 41.862069 | 128 | 0.629325 | 205 | 1,214 | 3.585366 | 0.219512 | 0.029932 | 0.032653 | 0.032653 | 0.530612 | 0.223129 | 0.032653 | 0.032653 | 0.021769 | 0.021769 | 0 | 0.064484 | 0.169687 | 1,214 | 29 | 129 | 41.862069 | 0.664683 | 0 | 0 | 0.166667 | 0 | 0 | 0.006584 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.208333 | false | 0 | 0.125 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78e27b1810b0eb666d13182e83f2f3c881794f6e | 17,296 | py | Python | Cartwheel/lib/Python26/Lib/site-packages/wx-2.8-msw-unicode/wx/lib/filebrowsebutton.py | MontyThibault/centre-of-mass-awareness | 58778f148e65749e1dfc443043e9fc054ca3ff4d | [
"MIT"
] | 27 | 2020-11-12T19:24:54.000Z | 2022-03-27T23:10:45.000Z | Cartwheel/lib/Python26/Lib/site-packages/wx-2.8-msw-unicode/wx/lib/filebrowsebutton.py | MontyThibault/centre-of-mass-awareness | 58778f148e65749e1dfc443043e9fc054ca3ff4d | [
"MIT"
] | 2 | 2020-11-02T06:30:39.000Z | 2022-02-23T18:39:55.000Z | Cartwheel/lib/Python26/Lib/site-packages/wx-2.8-msw-unicode/wx/lib/filebrowsebutton.py | MontyThibault/centre-of-mass-awareness | 58778f148e65749e1dfc443043e9fc054ca3ff4d | [
"MIT"
] | 3 | 2021-08-16T00:21:08.000Z | 2022-02-23T19:19:36.000Z | #----------------------------------------------------------------------
# Name: wxPython.lib.filebrowsebutton
# Purpose: Composite controls that provide a Browse button next to
# either a wxTextCtrl or a wxComboBox. The Browse button
# launches a wxFileDialog and loads the result into the
# other control.
#
# Author: Mike Fletcher
#
# RCS-ID: $Id: filebrowsebutton.py 59674 2009-03-20 21:00:16Z RD $
# Copyright: (c) 2000 by Total Control Software
# Licence: wxWindows license
#----------------------------------------------------------------------
# 12/02/2003 - Jeff Grimmett (grimmtooth@softhome.net)
#
# o 2.5 Compatability changes
#
import os
import types
import wx
#----------------------------------------------------------------------
class FileBrowseButton(wx.Panel):
"""
A control to allow the user to type in a filename or browse with
the standard file dialog to select file
"""
def __init__ (self, parent, id= -1,
pos = wx.DefaultPosition,
size = wx.DefaultSize,
style = wx.TAB_TRAVERSAL,
labelText= "File Entry:",
buttonText= "Browse",
toolTip= "Type filename or click browse to choose file",
# following are the values for a file dialog box
dialogTitle = "Choose a file",
startDirectory = ".",
initialValue = "",
fileMask = "*.*",
fileMode = wx.OPEN,
# callback for when value changes (optional)
changeCallback= lambda x:x,
labelWidth = 0,
name = 'fileBrowseButton',
):
"""
:param labelText: Text for label to left of text field
:param buttonText: Text for button which launches the file dialog
:param toolTip: Help text
:param dialogTitle: Title used in file dialog
:param startDirectory: Default directory for file dialog startup
:param fileMask: File mask (glob pattern, such as *.*) to use in file dialog
:param fileMode: wx.OPEN or wx.SAVE, indicates type of file dialog to use
:param changeCallback: Optional callback called for all changes in value of the control
:param labelWidth: Width of the label
"""
# store variables
self.labelText = labelText
self.buttonText = buttonText
self.toolTip = toolTip
self.dialogTitle = dialogTitle
self.startDirectory = startDirectory
self.initialValue = initialValue
self.fileMask = fileMask
self.fileMode = fileMode
self.changeCallback = changeCallback
self.callCallback = True
self.labelWidth = labelWidth
# create the dialog
self.createDialog(parent, id, pos, size, style, name )
# Setting a value causes the changeCallback to be called.
# In this case that would be before the return of the
# constructor. Not good. So a default value on
# SetValue is used to disable the callback
self.SetValue( initialValue, 0)
def createDialog( self, parent, id, pos, size, style, name ):
"""Setup the graphic representation of the dialog"""
wx.Panel.__init__ (self, parent, id, pos, size, style, name)
self.SetMinSize(size) # play nice with sizers
box = wx.BoxSizer(wx.HORIZONTAL)
self.label = self.createLabel( )
box.Add( self.label, 0, wx.CENTER )
self.textControl = self.createTextControl()
box.Add( self.textControl, 1, wx.LEFT|wx.CENTER, 5)
self.browseButton = self.createBrowseButton()
box.Add( self.browseButton, 0, wx.LEFT|wx.CENTER, 5)
# add a border around the whole thing and resize the panel to fit
outsidebox = wx.BoxSizer(wx.VERTICAL)
outsidebox.Add(box, 1, wx.EXPAND|wx.ALL, 3)
outsidebox.Fit(self)
self.SetAutoLayout(True)
self.SetSizer( outsidebox )
self.Layout()
if type( size ) == types.TupleType:
size = apply( wx.Size, size)
self.SetDimensions(-1, -1, size.width, size.height, wx.SIZE_USE_EXISTING)
# if size.width != -1 or size.height != -1:
# self.SetSize(size)
def SetBackgroundColour(self,color):
wx.Panel.SetBackgroundColour(self,color)
self.label.SetBackgroundColour(color)
def createLabel( self ):
"""Create the label/caption"""
label = wx.StaticText(self, -1, self.labelText, style =wx.ALIGN_RIGHT )
font = label.GetFont()
w, h, d, e = self.GetFullTextExtent(self.labelText, font)
if self.labelWidth > 0:
label.SetSize((self.labelWidth+5, h))
else:
label.SetSize((w+5, h))
return label
def createTextControl( self):
"""Create the text control"""
textControl = wx.TextCtrl(self, -1)
textControl.SetToolTipString( self.toolTip )
if self.changeCallback:
textControl.Bind(wx.EVT_TEXT, self.OnChanged)
textControl.Bind(wx.EVT_COMBOBOX, self.OnChanged)
return textControl
def OnChanged(self, evt):
if self.callCallback and self.changeCallback:
self.changeCallback(evt)
def createBrowseButton( self):
"""Create the browse-button control"""
button =wx.Button(self, -1, self.buttonText)
button.SetToolTipString( self.toolTip )
button.Bind(wx.EVT_BUTTON, self.OnBrowse)
return button
def OnBrowse (self, event = None):
""" Going to browse for file... """
current = self.GetValue()
directory = os.path.split(current)
if os.path.isdir( current):
directory = current
current = ''
elif directory and os.path.isdir( directory[0] ):
current = directory[1]
directory = directory [0]
else:
directory = self.startDirectory
current = ''
dlg = wx.FileDialog(self, self.dialogTitle, directory, current,
self.fileMask, self.fileMode)
if dlg.ShowModal() == wx.ID_OK:
self.SetValue(dlg.GetPath())
dlg.Destroy()
def GetValue (self):
"""
retrieve current value of text control
"""
return self.textControl.GetValue()
def SetValue (self, value, callBack=1):
"""set current value of text control"""
save = self.callCallback
self.callCallback = callBack
self.textControl.SetValue(value)
self.callCallback = save
def Enable (self, value=True):
""" Convenient enabling/disabling of entire control """
self.label.Enable (value)
self.textControl.Enable (value)
return self.browseButton.Enable (value)
def Disable (self,):
""" Convenient disabling of entire control """
self.Enable(False)
def GetLabel( self ):
""" Retrieve the label's current text """
return self.label.GetLabel()
def SetLabel( self, value ):
""" Set the label's current text """
rvalue = self.label.SetLabel( value )
self.Refresh( True )
return rvalue
class FileBrowseButtonWithHistory( FileBrowseButton ):
"""
with following additions:
__init__(..., history=None)
history -- optional list of paths for initial history drop-down
(must be passed by name, not a positional argument)
If history is callable it will must return a list used
for the history drop-down
changeCallback -- as for FileBrowseButton, but with a work-around
for win32 systems which don't appear to create wx.EVT_COMBOBOX
events properly. There is a (slight) chance that this work-around
will cause some systems to create two events for each Combobox
selection. If you discover this condition, please report it!
As for a FileBrowseButton.__init__ otherwise.
GetHistoryControl()
Return reference to the control which implements interfaces
required for manipulating the history list. See GetHistoryControl
documentation for description of what that interface is.
GetHistory()
Return current history list
SetHistory( value=(), selectionIndex = None )
Set current history list, if selectionIndex is not None, select that index
"""
def __init__( self, *arguments, **namedarguments):
self.history = namedarguments.get( "history" )
if self.history:
del namedarguments["history"]
self.historyCallBack=None
if callable(self.history):
self.historyCallBack=self.history
self.history=None
name = namedarguments.get('name', 'fileBrowseButtonWithHistory')
namedarguments['name'] = name
FileBrowseButton.__init__(self, *arguments, **namedarguments)
def createTextControl( self):
"""Create the text control"""
textControl = wx.ComboBox(self, -1, style = wx.CB_DROPDOWN )
textControl.SetToolTipString( self.toolTip )
textControl.Bind(wx.EVT_SET_FOCUS, self.OnSetFocus)
if self.changeCallback:
textControl.Bind(wx.EVT_TEXT, self.OnChanged)
textControl.Bind(wx.EVT_COMBOBOX, self.OnChanged)
if self.history:
history=self.history
self.history=None
self.SetHistory( history, control=textControl)
return textControl
def GetHistoryControl( self ):
"""
Return a pointer to the control which provides (at least)
the following methods for manipulating the history list:
Append( item ) -- add item
Clear() -- clear all items
Delete( index ) -- 0-based index to delete from list
SetSelection( index ) -- 0-based index to select in list
Semantics of the methods follow those for the wxComboBox control
"""
return self.textControl
def SetHistory( self, value=(), selectionIndex = None, control=None ):
"""Set the current history list"""
if control is None:
control = self.GetHistoryControl()
if self.history == value:
return
self.history = value
# Clear history values not the selected one.
tempValue=control.GetValue()
# clear previous values
control.Clear()
control.SetValue(tempValue)
# walk through, appending new values
for path in value:
control.Append( path )
if selectionIndex is not None:
control.SetSelection( selectionIndex )
def GetHistory( self ):
"""Return the current history list"""
if self.historyCallBack != None:
return self.historyCallBack()
elif self.history:
return list( self.history )
else:
return []
def OnSetFocus(self, event):
"""When the history scroll is selected, update the history"""
if self.historyCallBack != None:
self.SetHistory( self.historyCallBack(), control=self.textControl)
event.Skip()
if wx.Platform == "__WXMSW__":
def SetValue (self, value, callBack=1):
""" Convenient setting of text control value, works
around limitation of wx.ComboBox """
save = self.callCallback
self.callCallback = callBack
self.textControl.SetValue(value)
self.callCallback = save
# Hack to call an event handler
class LocalEvent:
def __init__(self, string):
self._string=string
def GetString(self):
return self._string
if callBack==1:
# The callback wasn't being called when SetValue was used ??
# So added this explicit call to it
self.changeCallback(LocalEvent(value))
class DirBrowseButton(FileBrowseButton):
def __init__(self, parent, id = -1,
pos = wx.DefaultPosition, size = wx.DefaultSize,
style = wx.TAB_TRAVERSAL,
labelText = 'Select a directory:',
buttonText = 'Browse',
toolTip = 'Type directory name or browse to select',
dialogTitle = '',
startDirectory = '.',
changeCallback = None,
dialogClass = wx.DirDialog,
newDirectory = False,
name = 'dirBrowseButton'):
FileBrowseButton.__init__(self, parent, id, pos, size, style,
labelText, buttonText, toolTip,
dialogTitle, startDirectory,
changeCallback = changeCallback,
name = name)
self.dialogClass = dialogClass
self.newDirectory = newDirectory
#
def OnBrowse(self, ev = None):
style=0
if not self.newDirectory:
style |= wx.DD_DIR_MUST_EXIST
dialog = self.dialogClass(self,
message = self.dialogTitle,
defaultPath = self.startDirectory,
style = style)
if dialog.ShowModal() == wx.ID_OK:
self.SetValue(dialog.GetPath())
dialog.Destroy()
#
#----------------------------------------------------------------------
if __name__ == "__main__":
#from skeletonbuilder import rulesfile
class SimpleCallback:
def __init__( self, tag ):
self.tag = tag
def __call__( self, event ):
print self.tag, event.GetString()
class DemoFrame( wx.Frame ):
def __init__(self, parent):
wx.Frame.__init__(self, parent, -1, "File entry with browse", size=(500,260))
self.Bind(wx.EVT_CLOSE, self.OnCloseWindow)
panel = wx.Panel (self,-1)
innerbox = wx.BoxSizer(wx.VERTICAL)
control = FileBrowseButton(
panel,
initialValue = "z:\\temp",
)
innerbox.Add( control, 0, wx.EXPAND )
middlecontrol = FileBrowseButtonWithHistory(
panel,
labelText = "With History",
initialValue = "d:\\temp",
history = ["c:\\temp", "c:\\tmp", "r:\\temp","z:\\temp"],
changeCallback= SimpleCallback( "With History" ),
)
innerbox.Add( middlecontrol, 0, wx.EXPAND )
middlecontrol = FileBrowseButtonWithHistory(
panel,
labelText = "History callback",
initialValue = "d:\\temp",
history = self.historyCallBack,
changeCallback= SimpleCallback( "History callback" ),
)
innerbox.Add( middlecontrol, 0, wx.EXPAND )
self.bottomcontrol = control = FileBrowseButton(
panel,
labelText = "With Callback",
style = wx.SUNKEN_BORDER|wx.CLIP_CHILDREN ,
changeCallback= SimpleCallback( "With Callback" ),
)
innerbox.Add( control, 0, wx.EXPAND)
self.bottommostcontrol = control = DirBrowseButton(
panel,
labelText = "Simple dir browse button",
style = wx.SUNKEN_BORDER|wx.CLIP_CHILDREN)
innerbox.Add( control, 0, wx.EXPAND)
ID = wx.NewId()
innerbox.Add( wx.Button( panel, ID,"Change Label", ), 1, wx.EXPAND)
self.Bind(wx.EVT_BUTTON, self.OnChangeLabel , id=ID)
ID = wx.NewId()
innerbox.Add( wx.Button( panel, ID,"Change Value", ), 1, wx.EXPAND)
self.Bind(wx.EVT_BUTTON, self.OnChangeValue, id=ID )
panel.SetAutoLayout(True)
panel.SetSizer( innerbox )
self.history={"c:\\temp":1, "c:\\tmp":1, "r:\\temp":1,"z:\\temp":1}
def historyCallBack(self):
keys=self.history.keys()
keys.sort()
return keys
def OnFileNameChangedHistory (self, event):
self.history[event.GetString ()]=1
def OnCloseMe(self, event):
self.Close(True)
def OnChangeLabel( self, event ):
self.bottomcontrol.SetLabel( "Label Updated" )
def OnChangeValue( self, event ):
self.bottomcontrol.SetValue( "r:\\somewhere\\over\\the\\rainbow.htm" )
def OnCloseWindow(self, event):
self.Destroy()
class DemoApp(wx.App):
def OnInit(self):
wx.InitAllImageHandlers()
frame = DemoFrame(None)
frame.Show(True)
self.SetTopWindow(frame)
return True
def test( ):
app = DemoApp(0)
app.MainLoop()
print 'Creating dialog'
test( )
| 36.721868 | 95 | 0.566142 | 1,738 | 17,296 | 5.581128 | 0.235328 | 0.01701 | 0.008351 | 0.010309 | 0.184742 | 0.139691 | 0.112371 | 0.083505 | 0.083505 | 0.065155 | 0 | 0.007505 | 0.329729 | 17,296 | 470 | 96 | 36.8 | 0.829207 | 0.089963 | 0 | 0.183391 | 0 | 0 | 0.043657 | 0.005146 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010381 | null | null | 0.00692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78e51986ef4ee9e7c7af6f2a83426baeaab981b9 | 1,426 | py | Python | pentest-scripts/learning-python-for-forensics/Chapter 6/rot13.py | paulveillard/cybersecurity-penetration-testing | a5afff13ec25afd0cf16ef966d35bddb91518af4 | [
"Apache-2.0"
] | 6 | 2021-12-07T21:02:12.000Z | 2022-03-03T12:08:14.000Z | pentest-scripts/learning-python-for-forensics/Chapter 6/rot13.py | paulveillard/cybersecurity-penetration-testing | a5afff13ec25afd0cf16ef966d35bddb91518af4 | [
"Apache-2.0"
] | null | null | null | pentest-scripts/learning-python-for-forensics/Chapter 6/rot13.py | paulveillard/cybersecurity-penetration-testing | a5afff13ec25afd0cf16ef966d35bddb91518af4 | [
"Apache-2.0"
] | 1 | 2022-01-15T23:57:36.000Z | 2022-01-15T23:57:36.000Z | def rotCode(data):
"""
The rotCode function encodes/decodes data using string indexing
:param data: A string
:return: The rot-13 encoded/decoded string
"""
rot_chars = ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm',
'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']
substitutions = []
# Walk through each individual character
for c in data:
# Walk through each individual character
if c.isupper():
try:
# Find the position of the character in rot_chars list
index = rot_chars.index(c.lower())
except ValueError:
substitutions.append(c)
continue
# Calculate the relative index that is 13 characters away from the index
substitutions.append((rot_chars[(index-13)]).upper())
else:
try:
# Find the position of the character in rot_chars list
index = rot_chars.index(c)
except ValueError:
substitutions.append(c)
continue
substitutions.append(rot_chars[((index-13))])
return ''.join(substitutions)
if __name__ == '__main__':
print rotCode('Jul, EBG-13?')
| 33.162791 | 90 | 0.47756 | 151 | 1,426 | 4.410596 | 0.523179 | 0.084084 | 0.078078 | 0.075075 | 0.531532 | 0.429429 | 0.195195 | 0.195195 | 0.195195 | 0.195195 | 0 | 0.011737 | 0.402525 | 1,426 | 42 | 91 | 33.952381 | 0.769953 | 0.178121 | 0 | 0.363636 | 0 | 0 | 0.045726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78ed1b7fc24c0d300d3ad14111db8c17f3c020fd | 5,401 | py | Python | app/routes/router.py | nityagautam/ReportDashboard-backend | d23fe008cb0df6a703fcd665181897a75b71d5b2 | [
"MIT"
] | 1 | 2021-05-06T09:48:46.000Z | 2021-05-06T09:48:46.000Z | app/routes/router.py | nityagautam/ReportDashboard | d23fe008cb0df6a703fcd665181897a75b71d5b2 | [
"MIT"
] | 2 | 2021-09-09T05:34:33.000Z | 2021-12-13T15:31:36.000Z | app/routes/router.py | nityagautam/ReportDashboard | d23fe008cb0df6a703fcd665181897a75b71d5b2 | [
"MIT"
] | null | null | null | #===============================================================
# @author: nityanarayan44@live.com
# @written: 08 December 2021
# @desc: Routes for the Backend server
#===============================================================
# Import section with referecne of entry file or main file;
from __main__ import application
from flask import jsonify, render_template, url_for, request, redirect
# Local sample data import
from app.config.uiconfig import app_ui_config
from app import sample_data
# ==============================================================
# App Routes/Gateways
# ==============================================================
@application.route('/test', methods=['GET'])
def test():
return '<h4>HELLO WORLD!</h4><hr/> it works!'
@application.route('/', methods=['GET'])
@application.route('/home', methods=['GET'])
@application.route('/dashboard', methods=['GET'])
def root():
return render_template("dashboard.html", app_data=app_ui_config, data=sample_data.latest_data)
@application.route('/history', methods=['GET'])
def history():
return render_template("history.html", app_data=app_ui_config, data=sample_data.history_data)
@application.route('/about', methods=['GET'])
def about():
return render_template("about.html", app_data=app_ui_config, data=sample_data.latest_data)
@application.route('/get-notes', methods=['POST'])
def get_todo():
print("KEY :: VALUE (from the received form data)")
print([(key, val) for key, val in zip(request.form.keys(), request.form.values())])
return redirect("/notes", code=302)
@application.route('/notes')
def info():
return render_template("notes.html", app_data=app_ui_config)
@application.route('/sample-data')
def get_sample_data():
return jsonify(app_ui_config)
# ==============================================================
# Error Handlers Starts
# ==============================================================
# 404 Handler; We can also pass the specific request errors codes to the decorator;
@application.errorhandler(404)
def not_found(err):
return render_template("error.html", app_data=app_ui_config, error_data=err), 400
# Exception/Error handler; We can also pass the specific errors to the decorator;
@application.errorhandler(TypeError)
def server_error(err):
application.logger.exception(err)
return render_template("error.html", app_data=app_ui_config, error_data=err), 500
# Exception/Error handler; We can also pass the specific errors to the decorator;
@application.errorhandler(Exception)
def server_error(err):
application.logger.exception(err)
return render_template("error.html", app_data=app_ui_config, error_data=err), 500
# ==============================================================
# Error Handlers Ends
# ==============================================================
# Route For Sample data
@application.route('/data')
def get_data():
data = {
"reports": [
{
"build": "build_no",
"created": "Imported 05052021T11:30:00:00IST",
"platform": "Imported Win/Unix/Mac",
"project_name": "project_name_1",
"report_location_path": "path/to/report/location/index.html",
"report_summary": {"pass": "50", "fail": "0", "ignored": "0", "skipped": "0"},
"total_time": "35 min."
},
{
"build": "build_no",
"created": "Imported 05052021T11:30:00:00IST",
"platform": "Imported Win/Unix/Mac",
"project_name": "project_name_2",
"report_location_path": "path/to/report/location/index.html",
"report_summary": {"pass": "10", "fail": "2", "ignored": "0", "skipped": "0"},
"total_time": "0.2345 secs."
},
{
"build": "build_no",
"created": "Imported 05052021T11:30:00:00IST",
"platform": "Imported Win/Unix/Mac",
"project_name": "project_name_3",
"report_location_path": "path/to/report/location/index.html",
"report_summary": {"pass": "100", "fail": "5", "ignored": "0", "skipped": "0"},
"total_time": "5 days"
}
]
}
return jsonify(data)
# ==============================================================
# Extra routes starts
# ==============================================================
@application.route('/sample1')
def sample1():
return render_template("web-analytics-overview.html")
@application.route('/sample2')
def sample2():
return render_template("web-analytics-real-time.html")
@application.route('/logo')
def get_logo():
"""
Queries the snapshot data for both Serenity and JMeter projects from the MongoDB.
Renders the Snapshot view of html
:return: N/A
"""
# set template directory of the Flask App to the path set by the user as command line arg.
return f'<html><head><title>Root</title><head><body><hr/> Welcome to the main page <hr/> ' \
f'Building image from static public location: <br/> ' \
f'<img src=\'{url_for("static", filename="images/logo.svg")}\' /> </body></html>'
| 38.035211 | 99 | 0.549713 | 581 | 5,401 | 4.967298 | 0.311532 | 0.072072 | 0.034304 | 0.033957 | 0.45738 | 0.422384 | 0.388773 | 0.378032 | 0.378032 | 0.365558 | 0 | 0.025018 | 0.20811 | 5,401 | 141 | 100 | 38.304965 | 0.649755 | 0.255323 | 0 | 0.211765 | 0 | 0.011765 | 0.306288 | 0.071484 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164706 | false | 0.035294 | 0.117647 | 0.105882 | 0.447059 | 0.023529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
78eed98843af7c2acb54d95dbb60b3f984e9337b | 15,624 | py | Python | idaes/generic_models/properties/core/examples/ASU_PR.py | carldlaird/idaes-pse | cc7a32ca9fa788f483fa8ef85f3d1186ef4a596f | [
"RSA-MD"
] | 112 | 2019-02-11T23:16:36.000Z | 2022-03-23T20:59:57.000Z | idaes/generic_models/properties/core/examples/ASU_PR.py | carldlaird/idaes-pse | cc7a32ca9fa788f483fa8ef85f3d1186ef4a596f | [
"RSA-MD"
] | 621 | 2019-03-01T14:44:12.000Z | 2022-03-31T19:49:25.000Z | idaes/generic_models/properties/core/examples/ASU_PR.py | carldlaird/idaes-pse | cc7a32ca9fa788f483fa8ef85f3d1186ef4a596f | [
"RSA-MD"
] | 154 | 2019-02-01T23:46:33.000Z | 2022-03-23T15:07:10.000Z | #################################################################################
# The Institute for the Design of Advanced Energy Systems Integrated Platform
# Framework (IDAES IP) was produced under the DOE Institute for the
# Design of Advanced Energy Systems (IDAES), and is copyright (c) 2018-2021
# by the software owners: The Regents of the University of California, through
# Lawrence Berkeley National Laboratory, National Technology & Engineering
# Solutions of Sandia, LLC, Carnegie Mellon University, West Virginia University
# Research Corporation, et al. All rights reserved.
#
# Please see the files COPYRIGHT.md and LICENSE.md for full copyright and
# license information.
#################################################################################
"""
Air separation phase equilibrium package using Peng-Robinson EoS.
Example property package using the Generic Property Package Framework.
This example shows how to set up a property package to do air separation
phase equilibrium in the generic framework using Peng-Robinson equation
along with methods drawn from the pre-built IDAES property libraries.
The example includes two dictionaries.
1. The dictionary named configuration contains parameters obtained from
The Properties of Gases and Liquids (1987) 4th edition and NIST.
2. The dictionary named configuration_Dowling_2015 contains parameters used in
A framework for efficient large scale equation-oriented flowsheet optimization
(2015) Dowling. The parameters are extracted from Properties of Gases and
Liquids (1977) 3rd edition for Antoine's vapor equation and acentric factors
and converted values from the Properties of Gases and Liquids (1977)
3rd edition to j.
"""
# Import Python libraries
import logging
# Import Pyomo units
from pyomo.environ import units as pyunits
# Import IDAES cores
from idaes.core import LiquidPhase, VaporPhase, Component
from idaes.generic_models.properties.core.state_definitions import FTPx
from idaes.generic_models.properties.core.eos.ceos import Cubic, CubicType
from idaes.generic_models.properties.core.phase_equil import SmoothVLE
from idaes.generic_models.properties.core.phase_equil.bubble_dew import \
LogBubbleDew
from idaes.generic_models.properties.core.phase_equil.forms import log_fugacity
from idaes.generic_models.properties.core.pure import RPP4
from idaes.generic_models.properties.core.pure import NIST
from idaes.generic_models.properties.core.pure import RPP3
# Set up logger
_log = logging.getLogger(__name__)
# ---------------------------------------------------------------------
# Configuration dictionary for a Peng-Robinson Oxygen-Argon-Nitrogen system
# Data Sources:
# [1] The Properties of Gases and Liquids (1987)
# 4th edition, Chemical Engineering Series - Robert C. Reid
# [2] NIST, https://webbook.nist.gov/
# Retrieved 16th August, 2020
# [3] The Properties of Gases and Liquids (1987)
# 3rd edition, Chemical Engineering Series - Robert C. Reid
# Cp parameters where converted to j in Dowling 2015
# [4] A framework for efficient large scale equation-oriented flowsheet optimization (2015)
# Computers and Chemical Engineering - Alexander W. Dowling
configuration = {
# Specifying components
"components": {
"nitrogen": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": NIST,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (28.0135E-3, pyunits.kg/pyunits.mol), # [1]
"pressure_crit": (34e5, pyunits.Pa), # [1]
"temperature_crit": (126.2, pyunits.K), # [1]
"omega": 0.037, # [1]
"cp_mol_ig_comp_coeff": {
"A": (3.115E1,
pyunits.J/pyunits.mol/pyunits.K), # [1]
"B": (-1.357E-2,
pyunits.J/pyunits.mol/pyunits.K**2),
"C": (2.680E-5,
pyunits.J/pyunits.mol/pyunits.K**3),
"D": (-1.168E-8,
pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [2]
"entr_mol_form_vap_comp_ref": (
191.61, pyunits.J/pyunits.mol/pyunits.K), # [2]
"pressure_sat_comp_coeff": {
"A": (3.7362, None), # [2]
"B": (264.651, pyunits.K),
"C": (-6.788, pyunits.K)}}},
"argon": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": NIST,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (39.948E-3, pyunits.kg/pyunits.mol), # [1]
"pressure_crit": (48.98e5, pyunits.Pa), # [1]
"temperature_crit": (150.86, pyunits.K), # [1]
"omega": 0.001, # [1]
"cp_mol_ig_comp_coeff": {
"A": (2.050E1,
pyunits.J/pyunits.mol/pyunits.K), # [1]
"B": (0.0, pyunits.J/pyunits.mol/pyunits.K**2),
"C": (0.0, pyunits.J/pyunits.mol/pyunits.K**3),
"D": (0.0, pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [2]
"entr_mol_form_vap_comp_ref": (
154.8, pyunits.J/pyunits.mol/pyunits.K), # [2]
"pressure_sat_comp_coeff": {"A": (3.29555, None), # [2]
"B": (215.24, pyunits.K),
"C": (-22.233, pyunits.K)}}},
"oxygen": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": NIST,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (31.999E-3, pyunits.kg/pyunits.mol), # [1]
"pressure_crit": (50.43e5, pyunits.Pa), # [1]
"temperature_crit": (154.58, pyunits.K), # [1]
"omega": 0.025, # [1]
"cp_mol_ig_comp_coeff": {
"A": (2.811E1, pyunits.J/pyunits.mol/pyunits.K),
"B": (-3.680E-6,
pyunits.J/pyunits.mol/pyunits.K**2),
"C": (1.746E-5, pyunits.J/pyunits.mol/pyunits.K**3),
"D": (-1.065E-8,
pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [2]
"entr_mol_form_vap_comp_ref": (
205.152, pyunits.J/pyunits.mol/pyunits.K), # [2]
"pressure_sat_comp_coeff": {
"A": (3.85845, None), # [2]
"B": (325.675, pyunits.K),
"C": (-5.667, pyunits.K)}}}},
# Specifying phases
"phases": {"Liq": {"type": LiquidPhase,
"equation_of_state": Cubic,
"equation_of_state_options": {
"type": CubicType.PR}},
"Vap": {"type": VaporPhase,
"equation_of_state": Cubic,
"equation_of_state_options": {
"type": CubicType.PR}}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FTPx,
"state_bounds": {"flow_mol": (0, 100, 1000, pyunits.mol/pyunits.s),
"temperature": (10, 300, 350, pyunits.K),
"pressure": (5e4, 1e5, 1e7, pyunits.Pa)},
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K),
# Defining phase equilibria
"phases_in_equilibrium": [("Vap", "Liq")],
"phase_equilibrium_state": {("Vap", "Liq"): SmoothVLE},
"bubble_dew_method": LogBubbleDew,
"parameter_data": {"PR_kappa": {("nitrogen", "nitrogen"): 0.000,
("nitrogen", "argon"): -0.26e-2,
("nitrogen", "oxygen"): -0.119e-1,
("argon", "nitrogen"): -0.26e-2,
("argon", "argon"): 0.000,
("argon", "oxygen"): 0.104e-1,
("oxygen", "nitrogen"): -0.119e-1,
("oxygen", "argon"): 0.104e-1,
("oxygen", "oxygen"): 0.000}}}
configuration_Dowling_2015 = {
# Specifying components
"components": {
"nitrogen": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": RPP3,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (28.0135E-3, pyunits.kg/pyunits.mol), # [3]
"pressure_crit": (33.943875e5, pyunits.Pa), # [4]
"temperature_crit": (126.2, pyunits.K), # [4]
"omega": 0.04, # [3]
"cp_mol_ig_comp_coeff": {
'A': (3.112896E1, pyunits.J/pyunits.mol/pyunits.K), # [3]
'B': (-1.356E-2, pyunits.J/pyunits.mol/pyunits.K**2),
'C': (2.6878E-5, pyunits.J/pyunits.mol/pyunits.K**3),
'D': (-1.167E-8, pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [2]
"entr_mol_form_vap_comp_ref": (
191.61, pyunits.J/pyunits.mol/pyunits.K), # [2]
"pressure_sat_comp_coeff": {
'A': (14.9342, None), # [3]
'B': (588.72, pyunits.K),
'C': (-6.60, pyunits.K)}}},
"argon": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": RPP3,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (39.948E-3, pyunits.kg/pyunits.mol), # [3]
"pressure_crit": (48.737325e5, pyunits.Pa), # [4]
"temperature_crit": (150.86, pyunits.K), # [4]
"omega": -0.004, # [1]
"cp_mol_ig_comp_coeff": {
'A': (2.0790296E1, pyunits.J/pyunits.mol/pyunits.K), # [3]
'B': (-3.209E-05, pyunits.J/pyunits.mol/pyunits.K**2),
'C': (5.163E-08, pyunits.J/pyunits.mol/pyunits.K**3),
'D': (0.0, pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [3]
"entr_mol_form_vap_comp_ref": (
154.8, pyunits.J/pyunits.mol/pyunits.K), # [3]
"pressure_sat_comp_coeff": {
'A': (15.2330, None), # [3]
'B': (700.51, pyunits.K),
'C': (-5.84, pyunits.K)}}},
"oxygen": {"type": Component,
"enth_mol_ig_comp": RPP4,
"entr_mol_ig_comp": RPP4,
"pressure_sat_comp": RPP3,
"phase_equilibrium_form": {("Vap", "Liq"): log_fugacity},
"parameter_data": {
"mw": (31.999E-3, pyunits.kg/pyunits.mol), # [3]
"pressure_crit": (50.45985e5, pyunits.Pa), # [4]
"temperature_crit": (154.58, pyunits.K), # [4]
"omega": 0.021, # [1]
"cp_mol_ig_comp_coeff": {
'A': (2.8087192E1, pyunits.J/pyunits.mol/pyunits.K), # [3]
'B': (-3.678E-6, pyunits.J/pyunits.mol/pyunits.K**2),
'C': (1.745E-5, pyunits.J/pyunits.mol/pyunits.K**3),
'D': (-1.064E-8, pyunits.J/pyunits.mol/pyunits.K**4)},
"enth_mol_form_vap_comp_ref": (
0.0, pyunits.J/pyunits.mol), # [2]
"entr_mol_form_vap_comp_ref": (
205.152, pyunits.J/pyunits.mol/pyunits.K), # [2]
"pressure_sat_comp_coeff": {
'A': (15.4075, None), # [3]
'B': (734.55, pyunits.K),
'C': (-6.45, pyunits.K)}}}},
# Specifying phases
"phases": {"Liq": {"type": LiquidPhase,
"equation_of_state": Cubic,
"equation_of_state_options": {
"type": CubicType.PR}},
"Vap": {"type": VaporPhase,
"equation_of_state": Cubic,
"equation_of_state_options": {
"type": CubicType.PR}}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FTPx,
"state_bounds": {"flow_mol": (0, 100, 1000, pyunits.mol/pyunits.s),
"temperature": (10, 300, 350, pyunits.K),
"pressure": (5e4, 1e5, 1e7, pyunits.Pa)},
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K),
# Defining phase equilibria
"phases_in_equilibrium": [("Vap", "Liq")],
"phase_equilibrium_state": {("Vap", "Liq"): SmoothVLE},
"bubble_dew_method": LogBubbleDew,
"parameter_data": {"PR_kappa": {("nitrogen", "nitrogen"): 0.000,
("nitrogen", "argon"): -0.26e-2,
("nitrogen", "oxygen"): -0.119e-1,
("argon", "nitrogen"): -0.26e-2,
("argon", "argon"): 0.000,
("argon", "oxygen"): 0.104e-1,
("oxygen", "nitrogen"): -0.119e-1,
("oxygen", "argon"): 0.104e-1,
("oxygen", "oxygen"): 0.000}}}
| 51.394737 | 91 | 0.473374 | 1,624 | 15,624 | 4.395936 | 0.198892 | 0.060513 | 0.075641 | 0.090769 | 0.738339 | 0.724051 | 0.683569 | 0.660877 | 0.556661 | 0.53621 | 0 | 0.069528 | 0.379544 | 15,624 | 303 | 92 | 51.564356 | 0.666907 | 0.171595 | 0 | 0.686099 | 0 | 0 | 0.184698 | 0.060673 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.049327 | 0 | 0.049327 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78f33bf3b80a0a0d98e998f783441284fa1b3068 | 3,503 | py | Python | invenio_madmp/views.py | FAIR-Data-Austria/invenio-madmp | 74372ee794f81666f5e9cf08ef448c21b2e428be | [
"MIT"
] | 1 | 2022-03-02T10:37:29.000Z | 2022-03-02T10:37:29.000Z | invenio_madmp/views.py | FAIR-Data-Austria/invenio-madmp | 74372ee794f81666f5e9cf08ef448c21b2e428be | [
"MIT"
] | 9 | 2020-08-25T12:03:08.000Z | 2020-10-20T11:45:32.000Z | invenio_madmp/views.py | FAIR-Data-Austria/invenio-madmp | 74372ee794f81666f5e9cf08ef448c21b2e428be | [
"MIT"
] | null | null | null | """Blueprint definitions for maDMP integration."""
from flask import Blueprint, jsonify, request
from invenio_db import db
from .convert import convert_dmp
from .models import DataManagementPlan
def _summarize_dmp(dmp: DataManagementPlan) -> dict:
"""Create a summary dictionary for the given DMP."""
res = {"dmp_id": dmp.dmp_id, "datasets": []}
for ds in dmp.datasets:
dataset = {"dataset_id": ds.dataset_id, "record": None}
if ds.record:
dataset["record"] = ds.record.model.json
res["datasets"].append(dataset)
return res
def create_rest_blueprint(app) -> Blueprint:
"""Create the blueprint for the REST endpoints using the current app extensions."""
# note: using flask.current_app isn't directly possible, because Invenio-MaDMP is
# registered as an extension in the API app, not the "normal" app
# (which is the one usually returned by current_app)
rest_blueprint = Blueprint("invenio_madmp", __name__)
auth = app.extensions["invenio-madmp"].auth
@rest_blueprint.route("/dmps", methods=["GET"])
@auth.login_required
def list_dmps():
"""Give a summary of all stored DMPs."""
dmps = DataManagementPlan.query.all()
res = [_summarize_dmp(dmp) for dmp in dmps]
return jsonify(res)
@rest_blueprint.route("/dmps", methods=["POST"])
@auth.login_required
def create_dmp():
"""Create a new DMP from the maDMP JSON in the request body."""
if request.json is None:
return jsonify({"error": "no json body supplied"}), 400
elif request.json.get("dmp") is None:
return jsonify({"error": "dmp not found in the body"}), 400
dmp_json = request.json.get("dmp", {})
dmp_json_id = dmp_json.get("dmp_id", {}).get("identifier")
if DataManagementPlan.get_by_dmp_id(dmp_json_id) is not None:
return jsonify({"error": "dmp with the same id already exists"}), 409
dmp = convert_dmp(dmp_json)
db.session.add(dmp)
db.session.commit()
# TODO change the returned value
return jsonify(_summarize_dmp(dmp)), 201
@rest_blueprint.route("/dmps/<dmp_id>", methods=["PATCH"])
@auth.login_required
def update_dmp(dmp_id: str = None):
"""Update the specified DMP using the maDMP JSON in the request body."""
hard_sync = request.args.get("sync", "soft") == "hard"
if request.json is None:
return jsonify({"error": "no json body supplied"}), 400
elif request.json.get("dmp") is None:
return jsonify({"error": "dmp not found in the body"}), 400
dmp_json = request.json.get("dmp", {})
dmp_json_id = dmp_json.get("dmp_id", {}).get("identifier")
if dmp_id and dmp_json_id and dmp_id != dmp_json_id:
return jsonify({"error": "mismatch between dmp id from url and body"}), 400
dmp_id = dmp_id or dmp_json_id
if DataManagementPlan.get_by_dmp_id(dmp_id) is None:
return jsonify({"error": "dmp not found"}), 404
dmp = convert_dmp(dmp_json, hard_sync)
db.session.commit()
# TODO change the returned value
return jsonify(_summarize_dmp(dmp))
@rest_blueprint.route("/dmps", methods=["PATCH"])
@auth.login_required
def update_dmp_without_id():
"""Update the specified DMP using the maDMP JSON in the request body."""
return update_dmp(None)
return rest_blueprint
| 35.744898 | 87 | 0.643163 | 475 | 3,503 | 4.581053 | 0.24 | 0.032169 | 0.057904 | 0.060662 | 0.45864 | 0.382813 | 0.382813 | 0.340533 | 0.286765 | 0.286765 | 0 | 0.008989 | 0.237796 | 3,503 | 97 | 88 | 36.113402 | 0.805993 | 0.190123 | 0 | 0.305085 | 0 | 0 | 0.13872 | 0 | 0 | 0 | 0 | 0.010309 | 0 | 1 | 0.101695 | false | 0 | 0.067797 | 0 | 0.389831 | 0.135593 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
78f83610f02792ce2cf026a72886ebff9b5ef71f | 579 | py | Python | assistance_bot/app.py | reakfog/personal_computer_voice_assistant | 3483f633c57cd2e930f94bcbda9739cde34525aa | [
"BSD-3-Clause"
] | null | null | null | assistance_bot/app.py | reakfog/personal_computer_voice_assistant | 3483f633c57cd2e930f94bcbda9739cde34525aa | [
"BSD-3-Clause"
] | null | null | null | assistance_bot/app.py | reakfog/personal_computer_voice_assistant | 3483f633c57cd2e930f94bcbda9739cde34525aa | [
"BSD-3-Clause"
] | 2 | 2021-07-26T20:22:31.000Z | 2021-07-29T12:58:03.000Z | import sys
sys.path = ['', '..'] + sys.path[1:]
import daemon
from assistance_bot import core
from functionality.voice_processing import speaking, listening
from functionality.commands import *
if __name__ == '__main__':
speaking.setup_assistant_voice(core.ttsEngine, core.assistant)
while True:
# start speech recording and speech recognition
recognized_speech = listening.get_listening_and_recognition_result(
core.recognizer,
core.microphone)
# executing the given command
execute_command(recognized_speech)
| 32.166667 | 75 | 0.723661 | 64 | 579 | 6.25 | 0.59375 | 0.035 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002165 | 0.202073 | 579 | 17 | 76 | 34.058824 | 0.863636 | 0.126079 | 0 | 0 | 0 | 0 | 0.019881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.384615 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
78fc7bd4cfef4c55a9ccedee325481258419cb94 | 11,929 | py | Python | ee/clickhouse/sql/person.py | wanderlog/posthog | a88b81d44ab31d262be07e84a85d045c4e28f2a3 | [
"MIT"
] | null | null | null | ee/clickhouse/sql/person.py | wanderlog/posthog | a88b81d44ab31d262be07e84a85d045c4e28f2a3 | [
"MIT"
] | null | null | null | ee/clickhouse/sql/person.py | wanderlog/posthog | a88b81d44ab31d262be07e84a85d045c4e28f2a3 | [
"MIT"
] | null | null | null | from ee.clickhouse.sql.clickhouse import KAFKA_COLUMNS, STORAGE_POLICY, kafka_engine
from ee.clickhouse.sql.table_engines import CollapsingMergeTree, ReplacingMergeTree
from ee.kafka_client.topics import KAFKA_PERSON, KAFKA_PERSON_DISTINCT_ID, KAFKA_PERSON_UNIQUE_ID
from posthog.settings import CLICKHOUSE_CLUSTER, CLICKHOUSE_DATABASE
TRUNCATE_PERSON_TABLE_SQL = f"TRUNCATE TABLE IF EXISTS person ON CLUSTER '{CLICKHOUSE_CLUSTER}'"
DROP_PERSON_TABLE_SQL = f"DROP TABLE IF EXISTS person ON CLUSTER '{CLICKHOUSE_CLUSTER}'"
TRUNCATE_PERSON_DISTINCT_ID_TABLE_SQL = f"TRUNCATE TABLE IF EXISTS person_distinct_id ON CLUSTER '{CLICKHOUSE_CLUSTER}'"
TRUNCATE_PERSON_DISTINCT_ID2_TABLE_SQL = (
f"TRUNCATE TABLE IF EXISTS person_distinct_id2 ON CLUSTER '{CLICKHOUSE_CLUSTER}'"
)
PERSONS_TABLE = "person"
PERSONS_TABLE_BASE_SQL = """
CREATE TABLE IF NOT EXISTS {table_name} ON CLUSTER '{cluster}'
(
id UUID,
created_at DateTime64,
team_id Int64,
properties VARCHAR,
is_identified Int8,
is_deleted Int8 DEFAULT 0
{extra_fields}
) ENGINE = {engine}
"""
PERSONS_TABLE_ENGINE = lambda: ReplacingMergeTree(PERSONS_TABLE, ver="_timestamp")
PERSONS_TABLE_SQL = lambda: (
PERSONS_TABLE_BASE_SQL
+ """Order By (team_id, id)
{storage_policy}
"""
).format(
table_name=PERSONS_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=PERSONS_TABLE_ENGINE(),
extra_fields=KAFKA_COLUMNS,
storage_policy=STORAGE_POLICY(),
)
KAFKA_PERSONS_TABLE_SQL = lambda: PERSONS_TABLE_BASE_SQL.format(
table_name="kafka_" + PERSONS_TABLE, cluster=CLICKHOUSE_CLUSTER, engine=kafka_engine(KAFKA_PERSON), extra_fields="",
)
# You must include the database here because of a bug in clickhouse
# related to https://github.com/ClickHouse/ClickHouse/issues/10471
PERSONS_TABLE_MV_SQL = """
CREATE MATERIALIZED VIEW {table_name}_mv ON CLUSTER '{cluster}'
TO {database}.{table_name}
AS SELECT
id,
created_at,
team_id,
properties,
is_identified,
is_deleted,
_timestamp,
_offset
FROM {database}.kafka_{table_name}
""".format(
table_name=PERSONS_TABLE, cluster=CLICKHOUSE_CLUSTER, database=CLICKHOUSE_DATABASE,
)
GET_LATEST_PERSON_SQL = """
SELECT * FROM person JOIN (
SELECT id, max(_timestamp) as _timestamp, max(is_deleted) as is_deleted
FROM person
WHERE team_id = %(team_id)s
GROUP BY id
) as person_max ON person.id = person_max.id AND person._timestamp = person_max._timestamp
WHERE team_id = %(team_id)s
AND person_max.is_deleted = 0
{query}
"""
GET_LATEST_PERSON_ID_SQL = """
(select id from (
{latest_person_sql}
))
""".format(
latest_person_sql=GET_LATEST_PERSON_SQL
)
#
# person_distinct_id table - use this still in queries, but this will eventually get removed.
#
PERSONS_DISTINCT_ID_TABLE = "person_distinct_id"
PERSONS_DISTINCT_ID_TABLE_BASE_SQL = """
CREATE TABLE IF NOT EXISTS {table_name} ON CLUSTER '{cluster}'
(
distinct_id VARCHAR,
person_id UUID,
team_id Int64,
_sign Int8 DEFAULT 1,
is_deleted Int8 ALIAS if(_sign==-1, 1, 0)
{extra_fields}
) ENGINE = {engine}
"""
PERSONS_DISTINCT_ID_TABLE_SQL = lambda: (
PERSONS_DISTINCT_ID_TABLE_BASE_SQL
+ """Order By (team_id, distinct_id, person_id)
{storage_policy}
"""
).format(
table_name=PERSONS_DISTINCT_ID_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=CollapsingMergeTree(PERSONS_DISTINCT_ID_TABLE, ver="_sign"),
extra_fields=KAFKA_COLUMNS,
storage_policy=STORAGE_POLICY(),
)
# :KLUDGE: We default is_deleted to 0 for backwards compatibility for when we drop `is_deleted` from message schema.
# Can't make DEFAULT if(_sign==-1, 1, 0) because Cyclic aliases error.
KAFKA_PERSONS_DISTINCT_ID_TABLE_SQL = lambda: """
CREATE TABLE {table_name} ON CLUSTER '{cluster}'
(
distinct_id VARCHAR,
person_id UUID,
team_id Int64,
_sign Nullable(Int8),
is_deleted Nullable(Int8)
) ENGINE = {engine}
""".format(
table_name="kafka_" + PERSONS_DISTINCT_ID_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=kafka_engine(KAFKA_PERSON_UNIQUE_ID),
)
# You must include the database here because of a bug in clickhouse
# related to https://github.com/ClickHouse/ClickHouse/issues/10471
PERSONS_DISTINCT_ID_TABLE_MV_SQL = """
CREATE MATERIALIZED VIEW {table_name}_mv ON CLUSTER '{cluster}'
TO {database}.{table_name}
AS SELECT
distinct_id,
person_id,
team_id,
coalesce(_sign, if(is_deleted==0, 1, -1)) AS _sign,
_timestamp,
_offset
FROM {database}.kafka_{table_name}
""".format(
table_name=PERSONS_DISTINCT_ID_TABLE, cluster=CLICKHOUSE_CLUSTER, database=CLICKHOUSE_DATABASE,
)
#
# person_distinct_ids2 - new table!
#
PERSON_DISTINCT_ID2_TABLE = "person_distinct_id2"
PERSON_DISTINCT_ID2_TABLE_BASE_SQL = """
CREATE TABLE IF NOT EXISTS {table_name} ON CLUSTER '{cluster}'
(
team_id Int64,
distinct_id VARCHAR,
person_id UUID,
is_deleted Int8,
version Int64 DEFAULT 1
{extra_fields}
) ENGINE = {engine}
"""
PERSON_DISTINCT_ID2_TABLE_ENGINE = lambda: ReplacingMergeTree(PERSON_DISTINCT_ID2_TABLE, ver="version")
PERSON_DISTINCT_ID2_TABLE_SQL = lambda: (
PERSON_DISTINCT_ID2_TABLE_BASE_SQL
+ """
ORDER BY (team_id, distinct_id)
SETTINGS index_granularity = 512
"""
).format(
table_name=PERSON_DISTINCT_ID2_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=PERSON_DISTINCT_ID2_TABLE_ENGINE(),
extra_fields=KAFKA_COLUMNS + "\n, _partition UInt64",
)
KAFKA_PERSON_DISTINCT_ID2_TABLE_SQL = lambda: PERSON_DISTINCT_ID2_TABLE_BASE_SQL.format(
table_name="kafka_" + PERSON_DISTINCT_ID2_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=kafka_engine(KAFKA_PERSON_DISTINCT_ID),
extra_fields="",
)
# You must include the database here because of a bug in clickhouse
# related to https://github.com/ClickHouse/ClickHouse/issues/10471
PERSON_DISTINCT_ID2_MV_SQL = """
CREATE MATERIALIZED VIEW {table_name}_mv ON CLUSTER '{cluster}'
TO {database}.{table_name}
AS SELECT
team_id,
distinct_id,
person_id,
is_deleted,
version,
_timestamp,
_offset,
_partition
FROM {database}.kafka_{table_name}
""".format(
table_name=PERSON_DISTINCT_ID2_TABLE, cluster=CLICKHOUSE_CLUSTER, database=CLICKHOUSE_DATABASE,
)
#
# Static Cohort
#
PERSON_STATIC_COHORT_TABLE = "person_static_cohort"
PERSON_STATIC_COHORT_BASE_SQL = """
CREATE TABLE IF NOT EXISTS {table_name} ON CLUSTER '{cluster}'
(
id UUID,
person_id UUID,
cohort_id Int64,
team_id Int64
{extra_fields}
) ENGINE = {engine}
"""
PERSON_STATIC_COHORT_TABLE_ENGINE = lambda: ReplacingMergeTree(PERSON_STATIC_COHORT_TABLE, ver="_timestamp")
PERSON_STATIC_COHORT_TABLE_SQL = lambda: (
PERSON_STATIC_COHORT_BASE_SQL
+ """Order By (team_id, cohort_id, person_id, id)
{storage_policy}
"""
).format(
table_name=PERSON_STATIC_COHORT_TABLE,
cluster=CLICKHOUSE_CLUSTER,
engine=PERSON_STATIC_COHORT_TABLE_ENGINE(),
storage_policy=STORAGE_POLICY(),
extra_fields=KAFKA_COLUMNS,
)
TRUNCATE_PERSON_STATIC_COHORT_TABLE_SQL = (
f"TRUNCATE TABLE IF EXISTS {PERSON_STATIC_COHORT_TABLE} ON CLUSTER '{CLICKHOUSE_CLUSTER}'"
)
INSERT_PERSON_STATIC_COHORT = (
f"INSERT INTO {PERSON_STATIC_COHORT_TABLE} (id, person_id, cohort_id, team_id, _timestamp) VALUES"
)
#
# Other queries
#
GET_TEAM_PERSON_DISTINCT_IDS = """
SELECT distinct_id, argMax(person_id, _timestamp) as person_id
FROM (
SELECT distinct_id, person_id, max(_timestamp) as _timestamp
FROM person_distinct_id
WHERE team_id = %(team_id)s %(extra_where)s
GROUP BY person_id, distinct_id, team_id
HAVING max(is_deleted) = 0
)
GROUP BY distinct_id
"""
# Query to query distinct ids using the new table, will be used if 0003_fill_person_distinct_id2 migration is complete
GET_TEAM_PERSON_DISTINCT_IDS_NEW_TABLE = """
SELECT distinct_id, argMax(person_id, version) as person_id
FROM person_distinct_id2
WHERE team_id = %(team_id)s %(extra_where)s
GROUP BY distinct_id
HAVING argMax(is_deleted, version) = 0
"""
GET_PERSON_IDS_BY_FILTER = """
SELECT DISTINCT p.id
FROM ({latest_person_sql}) AS p
INNER JOIN ({GET_TEAM_PERSON_DISTINCT_IDS}) AS pdi ON p.id = pdi.person_id
WHERE team_id = %(team_id)s
{distinct_query}
{limit}
{offset}
""".format(
latest_person_sql=GET_LATEST_PERSON_SQL,
distinct_query="{distinct_query}",
limit="{limit}",
offset="{offset}",
GET_TEAM_PERSON_DISTINCT_IDS="{GET_TEAM_PERSON_DISTINCT_IDS}",
)
INSERT_PERSON_SQL = """
INSERT INTO person (id, created_at, team_id, properties, is_identified, _timestamp, _offset, is_deleted) SELECT %(id)s, %(created_at)s, %(team_id)s, %(properties)s, %(is_identified)s, %(_timestamp)s, 0, 0
"""
INSERT_PERSON_DISTINCT_ID = """
INSERT INTO person_distinct_id SELECT %(distinct_id)s, %(person_id)s, %(team_id)s, %(_sign)s, now(), 0 VALUES
"""
INSERT_PERSON_DISTINCT_ID2 = """
INSERT INTO person_distinct_id2 (distinct_id, person_id, team_id, is_deleted, version, _timestamp, _offset, _partition) SELECT %(distinct_id)s, %(person_id)s, %(team_id)s, 0, %(version)s, now(), 0, 0 VALUES
"""
DELETE_PERSON_BY_ID = """
INSERT INTO person (id, created_at, team_id, properties, is_identified, _timestamp, _offset, is_deleted) SELECT %(id)s, %(created_at)s, %(team_id)s, %(properties)s, %(is_identified)s, %(_timestamp)s, 0, 1
"""
DELETE_PERSON_EVENTS_BY_ID = """
ALTER TABLE events DELETE
WHERE distinct_id IN (
SELECT distinct_id FROM person_distinct_id WHERE person_id=%(id)s AND team_id = %(team_id)s
)
AND team_id = %(team_id)s
"""
INSERT_COHORT_ALL_PEOPLE_THROUGH_PERSON_ID = """
INSERT INTO {cohort_table} SELECT generateUUIDv4(), actor_id, %(cohort_id)s, %(team_id)s, %(_timestamp)s, 0 FROM (
SELECT actor_id FROM ({query})
)
"""
INSERT_COHORT_ALL_PEOPLE_SQL = """
INSERT INTO {cohort_table} SELECT generateUUIDv4(), id, %(cohort_id)s, %(team_id)s, %(_timestamp)s, 0 FROM (
SELECT id FROM (
{latest_person_sql}
) as person INNER JOIN (
SELECT person_id, distinct_id FROM ({GET_TEAM_PERSON_DISTINCT_IDS}) WHERE person_id IN ({content_sql})
) as pdi ON person.id = pdi.person_id
WHERE team_id = %(team_id)s
GROUP BY id
)
"""
GET_DISTINCT_IDS_BY_PROPERTY_SQL = """
SELECT distinct_id
FROM (
{GET_TEAM_PERSON_DISTINCT_IDS}
)
WHERE person_id IN
(
SELECT id
FROM (
SELECT id, argMax(properties, person._timestamp) as properties, max(is_deleted) as is_deleted
FROM person
WHERE team_id = %(team_id)s
GROUP BY id
HAVING is_deleted = 0
)
WHERE {filters}
)
"""
GET_DISTINCT_IDS_BY_PERSON_ID_FILTER = """
SELECT distinct_id
FROM ({GET_TEAM_PERSON_DISTINCT_IDS})
WHERE {filters}
"""
GET_PERSON_PROPERTIES_COUNT = """
SELECT tupleElement(keysAndValues, 1) as key, count(*) as count
FROM person
ARRAY JOIN JSONExtractKeysAndValuesRaw(properties) as keysAndValues
WHERE team_id = %(team_id)s
GROUP BY tupleElement(keysAndValues, 1)
ORDER BY count DESC, key ASC
"""
GET_ACTORS_FROM_EVENT_QUERY = """
SELECT
{id_field} AS actor_id
{matching_events_select_statement}
FROM ({events_query})
GROUP BY actor_id
{limit}
{offset}
"""
COMMENT_DISTINCT_ID_COLUMN_SQL = (
lambda: f"ALTER TABLE person_distinct_id ON CLUSTER '{CLICKHOUSE_CLUSTER}' COMMENT COLUMN distinct_id 'skip_0003_fill_person_distinct_id2'"
)
SELECT_PERSON_PROP_VALUES_SQL = """
SELECT
value,
count(value)
FROM (
SELECT
{property_field} as value
FROM
person
WHERE
team_id = %(team_id)s AND
is_deleted = 0 AND
{property_field} IS NOT NULL AND
{property_field} != ''
ORDER BY id DESC
LIMIT 100000
)
GROUP BY value
ORDER BY count(value) DESC
LIMIT 20
"""
SELECT_PERSON_PROP_VALUES_SQL_WITH_FILTER = """
SELECT
value,
count(value)
FROM (
SELECT
{property_field} as value
FROM
person
WHERE
team_id = %(team_id)s AND
is_deleted = 0 AND
{property_field} ILIKE %(value)s
ORDER BY id DESC
LIMIT 100000
)
GROUP BY value
ORDER BY count(value) DESC
LIMIT 20
"""
| 28.200946 | 206 | 0.748093 | 1,696 | 11,929 | 4.893868 | 0.107311 | 0.033976 | 0.043012 | 0.034458 | 0.641566 | 0.569157 | 0.483133 | 0.433253 | 0.35494 | 0.315663 | 0 | 0.012078 | 0.15324 | 11,929 | 422 | 207 | 28.267773 | 0.809623 | 0.071255 | 0 | 0.460452 | 0 | 0.016949 | 0.610925 | 0.06566 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011299 | 0 | 0.011299 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60099617ee434da572759077c7ea7be632ca1953 | 2,020 | py | Python | alipay/aop/api/domain/AlipayEbppInvoiceAuthSignModel.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | 1 | 2022-03-07T06:11:10.000Z | 2022-03-07T06:11:10.000Z | alipay/aop/api/domain/AlipayEbppInvoiceAuthSignModel.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | null | null | null | alipay/aop/api/domain/AlipayEbppInvoiceAuthSignModel.py | snowxmas/alipay-sdk-python-all | 96870ced60facd96c5bce18d19371720cbda3317 | [
"Apache-2.0"
] | 1 | 2021-10-05T03:01:09.000Z | 2021-10-05T03:01:09.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
from alipay.aop.api.constant.ParamConstants import *
class AlipayEbppInvoiceAuthSignModel(object):
def __init__(self):
self._authorization_type = None
self._m_short_name = None
self._user_id = None
@property
def authorization_type(self):
return self._authorization_type
@authorization_type.setter
def authorization_type(self, value):
self._authorization_type = value
@property
def m_short_name(self):
return self._m_short_name
@m_short_name.setter
def m_short_name(self, value):
self._m_short_name = value
@property
def user_id(self):
return self._user_id
@user_id.setter
def user_id(self, value):
self._user_id = value
def to_alipay_dict(self):
params = dict()
if self.authorization_type:
if hasattr(self.authorization_type, 'to_alipay_dict'):
params['authorization_type'] = self.authorization_type.to_alipay_dict()
else:
params['authorization_type'] = self.authorization_type
if self.m_short_name:
if hasattr(self.m_short_name, 'to_alipay_dict'):
params['m_short_name'] = self.m_short_name.to_alipay_dict()
else:
params['m_short_name'] = self.m_short_name
if self.user_id:
if hasattr(self.user_id, 'to_alipay_dict'):
params['user_id'] = self.user_id.to_alipay_dict()
else:
params['user_id'] = self.user_id
return params
@staticmethod
def from_alipay_dict(d):
if not d:
return None
o = AlipayEbppInvoiceAuthSignModel()
if 'authorization_type' in d:
o.authorization_type = d['authorization_type']
if 'm_short_name' in d:
o.m_short_name = d['m_short_name']
if 'user_id' in d:
o.user_id = d['user_id']
return o
| 28.450704 | 87 | 0.614851 | 253 | 2,020 | 4.557312 | 0.173913 | 0.221162 | 0.130095 | 0.084996 | 0.323504 | 0.267997 | 0.084996 | 0.052038 | 0 | 0 | 0 | 0.0007 | 0.293069 | 2,020 | 70 | 88 | 28.857143 | 0.806723 | 0.020792 | 0 | 0.109091 | 0 | 0 | 0.096251 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163636 | false | 0 | 0.036364 | 0.054545 | 0.327273 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
600a51b33a2bb56f14ea62360d44dde1324b6215 | 1,968 | py | Python | anonlink-entity-service/backend/entityservice/tasks/solver.py | Sam-Gresh/linkage-agent-tools | f405c7efe3fa82d99bc047f130c0fac6f3f5bf82 | [
"Apache-2.0"
] | 1 | 2020-05-19T07:29:31.000Z | 2020-05-19T07:29:31.000Z | backend/entityservice/tasks/solver.py | hardbyte/anonlink-entity-service | 3c1815473bc8169ca571532c18e0913a45c704de | [
"Apache-2.0"
] | null | null | null | backend/entityservice/tasks/solver.py | hardbyte/anonlink-entity-service | 3c1815473bc8169ca571532c18e0913a45c704de | [
"Apache-2.0"
] | null | null | null | import anonlink
from anonlink.candidate_generation import _merge_similarities
from entityservice.object_store import connect_to_object_store
from entityservice.async_worker import celery, logger
from entityservice.settings import Config as config
from entityservice.tasks.base_task import TracedTask
from entityservice.tasks.permutation import save_and_permute
@celery.task(base=TracedTask, ignore_result=True, args_as_tags=('project_id', 'run_id'))
def solver_task(similarity_scores_filename, project_id, run_id, dataset_sizes, parent_span):
log = logger.bind(pid=project_id, run_id=run_id)
mc = connect_to_object_store()
solver_task.span.log_kv({'datasetSizes': dataset_sizes,
'filename': similarity_scores_filename})
score_file = mc.get_object(config.MINIO_BUCKET, similarity_scores_filename)
log.debug("Creating python sparse matrix from bytes data")
candidate_pairs_with_duplicates = anonlink.serialization.load_candidate_pairs(score_file)
similarity_scores, (dset_is0, dset_is1), (rec_is0, rec_is1) = candidate_pairs_with_duplicates
log.info(f"Number of candidate pairs before deduplication: {len(candidate_pairs_with_duplicates[0])}")
if len(candidate_pairs_with_duplicates[0]) > 0:
# TODO use public interface when available
# https://github.com/data61/anonlink/issues/271
candidate_pairs = _merge_similarities([zip(similarity_scores, dset_is0, dset_is1, rec_is0, rec_is1)], k=None)
log.info(f"Number of candidate pairs after deduplication: {len(candidate_pairs[0])}")
log.info("Calculating the optimal mapping from similarity matrix")
groups = anonlink.solving.greedy_solve(candidate_pairs)
else:
groups = []
log.info("Entity groups have been computed")
res = {
"groups": groups,
"datasetSizes": dataset_sizes
}
save_and_permute.delay(res, project_id, run_id, solver_task.get_serialized_span())
| 48 | 117 | 0.760163 | 258 | 1,968 | 5.496124 | 0.434109 | 0.098731 | 0.024683 | 0.039492 | 0.146685 | 0.146685 | 0.101551 | 0.059238 | 0.059238 | 0.059238 | 0 | 0.010198 | 0.152947 | 1,968 | 40 | 118 | 49.2 | 0.840432 | 0.043699 | 0 | 0 | 0 | 0 | 0.184141 | 0.035125 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.032258 | false | 0 | 0.225806 | 0 | 0.258065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6015330e90658ef9cb434f3116ddc5c99e3f87e7 | 6,403 | py | Python | vitcloud/views.py | biocross/VITCloud | 9656bd489c6d05717bf529d0661e07da0cd2551a | [
"MIT"
] | 2 | 2016-10-09T09:16:39.000Z | 2017-12-30T10:04:24.000Z | vitcloud/views.py | biocross/VITCloud | 9656bd489c6d05717bf529d0661e07da0cd2551a | [
"MIT"
] | 1 | 2015-03-28T12:10:24.000Z | 2015-03-28T19:19:00.000Z | vitcloud/views.py | biocross/VITCloud | 9656bd489c6d05717bf529d0661e07da0cd2551a | [
"MIT"
] | null | null | null | from django.views.generic import View
from django.http import HttpResponse
import os, json, datetime
from django.shortcuts import redirect
from django.shortcuts import render_to_response
from vitcloud.models import File
from django.views.decorators.csrf import csrf_exempt
from listingapikeys import findResult
import sys # sys.setdefaultencoding is cancelled by site.py
reload(sys) # to re-enable sys.setdefaultencoding()
sys.setdefaultencoding('utf-8')
#Custom Functions:
def repeated(fname, fsize, fblock, froom):
if(File.objects.filter(name=fname, size=fsize, block=fblock, room=froom)):
return True
else:
return False
#**Not for Production** Views
def clean(request):
q = File.objects.filter(block__iexact="L")
q.delete()
#Views:
def home(request):
no_of_files = len(File.objects.values_list('name').distinct())
no_of_blocks = len(File.objects.values_list('block').distinct())
no_of_rooms = len(File.objects.values_list('room').distinct())
file_sizes = File.objects.all()
total_file_size = 0
for x in file_sizes:
total_file_size = total_file_size + int(x.size)
total_file_size = (total_file_size/1024)
return render_to_response('home/home.htm' , {'x' : no_of_files, 'y' : no_of_blocks, 'z' : no_of_rooms, 'w' : total_file_size })
def pageSearch(request):
return render_to_response('home/search.htm')
def pageHowitworks(request):
return render_to_response('home/howitworks.htm')
def pageTopsharers(request):
return render_to_response('home/topsharers.htm')
def pageGettheapp(request):
return render_to_response('home/gettheapp.htm')
def search(request):
if request.method == "GET":
if (request.GET['thesearchbox'] == ""):
if ('latest' in request.GET):
results = File.objects.all().order_by("-id")
no = len(results)
paragraph = True
return render_to_response('home/results.htm', {'results' : results, 'paragraph' : paragraph, 'no' : no })
else:
return redirect('/blockwise')
else:
filename = str(request.GET['thesearchbox'])
paragraph = False
results = File.objects.filter(name__icontains=filename).order_by("-id")
no = len(results)
return render_to_response('home/results.htm', {'results' : results, 'paragraph': paragraph, 'no' : no })
def blockwise(request):
blockNames = File.objects.values_list('block').distinct()
for x in blockNames:
print str(x[0])
return render_to_response('home/blockwise.htm', {'blocks' : blockNames})
def blockwiseFeeder(request):
if request.method == "GET":
block = request.GET['block']
blockFiles = File.objects.filter(block__iexact=block).order_by("-id")
return render_to_response('home/blockwiseFeeder.htm', {'block': block, 'results': blockFiles})
def suggestions(request):
if request.method == "GET":
filename = str(request.GET['q'])
results = File.objects.filter(name__icontains=filename)
length = len(results)
suggestions = [filename, []]
for x in range(0, length, 1):
suggestions[1].append(results[x].name)
return HttpResponse(json.dumps(suggestions))
class Block:
name = ""
total = ""
def statistics(request):
finalArray = []
blockNames = []
blockSizes = []
blocks = File.objects.values_list('block').distinct()
for x in blocks:
blockName = str(str(x[0]).upper())
blockName = blockName + " Block"
blockNames.append(str(blockName).encode('utf-8'))
blockFiles = File.objects.filter(block__iexact=x[0])
totalSize = 0
for y in blockFiles:
totalSize = totalSize + int(y.size)
blockSizes.append(totalSize/1024)
return render_to_response('home/stats.htm', { 'blockNames' : blockNames, 'blockSizes' : blockSizes })
def apiFeed(request):
if request.method == "GET":
if("q" in request.GET):
filename = str(request.GET['q'])
result = findResult(filename)
return HttpResponse(json.dumps(result))
else:
return HttpResponse("Need The Required Parameters to work!")
def fileDetails(request):
if request.method == "GET":
filename = str(request.GET['q'])
results = File.objects.filter(name__icontains=filename)
filen = "NOTFOUND.404"
for x in results:
filen = x.name
return render_to_response('home/file.htm', {'results' : results, 'filen': filen })
def submitOne(request):
error = False
if 'filename' in request.GET:
filename = request.GET['filename']
filesize = 100000
fileblock = "MN"
fileroom = "447"
if not filename:
error = True
else:
now = datetime.datetime.now()
p1 = File.objects.create(name=filename, size = filesize, block = fileblock, room = fileroom, date = now)
results = File.objects.all()
return render_to_response('home/success.htm', { 'results': results })
return render_to_response('home/submitone.htm', { 'error': error })
@csrf_exempt
def interface(request):
if request.method == "POST":
data = json.loads(request.body)
currentBlock = str(data['Block'])
currentRoom = str(data['Room'])
currentHostelType = str(data['Hostel'])
no = len(data['Files'])
inserted = 0
data=data['Files']
for x in range(0, no, 2):
data[x+1] = int(data[x+1])
data[x+1] = (data[x+1]/1048576)
if not repeated(fname = data[x], fsize = str(data[x+1]), fblock=currentBlock, froom = currentRoom):
now = datetime.datetime.now()
temp = File.objects.create(name=data[x], size=str(data[x+1]), block = currentBlock, room = currentRoom, date = now)
inserted = (inserted + 1)
files_inserted = inserted
result = "inserted files: \n\n" + str(files_inserted)
return HttpResponse(result)
else:
return HttpResponse("<h2>VITCloud</h2> <h4>Desktop App Interface</h4><br/><br/><strong>Current Status:</strong> Listening at /interface...<br/><br/>Copyright 2012-2013<br/>Siddharth Gupta<br/>Saurabh Joshi")
| 37.444444 | 215 | 0.629705 | 764 | 6,403 | 5.175393 | 0.235602 | 0.047294 | 0.056651 | 0.072332 | 0.306272 | 0.247597 | 0.114568 | 0.103187 | 0.103187 | 0.082954 | 0 | 0.012308 | 0.238638 | 6,403 | 170 | 216 | 37.664706 | 0.798769 | 0.021084 | 0 | 0.153846 | 0 | 0.006993 | 0.117694 | 0.018844 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.062937 | null | null | 0.006993 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
601651a2b4d6d062db448e75989e40e985eb13df | 1,661 | py | Python | migrations/versions/e86dd3bc539c_change_admin_to_boolean.py | jonzxz/project-piscator | 588c8b1ac9355f9a82ac449fdbeaa1ef7eb441ef | [
"MIT"
] | null | null | null | migrations/versions/e86dd3bc539c_change_admin_to_boolean.py | jonzxz/project-piscator | 588c8b1ac9355f9a82ac449fdbeaa1ef7eb441ef | [
"MIT"
] | null | null | null | migrations/versions/e86dd3bc539c_change_admin_to_boolean.py | jonzxz/project-piscator | 588c8b1ac9355f9a82ac449fdbeaa1ef7eb441ef | [
"MIT"
] | 1 | 2021-02-18T03:08:21.000Z | 2021-02-18T03:08:21.000Z | """change admin to boolean
Revision ID: e86dd3bc539c
Revises: 6f63ef516cdc
Create Date: 2020-11-11 22:32:00.707936
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'e86dd3bc539c'
down_revision = '6f63ef516cdc'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('email_address', sa.Column('active', sa.Boolean(), nullable=False))
op.add_column('email_address', sa.Column('email_password', sa.String(length=255), nullable=False))
op.add_column('email_address', sa.Column('last_mailbox_size', sa.Integer(), nullable=True))
op.add_column('email_address', sa.Column('last_updated', sa.DateTime(), nullable=True))
op.add_column('email_address', sa.Column('phishing_mail_detected', sa.Integer(), nullable=True))
op.add_column('user', sa.Column('is_active', sa.Boolean(), nullable=False))
op.add_column('user', sa.Column('is_admin', sa.Boolean(), nullable=True))
op.add_column('user', sa.Column('last_logged_in', sa.DateTime(), nullable=True))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('user', 'last_logged_in')
op.drop_column('user', 'is_admin')
op.drop_column('user', 'is_active')
op.drop_column('email_address', 'phishing_mail_detected')
op.drop_column('email_address', 'last_updated')
op.drop_column('email_address', 'last_mailbox_size')
op.drop_column('email_address', 'email_password')
op.drop_column('email_address', 'active')
# ### end Alembic commands ###
| 38.627907 | 102 | 0.712222 | 226 | 1,661 | 5.017699 | 0.300885 | 0.106702 | 0.15873 | 0.070547 | 0.547619 | 0.452381 | 0.402998 | 0.347443 | 0.153439 | 0 | 0 | 0.032481 | 0.128838 | 1,661 | 42 | 103 | 39.547619 | 0.751209 | 0.183624 | 0 | 0 | 0 | 0 | 0.289613 | 0.033359 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.083333 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
601677ed1a2084da8bff806075ddd7b027330006 | 388 | py | Python | school/migrations/0010_alter_sala_unique_together.py | adrianomqsmts/django-escola | a69541bceb3f30bdd2e9f0f41aa9c2da6081a1d1 | [
"MIT"
] | null | null | null | school/migrations/0010_alter_sala_unique_together.py | adrianomqsmts/django-escola | a69541bceb3f30bdd2e9f0f41aa9c2da6081a1d1 | [
"MIT"
] | null | null | null | school/migrations/0010_alter_sala_unique_together.py | adrianomqsmts/django-escola | a69541bceb3f30bdd2e9f0f41aa9c2da6081a1d1 | [
"MIT"
] | null | null | null | # Generated by Django 4.0.3 on 2022-03-16 03:09
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('school', '0009_rename_periodo_semestre_alter_semestre_options_and_more'),
]
operations = [
migrations.AlterUniqueTogether(
name='sala',
unique_together={('porta', 'predio')},
),
]
| 21.555556 | 83 | 0.634021 | 41 | 388 | 5.780488 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065517 | 0.252577 | 388 | 17 | 84 | 22.823529 | 0.751724 | 0.115979 | 0 | 0 | 1 | 0 | 0.237537 | 0.175953 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
601c017654bfba5b4012ac4932fefa02ad294c7b | 912 | py | Python | account/admin.py | RichardLeeH/invoce_sys | 42a6f5750f45b25e0d7282114ccb7f9f72ee1761 | [
"Apache-2.0"
] | null | null | null | account/admin.py | RichardLeeH/invoce_sys | 42a6f5750f45b25e0d7282114ccb7f9f72ee1761 | [
"Apache-2.0"
] | null | null | null | account/admin.py | RichardLeeH/invoce_sys | 42a6f5750f45b25e0d7282114ccb7f9f72ee1761 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.contrib import admin
from django.contrib.auth.admin import UserAdmin
from django.contrib.auth.models import User
from rest_framework.authtoken.models import Token
from account.models import Profile
admin.site.site_header = 'invoce'
class TokenAdmin(admin.ModelAdmin):
list_display = ('key', 'uid', 'user', 'created')
fields = ('user',)
ordering = ('-created',)
def uid(self, obj):
return obj.user.id
uid.short_description = u'用户ID'
admin.site.unregister(Token)
admin.site.register(Token, TokenAdmin)
class ProfileInline(admin.StackedInline):
model = Profile
class UserCustomAdmin(UserAdmin):
list_display = ('id', 'username', 'email', 'is_active', 'is_staff')
inlines = (ProfileInline, )
ordering = ('-id', )
admin.site.unregister(User)
admin.site.register(User, UserCustomAdmin)
| 24 | 71 | 0.718202 | 111 | 912 | 5.792793 | 0.495496 | 0.069984 | 0.079316 | 0.065319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001292 | 0.151316 | 912 | 37 | 72 | 24.648649 | 0.829457 | 0.023026 | 0 | 0 | 0 | 0 | 0.08324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.25 | 0.041667 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
601e1228f0fc5110925548eceed16ee0fac450d1 | 3,654 | py | Python | pynyzo/pynyzo/keyutil.py | EggPool/pynyzo | 7f3b86f15caa51a975e6a428f4dff578a1f24bcb | [
"MIT"
] | 6 | 2019-02-09T02:46:18.000Z | 2021-03-29T04:15:15.000Z | pynyzo/pynyzo/keyutil.py | EggPool/pynyzo | 7f3b86f15caa51a975e6a428f4dff578a1f24bcb | [
"MIT"
] | 1 | 2020-05-17T18:29:20.000Z | 2020-05-18T08:31:33.000Z | pynyzo/pynyzo/keyutil.py | EggPool/pynyzo | 7f3b86f15caa51a975e6a428f4dff578a1f24bcb | [
"MIT"
] | 5 | 2019-02-09T02:46:19.000Z | 2021-01-08T06:49:50.000Z | """
Eddsa Ed25519 key handling
From
https://github.com/n-y-z-o/nyzoVerifier/blob/b73bc25ba3094abe3470ec070ce306885ad9a18f/src/main/java/co/nyzo/verifier/KeyUtil.java
plus
https://github.com/n-y-z-o/nyzoVerifier/blob/17509f03a7f530c0431ce85377db9b35688c078e/src/main/java/co/nyzo/verifier/util/SignatureUtil.java
"""
# Uses https://github.com/warner/python-ed25519 , c binding, fast
import ed25519
import hashlib
from pynyzo.byteutil import ByteUtil
class KeyUtil:
@staticmethod
def main():
"""Temp test, not to be used"""
signing_key, verifying_key = ed25519.create_keypair()
print("Original private key", ByteUtil.bytes_as_string_with_dashes(signing_key.to_bytes()[:32]))
# signing key has signing + verifying, we keep the first 32 to only get the private part.
print("Original public key", ByteUtil.bytes_as_string_with_dashes(verifying_key.to_bytes()))
@staticmethod
def generateSeed(hashable_keyword: str='') -> bytes:
"""Generate a private key, with optional keyword to get reproducible tests results or later HD Wallet."""
if len(hashable_keyword):
seed = hashlib.sha256(hashable_keyword).digest()
signing_key = ed25519.SigningKey(seed)
else:
signing_key, _ = ed25519.create_keypair()
return signing_key.to_bytes()[:32]
@staticmethod
def private_to_public(private: str) -> str:
"""Temp Test"""
keydata = bytes.fromhex(private)
signing_key = ed25519.SigningKey(keydata)
verifying_key = signing_key.get_verifying_key()
vkey_hex = verifying_key.to_ascii(encoding="hex")
return vkey_hex.decode('utf-8')
@staticmethod
def get_from_private_seed_file(filename: str):
"""returns priv and pub key - as object - from the stored nyzo text id format"""
with open(filename) as f:
nyzo = f.read(80).replace('-', '').encode('utf-8').strip()
signing_key = ed25519.SigningKey(nyzo, encoding="hex")
verifying_key = signing_key.get_verifying_key()
return signing_key, verifying_key
@staticmethod
def get_from_private_seed(seed: str):
"""returns priv and pub key - as object - from an hex seed"""
seed = seed.replace('-', '').encode('utf-8').strip()
signing_key = ed25519.SigningKey(seed, encoding="hex")
verifying_key = signing_key.get_verifying_key()
return signing_key, verifying_key
@staticmethod
def save_to_private_seed_file(filename: str, key: bytes) -> None:
"""Saves the privkey to the nyzo formatted file"""
nyzo_format = ByteUtil.bytes_as_string_with_dashes(key)
with open(filename, 'w') as f:
f.write(nyzo_format)
@staticmethod
def sign_bytes(bytes_to_sign: bytes, private_key: ed25519.SigningKey) -> bytes:
sig = private_key.sign(bytes_to_sign)
return sig
@staticmethod
def signature_is_valid(signature: bytes, signed_bytes: bytes, public_id: bytes) -> bool:
verifying_key = ed25519.VerifyingKey(public_id)
# todo: cache key from id, see https://github.com/n-y-z-o/nyzoVerifier/blob/17509f03a7f530c0431ce85377db9b35688c078e/src/main/java/co/nyzo/verifier/util/SignatureUtil.java
try:
verifying_key.verify(signature, signed_bytes)
# print("signature is good")
return True
except ed25519.BadSignatureError:
# print("signature is bad!")
return False
if __name__ == "__main__":
KeyUtil.main()
# KeyUtil.private_to_public('nyzo-formatted-private-key'.replace('-', ''))
| 38.463158 | 179 | 0.678982 | 458 | 3,654 | 5.218341 | 0.31441 | 0.058577 | 0.035565 | 0.045188 | 0.396234 | 0.349791 | 0.305021 | 0.261088 | 0.261088 | 0.176569 | 0 | 0.054073 | 0.210454 | 3,654 | 94 | 180 | 38.87234 | 0.77435 | 0.29283 | 0 | 0.232143 | 0 | 0 | 0.029145 | 0 | 0 | 0 | 0 | 0.010638 | 0 | 1 | 0.142857 | false | 0 | 0.053571 | 0 | 0.339286 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
601f307a31ada0a1b790c747cfc5310721f08839 | 724 | py | Python | python code/influxdb_worker.py | thongnbui/MIDS_251_project | 8eee0f4569268e11c2d1d356024dbdc10f180b10 | [
"Apache-2.0"
] | null | null | null | python code/influxdb_worker.py | thongnbui/MIDS_251_project | 8eee0f4569268e11c2d1d356024dbdc10f180b10 | [
"Apache-2.0"
] | null | null | null | python code/influxdb_worker.py | thongnbui/MIDS_251_project | 8eee0f4569268e11c2d1d356024dbdc10f180b10 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
import json
import argparse
from influxdb import InfluxDBClient
parser = argparse.ArgumentParser(description = 'pull data for softlayer queue' )
parser.add_argument( 'measurement' , help = 'measurement001' )
args = parser.parse_args()
client_influxdb = InfluxDBClient('50.23.117.76', '8086', 'cricket', 'cricket', 'cricket_data')
query = 'SELECT "data_center", "device", "value" FROM "cricket_data"."cricket_retention".'+args.measurement+' WHERE time > now() - 10m order by time'
result = client_influxdb.query(query)
for r in result:
i = 0
for data_center, device, value, time in r:
print args.measurement,'\t',r[i][data_center],'\t',r[i][device],'\t',r[i][time],'\t',r[i][value]
i += 1
| 34.47619 | 149 | 0.705801 | 103 | 724 | 4.864078 | 0.495146 | 0.015968 | 0.023952 | 0.083832 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031696 | 0.128453 | 724 | 20 | 150 | 36.2 | 0.762282 | 0.022099 | 0 | 0 | 0 | 0 | 0.315417 | 0.049505 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.214286 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6021d213fcca1b9fd94f8cf2d534f74eefae66dc | 3,522 | py | Python | src/python/pants/backend/docker/lint/hadolint/subsystem.py | xyzst/pants | d6a357fe67ee7e8e1aefeae625e107f5609f1717 | [
"Apache-2.0"
] | null | null | null | src/python/pants/backend/docker/lint/hadolint/subsystem.py | xyzst/pants | d6a357fe67ee7e8e1aefeae625e107f5609f1717 | [
"Apache-2.0"
] | 28 | 2021-12-27T15:53:46.000Z | 2022-03-23T11:01:42.000Z | src/python/pants/backend/docker/lint/hadolint/subsystem.py | riisi/pants | b33327389fab67c47b919710ea32f20ca284b1a6 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from __future__ import annotations
from typing import cast
from pants.core.util_rules.config_files import ConfigFilesRequest
from pants.core.util_rules.external_tool import TemplatedExternalTool
from pants.option.custom_types import file_option, shell_str
class Hadolint(TemplatedExternalTool):
options_scope = "hadolint"
name = "hadolint"
help = "A linter for Dockerfiles."
default_version = "v2.8.0"
# TODO: https://github.com/hadolint/hadolint/issues/411 tracks building and releasing
# hadolint for Linux ARM64.
default_known_versions = [
"v2.8.0|macos_x86_64|27985f257a216ecab06a16e643e8cb0123e7145b5d526cfcb4ce7a31fe99f357|2428944",
"v2.8.0|macos_arm64 |27985f257a216ecab06a16e643e8cb0123e7145b5d526cfcb4ce7a31fe99f357|2428944", # same as mac x86
"v2.8.0|linux_x86_64|9dfc155139a1e1e9b3b28f3de9907736b9dfe7cead1c3a0ae7ff0158f3191674|5895708",
]
default_url_template = (
"https://github.com/hadolint/hadolint/releases/download/{version}/hadolint-{platform}"
)
default_url_platform_mapping = {
"macos_arm64": "Darwin-x86_64",
"macos_x86_64": "Darwin-x86_64",
"linux_x86_64": "Linux-x86_64",
}
@classmethod
def register_options(cls, register):
super().register_options(register)
register(
"--skip",
type=bool,
default=False,
help="Don't use Hadolint when running `./pants lint`.",
)
register(
"--args",
type=list,
member_type=shell_str,
help=(
"Arguments to pass directly to Hadolint, e.g. `--hadolint-args='--format json'`.'"
),
)
register(
"--config",
type=file_option,
default=None,
advanced=True,
help=(
"Path to an YAML config file understood by Hadolint "
"(https://github.com/hadolint/hadolint#configure).\n\n"
f"Setting this option will disable `[{cls.options_scope}].config_discovery`. Use "
"this option if the config is located in a non-standard location."
),
)
register(
"--config-discovery",
type=bool,
default=True,
advanced=True,
help=(
"If true, Pants will include all relevant config files during runs "
"(`.hadolint.yaml` and `.hadolint.yml`).\n\n"
f"Use `[{cls.options_scope}].config` instead if your config is in a "
"non-standard location."
),
)
@property
def skip(self) -> bool:
return cast(bool, self.options.skip)
@property
def args(self) -> tuple[str, ...]:
return tuple(self.options.args)
@property
def config(self) -> str | None:
return cast("str | None", self.options.config)
def config_request(self) -> ConfigFilesRequest:
# Refer to https://github.com/hadolint/hadolint#configure for how config files are
# discovered.
return ConfigFilesRequest(
specified=self.config,
specified_option_name=f"[{self.options_scope}].config",
discovery=cast(bool, self.options.config_discovery),
check_existence=[".hadolint.yaml", ".hadolint.yml"],
)
| 35.938776 | 122 | 0.615559 | 373 | 3,522 | 5.686327 | 0.399464 | 0.016502 | 0.007544 | 0.04149 | 0.11834 | 0.036775 | 0 | 0 | 0 | 0 | 0 | 0.077074 | 0.277967 | 3,522 | 97 | 123 | 36.309278 | 0.75698 | 0.097956 | 0 | 0.2125 | 0 | 0 | 0.364583 | 0.127525 | 0 | 0 | 0 | 0.010309 | 0 | 1 | 0.0625 | false | 0.0125 | 0.0625 | 0.05 | 0.275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6026e4bb115c40518d8be86f2973d4fb63be08f1 | 2,019 | py | Python | hanlp/pretrained/tok.py | chen88358323/HanLP | ee9066c3b7aad405dfe0ccffb7f66c59017169ae | [
"Apache-2.0"
] | 2 | 2022-03-23T08:50:39.000Z | 2022-03-23T08:50:48.000Z | hanlp/pretrained/tok.py | kingfan1998/HanLP | ee9066c3b7aad405dfe0ccffb7f66c59017169ae | [
"Apache-2.0"
] | null | null | null | hanlp/pretrained/tok.py | kingfan1998/HanLP | ee9066c3b7aad405dfe0ccffb7f66c59017169ae | [
"Apache-2.0"
] | null | null | null | # -*- coding:utf-8 -*-
# Author: hankcs
# Date: 2019-12-28 21:12
from hanlp_common.constant import HANLP_URL
SIGHAN2005_PKU_CONVSEG = HANLP_URL + 'tok/sighan2005-pku-convseg_20200110_153722.zip'
'Conv model (:cite:`wang-xu-2017-convolutional`) trained on sighan2005 pku dataset.'
SIGHAN2005_MSR_CONVSEG = HANLP_URL + 'tok/convseg-msr-nocrf-noembed_20200110_153524.zip'
'Conv model (:cite:`wang-xu-2017-convolutional`) trained on sighan2005 msr dataset.'
CTB6_CONVSEG = HANLP_URL + 'tok/ctb6_convseg_nowe_nocrf_20200110_004046.zip'
'Conv model (:cite:`wang-xu-2017-convolutional`) trained on CTB6 dataset.'
PKU_NAME_MERGED_SIX_MONTHS_CONVSEG = HANLP_URL + 'tok/pku98_6m_conv_ngram_20200110_134736.zip'
'Conv model (:cite:`wang-xu-2017-convolutional`) trained on pku98 six months dataset with familiy name and given name merged into one unit.'
LARGE_ALBERT_BASE = HANLP_URL + 'tok/large_corpus_cws_albert_base_20211228_160926.zip'
'ALBERT model (:cite:`Lan2020ALBERT:`) trained on the largest CWS dataset in the world.'
SIGHAN2005_PKU_BERT_BASE_ZH = HANLP_URL + 'tok/sighan2005_pku_bert_base_zh_20201231_141130.zip'
'BERT model (:cite:`devlin-etal-2019-bert`) trained on sighan2005 pku dataset.'
COARSE_ELECTRA_SMALL_ZH = HANLP_URL + 'tok/coarse_electra_small_20220220_013548.zip'
'Electra (:cite:`clark2020electra`) small model trained on coarse-grained CWS corpora. Its performance is P=96.97% R=96.87% F1=96.92% which is ' \
'much higher than that of MTL model '
FINE_ELECTRA_SMALL_ZH = HANLP_URL + 'tok/fine_electra_small_20220217_190117.zip'
'Electra (:cite:`clark2020electra`) small model trained on fine-grained CWS corpora. Its performance is P=97.44% R=97.40% F1=97.42% which is ' \
'much higher than that of MTL model '
CTB9_TOK_ELECTRA_SMALL = HANLP_URL + 'tok/ctb9_electra_small_20220215_205427.zip'
'Electra (:cite:`clark2020electra`) small model trained on CTB9. Its performance is P=97.15% R=97.36% F1=97.26% which is ' \
'much higher than that of MTL model '
# Will be filled up during runtime
ALL = {}
| 67.3 | 146 | 0.788014 | 323 | 2,019 | 4.696594 | 0.362229 | 0.052736 | 0.06526 | 0.047462 | 0.476599 | 0.383652 | 0.350692 | 0.305867 | 0.208965 | 0.13975 | 0 | 0.142303 | 0.105498 | 2,019 | 29 | 147 | 69.62069 | 0.697674 | 0.045072 | 0 | 0.130435 | 0 | 0.173913 | 0.75923 | 0.360374 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6029de67c839bfcae337c354721a055f1b81107e | 2,452 | py | Python | model_selection.py | HrishikV/ineuron_inceome_prediction_internship | 4a97a7f29d80198f394fcfd880cc5250fe2a0d1e | [
"MIT"
] | null | null | null | model_selection.py | HrishikV/ineuron_inceome_prediction_internship | 4a97a7f29d80198f394fcfd880cc5250fe2a0d1e | [
"MIT"
] | null | null | null | model_selection.py | HrishikV/ineuron_inceome_prediction_internship | 4a97a7f29d80198f394fcfd880cc5250fe2a0d1e | [
"MIT"
] | null | null | null | from featur_selection import df,race,occupation,workclass,country
import pandas as pd
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import cross_val_score,KFold
from sklearn.linear_model import LogisticRegression
from imblearn.pipeline import Pipeline
from sklearn.compose import ColumnTransformer
from imblearn.combine import SMOTETomek
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier,AdaBoostClassifier
from sklearn.neighbors import KNeighborsClassifier
from catboost import CatBoostClassifier
from xgboost import XGBClassifier
from sklearn.svm import SVC
from matplotlib import pyplot as plt
import seaborn as sns
df1=df.copy()
salary=df1['salary'].reset_index(drop=True)
df1=df1.drop(['salary'],axis=1)
def concat_dataframes(data):
dataframe = pd.concat([data, workclass.iloc[data.index, :], race.iloc[data.index , :], occupation.iloc[data.index, :], country.iloc[data.index, :]], axis = 1)
dataframe = dataframe.dropna()
dataframe = dataframe.reset_index(drop=True)
return dataframe
df1= concat_dataframes(df1)
features=['age_logarthmic','hours_per_week']
scaler = ColumnTransformer(transformers = [('scale_num_features', StandardScaler(), features)], remainder='passthrough')
models = [LogisticRegression(), SVC(), AdaBoostClassifier(), RandomForestClassifier(), XGBClassifier(),DecisionTreeClassifier(), KNeighborsClassifier(), CatBoostClassifier()]
model_labels = ['LogisticReg.','SVC','AdaBoost','RandomForest','Xgboost','DecisionTree','KNN', 'CatBoost']
mean_validation_f1_scores = []
for model in models:
data_pipeline = Pipeline(steps = [
('scaler', scaler),
('resample', SMOTETomek()),
('model', model)
])
mean_validation_f1 = float(cross_val_score(data_pipeline, df1, salary, cv=KFold(n_splits=10), scoring='f1',n_jobs=-1).mean())
mean_validation_f1_scores.append(mean_validation_f1)
print(mean_validation_f1_scores)
fig, axes = plt.subplots(nrows = 2, ncols = 1, figsize = (15,8))
sns.set_style('dark')
sns.barplot(y = model_labels ,x = mean_validation_f1_scores, ax=axes[0])
axes[0].grid(True, color='k')
sns.set_style('whitegrid')
sns.lineplot(x = model_labels, y = mean_validation_f1_scores)
axes[1].grid(True, color='k')
fig.show() | 45.407407 | 240 | 0.722675 | 288 | 2,452 | 6 | 0.423611 | 0.050926 | 0.064815 | 0.063657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013612 | 0.161093 | 2,452 | 54 | 241 | 45.407407 | 0.826446 | 0 | 0 | 0 | 0 | 0 | 0.069303 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0.021277 | 0.340426 | 0 | 0.382979 | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
602de82ea89f13dcd9f29b60fb46750634f30aed | 7,711 | py | Python | app/auth/views.py | MainaKamau92/apexselftaught | 9f9a3bd1ba23e57a12e173730917fb9bb7003707 | [
"MIT"
] | 4 | 2019-01-02T19:52:00.000Z | 2022-02-21T11:07:34.000Z | app/auth/views.py | MainaKamau92/apexselftaught | 9f9a3bd1ba23e57a12e173730917fb9bb7003707 | [
"MIT"
] | 2 | 2019-12-04T13:36:54.000Z | 2019-12-04T13:49:21.000Z | app/auth/views.py | MainaKamau92/apexselftaught | 9f9a3bd1ba23e57a12e173730917fb9bb7003707 | [
"MIT"
] | 1 | 2021-11-28T13:23:14.000Z | 2021-11-28T13:23:14.000Z | # app/auth/views.py
import os
from flask import flash, redirect, render_template, url_for, request
from flask_login import login_required, login_user, logout_user, current_user
from . import auth
from .forms import (LoginForm, RegistrationForm,
RequestResetForm, ResetPasswordForm)
from .. import db, mail
from ..models import User
from flask_mail import Message
from werkzeug.security import generate_password_hash
@auth.route('/register/', methods=['GET', 'POST'])
def register():
"""
Handle requests to the /register route
Add an user to the database through the registration form
"""
logout_user()
form = RegistrationForm()
if form.validate_on_submit():
user = User(first_name=form.first_name.data,
last_name=form.last_name.data,
email=form.email.data,
username=form.username.data,
password=form.password.data,
is_freelancer=form.freelancer.data,
is_employer=form.employer.data)
# add user to the database
db.session.add(user)
db.session.commit()
flash(f'You have successfully registered! You may now login', 'success')
# redirect to the login page
return redirect(url_for('auth.login'))
# load registration form
return render_template('auth/register.html', form=form, title='Register')
@auth.route('/login/', methods=['GET', 'POST'])
def login():
"""
Handle requests to the /login route
Log an employee in through the login form
"""
if current_user.is_authenticated:
if current_user.is_freelancer == True and current_user.is_employer == False:
# redirect to the freelancer dashboard page after login
return redirect(url_for('freelancer.dashboard'))
elif current_user.is_employer == True and current_user.is_freelancer == False:
# redirect to the employer dashboard page after login
return redirect(url_for('employer.dashboard'))
elif current_user.is_employer and current_user.is_freelancer:
# redirect to the employer dashboard page after login
return redirect(url_for('employer.dashboard'))
else:
# redirect to the admin dashboard
return redirect(url_for('admin.admin_dashboard'))
form = LoginForm()
if form.validate_on_submit():
# check whether user exists in the database
# the password entered matches the password in the database
user = User.query.filter_by(email=form.email.data).first()
if user is not None and user.verify_password(form.password.data):
login_user(user, remember=form.remember.data)
#flash(f'Logged In', 'success')
if user.is_freelancer == True and user.is_employer == False:
# redirect to the freelancer dashboard page after login
return redirect(url_for('freelancer.dashboard'))
elif user.is_employer == True and user.is_freelancer == False:
# redirect to the employer dashboard page after login
return redirect(url_for('employer.dashboard'))
elif user.is_employer and user.is_freelancer:
# redirect to the employer dashboard page after login
return redirect(url_for('employer.dashboard'))
else:
# redirect to the admin dashboard
return redirect(url_for('admin.admin_dashboard'))
flash(f'Invalid Credentials', 'danger')
# load login template
return render_template('auth/login.html', form=form, title='Login')
@auth.route('/logout/', methods=['GET', 'POST'])
@login_required
def logout():
"""
Handle requests to the /logout route
Log an employee out through the logout link
"""
logout_user()
flash(f'You have been logged out', 'success')
# redirect to the login page
return redirect(url_for('auth.login'))
def send_reset_email(user):
try:
token = user.get_reset_token()
msg = Message('Password Reset Request',
sender='activecodar@gmail.com',
recipients=[user.email])
msg.body = f''' To reset your password visit the following link
{url_for('auth.reset_password', token=token, _external=True)}
If you did not make this request ignore this email
'''
mail.send(msg)
except Exception as e:
print(e)
@auth.route('/reset-password', methods=['GET', 'POST'])
def request_reset():
if current_user.is_authenticated:
next_page = request.args.get('next')
if current_user.is_freelancer == True and current_user.is_employer == False:
# redirect to the freelancer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('freelancer.dashboard'))
elif current_user.is_employer == True and current_user.is_freelancer == False:
# redirect to the employer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('employer.dashboard'))
elif current_user.is_employer and current_user.is_freelancer:
# redirect to the employer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('employer.dashboard'))
else:
# redirect to the admin dashboard
return redirect(next_page) if next_page else redirect(url_for('admin.admin_dashboard'))
form = RequestResetForm()
if form.validate_on_submit():
user = User.query.filter_by(email=form.email.data).first()
send_reset_email(user)
flash(f'Email has been sent with password reset instructions', 'info')
return redirect(url_for('auth.login'))
return render_template('auth/reset_request.html', form=form, title='Request Reset Password')
@auth.route('/reset-password/<token>', methods=['GET', 'POST'])
def reset_password(token):
if current_user.is_authenticated:
next_page = request.args.get('next')
if current_user.is_freelancer == True and current_user.is_employer == False:
# redirect to the freelancer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('freelancer.dashboard'))
elif current_user.is_employer == True and current_user.is_freelancer == False:
# redirect to the employer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('employer.dashboard'))
elif current_user.is_employer and current_user.is_freelancer:
# redirect to the employer dashboard page after login
return redirect(next_page) if next_page else redirect(url_for('employer.dashboard'))
else:
# redirect to the admin dashboard
return redirect(next_page) if next_page else redirect(url_for('admin.admin_dashboard'))
user = User.verify_reset_token(token)
if user is None:
flash(f'Invalid token or expired token', 'warning')
return redirect(url_for('auth.request_reset'))
form = ResetPasswordForm()
if form.validate_on_submit():
# add user to the database
hashed_password = generate_password_hash(form.password.data)
user.password_hash = hashed_password
db.session.commit()
flash(
f'Your password has been reset successfully! You may now login', 'success')
return redirect(url_for('auth.login'))
return render_template('auth/reset_password.html', form=form, title='Reset Password')
| 43.8125 | 99 | 0.664765 | 968 | 7,711 | 5.143595 | 0.14876 | 0.034947 | 0.059048 | 0.052219 | 0.569994 | 0.515565 | 0.515565 | 0.503515 | 0.503515 | 0.503515 | 0 | 0 | 0.244586 | 7,711 | 175 | 100 | 44.062857 | 0.854764 | 0.170017 | 0 | 0.413793 | 0 | 0 | 0.174925 | 0.03261 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051724 | false | 0.146552 | 0.077586 | 0 | 0.344828 | 0.008621 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
602fc03ac149fa50fb90ef1d0ffd3dc3832e7d14 | 5,054 | py | Python | cleaning.py | jhamrick/cogsci-proceedings-analysis | c3c8b0abd8b9ce639f6de0aea52aec46c2c8abca | [
"MIT"
] | null | null | null | cleaning.py | jhamrick/cogsci-proceedings-analysis | c3c8b0abd8b9ce639f6de0aea52aec46c2c8abca | [
"MIT"
] | null | null | null | cleaning.py | jhamrick/cogsci-proceedings-analysis | c3c8b0abd8b9ce639f6de0aea52aec46c2c8abca | [
"MIT"
] | 1 | 2020-05-11T10:38:38.000Z | 2020-05-11T10:38:38.000Z | import re
import difflib
import pandas as pd
import numpy as np
from nameparser import HumanName
from nameparser.config import CONSTANTS
CONSTANTS.titles.remove("gen")
CONSTANTS.titles.remove("prin")
def parse_paper_type(section_name):
section_name = section_name.strip().lower()
if section_name == '':
paper_type = None
elif re.match('.*workshop.*', section_name):
paper_type = 'workshop'
elif re.match('.*symposi.*', section_name):
paper_type = 'symposium'
elif re.match('.*poster.*', section_name):
paper_type = 'poster'
elif re.match('.*tutorial.*', section_name):
paper_type = 'workshop'
elif re.match('.*abstract.*', section_name):
paper_type = 'poster'
elif re.match('.*addenda.*', section_name):
paper_type = 'other'
else:
paper_type = 'talk'
return paper_type
def clean_authors(authors):
cleaned_authors = []
authors = authors.lower()
# get rid of commas where there are suffixes, like Jr. or III
authors = authors.replace(", jr.", " jr.")
authors = authors.replace(", iii", " iii")
authors = authors.replace(", ph.d", "")
# special cases
authors = authors.replace("organizer:", "")
authors = authors.replace("roel m,", "roel m.")
if authors == 'kozue miyashiro, etsuko harada, t.':
author_list = ['kozue miyashiro', 'etsuko harada, t.']
else:
author_list = authors.split(",")
for author in author_list:
author = HumanName(author.lower())
if author.first == '' or author.last == '':
raise ValueError("invalid author name: {}".format(author))
author.capitalize()
author.string_format = u"{last}, {title} {first} {middle}, {suffix}"
cleaned_authors.append(unicode(author))
return cleaned_authors
def extract_authors(papers):
author_papers = []
for i, paper in papers.iterrows():
authors = clean_authors(paper['authors'])
for author in authors:
entry = paper.copy().drop('authors')
entry['author'] = author
author_papers.append(entry)
author_papers = pd.DataFrame(author_papers)
return author_papers
def fix_author_misspellings(papers, G):
authors = np.sort(papers['author'].unique())
for i in xrange(len(authors)):
window = 20
lower = i + 1
upper = min(i + 1 + window, len(authors) - 1)
for j in xrange(len(authors[lower:upper])):
author1 = authors[i]
author2 = authors[lower + j]
if author1 == author2:
continue
author1_hn = HumanName(author1)
author2_hn = HumanName(author2)
same_first = author1_hn.first == author2_hn.first
same_last = author1_hn.last == author2_hn.last
if same_first and same_last:
replace = True
else:
ratio = difflib.SequenceMatcher(None, author1, author2).ratio()
if ratio > 0.9:
coauthors = set(G[author1].keys()) & set(G[author2].keys())
if len(coauthors) > 0:
replace = True
else:
print u"\nPossible match: '{}' vs '{}' (r={})".format(
author1, author2, ratio)
print sorted(G[author1].keys())
print sorted(G[author2].keys())
accept = ""
while accept not in ("y", "n"):
accept = raw_input("Accept? (y/n) ")
replace = accept == "y"
else:
replace = False
if replace:
num1 = len(papers.groupby('author').get_group(author1))
num2 = len(papers.groupby('author').get_group(author2))
if num1 > num2:
oldname = author2
newname = author1
else:
oldname = author1
newname = author2
print u"Replacing '{}' with '{}'".format(oldname, newname)
papers.loc[papers['author'] == oldname, 'author'] = newname
authors[authors == oldname] = newname
for neighbor in G[oldname]:
if neighbor not in G[newname]:
G.add_edge(newname, neighbor)
G[newname][neighbor]['weight'] = 0
weight = G[oldname][neighbor]['weight']
G[newname][neighbor]['weight'] += weight
G.remove_node(oldname)
return papers, G
if __name__ == "__main__":
import graph
papers = pd.read_csv("cogsci_proceedings_raw.csv")
papers['type'] = papers['section'].apply(parse_paper_type)
papers = extract_authors(papers)
G = graph.make_author_graph(papers)
papers, G = fix_author_misspellings(papers, G)
papers.to_csv("cogsci_proceedings.csv", encoding='utf-8')
| 34.616438 | 79 | 0.551049 | 541 | 5,054 | 5.012939 | 0.297597 | 0.036504 | 0.041298 | 0.051622 | 0.118732 | 0.078171 | 0.056047 | 0.056047 | 0 | 0 | 0 | 0.011747 | 0.326276 | 5,054 | 145 | 80 | 34.855172 | 0.784728 | 0.014444 | 0 | 0.10084 | 0 | 0 | 0.104259 | 0.009642 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0.033613 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
602fe47995203be2cbe5445ca36c210c61dfb7a1 | 384 | py | Python | quem_foi_para_mar_core/migrations/0004_auto_20200811_1945.py | CamilaBodack/template-projeto-selecao | b0a0cf6070bf8abab626a17af5c315c82368b010 | [
"MIT"
] | 1 | 2020-09-01T23:04:07.000Z | 2020-09-01T23:04:07.000Z | quem_foi_para_mar_core/migrations/0004_auto_20200811_1945.py | CamilaBodack/template-projeto-selecao | b0a0cf6070bf8abab626a17af5c315c82368b010 | [
"MIT"
] | 4 | 2020-10-07T18:04:41.000Z | 2020-10-07T18:07:58.000Z | quem_foi_para_mar_core/migrations/0004_auto_20200811_1945.py | CamilaBodack/template-projeto-selecao | b0a0cf6070bf8abab626a17af5c315c82368b010 | [
"MIT"
] | null | null | null | # Generated by Django 3.1 on 2020-08-11 19:45
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('quem_foi_para_mar_core', '0003_auto_20200811_1944'),
]
operations = [
migrations.RenameField(
model_name='contato',
old_name='pescador_id',
new_name='pescador',
),
]
| 20.210526 | 62 | 0.609375 | 43 | 384 | 5.186047 | 0.837209 | 0.107623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109489 | 0.286458 | 384 | 18 | 63 | 21.333333 | 0.70438 | 0.111979 | 0 | 0 | 1 | 0 | 0.20944 | 0.132743 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6030d536392f2700f6b4fca762988c6115c81681 | 268 | py | Python | examples/tinytag/fuzz.py | MJ-SEO/py_fuzz | 789fbfea21bf644ba4d00554fe4141694b0a190a | [
"Apache-2.0"
] | null | null | null | examples/tinytag/fuzz.py | MJ-SEO/py_fuzz | 789fbfea21bf644ba4d00554fe4141694b0a190a | [
"Apache-2.0"
] | null | null | null | examples/tinytag/fuzz.py | MJ-SEO/py_fuzz | 789fbfea21bf644ba4d00554fe4141694b0a190a | [
"Apache-2.0"
] | null | null | null | from pythonfuzz.main import PythonFuzz
from tinytag import TinyTag
import io
@PythonFuzz
def fuzz(buf):
try:
f = open('temp.mp4', "wb")
f.write(buf)
f.seek(0)
tag = TinyTag.get(f.name)
except UnicodeDecodeError:
pass
if __name__ == '__main__':
fuzz()
| 14.105263 | 38 | 0.69403 | 39 | 268 | 4.564103 | 0.641026 | 0.146067 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009091 | 0.179104 | 268 | 18 | 39 | 14.888889 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.067669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0.071429 | 0.214286 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
60338466dc34f8421b1477264c6d62ca84ee2404 | 36,939 | py | Python | payments/models.py | wahuneke/django-stripe-payments | 5d4b26b025fc3fa75d3a0aeaafd67fb825325c94 | [
"BSD-3-Clause"
] | null | null | null | payments/models.py | wahuneke/django-stripe-payments | 5d4b26b025fc3fa75d3a0aeaafd67fb825325c94 | [
"BSD-3-Clause"
] | null | null | null | payments/models.py | wahuneke/django-stripe-payments | 5d4b26b025fc3fa75d3a0aeaafd67fb825325c94 | [
"BSD-3-Clause"
] | null | null | null | import datetime
import decimal
import json
import traceback
from django.conf import settings
from django.core.mail import EmailMessage
from django.db import models
from django.utils import timezone
from django.template.loader import render_to_string
from django.contrib.sites.models import Site
import stripe
from jsonfield.fields import JSONField
from .managers import CustomerManager, ChargeManager, TransferManager
from .settings import (
DEFAULT_PLAN,
INVOICE_FROM_EMAIL,
PAYMENTS_PLANS,
plan_from_stripe_id,
SEND_EMAIL_RECEIPTS,
TRIAL_PERIOD_FOR_USER_CALLBACK,
PLAN_QUANTITY_CALLBACK
)
from .signals import (
cancelled,
card_changed,
subscription_made,
webhook_processing_error,
WEBHOOK_SIGNALS,
)
from .utils import convert_tstamp
stripe.api_key = settings.STRIPE_SECRET_KEY
stripe.api_version = getattr(settings, "STRIPE_API_VERSION", "2012-11-07")
class StripeObject(models.Model):
stripe_id = models.CharField(max_length=255, unique=True)
created_at = models.DateTimeField(default=timezone.now)
class Meta: # pylint: disable=E0012,C1001
abstract = True
class EventProcessingException(models.Model):
event = models.ForeignKey("Event", null=True)
data = models.TextField()
message = models.CharField(max_length=500)
traceback = models.TextField()
created_at = models.DateTimeField(default=timezone.now)
@classmethod
def log(cls, data, exception, event):
cls.objects.create(
event=event,
data=data or "",
message=str(exception),
traceback=traceback.format_exc()
)
def __unicode__(self):
return u"<%s, pk=%s, Event=%s>" % (self.message, self.pk, self.event)
class Event(StripeObject):
kind = models.CharField(max_length=250)
livemode = models.BooleanField()
customer = models.ForeignKey("Customer", null=True)
webhook_message = JSONField()
validated_message = JSONField(null=True)
valid = models.NullBooleanField(null=True)
processed = models.BooleanField(default=False)
stripe_connect = models.ForeignKey('ConnectUser', null=True)
@property
def message(self):
return self.validated_message
def __unicode__(self):
return "%s - %s" % (self.kind, self.stripe_id)
def link_customer(self):
cus_id = None
customer_crud_events = [
"customer.created",
"customer.updated",
"customer.deleted"
]
if self.kind in customer_crud_events:
cus_id = self.message["data"]["object"]["id"]
else:
cus_id = self.message["data"]["object"].get("customer", None)
if cus_id is not None:
try:
self.customer = Customer.objects.get(stripe_id=cus_id)
self.save()
except Customer.DoesNotExist:
pass
def link_stripe_connect(self):
connect_id = self.message["data"]["object"].get("user_id", None)
if connect_id is not None:
try:
self.stripe_connect = ConnectUser.objects.get(account_id=connect_id)
self.save()
except ConnectUser.DoesNotExist:
pass
def validate(self):
evt = stripe.Event.retrieve(self.stripe_id)
self.validated_message = json.loads(
json.dumps(
evt.to_dict(),
sort_keys=True,
cls=stripe.StripeObjectEncoder
)
)
if self.webhook_message["data"] == self.validated_message["data"]:
self.valid = True
else:
self.valid = False
self.save()
def process(self):
"""
"account.updated",
"account.application.deauthorized",
"charge.succeeded",
"charge.failed",
"charge.refunded",
"charge.dispute.created",
"charge.dispute.updated",
"chagne.dispute.closed",
"customer.created",
"customer.updated",
"customer.deleted",
"customer.subscription.created",
"customer.subscription.updated",
"customer.subscription.deleted",
"customer.subscription.trial_will_end",
"customer.discount.created",
"customer.discount.updated",
"customer.discount.deleted",
"invoice.created",
"invoice.updated",
"invoice.payment_succeeded",
"invoice.payment_failed",
"invoiceitem.created",
"invoiceitem.updated",
"invoiceitem.deleted",
"plan.created",
"plan.updated",
"plan.deleted",
"coupon.created",
"coupon.updated",
"coupon.deleted",
"transfer.created",
"transfer.updated",
"transfer.failed",
"ping"
"""
if self.valid and not self.processed:
try:
if not self.kind.startswith("plan.") and \
not self.kind.startswith("transfer."):
self.link_customer()
if not self.stripe_connect:
self.link_stripe_connect()
if self.kind.startswith("invoice."):
Invoice.handle_event(self)
elif self.kind.startswith("charge."):
if not self.customer:
self.link_customer()
self.customer.record_charge(
self.message["data"]["object"]["id"]
)
elif self.kind.startswith("transfer."):
Transfer.process_transfer(
self,
self.message["data"]["object"]
)
elif self.kind.startswith("customer.subscription."):
if not self.customer:
self.link_customer()
if self.customer:
self.customer.sync_current_subscription()
elif self.kind == "customer.deleted":
if not self.customer:
self.link_customer()
self.customer.purge()
self.send_signal()
self.processed = True
self.save()
except stripe.StripeError, e:
EventProcessingException.log(
data=e.http_body,
exception=e,
event=self
)
webhook_processing_error.send(
sender=Event,
data=e.http_body,
exception=e
)
def send_signal(self):
signal = WEBHOOK_SIGNALS.get(self.kind)
if signal:
return signal.send(sender=Event, event=self)
class Transfer(StripeObject):
# pylint: disable=C0301
event = models.ForeignKey(Event, related_name="transfers")
amount = models.DecimalField(decimal_places=2, max_digits=9)
status = models.CharField(max_length=25)
date = models.DateTimeField()
description = models.TextField(null=True, blank=True)
adjustment_count = models.IntegerField(null=True)
adjustment_fees = models.DecimalField(decimal_places=2, max_digits=7, null=True)
adjustment_gross = models.DecimalField(decimal_places=2, max_digits=7, null=True)
charge_count = models.IntegerField(null=True)
charge_fees = models.DecimalField(decimal_places=2, max_digits=7, null=True)
charge_gross = models.DecimalField(decimal_places=2, max_digits=9, null=True)
collected_fee_count = models.IntegerField(null=True)
collected_fee_gross = models.DecimalField(decimal_places=2, max_digits=7, null=True)
net = models.DecimalField(decimal_places=2, max_digits=9, null=True)
refund_count = models.IntegerField(null=True)
refund_fees = models.DecimalField(decimal_places=2, max_digits=7, null=True)
refund_gross = models.DecimalField(decimal_places=2, max_digits=7, null=True)
validation_count = models.IntegerField(null=True)
validation_fees = models.DecimalField(decimal_places=2, max_digits=7, null=True)
stripe_connect = models.ForeignKey('ConnectUser', null=True)
objects = TransferManager()
def update_status(self):
self.status = stripe.Transfer.retrieve(self.stripe_id).status
self.save()
@classmethod
def process_transfer(cls, event, transfer):
defaults = {
"amount": transfer["amount"] / decimal.Decimal("100"),
"status": transfer["status"],
"date": convert_tstamp(transfer, "date"),
"description": transfer.get("description", "")
}
summary = transfer.get("summary")
if summary:
defaults.update({
"adjustment_count": summary.get("adjustment_count"),
"adjustment_fees": summary.get("adjustment_fees"),
"adjustment_gross": summary.get("adjustment_gross"),
"charge_count": summary.get("charge_count"),
"charge_fees": summary.get("charge_fees"),
"charge_gross": summary.get("charge_gross"),
"collected_fee_count": summary.get("collected_fee_count"),
"collected_fee_gross": summary.get("collected_fee_gross"),
"refund_count": summary.get("refund_count"),
"refund_fees": summary.get("refund_fees"),
"refund_gross": summary.get("refund_gross"),
"validation_count": summary.get("validation_count"),
"validation_fees": summary.get("validation_fees"),
"net": summary.get("net") / decimal.Decimal("100")
})
for field in defaults:
if field.endswith("fees") or field.endswith("gross"):
defaults[field] = defaults[field] / decimal.Decimal("100")
if event.kind == "transfer.paid":
defaults.update({"event": event})
obj, created = Transfer.objects.get_or_create(
stripe_id=transfer["id"],
defaults=defaults
)
else:
obj, created = Transfer.objects.get_or_create(
stripe_id=transfer["id"],
event=event,
defaults=defaults
)
if event.stripe_connect:
obj.stripe_connect = event.stripe_connect
if created and summary:
for fee in summary.get("charge_fee_details", []):
obj.charge_fee_details.create(
amount=fee["amount"] / decimal.Decimal("100"),
application=fee.get("application", ""),
description=fee.get("description", ""),
kind=fee["type"]
)
else:
obj.status = transfer["status"]
obj.save()
if event.kind == "transfer.updated":
obj.update_status()
class TransferChargeFee(models.Model):
transfer = models.ForeignKey(Transfer, related_name="charge_fee_details")
amount = models.DecimalField(decimal_places=2, max_digits=7)
application = models.TextField(null=True, blank=True)
description = models.TextField(null=True, blank=True)
kind = models.CharField(max_length=150)
created_at = models.DateTimeField(default=timezone.now)
class Customer(StripeObject):
user = models.OneToOneField(
getattr(settings, "AUTH_USER_MODEL", "auth.User"),
null=True
)
card_fingerprint = models.CharField(max_length=200, blank=True)
card_last_4 = models.CharField(max_length=4, blank=True)
card_kind = models.CharField(max_length=50, blank=True)
date_purged = models.DateTimeField(null=True, editable=False)
objects = CustomerManager()
def __unicode__(self):
return unicode(self.user)
@property
def stripe_customer(self):
return stripe.Customer.retrieve(self.stripe_id)
def purge(self):
try:
self.stripe_customer.delete()
except stripe.InvalidRequestError as e:
if e.message.startswith("No such customer:"):
# The exception was thrown because the customer was already
# deleted on the stripe side, ignore the exception
pass
else:
# The exception was raised for another reason, re-raise it
raise
self.user = None
self.card_fingerprint = ""
self.card_last_4 = ""
self.card_kind = ""
self.date_purged = timezone.now()
self.save()
def delete(self, using=None):
# Only way to delete a customer is to use SQL
self.purge()
def can_charge(self):
return self.card_fingerprint and \
self.card_last_4 and \
self.card_kind and \
self.date_purged is None
def has_active_subscription(self):
try:
return self.current_subscription.is_valid()
except CurrentSubscription.DoesNotExist:
return False
def cancel(self, at_period_end=True):
try:
current = self.current_subscription
except CurrentSubscription.DoesNotExist:
return
sub = self.stripe_customer.cancel_subscription(
at_period_end=at_period_end
)
current.status = sub.status
current.cancel_at_period_end = sub.cancel_at_period_end
current.current_period_end = convert_tstamp(sub, "current_period_end")
current.save()
cancelled.send(sender=self, stripe_response=sub)
@classmethod
def create(cls, user, card=None, plan=None, charge_immediately=True):
if card and plan:
plan = PAYMENTS_PLANS[plan]["stripe_plan_id"]
elif DEFAULT_PLAN:
plan = PAYMENTS_PLANS[DEFAULT_PLAN]["stripe_plan_id"]
else:
plan = None
trial_end = None
if TRIAL_PERIOD_FOR_USER_CALLBACK and plan:
trial_days = TRIAL_PERIOD_FOR_USER_CALLBACK(user)
trial_end = datetime.datetime.utcnow() + datetime.timedelta(
days=trial_days
)
stripe_customer = stripe.Customer.create(
email=user.email,
card=card,
plan=plan or DEFAULT_PLAN,
trial_end=trial_end
)
if stripe_customer.active_card:
cus = cls.objects.create(
user=user,
stripe_id=stripe_customer.id,
card_fingerprint=stripe_customer.active_card.fingerprint,
card_last_4=stripe_customer.active_card.last4,
card_kind=stripe_customer.active_card.type
)
else:
cus = cls.objects.create(
user=user,
stripe_id=stripe_customer.id,
)
if plan:
if stripe_customer.subscription:
cus.sync_current_subscription(cu=stripe_customer)
if charge_immediately:
cus.send_invoice()
return cus
def update_card(self, token):
cu = self.stripe_customer
cu.card = token
cu.save()
self.save_card(cu)
def save_card(self, cu=None):
cu = cu or self.stripe_customer
active_card = cu.active_card
self.card_fingerprint = active_card.fingerprint
self.card_last_4 = active_card.last4
self.card_kind = active_card.type
self.save()
card_changed.send(sender=self, stripe_response=cu)
def retry_unpaid_invoices(self):
self.sync_invoices()
for inv in self.invoices.filter(paid=False, closed=False):
try:
inv.retry() # Always retry unpaid invoices
except stripe.InvalidRequestError, error:
if error.message != "Invoice is already paid":
raise error
def send_invoice(self):
try:
invoice = stripe.Invoice.create(customer=self.stripe_id)
if invoice.amount_due > 0:
invoice.pay()
return True
except stripe.InvalidRequestError:
return False # There was nothing to invoice
def sync(self, cu=None):
cu = cu or self.stripe_customer
updated = False
if hasattr(cu, "active_card") and cu.active_card:
# Test to make sure the card has changed, otherwise do not update it
# (i.e. refrain from sending any signals)
if (self.card_last_4 != cu.active_card.last4 or
self.card_fingerprint != cu.active_card.fingerprint or
self.card_kind != cu.active_card.type):
updated = True
self.card_last_4 = cu.active_card.last4
self.card_fingerprint = cu.active_card.fingerprint
self.card_kind = cu.active_card.type
else:
updated = True
self.card_fingerprint = ""
self.card_last_4 = ""
self.card_kind = ""
if updated:
self.save()
card_changed.send(sender=self, stripe_response=cu)
def sync_invoices(self, cu=None):
cu = cu or self.stripe_customer
for invoice in cu.invoices().data:
Invoice.sync_from_stripe_data(invoice, send_receipt=False)
def sync_charges(self, cu=None):
cu = cu or self.stripe_customer
for charge in cu.charges().data:
self.record_charge(charge.id)
def sync_current_subscription(self, cu=None):
cu = cu or self.stripe_customer
sub = getattr(cu, "subscription", None)
if sub is None:
try:
self.current_subscription.delete()
except CurrentSubscription.DoesNotExist:
pass
else:
try:
sub_obj = self.current_subscription
sub_obj.plan = plan_from_stripe_id(sub.plan.id)
sub_obj.current_period_start = convert_tstamp(
sub.current_period_start
)
sub_obj.current_period_end = convert_tstamp(
sub.current_period_end
)
sub_obj.amount = (sub.plan.amount / decimal.Decimal("100"))
sub_obj.status = sub.status
sub_obj.cancel_at_period_end = sub.cancel_at_period_end
sub_obj.start = convert_tstamp(sub.start)
sub_obj.quantity = sub.quantity
sub_obj.save()
except CurrentSubscription.DoesNotExist:
sub_obj = CurrentSubscription.objects.create(
customer=self,
plan=plan_from_stripe_id(sub.plan.id),
current_period_start=convert_tstamp(
sub.current_period_start
),
current_period_end=convert_tstamp(
sub.current_period_end
),
amount=(sub.plan.amount / decimal.Decimal("100")),
status=sub.status,
cancel_at_period_end=sub.cancel_at_period_end,
start=convert_tstamp(sub.start),
quantity=sub.quantity
)
if sub.trial_start and sub.trial_end:
sub_obj.trial_start = convert_tstamp(sub.trial_start)
sub_obj.trial_end = convert_tstamp(sub.trial_end)
sub_obj.save()
return sub_obj
def update_plan_quantity(self, quantity, charge_immediately=False):
self.subscribe(
plan=plan_from_stripe_id(
self.stripe_customer.subscription.plan.id
),
quantity=quantity,
charge_immediately=charge_immediately
)
def subscribe(self, plan, quantity=None, trial_days=None,
charge_immediately=True, token=None, coupon=None):
if quantity is None:
if PLAN_QUANTITY_CALLBACK is not None:
quantity = PLAN_QUANTITY_CALLBACK(self)
else:
quantity = 1
cu = self.stripe_customer
subscription_params = {}
if trial_days:
subscription_params["trial_end"] = \
datetime.datetime.utcnow() + datetime.timedelta(days=trial_days)
if token:
subscription_params["card"] = token
subscription_params["plan"] = PAYMENTS_PLANS[plan]["stripe_plan_id"]
subscription_params["quantity"] = quantity
subscription_params["coupon"] = coupon
resp = cu.update_subscription(**subscription_params)
if token:
# Refetch the stripe customer so we have the updated card info
cu = self.stripe_customer
self.save_card(cu)
self.sync_current_subscription(cu)
if charge_immediately:
self.send_invoice()
subscription_made.send(sender=self, plan=plan, stripe_response=resp)
return resp
def charge(self, amount, currency="usd", description=None,
send_receipt=True, application_fee=None,
stripe_connect_user=None):
"""
This method expects `amount` and 'application_fee' to be a Decimal type representing a
dollar amount. It will be converted to cents so any decimals beyond
two will be ignored.
"""
if not isinstance(amount, decimal.Decimal) or (not application_fee is None and not isinstance(application_fee, decimal.Decimal)):
raise ValueError(
"You must supply a decimal value representing dollars for amount and for application_fee (if supplied)."
)
charge_args = {
'amount': int(amount * 100),
'currency': currency,
'description': description,
}
if stripe_connect_user and isinstance(stripe_connect_user, ConnectUser):
charge_args['card'] = stripe.Token.create(customer=self.stripe_id, api_key=stripe_connect_user.stripe_access_token)
charge_args['api_key'] = stripe_connect_user.stripe_access_token
else:
charge_args['customer'] = self.stripe_id
if application_fee:
charge_args['application_fee'] = int(application_fee * 100)
resp = stripe.Charge.create(**charge_args)
obj = self.record_charge(resp["id"], stripe_connect_user)
if send_receipt:
obj.send_receipt()
return obj
def record_charge(self, charge_id, stripe_connect_user=None):
if stripe_connect_user and isinstance(stripe_connect_user, ConnectUser):
data = stripe.Charge.retrieve(charge_id, api_key=stripe_connect_user.stripe_access_token)
else:
data = stripe.Charge.retrieve(charge_id)
return Charge.sync_from_stripe_data(data)
class ConnectUser(models.Model):
"""
A user in your system who you may be routing payments to through "Stripe Connect"
"""
user = models.OneToOneField(
getattr(settings, "AUTH_USER_MODEL", "auth.User"),
null=True
)
# when a webhook is received for an action related to a ConnectUser, a 'user_id' will be provided
# This is the same as an account id
account_id = models.CharField(max_length=100)
stripe_access_token = models.CharField(max_length=100)
stripe_publishable_key = models.CharField(max_length=100)
@staticmethod
def account_id_lookup(stripe_access_token):
data = stripe.Account.retrieve(api_key=stripe_access_token)
return data.get('id', None)
def __unicode__(self):
return unicode(self.user)
class CurrentSubscription(models.Model):
customer = models.OneToOneField(
Customer,
related_name="current_subscription",
null=True
)
plan = models.CharField(max_length=100)
quantity = models.IntegerField()
start = models.DateTimeField()
# trialing, active, past_due, canceled, or unpaid
status = models.CharField(max_length=25)
cancel_at_period_end = models.BooleanField(default=False)
canceled_at = models.DateTimeField(blank=True, null=True)
current_period_end = models.DateTimeField(blank=True, null=True)
current_period_start = models.DateTimeField(blank=True, null=True)
ended_at = models.DateTimeField(blank=True, null=True)
trial_end = models.DateTimeField(blank=True, null=True)
trial_start = models.DateTimeField(blank=True, null=True)
amount = models.DecimalField(decimal_places=2, max_digits=7)
created_at = models.DateTimeField(default=timezone.now)
@property
def total_amount(self):
return self.amount * self.quantity
def plan_display(self):
return PAYMENTS_PLANS[self.plan]["name"]
def status_display(self):
return self.status.replace("_", " ").title()
def is_period_current(self):
return self.current_period_end > timezone.now()
def is_status_current(self):
return self.status in ["trialing", "active"]
def is_valid(self):
if not self.is_status_current():
return False
if self.cancel_at_period_end and not self.is_period_current():
return False
return True
def delete(self, using=None): # pylint: disable=E1002
"""
Set values to None while deleting the object so that any lingering
references will not show previous values (such as when an Event
signal is triggered after a subscription has been deleted)
"""
super(CurrentSubscription, self).delete(using=using)
self.plan = None
self.status = None
self.quantity = 0
self.amount = 0
class Invoice(models.Model):
stripe_id = models.CharField(max_length=255)
customer = models.ForeignKey(Customer, related_name="invoices")
attempted = models.NullBooleanField()
attempts = models.PositiveIntegerField(null=True)
closed = models.BooleanField(default=False)
paid = models.BooleanField(default=False)
period_end = models.DateTimeField()
period_start = models.DateTimeField()
subtotal = models.DecimalField(decimal_places=2, max_digits=7)
total = models.DecimalField(decimal_places=2, max_digits=7)
date = models.DateTimeField()
charge = models.CharField(max_length=50, blank=True)
created_at = models.DateTimeField(default=timezone.now)
stripe_connect = models.ForeignKey(ConnectUser, null=True)
class Meta: # pylint: disable=E0012,C1001
ordering = ["-date"]
def retry(self):
if not self.paid and not self.closed:
inv = stripe.Invoice.retrieve(self.stripe_id)
inv.pay()
return True
return False
def status(self):
if self.paid:
return "Paid"
return "Open"
@classmethod
def sync_from_stripe_data(cls, stripe_invoice, send_receipt=True, stripe_connect=None):
c = Customer.objects.get(stripe_id=stripe_invoice["customer"])
period_end = convert_tstamp(stripe_invoice, "period_end")
period_start = convert_tstamp(stripe_invoice, "period_start")
date = convert_tstamp(stripe_invoice, "date")
invoice, created = cls.objects.get_or_create(
stripe_id=stripe_invoice["id"],
defaults=dict(
customer=c,
attempted=stripe_invoice["attempted"],
attempts=stripe_invoice["attempt_count"],
closed=stripe_invoice["closed"],
paid=stripe_invoice["paid"],
period_end=period_end,
period_start=period_start,
subtotal=stripe_invoice["subtotal"] / decimal.Decimal("100"),
total=stripe_invoice["total"] / decimal.Decimal("100"),
date=date,
charge=stripe_invoice.get("charge") or "",
stripe_connect=stripe_connect
)
)
if not created:
# pylint: disable=C0301
invoice.attempted = stripe_invoice["attempted"]
invoice.attempts = stripe_invoice["attempt_count"]
invoice.closed = stripe_invoice["closed"]
invoice.paid = stripe_invoice["paid"]
invoice.period_end = period_end
invoice.period_start = period_start
invoice.subtotal = stripe_invoice["subtotal"] / decimal.Decimal("100")
invoice.total = stripe_invoice["total"] / decimal.Decimal("100")
invoice.date = date
invoice.charge = stripe_invoice.get("charge") or ""
invoice.stripe_connect = stripe_connect
invoice.save()
for item in stripe_invoice["lines"].get("data", []):
period_end = convert_tstamp(item["period"], "end")
period_start = convert_tstamp(item["period"], "start")
if item.get("plan"):
plan = plan_from_stripe_id(item["plan"]["id"])
else:
plan = ""
inv_item, inv_item_created = invoice.items.get_or_create(
stripe_id=item["id"],
defaults=dict(
amount=(item["amount"] / decimal.Decimal("100")),
currency=item["currency"],
proration=item["proration"],
description=item.get("description") or "",
line_type=item["type"],
plan=plan,
period_start=period_start,
period_end=period_end,
quantity=item.get("quantity")
)
)
if not inv_item_created:
inv_item.amount = (item["amount"] / decimal.Decimal("100"))
inv_item.currency = item["currency"]
inv_item.proration = item["proration"]
inv_item.description = item.get("description") or ""
inv_item.line_type = item["type"]
inv_item.plan = plan
inv_item.period_start = period_start
inv_item.period_end = period_end
inv_item.quantity = item.get("quantity")
inv_item.save()
if stripe_invoice.get("charge"):
obj = c.record_charge(stripe_invoice["charge"])
obj.invoice = invoice
obj.save()
if send_receipt:
obj.send_receipt()
return invoice
@classmethod
def handle_event(cls, event, send_receipt=SEND_EMAIL_RECEIPTS):
valid_events = ["invoice.payment_failed", "invoice.payment_succeeded"]
if event.kind in valid_events:
invoice_data = event.message["data"]["object"]
stripe_invoice = stripe.Invoice.retrieve(invoice_data["id"])
cls.sync_from_stripe_data(stripe_invoice, send_receipt=send_receipt, stripe_connect=event.stripe_connect)
class InvoiceItem(models.Model):
stripe_id = models.CharField(max_length=255)
created_at = models.DateTimeField(default=timezone.now)
invoice = models.ForeignKey(Invoice, related_name="items")
amount = models.DecimalField(decimal_places=2, max_digits=7)
currency = models.CharField(max_length=10)
period_start = models.DateTimeField()
period_end = models.DateTimeField()
proration = models.BooleanField(default=False)
line_type = models.CharField(max_length=50)
description = models.CharField(max_length=200, blank=True)
plan = models.CharField(max_length=100, blank=True)
quantity = models.IntegerField(null=True)
def plan_display(self):
return PAYMENTS_PLANS[self.plan]["name"]
class Charge(StripeObject):
customer = models.ForeignKey(Customer, related_name="charges", null=True)
invoice = models.ForeignKey(Invoice, null=True, related_name="charges")
card_last_4 = models.CharField(max_length=4, blank=True)
card_kind = models.CharField(max_length=50, blank=True)
amount = models.DecimalField(decimal_places=2, max_digits=7, null=True)
amount_refunded = models.DecimalField(
decimal_places=2,
max_digits=7,
null=True
)
description = models.TextField(blank=True)
paid = models.NullBooleanField(null=True)
disputed = models.NullBooleanField(null=True)
refunded = models.NullBooleanField(null=True)
fee = models.DecimalField(decimal_places=2, max_digits=7, null=True)
receipt_sent = models.BooleanField(default=False)
charge_created = models.DateTimeField(null=True, blank=True)
stripe_connect = models.ForeignKey(ConnectUser, null=True)
objects = ChargeManager()
def calculate_refund_amount(self, amount=None):
eligible_to_refund = self.amount - (self.amount_refunded or 0)
if amount:
amount_to_refund = min(eligible_to_refund, amount)
else:
amount_to_refund = eligible_to_refund
return int(amount_to_refund * 100)
def refund(self, amount=None):
# pylint: disable=E1121
charge_obj = stripe.Charge.retrieve(
self.stripe_id
).refund(
amount=self.calculate_refund_amount(amount=amount)
)
Charge.sync_from_stripe_data(charge_obj)
@classmethod
def sync_from_stripe_data(cls, data):
obj, _ = Charge.objects.get_or_create(
stripe_id=data["id"]
)
customer_id = data.get("customer", None);
customer = Customer.objects.get(stripe_id=customer_id) if customer_id else None
obj.customer = customer
invoice_id = data.get("invoice", None)
if Invoice.objects.filter(stripe_id=invoice_id).exists():
obj.invoice = obj.customer.invoices.get(stripe_id=invoice_id)
obj.card_last_4 = data["card"]["last4"]
obj.card_kind = data["card"]["type"]
obj.amount = (data["amount"] / decimal.Decimal("100"))
obj.paid = data["paid"]
obj.refunded = data["refunded"]
obj.fee = (data["fee"] / decimal.Decimal("100"))
obj.disputed = data["dispute"] is not None
obj.charge_created = convert_tstamp(data, "created")
if data.get("description"):
obj.description = data["description"]
if data.get("amount_refunded"):
# pylint: disable=C0301
obj.amount_refunded = (data["amount_refunded"] / decimal.Decimal("100"))
if data["refunded"]:
obj.amount_refunded = (data["amount"] / decimal.Decimal("100"))
user_id = data.get("user_id", None)
if user_id and ConnectUser.objects.filter(account_id=user_id).exists():
obj.stripe_connect = ConnectUser.objects.get(account_id=user_id)
obj.save()
return obj
def send_receipt(self):
if not self.receipt_sent and self.customer:
site = Site.objects.get_current()
protocol = getattr(settings, "DEFAULT_HTTP_PROTOCOL", "http")
ctx = {
"charge": self,
"site": site,
"protocol": protocol,
}
subject = render_to_string("payments/email/subject.txt", ctx)
subject = subject.strip()
message = render_to_string("payments/email/body.txt", ctx)
num_sent = EmailMessage(
subject,
message,
to=[self.customer.user.email],
from_email=INVOICE_FROM_EMAIL
).send()
self.receipt_sent = num_sent > 0
self.save()
@classmethod
def create(cls, card, amount, currency="usd", description=None, application_fee=None, stripe_connect_user=None):
"""
This method expects `amount` and 'application_fee' to be a Decimal type representing a
dollar amount. It will be converted to cents so any decimals beyond
two will be ignored.
"""
if not isinstance(amount, decimal.Decimal) or (not application_fee is None and not isinstance(application_fee, decimal.Decimal)):
raise ValueError(
"You must supply a decimal value representing dollars for amount and for application_fee (if supplied)."
)
charge_args = {
'amount': int(amount * 100),
'currency': currency,
'description': description,
'card': card,
}
if stripe_connect_user and isinstance(stripe_connect_user, ConnectUser):
charge_args['api_key'] = stripe_connect_user.stripe_access_token
elif stripe_connect_user:
charge_args['api_key'] = stripe_connect_user
if application_fee:
charge_args['application_fee'] = int(application_fee * 100)
resp = stripe.Charge.create(**charge_args)
return Charge.sync_from_stripe_data(resp)
| 37.654434 | 137 | 0.609491 | 4,048 | 36,939 | 5.358202 | 0.096591 | 0.016966 | 0.018257 | 0.024343 | 0.383264 | 0.325496 | 0.278377 | 0.210742 | 0.187275 | 0.141586 | 0 | 0.0084 | 0.294242 | 36,939 | 980 | 138 | 37.692857 | 0.82359 | 0.021035 | 0 | 0.262755 | 0 | 0 | 0.061557 | 0.004071 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.005102 | 0.020408 | null | null | 0.010204 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
603be24384736b5da4440432a56324e5b621091a | 260 | py | Python | examples/test_yield_8.py | MateuszG/django_auth | 4cda699c1b6516ffaa26329f545a674a7c849a16 | [
"MIT"
] | 2 | 2015-01-12T09:43:59.000Z | 2015-01-12T10:39:31.000Z | examples/test_yield_8.py | MateuszG/django_auth | 4cda699c1b6516ffaa26329f545a674a7c849a16 | [
"MIT"
] | null | null | null | examples/test_yield_8.py | MateuszG/django_auth | 4cda699c1b6516ffaa26329f545a674a7c849a16 | [
"MIT"
] | null | null | null | import pytest
@pytest.yield_fixture
def passwd():
print ("\nsetup before yield")
f = open("/etc/passwd")
yield f.readlines()
print ("teardown after yield")
f.close()
def test_has_lines(passwd):
print ("test called")
assert passwd
| 18.571429 | 34 | 0.653846 | 34 | 260 | 4.911765 | 0.617647 | 0.107784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 260 | 13 | 35 | 20 | 0.818627 | 0 | 0 | 0 | 0 | 0 | 0.238462 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0.181818 | false | 0.363636 | 0.090909 | 0 | 0.272727 | 0.272727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
603d47f5b923ece6ffdc97d38998dad6e0f866c8 | 2,022 | py | Python | examples/api-samples/inc_samples/convert_callback.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/convert_callback.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/convert_callback.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | import os
import json
import shutil
import time
from pyramid.renderers import render_to_response
from pyramid.response import Response
from groupdocs.ApiClient import ApiClient
from groupdocs.AsyncApi import AsyncApi
from groupdocs.StorageApi import StorageApi
from groupdocs.GroupDocsRequestSigner import GroupDocsRequestSigner
# Checking value on null
def IsNotNull(value):
return value is not None and len(value) > 0
def convert_callback(request):
currentDir = os.path.dirname(os.path.realpath(__file__))
if os.path.exists(currentDir + '/../user_info.txt'):
f = open(currentDir + '/../user_info.txt')
lines = f.readlines()
f.close()
clientId = lines[0].replace("\r\n", "")
privateKey = lines[1]
if IsNotNull(request.json_body):
jsonPostData = request.json_body
jobId = jsonPostData['SourceId']
# Create signer object
signer = GroupDocsRequestSigner(privateKey)
# Create apiClient object
apiClient = ApiClient(signer)
# Create AsyncApi object
async = AsyncApi(apiClient)
# Create Storage object
api = StorageApi(apiClient)
if jobId != '':
time.sleep(5)
# Make request to api for get document info by job id
jobs = async.GetJobDocuments(clientId, jobId)
if jobs.status == 'Ok':
# Get file guid
resultGuid = jobs.result.inputs[0].outputs[0].guid
name = jobs.result.inputs[0].outputs[0].name
currentDir = os.path.dirname(os.path.realpath(__file__))
downloadFolder = currentDir + '/../downloads/'
if not os.path.isdir(downloadFolder):
os.makedirs(downloadFolder)
#Downlaoding of file
fs = api.GetFile(clientId, resultGuid);
if fs:
filePath = downloadFolder + name
with open(filePath, 'wb') as fp:
shutil.copyfileobj(fs.inputStream, fp)
| 34.271186 | 68 | 0.62908 | 220 | 2,022 | 5.713636 | 0.440909 | 0.02864 | 0.025457 | 0.036595 | 0.105012 | 0.105012 | 0.065235 | 0.065235 | 0 | 0 | 0 | 0.005491 | 0.279426 | 2,022 | 58 | 69 | 34.862069 | 0.857241 | 0.097428 | 0 | 0.047619 | 0 | 0 | 0.035242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.238095 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60416481c613049aa881c1d91f118e1ecab9fdbf | 1,194 | py | Python | create_augmented_versions.py | jakobabesser/piano_aug | 37f78c77465749c80d7aa91d9e804b89024eb278 | [
"MIT"
] | null | null | null | create_augmented_versions.py | jakobabesser/piano_aug | 37f78c77465749c80d7aa91d9e804b89024eb278 | [
"MIT"
] | null | null | null | create_augmented_versions.py | jakobabesser/piano_aug | 37f78c77465749c80d7aa91d9e804b89024eb278 | [
"MIT"
] | null | null | null | from pedalboard import Reverb, Compressor, Gain, LowpassFilter, Pedalboard
import soundfile as sf
if __name__ == '__main__':
# replace by path of unprocessed piano file if necessar
fn_wav_source = 'live_grand_piano.wav'
# augmentation settings using Pedalboard library
settings = {'rev-': [Reverb(room_size=.4)],
'rev+': [Reverb(room_size=.8)],
'comp+': [Compressor(threshold_db=-15, ratio=20)],
'comp-': [Compressor(threshold_db=-10, ratio=10)],
'gain+': [Gain(gain_db=15)], # clipping
'gain-': [Gain(gain_db=5)],
'lpf-': [LowpassFilter(cutoff_frequency_hz=50)],
'lpf+': [LowpassFilter(cutoff_frequency_hz=250)]}
# create augmented versions
for s in settings.keys():
# load unprocessed piano recording
audio, sample_rate = sf.read(fn_wav_source)
# create Pedalboard object
board = Pedalboard(settings[s])
# create augmented audio
effected = board(audio, sample_rate)
# save it
fn_target = fn_wav_source.replace('.wav', f'_{s}.wav')
sf.write(fn_target, effected, sample_rate)
| 34.114286 | 74 | 0.61139 | 138 | 1,194 | 5.057971 | 0.514493 | 0.045845 | 0.047278 | 0.048711 | 0.094556 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020525 | 0.265494 | 1,194 | 34 | 75 | 35.117647 | 0.775371 | 0.187605 | 0 | 0 | 0 | 0 | 0.079084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6043f0f0c5013421d3026505d50e50aa5fb67097 | 9,333 | py | Python | src/python/Vector2_TEST.py | clalancette/ign-math | 84eb1bfe470d00d335c048f102b56c49a15b56be | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2019-08-21T20:50:05.000Z | 2022-03-27T11:48:25.000Z | src/python/Vector2_TEST.py | clalancette/ign-math | 84eb1bfe470d00d335c048f102b56c49a15b56be | [
"ECL-2.0",
"Apache-2.0"
] | 277 | 2020-04-16T23:38:50.000Z | 2022-03-31T11:11:58.000Z | src/python/Vector2_TEST.py | clalancette/ign-math | 84eb1bfe470d00d335c048f102b56c49a15b56be | [
"ECL-2.0",
"Apache-2.0"
] | 48 | 2020-04-15T21:15:43.000Z | 2022-03-14T19:29:04.000Z | # Copyright (C) 2021 Open Source Robotics Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import math
from ignition.math import Vector2d
from ignition.math import Vector2f
class TestVector2(unittest.TestCase):
def test_construction(self):
v = Vector2d()
self.assertAlmostEqual(0.0, v.x())
self.assertAlmostEqual(0.0, v.y())
vec = Vector2d(1, 0)
self.assertEqual(vec.x(), 1)
self.assertEqual(vec.y(), 0)
vec2 = Vector2d(vec)
self.assertEqual(vec2, vec)
# Copy
vec3 = vec
self.assertEqual(vec3, vec)
# Inequality
vec4 = Vector2d()
self.assertNotEqual(vec, vec4)
def test_vector2(self):
v = Vector2d(1, 2)
# Distance
self.assertAlmostEqual(2.236, v.distance(Vector2d()), delta=1e-2)
# Normalize
v.normalize()
self.assertTrue(v.equal(Vector2d(0.447214, 0.894427), 1e-4))
# Set
v.set(4, 5)
self.assertTrue(v.equal(Vector2d(4, 5), 1e-4))
# Abs
v.set(-1, -2)
self.assertTrue(v.abs().equal(Vector2d(1, 2), 1e-4))
# _eq_
v = Vector2d(6, 7)
self.assertTrue(v.equal(Vector2d(6, 7), 1e-4))
# _add_
v = v + Vector2d(1, 2)
self.assertTrue(v.equal(Vector2d(7, 9), 1e-4))
v += Vector2d(5, 6)
self.assertTrue(v.equal(Vector2d(12, 15), 1e-4))
# __sub__
v = v - Vector2d(2, 4)
self.assertTrue(v.equal(Vector2d(10, 11), 1e-4))
v.set(2, 4)
v -= Vector2d(1, 6)
self.assertTrue(v.equal(Vector2d(1, -2), 1e-4))
# __truediv__
v.set(10, 6)
v = v / Vector2d(2, 3)
self.assertTrue(v.equal(Vector2d(5, 2), 1e-4))
v.set(10, 6)
v /= Vector2d(2, 3)
self.assertTrue(v.equal(Vector2d(5, 2), 1e-4))
# __truediv__ int
v.set(10, 6)
v = v / 2
self.assertTrue(v.equal(Vector2d(5, 3), 1e-4))
v.set(10, 6)
v /= 2
self.assertTrue(v.equal(Vector2d(5, 3), 1e-4))
# __mul__
v.set(10, 6)
v = v * Vector2d(2, 4)
self.assertTrue(v.equal(Vector2d(20, 24), 1e-4))
v.set(10, 6)
v *= Vector2d(2, 4)
self.assertTrue(v.equal(Vector2d(20, 24), 1e-4))
# __mul__ int
v.set(10, 6)
v = v * 2
self.assertTrue(v.equal(Vector2d(20, 12), 1e-4))
v.set(10, 6)
v *= 2
self.assertTrue(v.equal(Vector2d(20, 12), 1e-4))
# is_finite
self.assertTrue(v.is_finite())
def test_max(self):
vec1 = Vector2d(0.1, 0.2)
vec2 = Vector2d(0.3, 0.5)
vec3 = Vector2d(0.4, 0.2)
self.assertAlmostEqual(vec1.max(), 0.2)
self.assertAlmostEqual(vec3.max(), 0.4)
vec1.max(vec2)
self.assertAlmostEqual(vec1, Vector2d(0.3, 0.5))
vec1.max(vec3)
self.assertAlmostEqual(vec1, Vector2d(0.4, 0.5))
def test_min(self):
vec1 = Vector2d(0.3, 0.5)
vec2 = Vector2d(0.1, 0.2)
vec3 = Vector2d(0.05, 0.1)
self.assertAlmostEqual(vec1.min(), 0.3)
self.assertAlmostEqual(vec3.min(), 0.05)
vec1.min(vec2)
self.assertAlmostEqual(vec1, Vector2d(0.1, 0.2))
vec1.min(vec3)
self.assertAlmostEqual(vec1, Vector2d(0.05, 0.1))
def test_equal_tolerance(self):
# Test Equal function with specified tolerance
self.assertFalse(Vector2d.ZERO.equal(Vector2d.ONE, 1e-6))
self.assertFalse(Vector2d.ZERO.equal(Vector2d.ONE, 1e-3))
self.assertFalse(Vector2d.ZERO.equal(Vector2d.ONE, 1e-1))
self.assertTrue(Vector2d.ZERO.equal(Vector2d.ONE, 1))
self.assertTrue(Vector2d.ZERO.equal(Vector2d.ONE, 1.1))
def test_dot(self):
v = Vector2d(1, 2)
self.assertAlmostEqual(v.dot(Vector2d(3, 4)), 11.0)
self.assertAlmostEqual(v.dot(Vector2d(0, 0)), 0.0)
self.assertAlmostEqual(v.dot(Vector2d(1, 0)), 1.0)
self.assertAlmostEqual(v.dot(Vector2d(0, 1)), 2.0)
def test_correct(self):
vec1 = Vector2d(0, float("nan"))
vec2 = Vector2d(float("inf"), -1)
vec3 = Vector2d(10, -2)
vec1.correct()
vec2.correct()
vec3.correct()
self.assertAlmostEqual(vec1, Vector2d(0, 0))
self.assertAlmostEqual(vec2, Vector2d(0, -1))
self.assertAlmostEqual(vec3, Vector2d(10, -2))
def test_abs_dot(self):
v = Vector2d(1, -2)
self.assertAlmostEqual(v.abs_dot(Vector2d(3, 4)), 11.0)
self.assertAlmostEqual(v.abs_dot(Vector2d(0, 0)), 0.0)
self.assertAlmostEqual(v.abs_dot(Vector2d(1, 0)), 1.0)
self.assertAlmostEqual(v.abs_dot(Vector2d(0, 1)), 2.0)
def test_add(self):
vec1 = Vector2d(0.1, 0.2)
vec2 = Vector2d(1.1, 2.2)
vec3 = vec1
vec3 += vec2
self.assertAlmostEqual(vec1 + vec2, Vector2d(1.2, 2.4))
self.assertAlmostEqual(vec3, Vector2d(1.2, 2.4))
# Add zero
# Scalar right
self.assertEqual(vec1 + 0, vec1)
# Vector left and right
self.assertAlmostEqual(Vector2d.ZERO + vec1, vec1)
self.assertAlmostEqual(vec1 + Vector2d.ZERO, vec1)
# Addition assigment
vec4 = Vector2d(vec1)
vec4 += 0
self.assertEqual(vec4, vec1)
vec4 += Vector2d.ZERO
self.assertAlmostEqual(vec4, vec1)
# Add non-trivial scalar values left and right
self.assertEqual(vec1 + 2.5, Vector2d(2.6, 2.7))
vec1 = vec4
vec4 += 2.5
self.assertEqual(vec4, Vector2d(2.6, 2.7))
def test_sub(self):
vec1 = Vector2d(0.1, 0.2)
vec2 = Vector2d(1.1, 2.2)
vec3 = vec2
vec3 -= vec1
self.assertAlmostEqual(vec2 - vec1, Vector2d(1.0, 2.0))
self.assertAlmostEqual(vec3, Vector2d(1.0, 2.0))
# Subtraction with zeros
# Scalar right
self.assertEqual(vec1 - 0, vec1)
# Vector left and right
self.assertAlmostEqual(Vector2d.ZERO - vec1, -vec1)
self.assertAlmostEqual(vec1 - Vector2d.ZERO, vec1)
# Subtraction assignment
vec4 = Vector2d(vec1)
vec4 -= 0
self.assertEqual(vec4, vec1)
vec4 -= Vector2d.ZERO
self.assertAlmostEqual(vec4, vec1)
# Subtract non-trivial scalar values left and right
self.assertEqual(vec1 - 2.5, -Vector2d(2.4, 2.3))
vec4 = vec1
vec4 -= 2.5
self.assertEqual(vec4, -Vector2d(2.4, 2.3))
def test_multiply(self):
v = Vector2d(0.1, -4.2)
vec2 = v * 2.0
self.assertEqual(vec2, Vector2d(0.2, -8.4))
vec2 *= 4.0
self.assertEqual(vec2, Vector2d(0.8, -33.6))
# Multiply by zero
# Scalar right
self.assertEqual(v * 0, Vector2d.ZERO)
# Element-wise vector multiplication
self.assertEqual(v * Vector2d.ZERO, Vector2d.ZERO)
# Multiply by one
# Scalar right
self.assertEqual(v * 1, v)
# Element-wise vector multiplication
self.assertEqual(v * Vector2d.ONE, v)
# Multiply by non-trivial scalar value
scalar = 2.5
expect = Vector2d(0.25, -10.5)
self.assertEqual(v * scalar, expect)
# Multiply by itself element-wise
v.set(0.1, 0.5)
self.assertAlmostEqual(v * v, Vector2d(0.01, 0.25))
def test_lenght(self):
# Zero vector
self.assertAlmostEqual(Vector2d.ZERO.length(), 0.0)
self.assertAlmostEqual(Vector2d.ZERO.squared_length(), 0.0)
# One vector
self.assertAlmostEqual(Vector2d.ONE.length(),
math.sqrt(2), delta=1e-10)
self.assertAlmostEqual(Vector2d.ONE.squared_length(), 2.0)
# Arbitrary vector
v = Vector2d(0.1, -4.2)
self.assertAlmostEqual(v.length(), 4.20119030752, delta=1e-10)
self.assertAlmostEqual(v.squared_length(), 17.65)
# Integer vector
v = Vector2d(3, 4)
self.assertAlmostEqual(v.length(), 5)
self.assertAlmostEqual(v.squared_length(), 25)
def test_nan(self):
nanVec = Vector2d.NAN
self.assertFalse(nanVec.is_finite())
self.assertTrue(math.isnan(nanVec.x()))
self.assertTrue(math.isnan(nanVec.y()))
nanVec.correct()
self.assertEqual(Vector2d.ZERO, nanVec)
self.assertTrue(nanVec.is_finite())
nanVecF = Vector2f.NAN
self.assertFalse(nanVecF.is_finite())
self.assertTrue(math.isnan(nanVecF.x()))
self.assertTrue(math.isnan(nanVecF.y()))
nanVecF.correct()
self.assertEqual(Vector2f.ZERO, nanVecF)
self.assertTrue(nanVecF.is_finite())
if __name__ == '__main__':
unittest.main()
| 29.165625 | 74 | 0.585985 | 1,241 | 9,333 | 4.357776 | 0.142627 | 0.159209 | 0.047152 | 0.055473 | 0.536982 | 0.426775 | 0.346709 | 0.346154 | 0.276257 | 0.195451 | 0 | 0.09168 | 0.280081 | 9,333 | 319 | 75 | 29.257053 | 0.713201 | 0.131898 | 0 | 0.160622 | 0 | 0 | 0.001739 | 0 | 0 | 0 | 0 | 0 | 0.481865 | 1 | 0.067358 | false | 0 | 0.020725 | 0 | 0.093264 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
604b36210d2f64d1a79dd2e280534e5bf39ec7cb | 4,737 | py | Python | facerec-master/py/facerec/distance.py | ArianeFire/HaniCam | 8a940486a613d680a0b556209a596cdf3eb71f53 | [
"MIT"
] | 776 | 2015-01-01T11:34:42.000Z | 2022-02-26T10:25:51.000Z | facerec-master/py/facerec/distance.py | ArianeFire/HaniCam | 8a940486a613d680a0b556209a596cdf3eb71f53 | [
"MIT"
] | 43 | 2015-03-17T07:48:38.000Z | 2019-08-21T05:16:36.000Z | facerec-master/py/facerec/distance.py | ArianeFire/HaniCam | 8a940486a613d680a0b556209a596cdf3eb71f53 | [
"MIT"
] | 479 | 2015-01-01T12:34:38.000Z | 2022-02-28T23:57:26.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (c) Philipp Wagner. All rights reserved.
# Licensed under the BSD license. See LICENSE file in the project root for full license information.
import numpy as np
class AbstractDistance(object):
def __init__(self, name):
self._name = name
def __call__(self,p,q):
raise NotImplementedError("Every AbstractDistance must implement the __call__ method.")
@property
def name(self):
return self._name
def __repr__(self):
return self._name
class EuclideanDistance(AbstractDistance):
def __init__(self):
AbstractDistance.__init__(self,"EuclideanDistance")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
return np.sqrt(np.sum(np.power((p-q),2)))
class CosineDistance(AbstractDistance):
"""
Negated Mahalanobis Cosine Distance.
Literature:
"Studies on sensitivity of face recognition performance to eye location accuracy.". Master Thesis (2004), Wang
"""
def __init__(self):
AbstractDistance.__init__(self,"CosineDistance")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
return -np.dot(p.T,q) / (np.sqrt(np.dot(p,p.T)*np.dot(q,q.T)))
class NormalizedCorrelation(AbstractDistance):
"""
Calculates the NormalizedCorrelation Coefficient for two vectors.
Literature:
"Multi-scale Local Binary Pattern Histogram for Face Recognition". PhD (2008). Chi Ho Chan, University Of Surrey.
"""
def __init__(self):
AbstractDistance.__init__(self,"NormalizedCorrelation")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
pmu = p.mean()
qmu = q.mean()
pm = p - pmu
qm = q - qmu
return 1.0 - (np.dot(pm, qm) / (np.sqrt(np.dot(pm, pm)) * np.sqrt(np.dot(qm, qm))))
class ChiSquareDistance(AbstractDistance):
"""
Negated Mahalanobis Cosine Distance.
Literature:
"Studies on sensitivity of face recognition performance to eye location accuracy.". Master Thesis (2004), Wang
"""
def __init__(self):
AbstractDistance.__init__(self,"ChiSquareDistance")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
bin_dists = (p-q)**2 / (p+q+np.finfo('float').eps)
return np.sum(bin_dists)
class HistogramIntersection(AbstractDistance):
def __init__(self):
AbstractDistance.__init__(self,"HistogramIntersection")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
return np.sum(np.minimum(p,q))
class BinRatioDistance(AbstractDistance):
"""
Calculates the Bin Ratio Dissimilarity.
Literature:
"Use Bin-Ratio Information for Category and Scene Classification" (2010), Xie et.al.
"""
def __init__(self):
AbstractDistance.__init__(self,"BinRatioDistance")
def __call__(self, p, q):
p = np.asarray(p).flatten()
q = np.asarray(q).flatten()
a = np.abs(1-np.dot(p,q.T)) # NumPy needs np.dot instead of * for reducing to tensor
b = ((p-q)**2 + 2*a*(p*q))/((p+q)**2+np.finfo('float').eps)
return np.abs(np.sum(b))
class L1BinRatioDistance(AbstractDistance):
"""
Calculates the L1-Bin Ratio Dissimilarity.
Literature:
"Use Bin-Ratio Information for Category and Scene Classification" (2010), Xie et.al.
"""
def __init__(self):
AbstractDistance.__init__(self,"L1-BinRatioDistance")
def __call__(self, p, q):
p = np.asarray(p, dtype=np.float).flatten()
q = np.asarray(q, dtype=np.float).flatten()
a = np.abs(1-np.dot(p,q.T)) # NumPy needs np.dot instead of * for reducing to tensor
b = ((p-q)**2 + 2*a*(p*q)) * abs(p-q) / ((p+q)**2+np.finfo('float').eps)
return np.abs(np.sum(b))
class ChiSquareBRD(AbstractDistance):
"""
Calculates the ChiSquare-Bin Ratio Dissimilarity.
Literature:
"Use Bin-Ratio Information for Category and Scene Classification" (2010), Xie et.al.
"""
def __init__(self):
AbstractDistance.__init__(self,"ChiSquare-BinRatioDistance")
def __call__(self, p, q):
p = np.asarray(p, dtype=np.float).flatten()
q = np.asarray(q, dtype=np.float).flatten()
a = np.abs(1-np.dot(p,q.T)) # NumPy needs np.dot instead of * for reducing to tensor
b = ((p-q)**2 + 2*a*(p*q)) * (p-q)**2 / ((p+q)**3+np.finfo('float').eps)
return np.abs(np.sum(b))
| 33.595745 | 125 | 0.617479 | 617 | 4,737 | 4.551053 | 0.222042 | 0.019231 | 0.011752 | 0.038462 | 0.629274 | 0.622151 | 0.601496 | 0.565171 | 0.565171 | 0.565171 | 0 | 0.012538 | 0.242347 | 4,737 | 140 | 126 | 33.835714 | 0.769852 | 0.278657 | 0 | 0.525641 | 0 | 0 | 0.07103 | 0.021092 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25641 | false | 0 | 0.012821 | 0.025641 | 0.512821 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
604b9cab87abdc5ce52f2c470f0e9885781ed2dd | 7,162 | py | Python | pgyer_uploader.py | elina8013/android_demo | d8cef19d06a4f21f7cf2c277bbabba8cf10a8608 | [
"Apache-2.0"
] | 666 | 2015-03-18T02:09:34.000Z | 2021-08-25T06:24:27.000Z | pgyer_uploader.py | shanjiaxiang/android_demo | d1afa66c30ae5b3c09a39f4c36c61640615177bb | [
"Apache-2.0"
] | 7 | 2017-04-26T07:06:49.000Z | 2019-07-08T08:05:13.000Z | pgyer_uploader.py | shanjiaxiang/android_demo | d1afa66c30ae5b3c09a39f4c36c61640615177bb | [
"Apache-2.0"
] | 371 | 2015-03-18T02:09:33.000Z | 2021-09-10T02:41:05.000Z | #!/usr/bin/python
#coding=utf-8
import os
import requests
import time
import re
from datetime import datetime
import urllib2
import json
import mimetypes
import smtplib
from email.MIMEText import MIMEText
from email.MIMEMultipart import MIMEMultipart
# configuration for pgyer
USER_KEY = "f605b7c7826690f796078e3dd23a60d5"
API_KEY = "8bdd05df986d598f01456914e51fc889"
PGYER_UPLOAD_URL = "https://www.pgyer.com/apiv1/app/upload"
repo_path = 'C:/Users/Administrator/.jenkins/workspace/Demo/app'
repo_url = 'https://github.com/r17171709/iite_test'
ipa_path = "C:/Users/Administrator/.jenkins/workspace/Demo/app/build/outputs/apk/app-release.apk"
update_description = "版本更新测试"
def parseUploadResult(jsonResult):
print 'post response: %s' % jsonResult
resultCode = jsonResult['code']
send_Email(jsonResult)
if resultCode != 0:
print "Upload Fail!"
raise Exception("Reason: %s" % jsonResult['message'])
print "Upload Success"
appKey = jsonResult['data']['appKey']
appDownloadPageURL = "https://www.pgyer.com/%s" % appKey
print "appDownloadPage: %s" % appDownloadPageURL
return appDownloadPageURL
def uploadIpaToPgyer(ipaPath, updateDescription):
print "Begin to upload ipa to Pgyer: %s" % ipaPath
headers = {'enctype': 'multipart/form-data'}
payload = {
'uKey': USER_KEY,
'_api_key': API_KEY,
'publishRange': '2', # 直接发布
'isPublishToPublic': '2', # 不发布到广场
'updateDescription': updateDescription # 版本更新描述
}
try_times = 0
while try_times < 5:
try:
print "uploading ... %s" % datetime.now()
ipa_file = {'file': open(ipaPath, 'rb')}
r = requests.post(PGYER_UPLOAD_URL,
headers = headers,
files = ipa_file,
data = payload
)
assert r.status_code == requests.codes.ok
result = r.json()
appDownloadPageURL = parseUploadResult(result)
return appDownloadPageURL
except requests.exceptions.ConnectionError:
print "requests.exceptions.ConnectionError occured!"
time.sleep(60)
print "try again ... %s" % datetime.now()
try_times += 1
except Exception as e:
print "Exception occured: %s" % str(e)
time.sleep(60)
print "try again ... %s" % datetime.now()
try_times += 1
if try_times >= 5:
raise Exception("Failed to upload ipa to Pgyer, retried 5 times.")
def parseQRCodeImageUrl(appDownloadPageURL):
try_times = 0
while try_times < 3:
try:
response = requests.get(appDownloadPageURL)
regex = '<img src=\"(.*?)\" style='
m = re.search(regex, response.content)
assert m is not None
appQRCodeURL = m.group(1)
print "appQRCodeURL: %s" % appQRCodeURL
return appQRCodeURL
except AssertionError:
try_times += 1
time.sleep(60)
print "Can not locate QRCode image. retry ... %s: %s" % (try_times, datetime.now())
if try_times >= 3:
raise Exception("Failed to locate QRCode image in download page, retried 3 times.")
def saveQRCodeImage(appDownloadPageURL, output_folder):
appQRCodeURL = parseQRCodeImageUrl(appDownloadPageURL)
response = requests.get(appQRCodeURL)
qr_image_file_path = os.path.join(output_folder, 'QRCode.png')
if response.status_code == 200:
with open(qr_image_file_path, 'wb') as f:
f.write(response.content)
print 'Save QRCode image to file: %s' % qr_image_file_path
def main():
appDownloadPageURL = uploadIpaToPgyer(ipa_path, update_description)
try:
output_folder = os.path.dirname(ipa_path)
saveQRCodeImage(appDownloadPageURL, output_folder)
except Exception as e:
print "Exception occured: %s" % str(e)
#获取 最后一次 提交git的信息
def getCommitInfo():
#方法一 使用 python 库 前提是 当前分支 在服务器上存在
# repo = Gittle(repo_path, origin_uri=repo_url)
# commitInfo = repo.commit_info(start=0, end=1)
# lastCommitInfo = commitInfo[0]
#方法二 直接 cd 到 目录下 git log -1 打印commit 信息
os.chdir(repo_path);
lastCommitInfo = run_cmd('git log -1')
return lastCommitInfo
#发送邮件
def send_Email(json_result):
print '*******start to send mail****'
appName = json_result['data']['appName']
appKey = json_result['data']['appKey']
appVersion = json_result['data']['appVersion']
appBuildVersion = json_result['data']['appBuildVersion']
appShortcutUrl = json_result['data']['appShortcutUrl']
#邮件接受者
mail_receiver = ['yyy@qq.com']
#根据不同邮箱配置 host,user,和pwd
mail_host = 'smtp.139.com'
mail_port = 465
mail_user = 'xxx@139.com'
mail_pwd = 'xxx'
mail_to = ','.join(mail_receiver)
msg = MIMEMultipart()
environsString = '<p><h3>本次打包相关信息</h3><p>'
# environsString += '<p>ipa 包下载地址 : ' + 'wudizhi' + '<p>'
environsString += '<p>蒲公英安装地址 : ' + 'http://www.pgyer.com/' + str(appShortcutUrl) + '<p><p><p><p>'
# environsString += '<li><a href="itms-services://?action=download-manifest&url=https://ssl.pgyer.com/app/plist/' + str(appKey) + '"></a>点击直接安装</li>'
environsString += '<p><h3>本次git提交相关信息</h3><p>'
#获取git最后一次提交信息
lastCommitInfo = getCommitInfo()
# #提交人
# committer = lastCommitInfo['committer']['raw']
# #提交信息
# description = lastCommitInfo['description']
environsString += '<p>' + '<font color="red">' + lastCommitInfo + '</font>' + '<p>'
# environsString += '<p>Description:' + '<font color="red">' + description + '</font>' + '<p>'
message = environsString
body = MIMEText(message, _subtype='html', _charset='utf-8')
msg["Accept-Language"]="zh-CN"
msg["Accept-Charset"]="ISO-8859-1,utf-8"
msg.attach(body)
msg['To'] = mail_to
msg['from'] = 'xxxx@139.com'
msg['subject'] = 'Android APP 最新打包文件'
try:
s = smtplib.SMTP()
# 设置为调试模式,就是在会话过程中会有输出信息
s.set_debuglevel(1)
s.connect(mail_host)
s.starttls() # 创建 SSL 安全加密 链接
s.login(mail_user, mail_pwd)
s.sendmail(mail_user, mail_receiver, msg.as_string())
s.close()
print '*******mail send ok****'
except Exception, e:
print e
def run_cmd(cmd):
try:
import subprocess
except ImportError:
_, result_f, error_f = os.popen3(cmd)
else:
process = subprocess.Popen(cmd, shell = True,
stdout = subprocess.PIPE, stderr = subprocess.PIPE)
result_f, error_f = process.stdout, process.stderr
errors = error_f.read()
if errors: pass
result_str = result_f.read().strip()
if result_f : result_f.close()
if error_f : error_f.close()
return result_str
if __name__ == '__main__':
main()
| 34.76699 | 154 | 0.604021 | 794 | 7,162 | 5.324937 | 0.358942 | 0.018921 | 0.016556 | 0.011353 | 0.082781 | 0.074267 | 0.06386 | 0.06386 | 0.0421 | 0.0421 | 0 | 0.021069 | 0.271014 | 7,162 | 205 | 155 | 34.936585 | 0.788738 | 0.106116 | 0 | 0.133758 | 0 | 0.006369 | 0.210697 | 0.045705 | 0.012739 | 0 | 0 | 0 | 0.019108 | 0 | null | null | 0.006369 | 0.082803 | null | null | 0.10828 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
604ecb6f7cdc9275682b21b948b61c6eab42174d | 2,988 | py | Python | src/dispatch/incident_cost/views.py | vj-codes/dispatch | f9354781956380cac290be02fb987eb50ddc1a5d | [
"Apache-2.0"
] | 1 | 2021-06-16T17:02:35.000Z | 2021-06-16T17:02:35.000Z | src/dispatch/incident_cost/views.py | dilbwagsingh/dispatch | ca7c9730dea64e196c6653321552d570dfdad069 | [
"Apache-2.0"
] | 10 | 2021-07-17T04:28:07.000Z | 2022-02-05T00:40:59.000Z | src/dispatch/incident_cost/views.py | dilbwagsingh/dispatch | ca7c9730dea64e196c6653321552d570dfdad069 | [
"Apache-2.0"
] | null | null | null | from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from dispatch.database.core import get_db
from dispatch.database.service import common_parameters, search_filter_sort_paginate
from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
from .models import (
IncidentCostCreate,
IncidentCostPagination,
IncidentCostRead,
IncidentCostUpdate,
)
from .service import create, delete, get, update
router = APIRouter()
@router.get("", response_model=IncidentCostPagination)
def get_incident_costs(*, common: dict = Depends(common_parameters)):
"""
Get all incident costs, or only those matching a given search term.
"""
return search_filter_sort_paginate(model="IncidentCost", **common)
@router.get("/{incident_cost_id}", response_model=IncidentCostRead)
def get_incident_cost(*, db_session: Session = Depends(get_db), incident_cost_id: int):
"""
Get an incident cost by id.
"""
incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
if not incident_cost:
raise HTTPException(status_code=404, detail="An incident cost with this id does not exist.")
return incident_cost
@router.post(
"",
response_model=IncidentCostRead,
dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
)
def create_incident_cost(
*, db_session: Session = Depends(get_db), incident_cost_in: IncidentCostCreate
):
"""
Create an incident cost.
"""
incident_cost = create(db_session=db_session, incident_cost_in=incident_cost_in)
return incident_cost
@router.put(
"/{incident_cost_id}",
response_model=IncidentCostRead,
dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
)
def update_incident_cost(
*,
db_session: Session = Depends(get_db),
incident_cost_id: int,
incident_cost_in: IncidentCostUpdate,
):
"""
Update an incident cost by id.
"""
incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
if not incident_cost:
raise HTTPException(status_code=404, detail="An incident cost with this id does not exist.")
incident_cost = update(
db_session=db_session,
incident_cost=incident_cost,
incident_cost_in=incident_cost_in,
)
return incident_cost
@router.delete(
"/{incident_cost_id}",
dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
)
def delete_incident_cost(*, db_session: Session = Depends(get_db), incident_cost_id: int):
"""
Delete an incident cost, returning only an HTTP 200 OK if successful.
"""
incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
if not incident_cost:
raise HTTPException(status_code=404, detail="An incident cost with this id does not exist.")
delete(db_session=db_session, incident_cost_id=incident_cost_id)
| 32.835165 | 100 | 0.749665 | 358 | 2,988 | 5.980447 | 0.209497 | 0.246614 | 0.091546 | 0.050444 | 0.583839 | 0.548809 | 0.499766 | 0.499766 | 0.402616 | 0.402616 | 0 | 0.004779 | 0.159639 | 2,988 | 90 | 101 | 33.2 | 0.847869 | 0.073963 | 0 | 0.344262 | 0 | 0 | 0.075808 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081967 | false | 0 | 0.114754 | 0 | 0.262295 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60535516e66bf2f9d907ac1cbd0eeb26881ca2c7 | 2,728 | py | Python | tests/tests.py | desdelgado/rheology-data-toolkit | 054b1659c914b8eed86239d27a746e26404395ec | [
"MIT"
] | null | null | null | tests/tests.py | desdelgado/rheology-data-toolkit | 054b1659c914b8eed86239d27a746e26404395ec | [
"MIT"
] | 18 | 2020-04-10T15:06:50.000Z | 2020-06-23T20:57:49.000Z | tests/tests.py | desdelgado/rheology-data-toolkit | 054b1659c914b8eed86239d27a746e26404395ec | [
"MIT"
] | null | null | null | import sys, os
sys.path.append("C:/Users/Delgado/Documents/Research/rheology-data-toolkit/rheodata/extractors")
import h5py
import pandas as pd
from antonpaar import AntonPaarExtractor as APE
from ARES_G2 import ARES_G2Extractor
# %%
sys.path.append("C:/Users/Delgado/Documents/Research/rheology-data-toolkit/rheodata")
from data_converter import rheo_data_transformer
import unittest
extractor = APE()
#converter = data_converter()
class TestAntonPaar(unittest.TestCase):
def setUp(self):
self.multi_file_test = "C:/Users/Delgado/Documents/Research/rheology-data-toolkit/tests/test_data/Anton_Paar/excel_test_data/two_tests_Steady State Viscosity Curve-LO50C_excel.xlsx"
self.modified_dict, self.raw_data_dict, self.cols, self.units = extractor.import_rheo_data(self.multi_file_test)
# Inilize the class to convert
self.converter = rheo_data_transformer(self.modified_dict, self.raw_data_dict, self.cols, self.units)
self.converter.load_to_hdf("test")
def test_modified_output_isdictionary(self):
self.assertIsInstance(self.modified_dict, dict)
def test_modified_output_dictionary_contains_pandas(self):
""" Test if the output is a dictonary of pandas dataframes'"""
for value in self.modified_dict.values():
self.assertIsInstance(value, pd.DataFrame)
def test_raw_output_isdictionary(self):
self.assertIsInstance(self.raw_data_dict, dict)
def test_raw_output_dictionary_contains_pandas(self):
""" Test if the output is a dictonary of pandas dataframes'"""
for value in self.raw_data_dict.values():
self.assertIsInstance(value, pd.DataFrame)
def test_project_name_added_raw_data(self):
""" Test if the output is a dictonary of pandas dataframes'"""
for df in self.raw_data_dict.values():
self.assertEqual(df.iloc[0,0], "Project:")
def test_hdf5_created(self):
name, ext = os.path.splitext("test.hdf5")
self.assertEqual(ext, ".hdf5")
def test_project_subfolders_added(self):
f = h5py.File('test.hdf5', "r")
project_keys = list(f['Project'].keys())
f.close()
self.assertListEqual(project_keys, ['Steady State Viscosity Curve-75C','Steady State Viscosity Curve-LO80C', ])
def test_analyze_cols(self):
temp_df = extractor.make_analyze_dataframes(self.multi_file_test)
for test_key in temp_df.keys():
test_cols = list(temp_df[test_key].columns)
parsed_cols = list(self.cols[test_key])
self.assertListEqual(test_cols, parsed_cols)
# TODO Write test for saving a file
if __name__ == '__main__':
unittest.main()
| 36.373333 | 189 | 0.712243 | 366 | 2,728 | 5.065574 | 0.295082 | 0.030205 | 0.029666 | 0.040453 | 0.407228 | 0.407228 | 0.357605 | 0.339266 | 0.312837 | 0.255663 | 0 | 0.00721 | 0.186584 | 2,728 | 74 | 190 | 36.864865 | 0.828301 | 0.096041 | 0 | 0.044444 | 0 | 0.022222 | 0.170282 | 0.115432 | 0 | 0 | 0 | 0.013514 | 0.177778 | 1 | 0.2 | false | 0 | 0.177778 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60582c7b916077e6db28ad364408137dc3ff3825 | 784 | py | Python | setup.py | mintmachine/arweave-python-client | 69e8e2d32090de5fd276efdb9b9103d91b4182f6 | [
"MIT"
] | 63 | 2020-01-22T23:43:53.000Z | 2022-03-24T23:18:13.000Z | setup.py | mintmachine/arweave-python-client | 69e8e2d32090de5fd276efdb9b9103d91b4182f6 | [
"MIT"
] | 17 | 2020-01-22T23:41:07.000Z | 2022-01-04T11:43:30.000Z | setup.py | mintmachine/arweave-python-client | 69e8e2d32090de5fd276efdb9b9103d91b4182f6 | [
"MIT"
] | 25 | 2020-08-12T05:00:25.000Z | 2022-03-31T01:43:25.000Z | from distutils.core import setup
setup(
name="arweave-python-client",
packages = ['arweave'], # this must be the same as the name above
version="1.0.15.dev0",
description="Client interface for sending transactions on the Arweave permaweb",
author="Mike Hibbert",
author_email="mike@hibbertitsolutions.co.uk",
url="https://github.com/MikeHibbert/arweave-python-client",
download_url="https://github.com/MikeHibbert/arweave-python-client",
keywords=['arweave', 'crypto'],
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
install_requires=[
'arrow',
'python-jose',
'pynacl',
'pycryptodome',
'cryptography',
'requests',
'psutil'
],
)
| 28 | 82 | 0.678571 | 89 | 784 | 5.94382 | 0.730337 | 0.073724 | 0.10775 | 0.064272 | 0.177694 | 0.177694 | 0.177694 | 0.177694 | 0 | 0 | 0 | 0.009217 | 0.169643 | 784 | 27 | 83 | 29.037037 | 0.803379 | 0.049745 | 0 | 0.076923 | 0 | 0 | 0.577389 | 0.067295 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60685790ed05ececd1e05f44f06a9d6c6808d671 | 1,380 | py | Python | theonionbox/tob/credits.py | ralphwetzel/theonionbox | 9812fce48153955e179755ea7a58413c3bee182f | [
"MIT"
] | 120 | 2015-12-30T09:41:56.000Z | 2022-03-23T02:30:05.000Z | theonionbox/tob/credits.py | nwithan8/theonionbox | 9e51fe0b4d07fc89a8a133fdeceb5f97d5d58713 | [
"MIT"
] | 57 | 2015-12-29T21:55:14.000Z | 2022-01-07T09:48:51.000Z | theonionbox/tob/credits.py | nwithan8/theonionbox | 9e51fe0b4d07fc89a8a133fdeceb5f97d5d58713 | [
"MIT"
] | 17 | 2018-02-05T08:57:46.000Z | 2022-02-28T16:44:41.000Z | Credits = [
('Bootstrap', 'https://getbootstrap.com', 'The Bootstrap team', 'MIT'),
('Bottle', 'http://bottlepy.org', 'Marcel Hellkamp', 'MIT'),
('Cheroot', 'https://github.com/cherrypy/cheroot', 'CherryPy Team', 'BSD 3-Clause "New" or "Revised" License'),
('Click', 'https://github.com/pallets/click', 'Pallets', 'BSD 3-Clause "New" or "Revised" License'),
('ConfigUpdater', 'https://github.com/pyscaffold/configupdater', 'Florian Wilhelm', 'MIT'),
('Glide', 'https://github.com/glidejs/glide', '@jedrzejchalubek', 'MIT'),
('JQuery', 'https://jquery.com', 'The jQuery Foundation', 'MIT'),
('jquery.pep.js', 'http://pep.briangonzalez.org', '@briangonzalez', 'MIT'),
('js-md5', 'https://github.com/emn178/js-md5', '@emn178', 'MIT'),
('PySocks', 'https://github.com/Anorov/PySocks', '@Anorov', 'Custom DAN HAIM'),
('RapydScript-NG', 'https://github.com/kovidgoyal/rapydscript-ng', '@kovidgoyal',
'BSD 2-Clause "Simplified" License'),
('Requests', 'https://requests.kennethreitz.org', 'Kenneth Reitz', 'Apache License, Version 2.0'),
('scrollMonitor', 'https://github.com/stutrek/scrollmonitor', '@stutrek', 'MIT'),
('Smoothie Charts', 'https://github.com/joewalnes/smoothie', '@drewnoakes', 'MIT'),
('stem', 'https://stem.torproject.org', 'Damian Johnson and The Tor Project', 'GNU LESSER GENERAL PUBLIC LICENSE')
]
| 69 | 118 | 0.647101 | 158 | 1,380 | 5.651899 | 0.468354 | 0.110862 | 0.141097 | 0.029115 | 0.06495 | 0.06495 | 0.06495 | 0 | 0 | 0 | 0 | 0.010708 | 0.12029 | 1,380 | 19 | 119 | 72.631579 | 0.724876 | 0 | 0 | 0 | 0 | 0 | 0.747101 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6069d2236a48fbb04ece52eba51a580571ed12ab | 2,756 | py | Python | backend/app/migrations/0001_initial.py | juniorosorio47/client-order | ec429436d822d07d0ec1e0be0c2615087eec6e65 | [
"MIT"
] | null | null | null | backend/app/migrations/0001_initial.py | juniorosorio47/client-order | ec429436d822d07d0ec1e0be0c2615087eec6e65 | [
"MIT"
] | null | null | null | backend/app/migrations/0001_initial.py | juniorosorio47/client-order | ec429436d822d07d0ec1e0be0c2615087eec6e65 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-10-18 23:21
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Client',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=120)),
],
),
migrations.CreateModel(
name='Order',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('total', models.DecimalField(decimal_places=2, default=0.0, max_digits=20)),
('timestamp', models.DateTimeField(auto_now_add=True)),
('client', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='app.client')),
],
),
migrations.CreateModel(
name='Product',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=120)),
('price', models.DecimalField(decimal_places=2, default=0.0, max_digits=20)),
('inventory', models.IntegerField(default=0)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='OrderProduct',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('quantity', models.IntegerField(default=1)),
('order', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app.order')),
('product', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app.product')),
],
options={
'unique_together': {('order', 'product')},
},
),
migrations.AddField(
model_name='order',
name='products',
field=models.ManyToManyField(through='app.OrderProduct', to='app.Product'),
),
migrations.AddField(
model_name='order',
name='user',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL),
),
]
| 41.757576 | 142 | 0.588534 | 284 | 2,756 | 5.577465 | 0.295775 | 0.035354 | 0.05303 | 0.083333 | 0.602273 | 0.602273 | 0.602273 | 0.54798 | 0.54798 | 0.54798 | 0 | 0.016467 | 0.272859 | 2,756 | 65 | 143 | 42.4 | 0.773952 | 0.016328 | 0 | 0.482759 | 1 | 0 | 0.080473 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.051724 | 0 | 0.12069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
606baf8442117d47dded1db451079748f48b00b6 | 4,291 | py | Python | nemo/collections/nlp/models/machine_translation/mt_enc_dec_config.py | vadam5/NeMo | 3c5db09539293c3c19a6bb7437011f91261119af | [
"Apache-2.0"
] | 1 | 2021-04-13T20:34:16.000Z | 2021-04-13T20:34:16.000Z | nemo/collections/nlp/models/machine_translation/mt_enc_dec_config.py | vadam5/NeMo | 3c5db09539293c3c19a6bb7437011f91261119af | [
"Apache-2.0"
] | null | null | null | nemo/collections/nlp/models/machine_translation/mt_enc_dec_config.py | vadam5/NeMo | 3c5db09539293c3c19a6bb7437011f91261119af | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2020, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from dataclasses import dataclass
from typing import Any, Optional, Tuple
from omegaconf.omegaconf import MISSING
from nemo.collections.nlp.data.machine_translation.machine_translation_dataset import TranslationDataConfig
from nemo.collections.nlp.models.enc_dec_nlp_model import EncDecNLPModelConfig
from nemo.collections.nlp.modules.common.token_classifier import TokenClassifierConfig
from nemo.collections.nlp.modules.common.tokenizer_utils import TokenizerConfig
from nemo.collections.nlp.modules.common.transformer.transformer import (
NeMoTransformerConfig,
NeMoTransformerEncoderConfig,
)
from nemo.core.config.modelPT import ModelConfig, OptimConfig, SchedConfig
@dataclass
class MTSchedConfig(SchedConfig):
name: str = 'InverseSquareRootAnnealing'
warmup_ratio: Optional[float] = None
last_epoch: int = -1
# TODO: Refactor this dataclass to to support more optimizers (it pins the optimizer to Adam-like optimizers).
@dataclass
class MTOptimConfig(OptimConfig):
name: str = 'adam'
lr: float = 1e-3
betas: Tuple[float, float] = (0.9, 0.98)
weight_decay: float = 0.0
sched: Optional[MTSchedConfig] = MTSchedConfig()
@dataclass
class MTEncDecModelConfig(EncDecNLPModelConfig):
# machine translation configurations
num_val_examples: int = 3
num_test_examples: int = 3
max_generation_delta: int = 10
label_smoothing: Optional[float] = 0.0
beam_size: int = 4
len_pen: float = 0.0
src_language: str = 'en'
tgt_language: str = 'en'
find_unused_parameters: Optional[bool] = True
shared_tokenizer: Optional[bool] = True
preproc_out_dir: Optional[str] = None
# network architecture configuration
encoder_tokenizer: Any = MISSING
encoder: Any = MISSING
decoder_tokenizer: Any = MISSING
decoder: Any = MISSING
head: TokenClassifierConfig = TokenClassifierConfig(log_softmax=True)
# dataset configurations
train_ds: Optional[TranslationDataConfig] = TranslationDataConfig(
src_file_name=MISSING,
tgt_file_name=MISSING,
tokens_in_batch=512,
clean=True,
shuffle=True,
cache_ids=False,
use_cache=False,
)
validation_ds: Optional[TranslationDataConfig] = TranslationDataConfig(
src_file_name=MISSING,
tgt_file_name=MISSING,
tokens_in_batch=512,
clean=False,
shuffle=False,
cache_ids=False,
use_cache=False,
)
test_ds: Optional[TranslationDataConfig] = TranslationDataConfig(
src_file_name=MISSING,
tgt_file_name=MISSING,
tokens_in_batch=512,
clean=False,
shuffle=False,
cache_ids=False,
use_cache=False,
)
optim: Optional[OptimConfig] = MTOptimConfig()
@dataclass
class AAYNBaseConfig(MTEncDecModelConfig):
# Attention is All You Need Base Configuration
encoder_tokenizer: TokenizerConfig = TokenizerConfig(library='yttm')
decoder_tokenizer: TokenizerConfig = TokenizerConfig(library='yttm')
encoder: NeMoTransformerEncoderConfig = NeMoTransformerEncoderConfig(
library='nemo',
model_name=None,
pretrained=False,
hidden_size=512,
inner_size=2048,
num_layers=6,
num_attention_heads=8,
ffn_dropout=0.1,
attn_score_dropout=0.1,
attn_layer_dropout=0.1,
)
decoder: NeMoTransformerConfig = NeMoTransformerConfig(
library='nemo',
model_name=None,
pretrained=False,
inner_size=2048,
num_layers=6,
num_attention_heads=8,
ffn_dropout=0.1,
attn_score_dropout=0.1,
attn_layer_dropout=0.1,
)
| 32.022388 | 110 | 0.718714 | 499 | 4,291 | 6.02004 | 0.41483 | 0.019973 | 0.02996 | 0.036618 | 0.299601 | 0.266312 | 0.222703 | 0.196738 | 0.196738 | 0.196738 | 0 | 0.01847 | 0.20508 | 4,291 | 133 | 111 | 32.263158 | 0.862211 | 0.193428 | 0 | 0.42268 | 0 | 0 | 0.014526 | 0.007554 | 0 | 0 | 0 | 0.007519 | 0 | 1 | 0 | true | 0 | 0.092784 | 0 | 0.463918 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
606d8521b79694dc8353d8aa111b68f4f20bff71 | 9,321 | py | Python | examples/Python 2.7/Client_Complete.py | jcjveraa/EDDN | d0cbae6b7a2cac180dd414cbc324c2d84c867cd8 | [
"BSD-3-Clause"
] | 100 | 2017-07-19T10:11:04.000Z | 2020-07-05T22:07:39.000Z | examples/Python 2.7/Client_Complete.py | Assasinnys/EDDN | f03a0857cf4a2b95903d332697b07cc055c6cee7 | [
"BSD-3-Clause"
] | 61 | 2020-09-28T19:05:10.000Z | 2022-03-22T09:16:08.000Z | examples/Python 2.7/Client_Complete.py | Assasinnys/EDDN | f03a0857cf4a2b95903d332697b07cc055c6cee7 | [
"BSD-3-Clause"
] | 28 | 2020-07-19T22:37:44.000Z | 2022-03-22T09:01:39.000Z | import zlib
import zmq
import simplejson
import sys, os, datetime, time
"""
" Configuration
"""
__relayEDDN = 'tcp://eddn.edcd.io:9500'
#__timeoutEDDN = 600000 # 10 minuts
__timeoutEDDN = 60000 # 1 minut
# Set False to listen to production stream; True to listen to debug stream
__debugEDDN = False;
# Set to False if you do not want verbose logging
__logVerboseFile = os.path.dirname(__file__) + '/Logs_Verbose_EDDN_%DATE%.htm'
#__logVerboseFile = False
# Set to False if you do not want JSON logging
__logJSONFile = os.path.dirname(__file__) + '/Logs_JSON_EDDN_%DATE%.log'
#__logJSONFile = False
# A sample list of authorised softwares
__authorisedSoftwares = [
"EDCE",
"ED-TD.SPACE",
"EliteOCR",
"Maddavo's Market Share",
"RegulatedNoise",
"RegulatedNoise__DJ",
"E:D Market Connector [Windows]"
]
# Used this to excludes yourself for example has you don't want to handle your own messages ^^
__excludedSoftwares = [
'My Awesome Market Uploader'
]
"""
" Start
"""
def date(__format):
d = datetime.datetime.utcnow()
return d.strftime(__format)
__oldTime = False
def echoLog(__str):
global __oldTime, __logVerboseFile
if __logVerboseFile != False:
__logVerboseFileParsed = __logVerboseFile.replace('%DATE%', str(date('%Y-%m-%d')))
if __logVerboseFile != False and not os.path.exists(__logVerboseFileParsed):
f = open(__logVerboseFileParsed, 'w')
f.write('<style type="text/css">html { white-space: pre; font-family: Courier New,Courier,Lucida Sans Typewriter,Lucida Typewriter,monospace; }</style>')
f.close()
if (__oldTime == False) or (__oldTime != date('%H:%M:%S')):
__oldTime = date('%H:%M:%S')
__str = str(__oldTime) + ' | ' + str(__str)
else:
__str = ' ' + ' | ' + str(__str)
print __str
sys.stdout.flush()
if __logVerboseFile != False:
f = open(__logVerboseFileParsed, 'a')
f.write(__str + '\n')
f.close()
def echoLogJSON(__json):
global __logJSONFile
if __logJSONFile != False:
__logJSONFileParsed = __logJSONFile.replace('%DATE%', str(date('%Y-%m-%d')))
f = open(__logJSONFileParsed, 'a')
f.write(str(__json) + '\n')
f.close()
def main():
echoLog('Starting EDDN Subscriber')
echoLog('')
context = zmq.Context()
subscriber = context.socket(zmq.SUB)
subscriber.setsockopt(zmq.SUBSCRIBE, "")
subscriber.setsockopt(zmq.RCVTIMEO, __timeoutEDDN)
while True:
try:
subscriber.connect(__relayEDDN)
echoLog('Connect to ' + __relayEDDN)
echoLog('')
echoLog('')
poller = zmq.Poller()
poller.register(subscriber, zmq.POLLIN)
while True:
socks = dict(poller.poll(__timeoutEDDN))
if socks:
if socks.get(subscriber) == zmq.POLLIN:
__message = subscriber.recv(zmq.NOBLOCK)
__message = zlib.decompress(__message)
__json = simplejson.loads(__message)
__converted = False
# Handle commodity v1
if __json['$schemaRef'] == 'https://eddn.edcd.io/schemas/commodity/1' + ('/test' if (__debugEDDN == True) else ''):
echoLogJSON(__message)
echoLog('Receiving commodity-v1 message...')
echoLog(' - Converting to v3...')
__temp = {}
__temp['$schemaRef'] = 'https://eddn.edcd.io/schemas/commodity/3' + ('/test' if (__debugEDDN == True) else '')
__temp['header'] = __json['header']
__temp['message'] = {}
__temp['message']['systemName'] = __json['message']['systemName']
__temp['message']['stationName'] = __json['message']['stationName']
__temp['message']['timestamp'] = __json['message']['timestamp']
__temp['message']['commodities'] = []
__commodity = {}
if 'itemName' in __json['message']:
__commodity['name'] = __json['message']['itemName']
if 'buyPrice' in __json['message']:
__commodity['buyPrice'] = __json['message']['buyPrice']
if 'stationStock' in __json['message']:
__commodity['supply'] = __json['message']['stationStock']
if 'supplyLevel' in __json['message']:
__commodity['supplyLevel'] = __json['message']['supplyLevel']
if 'sellPrice' in __json['message']:
__commodity['sellPrice'] = __json['message']['sellPrice']
if 'demand' in __json['message']:
__commodity['demand'] = __json['message']['demand']
if'demandLevel' in __json['message']:
__commodity['demandLevel'] = __json['message']['demandLevel']
__temp['message']['commodities'].append(__commodity)
__json = __temp
del __temp, __commodity
__converted = True
# Handle commodity v3
if __json['$schemaRef'] == 'https://eddn.edcd.io/schemas/commodity/3' + ('/test' if (__debugEDDN == True) else ''):
if __converted == False:
echoLogJSON(__message)
echoLog('Receiving commodity-v3 message...')
__authorised = False
__excluded = False
if __json['header']['softwareName'] in __authorisedSoftwares:
__authorised = True
if __json['header']['softwareName'] in __excludedSoftwares:
__excluded = True
echoLog(' - Software: ' + __json['header']['softwareName'] + ' / ' + __json['header']['softwareVersion'])
echoLog(' - ' + 'AUTHORISED' if (__authorised == True) else
('EXCLUDED' if (__excluded == True) else 'UNAUTHORISED')
)
if __authorised == True and __excluded == False:
# Do what you want with the data...
# Have fun !
# For example
echoLog(' - Timestamp: ' + __json['message']['timestamp'])
echoLog(' - Uploader ID: ' + __json['header']['uploaderID'])
echoLog(' - System Name: ' + __json['message']['systemName'])
echoLog(' - Station Name: ' + __json['message']['stationName'])
for __commodity in __json['message']['commodities']:
echoLog(' - Name: ' + __commodity['name'])
echoLog(' - Buy Price: ' + str(__commodity['buyPrice']))
echoLog(' - Supply: ' + str(__commodity['supply'])
+ ((' (' + __commodity['supplyLevel'] + ')') if 'supplyLevel' in __commodity else '')
)
echoLog(' - Sell Price: ' + str(__commodity['sellPrice']))
echoLog(' - Demand: ' + str(__commodity['demand'])
+ ((' (' + __commodity['demandLevel'] + ')') if 'demandLevel' in __commodity else '')
)
# End example
del __authorised, __excluded
echoLog('')
echoLog('')
del __converted
else:
print 'Disconnect from ' + __relayEDDN + ' (After timeout)'
echoLog('')
echoLog('')
sys.stdout.flush()
subscriber.disconnect(__relayEDDN)
break
except zmq.ZMQError, e:
subscriber.disconnect(__relayEDDN)
echoLog('')
echoLog('Disconnect from ' + __relayEDDN + ' (After receiving ZMQError)')
echoLog('ZMQSocketException: ' + str(e))
echoLog('')
time.sleep(10)
if __name__ == '__main__':
main()
| 40.526087 | 161 | 0.466259 | 703 | 9,321 | 5.761024 | 0.302987 | 0.057037 | 0.025679 | 0.038025 | 0.126173 | 0.069136 | 0.069136 | 0.058765 | 0.058765 | 0.031605 | 0 | 0.005158 | 0.417659 | 9,321 | 229 | 162 | 40.703057 | 0.740973 | 0.055681 | 0 | 0.155844 | 0 | 0.006494 | 0.202196 | 0.011322 | 0.006494 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.025974 | null | null | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
606fff8ef61161b6a42c08c99645ebf6b02a454f | 3,618 | py | Python | tests/test_button.py | MSLNZ/msl-qt | 33abbb4807b54e3a06dbe9c0f9b343802ece9b97 | [
"MIT"
] | 1 | 2018-07-14T23:02:24.000Z | 2018-07-14T23:02:24.000Z | tests/test_button.py | MSLNZ/msl-qt | 33abbb4807b54e3a06dbe9c0f9b343802ece9b97 | [
"MIT"
] | 1 | 2019-08-12T04:52:33.000Z | 2019-08-21T00:06:59.000Z | tests/test_button.py | MSLNZ/msl-qt | 33abbb4807b54e3a06dbe9c0f9b343802ece9b97 | [
"MIT"
] | 1 | 2019-08-05T05:22:47.000Z | 2019-08-05T05:22:47.000Z | import os
import sys
import pytest
from msl.qt import convert, Button, QtWidgets, QtCore, Qt
def test_text():
b = Button(text='hello')
assert b.text() == 'hello'
assert b.icon().isNull()
assert b.toolButtonStyle() == Qt.ToolButtonTextOnly
def test_icon():
path = os.path.dirname(__file__) + '/gamma.png'
gamma_size = QtCore.QSize(191, 291)
int_val = QtWidgets.QStyle.SP_DriveNetIcon
icon = convert.to_qicon(int_val)
sizes = icon.availableSizes()
if sys.platform == 'win32':
assert len(sizes) > 1
b = Button(icon=int_val)
assert b.text() == ''
assert not b.icon().isNull()
assert b.iconSize() == sizes[0]
assert b.toolButtonStyle() == Qt.ToolButtonIconOnly
b = Button(icon=path)
assert b.text() == ''
assert not b.icon().isNull()
assert b.iconSize() == gamma_size
assert b.toolButtonStyle() == Qt.ToolButtonIconOnly
b = Button(icon=convert.icon_to_base64(convert.to_qicon(path)))
assert b.text() == ''
assert not b.icon().isNull()
assert b.iconSize() == gamma_size
assert b.toolButtonStyle() == Qt.ToolButtonIconOnly
def test_icon_size():
int_val = QtWidgets.QStyle.SP_DriveNetIcon
icon = convert.to_qicon(int_val)
sizes = icon.availableSizes()
if sys.platform == 'win32':
assert len(sizes) > 1
#
# specify the size to the get_icon function
#
b = Button(icon=convert.to_qicon(int_val))
assert b.text() == ''
assert b.toolButtonStyle() == Qt.ToolButtonIconOnly
assert b.iconSize() == sizes[0]
b = Button(icon=convert.to_qicon(int_val, size=789))
assert b.iconSize() == QtCore.QSize(789, 789)
b = Button(icon=convert.to_qicon(int_val, size=3.0))
# specifying a scale factor will use the largest available size
assert b.iconSize() == QtCore.QSize(3*sizes[-1].width(), 3*sizes[-1].height())
b = Button(icon=convert.to_qicon(int_val, size=QtCore.QSize(50, 50)))
assert b.iconSize() == QtCore.QSize(50, 50)
for size in [(256,), (256, 256, 256)]:
with pytest.raises(ValueError, match='(width, height)'):
Button(icon=convert.to_qicon(int_val, size=size))
#
# use the icon_size kwarg
#
b = Button(icon=convert.to_qicon(int_val), icon_size=1234)
assert b.iconSize() == QtCore.QSize(1234, 1234)
b = Button(icon=convert.to_qicon(int_val), icon_size=3.0)
# specifying a scale factor will use the largest available size
assert b.iconSize() == QtCore.QSize(3*sizes[-1].width(), 3*sizes[-1].height())
b = Button(icon=convert.to_qicon(int_val), icon_size=(312, 312))
assert b.iconSize() == QtCore.QSize(312, 312)
b = Button(icon=convert.to_qicon(int_val), icon_size=QtCore.QSize(500, 500))
assert b.iconSize() == QtCore.QSize(500, 500)
for size in [(256,), (256, 256, 256)]:
with pytest.raises(ValueError, match='(width, height)'):
Button(icon=convert.to_qicon(int_val), icon_size=size)
def test_text_and_icon():
b = Button(text='hello', icon=QtWidgets.QStyle.SP_DriveNetIcon)
assert b.text() == 'hello'
assert not b.icon().isNull()
assert b.toolButtonStyle() == Qt.ToolButtonTextUnderIcon
b = Button(text='world', icon=QtWidgets.QStyle.SP_DriveNetIcon, is_text_under_icon=False)
assert b.text() == 'world'
assert not b.icon().isNull()
assert b.toolButtonStyle() == Qt.ToolButtonTextBesideIcon
def test_tooltip():
b = Button(tooltip='hello')
assert b.text() == ''
assert b.icon().isNull()
assert b.toolTip() == 'hello'
assert b.toolButtonStyle() == Qt.ToolButtonIconOnly
| 31.46087 | 93 | 0.657546 | 499 | 3,618 | 4.649299 | 0.170341 | 0.090517 | 0.078448 | 0.093103 | 0.759483 | 0.611207 | 0.588362 | 0.573276 | 0.530603 | 0.459052 | 0 | 0.035897 | 0.191542 | 3,618 | 114 | 94 | 31.736842 | 0.757265 | 0.052239 | 0 | 0.506494 | 0 | 0 | 0.026316 | 0 | 0 | 0 | 0 | 0 | 0.480519 | 1 | 0.064935 | false | 0 | 0.051948 | 0 | 0.116883 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60700c79a8bc5322f89b51ceab88a89b92a4de5b | 499 | py | Python | Exercicios/ex028.py | MateusBarboza99/Python-03- | 9c6df88aaa8ba83d385b92722ed1df5873df3a77 | [
"MIT"
] | null | null | null | Exercicios/ex028.py | MateusBarboza99/Python-03- | 9c6df88aaa8ba83d385b92722ed1df5873df3a77 | [
"MIT"
] | null | null | null | Exercicios/ex028.py | MateusBarboza99/Python-03- | 9c6df88aaa8ba83d385b92722ed1df5873df3a77 | [
"MIT"
] | null | null | null | from random import randint
from time import sleep
computador = randint(0, 5) # Faz o computador "PENSAR"
print('-=-' * 20)
print('Vou Pensar em Um Número Entre 0 e 5. Tente Adivinhar Paçoca...')
print('-=-' * 20)
jogador = int(input('Em que Número eu Pensei? ')) # Jogador tenta Adivinhar
print('PROCESSANDO........')
sleep(3)
if jogador == computador:
print('PARABÊNS! Você conseguiu me Vencer Paçoca')
else:
print('GANHEI! Eu Pensei no Número {} e não no {}!'.format(computador, jogador))
| 35.642857 | 84 | 0.687375 | 71 | 499 | 4.830986 | 0.605634 | 0.040816 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021429 | 0.158317 | 499 | 13 | 85 | 38.384615 | 0.795238 | 0.098196 | 0 | 0.153846 | 0 | 0 | 0.438479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.153846 | 0.461538 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
60730c8aa9c4f71059543c43f3553248624a3054 | 2,322 | py | Python | code/loader/lock.py | IBCNServices/StardogStreamReasoning | 646db9cec7bd06ac8bfa75952b9a41773f35544d | [
"Apache-2.0"
] | 5 | 2020-04-08T22:55:17.000Z | 2021-07-30T12:12:45.000Z | code/loader/lock.py | IBCNServices/StardogStreamReasoning | 646db9cec7bd06ac8bfa75952b9a41773f35544d | [
"Apache-2.0"
] | 1 | 2021-04-29T21:58:32.000Z | 2021-05-04T08:05:11.000Z | code/loader/lock.py | IBCNServices/StardogStreamReasoning | 646db9cec7bd06ac8bfa75952b9a41773f35544d | [
"Apache-2.0"
] | 3 | 2020-06-12T13:48:08.000Z | 2021-07-23T11:24:27.000Z | import threading
class RWLock:
"""Synchronization object used in a solution of so-called second
readers-writers problem. In this problem, many readers can simultaneously
access a share, and a writer has an exclusive access to this share.
Additionally, the following constraints should be met:
1) no reader should be kept waiting if the share is currently opened for
reading unless a writer is also waiting for the share,
2) no writer should be kept waiting for the share longer than absolutely
necessary.
The implementation is based on [1, secs. 4.2.2, 4.2.6, 4.2.7]
with a modification -- adding an additional lock (C{self.__readers_queue})
-- in accordance with [2].
Sources:
[1] A.B. Downey: "The little book of semaphores", Version 2.1.5, 2008
[2] P.J. Courtois, F. Heymans, D.L. Parnas:
"Concurrent Control with 'Readers' and 'Writers'",
Communications of the ACM, 1971 (via [3])
[3] http://en.wikipedia.org/wiki/Readers-writers_problem
"""
def __init__(self):
self.__read_switch = _LightSwitch()
self.__write_switch = _LightSwitch()
self.__no_readers = threading.Lock()
self.__no_writers = threading.Lock()
self.__readers_queue = threading.Lock()
"""A lock giving an even higher priority to the writer in certain
cases (see [2] for a discussion)"""
def reader_acquire(self):
self.__readers_queue.acquire()
self.__no_readers.acquire()
self.__read_switch.acquire(self.__no_writers)
self.__no_readers.release()
self.__readers_queue.release()
def reader_release(self):
self.__read_switch.release(self.__no_writers)
def writer_acquire(self):
self.__write_switch.acquire(self.__no_readers)
self.__no_writers.acquire()
def writer_release(self):
self.__no_writers.release()
self.__write_switch.release(self.__no_readers)
class _LightSwitch:
"""An auxiliary "light switch"-like object. The first thread turns on the
"switch", the last one turns it off (see [1, sec. 4.2.2] for details)."""
def __init__(self):
self.__counter = 0
self.__mutex = threading.Lock()
def acquire(self, lock):
self.__mutex.acquire()
self.__counter += 1
if self.__counter == 1:
lock.acquire()
self.__mutex.release()
def release(self, lock):
self.__mutex.acquire()
self.__counter -= 1
if self.__counter == 0:
lock.release()
self.__mutex.release()
| 31.808219 | 75 | 0.736434 | 348 | 2,322 | 4.637931 | 0.373563 | 0.037175 | 0.040273 | 0.023544 | 0.060719 | 0.060719 | 0.060719 | 0.060719 | 0.060719 | 0.060719 | 0 | 0.019329 | 0.153316 | 2,322 | 72 | 76 | 32.25 | 0.801628 | 0.466408 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.026316 | 0 | 0.289474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6073572f31a3babd2ff3a1985183c4320d96810e | 5,627 | py | Python | src/pyfmodex/channel_group.py | Loodoor/UnamedPy | 7d154c3a652992b3c1f28050f0353451f57b2a2d | [
"MIT"
] | 1 | 2017-02-21T16:46:21.000Z | 2017-02-21T16:46:21.000Z | src/pyfmodex/channel_group.py | Loodoor/UnamedPy | 7d154c3a652992b3c1f28050f0353451f57b2a2d | [
"MIT"
] | 1 | 2017-02-21T17:57:05.000Z | 2017-02-22T11:28:51.000Z | src/pyfmodex/channel_group.py | Loodoor/UnamedPy | 7d154c3a652992b3c1f28050f0353451f57b2a2d | [
"MIT"
] | null | null | null | from .fmodobject import *
from .globalvars import dll as _dll
from .globalvars import get_class
class ChannelGroup(FmodObject):
def add_dsp(self, dsp):
check_type(dsp, get_class("DSP"))
c_ptr = c_void_p()
self._call_fmod("FMOD_ChannelGroup_AddDSP", d._ptr, byref(c_ptr))
return get_class("DSPConnection")(c_ptr)
def add_group(self, group):
check_type(group, ChannelGroup)
self._call_fmod("FMOD_ChannelGroup_AddGroup", group._ptr)
@property
def _occlusion(self):
direct = c_float()
reverb = c_float()
self._call_fmod("FMOD_ChannelGroup_Get3DOcclusion", byref(direct), byref(reverb))
return direct.value, reverb.value
@_occlusion.setter
def _occlusion(self, occs):
self._call_fmod("FMOD_ChannelGroup_Set3DOcclusion", c_float(occs[0]), c_float(occs[1]))
@property
def direct_occlusion(self):
return self._occlusion[0]
@direct_occlusion.setter
def direct_occlusion(self, occ):
self._occlusion = (occ, self._occlusion[1])
@property
def reverb_occlusion(self):
return self._occlusion[1]
@reverb_occlusion.setter
def reverb_occlusion(self, occ):
self._occlusion = (self._occlusion[0], occ)
def get_channel(self, idx):
c_ptr = c_void_p()
self._call_fmod("FMOD_ChannelGroup_GetChannel", idx, byref(c_ptr))
return channel.Channel(c_ptr)
@property
def dsp_head(self):
dsp_ptr = c_void_p()
self._call_fmod("FMOD_ChannelGroup_GetDSPHead", byref(dsp_ptr))
return get_class("DSP")(dsp_ptr)
def get_group(self, idx):
grp_ptr = c_void_p()
self._call_fmod("FMOD_ChannelGroup_GetGroup", idx)
return ChannelGroup(grp_ptr)
@property
def mute(self):
mute = c_bool()
self._call_fmod("FMOD_ChannelGroup_GetMute", byref(mute))
return mute.value
@mute.setter
def mute(self, m):
self._call_fmod("FMOD_Channel_SetMute", m)
@property
def name(self):
buf = create_string_buffer(512)
self._call_fmod("FMOD_ChannelGroup_GetName", buf, 512)
return buf.value
@property
def num_channels(self):
num = c_int()
self._call_fmod("FMOD_ChannelGroup_GetNumChannels", byref(num))
return num.value
@property
def num_groups(self):
num = c_int()
self._call_fmod("FMOD_ChannelGroup_GetNumGroups", byref(num))
return num.value
@property
def parent_group(self):
grp_ptr = c_void_p()
self._call_fmod("FMOD_ChannelGroup_GetParentGroup", byref(grp_ptr))
return ChannelGroup(grp_ptr)
@property
def paused(self):
paused = c_bool()
self._call_fmod("FMOD_ChannelGroup_GetPaused", byref(paused))
return paused.value
@paused.setter
def paused(self, p):
self._call_fmod("FMOD_ChannelGroup_SetPaused", p)
@property
def pitch(self):
pitch = c_float()
self._call_fmod("FMOD_ChannelGroup_GetPitch", byref(pitch))
return pitch.value
@property
def pitch(self, p):
self._call_fmod("FMOD_ChannelGroup_SetPitch", p)
def get_spectrum(self, numvalues, channeloffset, window):
arr = c_float * numvalues
arri = arr()
self._call_fmod("FMOD_ChannelGroup_GetSpectrum", byref(arri), numvalues, channeloffset, window)
return list(arri)
@property
def system_object(self):
sptr = c_void_p()
self._call_fmod("FMOD_channelGroup_GetSystemObject", byref(sptr))
return get_class("System")(sptr, False)
@property
def volume(self):
vol = c_float()
self._call_fmod("FMOD_ChannelGroup_GetVolume", byref(vol))
return vol.value
@volume.setter
def volume(self, vol):
self._call_fmod("FMOD_ChannelGroup_SetVolume", c_float(vol))
def get_wave_data(self, numvalues, channeloffset):
arr = c_float * numvalues
arri = arr()
self._call_fmod("FMOD_ChannelGroup_GetWaveData", byref(arri), numvalues, channeloffset)
return list(arri)
def override_3d_attributes(self, pos=0, vel=0):
self._call_fmod("FMOD_ChannelGroup_Override3DAttributes", pos, vel)
def override_frequency(self, freq):
self._call_fmod("FMOD_ChannelGroup_OverrideFrequency", c_float(freq))
def override_pan(self, pan):
self._call_fmod("FMOD_ChannelGroup_OverridePan", c_float(pan))
def override_reverb_properties(self, props):
check_type(props, REVERB_CHANNELPROPERTIES)
self._call_fmod("FMOD_ChannelGroup_OverrideReverbProperties", props)
def override_speaker_mix(self, frontleft, frontright, center, lfe, backleft, backright, sideleft, sideright):
self._call_fmod("FMOD_ChannelGroup_OverrideSpeakerMix", frontleft, frontright, center, lfe, backleft, backright,
sideleft, sideright)
def override_volume(self, vol):
self._call_fmod("FMOD_ChannelGroup_OverrideVolume", c_float(vol))
def release(self):
self._call_fmod("FMOD_ChannelGroup_Release")
def stop(self):
self._call_fmod("FMOD_ChannelGroup_Stop")
@property
def reverb_properties(self):
props = REVERB_CHANNELPROPERTIES()
ckresult(_dll.FMOD_ChannelGroup_GetReverbProperties(self._ptr, byref(props)))
return props
@reverb_properties.setter
def reverb_properties(self, props):
check_type(props, REVERB_CHANNELPROPERTIES)
ckresult(_dll.FMOD_ChannelGroup_SetReverbProperties(self._ptr, byref(props)))
| 31.61236 | 120 | 0.675671 | 674 | 5,627 | 5.305638 | 0.186944 | 0.138702 | 0.100671 | 0.134228 | 0.455537 | 0.35151 | 0.314038 | 0.207215 | 0.140101 | 0.073266 | 0 | 0.004112 | 0.222143 | 5,627 | 177 | 121 | 31.79096 | 0.812886 | 0 | 0 | 0.235714 | 0 | 0 | 0.159055 | 0.151057 | 0 | 0 | 0 | 0 | 0 | 1 | 0.257143 | false | 0 | 0.021429 | 0.014286 | 0.421429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6076505608a2c5b23c1f2f1d1636e9aacc345050 | 1,105 | py | Python | examples/plot_graph.py | huyvo/gevent-websocket-py3.5 | b2eb3b5cfb020ac976ac0970508589020dce03ad | [
"Apache-2.0"
] | null | null | null | examples/plot_graph.py | huyvo/gevent-websocket-py3.5 | b2eb3b5cfb020ac976ac0970508589020dce03ad | [
"Apache-2.0"
] | null | null | null | examples/plot_graph.py | huyvo/gevent-websocket-py3.5 | b2eb3b5cfb020ac976ac0970508589020dce03ad | [
"Apache-2.0"
] | null | null | null | from __future__ import print_function
"""
This example generates random data and plots a graph in the browser.
Run it using Gevent directly using:
$ python plot_graph.py
Or with an Gunicorn wrapper:
$ gunicorn -k "geventwebsocket.gunicorn.workers.GeventWebSocketWorker" \
plot_graph:resource
"""
import gevent
import random
from geventwebsocket import WebSocketServer, WebSocketApplication, Resource
from geventwebsocket._compat import range_type
class PlotApplication(WebSocketApplication):
def on_open(self):
for i in range_type(10000):
self.ws.send("0 %s %s\n" % (i, random.random()))
gevent.sleep(0.1)
def on_close(self, reason):
print("Connection Closed!!!", reason)
def static_wsgi_app(environ, start_response):
start_response("200 OK", [("Content-Type", "text/html")])
return open("plot_graph.html").readlines()
resource = Resource([
('/', static_wsgi_app),
('/data', PlotApplication)
])
if __name__ == "__main__":
server = WebSocketServer(('', 8000), resource, debug=True)
server.serve_forever()
| 25.113636 | 76 | 0.700452 | 133 | 1,105 | 5.609023 | 0.616541 | 0.036193 | 0.034853 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016611 | 0.182805 | 1,105 | 43 | 77 | 25.697674 | 0.809524 | 0 | 0 | 0 | 1 | 0 | 0.10241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0 | 0.227273 | 0 | 0.454545 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6077af38c7d93f92258618a8fca241844841e829 | 3,636 | py | Python | src/ansible_navigator/ui_framework/content_defs.py | goneri/ansible-navigator | 59c5c4e9758404bcf363face09cf46c325b01ad3 | [
"Apache-2.0"
] | null | null | null | src/ansible_navigator/ui_framework/content_defs.py | goneri/ansible-navigator | 59c5c4e9758404bcf363face09cf46c325b01ad3 | [
"Apache-2.0"
] | null | null | null | src/ansible_navigator/ui_framework/content_defs.py | goneri/ansible-navigator | 59c5c4e9758404bcf363face09cf46c325b01ad3 | [
"Apache-2.0"
] | null | null | null | """Definitions of UI content objects."""
from dataclasses import asdict
from dataclasses import dataclass
from enum import Enum
from typing import Dict
from typing import Generic
from typing import TypeVar
from ..utils.compatibility import TypeAlias
from ..utils.serialize import SerializationFormat
class ContentView(Enum):
"""The content view."""
FULL = "full"
NORMAL = "normal"
T = TypeVar("T") # pylint:disable=invalid-name # https://github.com/PyCQA/pylint/pull/5221
DictType: TypeAlias = Dict[str, T]
@dataclass
class ContentBase(Generic[T]):
r"""The base class for all content dataclasses presented in the UI.
It should be noted, that while the return type is defined as ``T``
for the serialization functions below, mypy will not catch in incorrect
definition of ``T`` at this time. This is because of how ``asdict()``
is typed:
@overload
def asdict(obj: Any) -> dict[str, Any]: ...
@overload
def asdict(obj: Any, \*, dict_factory: Callable[[list[tuple[str, Any]]], _T]) -> _T: ...
Which result in mypy believing the outcome of asdict is dict[str, Any] and letting it silently
pass through an incorrect ``T``. ``Mypy`` identifies this as a known issue:
https://mypy.readthedocs.io/en/stable/additional_features.html#caveats-known-issues
"""
def asdict(
self,
content_view: ContentView,
serialization_format: SerializationFormat,
) -> DictType:
"""Convert thy self into a dictionary.
:param content_view: The content view
:param serialization_format: The serialization format
:returns: A dictionary created from self
"""
converter_map = {
(ContentView.FULL, SerializationFormat.JSON): self.serialize_json_full,
(ContentView.FULL, SerializationFormat.YAML): self.serialize_yaml_full,
(ContentView.NORMAL, SerializationFormat.JSON): self.serialize_json_normal,
(ContentView.NORMAL, SerializationFormat.YAML): self.serialize_yaml_normal,
}
try:
dump_self_as_dict = converter_map[content_view, serialization_format]
except KeyError:
return asdict(self)
else:
return dump_self_as_dict()
def serialize_json_full(self) -> DictType:
"""Provide dictionary for ``JSON`` with all attributes.
:returns: A dictionary created from self
"""
return asdict(self)
def serialize_json_normal(self) -> DictType:
"""Provide dictionary for ``JSON`` with curated attributes.
:returns: A dictionary created from self
"""
return asdict(self)
def serialize_yaml_full(self) -> DictType:
"""Provide dictionary for ``YAML`` with all attributes.
:returns: A dictionary created from self
"""
return asdict(self)
def serialize_yaml_normal(self) -> DictType:
"""Provide dictionary for ``JSON`` with curated attributes.
:returns: A dictionary created from self
"""
return asdict(self)
def get(self, attribute: str):
"""Allow this dataclass to be treated like a dictionary.
This is a work around until the UI fully supports dataclasses
at which time this can be removed.
Default is intentionally not implemented as a safeguard to enure
this is not more work than necessary to remove in the future
and will only return attributes in existence.
:param attribute: The attribute to get
:returns: The gotten attribute
"""
return getattr(self, attribute)
| 32.756757 | 98 | 0.667217 | 439 | 3,636 | 5.448747 | 0.346241 | 0.032191 | 0.037625 | 0.052258 | 0.303512 | 0.236622 | 0.183528 | 0.168478 | 0.168478 | 0.168478 | 0 | 0.001459 | 0.245875 | 3,636 | 110 | 99 | 33.054545 | 0.870897 | 0.487624 | 0 | 0.116279 | 0 | 0 | 0.006871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0 | 0.186047 | 0 | 0.581395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
608acb157c66cbf8e5a515ca2945d9dde9e8ed86 | 337 | py | Python | morphocut_server/extensions.py | madelinetharp/morphocut-server | a82ad5916adbd168816f7b26432b4a98d978c299 | [
"MIT"
] | null | null | null | morphocut_server/extensions.py | madelinetharp/morphocut-server | a82ad5916adbd168816f7b26432b4a98d978c299 | [
"MIT"
] | 1 | 2019-08-14T20:07:53.000Z | 2019-08-14T20:07:53.000Z | morphocut_server/extensions.py | madelinetharp/morphocut-server | a82ad5916adbd168816f7b26432b4a98d978c299 | [
"MIT"
] | 2 | 2019-11-28T13:10:28.000Z | 2021-11-19T20:37:19.000Z |
from flask_sqlalchemy import SQLAlchemy
from flask_redis import FlaskRedis
from flask_migrate import Migrate
# from flask_rq2 import RQ
from rq import Queue
from morphocut_server.worker import redis_conn
database = SQLAlchemy()
redis_store = FlaskRedis()
migrate = Migrate()
redis_queue = Queue(connection=redis_conn)
flask_rq = None
| 22.466667 | 46 | 0.824926 | 47 | 337 | 5.702128 | 0.382979 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003401 | 0.127596 | 337 | 14 | 47 | 24.071429 | 0.908163 | 0.071217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6093e41de9ab4c93dd30e9d6abbd45ef1d80f2ad | 214 | py | Python | Chapter11/publish_horoscope1_in_another_ipns.py | HowToBeCalculated/Hands-On-Blockchain-for-Python-Developers | f9634259dd3dc509f36a5ccf3a5182c0d2ec79c4 | [
"MIT"
] | 62 | 2019-03-18T04:41:41.000Z | 2022-03-31T05:03:13.000Z | Chapter11/publish_horoscope1_in_another_ipns.py | HowToBeCalculated/Hands-On-Blockchain-for-Python-Developers | f9634259dd3dc509f36a5ccf3a5182c0d2ec79c4 | [
"MIT"
] | 2 | 2020-06-14T21:56:03.000Z | 2022-01-07T05:32:01.000Z | Chapter11/publish_horoscope1_in_another_ipns.py | HowToBeCalculated/Hands-On-Blockchain-for-Python-Developers | f9634259dd3dc509f36a5ccf3a5182c0d2ec79c4 | [
"MIT"
] | 42 | 2019-02-22T03:10:36.000Z | 2022-02-20T04:47:04.000Z | import ipfsapi
c = ipfsapi.connect()
peer_id = c.key_list()['Keys'][1]['Id']
c.name_publish('QmYjYGKXqo36GDt6f6qvp9qKAsrc72R9y88mQSLvogu8Ub', key='another_key')
result = c.cat('/ipns/' + peer_id)
print(result)
| 19.454545 | 83 | 0.728972 | 28 | 214 | 5.392857 | 0.642857 | 0.07947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061856 | 0.093458 | 214 | 10 | 84 | 21.4 | 0.716495 | 0 | 0 | 0 | 0 | 0 | 0.32243 | 0.214953 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6093fc893ebcd37ea362c87f16107050ab4d6124 | 1,179 | py | Python | tests/optimize/test_newton_raphson_hypo.py | dwillmer/fastats | 5915423714b32ed7e953e1e3a311fe50c3f30943 | [
"MIT"
] | 26 | 2017-07-17T09:19:53.000Z | 2021-11-30T01:36:56.000Z | tests/optimize/test_newton_raphson_hypo.py | dwillmer/fastats | 5915423714b32ed7e953e1e3a311fe50c3f30943 | [
"MIT"
] | 320 | 2017-09-02T16:26:25.000Z | 2021-07-28T05:19:49.000Z | tests/optimize/test_newton_raphson_hypo.py | dwillmer/fastats | 5915423714b32ed7e953e1e3a311fe50c3f30943 | [
"MIT"
] | 13 | 2017-07-06T19:02:29.000Z | 2020-01-22T11:36:34.000Z |
from hypothesis import given, assume, settings
from hypothesis.strategies import floats
from numpy import cos
from pytest import approx
from fastats.optimise.newton_raphson import newton_raphson
def func(x):
return x**3 - x - 1
def less_or_equal(x, compared_to, rel=1e-6):
return ((x < compared_to)
or ((x - compared_to) == approx(0.0, rel=rel))
or (x == approx(x, rel=rel)))
nr_func = newton_raphson(1, 1e-6, root=func, return_callable=True)
@given(floats(min_value=0.01, max_value=3.5))
def test_minimal(x):
"""
Tests that the value output from the solver
is less than or equal to the value of the
objective.
"""
eps = 1e-12
value = nr_func(x, eps)
assume(func(x) > 0.0)
assert less_or_equal(value, compared_to=func(x))
def cos_func(x):
return cos(x) - 2 * x
nr_cos = newton_raphson(0.5, 1e-6, root=cos_func, return_callable=True)
@given(floats(min_value=0.3, max_value=0.8))
@settings(deadline=None)
def test_cos_minus_2x(x):
value = nr_cos(x, 1e-6)
assert less_or_equal(value, compared_to=cos_func(x))
if __name__ == '__main__':
import pytest
pytest.main([__file__])
| 21.436364 | 71 | 0.676845 | 197 | 1,179 | 3.827411 | 0.324873 | 0.039788 | 0.043767 | 0.058355 | 0.196286 | 0.196286 | 0.196286 | 0.111406 | 0.111406 | 0 | 0 | 0.032839 | 0.199321 | 1,179 | 54 | 72 | 21.833333 | 0.76589 | 0.081425 | 0 | 0 | 0 | 0 | 0.007561 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 1 | 0.172414 | false | 0 | 0.206897 | 0.103448 | 0.482759 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
60a325d8698400a67c71d72f6db2f8d62fcf36f0 | 1,798 | py | Python | challenges/Backend Challenge/pendulum_sort.py | HernandezDerekJ/Interview | 202d78767d452ecfe8c6220180d7ed53b1104231 | [
"MIT"
] | null | null | null | challenges/Backend Challenge/pendulum_sort.py | HernandezDerekJ/Interview | 202d78767d452ecfe8c6220180d7ed53b1104231 | [
"MIT"
] | null | null | null | challenges/Backend Challenge/pendulum_sort.py | HernandezDerekJ/Interview | 202d78767d452ecfe8c6220180d7ed53b1104231 | [
"MIT"
] | null | null | null | """
Coderpad solution
"""
def pend(arr):
## arr = [2,3,5,1,4]
## vrr = [0,0,0,0,0]
var = [0] * len(arr)
mid = (len(var) - 1) / 2
## sort_arr = [1,2,3,4,5]
## vrr = [0,0,1,0,0]
sort_arr = sorted(arr)
var[mid] = sort_arr[0]
# ^
# focus shouldn't be at beginning ofr array [1',2,3,4,5]
# ^
# it should be mid [1,2,3,4,5]
# ^
# var [0,0,1,0,0]
# Now it can be flipped left and right for ever increment
# ^ ^
# sort_arr [1,2,3,4,5]
# ^ ^
# var [0,0,1,0,0]
arr_increment = 1
for i in range(1, mid + 1):
#By now the mid is the only position that is correct
#As we parse through var[], we also parse through arr[] and flip values from least to greatest
# ^
# sort_arr [1,2,3,4,5]
# ^
# var [0,0,1,0,0]
var[mid+i] = sort_arr[arr_increment]
arr_increment += 1
# ^
# sort_arr [1,2,3,4,5]
# ^
# var [0,0,1,0,0]
var[mid-i] = sort_arr[arr_increment]
arr_increment += 1
#Odd number of elements
if ((len(sort_arr)-1) % 2 == 1):
# ^
# sort_arr [1,2,3,4,5,6]
# ^
# var [0,0,1,0,0,0]
var[len(arr) - 1] = sort_arr[len(arr) - 1]
print var
if __name__ == "__main__":
arr = [5,1,3,6,2,4]
pend(arr)
arr = [5, 1, 3, 2, 4]
pend(arr)
arr = [10, 4, 1, 5, 4, 3, 7, 9]
pend(arr)
| 31 | 102 | 0.375417 | 252 | 1,798 | 2.583333 | 0.27381 | 0.052227 | 0.032258 | 0.043011 | 0.334869 | 0.282642 | 0.270353 | 0.25192 | 0.231951 | 0.231951 | 0 | 0.118717 | 0.479978 | 1,798 | 57 | 103 | 31.54386 | 0.57754 | 0.527809 | 0 | 0.238095 | 0 | 0 | 0.010165 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60a44344bf403a9af637f117e79a2de34161a730 | 8,141 | py | Python | ptrace/oop/math_tests.py | xann16/py-path-tracing | 609dbe6b80580212bd9d8e93afb6902091040d7a | [
"MIT"
] | null | null | null | ptrace/oop/math_tests.py | xann16/py-path-tracing | 609dbe6b80580212bd9d8e93afb6902091040d7a | [
"MIT"
] | null | null | null | ptrace/oop/math_tests.py | xann16/py-path-tracing | 609dbe6b80580212bd9d8e93afb6902091040d7a | [
"MIT"
] | null | null | null | """Unit tests for math-oriented common classes."""
import unittest
import math
import numpy as np
from .vector import Vec3, OrthonormalBasis
from .raycast_base import Ray
from .camera import Camera
class Vec3Tests(unittest.TestCase):
"""Test for Vec3 class."""
def test_vec3_basic(self):
"""Basic creation, access and manipulation of vector components."""
zero = Vec3()
vvv = Vec3(1, 2, 3)
x_arr = np.array([.1, .2, .3], dtype='double')
xxx = Vec3.from_array(x_arr)
ones = Vec3.full(1.0)
i_hat = Vec3.versor(0)
self.assertEqual(zero[0], 0.0)
self.assertEqual(zero[1], 0.0)
self.assertEqual(zero[2], 0.0)
self.assertEqual(vvv[0], 1.0)
self.assertEqual(vvv[1], 2.0)
self.assertEqual(vvv[2], 3.0)
vvv[2] = 10
self.assertEqual(vvv[2], 10.0)
self.assertEqual(str(vvv), '[ 1. 2. 10.]')
self.assertEqual(xxx[0], .1)
self.assertEqual(xxx[1], .2)
self.assertEqual(xxx[2], .3)
self.assertEqual(ones[0], 1)
self.assertEqual(ones[1], 1)
self.assertEqual(ones[2], 1)
self.assertEqual(i_hat[0], 1)
self.assertEqual(i_hat[1], 0)
self.assertEqual(i_hat[2], 0)
is_v_eq = np.allclose(vvv.data(), np.array([1, 2, 10]))
self.assertEqual(is_v_eq, True)
is_x_eq = np.allclose(xxx.data(), x_arr)
self.assertEqual(is_x_eq, True)
self.assertEqual(vvv.copy(), vvv)
def test_vec3_arithmetic_and_comparisons(self):
"""Testing methods and operators used for arithmentic and comparisons.
"""
xxx = Vec3(1, 2, 3)
yyy = Vec3(1, 2, 3)
zzz = Vec3(1, 0, -1)
self.assertEqual(xxx == yyy, True)
self.assertEqual(xxx != yyy, False)
self.assertEqual(xxx != zzz, True)
self.assertEqual(xxx == zzz, False)
self.assertEqual(yyy != zzz, True)
self.assertEqual(yyy == zzz, False)
yyy += zzz
self.assertEqual(yyy, Vec3.full(2))
self.assertEqual(yyy + xxx, Vec3(3, 4, 5))
yyy -= zzz
self.assertEqual(yyy, xxx)
self.assertEqual(yyy - xxx, Vec3())
self.assertEqual(+xxx, xxx)
self.assertEqual(-xxx, Vec3(-1, -2, -3))
yyy *= -1
self.assertEqual(yyy, -xxx)
self.assertEqual(yyy * -1.0, xxx)
zzz /= 2
self.assertEqual(zzz, Vec3(.5, 0, -.5))
self.assertEqual(zzz / 2, Vec3(.25, 0, -.25))
vvv = Vec3(3, 1, -2)
vvv *= Vec3(2, .5, -1)
self.assertEqual(vvv, Vec3(6, .5, 2))
self.assertEqual(vvv * Vec3.full(2), Vec3(12, 1, 4))
www = Vec3.full(10)
www /= Vec3(10, 5, 2)
self.assertEqual(www, Vec3(1, 2, 5))
self.assertEqual(www / 2, Vec3(.5, 1, 2.5))
self.assertAlmostEqual(www.dot(Vec3()), 0)
self.assertAlmostEqual(Vec3(1, 2, 4).dot(Vec3(1, -2, 1)), 1)
self.assertEqual(Vec3.versor(0).cross(Vec3.versor(1)), Vec3.versor(2))
self.assertEqual(Vec3.versor(1).cross(Vec3.versor(2)), Vec3.versor(0))
self.assertEqual(Vec3.versor(2).cross(Vec3.versor(0)), Vec3.versor(1))
self.assertEqual(Vec3.versor(1).cross(Vec3.versor(0)), -Vec3.versor(2))
self.assertEqual(Vec3.versor(1).cross(Vec3.versor(1)), Vec3())
self.assertEqual(Vec3(1, 2, 3).isclose(Vec3(1, 2, 3)), True)
self.assertEqual(Vec3(1, 2, 3).isclose(Vec3(1, 2.0001, 3), 0.1), True)
def test_vec3_normalization(self):
"""Testing length calculations and normalisation."""
self.assertAlmostEqual(Vec3().sqr_length(), 0.0)
self.assertAlmostEqual(Vec3().length(), 0.0)
self.assertAlmostEqual(Vec3.versor(0).sqr_length(), 1.0)
self.assertAlmostEqual(Vec3.versor(1).length(), 1.0)
self.assertAlmostEqual(abs(Vec3.versor(2)), 1.0)
self.assertEqual(Vec3.versor(0).normalised(), Vec3.versor(0))
self.assertEqual(Vec3.versor(1).normalised(), Vec3.versor(1))
self.assertEqual(Vec3.versor(2).normalised(), Vec3.versor(2))
sqrt3 = math.sqrt(3)
v_sqrt3_inv = Vec3.full(1. / sqrt3)
self.assertAlmostEqual(Vec3.full(1).sqr_length(), 3)
self.assertAlmostEqual(Vec3.full(1).length(), sqrt3)
self.assertEqual(Vec3.full(1).normalised(), v_sqrt3_inv)
def test_vec3_reflection(self):
"""Testing reflection with respect to given normal vector."""
nnn = Vec3.versor(2)
self.assertEqual(nnn.reflect(Vec3.versor(0)), Vec3.versor(0))
self.assertEqual(nnn.reflect(Vec3.versor(2)), -Vec3.versor(2))
diag = Vec3(1, 1, 1).normalised()
diag_refl = diag.copy()
diag_refl[2] = -diag_refl[2]
self.assertEqual(nnn.reflect(diag), diag_refl)
class OrthonormalBasisTests(unittest.TestCase):
"""Tests for OrthonormalBasis class."""
def test_onb_basic(self):
"""Basic test reconstructing natural ONB."""
nat = OrthonormalBasis(Vec3.versor(0), Vec3.versor(1), Vec3.versor(2))
nat_alt = OrthonormalBasis.from_two('xy', Vec3.versor(0), Vec3.versor(1))
vvv = Vec3(1, 2, 3)
self.assertEqual(nat.transform(vvv), vvv)
self.assertEqual(nat_alt.transform(vvv), vvv)
def test_onb_factories(self):
"""Testing factory methods for creating ONBs from one or two vectors."""
onb1 = OrthonormalBasis.from_two('xy', Vec3(1, 2, 4).normalised(),\
Vec3(0, 0, -7).normalised())
self.assertAlmostEqual(abs(onb1.x_axis), 1.0)
self.assertAlmostEqual(abs(onb1.y_axis), 1.0)
self.assertAlmostEqual(abs(onb1.z_axis), 1.0)
self.assertAlmostEqual(onb1.x_axis.dot(onb1.y_axis), 0.0)
self.assertAlmostEqual(onb1.x_axis.dot(onb1.z_axis), 0.0)
self.assertAlmostEqual(onb1.y_axis.dot(onb1.z_axis), 0.0)
onb2 = OrthonormalBasis.from_two('zx', Vec3(-1, -1, -1).normalised(),\
Vec3(1, 1, -1).normalised())
self.assertAlmostEqual(abs(onb2.x_axis), 1.0)
self.assertAlmostEqual(abs(onb2.y_axis), 1.0)
self.assertAlmostEqual(abs(onb2.z_axis), 1.0)
self.assertAlmostEqual(onb2.x_axis.dot(onb2.y_axis), 0.0)
self.assertAlmostEqual(onb2.x_axis.dot(onb2.z_axis), 0.0)
self.assertAlmostEqual(onb2.y_axis.dot(onb2.z_axis), 0.0)
onb3 = OrthonormalBasis.from_z_axis(Vec3.versor(0))
self.assertAlmostEqual(abs(onb3.x_axis), 1.0)
self.assertAlmostEqual(abs(onb3.y_axis), 1.0)
self.assertAlmostEqual(abs(onb3.z_axis), 1.0)
self.assertAlmostEqual(onb3.x_axis.dot(onb3.y_axis), 0.0)
self.assertAlmostEqual(onb3.x_axis.dot(onb3.z_axis), 0.0)
self.assertAlmostEqual(onb3.y_axis.dot(onb3.z_axis), 0.0)
class RayTests(unittest.TestCase):
"""Tests for Ray class."""
def test_ray_basic(self):
"""Basic tests chcecking ray creation and probing their points."""
ox_axis = Ray(Vec3(), Vec3.versor(0))
self.assertEqual(ox_axis.point_at(4), Vec3(4, 0, 0))
direction = Vec3(1, -1, 0).normalised()
ray1 = Ray(Vec3(0, 2, 0), direction)
ray2 = Ray.from_points(Vec3(0, 2, 0), Vec3(2, 0, 0))
self.assertEqual(ray1.direction, direction)
self.assertEqual(ray2.direction, direction)
for i in range(10):
self.assertEqual(ray1.point_at(i), ray2.point_at(i))
self.assertEqual(ray1.point_at(0), ray1.origin)
self.assertEqual(ray2.point_at(0), ray2.origin)
class CameraTests(unittest.TestCase):
"""Tests for Camera class."""
def test_cam_basic(self):
"""Basic test checking if camera casts rays in correct direction."""
cam = Camera(Vec3(), Vec3.versor(0), Vec3.versor(2), 10, 10, 120)
cam.set_focus(Vec3.versor(0), 1.0)
for px_x in range(10):
for px_y in range(10):
ray = cam.get_ray(px_x, px_y)
self.assertGreaterEqual(ray.direction.dot(Vec3.versor(0)), 0.0)
if __name__ == '__main__':
unittest.main()
| 35.550218 | 81 | 0.607051 | 1,141 | 8,141 | 4.241893 | 0.134969 | 0.192149 | 0.095455 | 0.052273 | 0.430579 | 0.269215 | 0.191116 | 0.091529 | 0.03595 | 0.03595 | 0 | 0.069187 | 0.23658 | 8,141 | 228 | 82 | 35.70614 | 0.709574 | 0.076035 | 0 | 0.012987 | 0 | 0 | 0.004426 | 0 | 0 | 0 | 0 | 0 | 0.584416 | 1 | 0.051948 | false | 0 | 0.038961 | 0 | 0.116883 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60a44dedd81ec9521c352eb02db132ee4f4bcd33 | 2,336 | py | Python | tests/integration/condition__browser__have_url_test.py | kianku/selene | 5361938e4f34d6cfae6df3aeca80e06a3e657d8c | [
"MIT"
] | null | null | null | tests/integration/condition__browser__have_url_test.py | kianku/selene | 5361938e4f34d6cfae6df3aeca80e06a3e657d8c | [
"MIT"
] | null | null | null | tests/integration/condition__browser__have_url_test.py | kianku/selene | 5361938e4f34d6cfae6df3aeca80e06a3e657d8c | [
"MIT"
] | null | null | null | # MIT License
#
# Copyright (c) 2015-2020 Iakiv Kramarenko
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import os
import pytest
from selene import have
from selene.core.exceptions import TimeoutException
start_page = 'file://' + os.path.abspath(os.path.dirname(__file__)) + '/../resources/start_page.html'
def test_have_url(session_browser):
session_browser.open(start_page)
session_browser.should(have.url(session_browser.driver.current_url))
session_browser.should(have.no.url(session_browser.driver.current_url[:-1]))
def test_have_url_containing(session_browser):
session_browser.open(start_page)
session_browser.should(have.url_containing('start_page.html'))
session_browser.should(have.no.url_containing('start_page.xhtml'))
def test_fails_on_timeout_during_waiting_for_exact_url(session_browser):
browser = session_browser.with_(timeout=0.1)
browser.open(start_page)
with pytest.raises(TimeoutException):
browser.should(have.url('xttp:/'))
# TODO: check message too
def test_fails_on_timeout_during_waiting_for_part_of_url(session_browser):
browser = session_browser.with_(timeout=0.1)
browser.open(start_page)
with pytest.raises(TimeoutException):
browser.should(have.url_containing('xttp:/'))
# TODO: check message too
| 37.079365 | 101 | 0.771404 | 337 | 2,336 | 5.183976 | 0.41543 | 0.112192 | 0.058386 | 0.045793 | 0.366342 | 0.328563 | 0.261019 | 0.261019 | 0.218661 | 0.218661 | 0 | 0.006523 | 0.146832 | 2,336 | 62 | 102 | 37.677419 | 0.870045 | 0.47988 | 0 | 0.347826 | 0 | 0 | 0.066331 | 0.024349 | 0 | 0 | 0 | 0.016129 | 0 | 1 | 0.173913 | false | 0 | 0.173913 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60aacb1ea53997b926d5fc68404773f2cbe78055 | 852 | py | Python | rooms/models.py | Neisvestney/SentSyncServer | 45e9572b6c9b274ed2cbad28749fcb2154c98757 | [
"MIT"
] | null | null | null | rooms/models.py | Neisvestney/SentSyncServer | 45e9572b6c9b274ed2cbad28749fcb2154c98757 | [
"MIT"
] | null | null | null | rooms/models.py | Neisvestney/SentSyncServer | 45e9572b6c9b274ed2cbad28749fcb2154c98757 | [
"MIT"
] | null | null | null | from django.db import models
class Room(models.Model):
code = models.CharField('Code', max_length=128)
tab_url = models.CharField('Tab url', max_length=512, default='', blank=True)
def to_dict(self):
return {
'users': [u.to_dict() for u in self.users.all()],
'tabUrl': self.tab_url
}
def __str__(self):
return f'Room {self.code}'
class RoomUser(models.Model):
room = models.ForeignKey(Room, related_name='users', on_delete=models.CASCADE)
username = models.CharField('Username', max_length=128, default="user")
host = models.BooleanField('Is host')
def to_dict(self):
return {
'id': self.id,
'username': self.username,
'isHost': self.host,
}
def __str__(self):
return f'{self.username} ({self.id})'
| 25.818182 | 82 | 0.597418 | 107 | 852 | 4.588785 | 0.429907 | 0.081466 | 0.04888 | 0.052953 | 0.14664 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014241 | 0.258216 | 852 | 32 | 83 | 26.625 | 0.762658 | 0 | 0 | 0.26087 | 0 | 0 | 0.123239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0 | 0.043478 | 0.173913 | 0.695652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
60afd06ba545d034d33e457e4d731a2d4625fc23 | 340 | py | Python | dynts/lib/fallback/simplefunc.py | quantmind/dynts | 21ac57c648bfec402fa6b1fe569496cf098fb5e8 | [
"BSD-3-Clause"
] | 57 | 2015-02-10T13:42:06.000Z | 2022-03-28T14:48:36.000Z | dynts/lib/fallback/simplefunc.py | quantmind/dynts | 21ac57c648bfec402fa6b1fe569496cf098fb5e8 | [
"BSD-3-Clause"
] | 1 | 2016-11-01T07:43:05.000Z | 2016-11-01T07:43:05.000Z | dynts/lib/fallback/simplefunc.py | quantmind/dynts | 21ac57c648bfec402fa6b1fe569496cf098fb5e8 | [
"BSD-3-Clause"
] | 17 | 2015-05-08T04:09:19.000Z | 2021-08-02T19:24:52.000Z |
from .common import *
def tsminmax(v):
mv = NaN
xv = NaN
for v in x:
if x == x:
if mv == mv:
mv = min(mv,x)
else:
mv = x
if xv == xv:
xv = max(xv,x)
else:
xv = x
return (mv,xv) | 20 | 31 | 0.297059 | 40 | 340 | 2.525 | 0.425 | 0.089109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.608824 | 340 | 17 | 32 | 20 | 0.759399 | 0 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60b09b84c56168890b3b6bd6e6a345dc11ac1a37 | 3,552 | py | Python | week9/finance/application.py | lcsm29/edx-harvard-cs50 | 283f49bd6a9e4b8497e2b397d766b64527b4786b | [
"MIT"
] | null | null | null | week9/finance/application.py | lcsm29/edx-harvard-cs50 | 283f49bd6a9e4b8497e2b397d766b64527b4786b | [
"MIT"
] | null | null | null | week9/finance/application.py | lcsm29/edx-harvard-cs50 | 283f49bd6a9e4b8497e2b397d766b64527b4786b | [
"MIT"
] | null | null | null | import os
from cs50 import SQL
from flask import Flask, flash, redirect, render_template, request, session
from flask_session import Session
from tempfile import mkdtemp
from werkzeug.exceptions import default_exceptions, HTTPException, InternalServerError
from werkzeug.security import check_password_hash, generate_password_hash
from helpers import apology, login_required, lookup, usd
# Configure application
app = Flask(__name__)
# Ensure templates are auto-reloaded
app.config["TEMPLATES_AUTO_RELOAD"] = True
# Ensure responses aren't cached
@app.after_request
def after_request(response):
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
response.headers["Expires"] = 0
response.headers["Pragma"] = "no-cache"
return response
# Custom filter
app.jinja_env.filters["usd"] = usd
# Configure session to use filesystem (instead of signed cookies)
app.config["SESSION_FILE_DIR"] = mkdtemp()
app.config["SESSION_PERMANENT"] = False
app.config["SESSION_TYPE"] = "filesystem"
Session(app)
# Configure CS50 Library to use SQLite database
db = SQL("sqlite:///finance.db")
# Make sure API key is set
if not os.environ.get("API_KEY"):
raise RuntimeError("API_KEY not set")
@app.route("/")
@login_required
def index():
"""Show portfolio of stocks"""
return apology("TODO")
@app.route("/buy", methods=["GET", "POST"])
@login_required
def buy():
"""Buy shares of stock"""
return apology("TODO")
@app.route("/history")
@login_required
def history():
"""Show history of transactions"""
return apology("TODO")
@app.route("/login", methods=["GET", "POST"])
def login():
"""Log user in"""
# Forget any user_id
session.clear()
# User reached route via POST (as by submitting a form via POST)
if request.method == "POST":
# Ensure username was submitted
if not request.form.get("username"):
return apology("must provide username", 403)
# Ensure password was submitted
elif not request.form.get("password"):
return apology("must provide password", 403)
# Query database for username
rows = db.execute("SELECT * FROM users WHERE username = ?", request.form.get("username"))
# Ensure username exists and password is correct
if len(rows) != 1 or not check_password_hash(rows[0]["hash"], request.form.get("password")):
return apology("invalid username and/or password", 403)
# Remember which user has logged in
session["user_id"] = rows[0]["id"]
# Redirect user to home page
return redirect("/")
# User reached route via GET (as by clicking a link or via redirect)
else:
return render_template("login.html")
@app.route("/logout")
def logout():
"""Log user out"""
# Forget any user_id
session.clear()
# Redirect user to login form
return redirect("/")
@app.route("/quote", methods=["GET", "POST"])
@login_required
def quote():
"""Get stock quote."""
return apology("TODO")
@app.route("/register", methods=["GET", "POST"])
def register():
"""Register user"""
return apology("TODO")
@app.route("/sell", methods=["GET", "POST"])
@login_required
def sell():
"""Sell shares of stock"""
return apology("TODO")
def errorhandler(e):
"""Handle error"""
if not isinstance(e, HTTPException):
e = InternalServerError()
return apology(e.name, e.code)
# Listen for errors
for code in default_exceptions:
app.errorhandler(code)(errorhandler)
| 24.839161 | 100 | 0.675113 | 457 | 3,552 | 5.170678 | 0.36105 | 0.055015 | 0.043165 | 0.042319 | 0.161659 | 0.115954 | 0 | 0 | 0 | 0 | 0 | 0.005927 | 0.192568 | 3,552 | 142 | 101 | 25.014085 | 0.817992 | 0.228322 | 0 | 0.208333 | 0 | 0 | 0.174823 | 0.007845 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138889 | false | 0.069444 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
60b4ecd85396e388011990857be8fae61e376084 | 517 | py | Python | Cursoemvideo/desafios/desafio008.py | gentildf/Python | bb333e55b197492eac1294619ca7b13ef57bb631 | [
"MIT"
] | 1 | 2021-03-16T13:07:29.000Z | 2021-03-16T13:07:29.000Z | Cursoemvideo/desafios/desafio008.py | gentildf/Python | bb333e55b197492eac1294619ca7b13ef57bb631 | [
"MIT"
] | null | null | null | Cursoemvideo/desafios/desafio008.py | gentildf/Python | bb333e55b197492eac1294619ca7b13ef57bb631 | [
"MIT"
] | null | null | null | #Escreva um programa que leia um valor em metros e o exiba convertido em centimetros e milimetros.
n = float(input('\033[32mDigite o numero:\033[m'))
print('O número digitado é \033[33m{0:.0f}m\033[m.\n'
'Ele apresentado em centimetros fica \033[33m{0:.2f}cm\033[m.\n'
'Apresentado em milímetros fica \033[33m{0:.3f}mm\033[m'
.format(n))
#print('O número em metros é {0}.\n
# O número em convertido para centimetros é {1}.\n
# O número convertido para milimetros é {2}'
# .format(n, n/100, n/1000))
| 47 | 98 | 0.686654 | 95 | 517 | 3.736842 | 0.442105 | 0.04507 | 0.059155 | 0.061972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110855 | 0.162476 | 517 | 10 | 99 | 51.7 | 0.709007 | 0.483559 | 0 | 0 | 0 | 0.2 | 0.729008 | 0.282443 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60b50f0934ca9581fa086a588f7ded362261de6d | 795 | py | Python | publish_fanout.py | Dordoloy/BachelorDIM-Lectures-Algorithms-2019 | 14e3e7132eecf62e3476dbea4d9db32adf1544ff | [
"MIT"
] | null | null | null | publish_fanout.py | Dordoloy/BachelorDIM-Lectures-Algorithms-2019 | 14e3e7132eecf62e3476dbea4d9db32adf1544ff | [
"MIT"
] | null | null | null | publish_fanout.py | Dordoloy/BachelorDIM-Lectures-Algorithms-2019 | 14e3e7132eecf62e3476dbea4d9db32adf1544ff | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Oct 21 08:47:08 2019
@author: dordoloy
"""
import os
import pika
import config
import getpass
def publish_fanout():
amqp_url=config.amqp_url
# Parse CLODUAMQP_URL (fallback to localhost)
url = os.environ.get('CLOUDAMQP_URL',amqp_url)
params = pika.URLParameters(url)
params.socket_timeout = 5
connection = pika.BlockingConnection(params) # Connect to CloudAMQP
properties = pika.BasicProperties()
channel = connection.channel()
channel.exchange_declare(exchange='posts',
exchange_type='fanout')
channel.basic_publish(exchange='posts',
routing_key='',
body='message')
print("send")
publish_fanout() | 22.714286 | 71 | 0.622642 | 86 | 795 | 5.616279 | 0.616279 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02418 | 0.271698 | 795 | 35 | 72 | 22.714286 | 0.810017 | 0.178616 | 0 | 0 | 0 | 0 | 0.062112 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.052632 | 0.210526 | 0 | 0.263158 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
60b55902d0598b3b2fa55eabb666b4e4d52752c7 | 16,112 | py | Python | orbit/utils.py | xjx0524/models | 99be973aa8168a0d2275d475883b3256b193251f | [
"Apache-2.0"
] | null | null | null | orbit/utils.py | xjx0524/models | 99be973aa8168a0d2275d475883b3256b193251f | [
"Apache-2.0"
] | null | null | null | orbit/utils.py | xjx0524/models | 99be973aa8168a0d2275d475883b3256b193251f | [
"Apache-2.0"
] | null | null | null | <<<<<<< HEAD
# Lint as: python3
=======
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
# Copyright 2020 The Orbit Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Some layered modules/functions to help users writing custom training loop."""
import abc
import contextlib
import functools
import inspect
<<<<<<< HEAD
=======
import os
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
import numpy as np
import tensorflow as tf
def create_loop_fn(step_fn):
"""Creates a multiple steps function driven by the python while loop.
Args:
step_fn: A function which takes `iterator` as input.
Returns:
A callable defined as the `loop_fn` defination below.
"""
def loop_fn(iterator, num_steps, state=None, reduce_fn=None):
"""A loop function with multiple steps.
Args:
iterator: A nested structure of tf.data `Iterator` or
`DistributedIterator`.
num_steps: The number of steps in the loop. If `num_steps==-1`, will
iterate until exausting the iterator.
state: An optional initial state before running the loop.
reduce_fn: a callable defined as `def reduce_fn(state, value)`, where
`value` is the outputs from `step_fn`.
Returns:
The updated state.
"""
try:
step = 0
# To make sure the OutOfRangeError exception can be handled well with
# async remote eager, we need to wrap the loop body in a `async_scope`.
with tf.experimental.async_scope():
while (num_steps == -1 or step < num_steps):
outputs = step_fn(iterator)
if reduce_fn is not None:
state = reduce_fn(state, outputs)
step += 1
return state
except (StopIteration, tf.errors.OutOfRangeError):
tf.experimental.async_clear_error()
return state
return loop_fn
def create_tf_while_loop_fn(step_fn):
"""Create a multiple steps function driven by tf.while_loop on the host.
Args:
step_fn: A function which takes `iterator` as input.
Returns:
A callable defined as the `loop_fn` defination below.
"""
def loop_fn(iterator, num_steps):
"""A loop function with multiple steps.
Args:
iterator: A nested structure of tf.data `Iterator` or
`DistributedIterator`.
num_steps: The number of steps in the loop. Must be a tf.Tensor.
"""
if not isinstance(num_steps, tf.Tensor):
raise ValueError("`num_steps` should be an `tf.Tensor`. Python object "
"may cause retracing.")
for _ in tf.range(num_steps):
step_fn(iterator)
return loop_fn
<<<<<<< HEAD
=======
def create_global_step() -> tf.Variable:
"""Creates a `tf.Variable` suitable for use as a global step counter.
Creating and managing a global step variable may be necessary for
`AbstractTrainer` subclasses that perform multiple parameter updates per
`Controller` "step", or use different optimizers on different steps.
In these cases, an `optimizer.iterations` property generally can't be used
directly, since it would correspond to parameter updates instead of iterations
in the `Controller`'s training loop. Such use cases should simply call
`step.assign_add(1)` at the end of each step.
Returns:
A non-trainable scalar `tf.Variable` of dtype `tf.int64`, with only the
first replica's value retained when synchronizing across replicas in
a distributed setting.
"""
return tf.Variable(
0,
dtype=tf.int64,
trainable=False,
aggregation=tf.VariableAggregation.ONLY_FIRST_REPLICA)
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
def make_distributed_dataset(strategy, dataset_or_fn, *args, **kwargs):
"""A helper function to create distributed dataset.
Args:
strategy: An instance of `tf.distribute.Strategy`.
dataset_or_fn: A instance of `tf.data.Dataset` or a function which takes an
`tf.distribute.InputContext` as input and returns a `tf.data.Dataset`. If
it is a function, it could optionally have an argument named
`input_context` which is `tf.distribute.InputContext` argument type.
*args: The list of arguments to be passed to dataset_or_fn.
**kwargs: Any keyword arguments to be passed.
Returns:
A distributed Dataset.
"""
if strategy is None:
strategy = tf.distribute.get_strategy()
if isinstance(dataset_or_fn, tf.data.Dataset):
return strategy.experimental_distribute_dataset(dataset_or_fn)
if not callable(dataset_or_fn):
raise ValueError("`dataset_or_fn` should be either callable or an instance "
"of `tf.data.Dataset`")
def dataset_fn(ctx):
"""Wrapped dataset function for creating distributed dataset.."""
# If `dataset_or_fn` is a function and has `input_context` as argument
# names, pass `ctx` as the value of `input_context` when calling
# `dataset_or_fn`. Otherwise `ctx` will not be used when calling
# `dataset_or_fn`.
argspec = inspect.getfullargspec(dataset_or_fn)
args_names = argspec.args
if "input_context" in args_names:
kwargs["input_context"] = ctx
ds = dataset_or_fn(*args, **kwargs)
return ds
return strategy.experimental_distribute_datasets_from_function(dataset_fn)
class SummaryManager:
"""A class manages writing summaries."""
def __init__(self, summary_dir, summary_fn, global_step=None):
"""Construct a summary manager object.
Args:
summary_dir: the directory to write summaries.
summary_fn: A callable defined as `def summary_fn(name, tensor,
step=None)`, which describes the summary operation.
global_step: A `tf.Variable` instance for the global step.
"""
self._enabled = (summary_dir is not None)
self._summary_dir = summary_dir
self._summary_fn = summary_fn
<<<<<<< HEAD
self._summary_writer = None
=======
self._summary_writers = {}
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
if global_step is None:
self._global_step = tf.summary.experimental.get_step()
else:
self._global_step = global_step
<<<<<<< HEAD
@property
def summary_writer(self):
"""Returns the underlying summary writer."""
if self._summary_writer is not None:
return self._summary_writer
if self._enabled:
self._summary_writer = tf.summary.create_file_writer(self._summary_dir)
else:
self._summary_writer = tf.summary.create_noop_writer()
return self._summary_writer
def flush(self):
"""Flush the underlying summary writer."""
if self._enabled:
tf.summary.flush(self.summary_writer)
def write_summaries(self, items):
"""Write a bulk of summaries.
Args:
items: a dictionary of `Tensors` for writing summaries.
"""
# TODO(rxsang): Support writing summaries with nested structure, so users
# can split the summaries into different directories for nicer visualization
# in Tensorboard, like train and eval metrics.
if not self._enabled:
return
with self.summary_writer.as_default():
for name, tensor in items.items():
self._summary_fn(name, tensor, step=self._global_step)
=======
def summary_writer(self, relative_path=""):
"""Returns the underlying summary writer.
Args:
relative_path: The current path in which to write summaries, relative to
the summary directory. By default it is empty, which specifies the root
directory.
"""
if self._summary_writers and relative_path in self._summary_writers:
return self._summary_writers[relative_path]
if self._enabled:
self._summary_writers[relative_path] = tf.summary.create_file_writer(
os.path.join(self._summary_dir, relative_path))
else:
self._summary_writers[relative_path] = tf.summary.create_noop_writer()
return self._summary_writers[relative_path]
def flush(self):
"""Flush the underlying summary writers."""
if self._enabled:
tf.nest.map_structure(tf.summary.flush, self._summary_writers)
def write_summaries(self, summary_dict):
"""Write summaries for the given values.
This recursively creates subdirectories for any nested dictionaries
provided in `summary_dict`, yielding a hierarchy of directories which will
then be reflected in the TensorBoard UI as different colored curves.
E.g. users may evaluate on muliple datasets and return `summary_dict` as a
nested dictionary.
```
{
"dataset": {
"loss": loss,
"accuracy": accuracy
},
"dataset2": {
"loss": loss2,
"accuracy": accuracy2
},
}
```
This will create two subdirectories "dataset" and "dataset2" inside the
summary root directory. Each directory will contain event files including
both "loss" and "accuracy" summaries.
Args:
summary_dict: A dictionary of values. If any value in `summary_dict` is
itself a dictionary, then the function will recursively create
subdirectories with names given by the keys in the dictionary. The
Tensor values are summarized using the summary writer instance specific
to the parent relative path.
"""
if not self._enabled:
return
self._write_summaries(summary_dict)
def _write_summaries(self, summary_dict, relative_path=""):
for name, value in summary_dict.items():
if isinstance(value, dict):
self._write_summaries(
value, relative_path=os.path.join(relative_path, name))
else:
with self.summary_writer(relative_path).as_default():
self._summary_fn(name, value, step=self._global_step)
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
class Trigger(metaclass=abc.ABCMeta):
"""An abstract class representing a "trigger" for some event."""
@abc.abstractmethod
def __call__(self, value: float, force_trigger=False):
"""Maybe trigger the event based on the given value.
Args:
value: the value for triggering.
force_trigger: Whether the trigger is forced triggered.
Returns:
`True` if the trigger is triggered on the given `value`, and
`False` otherwise.
"""
@abc.abstractmethod
def reset(self):
"""Reset states in the trigger."""
class IntervalTrigger(Trigger):
"""Triggers on every fixed interval."""
def __init__(self, interval, start=0):
"""Constructs the IntervalTrigger.
Args:
interval: The triggering interval.
start: An initial value for the trigger.
"""
self._interval = interval
self._last_trigger_value = start
def __call__(self, value, force_trigger=False):
"""Maybe trigger the event based on the given value.
Args:
value: the value for triggering.
force_trigger: If True, the trigger will be forced triggered unless the
last trigger value is equal to `value`.
Returns:
`True` if the trigger is triggered on the given `value`, and
`False` otherwise.
"""
if force_trigger and value != self._last_trigger_value:
self._last_trigger_value = value
return True
if self._interval and self._interval > 0:
if value >= self._last_trigger_value + self._interval:
self._last_trigger_value = value
return True
return False
def reset(self):
"""See base class."""
self._last_trigger_value = 0
class EpochHelper:
"""A Helper class to handle epochs in Customized Training Loop."""
def __init__(self, epoch_steps, global_step):
"""Constructs the EpochHelper.
Args:
epoch_steps: An integer indicates how many steps in an epoch.
global_step: A `tf.Variable` instance indicates the current global step.
"""
self._epoch_steps = epoch_steps
self._global_step = global_step
self._current_epoch = None
self._epoch_start_step = None
self._in_epoch = False
def epoch_begin(self):
"""Returns whether a new epoch should begin."""
if self._in_epoch:
return False
current_step = self._global_step.numpy()
self._epoch_start_step = current_step
self._current_epoch = current_step // self._epoch_steps
self._in_epoch = True
return True
def epoch_end(self):
"""Returns whether the current epoch should end."""
if not self._in_epoch:
raise ValueError("`epoch_end` can only be called inside an epoch")
current_step = self._global_step.numpy()
epoch = current_step // self._epoch_steps
if epoch > self._current_epoch:
self._in_epoch = False
return True
return False
@property
def batch_index(self):
"""Index of the next batch within the current epoch."""
return self._global_step.numpy() - self._epoch_start_step
@property
def current_epoch(self):
return self._current_epoch
@contextlib.contextmanager
def _soft_device_placement():
"""Context manager for soft device placement, allowing summaries on CPU."""
original_setting = tf.config.get_soft_device_placement()
try:
tf.config.set_soft_device_placement(True)
yield
finally:
tf.config.set_soft_device_placement(original_setting)
def train_function_with_summaries(*args, **kwargs):
"""Utility function to support TPU summaries via multiple `tf.function`s.
This permits interleaving summaries inside TPU-compatible code, but without
any performance impact on steps that do not write summaries.
Usage is as a decorator, similar to `tf.function`, and any `tf.function`
arguments will be passed through if supplied:
@trainer.train_function_with_summaries
def train(self, num_steps):
...
The decorated function is assumed to be a loop method accepting a `num_steps`
parameter, as for instance would be called within the `Controller`'s outer
train loop. The implementation here assumes that `summary_frequency` is
divisible by `steps_per_loop`. The decorated method should accept two
arguments, `self` and `num_steps`.
Two `tf.function` versions of `train_fn` are created: one inside a summary
writer scope with soft device placement enabled (used on steps that require
summary writing), and one with no summary writer present and soft device
placement disabled (used on all other steps).
Args:
*args: Arguments to pass through to `tf.function`.
**kwargs: Keyword arguments to pass through to `tf.function`.
Returns:
If the first argument is a callable, returns the decorated callable.
Otherwise, returns a decorator.
"""
def decorator(train_fn):
# TODO(dhr): Validate the signature of train_fn?
train_fn_with_summaries = tf.function(train_fn, *args, **kwargs)
train_fn_without_summaries = tf.function(train_fn, *args, **kwargs)
@functools.wraps(train_fn)
def wrapper(self, num_steps):
if tf.summary.should_record_summaries():
with _soft_device_placement():
output = train_fn_with_summaries(self, tf.constant(1))
num_steps -= 1
if num_steps >= 1:
with tf.summary.record_if(False):
output = train_fn_without_summaries(self, num_steps)
return output
return wrapper
if args and callable(args[0]):
train_fn, args = args[0], args[1:]
return decorator(train_fn)
return decorator
def get_value(x) -> np.ndarray:
"""Returns the value of a variable/tensor.
Args:
x: input variable.
Returns:
<<<<<<< HEAD
A Numpy array.
=======
A Numpy array or number.
>>>>>>> a811a3b7e640722318ad868c99feddf3f3063e36
"""
if not tf.is_tensor(x):
return x
return x.numpy()
| 32.288577 | 80 | 0.696996 | 2,143 | 16,112 | 5.071862 | 0.205786 | 0.026313 | 0.012145 | 0.011041 | 0.220075 | 0.177569 | 0.123194 | 0.097065 | 0.075628 | 0.075628 | 0 | 0.014807 | 0.211954 | 16,112 | 498 | 81 | 32.353414 | 0.841222 | 0.078265 | 0 | 0.303483 | 0 | 0 | 0.030228 | 0 | 0 | 0 | 0 | 0.004016 | 0 | 0 | null | null | 0 | 0.034826 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60b585e86ccc82ac6a2eddf76170bf681c5dfe4e | 364 | py | Python | python/word-count/word_count.py | whitepeaony/exercism-python | 5c947ba5c589f1969bcb754e88969420262c457e | [
"MIT"
] | null | null | null | python/word-count/word_count.py | whitepeaony/exercism-python | 5c947ba5c589f1969bcb754e88969420262c457e | [
"MIT"
] | 20 | 2019-08-02T13:59:21.000Z | 2022-01-22T10:08:23.000Z | python/word-count/word_count.py | whitepeaony/exercism-python | 5c947ba5c589f1969bcb754e88969420262c457e | [
"MIT"
] | null | null | null | def count_words(sentence):
sentence = sentence.lower()
words = {}
shit = ',\n:!&@$%^&._'
for s in shit:
sentence = sentence.replace(s, ' ')
for w in sentence.split():
if w.endswith('\''):
w = w[:-1]
if w.startswith('\''):
w = w[1:]
words[w] = words.get(w, 0) + 1
return words
| 20.222222 | 43 | 0.453297 | 44 | 364 | 3.704545 | 0.454545 | 0.294479 | 0.03681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017094 | 0.357143 | 364 | 17 | 44 | 21.411765 | 0.679487 | 0 | 0 | 0 | 0 | 0 | 0.044077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60b7ecc0410fa366c1d483830d2676a99b8210cd | 1,509 | py | Python | ficheros/CSV/prueba csv (lm)/alumnos.py | txtbits/daw-python | 5dde1207e2791e90aa5e9ce2b6afc4116129efab | [
"MIT"
] | null | null | null | ficheros/CSV/prueba csv (lm)/alumnos.py | txtbits/daw-python | 5dde1207e2791e90aa5e9ce2b6afc4116129efab | [
"MIT"
] | null | null | null | ficheros/CSV/prueba csv (lm)/alumnos.py | txtbits/daw-python | 5dde1207e2791e90aa5e9ce2b6afc4116129efab | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
'''
Created on 02/12/2011
@author: chra
'''
import csv
from operator import itemgetter
# ----- Función media de la notas de los alumnos ----------
def media(alumno):
#devuelve la nota media a partir de un diccionario con datos de un alumno
nota1 = int(alumno['Nota1'])
nota2 = int(alumno.get('Nota2'))
nota3 = int(alumno.get('Nota3'))
return (nota1+nota2+nota3) / 3.
# ----------------------------------------------------------
fin = open('alumnos.csv')
lector = csv.DictReader(fin, delimiter=",") # si no se pone delimiter, coge la coma por defecto // devuelve diccionario
# lector = csv.reader(fin, delimiter=",") <-- Devuelve lista
alumnos = []
for linea in lector:
alumnos.append((linea['Alumno'], media(linea)))
# -------- Ordenar por nombre de alumno -----------
alumnos.sort()
print 'Orden por nombre de alumno'
for al in alumnos:
print "%-10s %6.2f" % al #10 espacios entre cadena (nombre - nota media) y permite 6 digitos, 2 de ellos decimales.
# --------------------------------------------------
# --------- Ordenar por nota -----------------------
print '\nOrden por nota'
alumnos.sort(key=itemgetter(1),reverse=True)
for al in alumnos:
print "%-10s %6.2f" % al
#---------------------------------------------------
# Crea un fichero 'lista_ordenada_notas.csv' y escribe la lista ordenada por notas
fw = open('lista_ordenada_notas.csv', 'w')
csvwriter = csv.writer(fw)
for al in alumnos:
csvwriter.writerow(al)
fw.close() | 29.588235 | 120 | 0.581842 | 193 | 1,509 | 4.528497 | 0.476684 | 0.030892 | 0.024027 | 0.048055 | 0.061785 | 0.061785 | 0.061785 | 0.061785 | 0.061785 | 0 | 0 | 0.025177 | 0.15772 | 1,509 | 51 | 121 | 29.588235 | 0.662471 | 0.474486 | 0 | 0.2 | 0 | 0 | 0.165536 | 0.032564 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.08 | null | null | 0.16 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60bb3a202511a6b1b6fac3f7f1b3970bd6850c3c | 367 | py | Python | setup.py | monkey2000/pygazetteer | 3eb6026b1473f773817a81ebc0060ec455482739 | [
"MIT"
] | 1 | 2018-01-28T09:38:56.000Z | 2018-01-28T09:38:56.000Z | setup.py | monkey2000/pygazetteer | 3eb6026b1473f773817a81ebc0060ec455482739 | [
"MIT"
] | null | null | null | setup.py | monkey2000/pygazetteer | 3eb6026b1473f773817a81ebc0060ec455482739 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(name='pygazetteer',
version='0.1.0',
description='Location extractor by looking up gazetteer',
url='https://github.com/monkey2000/pygazetteer',
license='MIT',
packages=['pygazetteer'],
install_requires=[
'pyahocorasick'
],
zip_safe=False,
include_package_data=True)
| 24.466667 | 63 | 0.643052 | 38 | 367 | 6.105263 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024911 | 0.234332 | 367 | 14 | 64 | 26.214286 | 0.800712 | 0 | 0 | 0 | 0 | 0 | 0.344262 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60cbf96ec164888707eb2ca83b141b44664c1449 | 821 | py | Python | bouncer/blacklist/signals.py | sourcelair/bouncer-api | 132c63bbd470b0635054ad71656d0303b39ee421 | [
"MIT"
] | null | null | null | bouncer/blacklist/signals.py | sourcelair/bouncer-api | 132c63bbd470b0635054ad71656d0303b39ee421 | [
"MIT"
] | 25 | 2018-07-17T10:58:23.000Z | 2018-08-30T11:33:58.000Z | bouncer/blacklist/signals.py | sourcelair/bouncer-api | 132c63bbd470b0635054ad71656d0303b39ee421 | [
"MIT"
] | 2 | 2018-07-18T10:57:20.000Z | 2020-05-23T04:43:27.000Z | from django.db.models.signals import pre_save
from django.dispatch import receiver
from blacklist import models
from hashlib import sha256
@receiver(pre_save, sender=models.EmailEntry)
def email_entry_handler(sender, instance, **kwargs):
"""
Handler that assigns to lower_case_entry_value the entry_value.lower()
"""
instance.lower_case_entry_value = instance.entry_value.lower()
email_hasher = sha256(instance.lower_case_entry_value.encode())
instance.hashed_value = email_hasher.hexdigest().lower()
@receiver(pre_save, sender=models.IPEntry)
@receiver(pre_save, sender=models.EmailHostEntry)
def entry_handler(instance, **kwargs):
"""
Handler that assigns to lower_case_entry_value the entry_value.lower()
"""
instance.lower_case_entry_value = instance.entry_value.lower()
| 31.576923 | 74 | 0.774665 | 109 | 821 | 5.568807 | 0.311927 | 0.14827 | 0.115321 | 0.156507 | 0.576606 | 0.398682 | 0.398682 | 0.398682 | 0.398682 | 0.398682 | 0 | 0.00838 | 0.127893 | 821 | 25 | 75 | 32.84 | 0.839385 | 0.171742 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
60ccc770ca96a11d7b161998731a553fd9e80f74 | 971 | py | Python | app/boardgames/migrations/0001_initial.py | collaer/boardgames | 51dd9ec257e5d5bab23a88b10a5f91fdd7fe4210 | [
"MIT"
] | null | null | null | app/boardgames/migrations/0001_initial.py | collaer/boardgames | 51dd9ec257e5d5bab23a88b10a5f91fdd7fe4210 | [
"MIT"
] | null | null | null | app/boardgames/migrations/0001_initial.py | collaer/boardgames | 51dd9ec257e5d5bab23a88b10a5f91fdd7fe4210 | [
"MIT"
] | null | null | null | # Generated by Django 3.1 on 2020-08-22 17:48
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='BoardGame',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=50)),
('eidtion_year', models.IntegerField()),
('designer', models.CharField(max_length=30)),
('game_duration_min', models.IntegerField()),
('player_number', models.IntegerField()),
('rating', models.IntegerField(choices=[(1, 'Very bad'), (2, 'Bad'), (3, 'Regular'), (4, 'Good'), (5, 'Very good')])),
('played', models.BooleanField()),
('acquisition_date', models.DateField()),
],
),
]
| 33.482759 | 134 | 0.547889 | 92 | 971 | 5.673913 | 0.695652 | 0.137931 | 0.068966 | 0.091954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033528 | 0.293512 | 971 | 28 | 135 | 34.678571 | 0.727405 | 0.044284 | 0 | 0 | 1 | 0 | 0.137149 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.238095 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60ccdf38e3f7af2b3782dcf2b849f63822752423 | 2,181 | py | Python | matcher/utils.py | BlueRidgeLabs/slack-meetups | aca850d4265f30f6d76bed1b1baeb2973c60c83d | [
"MIT"
] | 12 | 2019-05-23T03:55:06.000Z | 2022-02-25T07:11:27.000Z | matcher/utils.py | BlueRidgeLabs/slack-meetups | aca850d4265f30f6d76bed1b1baeb2973c60c83d | [
"MIT"
] | 11 | 2020-03-27T18:43:58.000Z | 2021-06-10T18:00:50.000Z | matcher/utils.py | BlueRidgeLabs/slack-meetups | aca850d4265f30f6d76bed1b1baeb2973c60c83d | [
"MIT"
] | 9 | 2019-11-15T23:18:44.000Z | 2022-02-10T19:27:37.000Z | import re
# regex for a user or channel mention at the beginning of a message
# example matches: " <@UJQ07L30Q> ", "<#C010P8N1ABB|interns>"
# interactive playground: https://regex101.com/r/2Z7eun/2
MENTION_PATTERN = r"(?:^\s?<@(.*?)>\s?)|(?:^\s?<#(.*?)\|.*?>\s?)"
def get_set_element(_set):
"""get the element from the set to which the iterator points; returns an
arbitrary item
"""
for element in _set:
return element
def get_person_from_match(user_id, match):
"""given a Match, return the Person corresponding to the passed user ID
"""
if match.person_1.user_id == user_id:
return match.person_1
elif match.person_2.user_id == user_id:
return match.person_2
else:
raise Exception(f"Person with user ID \"{user_id}\" is not part of "
f"the passed match ({match}).")
def get_other_person_from_match(user_id, match):
"""given a Match, return the Person corresponding to the user who is NOT
the passed user ID (i.e. the other Person)
"""
if match.person_1.user_id == user_id:
return match.person_2
elif match.person_2.user_id == user_id:
return match.person_1
else:
raise Exception(f"Person with user ID \"{user_id}\" is not part of "
f"the passed match ({match}).")
def blockquote(message):
"""return `message` with markdown blockquote formatting (start each line
with "> ")
"""
if message:
return re.sub(r"^", "> ", message, flags=re.MULTILINE)
else:
return None
def get_mention(message):
"""get the user or channel ID mentioned at the beginning of a message, if
any
"""
match = re.search(MENTION_PATTERN, message)
if match:
# return the first not-None value in the match group tuple, be it a
# user or channel mention
# https://stackoverflow.com/a/18533669
return next(group for group in match.group(1, 2) if group is not None)
else:
return None
def remove_mention(message):
"""remove the user or channel mention from the beginning of a message, if
any
"""
return re.sub(MENTION_PATTERN, "", message, count=1)
| 30.291667 | 78 | 0.643283 | 318 | 2,181 | 4.298742 | 0.279874 | 0.070227 | 0.043892 | 0.05267 | 0.433065 | 0.402341 | 0.383321 | 0.343819 | 0.340892 | 0.340892 | 0 | 0.020581 | 0.242549 | 2,181 | 71 | 79 | 30.71831 | 0.806901 | 0.371848 | 0 | 0.529412 | 0 | 0 | 0.13587 | 0.034161 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0.058824 | 0.029412 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
60d29eaedc8f1767e8af8cea16dd1cbc33c8db22 | 2,539 | py | Python | smaregipy/pos/customer_groups.py | shabaraba/SmaregiPy | 5447e5da1f21b38c0da1a759ee50b982de1522f7 | [
"MIT"
] | null | null | null | smaregipy/pos/customer_groups.py | shabaraba/SmaregiPy | 5447e5da1f21b38c0da1a759ee50b982de1522f7 | [
"MIT"
] | 1 | 2021-08-18T21:08:06.000Z | 2021-08-18T21:08:06.000Z | smaregipy/pos/customer_groups.py | shabaraba/SmaregiPy | 5447e5da1f21b38c0da1a759ee50b982de1522f7 | [
"MIT"
] | null | null | null | import datetime
from pydantic import Field
from typing import (
ClassVar,
List,
Dict,
Optional,
)
from smaregipy.base_api import (
BaseServiceRecordApi,
BaseServiceCollectionApi,
)
from smaregipy.utils import NoData, DictUtil
class CustomerGroup(BaseServiceRecordApi):
RECORD_NAME = 'customer_groups'
ID_PROPERTY_NAME: ClassVar[str] = 'customer_group_id'
REQUEST_EXCLUDE_KEY: ClassVar[List[str]] = ['customer_group_id']
customer_group_id: Optional[int] = Field(default_factory=NoData)
customer_group_section_id: Optional[int] = Field(default_factory=NoData)
label: Optional[str] = Field(default_factory=NoData)
display_flag: Optional[bool] = Field(default_factory=NoData)
display_sequence: Optional[int] = Field(default_factory=NoData)
ins_date_time: Optional[datetime.datetime] = Field(default_factory=NoData)
upd_date_time: Optional[datetime.datetime] = Field(default_factory=NoData)
class CustomerGroupCollection(BaseServiceCollectionApi[CustomerGroup]):
RECORD_NAME = 'customer_groups'
COLLECT_MODEL = CustomerGroup
WITH: ClassVar[List[str]] = []
class CustomerGroupSection(BaseServiceRecordApi):
RECORD_NAME = 'customer_group_sections'
ID_PROPERTY_NAME: ClassVar[str] = 'customer_group_section_id'
REQUEST_EXCLUDE_KEY: ClassVar[List[str]] = ['customer_group_section_id']
customer_group_section_id: Optional[int] = Field(default_factory=NoData)
customer_group_section_label: Optional[str] = Field(default_factory=NoData)
ins_date_time: Optional[datetime.datetime] = Field(default_factory=NoData)
upd_date_time: Optional[datetime.datetime] = Field(default_factory=NoData)
async def save(self: 'CustomerGroupSection') -> 'CustomerGroupSection':
"""
客層セクションの更新を行います。
put処理のため、saveメソッドをオーバーライド
"""
uri = self._get_uri(self._path_params)
header = self._get_header()
response = self._api_put(uri, header, self.to_api_request_body())
response_data: Dict = DictUtil.convert_key_to_snake(response[self.Response.KEY_DATA])
response_model = self.__class__(**response_data)
self.copy_all_fields(response_model)
self.id(getattr(self, self.ID_PROPERTY_NAME))
self._status=self.DataStatus.SAVED
return self
class CustomerGroupSectionCollection(BaseServiceCollectionApi[CustomerGroupSection]):
RECORD_NAME = 'customer_group_sections'
COLLECT_MODEL = CustomerGroupSection
WITH: ClassVar[List[str]] = []
| 37.338235 | 93 | 0.747538 | 281 | 2,539 | 6.430605 | 0.27758 | 0.073049 | 0.115661 | 0.152186 | 0.453237 | 0.388489 | 0.382402 | 0.308799 | 0.308799 | 0.256779 | 0 | 0 | 0.159118 | 2,539 | 67 | 94 | 37.895522 | 0.84637 | 0 | 0 | 0.24 | 0 | 0 | 0.0814 | 0.039072 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.66 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
60d863fc0a0729ad229e95ea37c2cad1dd4545ab | 2,165 | py | Python | toontown/fishing/FishCollection.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 99 | 2019-11-02T22:25:00.000Z | 2022-02-03T03:48:00.000Z | toontown/fishing/FishCollection.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 42 | 2019-11-03T05:31:08.000Z | 2022-03-16T22:50:32.000Z | toontown/fishing/FishCollection.py | TheFamiliarScoot/open-toontown | 678313033174ea7d08e5c2823bd7b473701ff547 | [
"BSD-3-Clause"
] | 57 | 2019-11-03T07:47:37.000Z | 2022-03-22T00:41:49.000Z | from . import FishBase
from . import FishGlobals
class FishCollection:
def __init__(self):
self.fishList = []
def __len__(self):
return len(self.fishList)
def getFish(self):
return self.fishList
def makeFromNetLists(self, genusList, speciesList, weightList):
self.fishList = []
for genus, species, weight in zip(genusList, speciesList, weightList):
self.fishList.append(FishBase.FishBase(genus, species, weight))
def getNetLists(self):
genusList = []
speciesList = []
weightList = []
for fish in self.fishList:
genusList.append(fish.getGenus())
speciesList.append(fish.getSpecies())
weightList.append(fish.getWeight())
return [genusList, speciesList, weightList]
def hasFish(self, genus, species):
for fish in self.fishList:
if fish.getGenus() == genus and fish.getSpecies() == species:
return 1
return 0
def hasGenus(self, genus):
for fish in self.fishList:
if fish.getGenus() == genus:
return 1
return 0
def __collect(self, newFish, updateCollection):
for fish in self.fishList:
if fish.getGenus() == newFish.getGenus() and fish.getSpecies() == newFish.getSpecies():
if fish.getWeight() < newFish.getWeight():
if updateCollection:
fish.setWeight(newFish.getWeight())
return FishGlobals.COLLECT_NEW_RECORD
else:
return FishGlobals.COLLECT_NO_UPDATE
if updateCollection:
self.fishList.append(newFish)
return FishGlobals.COLLECT_NEW_ENTRY
def collectFish(self, newFish):
return self.__collect(newFish, updateCollection=1)
def getCollectResult(self, newFish):
return self.__collect(newFish, updateCollection=0)
def __str__(self):
numFish = len(self.fishList)
txt = 'Fish Collection (%s fish):' % numFish
for fish in self.fishList:
txt += '\n' + str(fish)
return txt
| 30.069444 | 99 | 0.597229 | 214 | 2,165 | 5.929907 | 0.247664 | 0.113475 | 0.035461 | 0.051221 | 0.297084 | 0.171001 | 0.171001 | 0.090623 | 0.063042 | 0 | 0 | 0.004011 | 0.309007 | 2,165 | 71 | 100 | 30.492958 | 0.844251 | 0 | 0 | 0.240741 | 0 | 0 | 0.012933 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.203704 | false | 0 | 0.037037 | 0.074074 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60d8c3596ef6ebab1b92b84940522e5139154989 | 1,550 | py | Python | lead/strategies/strategy_base.py | M4gicT0/Distribute | af903cdf6ae271f4b1152007ea4ba3928af57936 | [
"MIT"
] | null | null | null | lead/strategies/strategy_base.py | M4gicT0/Distribute | af903cdf6ae271f4b1152007ea4ba3928af57936 | [
"MIT"
] | null | null | null | lead/strategies/strategy_base.py | M4gicT0/Distribute | af903cdf6ae271f4b1152007ea4ba3928af57936 | [
"MIT"
] | null | null | null | #! /usr/bin/env python
# -*- coding: utf-8 -*-
# vim:fenc=utf-8
#
#
# Distributed under terms of the MIT license.
"""
Strategy base class
"""
from abc import ABCMeta, abstractmethod
from tinydb import TinyDB, Query
from node import Node
import json
class Strategy(object):
def __init__(self, this_controller, this_description=None):
self.description = this_description
self.controller = this_controller
self.ledger = TinyDB("ledger.json")
self.db = TinyDB("nodes.json")
self.nodes = []
@abstractmethod
def store_file(self, file_bytes, file_name):
pass
@abstractmethod
def retrieve_file(self, file_name, locations):
pass
@abstractmethod
def get_time(self):
pass
def getNodes(self):
self.nodes = []
for item in self.db:
node = Node(item['mac'],item['ip'],item['port'],item['units'])
self.nodes.append(node)
return self.nodes
def getNodesWithFile(self,filename):
macs = self.ledger.search(Query().file_name == filename)
self.nodes = []
for item in macs:
mac = item["location"]
dbnode = self.db.get(Query().mac == mac)
if(dbnode == None):
continue
node = Node(dbnode['mac'],dbnode['ip'],dbnode['port'],dbnode['units'])
self.nodes.append(node)
return self.nodes
def getFileSize(self, filename):
file = self.ledger.get(Query().file_name == filename)
return file['size']
| 25.833333 | 82 | 0.60129 | 184 | 1,550 | 4.978261 | 0.369565 | 0.068777 | 0.026201 | 0.034935 | 0.131004 | 0.091703 | 0.091703 | 0.091703 | 0.091703 | 0 | 0 | 0.001771 | 0.271613 | 1,550 | 59 | 83 | 26.271186 | 0.809566 | 0.07871 | 0 | 0.325 | 0 | 0 | 0.04311 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.175 | false | 0.075 | 0.1 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
60d9b92e67e9782a6608f736efa327e02d317901 | 702 | py | Python | Easy/233/233.py | lw7360/dailyprogrammer | 7a0a4bdf20dd22ca96713958b479d963a15b09f5 | [
"MIT"
] | null | null | null | Easy/233/233.py | lw7360/dailyprogrammer | 7a0a4bdf20dd22ca96713958b479d963a15b09f5 | [
"MIT"
] | null | null | null | Easy/233/233.py | lw7360/dailyprogrammer | 7a0a4bdf20dd22ca96713958b479d963a15b09f5 | [
"MIT"
] | null | null | null | # https://www.reddit.com/r/dailyprogrammer/comments/3ltee2/20150921_challenge_233_easy_the_house_that_ascii/
import random
import sys
def main():
data = open(sys.argv[1]).read().splitlines()[1::]
door = random.randrange(len(data[-1]))
wideData = []
for row in data:
curStr = ''
for ast in row:
if ast == '*':
curStr += '*****'
else:
curStr += ' '
wideData.append(curStr)
longData = []
for row in wideData:
longData.append(row[:])
longData.append(row[:])
longData.append(row[:])
for row in longData:
print row
if __name__ == "__main__":
main()
| 21.9375 | 108 | 0.541311 | 78 | 702 | 4.679487 | 0.538462 | 0.049315 | 0.065753 | 0.136986 | 0.139726 | 0.139726 | 0 | 0 | 0 | 0 | 0 | 0.033264 | 0.314815 | 702 | 31 | 109 | 22.645161 | 0.725572 | 0.150997 | 0 | 0.130435 | 0 | 0 | 0.031987 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.086957 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60dac391ea8e7b79572392e5f5a6e6cef2504848 | 11,103 | py | Python | usr/callbacks/action/tools.py | uwitec/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | 151 | 2015-01-25T10:25:29.000Z | 2022-03-15T10:04:09.000Z | usr/callbacks/action/tools.py | legendmohe/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | null | null | null | usr/callbacks/action/tools.py | legendmohe/LEHome | a959a2fe64a23c58de7c0ff3254eae8c27732320 | [
"Apache-2.0"
] | 70 | 2015-02-02T02:35:48.000Z | 2021-05-13T09:51:08.000Z | #!/usr/bin/env python
# encoding: utf-8
from __future__ import division
from decimal import Decimal
import subprocess
import threading
import urllib2
import urllib
import httplib
import json
import re
import hashlib
import base64
# import zlib
from lib.command.runtime import UserInput
from lib.helper.CameraHelper import CameraHelper
from lib.sound import Sound
from util import Util
from util.Res import Res
from util.log import *
from lib.model import Callback
class timer_callback(Callback.Callback):
def callback(self, cmd, action, target, msg):
if msg is None:
self._home.publish_msg(cmd, u"时间格式错误")
return False, None
if msg.endswith(u'点') or \
msg.endswith(u'分'):
t = Util.gap_for_timestring(msg)
elif msg.endswith(u"秒"):
t = int(Util.cn2dig(msg[:-1]))
elif msg.endswith(u"分钟"):
t = int(Util.cn2dig(msg[:-2]))*60
elif msg.endswith(u"小时"):
t = int(Util.cn2dig(msg[:-2]))*60*60
else:
self._home.publish_msg(cmd, u"时间格式错误")
return False
if t is None:
self._home.publish_msg(cmd, u"时间格式错误")
return False, None
DEBUG("thread wait for %d sec" % (t, ))
self._home.publish_msg(cmd, action + target + msg)
threading.current_thread().waitUtil(t)
if threading.current_thread().stopped():
return False
self._home.setResume(True)
count = 7
Sound.notice( Res.get_res_path("sound/com_bell"), True, count)
self._home.setResume(False)
return True
class translate_callback(Callback.Callback):
base_url = "http://fanyi.youdao.com/openapi.do"
def callback(self, cmd, msg):
if Util.empty_str(msg):
cancel_flag = u"取消"
finish_flag = u"完成"
self._home.publish_msg(
cmd
, u"请输入内容, 输入\"%s\"或\"%s\"结束:" % (finish_flag, cancel_flag)
, cmd_type="input"
)
msg = UserInput(self._home).waitForInput(
finish=finish_flag,
cancel=cancel_flag)
if msg is None:
self._home.publish_msg(cmd, u"无翻译内容")
elif len(msg) > 200:
self._home.publish_msg(cmd, u"翻译内容过长(<200字)")
else:
try:
values = {
"keyfrom":"11111testt111",
"key":"2125866912",
"type":"data",
"doctype":"json",
"version":"1.1",
"q":msg.encode("utf-8")
}
url = translate_callback.base_url + "?" + urllib.urlencode(values)
res = urllib2.urlopen(url).read()
res = " ".join(json.loads(res)["translation"])
self._home.publish_msg(cmd, u"翻译结果:\n" + res)
except Exception, ex:
ERROR("request error:", ex)
self._home.publish_msg(cmd, u"翻译失败")
return True
return True
class baidu_wiki_callback(Callback.Callback):
base_url = "http://wapbaike.baidu.com"
def searchWiki(self, word, time=10):
value = {"word": word.encode("utf-8")}
url = baidu_wiki_callback.base_url + \
"/search?" + urllib.urlencode(value)
try:
response = urllib2.urlopen(url, timeout=time)
html = response.read().encode("utf-8")
response.close()
real_url = None
content = None
m = re.compile(r"URL=(.+)'>").search(html)
if m:
real_url = m.group(1)
else:
return None, None
real_url = real_url[:real_url.index("?")]
if not real_url is None:
url = baidu_wiki_callback.base_url + real_url
response = urllib2.urlopen(url, timeout=time)
html = response.read()
response.close()
m = re.compile(
r'<p class="summary"><p>(.+)<div class="card-info">',
re.DOTALL
).search(html)
if m:
content = m.group(1)
return Util.strip_tags(content), url
else:
return None, None
except Exception, ex:
ERROR("wiki error: ", ex)
return None, None
def callback(self, cmd, msg):
if Util.empty_str(msg):
cancel_flag = u"取消"
finish_flag = u"完成"
self._home.publish_msg(
cmd
, u"请输入内容, 输入\"%s\"或\"%s\"结束:" % (finish_flag, cancel_flag)
, cmd_type="input"
)
msg = UserInput(self._home).waitForInput(
finish=finish_flag,
cancel=cancel_flag)
if not msg is None:
self._home.publish_msg(cmd, u"正在搜索...")
res, url = self.searchWiki(msg)
if res is None:
self._home.publish_msg(cmd, u"无百科内容")
else:
res = res.decode("utf-8")
if len(res) > 140:
res = res[:140]
msg = u"百度百科:\n %s...\n%s" \
% (res, url)
self._home.publish_msg(cmd, msg)
else:
self._home.publish_msg(cmd, u"无百科内容")
return True
class cal_callback(Callback.Callback):
_ops = {
u'加':'+',
u'减':'-',
u'乘':'*',
u'除':'/',
u'+':'+',
u'-':'-',
u'*':'*',
u'/':'/',
u'(':'(',
u'(':'(',
u')':')',
u')':')',
}
def _parse_tokens(self, src):
tokens = []
cur_t = u''
for term in src:
if term in cal_callback._ops:
if cur_t != u'':
tokens.append(cur_t)
cur_t = u''
tokens.append(term)
else:
cur_t += term
if cur_t != u'':
tokens.append(cur_t)
return tokens
def _parse_expression(self, tokens):
expression = u''
for token in tokens:
if token in cal_callback._ops:
expression += cal_callback._ops[token]
else:
num = Util.cn2dig(token)
if num is None:
return None
expression += str(num)
res = None
INFO("expression: " + expression)
try:
res = eval(expression)
res = Decimal.from_float(res).quantize(Decimal('0.00'))
except Exception, ex:
ERROR("cal expression error:", ex)
return res
def callback(self, cmd, msg):
if Util.empty_str(msg):
cancel_flag = u"取消"
finish_flag = u"完成"
self._home.publish_msg(
cmd
, u"请输入公式, 输入\"%s\"或\"%s\"结束:" % (finish_flag, cancel_flag)
, cmd_type="input"
)
msg = UserInput(self._home).waitForInput(
finish=finish_flag,
cancel=cancel_flag)
if msg is None:
self._home.publish_msg(cmd, u"无公式内容")
else:
tokens = self._parse_tokens(msg)
if not tokens is None:
res = self._parse_expression(tokens)
if not res is None:
self._home.publish_msg(cmd, u"%s = %s" % (msg, str(res)))
return True, res
else:
self._home.publish_msg(cmd, u"计算出错")
return True, None
else:
self._home.publish_msg(cmd, u"格式有误")
return True, None
class camera_quickshot_callback(Callback.Callback):
IMAGE_SERVER_URL = "http://lehome.sinaapp.com/image"
IMAGE_HOST_URL = "http://lehome-image.stor.sinaapp.com/"
def _upload_image(self, img_src, thumbnail_src):
if img_src is None or len(img_src) == 0:
return None, None
INFO("uploading: %s %s" % (img_src, thumbnail_src))
# swift --insecure upload image data/capture/2015_05_23_001856.jpg
proc = subprocess.Popen(
[
"swift",
"--insecure",
"upload",
"image",
thumbnail_src,
img_src
],
stdout=subprocess.PIPE
)
read_img = None
read_thumbnail = None
for i in range(2) :
try:
data = proc.stdout.readline().strip() #block / wait
INFO("swift readline: %s" % data)
if data.endswith(".thumbnail.jpg"):
INFO("save to storage:%s" % data)
read_thumbnail = camera_quickshot_callback.IMAGE_HOST_URL + data
elif data.endswith(".jpg"):
INFO("save to storage:%s" % data)
read_img = camera_quickshot_callback.IMAGE_HOST_URL + data
if not read_img is None and not read_thumbnail is None:
return read_img, read_thumbnail
except (KeyboardInterrupt, SystemExit):
raise
except Exception, ex:
ERROR(ex)
break
return None, None
def callback(self, cmd, msg):
self._home.publish_msg(cmd, u"正在截图...")
Sound.notice(Res.get_res_path("sound/com_shoot"))
save_path="data/capture/"
save_name, thumbnail_name = CameraHelper().take_a_photo(save_path)
# for test
# save_name = "2015_05_02_164052.jpg"
if save_name is None:
self._home.publish_msg(cmd, u"截图失败")
INFO("capture faild.")
return True
img_url, thumbnail_url = self._upload_image(
save_path + save_name,
save_path + thumbnail_name,
)
if img_url is None:
self._home.publish_msg(cmd, u"截图失败")
INFO("upload capture faild.")
return True
else:
self._home.publish_msg(
cmd,
msg=img_url,
cmd_type="capture"
)
return True
class push_info_callback(Callback.Callback):
def callback(self, cmd, target, msg):
if target is None or len(target) == 0:
if msg is None or len(msg) == 0:
self._home.publish_msg(cmd, u"请输入内容")
return True, None
self._home.publish_msg(cmd, msg)
DEBUG("show_callback: %s" % msg)
return True, msg
return True, "push"
| 33.442771 | 84 | 0.482482 | 1,213 | 11,103 | 4.252267 | 0.203627 | 0.04653 | 0.072703 | 0.087243 | 0.37786 | 0.367197 | 0.310586 | 0.246801 | 0.204731 | 0.158976 | 0 | 0.014601 | 0.407818 | 11,103 | 331 | 85 | 33.543807 | 0.769886 | 0.015311 | 0 | 0.337884 | 0 | 0 | 0.075599 | 0.002471 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.061433 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60db218e039d083d014a00cf31bcf97c6c95ef51 | 286 | py | Python | src/fix_code_1.py | Athenian-Computer-Science/numeric-operations-1-practice-template | cdfe370ced98c5e4c770bb89bb7faf0da31ea985 | [
"Apache-2.0"
] | null | null | null | src/fix_code_1.py | Athenian-Computer-Science/numeric-operations-1-practice-template | cdfe370ced98c5e4c770bb89bb7faf0da31ea985 | [
"Apache-2.0"
] | null | null | null | src/fix_code_1.py | Athenian-Computer-Science/numeric-operations-1-practice-template | cdfe370ced98c5e4c770bb89bb7faf0da31ea985 | [
"Apache-2.0"
] | null | null | null | #############################
# Collaborators: (enter people or resources who/that helped you)
# If none, write none
#
#
#############################
base = input('Enter the base: ")
height =
area = # Calculate the area of the triangle
print("The area of the triangle is (area).") | 23.833333 | 64 | 0.548951 | 34 | 286 | 4.617647 | 0.647059 | 0.089172 | 0.11465 | 0.152866 | 0.254777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160839 | 286 | 12 | 65 | 23.833333 | 0.654167 | 0.409091 | 0 | 0 | 0 | 0 | 0.330189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60def5370c10bc67986c4becccbe96488e2d1a66 | 3,924 | py | Python | venv/Lib/site-packages/pafy/g.py | DavidJohnKelly/YoutubeDownloader | 29cb4aa90946803474959f60d7b7e2f07c6e4de2 | [
"MIT"
] | 2 | 2021-10-05T03:09:57.000Z | 2022-01-12T09:50:31.000Z | venv/Lib/site-packages/pafy/g.py | DavidJohnKelly/YoutubeDownloader | 29cb4aa90946803474959f60d7b7e2f07c6e4de2 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/pafy/g.py | DavidJohnKelly/YoutubeDownloader | 29cb4aa90946803474959f60d7b7e2f07c6e4de2 | [
"MIT"
] | 1 | 2021-05-23T17:10:20.000Z | 2021-05-23T17:10:20.000Z | import sys
if sys.version_info[:2] >= (3, 0):
# pylint: disable=E0611,F0401,I0011
from urllib.request import build_opener
else:
from urllib2 import build_opener
from . import __version__
urls = {
'gdata': "https://www.googleapis.com/youtube/v3/",
'watchv': "http://www.youtube.com/watch?v=%s",
'playlist': ('http://www.youtube.com/list_ajax?'
'style=json&action_get_list=1&list=%s'),
'thumb': "http://i.ytimg.com/vi/%s/default.jpg",
'bigthumb': "http://i.ytimg.com/vi/%s/mqdefault.jpg",
'bigthumbhd': "http://i.ytimg.com/vi/%s/hqdefault.jpg",
# For internal backend
'vidinfo': ('https://www.youtube.com/get_video_info?video_id=%s&'
'eurl=https://youtube.googleapis.com/v/%s&sts=%s'),
'embed': "https://youtube.com/embed/%s"
}
api_key = "AIzaSyCIM4EzNqi1in22f4Z3Ru3iYvLaY8tc3bo"
user_agent = "pafy " + __version__
lifespan = 60 * 60 * 5 # 5 hours
opener = build_opener()
opener.addheaders = [('User-Agent', user_agent)]
cache = {}
def_ydl_opts = {'quiet': True, 'prefer_insecure': False, 'no_warnings': True}
# The following are specific to the internal backend
UEFSM = 'url_encoded_fmt_stream_map'
AF = 'adaptive_fmts'
jsplayer = r';ytplayer\.config\s*=\s*({.*?});'
itags = {
'5': ('320x240', 'flv', "normal", ''),
'17': ('176x144', '3gp', "normal", ''),
'18': ('640x360', 'mp4', "normal", ''),
'22': ('1280x720', 'mp4', "normal", ''),
'34': ('640x360', 'flv', "normal", ''),
'35': ('854x480', 'flv', "normal", ''),
'36': ('320x240', '3gp', "normal", ''),
'37': ('1920x1080', 'mp4', "normal", ''),
'38': ('4096x3072', 'mp4', "normal", '4:3 hi-res'),
'43': ('640x360', 'webm', "normal", ''),
'44': ('854x480', 'webm', "normal", ''),
'45': ('1280x720', 'webm', "normal", ''),
'46': ('1920x1080', 'webm', "normal", ''),
'82': ('640x360-3D', 'mp4', "normal", ''),
'83': ('640x480-3D', 'mp4', 'normal', ''),
'84': ('1280x720-3D', 'mp4', "normal", ''),
'100': ('640x360-3D', 'webm', "normal", ''),
'102': ('1280x720-3D', 'webm', "normal", ''),
'133': ('426x240', 'm4v', 'video', ''),
'134': ('640x360', 'm4v', 'video', ''),
'135': ('854x480', 'm4v', 'video', ''),
'136': ('1280x720', 'm4v', 'video', ''),
'137': ('1920x1080', 'm4v', 'video', ''),
'138': ('4096x3072', 'm4v', 'video', ''),
'139': ('48k', 'm4a', 'audio', ''),
'140': ('128k', 'm4a', 'audio', ''),
'141': ('256k', 'm4a', 'audio', ''),
'160': ('256x144', 'm4v', 'video', ''),
'167': ('640x480', 'webm', 'video', ''),
'168': ('854x480', 'webm', 'video', ''),
'169': ('1280x720', 'webm', 'video', ''),
'170': ('1920x1080', 'webm', 'video', ''),
'171': ('128k', 'ogg', 'audio', ''),
'172': ('192k', 'ogg', 'audio', ''),
'218': ('854x480', 'webm', 'video', 'VP8'),
'219': ('854x480', 'webm', 'video', 'VP8'),
'242': ('360x240', 'webm', 'video', 'VP9'),
'243': ('480x360', 'webm', 'video', 'VP9'),
'244': ('640x480', 'webm', 'video', 'VP9 low'),
'245': ('640x480', 'webm', 'video', 'VP9 med'),
'246': ('640x480', 'webm', 'video', 'VP9 high'),
'247': ('720x480', 'webm', 'video', 'VP9'),
'248': ('1920x1080', 'webm', 'video', 'VP9'),
'249': ('48k', 'opus', 'audio', 'Opus'),
'250': ('56k', 'opus', 'audio', 'Opus'),
'251': ('128k', 'opus', 'audio', 'Opus'),
'256': ('192k', 'm4a', 'audio', '6-channel'),
'258': ('320k', 'm4a', 'audio', '6-channel'),
'264': ('2560x1440', 'm4v', 'video', ''),
'266': ('3840x2160', 'm4v', 'video', 'AVC'),
'271': ('1920x1280', 'webm', 'video', 'VP9'),
'272': ('3414x1080', 'webm', 'video', 'VP9'),
'278': ('256x144', 'webm', 'video', 'VP9'),
'298': ('1280x720', 'm4v', 'video', '60fps'),
'299': ('1920x1080', 'm4v', 'video', '60fps'),
'302': ('1280x720', 'webm', 'video', 'VP9'),
'303': ('1920x1080', 'webm', 'video', 'VP9'),
}
| 41.305263 | 77 | 0.509429 | 438 | 3,924 | 4.495434 | 0.46347 | 0.082275 | 0.073134 | 0.019807 | 0.024378 | 0.024378 | 0 | 0 | 0 | 0 | 0 | 0.181139 | 0.181193 | 3,924 | 94 | 78 | 41.744681 | 0.431684 | 0.028797 | 0 | 0 | 0 | 0 | 0.464004 | 0.034945 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045977 | 0 | 0.045977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60e50a2d4456ade56ee11c29fccdf4c73a092641 | 5,590 | py | Python | sympy/polys/tests/test_monomialtools.py | ichuang/sympy | 18afbcc7df2ebf2280ea5a88fde8ece34182ae71 | [
"BSD-3-Clause"
] | 1 | 2015-10-06T21:51:51.000Z | 2015-10-06T21:51:51.000Z | sympy/polys/tests/test_monomialtools.py | ichuang/sympy | 18afbcc7df2ebf2280ea5a88fde8ece34182ae71 | [
"BSD-3-Clause"
] | null | null | null | sympy/polys/tests/test_monomialtools.py | ichuang/sympy | 18afbcc7df2ebf2280ea5a88fde8ece34182ae71 | [
"BSD-3-Clause"
] | null | null | null | """Tests for tools and arithmetics for monomials of distributed polynomials. """
from sympy.polys.monomialtools import (
monomials, monomial_count,
monomial_key, lex, grlex, grevlex,
monomial_mul, monomial_div,
monomial_gcd, monomial_lcm,
monomial_max, monomial_min,
monomial_divides,
Monomial,
InverseOrder, ProductOrder
)
from sympy.polys.polyerrors import ExactQuotientFailed
from sympy.abc import a, b, c, x, y, z
from sympy.utilities.pytest import raises
def test_monomials():
assert sorted(monomials([], 0)) == [1]
assert sorted(monomials([], 1)) == [1]
assert sorted(monomials([], 2)) == [1]
assert sorted(monomials([], 3)) == [1]
assert sorted(monomials([x], 0)) == [1]
assert sorted(monomials([x], 1)) == [1, x]
assert sorted(monomials([x], 2)) == [1, x, x**2]
assert sorted(monomials([x], 3)) == [1, x, x**2, x**3]
assert sorted(monomials([x, y], 0)) == [1]
assert sorted(monomials([x, y], 1)) == [1, x, y]
assert sorted(monomials([x, y], 2)) == [1, x, y, x**2, y**2, x*y]
assert sorted(monomials([x, y], 3)) == [1, x, y, x**2, x**3, y**2, y**3, x*y, x*y**2, y*x**2]
def test_monomial_count():
assert monomial_count(2, 2) == 6
assert monomial_count(2, 3) == 10
def test_lex_order():
assert lex((1,2,3)) == (1,2,3)
assert str(lex) == 'lex'
assert lex((1,2,3)) == lex((1,2,3))
assert lex((2,2,3)) > lex((1,2,3))
assert lex((1,3,3)) > lex((1,2,3))
assert lex((1,2,4)) > lex((1,2,3))
assert lex((0,2,3)) < lex((1,2,3))
assert lex((1,1,3)) < lex((1,2,3))
assert lex((1,2,2)) < lex((1,2,3))
def test_grlex_order():
assert grlex((1,2,3)) == (6, (1,2,3))
assert str(grlex) == 'grlex'
assert grlex((1,2,3)) == grlex((1,2,3))
assert grlex((2,2,3)) > grlex((1,2,3))
assert grlex((1,3,3)) > grlex((1,2,3))
assert grlex((1,2,4)) > grlex((1,2,3))
assert grlex((0,2,3)) < grlex((1,2,3))
assert grlex((1,1,3)) < grlex((1,2,3))
assert grlex((1,2,2)) < grlex((1,2,3))
assert grlex((2,2,3)) > grlex((1,2,4))
assert grlex((1,3,3)) > grlex((1,2,4))
assert grlex((0,2,3)) < grlex((1,2,2))
assert grlex((1,1,3)) < grlex((1,2,2))
def test_grevlex_order():
assert grevlex((1,2,3)) == (6, (-3,-2,-1))
assert str(grevlex) == 'grevlex'
assert grevlex((1,2,3)) == grevlex((1,2,3))
assert grevlex((2,2,3)) > grevlex((1,2,3))
assert grevlex((1,3,3)) > grevlex((1,2,3))
assert grevlex((1,2,4)) > grevlex((1,2,3))
assert grevlex((0,2,3)) < grevlex((1,2,3))
assert grevlex((1,1,3)) < grevlex((1,2,3))
assert grevlex((1,2,2)) < grevlex((1,2,3))
assert grevlex((2,2,3)) > grevlex((1,2,4))
assert grevlex((1,3,3)) > grevlex((1,2,4))
assert grevlex((0,2,3)) < grevlex((1,2,2))
assert grevlex((1,1,3)) < grevlex((1,2,2))
assert grevlex((0,1,1)) > grevlex((0,0,2))
assert grevlex((0,3,1)) < grevlex((2,2,1))
def test_InverseOrder():
ilex = InverseOrder(lex)
igrlex = InverseOrder(grlex)
assert ilex((1,2,3)) > ilex((2, 0, 3))
assert igrlex((1, 2, 3)) < igrlex((0, 2, 3))
assert str(ilex) == "ilex"
assert str(igrlex) == "igrlex"
def test_ProductOrder():
P = ProductOrder((grlex, lambda m: m[:2]), (grlex, lambda m: m[2:]))
assert P((1, 3, 3, 4, 5)) > P((2, 1, 5, 5, 5))
assert str(P) == "ProductOrder(grlex, grlex)"
def test_monomial_key():
assert monomial_key() == lex
assert monomial_key('lex') == lex
assert monomial_key('grlex') == grlex
assert monomial_key('grevlex') == grevlex
raises(ValueError, "monomial_key('foo')")
raises(ValueError, "monomial_key(1)")
def test_monomial_mul():
assert monomial_mul((3,4,1), (1,2,0)) == (4,6,1)
def test_monomial_div():
assert monomial_div((3,4,1), (1,2,0)) == (2,2,1)
def test_monomial_gcd():
assert monomial_gcd((3,4,1), (1,2,0)) == (1,2,0)
def test_monomial_lcm():
assert monomial_lcm((3,4,1), (1,2,0)) == (3,4,1)
def test_monomial_max():
assert monomial_max((3,4,5), (0,5,1), (6,3,9)) == (6,5,9)
def test_monomial_min():
assert monomial_min((3,4,5), (0,5,1), (6,3,9)) == (0,3,1)
def test_monomial_divides():
assert monomial_divides((1,2,3), (4,5,6)) is True
assert monomial_divides((1,2,3), (0,5,6)) is False
def test_Monomial():
m = Monomial((3, 4, 1), (x, y, z))
n = Monomial((1, 2, 0), (x, y, z))
assert m.as_expr() == x**3*y**4*z
assert n.as_expr() == x**1*y**2
assert m.as_expr(a, b, c) == a**3*b**4*c
assert n.as_expr(a, b, c) == a**1*b**2
assert m.exponents == (3, 4, 1)
assert m.gens == (x, y, z)
assert n.exponents == (1, 2, 0)
assert n.gens == (x, y, z)
assert m == (3, 4, 1)
assert n != (3, 4, 1)
assert m != (1, 2, 0)
assert n == (1, 2, 0)
assert m[0] == m[-3] == 3
assert m[1] == m[-2] == 4
assert m[2] == m[-1] == 1
assert n[0] == n[-3] == 1
assert n[1] == n[-2] == 2
assert n[2] == n[-1] == 0
assert m[:2] == (3, 4)
assert n[:2] == (1, 2)
assert m*n == Monomial((4, 6, 1))
assert m/n == Monomial((2, 2, 1))
assert m*(1, 2, 0) == Monomial((4, 6, 1))
assert m/(1, 2, 0) == Monomial((2, 2, 1))
assert m.gcd(n) == Monomial((1, 2, 0))
assert m.lcm(n) == Monomial((3, 4, 1))
assert m.gcd((1, 2, 0)) == Monomial((1, 2, 0))
assert m.lcm((1, 2, 0)) == Monomial((3, 4, 1))
assert m**0 == Monomial((0, 0, 0))
assert m**1 == m
assert m**2 == Monomial((6, 8, 2))
assert m**3 == Monomial((9,12, 3))
raises(ExactQuotientFailed, "m/Monomial((5, 2, 0))")
| 29.114583 | 97 | 0.549911 | 998 | 5,590 | 3.023046 | 0.073146 | 0.041763 | 0.032814 | 0.065628 | 0.433875 | 0.327478 | 0.241299 | 0.191913 | 0.129267 | 0.042426 | 0 | 0.102831 | 0.210197 | 5,590 | 191 | 98 | 29.267016 | 0.580521 | 0.013059 | 0 | 0 | 0 | 0 | 0.02196 | 0 | 0 | 0 | 0 | 0 | 0.731884 | 1 | 0.115942 | false | 0 | 0.028986 | 0 | 0.144928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
60e763dc935a4e4a40e12663f5266d53075d9cc0 | 732 | py | Python | utils/arrival_overlaps.py | davmre/sigvisa | 91a1f163b8f3a258dfb78d88a07f2a11da41bd04 | [
"BSD-3-Clause"
] | null | null | null | utils/arrival_overlaps.py | davmre/sigvisa | 91a1f163b8f3a258dfb78d88a07f2a11da41bd04 | [
"BSD-3-Clause"
] | null | null | null | utils/arrival_overlaps.py | davmre/sigvisa | 91a1f163b8f3a258dfb78d88a07f2a11da41bd04 | [
"BSD-3-Clause"
] | null | null | null | import sigvisa.database.db
from sigvisa.database.dataset import *
import sigvisa.utils.geog
cursor = database.db.connect().cursor()
detections, arid2num = read_detections(cursor, 1237680000, 1237680000 + 168 * 3600, arrival_table="leb_arrival", noarrays=False)
last_det = dict()
overlaps = 0
for det in detections:
site = det[0]
time = det[2]
if site in last_det:
gap = time - last_det[site]
if gap < 5:
print " arrival %d at siteid %d occured %f seconds after previous at %f : phase %s" % (det[1], site, gap, last_det[site], det[DET_PHASE_COL])
overlaps = overlaps + 1
last_det[site] = time
print "total overlaps: ", overlaps, " out of ", len(detections), " detections"
| 31.826087 | 153 | 0.669399 | 103 | 732 | 4.660194 | 0.495146 | 0.072917 | 0.06875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059028 | 0.213115 | 732 | 22 | 154 | 33.272727 | 0.774306 | 0 | 0 | 0 | 0 | 0 | 0.165301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.176471 | null | null | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.