hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ba13957d1dfe8a9dad60a0b98fdc76bc2c051de1 | 2,133 | py | Python | zignet/api.py | Rabbit-Development/Zignet | 341ac2a5e411c385898a060bb41a825e1142625e | [
"MIT"
] | null | null | null | zignet/api.py | Rabbit-Development/Zignet | 341ac2a5e411c385898a060bb41a825e1142625e | [
"MIT"
] | null | null | null | zignet/api.py | Rabbit-Development/Zignet | 341ac2a5e411c385898a060bb41a825e1142625e | [
"MIT"
] | null | null | null | from zignet import controller, db
from zignet.models import *
from flask import request, abort, make_response
@controller.route('/login', methods = ['POST'])
def login():
username = request.get('username')
password = request.get('password')
Rfid = request.get('Rfid')
pincode = request.get('pincode')
if any(username) and any(password) :
user = db.session.query(User).filter(User.username == username).first()
if user.verify_password(password) :
return make_response('true')
else :
abort(401)
elif any(Rfid) and any(pincode):
user = db.session.query(User).filter(User.Rfid == Rfid).first()
if user.verify_pincode(pincode) :
return make_response('true')
else :
abort(401)
else :
abort(400)
@controller.route('/create_user', methods = ['POST'])
def create_user():
username = request.get('username')
password = request.get('password')
email = request.get('email')
user = User(email = email, password = password, username = username)
db.session.add(user)
db.session.commit()
return make_response('user created')
@controller.route('/update_user', methods = ['POST'])
def update_user():
username = request.get('username')
password = request.get('password')
email = request.get('email')
pincode = request.get('pincode')
first_name = request.get('first_name')
last_name = request.get('last_name')
phone = request.get('phone')
address = request.get('address')
zip_code = request.get('zip_code')
country = request.get('country')
user = db.session.query(User).filter(User.username == username).first()
if user.verify_password(password) :
if any(email) :
user.email = email
if any(username) :
user.username = username
if any(password) :
user.password = password
if any(pincode) :
user.pincode = pincode
if any(first_name) :
user.first_name = first_name
if any(last_name) :
user.last_name = last_name
if any(phone) :
user.phone = phone
if any(address) :
user.address = address
if any(zip_code) :
user.zip_code = zip_code
if any(country) :
user.country = country
return make_response('user updated')
else:
abort(401)
| 22.935484 | 73 | 0.69339 | 286 | 2,133 | 5.076923 | 0.174825 | 0.11708 | 0.035813 | 0.053719 | 0.320937 | 0.320937 | 0.320937 | 0.252066 | 0.216253 | 0.216253 | 0 | 0.006696 | 0.159869 | 2,133 | 92 | 74 | 23.184783 | 0.803571 | 0 | 0 | 0.333333 | 0 | 0 | 0.091976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0.130435 | 0.043478 | 0 | 0.144928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ba16fa59961b765da18297dbd32815b00e0a3cbc | 3,628 | py | Python | SudaAutoLogin.py | a386881/- | 7ff1d2725d4e9691f4aabd46c928ed1a554375d2 | [
"Apache-2.0"
] | 1 | 2018-03-28T16:35:46.000Z | 2018-03-28T16:35:46.000Z | SudaAutoLogin.py | a386881/- | 7ff1d2725d4e9691f4aabd46c928ed1a554375d2 | [
"Apache-2.0"
] | null | null | null | SudaAutoLogin.py | a386881/- | 7ff1d2725d4e9691f4aabd46c928ed1a554375d2 | [
"Apache-2.0"
] | null | null | null | # -*- coding:utf-8 -*-
VERSION = r'2016/04/02'
# 配置中心
URL = "http://wg.suda.edu.cn/indexn.aspx"
STUDENT_ID = 'XXXXXXXXXXXXXXXXXXXX'
STUDENT_PASSWORD = 'XXXXXXXXXXXXXXXXXXXX'
'''
Created on 2016年4月2日
@author: XenoAmess
'''
import urllib.parse
import urllib.request
import time
def get_html(url):
'''首先通讯,获得一个html字符串,返回这个字符串'''
html_byte = urllib.request.urlopen(url).read()
html_str = str(html_byte, "utf-8")
return html_str
def txt_wrap_by(start_str, end_str, html_str):
'''取出字符串html_str中的,被start_str与end_str包绕的字符串'''
start = html_str.find(start_str)
if start >= 0:
start += len(start_str)
end = html_str.find(end_str, start)
if end >= 0:
return html_str[start:end].strip()
def read_html_for_money(html_str):
'''读取拿到的html,返回钱数'''
reg__L = r"</font><br/><br/><font color='#000'>您的帐户余额是<font color='#ff0000'><b>"
reg__R = r"</b></font>元。</font><br><br>"
money = txt_wrap_by(reg__L, reg__R, html_str)
return money
def read_html_for_keys(html_str):
'''读取拿到的html,返回两个关键值'''
reg__VIEWSTATE = r'<input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE" value="'
reg__EVENTVALIDATION = r'<input type="hidden" name="__EVENTVALIDATION" id="__EVENTVALIDATION" value="'
reg__right = r'" />'
viewstate = txt_wrap_by(reg__VIEWSTATE, reg__right, html_str)
eventvalidation = txt_wrap_by(reg__EVENTVALIDATION, reg__right, html_str)
return viewstate, eventvalidation
# file_input = open('data.txt')
# html_str = file_input.read()
# file_input.close()
# read_html(html_str)
def reget_html(url, string_student_id, string_student_password, VIEWSTATE, EVENTVALIDATION):
'''再次向html通信'''
search = urllib.parse.urlencode([
('__EVENTTARGET', ''),
('__EVENTARGUMENT', ''),
('__VIEWSTATE', VIEWSTATE),
('__EVENTVALIDATION', EVENTVALIDATION),
('TextBox1', string_student_id),
('TextBox2', string_student_password),
('nw', 'RadioButton2'),
('tm', 'RadioButton8'),
('Button1', '登录网关'),
])
search = bytes(search, encoding="utf8")
html_byte = urllib.request.urlopen(url, search).read()
html_str = str(html_byte, "utf-8")
return html_str
def output_to_file(file_name_str, file_str):
file_output = open(file_name_str, 'w')
file_output.write(file_str)
file_output.close()
def main():
html_str = get_html(URL)
output_to_file('before.html', html_str)
VIEWSTATE, EVENTVALIDATION = read_html_for_keys(html_str)
html_str = reget_html(URL, STUDENT_ID, STUDENT_PASSWORD, VIEWSTATE, EVENTVALIDATION)
output_to_file('after.html', html_str)
print(time.strftime("%c"))
money = read_html_for_money(html_str)
if(money != None):
print('登录成功。您的账户余额为' + money + '元。')
else:
print('请检查您是否已登录。若现在仍无网络连接,则意味着本程序已失效。')
def runforever():
'''每隔十分钟自动登录一次'''
while(1):
main()
time.sleep(10 * 60)
if (__name__ == "__main__"):
print('苏大网关自动登录器')
print()
print('版本:')
print(VERSION)
print()
print('用户ID:')
print(STUDENT_ID)
print()
print('用户密码:')
print(STUDENT_PASSWORD)
print()
print('网关地址:')
print(URL)
print()
print('作者:')
print('XenoAmess')
print()
print('说明:')
print(r'由于网关似乎经常掉线,所以该程序每10分钟自动登录一次。只需要最小化以后挂在那里不用管就行了。当然了虽然选哪个按钮都是2小时以后就退出了,不过我还是模拟的选的10小时的按钮。如果您不需要多次的话,请翻源码把下一行的runforever()改成main()就行了。')
runforever()
| 29.495935 | 146 | 0.628997 | 427 | 3,628 | 5.025761 | 0.334895 | 0.0685 | 0.016775 | 0.016775 | 0.128611 | 0.109972 | 0.039143 | 0.039143 | 0.039143 | 0.039143 | 0 | 0.014963 | 0.226295 | 3,628 | 122 | 147 | 29.737705 | 0.749555 | 0.06753 | 0 | 0.114943 | 0 | 0.011494 | 0.219635 | 0.08905 | 0 | 0 | 0 | 0 | 0 | 1 | 0.091954 | false | 0.057471 | 0.034483 | 0 | 0.183908 | 0.252874 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ba174b62313fc6bd666f2be4dd49654e375a89a1 | 8,797 | py | Python | src/dev/gym-tensegrity/gym_tensegrity/envs/jumper_test.py | hany606/Tensegrity_Robotics | 60b34ded95fb641842d46add450e398149d7ec92 | [
"Apache-2.0"
] | null | null | null | src/dev/gym-tensegrity/gym_tensegrity/envs/jumper_test.py | hany606/Tensegrity_Robotics | 60b34ded95fb641842d46add450e398149d7ec92 | [
"Apache-2.0"
] | null | null | null | src/dev/gym-tensegrity/gym_tensegrity/envs/jumper_test.py | hany606/Tensegrity_Robotics | 60b34ded95fb641842d46add450e398149d7ec92 | [
"Apache-2.0"
] | null | null | null | import gym
import gym_tensegrity
import numpy as np
import os
from time import sleep
# Discrete action space functions testing
def main(port_num=10042):
def print_observation(obs):
print("Observations {:}".format(obs))
env = gym.make('gym_tensegrity:jumper-v0')
# action = randint(0,15)
action = 14
# print("Action: {:}".format(action))
init_obs ,_,_,_=env.step(action)
# print_observation(init_obs)
# print(env.env.actions_json)
# print("")
# input("-> check point: WAIT for INPUT !!!!")
# for i in range(50):
# input("-> check point: WAIT for INPUT !!!!")
# observation, reward, done, _= env.step(action)
# print_observation(observation)
# print("Done:???:{:}".format(done))
# input("-> check point: WAIT for INPUT !!!!")
# for i in range(1,1001):
# action = env.action_space.sample()
# # action = 2
# input("-> check point: WAIT for INPUT !!!!")
# print("--------------- ({:}) ---------------".format(i))
# print("######\nAction: {:}\n######".format(action))
# observation, reward, done, _= env.step(action)
# print_observation(observation)
# print("Done:???:{:}".format(done))
# input("-> check point: WAIT for INPUT !!!!")
# for i in range(50):
# observation, reward, done, _= env.step(2)
# input("-> check point: WAIT for INPUT !!!!")
flag = 0
# i = 0
while True:
# i += 1
# print(i)
# if(i > 100):
# i = 0
# env.reset()
inp = "d"
# inp = input("~~~~~~input: ")
if(inp == "w"):
flag = 1
elif(inp == "s"):
flag = -1
elif(inp == "d"):
flag = 0
if(flag <= 0):
observation, reward, done, _= env.step(4)
observation, reward, done, _= env.step(5)
if(flag >= 0):
observation, reward, done, _= env.step(12)
observation, reward, done, _= env.step(13)
print(observation)
print("angle:{:}".format(observation[-1]*180/np.pi))
def forked_process_main():
port_num_base = 10042
num_threads = 2
for i in range(num_threads):
pid = os.fork()
print("fork {:}".format(pid))
if(pid > 0):
print("Child: {:} -> on port: {:}".format(pid, port_num_base+i))
config = {"port_num":port_num_base+i}
main(config)
def threaded_main():
import threading
port_num_base = 10042
num_threads = 10
threads_list = []
for i in range(num_threads):
config = {"port_num":port_num_base+i}
threads_list.append(threading.Thread(target=main, args=(config,)))
for i in range(num_threads):
threads_list[i].start()
# Continuous action space for lengths function testing
def main_cont_lengths(port_num=10042):
def print_observation(obs):
print("Observations {:}".format(obs))
env = gym.make('gym_tensegrity:jumper-v0')
# action = randint(0,15)
action = [7.95 for i in range(8)]
# action[0] = 5
print("Action: {:}".format(action))
# input("-> check point: WAIT for INPUT !!!!")
init_obs ,_,_,_=env.step(action)
print_observation(init_obs)
# print(env.env.actions_json)
# print("")
# input("-> check point: WAIT for INPUT !!!!")
flag = 0
# i = 0
while True:
observation, reward, done, _= env.step(init_obs[:-1])
print(observation)
print("angle:{:}".format(observation[-1]*180/np.pi))
# Continuous action space for delta lengths function testing
def main_cont_dlengths(config):
def print_observation(obs):
print("Observations {:}".format(obs))
tot_reward = 0
env = gym.make('gym_tensegrity:jumper-v0', config=config)
# action = randint(0,15)
action = np.array([0. for i in range(8)])
# action[0] = 1.7
print("Action: {:}".format(action))
# input("-> check point: WAIT for INPUT !!!!")
init_obs ,tot_reward,done,_=env.step(action)
print_observation(init_obs)
action[0] = 0
# print(env.env.actions_json)
# print("")
input("-> check point: WAIT for INPUT !!!!")
while not done:
action = env.action_space.sample()
observation, reward, done, _= env.step(action)
tot_reward += reward
print("Action: {:}".format(action))
# input("-> check point: WAIT for INPUT !!!!")
print("Reward: {:}, Done: {:}".format(reward,done))
print("Time: {:}".format(env.env.getTime()))
print_observation(observation)
print("angle:{:}".format(env.env.getLegAngle()*180/np.pi))
print("Total Reward: {:}".format(tot_reward))
# sleep(0.01)
input("-> check point: WAIT for INPUT !!!!")
while True:
inp = "d"
inp = input("~~~~~~input: ")
#action = env.action_space.sample()
#observation, reward, done, _= env.step(action)
if(inp == "w"):
flag = 1
elif(inp == "s"):
flag = -1
elif(inp == "d"):
flag = 0
if(flag < 0):
action[0] = -0.1
observation, reward, done, _= env.step(action)
# # action[0] = 0
# # observation, reward, done, _= env.step(action)
if(flag > 0):
action[0] = 0.1
observation, reward, done, _= env.step(action)
# # action[0] = 0
# # observation, reward, done, _= env.step(action)
if(flag == 0):
action[0] = 0
observation, reward, done, _= env.step(action)
print(observation)
print("angle:{:}".format(env.env.getLegAngle()*180/np.pi))
def test(config=None):
def print_observation(obs):
# This printing for the default observation
print("Observations: ")
for i in range(6):
print("#{:} End point: {:}".format(i+1, [obs[3*i:3*(i+1)]]))
print("---")
for i in range(6):
print("#{:} End point velocity: {:}".format(i+1, [obs[3*(i+6):3*(i+1+6)]]))
print("Leg angle:{:}".format(env.env.getLegAngle()*180/np.pi))
squre_sides_angles = env.env.getSquareSidesAngles()
print("Square side angle1:{:}".format(squre_sides_angles[0]*180/np.pi))
print("Square side angle2:{:}".format(squre_sides_angles[1]*180/np.pi))
print("----------------------------------")
if(config is not None):
env = gym.make('gym_tensegrity:jumper-v0', config=config)
if(config is None):
env = gym.make('gym_tensegrity:jumper-v0')
observation = env.reset()
print_observation(observation)
tot_reward = 0
action = np.array([0. for i in range(8)])
done = False
input("-> check point: WAIT for INPUT !!!!")
while not done:
#inp = input("INPUT")
# action = env.action_space.sample()
print("Action: {:}".format(action))
observation, reward, done, _= env.step(action)
tot_reward += reward
print("Reward: {:}, Done: {:}".format(reward,done))
print("Time: {:}".format(env.env.getTime()))
print_observation(observation)
print("angle:{:}".format(env.env.getLegAngle()*180/np.pi))
print("Total Reward: {:}".format(tot_reward))
# input("-> check point: WAIT for INPUT !!!!")
# sleep(0.01)
input("-> check point: WAIT for INPUT !!!!")
flag = 0
while True:
inp = 'd'
# inp = input("~~~~~~input: ")
#action = env.action_space.sample()
#observation, reward, done, _= env.step(action)
if(inp == "w"):
flag = 1
elif(inp == "s"):
flag = -1
elif(inp == "d"):
flag = 0
if(flag < 0):
action[0] = -0.1
if(flag > 0):
action[0] = 0.1
if(flag == 0):
action[0] = 0
observation, reward, done, _= env.step(action)
print(action)
print_observation(observation)
if __name__ == "__main__":
# test({'starting_coordinates':(0,10,0), "max_num_steps":1000, "starting_angle":(1.0001*np.pi/180,0)})
# test({'starting_coordinates':(0,100,0), "max_num_steps":10000, "starting_angle":(0,0), "starting_leg_angle": (0,0), "randomized_starting": False})
# test({"max_num_steps":10000, "randomized_starting": {"angle":[False], "height":[True, 10,100]}})
# test({'starting_coordinates':[0,10,0], "max_num_steps":10000, "randomized_starting": {"angle":[[True, True], [4,1],[10,3]], "height":[False]}})
test({'starting_coordinates':[0,10,0], "max_num_steps":10000, 'starting_leg_angle':[1,2],
'observation_noise': {"uncorrelated":{"mean":0,"stdev":1}, "correlated":{"mean":0,"stdev":1}}})
| 33.32197 | 152 | 0.545982 | 1,070 | 8,797 | 4.364486 | 0.13271 | 0.049251 | 0.052891 | 0.069165 | 0.752677 | 0.71606 | 0.653105 | 0.606424 | 0.541542 | 0.477944 | 0 | 0.037296 | 0.268501 | 8,797 | 263 | 153 | 33.448669 | 0.688423 | 0.282824 | 0 | 0.66879 | 0 | 0 | 0.134822 | 0.024747 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063694 | false | 0 | 0.038217 | 0 | 0.101911 | 0.267516 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba25057c6b3ececb19513f93d3dbe767448d9d9b | 458 | py | Python | Python Fundamentals/Regular Expressions/More Exercise/Task02_03.py | IvanTodorovBG/SoftUni | 7b667f6905d9f695ab1484efbb02b6715f6d569e | [
"MIT"
] | 1 | 2022-03-16T10:23:04.000Z | 2022-03-16T10:23:04.000Z | Python Fundamentals/Regular Expressions/More Exercise/Task02_03.py | IvanTodorovBG/SoftUni | 7b667f6905d9f695ab1484efbb02b6715f6d569e | [
"MIT"
] | null | null | null | Python Fundamentals/Regular Expressions/More Exercise/Task02_03.py | IvanTodorovBG/SoftUni | 7b667f6905d9f695ab1484efbb02b6715f6d569e | [
"MIT"
] | null | null | null | import re
data = input()
pattern = r"%([A-Z][a-z]+)%([^|$%.]+)?<(\w+)>([^|$%.]+)?\|(\d+)\|([^|$%.0-9]+)?([0-9]+(\.[0-9]+)?)\$"
total_income = 0
while data != "end of shift":
for match in re.finditer(pattern, data):
print(f"{match.group(1)}: {match.group(3)} - {int(match.group(5)) * float(match.group(7)):.2f}")
total_income += int(match.group(5)) * float(match.group(7))
data = input()
print(f"Total income: {total_income:.2f}")
| 30.533333 | 104 | 0.528384 | 69 | 458 | 3.463768 | 0.463768 | 0.251046 | 0.025105 | 0.033473 | 0.251046 | 0.251046 | 0.251046 | 0.251046 | 0 | 0 | 0 | 0.038363 | 0.146288 | 458 | 14 | 105 | 32.714286 | 0.57289 | 0 | 0 | 0.2 | 0 | 0.2 | 0.475983 | 0.248908 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0.2 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba26e1fd3bf707b269e72c095f6fc249a64b024e | 319 | py | Python | sky/migrations/0008_remove_news_label.py | eethan1/IMnight2018_Backend | 39780f737e57763fdfb171c4687a375d3c5a4bb0 | [
"Apache-2.0"
] | null | null | null | sky/migrations/0008_remove_news_label.py | eethan1/IMnight2018_Backend | 39780f737e57763fdfb171c4687a375d3c5a4bb0 | [
"Apache-2.0"
] | null | null | null | sky/migrations/0008_remove_news_label.py | eethan1/IMnight2018_Backend | 39780f737e57763fdfb171c4687a375d3c5a4bb0 | [
"Apache-2.0"
] | 4 | 2018-01-27T06:01:41.000Z | 2018-02-21T12:18:35.000Z | # Generated by Django 2.0 on 2018-02-24 11:21
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('sky', '0007_auto_20180224_1120'),
]
operations = [
migrations.RemoveField(
model_name='news',
name='label',
),
]
| 17.722222 | 45 | 0.583072 | 34 | 319 | 5.352941 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 0.304075 | 319 | 17 | 46 | 18.764706 | 0.684685 | 0.134796 | 0 | 0 | 1 | 0 | 0.127737 | 0.083942 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba2ae5c00bc2049ab803532fa3e9a36db8f45a24 | 288 | py | Python | authenticationApp/EmailHandler.py | George-Okumu/IReporter-Django | 5962984ce0069cdf048dbf91686377568a7cf55b | [
"MIT"
] | null | null | null | authenticationApp/EmailHandler.py | George-Okumu/IReporter-Django | 5962984ce0069cdf048dbf91686377568a7cf55b | [
"MIT"
] | 1 | 2021-10-06T20:15:11.000Z | 2021-10-06T20:15:11.000Z | authenticationApp/EmailHandler.py | George-Okumu/IReporter-Django | 5962984ce0069cdf048dbf91686377568a7cf55b | [
"MIT"
] | null | null | null | from django.core.mail import EmailMessage, message
class EmailHandlerClass:
@staticmethod
def sendEmail(data):
email = EmailMessage(subject=data['email_subject'], body=data['email_body'], to=[data['email_to']])
email.send()
# email.send()
| 32 | 107 | 0.638889 | 31 | 288 | 5.83871 | 0.580645 | 0.198895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.232639 | 288 | 9 | 108 | 32 | 0.819005 | 0.041667 | 0 | 0 | 0 | 0 | 0.112727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ba2b10e7983f50e892222f03799fc1c092cbda9b | 495 | py | Python | checkdns.py | delcacho/DataSciencePlatform | c19ac4c1aba54bafc0fed05cc534bb447ab3b631 | [
"BSD-3-Clause"
] | null | null | null | checkdns.py | delcacho/DataSciencePlatform | c19ac4c1aba54bafc0fed05cc534bb447ab3b631 | [
"BSD-3-Clause"
] | null | null | null | checkdns.py | delcacho/DataSciencePlatform | c19ac4c1aba54bafc0fed05cc534bb447ab3b631 | [
"BSD-3-Clause"
] | null | null | null | from area53 import route53
from boto.route53.exception import DNSServerError
from kubernetes import client, config
from datetime import datetime
import socket
import time
# Ensure cluster is running
consec = 0
while consec < 10:
try:
ip = socket.gethostbyname("http://api.k8s.dev.bayescluster.com")
cpnsec += 1
print("Successful resolution! :)")
except Exception as e:
consec = 0
print(e)
print("Error in DNS lookup. Gonna sleep for a while...")
time.sleep(10)
| 23.571429 | 68 | 0.717172 | 68 | 495 | 5.220588 | 0.676471 | 0.078873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035 | 0.191919 | 495 | 20 | 69 | 24.75 | 0.8525 | 0.050505 | 0 | 0.117647 | 0 | 0 | 0.228632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.352941 | 0 | 0.352941 | 0.176471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
ba2b2f46854a6061db7ab4dc16a1519a9222534e | 2,339 | py | Python | config/urls.py | ISI-MIP/isimip | c2a78c727337e38f3695031e00afd607da7d6dcb | [
"MIT"
] | null | null | null | config/urls.py | ISI-MIP/isimip | c2a78c727337e38f3695031e00afd607da7d6dcb | [
"MIT"
] | null | null | null | config/urls.py | ISI-MIP/isimip | c2a78c727337e38f3695031e00afd607da7d6dcb | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
from django.conf import settings
from django.conf.urls import include, url
from django.conf.urls.static import static
from django.contrib import admin
from django.views import defaults as default_views
from wagtail.admin import urls as wagtailadmin_urls
from wagtail.core import urls as wagtail_urls
from wagtail.documents import urls as wagtaildocs_urls
from wagtail.contrib.sitemaps.views import sitemap
from isi_mip.climatemodels import urls as climatemodels_urls
from isi_mip.invitation import urls as invitations_urls
from isi_mip.contrib.views import export_users
urlpatterns = [
url(r'^styleguide/', include("isi_mip.styleguide.urls", namespace="styleguide")),
url(r'^sitemap\.xml$', sitemap),
url(r'^admin/export/users/$', export_users, name='export_users'),
url(r'^admin/', include(admin.site.urls)),
url(r'^auth/', include('django.contrib.auth.urls')),
url(r'^blog/', include('blog.urls', namespace="blog")),
url(r'^cms/', include(wagtailadmin_urls)),
url(r'^documents/', include(wagtaildocs_urls)),
url(r'^models/', include(climatemodels_urls, namespace='climatemodels')),
url(r'^accounts/', include(invitations_urls, namespace='accounts')),
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
if settings.DEBUG:
from django.conf.urls.static import static
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
# Serve static and media files from development server
urlpatterns += staticfiles_urlpatterns()
urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
# This allows the error pages to be debugged during development, just visit
# these url in browser to see how these error pages look like.
urlpatterns += [
url(r'^400/$', default_views.bad_request, kwargs={'exception': Exception("Bad Request!")}),
url(r'^403/$', default_views.permission_denied, kwargs={'exception': Exception("Permission Denied")}),
url(r'^404/$', default_views.page_not_found, kwargs={'exception': Exception("Page not Found")}),
url(r'^500/$', default_views.server_error),
]
import debug_toolbar
urlpatterns += [
url(r'^__debug__/', include(debug_toolbar.urls)),
]
urlpatterns += [
url(r'', include(wagtail_urls)),
] | 41.767857 | 110 | 0.734929 | 306 | 2,339 | 5.46732 | 0.284314 | 0.038255 | 0.035864 | 0.032277 | 0.124328 | 0.124328 | 0.124328 | 0.124328 | 0.124328 | 0.063359 | 0 | 0.005961 | 0.139376 | 2,339 | 56 | 111 | 41.767857 | 0.825137 | 0.079949 | 0 | 0.116279 | 0 | 0 | 0.143322 | 0.031643 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.372093 | 0 | 0.372093 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e8404984e29c624bea38b2217947473bb1186bbd | 4,035 | py | Python | forgetPass.py | goyal705/Hotel-Management-System | 8ea3598915f4062a7ae65c634e656d0f25b961f0 | [
"Apache-2.0"
] | 1 | 2021-09-10T11:39:23.000Z | 2021-09-10T11:39:23.000Z | forgetPass.py | goyal705/Hotel-Management-System | 8ea3598915f4062a7ae65c634e656d0f25b961f0 | [
"Apache-2.0"
] | null | null | null | forgetPass.py | goyal705/Hotel-Management-System | 8ea3598915f4062a7ae65c634e656d0f25b961f0 | [
"Apache-2.0"
] | null | null | null | from tkinter import *
from tkinter import ttk
from tkinter import messagebox
import mysql.connector
from login_new import Small_login_win
class Forget_pass_win:
def __init__(self, root):
self.root = root
self.root.title("Forget Password")
self.root.geometry('300x400+520+130')
self.root.maxsize(300, 400)
self.root.minsize(300, 400)
self.user_security_ques = StringVar()
self.user_security_ans = StringVar()
self.new_pass = StringVar()
self.user_name = StringVar()
label = Label(self.root, text="Forgot Password", font=("arial", 20, "bold"), fg="red")
label.place(x=38, y=10)
label_security = Label(self.root, text="Enter UserName", font=("times new roman", 15, "bold"))
label_security.place(x=70, y=50)
entry_name = ttk.Entry(self.root, font=("times new roman", 15, "bold"), textvariable=self.user_name)
entry_name.place(x=45, y=85)
label_security = Label(self.root, text="Enter Security Question", font=("times new roman", 15, "bold"))
label_security.place(x=45, y=120)
entry_security_question = ttk.Combobox(self.root, font=("times new roman", 15, "bold"), width=18,
state="readonly", textvariable=self.user_security_ques)
entry_security_question["values"] = ("Your Petname", "Your Favourite Hobby", "Your Favourite Subject")
entry_security_question.current(0)
entry_security_question.place(x=50, y=155)
label_security_answer = Label(self.root, text="Enter Security Answer", font=("times new roman", 15, "bold"))
label_security_answer.place(x=45, y=195)
entry_name = ttk.Entry(self.root, font=("times new roman", 15, "bold"), textvariable=self.user_security_ans)
entry_name.place(x=45, y=230)
label_security = Label(self.root, text="Enter New Password", font=("times new roman", 15, "bold"))
label_security.place(x=54, y=265)
entry_name = ttk.Entry(self.root, font=("times new roman", 15, "bold"), textvariable=self.new_pass, show="*")
entry_name.place(x=45, y=300)
login_btn = Button(self.root, cursor="circle", command=self.forget, fg="white", bg="red", width=20,
text="Submit Now", font=("times new roman", 10, "bold"))
login_btn.place(x=70, y=360)
def forget(self):
if self.user_security_ans.get() == "":
messagebox.showerror("Error", "Pls answer the security question", parent=self.root)
elif self.new_pass.get() == "":
messagebox.showerror("Error", "Pls enter your new password", parent=self.root)
else:
connection = mysql.connector.connect(host="localhost", username="root", password="1234",
database="tushar")
my_cur = connection.cursor()
query = "select * from user_register where UserName=%s and SecurityQuestion=%s and SecurityAnswer=%s"
value = (self.user_name.get(), self.user_security_ques.get(), self.user_security_ans.get())
my_cur.execute(query, value)
row = my_cur.fetchone()
if row is None:
messagebox.showerror("Error", "Please enter the correct values", parent=self.root)
else:
query = "update user_register set UserPassword=%s where UserName=%s"
value = (self.new_pass.get(), self.user_name.get())
my_cur.execute(query, value)
connection.commit()
connection.close()
messagebox.showinfo("Information", "Your password has been reseted please login again", parent=self.root)
self.new_window = Toplevel(self.root)
self.app = Small_login_win(self.new_window)
if __name__ == '__main__':
root = Tk()
obj = Forget_pass_win(root)
root.mainloop()
| 49.207317 | 122 | 0.607187 | 504 | 4,035 | 4.708333 | 0.28373 | 0.070796 | 0.045512 | 0.064475 | 0.310999 | 0.269701 | 0.209861 | 0.165613 | 0.137379 | 0.137379 | 0 | 0.033681 | 0.264188 | 4,035 | 81 | 123 | 49.814815 | 0.765578 | 0 | 0 | 0.058824 | 0 | 0 | 0.187658 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029412 | false | 0.191176 | 0.073529 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e840e5bc9763698d91cf0375b1df8b3f61aea666 | 1,449 | py | Python | docs/podstawy/przyklady/ocenyfun.py | damiankarol7/python101 | 1978a9402a8fb0f20c4ca7bd542cb8d7d4501b9b | [
"MIT"
] | 44 | 2015-02-11T19:10:37.000Z | 2021-11-11T09:45:43.000Z | docs/podstawy/przyklady/ocenyfun.py | damiankarol7/python101 | 1978a9402a8fb0f20c4ca7bd542cb8d7d4501b9b | [
"MIT"
] | 9 | 2015-02-06T21:26:25.000Z | 2022-03-31T10:44:22.000Z | docs/podstawy/przyklady/ocenyfun.py | damiankarol7/python101 | 1978a9402a8fb0f20c4ca7bd542cb8d7d4501b9b | [
"MIT"
] | 172 | 2015-06-13T07:16:24.000Z | 2022-03-30T20:41:11.000Z | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Moduł ocenyfun zawiera funkcje wykorzystywane w pliku 05_oceny_03.py
"""
import math # zaimportuj moduł matematyczny
def drukuj(co, kom="Sekwencja zawiera: "):
print(kom)
for i in co:
print(i, end=" ")
def srednia(oceny):
suma = sum(oceny)
return suma / float(len(oceny))
def mediana(oceny):
"""
Jeżeli ilość ocen jest parzysta, medianą jest średnia arytmetyczna
dwóch środkowych ocen. Jesli ilość jest nieparzysta mediana równa
się elementowi środkowemu ouporządkowanej rosnąco listy ocen.
"""
oceny.sort()
if len(oceny) % 2 == 0: # parzysta ilość ocen
half = int(len(oceny) / 2)
# można tak:
# return float(oceny[half-1]+oceny[half]) / 2.0
# albo tak:
return float(sum(oceny[half - 1:half + 1])) / 2.0
else: # nieparzysta ilość ocen
return oceny[len(oceny) / 2]
def wariancja(oceny, srednia):
"""
Wariancja to suma kwadratów różnicy każdej oceny i średniej
podzielona przez ilość ocen:
sigma = (o1-s)+(o2-s)+...+(on-s) / n, gdzie:
o1, o2, ..., on - kolejne oceny,
s - średnia ocen,
n - liczba ocen.
"""
sigma = 0.0
for ocena in oceny:
sigma += (ocena - srednia)**2
return sigma / len(oceny)
def odchylenie(oceny, srednia): # pierwiastek kwadratowy z wariancji
w = wariancja(oceny, srednia)
return math.sqrt(w)
| 25.421053 | 72 | 0.619738 | 192 | 1,449 | 4.666667 | 0.484375 | 0.044643 | 0.030134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022305 | 0.257419 | 1,449 | 56 | 73 | 25.875 | 0.810409 | 0.47343 | 0 | 0 | 0 | 0 | 0.029197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.043478 | 0 | 0.478261 | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e84512b0da871bd7cbe1284df91d047dbdd1a5e5 | 303 | py | Python | tmdb/three/reviews.py | Cologler/ezapi-tmdb | 6a8a1a0a3da99cb18d11f47f1b40bbffb2a16be6 | [
"MIT"
] | 4 | 2017-05-16T02:30:52.000Z | 2021-07-01T13:21:27.000Z | tmdb/three/reviews.py | Cologler/ezapi-tmdb | 6a8a1a0a3da99cb18d11f47f1b40bbffb2a16be6 | [
"MIT"
] | 4 | 2020-09-03T03:19:49.000Z | 2021-12-21T05:24:04.000Z | tmdb/three/reviews.py | Cologler/ezapi-tmdb | 6a8a1a0a3da99cb18d11f47f1b40bbffb2a16be6 | [
"MIT"
] | 3 | 2021-02-15T18:13:08.000Z | 2021-04-10T03:53:58.000Z | from .base import ENDPOINT, process_response
class ReviewsMixin:
@process_response
def get_review_details(self, review_id, **kwargs):
"""
GET /review/{review_id}
"""
url = f"{ENDPOINT}/3/review/{review_id}"
return self.make_request("GET", url, kwargs)
| 23.307692 | 54 | 0.630363 | 36 | 303 | 5.083333 | 0.583333 | 0.131148 | 0.153005 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004367 | 0.244224 | 303 | 12 | 55 | 25.25 | 0.79476 | 0.075908 | 0 | 0 | 0 | 0 | 0.132813 | 0.121094 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e84f7ec1d7eb75e5b190a746b5e84c8c1dc54889 | 241 | py | Python | C21/test.py | jpch89/learningpython | 47e78041e519ecd2e00de1b32f6416b56ce2616c | [
"MIT"
] | 2 | 2020-10-20T10:18:48.000Z | 2020-12-02T09:41:18.000Z | C21/test.py | jpch89/learningpython | 47e78041e519ecd2e00de1b32f6416b56ce2616c | [
"MIT"
] | null | null | null | C21/test.py | jpch89/learningpython | 47e78041e519ecd2e00de1b32f6416b56ce2616c | [
"MIT"
] | 1 | 2020-12-02T10:03:29.000Z | 2020-12-02T10:03:29.000Z | '求平方根'
from math import sqrt
li = [2, 4, 9, 16, 25]
# for 循环版本
res = []
for i in li:
res.append(sqrt(i))
print(res)
# map 版本
print(list(map(sqrt, li)))
# 列表推导式版本
print([sqrt(i) for i in li])
# 生成器版本
print(list(sqrt(i) for i in li))
| 12.05 | 32 | 0.605809 | 48 | 241 | 3.041667 | 0.479167 | 0.082192 | 0.123288 | 0.164384 | 0.178082 | 0.178082 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.215768 | 241 | 19 | 33 | 12.684211 | 0.73545 | 0.145228 | 0 | 0 | 0 | 0 | 0.019324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e85331900c6459687504959fb943368a54bbdd9f | 2,908 | py | Python | modules/MessageWatcher/MessageWatcher.py | ediril/BCI | f211ba70d6d75a9badff6872f86416b065f6192b | [
"BSD-2-Clause"
] | 6 | 2016-12-30T03:43:49.000Z | 2020-04-19T16:04:37.000Z | modules/MessageWatcher/MessageWatcher.py | hongweimao/BCI | 49b7e8137bd5f9d18e3efdbd94a112cde5d16c4c | [
"BSD-2-Clause"
] | 1 | 2022-03-08T09:16:10.000Z | 2022-03-08T09:16:10.000Z | modules/MessageWatcher/MessageWatcher.py | ediril/BCI | f211ba70d6d75a9badff6872f86416b065f6192b | [
"BSD-2-Clause"
] | 2 | 2015-06-16T02:46:03.000Z | 2018-12-20T20:07:59.000Z | import numpy as np
import Dragonfly_config as rc
from argparse import ArgumentParser
from ConfigParser import SafeConfigParser
from PyDragonfly import Dragonfly_Module, CMessage, copy_to_msg, copy_from_msg
from time import time
class MessageWatcher(object):
# msg_types = ['GROBOT_RAW_FEEDBACK',
# 'GROBOT_FEEDBACK',
# 'SAMPLE_GENERATED',
# 'SPM_SPIKECOUNT',
# 'EM_MOVEMENT_COMMAND',
# 'COMPOSITE_MOVEMENT_COMMAND'
# ]
def __init__(self, config_file):
self.load_config(config_file)
self.msg_nums = [eval('rc.MT_%s' % (x)) for x in self.msg_types]
self.count = np.zeros((len(self.msg_nums)), dtype=int)
self.last_time = time()
self.setup_Dragonfly()
self.run()
def load_config(self, config_file):
self.config = SafeConfigParser()
self.config.read(config_file)
self.msg_types = [x.upper() for x in self.config.options('messages')]
self.msg_types.sort()
def setup_Dragonfly(self):
server = self.config.get('Dragonfly', 'server')
self.mod = Dragonfly_Module(0, 0)
self.mod.ConnectToMMM(server)
for i in self.msg_types:
self.mod.Subscribe(eval('rc.MT_%s' % (i)))
self.mod.SendModuleReady()
print "Connected to Dragonfly at", server
def run(self):
while True:
msg = CMessage()
rcv = self.mod.ReadMessage(msg, 0.1)
if rcv == 1:
self.process_message(msg)
this_time = time()
self.diff_time = this_time - self.last_time
if self.diff_time > 1.:
self.last_time = this_time
self.write()
self.count[:] = 0
def process_message(self, in_msg):
msg_type = in_msg.GetHeader().msg_type
if not msg_type in self.msg_nums:
return
msg_idx = self.msg_nums.index(msg_type)
self.count[msg_idx] += 1
def write(self):
for msg_type, c in zip(self.msg_types, self.count):
rate = c / self.diff_time
print "%40s %5.2f Hz" % (msg_type, rate)
if (('GROBOT_RAW_FEEDBACK' in msg_type) and (rate < 48.0)):
print "Raw feedback rate is too low!"
print "Raw feedback rate is too low!"
print "Raw feedback rate is too low!"
print "Raw feedback rate is too low!"
print "window was %0.3f seconds\n" % (self.diff_time)
print ""
if __name__ == "__main__":
parser = ArgumentParser(description = "Display information about message flow")
parser.add_argument('config', metavar='config_file', type=str)
args = parser.parse_args()
print("Using config file %s" % args.config)
mw = MessageWatcher(args.config)
| 36.810127 | 83 | 0.584594 | 363 | 2,908 | 4.473829 | 0.330579 | 0.038793 | 0.036946 | 0.049261 | 0.110222 | 0.072044 | 0.072044 | 0.072044 | 0.072044 | 0.072044 | 0 | 0.008483 | 0.310867 | 2,908 | 78 | 84 | 37.282051 | 0.801896 | 0.077029 | 0 | 0.064516 | 0 | 0 | 0.119955 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.096774 | null | null | 0.145161 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e85c2c9c74527b46aef466680cba313a99e7950d | 2,762 | py | Python | gameNode_test.py | rcolomina/pythonchess | 1b12ea4a1668da6c47dd39ff16d1e48af33ea2f5 | [
"MIT"
] | null | null | null | gameNode_test.py | rcolomina/pythonchess | 1b12ea4a1668da6c47dd39ff16d1e48af33ea2f5 | [
"MIT"
] | 2 | 2016-11-01T09:57:36.000Z | 2016-11-01T10:05:50.000Z | gameNode_test.py | rcolomina/pythonchess | 1b12ea4a1668da6c47dd39ff16d1e48af33ea2f5 | [
"MIT"
] | null | null | null | #!/usr/bin/python
from piece import Piece
from gameNode import GameNode
from functions import *
from minmax import MinMax
## TEST 1: GameNode cration. Generating list of movements on a game node by piece
print "TEST 1: Building a Game Node"
listPiecesWhite=[]
listPiecesBlack=[]
p1=Piece('Q',[2,1])
p2=Piece('N',[2,2])
p3=Piece('q',[1,4])
listPiecesWhite.append(p1)
listPiecesWhite.append(p2)
listPiecesBlack.append(p3)
gameNode=GameNode(listPiecesWhite,listPiecesBlack,"white")
assert(genListMovsPiece(gameNode,p3)==[[2,4],[3,4],[4,4],[1,1],[1,2],[1,3],[2,3],[3,2],[4,1]])
assert(not checkPieceMovValid(gameNode,p3,[4,2])) # Black Queen cannot move to [4,2], ilegal move
print "Checked movements for black queen: Black Queen cannot move to [4,2], ilegal move"
assert(not checkPieceMovValid(gameNode,p2,[4,4])) # White Knight cannot move to [4,4], ilegal move
print "Checked movements for white knight: White Knight cannot move to [4,4], ilegal move"
assert(not checkPieceMovValid(gameNode,p1,[2,4])) # White Queen cannot move to [2,4], White Knight in the middle
print "Checked movements for white queen: White Queen cannot move to [2,4], cose white Knight in the middle"
assert(not checkPieceMovValid(gameNode,p1,[2,2])) # White Queen cannot move to [2,2], White Knight on it
print "Checked movements for white queen: White Queen cannot move to [2,2], cose white Knight is over it"
# Game Successors
print "--"
print "TEST 2: Check whether white wins after a queen do an eat movement"
####
listPiecesWhite=[]
listPiecesBlack=[]
p1=Piece('Q',[2,1])
listPiecesWhite.append(p1)
p2=Piece('q',[1,4])
listPiecesBlack.append(p2)
gameNode=GameNode(listPiecesWhite,listPiecesBlack,"white")
print gameNode.draw()
mov=[1,1]
piece=p1
gameNodeChild=gameNode.succesGameActionMove(piece,mov)
print gameNodeChild.draw()
assert(not gameNode.checkWhiteWin())
print "Num black pieces:", gameNodeChild.numBlackPieces()
assert(gameNodeChild.numBlackPieces()==1)
#gameNodeChild.draw()
assert(not gameNodeChild.checkWhiteWin())
print "Checked that white not win"
assert(gameNodeChild.numBlackPieces()==1)
mov=[1,4] # position of black queen
piece=p2 # capturing black queen by white knight
gameNodeChild=gameNode.succesGameActionMove(piece,mov)
#gameNodeChild.draw()
#assert(gameNodeChild.checkWhiteWin())
#assert(gameNodeChild.numBlackPieces()==0)
print "--"
print "TEST 3: Generate all games nodes since a given game"
listGameNodes=genListNextGameNodesForcingColor(gameNode,"black")
assert(len(listGameNodes)==9)
print "Number for black choises is 9"
listGameNodes=genListNextGameNodesForcingColor(gameNode,"white")
assert(len(listGameNodes)==9)
print "Number for white choises is 9"
listGameNodes=genListNextGameNodes(gameNode)
#assert(len(listGameNodes)==10)
| 36.342105 | 112 | 0.763939 | 393 | 2,762 | 5.368957 | 0.231552 | 0.037915 | 0.045498 | 0.048341 | 0.409479 | 0.291469 | 0.216114 | 0.119431 | 0.119431 | 0.054028 | 0 | 0.034719 | 0.103186 | 2,762 | 75 | 113 | 36.826667 | 0.817117 | 0.191166 | 0 | 0.327273 | 0 | 0.036364 | 0.286489 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | null | null | 0 | 0.072727 | null | null | 0.272727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8608fa57e48149116847118e359bdf6450a3512 | 2,155 | py | Python | utils/Deque.py | FedePeralta/ASVs_Deep_Reinforcement_Learning_with_CNNs | 23b9b181499a4b06f2ca2951c002359c1959e727 | [
"MIT"
] | 4 | 2021-03-22T12:42:55.000Z | 2021-12-13T03:03:52.000Z | utils/Deque.py | FedePeralta/ASVs_Deep_Reinforcement_Learning_with_CNNs | 23b9b181499a4b06f2ca2951c002359c1959e727 | [
"MIT"
] | null | null | null | utils/Deque.py | FedePeralta/ASVs_Deep_Reinforcement_Learning_with_CNNs | 23b9b181499a4b06f2ca2951c002359c1959e727 | [
"MIT"
] | 1 | 2021-03-22T12:48:21.000Z | 2021-03-22T12:48:21.000Z | import numpy as np
from utils.Node import Node
class Deque(object):
"""Generic deque object"""
def __init__(self, max_size, dimension_of_value_attribute):
self.max_size = max_size
self.dimension_of_value_attribute = dimension_of_value_attribute
self.deque = self.initialise_deque()
self.deque_index_to_overwrite_next = 0
self.reached_max_capacity = False
self.number_experiences_in_deque = 0
def initialise_deque(self):
"""Initialises a queue of Nodes of length self.max_size"""
deque = np.array([Node(0, tuple([None for _ in range(self.dimension_of_value_attribute)])) for _ in range(self.max_size)])
return deque
def add_element_to_deque(self, new_key, new_value):
"""Adds an element to the deque and then updates the index of the next element to be overwritten and also the
amount of elements in the deque"""
self.update_deque_node_key_and_value(self.deque_index_to_overwrite_next, new_key, new_value)
self.update_number_experiences_in_deque()
self.update_deque_index_to_overwrite_next()
def update_deque_node_key_and_value(self, index, new_key, new_value):
self.update_deque_node_key(index, new_key)
self.update_deque_node_value(index, new_value)
def update_deque_node_key(self, index, new_key):
self.deque[index].update_key(new_key)
def update_deque_node_value(self, index, new_value):
self.deque[index].update_value(new_value)
def update_deque_index_to_overwrite_next(self):
"""Updates the deque index that we should write over next. When the buffer gets full we begin writing over
older experiences"""
if self.deque_index_to_overwrite_next < self.max_size - 1:
self.deque_index_to_overwrite_next += 1
else:
self.reached_max_capacity = True
self.deque_index_to_overwrite_next = 0
def update_number_experiences_in_deque(self):
"""Keeps track of how many experiences there are in the buffer"""
if not self.reached_max_capacity:
self.number_experiences_in_deque += 1 | 43.979592 | 130 | 0.716009 | 316 | 2,155 | 4.518987 | 0.259494 | 0.070028 | 0.068627 | 0.102941 | 0.429972 | 0.268207 | 0.084034 | 0 | 0 | 0 | 0 | 0.004123 | 0.212065 | 2,155 | 49 | 131 | 43.979592 | 0.836867 | 0.182831 | 0 | 0.060606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.242424 | false | 0 | 0.060606 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8642331b936e08e2dbe11e2424740cd37d2e4b8 | 1,014 | py | Python | release/stubs.min/System/Windows/Forms/__init___parts/DataGridViewCellStateChangedEventArgs.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/System/Windows/Forms/__init___parts/DataGridViewCellStateChangedEventArgs.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/System/Windows/Forms/__init___parts/DataGridViewCellStateChangedEventArgs.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | class DataGridViewCellStateChangedEventArgs(EventArgs):
"""
Provides data for the System.Windows.Forms.DataGridView.CellStateChanged event.
DataGridViewCellStateChangedEventArgs(dataGridViewCell: DataGridViewCell,stateChanged: DataGridViewElementStates)
"""
@staticmethod
def __new__(self, dataGridViewCell, stateChanged):
""" __new__(cls: type,dataGridViewCell: DataGridViewCell,stateChanged: DataGridViewElementStates) """
pass
Cell = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Gets the System.Windows.Forms.DataGridViewCell that has a changed state.
Get: Cell(self: DataGridViewCellStateChangedEventArgs) -> DataGridViewCell
"""
StateChanged = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Gets the state that has changed on the cell.
Get: StateChanged(self: DataGridViewCellStateChangedEventArgs) -> DataGridViewElementStates
"""
| 26.684211 | 115 | 0.719921 | 86 | 1,014 | 8.395349 | 0.430233 | 0.083102 | 0.044321 | 0.058172 | 0.166205 | 0.166205 | 0.166205 | 0.166205 | 0.166205 | 0.166205 | 0 | 0 | 0.192308 | 1,014 | 37 | 116 | 27.405405 | 0.881563 | 0.285996 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e876bb50d8ee0021ee27d7402634cce9a0b1b389 | 756 | py | Python | lista1/156_Ananagrams/156_Ananagrams.py | L30Bola/mab606 | c29a781752b1d12b0df308d604496c7ffa0c5b6e | [
"BSL-1.0"
] | null | null | null | lista1/156_Ananagrams/156_Ananagrams.py | L30Bola/mab606 | c29a781752b1d12b0df308d604496c7ffa0c5b6e | [
"BSL-1.0"
] | null | null | null | lista1/156_Ananagrams/156_Ananagrams.py | L30Bola/mab606 | c29a781752b1d12b0df308d604496c7ffa0c5b6e | [
"BSL-1.0"
] | null | null | null | #!/usr/bin/env python
import sys
import collections
listas = list(map(str.split, sys.stdin.readlines()))
entrada = [item for sublist in listas for item in sublist] # simplificando as listas, para facilitar
contador = collections.Counter()
palavras = []
for palavra in entrada:
if palavra == "#":
break
palavra_simplificada = "".join(sorted(palavra.lower()))
if len(palavra) >= 1:
contador[palavra_simplificada] += 1
palavras.append((palavra, palavra_simplificada))
lista_auxiliar_palavras = []
for palavra, palavra_simplificada in palavras:
if not palavra_simplificada in contador or contador[palavra_simplificada] < 2:
lista_auxiliar_palavras.append(palavra)
for resultado in sorted(lista_auxiliar_palavras):
print(resultado) | 29.076923 | 100 | 0.756614 | 95 | 756 | 5.894737 | 0.452632 | 0.203571 | 0.1125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004615 | 0.140212 | 756 | 26 | 101 | 29.076923 | 0.856923 | 0.079365 | 0 | 0 | 0 | 0 | 0.001439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.105263 | 0.052632 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e879a021a202e01effe229f08a5ace1da13c48a5 | 1,030 | py | Python | viz/main.py | YoniSchirris/SimCLR-1 | 535472ac76d24d368d3bc08c17987df315e0b657 | [
"Apache-2.0"
] | 1 | 2021-12-03T12:59:39.000Z | 2021-12-03T12:59:39.000Z | viz/main.py | YoniSchirris/SimCLR-1 | 535472ac76d24d368d3bc08c17987df315e0b657 | [
"Apache-2.0"
] | null | null | null | viz/main.py | YoniSchirris/SimCLR-1 | 535472ac76d24d368d3bc08c17987df315e0b657 | [
"Apache-2.0"
] | null | null | null | import torch
from viz.visualizer import Visualizer
from modules.deepmil import Attention
from msidata.dataset_msi_features_with_patients import PreProcessedMSIFeatureDataset
from testing.logistic_regression import get_precomputed_dataloader
import argparse
from experiment import ex
from utils import post_config_hook
@ex.automain
def main(_run, _log):
args = argparse.Namespace(**_run.config)
args = post_config_hook(args, _run)
args.device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
# Get test data to be visualized
_, test_loader = get_precomputed_dataloader(args, args.use_precomputed_features_id)
# Load model to be used
model = Attention()
# Initialize visualizer.. not necessary?
viz = Visualizer()
viz.visualize_first_patient(test_loader, model, method='deepmil')
print('done')
# for step, data in enumerate(loader):
# optimizer.zero_grad()
# x = data[0]
# y = data[1]
if __name__=="__main__":
main() | 22.391304 | 87 | 0.72233 | 131 | 1,030 | 5.412214 | 0.564886 | 0.036671 | 0.067701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003601 | 0.191262 | 1,030 | 46 | 88 | 22.391304 | 0.847539 | 0.180583 | 0 | 0 | 0 | 0 | 0.033453 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.4 | 0 | 0.45 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e879fd71ea6d131c8a7d4e47e9c565b330dabbe2 | 340 | py | Python | projects/golem_e2e/tests/login/login_missing_password.py | kangchenwei/keyautotest2 | f980d46cabfc128b2099af3d33968f236923063f | [
"MIT"
] | null | null | null | projects/golem_e2e/tests/login/login_missing_password.py | kangchenwei/keyautotest2 | f980d46cabfc128b2099af3d33968f236923063f | [
"MIT"
] | null | null | null | projects/golem_e2e/tests/login/login_missing_password.py | kangchenwei/keyautotest2 | f980d46cabfc128b2099af3d33968f236923063f | [
"MIT"
] | null | null | null |
description = 'Verify the user cannot log in if password value is missing'
pages = ['login']
def test(data):
navigate(data.env.url)
send_keys(login.username_input, 'admin')
click(login.login_button)
capture('Verify the correct error message is shown')
verify_text_in_element(login.error_list, 'Password is required')
| 28.333333 | 74 | 0.735294 | 49 | 340 | 4.959184 | 0.734694 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164706 | 340 | 11 | 75 | 30.909091 | 0.855634 | 0 | 0 | 0 | 0 | 0 | 0.380531 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.25 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e87d57d0c478e441bd8e2fe9c189947046ddeca4 | 39,545 | py | Python | fsleyes/layouts.py | pauldmccarthy/fsleyes | 453a6b91ec7763c39195814d635257e3766acf83 | [
"Apache-2.0"
] | 12 | 2018-05-05T01:36:25.000Z | 2021-09-23T20:44:08.000Z | fsleyes/layouts.py | pauldmccarthy/fsleyes | 453a6b91ec7763c39195814d635257e3766acf83 | [
"Apache-2.0"
] | 97 | 2018-05-05T02:17:23.000Z | 2022-03-29T14:58:42.000Z | fsleyes/layouts.py | pauldmccarthy/fsleyes | 453a6b91ec7763c39195814d635257e3766acf83 | [
"Apache-2.0"
] | 6 | 2017-12-09T09:02:00.000Z | 2021-03-05T18:55:13.000Z | #!/usr/bin/env python
#
# layout.py - The layout API (previously called "perspectives").
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
#
"""This module provides functions for managing *layouts* - stored view and
control panel layouts for *FSLeyes*. Layouts may be persisted using the
:mod:`.settings` module. A few layouts are also *built in*, and are
defined in the :attr:`BUILT_IN_LAYOUTS` dictionary.
.. note:: Prior to FSLeyes 0.24.0, *layouts* were called *perspectives*.
The ``layouts`` module provides the following functions. These are intended
for use by the :class:`.FSLeyesFrame`, but can be used in other ways too:
.. autosummary::
:nosignatures:
getAllLayouts
loadLayout
applyLayout
saveLayout
removeLayout
serialiseLayout
deserialiseLayout
A layout defines a layout for a :class:`.FSLeyesFrame`. It specifies the
type and layout of one or more *views* (defined in the :mod:`.views` module)
and, within each view, the type and layout of one or more *controls* (defined
in the :mod:`.controls` module). See the :mod:`fsleyes` documentation for
an overview of views and controls.
All of this information is stored as a string - see the
:func:`serialiseLayout` function for details on its storage format.
"""
import functools as ft
import logging
import pkgutil
import textwrap
import importlib
import collections
import fsl.utils.settings as fslsettings
import fsleyes_widgets.utils.status as status
import fsleyes.strings as strings
import fsleyes.plugins as plugins
import fsleyes.controls as controls
import fsleyes.views as views
import fsleyes.views.viewpanel as viewpanel
import fsleyes.views.canvaspanel as canvaspanel
import fsleyes.views.plotpanel as plotpanel
import fsleyes.controls.controlpanel as controlpanel
log = logging.getLogger(__name__)
def getAllLayouts():
"""Returns a list containing the names of all saved layouts. The
returned list does not include built-in layouts - these are
accessible in the :attr:`BUILT_IN_LAYOUTS` dictionary.
"""
layouts = fslsettings.read('fsleyes.layouts', []) + \
fslsettings.read('fsleyes.perspectives', [])
uniq = []
for l in layouts:
if l not in uniq:
uniq.append(l)
return uniq
def loadLayout(frame, name, **kwargs):
"""Load the named layout, and apply it to the given
:class:`.FSLeyesFrame`. The ``kwargs`` are passed through to the
:func:`applyLayout` function.
"""
if name in BUILT_IN_LAYOUTS.keys():
log.debug('Loading built-in layout {}'.format(name))
layout = BUILT_IN_LAYOUTS[name]
else:
log.debug('Loading saved layout {}'.format(name))
layout = fslsettings.read('fsleyes.layouts.{}'.format(name), None)
if layout is None:
fslsettings.read('fsleyes.perspectives.{}'.format(name), None)
if layout is None:
raise ValueError('No layout named "{}" exists'.format(name))
log.debug('Applying layout:\n{}'.format(layout))
applyLayout(frame, name, layout, **kwargs)
def applyLayout(frame, name, layout, message=None):
"""Applies the given serialised layout string to the given
:class:`.FSLeyesFrame`.
:arg frame: The :class:`.FSLeyesFrame` instance.
:arg name: The layout name.
:arg layout: The serialised layout string.
:arg message: A message to display (using the :mod:`.status` module).
"""
import fsleyes.views.canvaspanel as canvaspanel
layout = deserialiseLayout(layout)
frameChildren = layout[0]
frameLayout = layout[1]
vpChildrens = layout[2]
vpLayouts = layout[3]
vpPanelProps = layout[4]
vpSceneProps = layout[5]
# Show a message while re-configuring the frame
if message is None:
message = strings.messages[
'layout.applyingLayout'].format(
strings.layouts.get(name, name))
status.update(message)
# Clear all existing view
# panels from the frame
frame.removeAllViewPanels()
# Add all of the view panels
# specified in the layout
for vp in frameChildren:
log.debug('Adding view panel {} to frame'.format(vp.__name__))
frame.addViewPanel(vp, defaultLayout=False)
# Apply the layout to those view panels
frame.auiManager.LoadPerspective(frameLayout)
# For each view panel, add all of the
# control panels, and lay them out
viewPanels = frame.viewPanels
for i in range(len(viewPanels)):
vp = viewPanels[ i]
children = vpChildrens[ i]
vpLayout = vpLayouts[ i]
panelProps = vpPanelProps[i]
sceneProps = vpSceneProps[i]
for child in children:
log.debug('Adding control panel {} to {}'.format(
child.__name__, type(vp).__name__))
_addControlPanel(vp, child)
vp.auiManager.LoadPerspective(vpLayout)
# Apply saved property values
# to the view panel.
for name, val in panelProps.items():
log.debug('Setting {}.{} = {}'.format(
type(vp).__name__, name, val))
vp.deserialise(name, val)
# And to its SceneOpts instance if
# it is a CanvasPanel, or its
# PlotCanvas if it is a PlotPanel
if isinstance(vp, canvaspanel.CanvasPanel): aux = vp.sceneOpts
elif isinstance(vp, plotpanel.PlotPanel): aux = vp.canvas
for name, val in sceneProps.items():
log.debug('Setting {}.{} = {}'.format(
type(aux).__name__, name, val))
aux.deserialise(name, val)
def saveLayout(frame, name):
"""Serialises the layout of the given :class:`.FSLeyesFrame` and saves
it as a layout with the given name.
"""
if name in BUILT_IN_LAYOUTS.keys():
raise ValueError('A built-in layout named "{}" '
'already exists'.format(name))
log.debug('Saving current layout with name {}'.format(name))
layout = serialiseLayout(frame)
fslsettings.write('fsleyes.layouts.{}'.format(name), layout)
_addToLayoutList(name)
log.debug('Serialised layout:\n{}'.format(layout))
def removeLayout(name):
"""Deletes the named layout. """
log.debug('Deleting layout with name {}'.format(name))
fslsettings.delete('fsleyes.layouts.{}' .format(name))
fslsettings.delete('fsleyes.perspectives.{}'.format(name))
_removeFromLayoutList(name)
def serialiseLayout(frame):
"""Serialises the layout of the given :class:`.FSLeyesFrame`, and returns
it as a string.
.. note:: This function was written against wx.lib.agw.aui.AuiManager as
it exists in wxPython 3.0.2.0.
*FSLeyes* uses a hierarchy of ``wx.lib.agw.aui.AuiManager`` instances for
its layout - the :class:`.FSLeyesFrame` uses an ``AuiManager`` to lay out
:class:`.ViewPanel` instances, and each of these ``ViewPanels`` use their
own ``AuiManager`` to lay out control panels.
The layout for a single ``AuiManager`` can be serialised to a string via
the ``AuiManager.SavePerspective`` and ``AuiManager.SavePaneInfo``
methods. One of these strings consists of:
- A name, `'layout1'` or `'layout2'`, specifying the AUI version
(this will always be at least `'layout2'` for *FSLeyes*).
- A set of key-value set of key-value pairs defining the top level
panel layout.
- A set of key-value pairs for each pane, defining its layout. the
``AuiManager.SavePaneInfo`` method returns this for a single pane.
These are all encoded in a single string, with the above components
separated with '|' characters, and the pane-level key-value pairs
separated with a ';' character. For example:
layout2|key1=value1|name=Pane1;caption=Pane 1|\
name=Pane2;caption=Pane 2|doc_size(5,0,0)=22|
This function queries each of the AuiManagers, and extracts the following:
1. A layout string for the :class:`.FSLeyesFrame`.
2. A string containing a comma-separated list of :class:`.ViewPanel`
class names, in the same order as they are specified in the frame
layout string.
3. For each ``ViewPanel``:
- A layout string for the ``ViewPanel``
- A string containing a comma-separated list of control panel
class names, in the same order as specified in the
``ViewPanel`` layout string.
Each of these pieces of information are then concatenated into a single
newline separated string.
In FSLeyes 0.35.0, the list of ``ViewPanel`` and ``ControlPanel`` class
names was changed from containing just the class names
(e.g. ``'OrthoPanel'``) to containing the fully resolved class paths
(e.g. ``'fsleyes.views.orthopanel.OrthoPanel'``). The
:func:`deserialiseLayout` function is compatible with both formats.
"""
# We'll start by defining this silly function, which
# takes an ``AuiManager`` layout string, and a list
# of the children which are being managed by the
# AuiManager, and makes sure that the order of the
# child pane layout specifications in the string is
# the same as the order of the children in the list.
#
# If the 'rename' argument is True, this function
# performs an additional step.
#
# The FSLeyesFrame gives each of its view panels a
# unique name of the form "ClassName index", where
# the 'index' is a sequentially increasing identifier
# number (so that multiple views of the same type can
# be differentiated). If the 'rename' argument to
# this function is True, these names are adjusted so
# that they begin at 1 and increase sequentially. This
# is done by the patchPanelName function, defined
# below.
#
# This name adjustment is required to handle
# situations where the indices of existing view panels
# are not sequential, as when a layout is applied, the
# view panel names given by the FSLeyesFrame must
# match the names that are specified in the layout
# string.
#
# In addition to patching the name of each panel,
# the 'rename' argument will also cause the panel
# caption (its display title) to be adjusted so that
# it is in line with the name.
def patchLayoutString(auiMgr, panels, rename=False):
layoutStr = auiMgr.SavePerspective()
# The different sections of the string
# returned by SavePerspective are
# separated with a '|' character.
sections = layoutStr.split('|')
sections = [s.strip() for s in sections]
sections = [s for s in sections if s != '']
# Here, we identify sections which specify
# the layout of a child pane, remove them,
# and patch them back in, in the order that
# the child panels are specified in the list.
pi = 0
for si, s in enumerate(sections):
if s.find('name=') > -1:
panel = panels[pi]
panelInfo = auiMgr.GetPane(panel)
panelLayout = auiMgr.SavePaneInfo(panelInfo)
pi += 1
sections[si] = panelLayout
if rename:
sections[si] = patchPanelName(sections[si], pi)
# Now the panel layouts in our layout string
# are in the same order as our list of view
# panels - we can re-join the layout string
# sections, and we're done.
return '|'.join(sections) + '|'
# The purpose of this function is described above.
def patchPanelName(layoutString, index):
# In each AUI layout section, 'key=value'
# pairs are separated with a semi-colon
kvps = layoutString.split(';')
# And each 'key=value' pair is separated
# with an equals character
kvps = [kvp.split('=') for kvp in kvps]
kvps = collections.OrderedDict(kvps)
# We need to update the indices contained
# in the 'name' and 'caption' values
name = kvps['name']
caption = kvps['caption']
# Strip off the old index
name = ' '.join(name .split()[:-1])
caption = ' '.join(caption.split()[:-1])
# Patch in the new index
name = '{} {}'.format(name, index)
caption = '{} {}'.format(caption, index)
kvps['name'] = name
kvps['caption'] = caption
# Reconstruct the layout string
kvps = ['='.join((k, v)) for k, v in kvps.items()]
kvps = ';'.join(kvps)
return kvps
# Now we can start extracting the layout information.
# We start with the FSLeyesFrame layout.
auiMgr = frame.auiManager
viewPanels = frame.viewPanels
# Generate the frame layout string, and a
# list of the children of the frame
frameLayout = patchLayoutString(auiMgr, viewPanels, True)
frameChildren = ['.'.join((type(vp).__module__, type(vp).__qualname__))
for vp in viewPanels]
frameChildren = ','.join(frameChildren)
# We are going to build a list of layout strings,
# one for each ViewPanel, and a corresponding list
# of control panels displayed on each ViewPanel.
vpLayouts = []
vpConfigs = []
for vp in viewPanels:
# Get the auiManager and layout for this view panel.
# This is a little bit complicated, as ViewPanels
# differentiate between the main 'centre' panel, and
# all other secondary (control) panels. The layout
# string needs to contain layout information for
# all of these panels, but we only care about the
# control panels.
vpAuiMgr = vp.auiManager
ctrlPanels = vp.getPanels()
centrePanel = vp.centrePanel
# As above for the frame, generate a layout
# string and a list of control panels - the
# children of the view panel.
vpLayout = patchLayoutString(vpAuiMgr, [centrePanel] + ctrlPanels)
vpChildren = ['.'.join((type(cp).__module__, type(cp).__qualname__))
for cp in ctrlPanels]
vpChildren = ','.join(vpChildren)
# Get the panel and scene settings
panelProps, sceneProps = _getPanelProps(vp)
# And turn them into comma-separated key-value pairs.
panelProps = ['{}={}'.format(k, v) for k, v in panelProps.items()]
sceneProps = ['{}={}'.format(k, v) for k, v in sceneProps.items()]
panelProps = ','.join(panelProps)
sceneProps = ','.join(sceneProps)
# Build the config string - the children,
# the panel settings and the scene settings.
vpConfig = ';'.join([vpChildren, panelProps, sceneProps])
vpLayouts.append(vpLayout)
vpConfigs.append(vpConfig)
# We serialise all of these pieces of information
# as a single newline-separated string.
layout = [frameChildren, frameLayout]
for vpConfig, vpLayout in zip(vpConfigs, vpLayouts):
layout.append(vpConfig)
layout.append(vpLayout)
# And we're done!
return '\n'.join(layout)
def deserialiseLayout(layout):
"""Deserialises a layout string which was created by the
:func:`serialiseLayout` string.
:returns: A tuple containing the following:
- A list of :class:`.ViewPanel` class types - the
children of the :class:`.FSLeyesFrame`.
- An ``aui`` layout string for the :class:`.FSLeyesFrame`
- A list of lists, one for each ``ViewPanel``, with each
list containing a collection of control panel class
types - the children of the corresponding ``ViewPanel``.
- A list of strings, one ``aui`` layout string for each
``ViewPanel``.
- A list of dictionaries, one for each ``ViewPanel``,
containing property ``{name : value}`` pairs to be
applied to the ``ViewPanel``.
- A list of dictionaries, one for each ``ViewPanel``,
containing property ``{name : value}`` pairs to be applied
to the :class:`.SceneOpts` instance associated with the
``ViewPanel``, if it is a :class:`.CanvasPanel`, or the
:class:`.PlotCanvas` instance associated with the
``ViewPanel``, if it is a :class:`.PlotPanel`.
"""
# Versions of FSLeyes prior to 1.0.0 would just
# save the view/control class name. This was
# changed in 1.0.0 so that the full path to the
# class is saved. This function aims to be
# compatible with both formats - given a class
# name, or a fully resolved class name, it will
# return the corresponding type object.
def findViewOrControl(panelname, paneltype):
# new format
if '.' in panelname:
mod, cls = panelname.rsplit('.', maxsplit=1)
mod = importlib.import_module(mod)
return getattr(mod, cls)
# make a list of all candidate types,
# then search through them for a match
panels = []
# builtins
if paneltype == 'control':
basemod = controls
basetype = (controlpanel.ControlPanel,
controlpanel.ControlToolBar)
else:
basemod = views
basetype = viewpanel.ViewPanel
mods = pkgutil.iter_modules(basemod.__path__, basemod.__name__ + '.')
for _, mod, _ in mods:
mod = importlib.import_module(mod)
for att in dir(mod):
att = getattr(mod, att)
if isinstance(att, type) and issubclass(att, basetype):
panels.append(att)
# plugins
if paneltype == 'control':
panels.extend(plugins.listControls().values())
else:
panels.extend(plugins.listViews().values())
for panel in panels:
if panel.__name__ == panelname:
return panel
raise ValueError('Unknown FSLeyes panel type: {}'.format(panelname))
findView = ft.partial(findViewOrControl, paneltype='view')
findControl = ft.partial(findViewOrControl, paneltype='control')
lines = layout.split('\n')
lines = [line.strip() for line in lines]
lines = [line for line in lines if line != '']
frameChildren = lines[0]
frameLayout = lines[1]
# The children strings are comma-separated
# class names. The frame children are ViewPanels,
# which are all defined in the fsleyes.views
# package.
frameChildren = frameChildren.split(',')
frameChildren = [fc.strip() for fc in frameChildren]
frameChildren = [fc for fc in frameChildren if fc != '']
frameChildren = [findView(fc) for fc in frameChildren]
# Collate the children/layouts for each view panel
vpChildren = []
vpLayouts = []
vpPanelProps = []
vpSceneProps = []
for i in range(len(frameChildren)):
linei = (i * 2) + 2
config = lines[linei]
layout = lines[linei + 1]
children, panelProps, sceneProps = config.split(';')
vpChildren .append(children)
vpLayouts .append(layout)
vpPanelProps .append(panelProps)
vpSceneProps .append(sceneProps)
# The ViewPanel children string is a comma-separated
# list of control panel class names. All control panels
# should be defined in the fsleyes.controls package.
for i in range(len(vpChildren)):
children = vpChildren[i].split(',')
children = [vpc.strip() for vpc in children]
children = [vpc for vpc in children if vpc != '']
children = [findControl(vpc) for vpc in children]
vpChildren[i] = children
# The panel props and scene props strings are
# comma-separated lists of 'prop=value' pairs.
# We'll turn them into a dict for convenience.
for i in range(len(vpPanelProps)):
props = vpPanelProps[i].split(',')
props = [p for p in props if p != '']
props = [p.split('=') for p in props]
vpPanelProps[i] = collections.OrderedDict(props)
for i in range(len(vpSceneProps)):
props = vpSceneProps[i].split(',')
props = [p for p in props if p != '']
props = [p.split('=') for p in props]
vpSceneProps[i] = collections.OrderedDict(props)
return (frameChildren,
frameLayout,
vpChildren,
vpLayouts,
vpPanelProps,
vpSceneProps)
def _addToLayoutList(layout):
"""Adds the given layout name to the list of saved layouts. """
layout = layout.strip()
layouts = getAllLayouts()
if layout not in layouts:
layouts.append(layout)
log.debug('Updating stored layout list: {}'.format(layout))
fslsettings.write('fsleyes.layouts', layouts)
def _removeFromLayoutList(layout):
"""Removes the given layout name from the list of saved layouts.
"""
layouts = getAllLayouts()
try: layouts.remove(layout)
except ValueError: return
log.debug('Updating stored layout list: {}'.format(layouts))
fslsettings.write('fsleyes.layouts', layouts)
def _addControlPanel(viewPanel, panelType):
"""Adds a control panel to the given :class:`.ViewPanel`.
:arg viewPanel: A :class:`.ViewPanel` instance.
:arg panelType: A control panel type.
"""
viewPanel.togglePanel(panelType)
def _getPanelProps(panel):
"""Creates and returns two dictionaries, containing properties of the given
:class:`.ViewPanel` (and its associated :class:`.SceneOpts` instance, if
it is a :class:`.CanvasPanel`, or :class:`.PlotCanvas`, if it is a
:class:`.PlotPanel`), which are to be saved as part of a seriaised
*FSLeyes* layout. The properties to be saved are listed in the
:data:`VIEWPANEL_PROPS` dictionary.
"""
if not isinstance(panel, (canvaspanel.CanvasPanel, plotpanel.PlotPanel)):
return {}, {}
panelType = type(panel).__name__
panelProps, sceneProps = VIEWPANEL_PROPS.get(panelType, ({}, {}))
if isinstance(panel, canvaspanel.CanvasPanel):
aux = panel.sceneOpts
elif isinstance(panel, plotpanel.PlotPanel):
aux = panel.canvas
panelProps = {name : panel.serialise(name) for name in panelProps}
sceneProps = {name : aux .serialise(name) for name in sceneProps}
return panelProps, sceneProps
VIEWPANEL_PROPS = {
'OrthoPanel' : [['syncLocation',
'syncOverlayOrder',
'syncOverlayDisplay',
'syncOverlayVolume',
'movieRate',
'movieAxis'],
['showCursor',
'bgColour',
'fgColour',
'cursorColour',
'cursorGap',
'showColourBar',
'colourBarLocation',
'colourBarLabelSide',
'showXCanvas',
'showYCanvas',
'showZCanvas',
'showLabels',
'labelSize',
'layout',
'xzoom',
'yzoom',
'zzoom',
'highDpi']],
'LightBoxPanel' : [['syncLocation',
'syncOverlayOrder',
'syncOverlayDisplay',
'syncOverlayVolume',
'movieRate',
'movieAxis'],
['showCursor',
'bgColour',
'fgColour',
'cursorColour',
'showColourBar',
'colourBarLocation',
'colourBarLabelSide',
'zax',
'showGridLines',
'highlightSlice',
'highDpi']],
'Scene3DPanel' : [['syncLocation',
'syncOverlayOrder',
'syncOverlayDisplay',
'syncOverlayVolume'],
['showCursor',
'bgColour',
'fgColour',
'cursorColour',
'showColourBar',
'colourBarLocation',
'colourBarLabelSide',
'occlusion',
'light',
'lightPos',
'offset',
'rotation',
'showLegend']],
'TimeSeriesPanel' : [['usePixdim',
'plotMode',
'plotMelodicICs'],
['legend',
'xAutoScale',
'yAutoScale',
'xLogScale',
'yLogScale',
'ticks',
'grid',
'gridColour',
'bgColour',
'smooth']],
'HistogramPanel' : [['histType',
'plotType'],
['legend',
'xAutoScale',
'yAutoScale',
'xLogScale',
'yLogScale',
'ticks',
'grid',
'gridColour',
'bgColour',
'smooth']],
'PowerSpectrumPanel' : [['plotMelodicICs',
'plotFrequencies'],
['legend',
'xAutoScale',
'yAutoScale',
'xLogScale',
'yLogScale',
'ticks',
'grid',
'gridColour',
'bgColour',
'smooth']]}
# The order in which properties are defined in
# a layout is the order in which they will
# be applied. This is important to remember when
# considering properties that have side effects
# (e.g. setting SceneOpts.bgColour will clobber
# SceneOpts.fgColour).
BUILT_IN_LAYOUTS = collections.OrderedDict((
('default',
textwrap.dedent("""
fsleyes.views.orthopanel.OrthoPanel
layout2|name=OrthoPanel 1;caption=Ortho View 1;state=67376064;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|
fsleyes.controls.orthotoolbar.OrthoToolBar,fsleyes.controls.overlaydisplaytoolbar.OverlayDisplayToolBar,fsleyes.controls.overlaylistpanel.OverlayListPanel,fsleyes.controls.locationpanel.LocationPanel;syncOverlayOrder=True,syncLocation=True,syncOverlayDisplay=True,movieRate=400;colourBarLocation=top,showCursor=True,bgColour=#000000ff,layout=horizontal,colourBarLabelSide=top-left,cursorGap=False,fgColour=#ffffffff,cursorColour=#00ff00ff,showXCanvas=True,showYCanvas=True,showColourBar=False,showZCanvas=True,showLabels=True
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OrthoToolBar;caption=Ortho view toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayDisplayToolBar;caption=Display toolbar;state=67382012;dir=1;layer=11;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayListPanel;caption=Overlay list;state=67373052;dir=3;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=1;minh=1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=LocationPanel;caption=Location;state=67373052;dir=3;layer=0;row=0;pos=1;prop=100000;bestw=-1;besth=-1;minw=1;minh=1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|dock_size(3,0,0)=176|dock_size(1,10,0)=49|dock_size(1,11,0)=67|
""")), # noqa
('melodic',
textwrap.dedent("""
fsleyes.views.lightboxpanel.LightBoxPanel,fsleyes.views.timeseriespanel.TimeSeriesPanel,fsleyes.views.powerspectrumpanel.PowerSpectrumPanel
layout2|name=LightBoxPanel 1;caption=Lightbox View 1;state=67377088;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=TimeSeriesPanel 2;caption=Time series 2;state=67377148;dir=3;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=PowerSpectrumPanel 3;caption=Power spectra 3;state=67377148;dir=3;layer=0;row=0;pos=1;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|dock_size(3,0,0)=224|
fsleyes.controls.locationpanel.LocationPanel,fsleyes.controls.overlaylistpanel.OverlayListPanel,fsleyes.plugins.controls.melodicclassificationpanel.MelodicClassificationPanel,fsleyes.controls.lightboxtoolbar.LightBoxToolBar,fsleyes.controls.overlaydisplaytoolbar.OverlayDisplayToolBar;syncLocation=True,syncOverlayOrder=True,movieRate=750,syncOverlayDisplay=True;bgColour=#000000ff,fgColour=#ffffffff,showCursor=True,cursorColour=#00ff00ff,highlightSlice=False,zax=2,showColourBar=False,showGridLines=False,colourBarLocation=top
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=LocationPanel;caption=Location;state=67373052;dir=3;layer=0;row=0;pos=1;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayListPanel;caption=Overlay list;state=67373052;dir=3;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=MelodicClassificationPanel;caption=Melodic IC classification;state=67373052;dir=2;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=LightBoxToolBar;caption=Lightbox view toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayDisplayToolBar;caption=Display toolbar;state=67382012;dir=1;layer=11;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|dock_size(3,0,0)=130|dock_size(1,10,0)=45|dock_size(1,11,0)=51|dock_size(2,0,0)=402|
TimeSeriesToolBar;;
layout2|name=FigureCanvasWxAgg;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=TimeSeriesToolBar;caption=Time series toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=642|dock_size(1,10,0)=36|
PowerSpectrumToolBar;;
layout2|name=FigureCanvasWxAgg;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=PowerSpectrumToolBar;caption=Plot toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=642|dock_size(1,10,0)=36|
""")), # noqa
('feat',
textwrap.dedent("""
fsleyes.views.orthopanel.OrthoPanel,fsleyes.views.timeseriespanel.TimeSeriesPanel
layout2|name=OrthoPanel 1;caption=Ortho View 1;state=67377088;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=TimeSeriesPanel 2;caption=Time series 2;state=67377148;dir=3;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|dock_size(3,0,0)=282|
fsleyes.controls.overlaylistpanel.OverlayListPanel,fsleyes.controls.overlaydisplaytoolbar.OverlayDisplayToolBar,fsleyes.controls.orthotoolbar.OrthoToolBar,fsleyes.controls.locationpanel.LocationPanel,fsleyes.plugins.controls.clusterpanel.ClusterPanel;syncLocation=True,syncOverlayOrder=True,movieRate=750,syncOverlayDisplay=True;layout=horizontal,showLabels=True,bgColour=#000000ff,fgColour=#ffffffff,showCursor=True,showZCanvas=True,cursorColour=#00ff00ff,showColourBar=False,showYCanvas=True,showXCanvas=True,colourBarLocation=top
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayListPanel;caption=Overlay list;state=67373052;dir=3;layer=2;row=0;pos=0;prop=87792;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayDisplayToolBar;caption=Display toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OrthoToolBar;caption=Ortho view toolbar;state=67382012;dir=1;layer=10;row=1;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=LocationPanel;caption=Location;state=67373052;dir=3;layer=2;row=0;pos=1;prop=98544;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=ClusterPanel;caption=Cluster browser;state=67373052;dir=2;layer=1;row=0;pos=0;prop=114760;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=10|dock_size(2,1,0)=566|dock_size(1,10,0)=51|dock_size(1,10,1)=36|dock_size(3,2,0)=130|
OverlayListPanel,TimeSeriesToolBar;;
layout2|name=FigureCanvasWxAgg;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=OverlayListPanel;caption=Overlay list;state=67373052;dir=4;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|name=TimeSeriesToolBar;caption=Time series toolbar;state=67382012;dir=1;layer=10;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=642|dock_size(1,10,0)=36|dock_size(4,0,0)=206|
""")), # noqa
('ortho',
textwrap.dedent("""
fsleyes.views.orthopanel.OrthoPanel
layout2|name=OrthoPanel 1;caption=Ortho View 1;state=67376064;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|
;syncLocation=True,syncOverlayOrder=True,syncOverlayDisplay=True;layout=horizontal,showLabels=True,bgColour=#000000ff,fgColour=#ffffffff,showCursor=True,showZCanvas=True,cursorColour=#00ff00ff,showColourBar=False,showYCanvas=True,showXCanvas=True,colourBarLocation=top
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|
""")), # noqa
('3d',
textwrap.dedent("""
fsleyes.views.scene3dpanel.Scene3DPanel
layout2|name=Scene3DPanel 1;caption=3D View 1;state=67376064;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=24|
;syncOverlayOrder=True,syncOverlayDisplay=True,syncLocation=True;showColourBar=False,showLegend=True,cursorColour=#00ff00ff,colourBarLocation=top,showCursor=True,colourBarLabelSide=top-left,bgColour=#9999c0ff,fgColour=#00ff00ff
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|
""")), # noqa
('lightbox',
textwrap.dedent("""
fsleyes.views.lightboxpanel.LightBoxPanel
layout2|name=LightBoxPanel 1;caption=Lightbox View 1;state=67376064;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=22|
;syncLocation=True,syncOverlayOrder=True,syncOverlayDisplay=True;bgColour=#000000ff,fgColour=#ffffffff,showCursor=True,cursorColour=#00ff00ff,highlightSlice=False,zax=2,showColourBar=False,showGridLines=False,colourBarLocation=top
layout2|name=Panel;caption=;state=768;dir=5;layer=0;row=0;pos=0;prop=100000;bestw=-1;besth=-1;minw=-1;minh=-1;maxw=-1;maxh=-1;floatx=-1;floaty=-1;floatw=-1;floath=-1;notebookid=-1;transparent=255|dock_size(5,0,0)=10|
""")))) # noqa
| 49.185323 | 1,435 | 0.62157 | 4,859 | 39,545 | 5.034369 | 0.133155 | 0.00883 | 0.016188 | 0.01766 | 0.431976 | 0.387335 | 0.349971 | 0.329777 | 0.318698 | 0.305331 | 0 | 0.049495 | 0.253559 | 39,545 | 803 | 1,436 | 49.246575 | 0.779219 | 0.269035 | 0 | 0.269327 | 0 | 0.054863 | 0.483457 | 0.385822 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034913 | false | 0 | 0.047382 | 0 | 0.104738 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e88065d21621edc95c47aa6a03e2fdbde75bbf30 | 329 | py | Python | kwickstart/templates/flask/app.py | TxConvergentAdmin/convergent-kwickstart | 42374a1705c019ceb34e6ece718df3c17695b852 | [
"MIT"
] | 1 | 2020-02-07T17:52:48.000Z | 2020-02-07T17:52:48.000Z | kwickstart/templates/flask/app.py | TxConvergentAdmin/convergent-kwickstart | 42374a1705c019ceb34e6ece718df3c17695b852 | [
"MIT"
] | null | null | null | kwickstart/templates/flask/app.py | TxConvergentAdmin/convergent-kwickstart | 42374a1705c019ceb34e6ece718df3c17695b852 | [
"MIT"
] | null | null | null | from flask import Flask, jsonify, request
PORT = 5000
app = Flask(__name__)
@app.route('/')
def index():
return 'Hello World'
@app.route('/data')
def data():
return jsonify({'error': False, 'data': 123})
if __name__ == "__main__":
print('Running on http://127.0.0.1:' + str(PORT))
app.run('0.0.0.0', PORT)
| 16.45 | 53 | 0.613982 | 49 | 329 | 3.877551 | 0.612245 | 0.042105 | 0.031579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063197 | 0.182371 | 329 | 19 | 54 | 17.315789 | 0.643123 | 0 | 0 | 0 | 0 | 0 | 0.209726 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.083333 | 0.166667 | 0.416667 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
e8887bd9a2e073f305e0565db697d9a93e02600d | 318 | py | Python | player/migrations/0002_remove_music_thumbnail.py | Amoki/Amoki-Music | 77b0e426fe9cc6c9cd12346a5e5e81a62362bb83 | [
"MIT"
] | 3 | 2015-06-16T11:12:29.000Z | 2019-05-03T09:09:21.000Z | player/migrations/0002_remove_music_thumbnail.py | Amoki/Amoki-Music | 77b0e426fe9cc6c9cd12346a5e5e81a62362bb83 | [
"MIT"
] | 16 | 2015-08-18T14:35:55.000Z | 2021-06-10T17:31:04.000Z | player/migrations/0002_remove_music_thumbnail.py | Amoki/Amoki-Music | 77b0e426fe9cc6c9cd12346a5e5e81a62362bb83 | [
"MIT"
] | 1 | 2016-10-19T14:48:52.000Z | 2016-10-19T14:48:52.000Z | from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('player', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='music',
name='thumbnail',
),
]
| 17.666667 | 40 | 0.603774 | 27 | 318 | 6.851852 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.295597 | 318 | 17 | 41 | 18.705882 | 0.808036 | 0 | 0 | 0 | 0 | 0 | 0.100629 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.416667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e889b7bfe344006d80c5021c1a5f4296e429ca4d | 1,749 | py | Python | flaskapp/models.py | guillermosainz/instareplic | a4fb26269d54be1a140b1802a4d8013a6aa0ec70 | [
"MIT"
] | 53 | 2015-02-15T19:00:33.000Z | 2022-02-08T09:58:54.000Z | flaskapp/models.py | guillermosainz/instareplic | a4fb26269d54be1a140b1802a4d8013a6aa0ec70 | [
"MIT"
] | 5 | 2016-06-03T13:12:50.000Z | 2018-05-29T08:31:44.000Z | flaskapp/models.py | guillermosainz/instareplic | a4fb26269d54be1a140b1802a4d8013a6aa0ec70 | [
"MIT"
] | 15 | 2016-05-18T20:09:53.000Z | 2020-06-01T04:06:00.000Z | from mongoengine import StringField, EmailField, BooleanField
from flask.ext.login import UserMixin
import requests
import json
from mongoengine import Document
from social.apps.flask_app.me.models import FlaskStorage
class User(Document, UserMixin):
username = StringField(max_length=200)
password = StringField(max_length=200, default='')
name = StringField(max_length=100)
fullname = StringField(max_length=100)
first_name = StringField(max_length=100)
last_name = StringField(max_length=100)
email = EmailField()
active = BooleanField(default=True)
def facebook_api(self, url, fields=None):
params = {
'access_token': self.get_social_auth("facebook").extra_data['access_token']
}
if fields:
params["fields"] = ",".join(fields)
res = requests.get(url, params=params)
if res.status_code != 200:
raise Exception("Status was %s" % res.status_code)
return json.loads(res.content)
def get_facebook_albums(self):
return self.facebook_api("https://graph.facebook.com/v2.2/me/albums", fields=["id", "name"])["data"]
def get_facebook_photos(self, album_id):
photos = []
url = "https://graph.facebook.com/v2.2/%s/photos" % album_id
while url:
ret = self.facebook_api(url, fields=[
"id", "created_time", "from", "height", "width", "name", "source"
])
photos += ret["data"]
url = ret.get("paging", {}).get("next")
return photos
def get_social_auth(self, provider):
return FlaskStorage.user.get_social_auth_for_user(self, provider=provider).get()
def is_active(self):
return self.active
| 30.155172 | 108 | 0.644368 | 213 | 1,749 | 5.13615 | 0.384977 | 0.076782 | 0.109689 | 0.084095 | 0.117916 | 0.043876 | 0 | 0 | 0 | 0 | 0 | 0.018574 | 0.230417 | 1,749 | 57 | 109 | 30.684211 | 0.794205 | 0 | 0 | 0 | 0 | 0 | 0.112636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121951 | false | 0.02439 | 0.146341 | 0.073171 | 0.609756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e88a581db548ce41d35b785b7240edca51e28965 | 1,136 | py | Python | notes/reference/tutorials/an-introduction-to-asynch-programming-and-twisted/exercises/part3/ex2.py | aav789/study-notes | 34eca00cd48869ba7a79c0ea7d8948ee9bde72b9 | [
"MIT"
] | 43 | 2015-06-10T14:48:00.000Z | 2020-11-29T16:22:28.000Z | notes/reference/tutorials/an-introduction-to-asynch-programming-and-twisted/exercises/part3/ex2.py | aav789/study-notes | 34eca00cd48869ba7a79c0ea7d8948ee9bde72b9 | [
"MIT"
] | 1 | 2021-11-01T12:01:44.000Z | 2021-11-01T12:01:44.000Z | notes/reference/tutorials/an-introduction-to-asynch-programming-and-twisted/exercises/part3/ex2.py | lextoumbourou/notes | 5f94c59a467eb3eb387542bdce398abc0365e6a7 | [
"MIT"
] | 40 | 2015-03-02T10:33:59.000Z | 2020-05-24T12:17:05.000Z | from twisted.internet import reactor, task
class CounterManager(object):
counters = []
@classmethod
def add_counter(cls, counter):
cls.counters.append(counter)
@classmethod
def has_active_counters(cls):
return all([not c.is_active for c in cls.counters])
class Counter(object):
def __init__(self, name, between_time, counter=5):
self.name = name
self.between_time = between_time
self.counter = counter
self.is_active = True
CounterManager.add_counter(self)
def start(self):
self.loop_handler = task.LoopingCall(self.count)
self.loop_handler.start(self.between_time)
def count(self):
if self.counter == 0:
self.is_active = False
self.loop_handler.stop()
if CounterManager.has_active_counters():
print 'No counters active. Stopping!'
reactor.stop()
else:
print self.name + ':', self.counter
self.counter -= 1
print 'Start'
Counter('1', 0.5).start()
Counter('2', 1).start()
Counter('3', 0.1).start()
reactor.run()
| 24.170213 | 59 | 0.612676 | 138 | 1,136 | 4.898551 | 0.355072 | 0.065089 | 0.066568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01335 | 0.274648 | 1,136 | 46 | 60 | 24.695652 | 0.807039 | 0 | 0 | 0.058824 | 0 | 0 | 0.033451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.029412 | null | null | 0.088235 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e894d63a61f84c8bd811e9165930421a819cf713 | 274 | py | Python | app/api_docs/__init__.py | linrong/flask-server | 5f0896c6ccedb6b172b9af7e1018600e38a2df43 | [
"MIT"
] | null | null | null | app/api_docs/__init__.py | linrong/flask-server | 5f0896c6ccedb6b172b9af7e1018600e38a2df43 | [
"MIT"
] | 1 | 2019-09-06T10:06:47.000Z | 2019-09-10T07:18:47.000Z | app/api_docs/__init__.py | linrong/flask-server | 5f0896c6ccedb6b172b9af7e1018600e38a2df43 | [
"MIT"
] | null | null | null | # _*_ coding: utf-8 _*_
"""
Created by lr on 2019/08/29.
此模块用来编写flasgger中api列表下的详细操作信息
"""
from app.api_docs.v1 import user, client, token, \
banner, theme, product, category, \
address, order, pay
from app.api_docs.cms import cms_user, file
__author__ = 'lr' | 24.909091 | 50 | 0.69708 | 38 | 274 | 4.736842 | 0.789474 | 0.077778 | 0.111111 | 0.155556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044643 | 0.182482 | 274 | 11 | 51 | 24.909091 | 0.758929 | 0.29562 | 0 | 0 | 0 | 0 | 0.01087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e89be0e46a4e4f06c678ed62e880b446af83ee7d | 4,294 | py | Python | src/thresholding/Utilities.py | dsp-uga/Team-kieffer | f71ebcea3928d00496fb32156ebe990083795d29 | [
"MIT"
] | null | null | null | src/thresholding/Utilities.py | dsp-uga/Team-kieffer | f71ebcea3928d00496fb32156ebe990083795d29 | [
"MIT"
] | null | null | null | src/thresholding/Utilities.py | dsp-uga/Team-kieffer | f71ebcea3928d00496fb32156ebe990083795d29 | [
"MIT"
] | null | null | null | """
Author: Narinder Singh Project: Cilia Segmentation Date: 27 Feb 2019
Course: CSCI 8360 @ UGA Semester: Spring 2019 Module: Utilities.py
Description: This module contains methods and classes that make life easier.
"""
import os
import sys
import numpy as np
import matplotlib.pyplot as matplot
from scipy.misc import imsave
from PIL import Image
from Config import *
MASKS_PATH = os.path.join(DATA_FILES_PATH, "masks/")
LIT_MASKS_PATH = os.path.join(MASKS_PATH, "lit/")
FRAMES_PATH = os.path.join(DATA_FILES_PATH, "data/frames")
# Stretching constant for masks to scale the range of grayscales from [0, 2] to [0, 255]
MASK_STRETCHING_CONSTANT = 127.5
class UtilitiesError(Exception): pass
class BadHashError(UtilitiesError): pass
class ProgressBar:
"""
A handrolled implementation of a progress bar. The bar displays the progress as a ratio like this: (1/360).
"""
def __init__(self, max = 100, message = "Initiating ....."):
"""
Initialize the bar with the total number of units (scale).
"""
self.max = max
self.current = 0
print message + '\n'
def update(self, add = 1):
"""
Record progress.
"""
self.current += add
self._clear()
self._display()
def _display(self):
"""
Print the completion ratio on the screen.
"""
print "(" + str(self.current) + "/" + str(self.max) + ")"
def _clear(self):
"""
Erase the old ratio from the console.
"""
sys.stdout.write("\033[F")
sys.stdout.flush()
def flen(filename):
"""
File LENgth computes and returns the number of lines in a file. @filename <string> is path to a file. This is an epensive method to call for the whole file is read to determine the number of lines.
returns: <integer> line count
"""
# Read and count lines.
with open(filename, 'r') as infile:
return sum((1 for line in infile))
def isImageFile(fpath):
"""
Returns whether or not the given path or filename is for an image file. The method is crude at the moment and just checks for some popular formats.
"""
path, fname = os.path.split(fpath)
if fname.endswith(("png", "jpeg", "gif", "tiff", "bmp")): return True
else: return False
def invertMask(mask):
"""
Inverts a numpy binary mask.
"""
return mask == False
def readMask(hash, binarize=True):
"""
Reads the mask for the given hash and if binarize flag is set, makes the mask binary (True/False : Cilia/Not-cilia)
"""
fpath = os.path.join(MASKS_PATH, hash + ".png")
if not os.path.isfile(fpath): raise BadHashError("Hash: " + hash + " does not exist OR does not have a mask against it.")
img = Image.open(fpath)
mat = np.asarray(img, np.int32)
mat.setflags(write=1)
if binarize:
ciliaMask = mat == CILIA_GRAYSCALE
backgroundMask = invertMask(ciliaMask)
mat[ciliaMask] = True
mat[backgroundMask] = False
return mat
def displayMask(hash, binarize=True):
"""
Displays the cilia mask against the given hash value.
"""
mask = readMask(hash, binarize)
if binarize: im = Image.fromarray(mask * 255)
else: im = Image.fromarray(mask * MASK_STRETCHING_CONSTANT)
im.show()
def displayHeatMap(mat):
"""
Dispalys the heat map for the given matrix.
"""
matplot.imshow(mat, cmap='hot')
matplot.show()
def readLines(filepath):
"""
Reads and returns the lines of the given file as a list.
"""
lines = []
with open(filepath, 'r') as infile:
for line in infile:
lines.append(line.strip())
return lines
def getVideoFramesDirectory(hash):
"""
Returns the video frames directory for the given hash.
"""
dir = os.path.join(FRAMES_PATH, hash)
if not os.path.isdir(dir): raise BadHashError("No frame directory found against the hash: " + hash)
else: return dir
def mean(collection):
"""
Mean for a numeric collection
"""
return sum(collection) / (len(collection) or 1)
def stretchAndSaveMasks(hashes):
"""
This method stretches the contrast for the masks by rescaling them to 0-255 grayscale making the white regions in the masks cilia cells.
"""
# Read each mask and hash
for hash in hashes:
mask = readMask(hash, binarize=False)
result = mask * MASK_STRETCHING_CONSTANT
imsave(os.path.join(LIT_MASKS_PATH, hash + ".png"), result)
if __name__ == '__main__':
# Quick testing etc.
hashes = readLines(TRAIN_FILE)
stretchAndSaveMasks(hashes)
| 24.678161 | 199 | 0.698882 | 625 | 4,294 | 4.7424 | 0.384 | 0.018219 | 0.020243 | 0.01417 | 0.035762 | 0.018219 | 0.018219 | 0 | 0 | 0 | 0 | 0.013742 | 0.186539 | 4,294 | 173 | 200 | 24.820809 | 0.834812 | 0.035165 | 0 | 0 | 0 | 0 | 0.072345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.026316 | 0.092105 | null | null | 0.026316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e89e57eee1002924427fb9529bc74acb259d5736 | 5,895 | py | Python | zm-jython/jylibs/ldap.py | hernad/zimbra9 | cf61ffa40d9600ab255ef4516ca25029fff6603b | [
"Apache-2.0"
] | null | null | null | zm-jython/jylibs/ldap.py | hernad/zimbra9 | cf61ffa40d9600ab255ef4516ca25029fff6603b | [
"Apache-2.0"
] | null | null | null | zm-jython/jylibs/ldap.py | hernad/zimbra9 | cf61ffa40d9600ab255ef4516ca25029fff6603b | [
"Apache-2.0"
] | null | null | null | #
# ***** BEGIN LICENSE BLOCK *****
# Zimbra Collaboration Suite Server
# Copyright (C) 2010, 2012, 2013, 2014, 2015, 2016 Synacor, Inc.
#
# This program is free software: you can redistribute it and/or modify it under
# the terms of the GNU General Public License as published by the Free Software Foundation,
# version 2 of the License.
#
# This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
# You should have received a copy of the GNU General Public License along with this program.
# If not, see <https://www.gnu.org/licenses/>.
# ***** END LICENSE BLOCK *****
#
import conf
from com.zimbra.cs.ldap.LdapServerConfig import GenericLdapConfig
from com.zimbra.cs.ldap import LdapClient
from com.zimbra.cs.ldap import LdapUsage
from com.zimbra.cs.ldap import ZAttributes
from com.zimbra.cs.ldap import ZLdapContext
from com.zimbra.cs.ldap import ZLdapFilter
from com.zimbra.cs.ldap import ZLdapFilterFactory
from com.zimbra.cs.ldap.ZLdapFilterFactory import FilterId
from com.zimbra.cs.ldap import ZSearchControls
from com.zimbra.cs.ldap import ZSearchResultEntry;
from com.zimbra.cs.ldap import ZMutableEntry
from com.zimbra.cs.ldap import ZSearchResultEnumeration
from com.zimbra.cs.ldap import ZSearchScope
from com.zimbra.cs.ldap.LdapException import LdapSizeLimitExceededException
from logmsg import *
# (Key, DN, requires_master)
keymap = {
"ldap_common_loglevel" : ("olcLogLevel", "cn=config", False),
"ldap_common_threads" : ("olcThreads", "cn=config", False),
"ldap_common_toolthreads" : ("olcToolThreads", "cn=config", False),
"ldap_common_require_tls" : ("olcSecurity", "cn=config", False),
"ldap_common_writetimeout" : ("olcWriteTimeout", "cn=config", False),
"ldap_common_tlsdhparamfile" : ("olcTLSDHParamFile", "cn=config", False),
"ldap_common_tlsprotocolmin" : ("olcTLSProtocolMin", "cn=config", False),
"ldap_common_tlsciphersuite" : ("olcTLSCipherSuite", "cn=config", False),
"ldap_db_maxsize" : ("olcDbMaxsize", "olcDatabase={3}mdb,cn=config", False),
"ldap_db_envflags" : ("olcDbEnvFlags", "olcDatabase={3}mdb,cn=config", False),
"ldap_db_rtxnsize" : ("olcDbRtxnSize", "olcDatabase={3}mdb,cn=config", False),
"ldap_accesslog_maxsize" : ("olcDbMaxsize", "olcDatabase={2}mdb,cn=config", True),
"ldap_accesslog_envflags" : ("olcDbEnvFlags", "olcDatabase={2}mdb,cn=config", True),
"ldap_overlay_syncprov_checkpoint" : ("olcSpCheckpoint", "olcOverlay={0}syncprov,olcDatabase={3}mdb,cn=config", True),
"ldap_overlay_syncprov_sessionlog" : ("olcSpSessionlog", "olcOverlay={0}syncprov,olcDatabase={3}mdb,cn=config", True),
"ldap_overlay_accesslog_logpurge" : ("olcAccessLogPurge", "olcOverlay={1}accesslog,olcDatabase={3}mdb,cn=config", True)
}
class Ldap:
master = False
mLdapConfig = None
@classmethod
def initLdap(cls, c = None):
if c:
cls.cf = c
Log.logMsg(5, "Creating ldap context")
ldapUrl = "ldapi:///"
bindDN = "cn=config"
try:
cls.mLdapConfig = GenericLdapConfig(ldapUrl, cls.cf.ldap_starttls_required, bindDN, cls.cf.ldap_root_password)
except Exception, e:
Log.logMsg(1, "LDAP CONFIG FAILURE (%s)" % e)
else:
cls.cf = conf.Config()
@classmethod
def modify_attribute(cls, key, value):
if cls.cf.ldap_is_master:
atbase = "cn=accesslog"
atfilter = "(objectClass=*)"
atreturn = ['1.1']
zfilter = ZLdapFilterFactory.getInstance().fromFilterString(FilterId.ZMCONFIGD, atfilter)
searchControls = ZSearchControls.createSearchControls(ZSearchScope.SEARCH_SCOPE_BASE, ZSearchControls.SIZE_UNLIMITED, atreturn)
mLdapContext = LdapClient.getContext(cls.mLdapConfig, LdapUsage.SEARCH)
try:
ne = mLdapContext.searchDir(atbase, zfilter, searchControls)
except:
cls.master = False
else:
cls.master = True
Log.logMsg(5, "Ldap config is master")
LdapClient.closeContext(mLdapContext)
(attr, dn, xform) = Ldap.lookupKey(key)
if attr is not None:
v = xform % (value,)
atreturn = [attr]
searchControls = ZSearchControls.createSearchControls(ZSearchScope.SEARCH_SCOPE_BASE, ZSearchControls.SIZE_UNLIMITED, atreturn)
mLdapContext = LdapClient.getContext(cls.mLdapConfig, LdapUsage.SEARCH)
ne = mLdapContext.searchDir(dn, zfilter, searchControls)
entry = ne.next()
entryAttrs = entry.getAttributes()
origValue = entryAttrs.getAttrString(attr)
attrPresent = entryAttrs.hasAttribute(attr)
LdapClient.closeContext(mLdapContext)
if origValue != v:
if attr == "olcSpSessionlog" and not attrPresent:
Log.logMsg(4, "olcSpSessionlog attribute is not present and can't replace it")
else:
Log.logMsg(4, "Setting %s to %s" % (key, v))
mLdapContext = LdapClient.getContext(cls.mLdapConfig, LdapUsage.MOD)
mEntry = LdapClient.createMutableEntry()
mEntry.setAttr(attr, v)
try:
mLdapContext.replaceAttributes(dn, mEntry.getAttributes())
LdapClient.closeContext(mLdapContext)
except:
return 1;
@classmethod
def lookupKey(cls, key):
if key in keymap:
(attr, dn, requires_master) = keymap[key]
if re.match("ldap_db_", key) and not cls.master:
dn = "olcDatabase={2}mdb,cn=config"
xform = "%s"
if key == "ldap_common_require_tls":
xform = "ssf=%s"
if requires_master and not cls.master:
Log.logMsg(5, "LDAP: Trying to modify key: %s when not a master" % (key,))
return (None, None, None)
else:
Log.logMsg(5, "Found key %s and dn %s for %s (%s)" % (attr, dn, key, cls.master))
return (attr, dn, xform)
else:
Log.logMsg(1, "UNKNOWN KEY %s" % (key,))
raise Exception("Key error")
Ldap.initLdap()
| 40.9375 | 130 | 0.709754 | 725 | 5,895 | 5.692414 | 0.317241 | 0.034892 | 0.0441 | 0.050884 | 0.32566 | 0.256361 | 0.156288 | 0.127938 | 0.111461 | 0.111461 | 0 | 0.009788 | 0.168109 | 5,895 | 143 | 131 | 41.223776 | 0.83177 | 0.126887 | 0 | 0.18018 | 0 | 0 | 0.26126 | 0.123416 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.009009 | 0.144144 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8a44e30435268e66fe7dcfc673cc83f8a07c67f | 5,387 | py | Python | features/steps/common.py | PolySync/kevlar-laces | a5f36d1e4955963b3f1be3fd0ad8d3839a0bee3b | [
"MIT"
] | 3 | 2017-10-05T23:25:40.000Z | 2018-06-06T10:03:56.000Z | features/steps/common.py | PolySync/kevlar-laces | a5f36d1e4955963b3f1be3fd0ad8d3839a0bee3b | [
"MIT"
] | 11 | 2017-10-05T23:22:30.000Z | 2018-07-10T21:13:40.000Z | features/steps/common.py | PolySync/kevlar-laces | a5f36d1e4955963b3f1be3fd0ad8d3839a0bee3b | [
"MIT"
] | 3 | 2018-06-22T21:43:07.000Z | 2021-02-01T12:02:43.000Z | from behave import *
from hamcrest import *
import subprocess
import shlex
import os
import tempfile
import utils
@given('a local copy of the repo on the {branch} branch')
def step_impl(context, branch):
context.mock_developer_dir = tempfile.mkdtemp(prefix='kevlar')
utils.shell_command('git -C {0} clone -q file:///{1} . -b {2}'.format(context.mock_developer_dir, context.mock_github_dir, branch))
utils.shell_command('git -C {0} checkout -q {1}'.format(context.mock_developer_dir, branch))
utils.shell_command('git -C {0} config --local user.signingkey 794267AC'.format(context.mock_developer_dir))
utils.shell_command('git -C {0} config --local user.name "Local Test"'.format(context.mock_developer_dir))
utils.shell_command('git -C {0} config --local user.email "donut-reply@polysync.io"'.format(context.mock_developer_dir))
utils.shell_command('git -C {0} config --local gpg.program gpg2'.format(context.mock_developer_dir))
@given('I create a new {branch} branch')
def step_impl(context, branch):
command = 'git -C {0} checkout -b {1}'.format(context.mock_developer_dir, branch)
utils.run_with_project_in_path(command, context)
@given('the {release_tag} release tag already exists')
def step_impl(context, release_tag):
command = 'git -C {0} tag -s {1} -m {1}'.format(context.mock_github_dir, release_tag)
utils.run_with_project_in_path(command, context)
@given('the GPG signing key is not available')
def step_impl(context):
command = 'git -C {0} config --local user.signingkey 00000000'.format(context.mock_developer_dir)
utils.run_with_project_in_path(command, context)
@given('I have done some work on the repo')
def step_impl(context):
utils.shell_command('cp -a {0}/features/test_file.txt {1}/test_file.txt'.format(os.getcwd(), context.mock_developer_dir))
@given('the project contains subdirectory {directory}')
def step_impl(context, directory):
wd = '{0}/{1}'.format(context.mock_developer_dir, directory)
utils.run_with_project_in_path('mkdir {0}/{1}'.format(context.mock_developer_dir, directory), context)
context.wd = wd
@given('the {branch} branch contains unsigned commits')
def step_impl(context, branch):
utils.run_with_project_in_path('git -C {0} commit --allow-empty --no-gpg-sign -m "creating an unsigned commit"'.format(context.mock_developer_dir), context)
@given('the {tag} tag is unsigned')
def step_impl(context, tag):
utils.run_with_project_in_path('git -C {0} tag -a {1} -m {1}'.format(context.mock_developer_dir, tag), context)
@given('the {tag} tag contains unsigned commits')
def step_impl(context, tag):
utils.run_with_project_in_path('git -C {0} commit --allow-empty --no-gpg-sign -m "creating an unsigned commit"'.format(context.mock_developer_dir), context)
utils.run_with_project_in_path('git -C {0} tag -s {1} -m {1}'.format(context.mock_developer_dir, tag), context)
@when('the {command} command is run with the -h flag')
def step_impl(context, command):
cmd = 'git -C {0} {1} -h'.format(context.mock_developer_dir, command)
context.out, context.err, context.rc = utils.run_with_project_in_path(cmd, context)
@when('I run git {action} from the {directory} directory')
def step_impl(context, action, directory):
command = 'git -C {0} {1}'.format(context.wd, action)
context.out, context.err, context.rc = utils.run_with_project_in_path(command, context)
@when('I run git-{action}')
def step_impl(context, action):
command = 'git -C {0} {1}'.format(context.mock_developer_dir, action)
context.out, context.err, context.rc = utils.run_with_project_in_path(command, context)
@then('the script should return {exit_code}')
def step_impl(context, exit_code):
assert_that(context.rc, equal_to(int(exit_code)))
@then('the merge commit should be signed')
def step_impl(context):
command = "git -C {0} verify-commit {1}".format(context.mock_github_dir, context.sha_hash)
unused, verify_output, rc = utils.run_with_project_in_path(command, context)
assert_that(verify_output, contains_string('Signature made'))
@then('the repo should be returned to the state it was in before I ran the script')
def step_impl(context):
exists = False
original_string = 'A working file with some text'
with open('{0}/test_file.txt'.format(context.mock_developer_dir), 'r') as check_file:
for line in check_file:
if original_string in line:
exists = True
assert_that(exists, True)
@then('the repo should be returned to the {branch} branch when I am done')
def step_impl(context, branch):
out, err, rc = utils.run_with_project_in_path('git -C {0} branch'.format(context.mock_developer_dir), context)
assert_that(out, contains_string(branch))
@then('the {directory} directory should exist when I am done')
def step_impl(context, directory):
out, err, rc = utils.shell_command('ls {0}/{1}'.format(context.mock_developer_dir, directory))
assert_that(context.rc, equal_to(0))
@then('the terminal displays usage options for the {command} command')
def step_impl(context, command):
assert_that(context.out, contains_string('usage:'))
@then('the terminal prints an error')
def step_impl(context):
assert_that(context.out, contains_string('ERROR:'))
@then('the script exits with status 0')
def step_impl(context):
assert_that(context.rc, equal_to(0))
| 46.439655 | 160 | 0.733247 | 834 | 5,387 | 4.545564 | 0.188249 | 0.069639 | 0.110789 | 0.127407 | 0.657874 | 0.569771 | 0.490108 | 0.40517 | 0.289106 | 0.249011 | 0 | 0.012623 | 0.132356 | 5,387 | 115 | 161 | 46.843478 | 0.79846 | 0 | 0 | 0.271739 | 0 | 0.032609 | 0.30982 | 0.009467 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.217391 | false | 0 | 0.076087 | 0 | 0.293478 | 0.01087 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8b070cce38d20c7a2e6fd86cc0080c388e9443f | 310 | py | Python | Module8/inheritance/02_task_IterInt.py | xm4dn355x/specialist_python3_2nd_lvl | 4ea8c82eb0f32aa92c82914f6599c2c47a2f7032 | [
"MIT"
] | null | null | null | Module8/inheritance/02_task_IterInt.py | xm4dn355x/specialist_python3_2nd_lvl | 4ea8c82eb0f32aa92c82914f6599c2c47a2f7032 | [
"MIT"
] | null | null | null | Module8/inheritance/02_task_IterInt.py | xm4dn355x/specialist_python3_2nd_lvl | 4ea8c82eb0f32aa92c82914f6599c2c47a2f7032 | [
"MIT"
] | null | null | null | # Разработать класс IterInt, который наследует функциональность стандартного типа int, но добавляет
# возможность итерировать по цифрам числа
class IterInt(int):
pass
n = IterInt(12346)
for digit in n:
print("digit = ", digit)
# Выведет:
# digit = 1
# digit = 2
# digit = 3
# digit = 4
# digit = 6 | 17.222222 | 99 | 0.693548 | 41 | 310 | 5.243902 | 0.731707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040984 | 0.212903 | 310 | 18 | 100 | 17.222222 | 0.840164 | 0.632258 | 0 | 0 | 0 | 0 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0.2 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e8b2ef11a1743e630591f449491e6c4cb3e345c2 | 1,100 | py | Python | base/backprop_perceptron.py | tardatio/granary_ai | 1af8efe0f2f971d25763b9b457e9025a73bdfb0d | [
"MIT"
] | null | null | null | base/backprop_perceptron.py | tardatio/granary_ai | 1af8efe0f2f971d25763b9b457e9025a73bdfb0d | [
"MIT"
] | null | null | null | base/backprop_perceptron.py | tardatio/granary_ai | 1af8efe0f2f971d25763b9b457e9025a73bdfb0d | [
"MIT"
] | null | null | null |
def forward(w,s,b,y):
Yhat= w * s + b
output = (Yhat-y)**2
return output, Yhat
def derivative_W(x, output, Yhat, y):
return ((2 * output) * (Yhat - y)) * x # w
def derivative_B(b, output, Yhat, y):
return ((2 * output) * (Yhat - y)) * b #bias
def main():
w = 1.0 #weight
x = 2.0 #sample
b = 1.0 #bias
y = 2.0*x #rule
learning = 1e-1
epoch = 3
for i in range(epoch+1):
output, Yhat = forward(w,x,b,y)
print("-----------------------------------------------------------------------------")
print("w:",w)
print("\tw*b:",w*x)
print("x:",x,"\t\tsum:", w*x+b)
print("\tb:",b,"\t\t\tg1:",abs(Yhat-y),"\tg2:",abs(Yhat-y)**2,"\tloss:",output)
print("\t\tY=2*x:", y)
print("-----------------------------------------------------------------------------")
if output == 0.0:
break
gw = derivative_W(x, output, Yhat, y)
gb = derivative_B(b, output, Yhat, y)
w -= learning * gw
b -= learning * gb
if __name__ == '__main__':
main()
| 25 | 94 | 0.408182 | 151 | 1,100 | 2.89404 | 0.278146 | 0.20595 | 0.176201 | 0.08238 | 0.292906 | 0.292906 | 0.132723 | 0.132723 | 0 | 0 | 0 | 0.026415 | 0.277273 | 1,100 | 43 | 95 | 25.581395 | 0.52327 | 0.022727 | 0 | 0.0625 | 0 | 0 | 0.2015 | 0.14433 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0.0625 | 0.21875 | 0.21875 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8bd7184378822445ccc65b6e322ca8a912026c5 | 385 | py | Python | src/cart/migrations/0005_orderitem_quantity.py | Bakhtiyar-Habib/CSE327-Project | 4126b40eb398e4cf13b49136e552775c5f3b0635 | [
"bzip2-1.0.6"
] | null | null | null | src/cart/migrations/0005_orderitem_quantity.py | Bakhtiyar-Habib/CSE327-Project | 4126b40eb398e4cf13b49136e552775c5f3b0635 | [
"bzip2-1.0.6"
] | null | null | null | src/cart/migrations/0005_orderitem_quantity.py | Bakhtiyar-Habib/CSE327-Project | 4126b40eb398e4cf13b49136e552775c5f3b0635 | [
"bzip2-1.0.6"
] | null | null | null | # Generated by Django 2.0.7 on 2020-05-18 05:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('cart', '0004_auto_20200518_1139'),
]
operations = [
migrations.AddField(
model_name='orderitem',
name='quantity',
field=models.IntegerField(default=1),
),
]
| 20.263158 | 49 | 0.597403 | 41 | 385 | 5.512195 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116788 | 0.288312 | 385 | 18 | 50 | 21.388889 | 0.708029 | 0.116883 | 0 | 0 | 1 | 0 | 0.130178 | 0.068047 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8bfced731c80842e64d71e3b64a2ac8f77ae7b9 | 566 | py | Python | icpc/2019-10-4/F-gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | 3 | 2018-08-30T09:43:20.000Z | 2019-12-03T04:53:43.000Z | icpc/2019-10-4/F-gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | null | null | null | icpc/2019-10-4/F-gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | null | null | null | #!/usr/bin/pypy
from sys import *
from random import *
n, m, CMAX, d1, d2 = map(int, argv[1:])
#print randint(0, n)
print 1
x0, y0 = randint(-20, -10), randint(-20, -10)
dx, dy = randint(-d1, -1), randint(1, d1)
x1, y1 = x0 + dx, y0 + dy
dx, dy = -dy, dx
print x0, y0
print x1, y1
print x1 + dx, y1 + dy
print x0 + dx, y0 + dy
print n
for i in xrange(n):
print m
x, y = randint(0, CMAX), randint(0, CMAX)
for j in xrange(m):
print x, y, 0
x += randint(-d2, d2)
y += randint(-d2, d2)
x = max(0, x)
y = max(0, y)
| 20.214286 | 45 | 0.533569 | 108 | 566 | 2.796296 | 0.324074 | 0.07947 | 0.072848 | 0.05298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099502 | 0.289753 | 566 | 27 | 46 | 20.962963 | 0.651741 | 0.058304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.090909 | null | null | 0.363636 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8c50788ce4892af43a9be0933cf50a03caaf69d | 2,376 | py | Python | app/app.py | CLARIAH/wp6-missieven | 67e7d0123d37cce36be5353801b90010d4ad4be0 | [
"MIT"
] | null | null | null | app/app.py | CLARIAH/wp6-missieven | 67e7d0123d37cce36be5353801b90010d4ad4be0 | [
"MIT"
] | null | null | null | app/app.py | CLARIAH/wp6-missieven | 67e7d0123d37cce36be5353801b90010d4ad4be0 | [
"MIT"
] | null | null | null | import types
from tf.advanced.app import App
MODIFIERS = """
remark folio note ref emph und super special q num den
""".strip().split()
def fmt_layoutFull(app, n, **kwargs):
return app._wrapHtml(n, ("",))
def fmt_layoutRemarks(app, n, **kwargs):
return app._wrapHtml(n, ("r",))
def fmt_layoutNotes(app, n, **kwargs):
return app._wrapHtml(n, ("n",))
def fmt_layoutOrig(app, n, **kwargs):
return app._wrapHtml(n, ("o",))
def fmt_layoutNoRemarks(app, n, **kwargs):
return app._wrapHtml(n, ("o", "n"))
def fmt_layoutNoNotes(app, n, **kwargs):
return app._wrapHtml(n, ("o", "r"))
def fmt_layoutNonOrig(app, n, **kwargs):
return app._wrapHtml(n, ("r", "n"))
NOTE = "note"
WORD = "word"
class TfApp(App):
def __init__(app, *args, **kwargs):
app.fmt_layoutFull = types.MethodType(fmt_layoutFull, app)
app.fmt_layoutRemarks = types.MethodType(fmt_layoutRemarks, app)
app.fmt_layoutNotes = types.MethodType(fmt_layoutNotes, app)
app.fmt_layoutOrig = types.MethodType(fmt_layoutOrig, app)
app.fmt_layoutNoRemarks = types.MethodType(fmt_layoutNoRemarks, app)
app.fmt_layoutNoNotes = types.MethodType(fmt_layoutNoNotes, app)
app.fmt_layoutNonOrig = types.MethodType(fmt_layoutNonOrig, app)
super().__init__(*args, **kwargs)
def _wrapHtml(app, n, kinds):
api = app.api
F = api.F
Fs = api.Fs
L = api.L
preNote = ""
postNote = ""
if "" in kinds or "n" in kinds:
notes = L.u(n, otype=NOTE)
if notes:
note = notes[0]
mark = F.mark.v(note)
noteWords = L.d(note, otype=WORD)
firstWord = noteWords[0]
lastWord = noteWords[-1]
if firstWord == n:
preNote = f"«{mark}= "
if lastWord == n:
postNote = f" ={mark}»"
material = "".join(Fs(f"trans{kind}").v(n) or "" for kind in kinds)
after = "".join(Fs(f"punc{kind}").v(n) or "" for kind in kinds)
material = f"{preNote}{material}{after}{postNote}"
clses = " ".join(
cf for cf in MODIFIERS if (fscf := Fs(f"is{cf}")) and fscf.v(n)
)
if clses:
material = f'<span class="{clses}">{material}</span>'
return material
| 28.285714 | 76 | 0.570286 | 301 | 2,376 | 4.385382 | 0.255814 | 0.024242 | 0.05303 | 0.084848 | 0.185606 | 0.185606 | 0.185606 | 0.143182 | 0 | 0 | 0 | 0.001756 | 0.281145 | 2,376 | 83 | 77 | 28.626506 | 0.769906 | 0 | 0 | 0 | 0 | 0 | 0.083754 | 0.02904 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152542 | false | 0 | 0.033898 | 0.118644 | 0.338983 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
e8c5b2acc01726c22676df5fc3f9d17efb91642c | 1,424 | py | Python | password-cracking/leatspeak.py | cyberprogrammer/ctf-stuff | 6814ff819c51bb1cddae76ff611bede3909aa129 | [
"MIT"
] | null | null | null | password-cracking/leatspeak.py | cyberprogrammer/ctf-stuff | 6814ff819c51bb1cddae76ff611bede3909aa129 | [
"MIT"
] | null | null | null | password-cracking/leatspeak.py | cyberprogrammer/ctf-stuff | 6814ff819c51bb1cddae76ff611bede3909aa129 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import sys
from string import ascii_letters
import itertools
def includeDefault(charSet):
for i in range(0,256):
charSet[i] = set([i])
def includeInvertedCases(charSet):
for c in ascii_letters:
charSet[ord(c)] |= set([ord(c.lower()) if c.isupper() else ord(c.upper())])
def includeLeetSpeak(charSet):
# TODO: make this more beautiful
charSet[ord("a")] |= set([ord("@"),ord("4")])
charSet[ord("A")] |= set([ord("@"),ord("4")])
charSet[ord("e")] |= set([ord("3")])
charSet[ord("E")] |= set([ord("3")])
charSet[ord("s")] |= set([ord("$"),ord("5")])
charSet[ord("S")] |= set([ord("$"),ord("5")])
charSet[ord("l")] |= set([ord("1")])
charSet[ord("L")] |= set([ord("1")])
charSet[ord("i")] |= set([ord("!"),ord("1")])
charSet[ord("I")] |= set([ord("!"),ord("1")])
def findCombinations(word, charSet):
wordCombinations = []
# Transform string to list of set of characters
for i, c in enumerate(word):
wordCombinations += [charSet[c]]
for newWord in itertools.product(*wordCombinations):
sys.stdout.buffer.write(bytearray(newWord))
sys.stdout.flush()
if __name__ == "__main__":
charSet = dict()
includeDefault(charSet)
#includeInvertedCases(charSet)
includeLeetSpeak(charSet)
for line in sys.stdin.buffer:
findCombinations(line, charSet)
| 25.428571 | 83 | 0.587079 | 179 | 1,424 | 4.614525 | 0.357542 | 0.133172 | 0.065375 | 0.050847 | 0.239709 | 0.239709 | 0.239709 | 0.239709 | 0.181598 | 0 | 0 | 0.013181 | 0.200843 | 1,424 | 55 | 84 | 25.890909 | 0.712654 | 0.089888 | 0 | 0 | 0 | 0 | 0.026316 | 0 | 0 | 0 | 0 | 0.018182 | 0 | 1 | 0.121212 | false | 0 | 0.090909 | 0 | 0.212121 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8c9149709f8faee9ce978ba8f02bea4f6d7bb11 | 375 | py | Python | fbscraper.py | Woahisme/final_project | f94fe272f32bc7c1cac3c6939054c0aee6edb71a | [
"MIT"
] | null | null | null | fbscraper.py | Woahisme/final_project | f94fe272f32bc7c1cac3c6939054c0aee6edb71a | [
"MIT"
] | null | null | null | fbscraper.py | Woahisme/final_project | f94fe272f32bc7c1cac3c6939054c0aee6edb71a | [
"MIT"
] | null | null | null |
import json
from facebook_scraper import get_profile
#pass through name from webpage once issue is fixed
json_data = get_profile("passparam", cookies="./fbcookies.json")
# json_object = json.loads(json_data)
json_formatted = json.dumps(json_data, indent = 2)
print(json_formatted)
with open("fb_info.json", "w") as outfile:
json.dump(json_data, outfile, indent = 2) | 25 | 64 | 0.76 | 57 | 375 | 4.807018 | 0.596491 | 0.116788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006135 | 0.130667 | 375 | 15 | 65 | 25 | 0.834356 | 0.229333 | 0 | 0 | 0 | 0 | 0.132404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0.285714 | 0 | 0.285714 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e8cf436b3f7cff830940245da46ee0d3379aa73a | 435 | py | Python | tinder/__init__.py | mzdravkov/elsys-ci-flask-example | 6543e1cc7e40db0535df6c4fe78120fd529d5cbb | [
"Apache-2.0"
] | 2 | 2019-12-30T13:26:55.000Z | 2020-01-18T14:03:25.000Z | tinder/__init__.py | mzdravkov/elsys-ci-flask-example | 6543e1cc7e40db0535df6c4fe78120fd529d5cbb | [
"Apache-2.0"
] | 3 | 2019-11-05T16:47:54.000Z | 2020-10-31T18:50:31.000Z | tinder/__init__.py | mzdravkov/elsys-ci-flask-example | 6543e1cc7e40db0535df6c4fe78120fd529d5cbb | [
"Apache-2.0"
] | 24 | 2019-10-10T19:17:40.000Z | 2020-10-25T10:42:00.000Z | from flask import Flask
from flask_socketio import SocketIO
from flask_sqlalchemy import SQLAlchemy
UPLOAD_FOLDER = 'uploads'
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif'}
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////tmp/dev.db'
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
app.secret_key = "eusehuccuhosn23981pcgid1xth4dn"
socketio = SocketIO(app)
db = SQLAlchemy(app)
from tinder.routes import *
| 25.588235 | 63 | 0.770115 | 55 | 435 | 5.854545 | 0.509091 | 0.083851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018041 | 0.108046 | 435 | 16 | 64 | 27.1875 | 0.811856 | 0 | 0 | 0 | 0 | 0 | 0.245977 | 0.170115 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
2cd504ebd0e349ea8bbd9fdb492b7ad1e64929f7 | 194 | py | Python | online assessment interview/SE Big Data role/mongoTest1.py | NirmalSilwal/Python- | 6d23112db8366360f0b79bdbf21252575e8eab3e | [
"MIT"
] | 32 | 2020-04-05T08:29:40.000Z | 2022-01-08T03:10:00.000Z | online assessment interview/SE Big Data role/mongoTest1.py | NirmalSilwal/Python- | 6d23112db8366360f0b79bdbf21252575e8eab3e | [
"MIT"
] | 3 | 2021-06-02T04:09:11.000Z | 2022-03-02T14:55:03.000Z | online assessment interview/SE Big Data role/mongoTest1.py | NirmalSilwal/Python- | 6d23112db8366360f0b79bdbf21252575e8eab3e | [
"MIT"
] | 3 | 2020-07-13T05:44:04.000Z | 2021-03-03T07:07:58.000Z | import pymongo
connection = pymongo.MongoClient("localhost", 27017)
database = connection['mydb_01']
collection = database['mycol_01']
data = {'Name' : "Akshay"}
collection.insert_one(data) | 17.636364 | 52 | 0.742268 | 22 | 194 | 6.409091 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052326 | 0.113402 | 194 | 11 | 53 | 17.636364 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0.174359 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2cd7bb87b2723af7e7c3023b2524d9f7f046ca6e | 722 | py | Python | BurstPaperWallet/initialize.py | MrPilotMan/BurstPaperWallet | 5a98a646487d6049f455680fe26ff10185b6d097 | [
"Apache-2.0"
] | 5 | 2018-07-21T09:05:35.000Z | 2018-09-18T16:36:52.000Z | BurstPaperWallet/initialize.py | MrPilotMan/Burst-Paper-Wallet | 5a98a646487d6049f455680fe26ff10185b6d097 | [
"MIT"
] | null | null | null | BurstPaperWallet/initialize.py | MrPilotMan/Burst-Paper-Wallet | 5a98a646487d6049f455680fe26ff10185b6d097 | [
"MIT"
] | null | null | null | from BurstPaperWallet.api import brs_api
from BurstPaperWallet.api import passphrase_url_transform as transform
def initialize(account, old_passphrase, fee=735000):
url = "sendMoney&recipient={}&secretPhrase={}&amountNQT=1&feeNQT={}&recipientPublicKey={}&deadline=1440"\
.format(account["reed solomon"], transform(old_passphrase), fee, account["public key"])
print(brs_api(url))
def check_balance(reed_solomon):
url = "getGuaranteedBalance&account={}".format(reed_solomon)
balance = brs_api(url)
return balance["guaranteedBalanceNQT"]
def adjust_fee(balance, fee):
if fee is None:
fee = 735000
if int(balance) >= fee:
return fee
else:
return balance
| 27.769231 | 109 | 0.710526 | 85 | 722 | 5.905882 | 0.482353 | 0.035857 | 0.091633 | 0.115538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028476 | 0.17313 | 722 | 25 | 110 | 28.88 | 0.812395 | 0 | 0 | 0 | 0 | 0 | 0.234072 | 0.1759 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0.176471 | 0.117647 | 0 | 0.470588 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2cda4cc79b80b9a4194f66d4309cdecba7f0a2e3 | 608 | py | Python | PycharmProjects/untitled1/printGraph.py | jiankangliu/baseOfPython | a10e81c79bc6fc3807ca8715fb1be56df527742c | [
"MIT"
] | null | null | null | PycharmProjects/untitled1/printGraph.py | jiankangliu/baseOfPython | a10e81c79bc6fc3807ca8715fb1be56df527742c | [
"MIT"
] | null | null | null | PycharmProjects/untitled1/printGraph.py | jiankangliu/baseOfPython | a10e81c79bc6fc3807ca8715fb1be56df527742c | [
"MIT"
] | null | null | null | # 第一行一个*,第二行两个*........ 共十行
# 打印乘法口诀
n = 1
while n <= 10:
n1 = n
while n1:
print("*", end = "")
n1 -= 1
print()
n += 1
n2 = 1
n3 = 1
while n2 < 10:
while n3 <= n2:
print(f"{n3}*{n2}={n3*n2}",end="\t")
n3 += 1
print()
n2 += 1
n3 = 1
n4 = 0
while n4 < 40:
n5 = 0
if n4 < 20:
while n5 < n4:
print(' ',end='')
n5 += 1
print('*')
else:
n5 = 39 - n4
while n5:
print(' ',end='')
n5 -= 1
print('*')
n4 += 1
| 16 | 45 | 0.320724 | 76 | 608 | 2.565789 | 0.302632 | 0.123077 | 0.051282 | 0.061538 | 0.164103 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163934 | 0.498355 | 608 | 37 | 46 | 16.432432 | 0.47541 | 0.054276 | 0 | 0.25 | 0 | 0 | 0.044944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2cdd770cb59585fff2f8363916717357018b2efd | 34,050 | py | Python | hcpre/duke_siemens/util_dicom_siemens.py | beOn/hcpre | 8c56d4f72c06abcb5d2d2b64e7e37fee040f2be4 | [
"BSD-3-Clause"
] | 10 | 2016-09-17T09:28:16.000Z | 2019-07-31T18:40:12.000Z | hcpre/duke_siemens/util_dicom_siemens.py | beOn/hcpre | 8c56d4f72c06abcb5d2d2b64e7e37fee040f2be4 | [
"BSD-3-Clause"
] | 4 | 2017-10-30T19:02:40.000Z | 2018-01-14T00:28:46.000Z | hcpre/duke_siemens/util_dicom_siemens.py | beOn/hcpre | 8c56d4f72c06abcb5d2d2b64e7e37fee040f2be4 | [
"BSD-3-Clause"
] | 5 | 2015-03-30T17:41:32.000Z | 2020-10-15T13:17:22.000Z | """
Routines for extracting data from Siemens DICOM files.
The simplest way to read a file is to call read(filename). If you like you
can also call lower level functions like read_data().
Except for the map of internal data types to numpy type strings (which
doesn't require an import of numpy), this code is deliberately ignorant of
numpy. It returns native Python types that are easy to convert into
numpy types.
"""
# Python modules
from __future__ import division
import struct
import exceptions
import math
# 3rd party modules
import dicom
# Our modules
import util_mrs_file
import constants
TYPE_NONE = 0
TYPE_IMAGE = 1
TYPE_SPECTROSCOPY = 2
# Change to True to enable the assert() statements sprinkled through the code
ASSERTIONS_ENABLED = False
# THese are some Siemens-specific tags
TAG_CONTENT_TYPE = (0x0029, 0x1008)
TAG_SPECTROSCOPY_DATA = (0x7fe1, 0x1010)
# I (Philip) ported much of the private tag parsing code from the IDL routines
# dicom_fill_rsp.pro and dicom_fill_util.pro, except for the CSA header
# parsing which is a port of C++ code in the GDCM project.
# Since a lot (all?) of the Siemens format is undocumented, there are magic
# numbers and logic in here that I can't explain. Sorry! Where appropriate
# I have copied or paraphrased comments from the IDL code; they're marked
# with [IDL]. Unmarked comments are mine. Where ambiguous, I labelled my
# comments with [PS] (Philip Semanchuk).
def read(filename, ignore_data=False):
""" This is the simplest (and recommended) way for our code to read a
Siemens DICOM file.
It returns a tuple of (parameters, data). The parameters are a dict.
The data is in a Python list.
"""
# Since a DICOM file is params + data together, it's not so simple to
# ignore the data part. The best we can do is tell PyDicom to apply
# lazy evaluation which is probably less efficient in the long run.
defer_size = 4096 if ignore_data else 0
dataset = dicom.read_file(filename)
params = read_parameters_from_dataset(dataset)
data = read_data_from_dataset(dataset)
return params, data
def read_parameters(filename):
return read_parameters_from_dataset(dicom.read_file(filename))
def read_data(filename):
return read_data_from_dataset(dicom.read_file(filename))
def read_data_from_dataset(dataset):
"""Given a PyDicom dataset, returns the data in the Siemens DICOM
spectroscopy data tag (0x7fe1, 0x1010) as a list of complex numbers.
"""
data = _get(dataset, TAG_SPECTROSCOPY_DATA)
if data:
# Big simplifying assumptions --
# 1) Data is a series of complex numbers organized as ririri...
# where r = real and i = imaginary.
# 2) Each real & imaginary number is a 4 byte float.
# 3) Data is little endian.
data = struct.unpack("<%df" % (len(data) / 4), data)
data = util_mrs_file.collapse_complexes(data)
else:
data = [ ]
return data
def read_parameters_from_dataset(dataset):
"""Given a PyDicom dataset, returns a fairly extensive subset of the
parameters therein as a dictionary.
"""
params = { }
# The code below refers to slice_index as a variable, but here it is
# hardcoded to one. It could vary, in theory, but in practice I don't
# know how it would actually be used. How would the slice index or
# indices be passed? How would the data be returned? For now, I'll
# leave the slice code active but hardcode the index to 1.
slice_index = 1
# [PS] - Even after porting this code I still can't figure out what
# ptag_img and ptag_ser stand for, so I left the names as is.
ptag_img = { }
ptag_ser = { }
# (0x0029, 0x__10) is one of several possibilities
# - SIEMENS CSA NON-IMAGE, CSA Data Info
# - SIEMENS CSA HEADER, CSA Image Header Info
# - SIEMENS CSA ENVELOPE, syngo Report Data
# - SIEMENS MEDCOM HEADER, MedCom Header Info
# - SIEMENS MEDCOM OOG, MedCom OOG Info (MEDCOM Object Oriented Graphics)
# Pydicom identifies it as "CSA Image Header Info"
for tag in ( (0x0029, 0x1010), (0x0029, 0x1210), (0x0029, 0x1110) ):
tag_data = dataset.get(tag, None)
if tag_data:
break
if tag_data:
ptag_img = _parse_csa_header(tag_data.value)
# [IDL] Access the SERIES Shadow Data
# [PS] I don't know what makes this "shadow" data.
for tag in ( (0x0029, 0x1020), (0x0029, 0x1220), (0x0029, 0x1120) ):
tag_data = dataset.get(tag, None)
if tag_data:
break
if tag_data:
ptag_ser = _parse_csa_header(tag_data.value)
# [IDL] "MrProtocol" (VA25) and "MrPhoenixProtocol" (VB13) are special
# elements that contain many parameters.
if ptag_ser.get("MrProtocol", ""):
prot_ser = _parse_protocol_data(ptag_ser["MrProtocol"])
if ptag_ser.get("MrPhoenixProtocol", ""):
prot_ser = _parse_protocol_data(ptag_ser["MrPhoenixProtocol"])
# [IDL] Determine if file is SVS,SI,EPSI, or OTHER
# [PS] IDL code doesn't match comments. Possibilities appear to
# include EPSI, SVS, CSI, JPRESS and SVSLIP2. "OTHER" isn't
# considered.
# EPSI = Echo-Planar Spectroscopic Imaging
# SVS = Single voxel spectroscopy
# CSI = Chemical Shift Imaging
# JPRESS = J-resolved spectroscopy
# SVSLIP2 = No idea!
is_epsi = False
is_svs = False
is_csi = False
is_jpress = False
is_svslip2 = False
# [IDL] Protocol name
parameter_filename = _extract_from_quotes(prot_ser.get("tProtocolName", ""))
parameter_filename = parameter_filename.strip()
# [IDL] Sequence file name
sequence_filename = _extract_from_quotes(prot_ser.get("tSequenceFileName", ""))
sequence_filename = sequence_filename.strip()
sequence_filename2 = ptag_img.get("SequenceName", "")
sequence_filename2 = sequence_filename2.strip()
parameter_filename_lower = parameter_filename.lower()
sequence_filename_lower = sequence_filename.lower()
sequence_filename2_lower = sequence_filename2.lower()
is_epsi = ("epsi" in (parameter_filename_lower, sequence_filename_lower))
is_svs = ("svs" in (parameter_filename_lower, sequence_filename_lower,
sequence_filename2_lower))
if "fid" in (parameter_filename_lower, sequence_filename_lower):
if "csi" in (parameter_filename_lower, sequence_filename_lower):
is_csi = True
else:
is_svs = True
if "csi" in (parameter_filename_lower, sequence_filename_lower):
is_csi = True
is_jpress = ("jpress" in (parameter_filename_lower,
sequence_filename_lower))
is_svslip2 = ("svs_li2" in (parameter_filename_lower,
sequence_filename2_lower))
# Patient Info
params["patient_name"] = _get(dataset, (0x0010, 0x0010), "")
params["patient_id"] = _get(dataset, (0x0010, 0x0020))
params["patient_birthdate"] = _get(dataset, (0x0010, 0x0030))
params["patient_sex"] = _get(dataset, (0x0010, 0x0040), "")
# [PS] Siemens stores the age as nnnY where 'n' is a digit, e.g. 042Y
params["patient_age"] = \
int(_get(dataset, (0x0010, 0x1010), "000Y")[:3])
params["patient_weight"] = round(_get(dataset, (0x0010, 0x1030), 0))
params["study_code"] = _get(dataset, (0x0008, 0x1030), "")
# Identification info
params["bed_move_fraction"] = 0.0
s = _get(dataset, (0x0008, 0x0080), "")
if s:
s = " " + s
s += _get(dataset, (0x0008, 0x1090), "")
params["institution_id"] = s
params["parameter_filename"] = parameter_filename
params["study_type"] = "spec"
# DICOM date format is YYYYMMDD
params["bed_move_date"] = _get(dataset, (0x0008, 0x0020), "")
params["measure_date"] = params["bed_move_date"]
# DICOM time format is hhmmss.fraction
params["bed_move_time"] = _get(dataset, (0x0008, 0x0030), "")
params["comment_1"] = _get(dataset, (0x0008, 0x0031), "")
if not params["comment_1"]:
params["comment_1"] = _get(dataset, (0x0020, 0x4000), "")
# DICOM time format is hhmmss.fraction
params["measure_time"] = _get(dataset, (0x0008, 0x0032), "")
params["sequence_filename"] = ptag_img.get("SequenceName", "")
params["sequence_type"] = ptag_img.get("SequenceName", "")
# Measurement info
params["echo_position"] = "0.0"
params["image_contrast_mode"] = "unknown"
params["kspace_mode"] = "unknown"
params["measured_slices"] = "1"
params["saturation_bands"] = "0"
# Seems to me that a quantity called "NumberOfAverages" would be an
# int, but it is stored as a float, e.g. "128.0000" which makes
# Python's int() choke unless I run it through float() first.
params["averages"] = int(_float(ptag_img.get("NumberOfAverages", "")))
params["flip_angle"] = _float(ptag_img.get("FlipAngle", ""))
# [PS] DICOM stores frequency as MHz, we store it as Hz. Mega = 1x10(6)
params["frequency"] = float(ptag_img.get("ImagingFrequency", 0)) * 1e6
inversion_time = float(ptag_img.get("InversionTime", 0))
params["inversion_time_1"] = inversion_time
params["number_inversions"] = 1 if inversion_time else 0
params["measured_echoes"] = ptag_img.get("EchoTrainLength", "1")
params["nucleus"] = ptag_img.get("ImagedNucleus", "")
params["prescans"] = prot_ser.get("sSpecPara.lPreparingScans", 0)
# Gain
gain = prot_ser.get("sRXSPEC.lGain", None)
if gain == 0:
gain = "-20.0"
elif gain == 1:
gain = "0.0"
else:
gain = ""
params["receiver_gain"] = gain
params["ft_scale_factor"] = \
float(prot_ser.get("sRXSPEC.aFFT_SCALE[0].flFactor", 0))
# Receiver Coil
coil = prot_ser.get("sCOIL_SELECT_MEAS.asList[0].sCoilElementID.tCoilID", "")
params["receiver_coil"] = _extract_from_quotes(coil)
# [IDL] differs in EPSI
params["repetition_time_1"] = float(prot_ser.get("alTR[0]", 0)) * 0.001
sweep_width = ""
remove_oversample_flag = prot_ser.get("sSpecPara.ucRemoveOversampling", "")
remove_oversample_flag = (remove_oversample_flag.strip() == "0x1")
readout_os = float(ptag_ser.get("ReadoutOS", 1.0))
dwelltime = float(ptag_img.get("RealDwellTime", 1.0)) * 1e-9
if dwelltime:
sweep_width = 1 / dwelltime
if not remove_oversample_flag:
sweep_width *= readout_os
sweep_width = str(sweep_width)
params["transmitter_voltage"] = \
prot_ser.get("sTXSPEC.asNucleusInfo[0].flReferenceAmplitude", "0.0")
params["total_duration"] = \
prot_ser.get("lTotalScanTimeSec", "0.0")
prefix = "sSliceArray.asSlice[%d]." % slice_index
image_parameters = (
("image_dimension_line", "dPhaseFOV"),
("image_dimension_column", "dReadoutFOV"),
("image_dimension_partition", "dThickness"),
("image_position_sagittal", "sPosition.dSag"),
("image_position_coronal", "sPosition.dCor"),
("image_position_transverse", "sPosition.dTra"),
)
for key, name in image_parameters:
params[key] = float(prot_ser.get(prefix + name, "0.0"))
# [IDL] Image Normal/Column
image_orientation = ptag_img.get("ImageOrientationPatient", "")
if not image_orientation:
slice_orientation_pitch = ""
slice_distance = ""
else:
# image_orientation is a list of strings, e.g. --
# ['-1.00000000', '0.00000000', '0.00000000', '0.00000000',
# '1.00000000', '0.00000000']
# [IDL] If the data we are processing is a Single Voxel
# Spectroscopy data, interchange rows and columns. Due to an error
# in the protocol used.
if is_svs:
image_orientation = image_orientation[3:] + image_orientation[:3]
# Convert the values to float and discard ones smaller than 1e-4
f = lambda value: 0.0 if abs(value) < 1e-4 else value
image_orientation = [f(float(value)) for value in image_orientation]
row = image_orientation[:3]
column = image_orientation[3:6]
normal = ( ((row[1] * column[2]) - (row[2] * column[1])),
((row[2] * column[0]) - (row[0] * column[2])),
((row[0] * column[1]) - (row[1] * column[0])),
)
params["image_normal_sagittal"] = normal[0]
params["image_normal_coronal"] = normal[1]
params["image_normal_transverse"] = normal[2]
params["image_column_sagittal"] = column[0]
params["image_column_coronal"] = column[0]
params["image_column_transverse"] = column[0]
# Second part of the return tuple is orientation; we don't use it.
slice_orientation_pitch, _ = _dicom_orientation_string(normal)
# Slice distance
# http://en.wikipedia.org/wiki/Dot_product
keys = ("image_position_sagittal", "image_position_coronal",
"image_position_transverse")
a = [params[key] for key in keys]
b = normal
bb = math.sqrt(sum([value ** 2 for value in normal]))
slice_distance = ((a[0] * b[0]) + (a[1] * b[1]) + (a[2] * b[2])) / bb
params["slice_orientation_pitch"] = slice_orientation_pitch
params["slice_distance"] = slice_distance
regions = ( ("region_dimension_line", "dPhaseFOV"),
("region_dimension_column", "dReadoutFOV"),
("region_dimension_partition", "dThickness"),
("region_position_sagittal", "sPosition.dSag"),
("region_position_coronal", "sPosition.dCor"),
("region_position_transverse", "sPosition.dTra"),
)
for key, name in regions:
name = "sSpecPara.sVoI." + name
params[key] = float(prot_ser.get(name, 0))
# 'DATA INFORMATION'
params["measure_size_spectral"] = \
long(prot_ser.get('sSpecPara.lVectorSize', 0))
params["slice_thickness"] = _float(ptag_img.get("SliceThickness", 0))
params["current_slice"] = "1"
params["number_echoes"] = "1"
params["number_slices"] = "1"
params["data_size_spectral"] = params["measure_size_spectral"]
# ;------------------------------------------------------
# [IDL] Sequence Specific Changes
if not is_epsi:
# [IDL] Echo time - JPRESS handling added by Dragan
echo_time = 0.0
if is_jpress:
# [IDL] Yingjian saves echo time in a private 'echotime' field
# [PS] The IDL code didn't use a dict to store these values
# but instead did a brute force case-insensitive search over
# an array of strings. In that context, key case didn't matter
# but here it does.
keys = prot_ser.keys()
for key in keys:
if key.upper() == "ECHOTIME":
echo_time = float(prot_ser[key])
if is_svslip2:
# [IDL] BJS found TE value set in ICE to be updated in
# 'echotime' field
# [PS] The IDL code didn't use a dict to store these values
# but instead did a brute force case-insensitive search over
# an array of strings. In that context, key case didn't matter
# but here it does.
keys = ptag_img.keys()
for key in keys:
if key.upper() == "ECHOTIME":
echo_time = float(ptag_img[key])
if not echo_time:
# [IDL] still no echo time - try std place
echo_time = float(prot_ser.get('alTE[0]', 0.0))
echo_time /= 1000
params["echo_time"] = echo_time
params["data_size_line"] = \
int(prot_ser.get('sSpecPara.lFinalMatrixSizePhase', 1))
params["data_size_column"] = \
int(prot_ser.get('sSpecPara.lFinalMatrixSizeRead', 1))
params["data_size_partition"] = \
int(prot_ser.get('sSpecPara.lFinalMatrixSizeSlice', 1))
if is_svs:
# [IDL] For Single Voxel Spectroscopy data (SVS) only
params["image_dimension_line"] = \
params["region_dimension_line"]
params["image_dimension_column"] = \
params["region_dimension_column"]
params["image_dimension_partition"] = \
params["region_dimension_partition"]
# [IDL] For SVS data the following three parameters cannot be
# anything other than 1
params["measure_size_line"] = 1
params["measure_size_column"] = 1
params["measure_size_partition"] = 1
else:
# Not SVS
# ;--------------------------------------------------
# ; [IDL] For CSI or OTHER Spectroscopy data only
# ;--------------------------------------------------
measure_size_line = int(prot_ser.get('sKSpace.lPhaseEncodingLines', 1))
params["measure_size_line"] = str(measure_size_line)
measure_size_column = int(prot_ser.get('sKSpace.lPhaseEncodingLines', 0))
params["measure_size_column"] = str(measure_size_column)
measure_size_partition = int(prot_ser.get('sKSpace.lPartitions', '0'))
kspace_dimension = prot_ser.get('sKSpace.ucDimension', '')
if kspace_dimension.strip() == "0x2":
measure_size_partition = 1
params["data_size_partition"] = 1
data_size_partition = 1
params["measure_size_partition"] = measure_size_partition
if sequence_filename in ("svs_cp_press", "svs_se_ir", "svs_tavg"):
# [IDL] Inversion Type 0-Volume,1-None
s = prot_ser.get("SPREPPULSES.UCINVERSION", "")
if s == "0x1":
params["number_inversions"] = 1
elif s == "0x2":
params["number_inversions"] = 0
# else:
# params["number_inversions"] doesn't get set at all.
# This matches the behavior of the IDL code. Note that
# params["number_inversions"] is also populated
# unconditionally in code many lines above.
if sequence_filename in ("svs_se", "svs_st", "fid", "fid3", "fid_var",
"csi_se", "csi_st", "csi_fid", "csi_fidvar",
"epsi"):
# [IDL] FOR EPSI Measure_size and Data_size parameters are the same
params["region_dimension_line"] = \
params["image_dimension_line"]
params["region_dimension_column"] = \
params["image_dimension_column"]
params["ft_scale_factor"] = "1.0"
params["data_size_line"] = \
int(prot_ser.get('sKSpace.lPhaseEncodingLines', 0))
params["data_size_column"] = \
int(prot_ser.get('sKSpace.lBaseResolution', 0)) * readout_os
params["data_size_partition"] = \
int(prot_ser.get('sKSpace.lPartitions', 0))
params["measure_size_line"] = params["data_size_line"]
measure_size_column = params["data_size_column"]
measure_size_partition = params["data_size_partition"]
index = 0 if ((int(dataset.get("InstanceNumber", 0)) % 2) == 1) else 1
echo_time = float(prot_ser.get('alTE[%d]' % index, 0)) / 1000
repetition_time_1 = float(prot_ser.get('alTR[%d]' % index, 0)) / 1000
params["echo_time"] = str(echo_time)
params["repetition_time_1"] = str(repetition_time_1)
dwelltime = float(ptag_img.get("RealDwellTime", 0.0))
if dwelltime and base_resolution:
sweep_width = 1 / (dwelltime * base_resolution * readout_os)
else:
sweep_width = ""
params["sweep_width"] = sweep_width
# Added by BTA
ip_rot = prot_ser.get("sSliceArray.asSlice[0].dInPlaneRot", None)
pol_swap = prot_ser.get("sWipMemBlock.alFree[40]", None)
if ip_rot:
try:
ip_rot = float(ip_rot)
params["in_plane_rotation"] = ip_rot
except Exception, e:
pass
if pol_swap:
try:
pol_swap = int(pol_swap)
params["polarity_swap"] = pol_swap
except Exception, e:
raise e
return params
def _my_assert(expression):
if ASSERTIONS_ENABLED:
assert(expression)
def _dicom_orientation_string(normal):
"""Given a 3-item list (or other iterable) that represents a normal vector
to the "imaging" plane, this function determines the orientation of the
vector in 3-dimensional space. It returns a tuple of (angle, orientation)
in which angle is e.g. "Tra" or "Tra>Cor -6" or "Tra>Sag 14.1 >Cor 9.3"
and orientation is e.g. "Sag" or "Cor-Tra".
For double angulation, errors in secondary angle occur that may be due to
rounding errors in internal Siemens software, which calculates row and
column vectors.
"""
# docstring paraphrases IDL comments
TOLERANCE = 1.e-4
orientations = ('Sag', 'Cor', 'Tra')
final_angle = ""
final_orientation = ""
# [IDL] evaluate orientation of normal vector:
#
# Find principal direction of normal vector (i.e. axis with its largest
# component)
# Find secondary direction (second largest component)
# Calc. angle btw. projection of normal vector into the plane that
# includes both principal and secondary directions on the one hand
# and the principal direction on the other hand ==> 1st angulation:
# "principal>secondary = angle"
# Calc. angle btw. projection into plane perpendicular to principal
# direction on the one hand and secondary direction on the other
# hand ==> 2nd angulation: "secondary>third dir. = angle"
# get principal, secondary and ternary directions
sorted_normal = sorted(normal)
for i, value in enumerate(normal):
if value == sorted_normal[2]:
# [IDL] index of principal direction
principal = i
if value == sorted_normal[1]:
# [IDL] index of secondary direction
secondary = i
if value == sorted_normal[0]:
# [IDL] index of ternary direction
ternary = i
# [IDL] calc. angle between projection into third plane (spawned by
# principle & secondary directions) and principal direction:
angle_1 = math.atan2(normal[secondary], normal[principal]) * \
constants.RADIANS_TO_DEGREES
# [IDL] calc. angle btw. projection on rotated principle direction and
# secondary direction:
# projection on rotated principle dir.
new_normal_ip = math.sqrt((normal[principal] ** 2) + (normal[secondary] ** 2))
angle_2 = math.atan2(normal[ternary], new_normal_ip) * \
constants.RADIANS_TO_DEGREES
# [IDL] SIEMENS notation requires modifications IF principal dir. indxs SAG !
# [PS] In IDL, indxs is the name of the variable that is "secondary" here.
# Even with that substitution, I don't understand the comment above.
if not principal:
if abs(angle_1) > 0:
sign1 = angle_1 / abs(angle_1)
else:
sign1 = 1.0
angle_1 -= (sign1 * 180.0)
angle_2 *= -1
if (abs(angle_2) < TOLERANCE) or (abs(abs(angle_2) - 180) < TOLERANCE):
if (abs(angle_1) < TOLERANCE) or (abs(abs(angle_1) - 180) < TOLERANCE):
# [IDL] NON-OBLIQUE:
final_angle = orientations[principal]
final_orientation = ang
else:
# [IDL] SINGLE-OBLIQUE:
final_angle = "%s>%s %.3f" % \
(orientations[principal], orientations[secondary],
(-1 * angle_1)
)
final_orientation = orientations[principal] + '-' + orientations[secondary]
else:
# [IDL] DOUBLE-OBLIQUE:
final_angle = "%s>%s %.3f >%s %f" % \
(orientations[principal], orientations[secondary],
(-1 * angle_1), orientations[ternary], (-1 * angle_2))
final_orientation = "%s-%s-%s" % \
(orientations[principal], orientations[secondary],
orientations[ternary])
return final_angle, final_orientation
def _float(value):
"""Attempts to return value as a float. No different from Python's
built-in float(), except that it accepts None and "" (for which it
returns 0.0).
"""
return float(value) if value else 0.0
def _extract_from_quotes(s):
"""Given a string, returns the portion between the first and last
double quote (ASCII 34). If there aren't at least two quote characters,
the original string is returned."""
start = s.find('"')
end = s.rfind('"')
if (start != -1) and (end != -1):
s = s[start + 1 : end]
return s
def _null_truncate(s):
"""Given a string, returns a version truncated at the first '\0' if
there is one. If not, the original string is returned."""
i = s.find(chr(0))
if i != -1:
s = s[:i]
return s
def _scrub(item):
"""Given a string, returns a version truncated at the first '\0' and
stripped of leading/trailing whitespace. If the param is not a string,
it is returned unchanged."""
if isinstance(item, basestring):
return _null_truncate(item).strip()
else:
return item
def _get_chunks(tag, index, format, little_endian=True):
"""Given a CSA tag string, an index into that string, and a format
specifier compatible with Python's struct module, returns a tuple
of (size, chunks) where size is the number of bytes read and
chunks are the data items returned by struct.unpack(). Strings in the
list of chunks have been run through _scrub().
"""
# The first character of the format string indicates endianness.
format = ('<' if little_endian else '>') + format
size = struct.calcsize(format)
chunks = struct.unpack(format, tag[index:index + size])
chunks = [_scrub(item) for item in chunks]
return (size, chunks)
def _parse_protocol_data(protocol_data):
"""Returns a dictionary containing the name/value pairs inside the
"ASCCONV" section of the MrProtocol or MrPhoenixProtocol elements
of a Siemens CSA Header tag.
"""
# Protocol_data is a large string (e.g. 32k) that lists a lot of
# variables in a JSONish format with which I'm not familiar. Following
# that there's another chunk of data delimited by the strings you see
# below.
# That chunk is a list of name=value pairs, INI file style. We
# ignore everything outside of the ASCCONV delimiters. Everything inside
# we parse and return as a dictionary.
start = protocol_data.find("### ASCCONV BEGIN ###")
end = protocol_data.find("### ASCCONV END ###")
_my_assert(start != -1)
_my_assert(end != -1)
start += len("### ASCCONV BEGIN ###")
protocol_data = protocol_data[start:end]
lines = protocol_data.split('\n')
# The two lines of code below turn the 'lines' list into a list of
# (name, value) tuples in which name & value have been stripped and
# all blank lines have been discarded.
f = lambda pair: (pair[0].strip(), pair[1].strip())
lines = [f(line.split('=')) for line in lines if '=' in line]
return dict(lines)
def _get(dataset, tag, default=None):
"""Returns the value of a dataset tag, or the default if the tag isn't
in the dataset.
PyDicom datasets already have a .get() method, but it returns a
dicom.DataElement object. In practice it's awkward to call dataset.get()
and then figure out if the result is the default or a DataElement,
and if it is the latter _get the .value attribute. This function allows
me to avoid all that mess.
It is also a workaround for this bug (which I submitted) which should be
fixed in PyDicom > 0.9.3:
http://code.google.com/p/pydicom/issues/detail?id=72
"""
return default if tag not in dataset else dataset[tag].value
def _parse_csa_header(tag, little_endian = True):
"""The CSA header is a Siemens private tag that should be passed as
a string. Any of the following tags should work: (0x0029, 0x1010),
(0x0029, 0x1210), (0x0029, 0x1110), (0x0029, 0x1020), (0x0029, 0x1220),
(0x0029, 0x1120).
The function returns a dictionary keyed by element name.
"""
# Let's have a bit of fun, shall we? A Siemens CSA header is a mix of
# binary glop, ASCII, binary masquerading as ASCII, and noise masquerading
# as signal. It's also undocumented, so there's no specification to which
# to refer.
# The format is a good one to show to anyone who complains about XML being
# verbose or hard to read. Spend an afternoon with this and XML will
# look terse and read like a Shakespearean sonnet.
# The algorithm below is a translation of the GDCM project's
# CSAHeader::LoadFromDataElement() inside gdcmCSAHeader.cxx. I don't know
# how that code's author figured out what's in a CSA header, but the
# code works.
# I added comments and observations, but they're inferences. I might
# be wrong. YMMV.
# Some observations --
# - If you need to debug this code, a hexdump of the tag data will be
# your best friend.
# - The data in the tag is a list of elements, each of which contains
# zero or more subelements. The subelements can't be further divided
# and are either empty or contain a string.
# - Everything begins on four byte boundaries.
# - This code will break on big endian data. I don't know if this data
# can be big endian, and if that's possible I don't know what flag to
# read to indicate that. However, it's easy to pass an endianness flag
# to _get_chunks() should the need to parse big endian data arise.
# - Delimiters are thrown in here and there; they are 0x4d = 77 which is
# ASCII 'M' and 0xcd = 205 which has no ASCII representation.
# - Strings in the data are C-style NULL terminated.
# I sometimes read delimiters as strings and sometimes as longs.
DELIMITERS = ("M", "\xcd", 0x4d, 0xcd)
# This dictionary of elements is what this function returns
elements = { }
# I march through the tag data byte by byte (actually a minimum of four
# bytes at a time), and current points to my current position in the tag
# data.
current = 0
# The data starts with "SV10" followed by 0x04, 0x03, 0x02, 0x01.
# It's meaningless to me, so after reading it, I discard it.
size, chunks = _get_chunks(tag, current, "4s4s")
current += size
_my_assert(chunks[0] == "SV10")
_my_assert(chunks[1] == "\4\3\2\1")
# get the number of elements in the outer list
size, chunks = _get_chunks(tag, current, "L")
current += size
element_count = chunks[0]
# Eat a delimiter (should be 0x77)
size, chunks = _get_chunks(tag, current, "4s")
current += size
_my_assert(chunks[0] in DELIMITERS)
for i in range(element_count):
# Each element looks like this:
# - (64 bytes) Element name, e.g. ImagedNucleus, NumberOfFrames,
# VariableFlipAngleFlag, MrProtocol, etc. Only the data up to the
# first 0x00 is important. The rest is helpfully populated with
# noise that has enough pattern to make it look like something
# other than the garbage that it is.
# - (4 bytes) VM
# - (4 bytes) VR
# - (4 bytes) syngo_dt
# - (4 bytes) # of subelements in this element (often zero)
# - (4 bytes) a delimiter (0x4d or 0xcd)
size, chunks = _get_chunks(tag, current,
"64s" + "4s" + "4s" + "4s" + "L" + "4s")
current += size
name, vm, vr, syngo_dt, subelement_count, delimiter = chunks
_my_assert(delimiter in DELIMITERS)
# The subelements hold zero or more strings. Those strings are stored
# temporarily in the values list.
values = [ ]
for j in range(subelement_count):
# Each subelement looks like this:
# - (4 x 4 = 16 bytes) Call these four bytes A, B, C and D. For
# some strange reason, C is always a delimiter, while A, B and
# D are always equal to one another. They represent the length
# of the associated data string.
# - (n bytes) String data, the length of which is defined by
# A (and A == B == D).
# - (m bytes) Padding if length is not an even multiple of four.
size, chunks = _get_chunks(tag, current, "4L")
current += size
_my_assert(chunks[0] == chunks[1])
_my_assert(chunks[1] == chunks[3])
_my_assert(chunks[2] in DELIMITERS)
length = chunks[0]
# get a chunk-o-stuff, length indicated by code above.
# Note that length can be 0.
size, chunks = _get_chunks(tag, current, "%ds" % length)
current += size
if chunks[0]:
values.append(chunks[0])
# If we're not at a 4 byte boundary, move.
# Clever modulus code below swiped from GDCM
current += (4 - (length % 4)) % 4
# The value becomes a single string item (possibly "") or a list
# of strings
if len(values) == 0:
values = ""
if len(values) == 1:
values = values[0]
_my_assert(name not in elements)
elements[name] = values
return elements
| 39.048165 | 87 | 0.622173 | 4,414 | 34,050 | 4.65836 | 0.205936 | 0.011234 | 0.014104 | 0.00569 | 0.192053 | 0.143809 | 0.116574 | 0.081023 | 0.04484 | 0.036281 | 0 | 0.02889 | 0.272129 | 34,050 | 871 | 88 | 39.092997 | 0.800759 | 0.29301 | 0 | 0.130952 | 0 | 0 | 0.157882 | 0.060561 | 0 | 0 | 0.013938 | 0 | 0.033333 | 0 | null | null | 0.002381 | 0.016667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2ce31456540ab00747dedfd50b8822dd96f15a36 | 5,176 | py | Python | instrument_plugins/EGandG_Model5209.py | sourav-majumder/qtlab | 96b2a127b1df7b45622c90229bd5ef8a4083614e | [
"MIT"
] | null | null | null | instrument_plugins/EGandG_Model5209.py | sourav-majumder/qtlab | 96b2a127b1df7b45622c90229bd5ef8a4083614e | [
"MIT"
] | null | null | null | instrument_plugins/EGandG_Model5209.py | sourav-majumder/qtlab | 96b2a127b1df7b45622c90229bd5ef8a4083614e | [
"MIT"
] | null | null | null | # EGandG_Model5209.py class, to perform the communication between the Wrapper and the device
# Martijn Schaafsma <qtlab@mcschaafsma.nl>, 2010
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
from instrument import Instrument
import types
import logging
import numpy as np
from time import sleep
import visa
class EGandG_Model5209(Instrument):
'''
This is the driver for the Lockin
Usage:
Initialize with
<name> = instruments.create('<name>', 'EGandG_Model5209', address='<GBIP address>, reset=<bool>')
'''
def __init__(self, name, address, reset=False):
logging.info(__name__ + ' : Initializing instrument EG&G Model 5209')
Instrument.__init__(self, name, tags=['physical'])
self._address = address
#>>>>>>>>>>>>>>
assert False, "pyvisa syntax has changed, tweak the line below according to the instructions in qtlab/instrument_plugins/README_PYVISA_API_CHANGES"
#self._visainstrument = visa.instrument(self._address)
#<<<<<<<<<<<<<<
#self.init_default()
# Sensitivity
self._sen = 1.0
# Add functions
self.add_function('init_default')
self.add_function ('get_all')
self.add_function ('auto_measure')
self.add_function ('auto_phase')
# Add parameters
self.add_parameter('value',
flags=Instrument.FLAG_GET, units='V', type=types.FloatType,tags=['measure'])
self.add_parameter('frequency',
flags=Instrument.FLAG_GET, units='mHz', type=types.FloatType)
self.add_parameter('sensitivity',
flags=Instrument.FLAG_GETSET, units='', minval=1, maxval=15, type=types.IntType)
self.add_parameter('timeconstant',
flags=Instrument.FLAG_GETSET, units='', minval=1, maxval=15, type=types.IntType)
self.add_parameter('sensitivity_v',
flags=Instrument.FLAG_GETSET, units='V', minval=0.0, maxval=15.0, type=types.FloatType)
self.add_parameter('timeconstant_t',
flags=Instrument.FLAG_GETSET, units='s', minval=0.0, maxval=15.0, type=types.FloatType)
self.add_parameter('filter',
flags=Instrument.FLAG_GETSET, units='', minval=0, maxval=3, type=types.IntType)
if reset:
self.init_default()
#self.get_all()
self.get_sensitivity_v()
def _write(self, letter):
self._visainstrument.write(letter)
sleep(0.1)
def _ask(self, question):
return self._visainstrument.ask(question)
def get_all(self):
self.get_value()
self.get_frequency()
self.get_sensitivity()
self.get_timeconstant()
self.get_sensitivity_v()
self.get_timeconstant_t()
def init_default(self):
# self._write("ASM")
self._write("SEN 7")
self._write("XTC 3")
self._write("FLT 3")
def auto_measure(self):
self._write("ASM")
def auto_phase(self):
self._write("AQN")
def do_get_frequency(self):
stringval = self._ask("FRQ?")
return float(stringval)
def do_get_value(self):
stringval = self._ask("OUT?")
sd = stringval.split()
if len(sd)==2:
s=sd[0]
v = float(sd[1])
if (s=='-'):
v = -v
else:
v = float(sd[0])
return v*self._sen/10000.0
def do_get_sensitivity(self):
stringval = self._ask("SEN?")
self.get_sensitivity_v()
return int(stringval)
def do_set_sensitivity(self,val):
self._write("SEN %d"%val)
self.get_sensitivity()
def do_get_filter(self):
stringval = self._ask("FLT?")
print stringval
return int(stringval)
def do_set_filter(self,val):
self._write("FLT %d"%val)
def do_get_timeconstant(self):
stringval = self._ask("XTC?")
return int(stringval)
def do_set_timeconstant(self,val):
self._write("XTC %d"%val)
def do_get_sensitivity_v(self):
stringval = self._ask("SEN?")
n = int(stringval)
self._sen = pow(10,(int(n/2)-7+np.log10(3)*np.mod(n,2)))
return self._sen
def do_set_sensitivity_v(self,val):
n = np.log10(val)*2.0+13.99
if (np.mod(n,2) > 0.9525) & (np.mod(n,2) < 1.1):
n = n+0.1
self._write("SEN %d"%n)
self.get_sensitivity_v()
def do_get_timeconstant_t(self):
stringval = self._ask("XTC?")
n = int(stringval)
sen = pow(10,(int(n/2)-3+np.log10(3)*mod(n,2)/))
return sen
def do_set_timeconstant_t(self,val):
n = np.log10(val)*2.0+5.99
if (mod(n,2) > 0.9525) & (mod(n,2) < 1.1):
n = n+0.1
self._write("XTC %d"%n)
| 30.269006 | 153 | 0.651468 | 731 | 5,176 | 4.445964 | 0.27907 | 0.018462 | 0.034462 | 0.043077 | 0.272615 | 0.174154 | 0.103385 | 0.103385 | 0.091077 | 0.091077 | 0 | 0.029593 | 0.216577 | 5,176 | 170 | 154 | 30.447059 | 0.771887 | 0.191461 | 0 | 0.17757 | 0 | 0 | 0.096734 | 0.012563 | 0 | 0 | 0 | 0 | 0.009346 | 0 | null | null | 0 | 0.056075 | null | null | 0.009346 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2ce817a6e849abb570f9ff5f54594335a171ed3d | 2,918 | py | Python | ext/testlib/suite.py | mandaltj/gem5_chips | b9c0c602241ffda7851c1afb32fa01f295bb98fd | [
"BSD-3-Clause"
] | 135 | 2016-10-21T03:31:49.000Z | 2022-03-25T01:22:20.000Z | ext/testlib/suite.py | mandaltj/gem5_chips | b9c0c602241ffda7851c1afb32fa01f295bb98fd | [
"BSD-3-Clause"
] | 35 | 2017-03-10T17:57:46.000Z | 2022-02-18T17:34:16.000Z | ext/testlib/suite.py | mandaltj/gem5_chips | b9c0c602241ffda7851c1afb32fa01f295bb98fd | [
"BSD-3-Clause"
] | 48 | 2016-12-08T12:03:13.000Z | 2022-02-16T09:16:13.000Z | # Copyright (c) 2017 Mark D. Hill and David A. Wood
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met: redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer;
# redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution;
# neither the name of the copyright holders nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# Authors: Sean Wilson
import helper
import runner as runner_mod
class TestSuite(object):
'''
An object grouping a collection of tests. It provides tags which enable
filtering during list and run selection. All tests held in the suite must
have a unique name.
..note::
The :func:`__new__` method enables collection of test cases, it must
be called in order for test cases to be collected.
..note::
To reduce test definition boilerplate, the :func:`init` method is
forwarded all `*args` and `**kwargs`. This means derived classes can
define init without boilerplate super().__init__(*args, **kwargs).
'''
runner = runner_mod.SuiteRunner
collector = helper.InstanceCollector()
fixtures = []
tests = []
tags = set()
def __new__(klass, *args, **kwargs):
obj = super(TestSuite, klass).__new__(klass, *args, **kwargs)
TestSuite.collector.collect(obj)
return obj
def __init__(self, name=None, fixtures=tuple(), tests=tuple(),
tags=tuple(), **kwargs):
self.fixtures = self.fixtures + list(fixtures)
self.tags = self.tags | set(tags)
self.tests = self.tests + list(tests)
if name is None:
name = self.__class__.__name__
self.name = name
def __iter__(self):
return iter(self.tests) | 42.289855 | 77 | 0.718986 | 398 | 2,918 | 5.18593 | 0.474874 | 0.017442 | 0.016473 | 0.022287 | 0.089147 | 0.065891 | 0.065891 | 0.065891 | 0.065891 | 0.065891 | 0 | 0.001736 | 0.210418 | 2,918 | 69 | 78 | 42.289855 | 0.894097 | 0.697395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0 | 0.090909 | 0.045455 | 0.590909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
2cecc4379034e1c8d91bd34f47fbd53d6988aac0 | 857 | py | Python | crossvalidation_pipeline.py | ktian08/6784-drugs | 7c3ae9f65ce60b031008b0026bb9b954575315fa | [
"MIT"
] | 1 | 2020-06-13T00:40:21.000Z | 2020-06-13T00:40:21.000Z | crossvalidation_pipeline.py | ktian08/6784-drugs | 7c3ae9f65ce60b031008b0026bb9b954575315fa | [
"MIT"
] | null | null | null | crossvalidation_pipeline.py | ktian08/6784-drugs | 7c3ae9f65ce60b031008b0026bb9b954575315fa | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Andrew D. Rouillard
Computational Biologist
Target Sciences
GSK
andrew.d.rouillard@gsk.com
"""
import sys
import get_generalizable_features
import get_merged_features
import get_useful_features
def main(validation_rep=0, validation_fold=0):
print('VALIDATION_REP: {0!s}, VALIDATION_FOLD:{1!s}'.format(validation_rep, validation_fold), flush=True)
print('GETTING GENERALIZABLE FEATURES...', flush=True)
get_generalizable_features.main(validation_rep, validation_fold)
print('GETTING MERGED FEATURES...', flush=True)
get_merged_features.main(validation_rep, validation_fold)
print('GETTING USEFUL FEATURES...', flush=True)
get_useful_features.main(validation_rep, validation_fold)
if __name__ == '__main__':
main(validation_rep=int(sys.argv[1]), validation_fold=int(sys.argv[2]))
| 28.566667 | 109 | 0.753792 | 112 | 857 | 5.464286 | 0.330357 | 0.148693 | 0.138889 | 0.176471 | 0.230392 | 0.230392 | 0.166667 | 0.166667 | 0 | 0 | 0 | 0.009333 | 0.124854 | 857 | 29 | 110 | 29.551724 | 0.806667 | 0.131855 | 0 | 0 | 0 | 0 | 0.186141 | 0.028533 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.285714 | 0 | 0.357143 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2cee64c765350c049d3f0289910d5b8f629efbd1 | 16,464 | py | Python | darling_ansible/python_venv/lib/python3.7/site-packages/oci/waas/models/health_check.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/waas/models/health_check.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/waas/models/health_check.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | 1 | 2020-06-25T03:12:58.000Z | 2020-06-25T03:12:58.000Z | # coding: utf-8
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class HealthCheck(object):
"""
Health checks monitor the status of your origin servers and only route traffic to the origins that pass the health check. If the health check fails, origin is automatically removed from the load balancing.
There is roughly one health check per EDGE POP per period. Any checks that pass will be reported as \"healthy\".
"""
#: A constant which can be used with the method property of a HealthCheck.
#: This constant has a value of "GET"
METHOD_GET = "GET"
#: A constant which can be used with the method property of a HealthCheck.
#: This constant has a value of "HEAD"
METHOD_HEAD = "HEAD"
#: A constant which can be used with the method property of a HealthCheck.
#: This constant has a value of "POST"
METHOD_POST = "POST"
#: A constant which can be used with the expected_response_code_group property of a HealthCheck.
#: This constant has a value of "2XX"
EXPECTED_RESPONSE_CODE_GROUP_2_XX = "2XX"
#: A constant which can be used with the expected_response_code_group property of a HealthCheck.
#: This constant has a value of "3XX"
EXPECTED_RESPONSE_CODE_GROUP_3_XX = "3XX"
#: A constant which can be used with the expected_response_code_group property of a HealthCheck.
#: This constant has a value of "4XX"
EXPECTED_RESPONSE_CODE_GROUP_4_XX = "4XX"
#: A constant which can be used with the expected_response_code_group property of a HealthCheck.
#: This constant has a value of "5XX"
EXPECTED_RESPONSE_CODE_GROUP_5_XX = "5XX"
def __init__(self, **kwargs):
"""
Initializes a new HealthCheck object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param is_enabled:
The value to assign to the is_enabled property of this HealthCheck.
:type is_enabled: bool
:param method:
The value to assign to the method property of this HealthCheck.
Allowed values for this property are: "GET", "HEAD", "POST", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type method: str
:param path:
The value to assign to the path property of this HealthCheck.
:type path: str
:param headers:
The value to assign to the headers property of this HealthCheck.
:type headers: dict(str, str)
:param expected_response_code_group:
The value to assign to the expected_response_code_group property of this HealthCheck.
Allowed values for items in this list are: "2XX", "3XX", "4XX", "5XX", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type expected_response_code_group: list[str]
:param is_response_text_check_enabled:
The value to assign to the is_response_text_check_enabled property of this HealthCheck.
:type is_response_text_check_enabled: bool
:param expected_response_text:
The value to assign to the expected_response_text property of this HealthCheck.
:type expected_response_text: str
:param interval_in_seconds:
The value to assign to the interval_in_seconds property of this HealthCheck.
:type interval_in_seconds: int
:param timeout_in_seconds:
The value to assign to the timeout_in_seconds property of this HealthCheck.
:type timeout_in_seconds: int
:param healthy_threshold:
The value to assign to the healthy_threshold property of this HealthCheck.
:type healthy_threshold: int
:param unhealthy_threshold:
The value to assign to the unhealthy_threshold property of this HealthCheck.
:type unhealthy_threshold: int
"""
self.swagger_types = {
'is_enabled': 'bool',
'method': 'str',
'path': 'str',
'headers': 'dict(str, str)',
'expected_response_code_group': 'list[str]',
'is_response_text_check_enabled': 'bool',
'expected_response_text': 'str',
'interval_in_seconds': 'int',
'timeout_in_seconds': 'int',
'healthy_threshold': 'int',
'unhealthy_threshold': 'int'
}
self.attribute_map = {
'is_enabled': 'isEnabled',
'method': 'method',
'path': 'path',
'headers': 'headers',
'expected_response_code_group': 'expectedResponseCodeGroup',
'is_response_text_check_enabled': 'isResponseTextCheckEnabled',
'expected_response_text': 'expectedResponseText',
'interval_in_seconds': 'intervalInSeconds',
'timeout_in_seconds': 'timeoutInSeconds',
'healthy_threshold': 'healthyThreshold',
'unhealthy_threshold': 'unhealthyThreshold'
}
self._is_enabled = None
self._method = None
self._path = None
self._headers = None
self._expected_response_code_group = None
self._is_response_text_check_enabled = None
self._expected_response_text = None
self._interval_in_seconds = None
self._timeout_in_seconds = None
self._healthy_threshold = None
self._unhealthy_threshold = None
@property
def is_enabled(self):
"""
Gets the is_enabled of this HealthCheck.
Enables or disables the health checks.
:return: The is_enabled of this HealthCheck.
:rtype: bool
"""
return self._is_enabled
@is_enabled.setter
def is_enabled(self, is_enabled):
"""
Sets the is_enabled of this HealthCheck.
Enables or disables the health checks.
:param is_enabled: The is_enabled of this HealthCheck.
:type: bool
"""
self._is_enabled = is_enabled
@property
def method(self):
"""
Gets the method of this HealthCheck.
An HTTP verb (i.e. HEAD, GET, or POST) to use when performing the health check.
Allowed values for this property are: "GET", "HEAD", "POST", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The method of this HealthCheck.
:rtype: str
"""
return self._method
@method.setter
def method(self, method):
"""
Sets the method of this HealthCheck.
An HTTP verb (i.e. HEAD, GET, or POST) to use when performing the health check.
:param method: The method of this HealthCheck.
:type: str
"""
allowed_values = ["GET", "HEAD", "POST"]
if not value_allowed_none_or_none_sentinel(method, allowed_values):
method = 'UNKNOWN_ENUM_VALUE'
self._method = method
@property
def path(self):
"""
Gets the path of this HealthCheck.
Path to visit on your origins when performing the health check.
:return: The path of this HealthCheck.
:rtype: str
"""
return self._path
@path.setter
def path(self, path):
"""
Sets the path of this HealthCheck.
Path to visit on your origins when performing the health check.
:param path: The path of this HealthCheck.
:type: str
"""
self._path = path
@property
def headers(self):
"""
Gets the headers of this HealthCheck.
HTTP header fields to include in health check requests, expressed as `\"name\": \"value\"` properties. Because HTTP header field names are case-insensitive, any use of names that are case-insensitive equal to other names will be rejected. If Host is not specified, requests will include a Host header field with value matching the policy's protected domain. If User-Agent is not specified, requests will include a User-Agent header field with value \"waf health checks\".
**Note:** The only currently-supported header fields are Host and User-Agent.
:return: The headers of this HealthCheck.
:rtype: dict(str, str)
"""
return self._headers
@headers.setter
def headers(self, headers):
"""
Sets the headers of this HealthCheck.
HTTP header fields to include in health check requests, expressed as `\"name\": \"value\"` properties. Because HTTP header field names are case-insensitive, any use of names that are case-insensitive equal to other names will be rejected. If Host is not specified, requests will include a Host header field with value matching the policy's protected domain. If User-Agent is not specified, requests will include a User-Agent header field with value \"waf health checks\".
**Note:** The only currently-supported header fields are Host and User-Agent.
:param headers: The headers of this HealthCheck.
:type: dict(str, str)
"""
self._headers = headers
@property
def expected_response_code_group(self):
"""
Gets the expected_response_code_group of this HealthCheck.
The HTTP response codes that signify a healthy state.
- **2XX:** Success response code group.
- **3XX:** Redirection response code group.
- **4XX:** Client errors response code group.
- **5XX:** Server errors response code group.
Allowed values for items in this list are: "2XX", "3XX", "4XX", "5XX", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The expected_response_code_group of this HealthCheck.
:rtype: list[str]
"""
return self._expected_response_code_group
@expected_response_code_group.setter
def expected_response_code_group(self, expected_response_code_group):
"""
Sets the expected_response_code_group of this HealthCheck.
The HTTP response codes that signify a healthy state.
- **2XX:** Success response code group.
- **3XX:** Redirection response code group.
- **4XX:** Client errors response code group.
- **5XX:** Server errors response code group.
:param expected_response_code_group: The expected_response_code_group of this HealthCheck.
:type: list[str]
"""
allowed_values = ["2XX", "3XX", "4XX", "5XX"]
if expected_response_code_group:
expected_response_code_group[:] = ['UNKNOWN_ENUM_VALUE' if not value_allowed_none_or_none_sentinel(x, allowed_values) else x for x in expected_response_code_group]
self._expected_response_code_group = expected_response_code_group
@property
def is_response_text_check_enabled(self):
"""
Gets the is_response_text_check_enabled of this HealthCheck.
Enables or disables additional check for predefined text in addition to response code.
:return: The is_response_text_check_enabled of this HealthCheck.
:rtype: bool
"""
return self._is_response_text_check_enabled
@is_response_text_check_enabled.setter
def is_response_text_check_enabled(self, is_response_text_check_enabled):
"""
Sets the is_response_text_check_enabled of this HealthCheck.
Enables or disables additional check for predefined text in addition to response code.
:param is_response_text_check_enabled: The is_response_text_check_enabled of this HealthCheck.
:type: bool
"""
self._is_response_text_check_enabled = is_response_text_check_enabled
@property
def expected_response_text(self):
"""
Gets the expected_response_text of this HealthCheck.
Health check will search for the given text in a case-sensitive manner within the response body and will fail if the text is not found.
:return: The expected_response_text of this HealthCheck.
:rtype: str
"""
return self._expected_response_text
@expected_response_text.setter
def expected_response_text(self, expected_response_text):
"""
Sets the expected_response_text of this HealthCheck.
Health check will search for the given text in a case-sensitive manner within the response body and will fail if the text is not found.
:param expected_response_text: The expected_response_text of this HealthCheck.
:type: str
"""
self._expected_response_text = expected_response_text
@property
def interval_in_seconds(self):
"""
Gets the interval_in_seconds of this HealthCheck.
Time between health checks of an individual origin server, in seconds.
:return: The interval_in_seconds of this HealthCheck.
:rtype: int
"""
return self._interval_in_seconds
@interval_in_seconds.setter
def interval_in_seconds(self, interval_in_seconds):
"""
Sets the interval_in_seconds of this HealthCheck.
Time between health checks of an individual origin server, in seconds.
:param interval_in_seconds: The interval_in_seconds of this HealthCheck.
:type: int
"""
self._interval_in_seconds = interval_in_seconds
@property
def timeout_in_seconds(self):
"""
Gets the timeout_in_seconds of this HealthCheck.
Response timeout represents wait time until request is considered failed, in seconds.
:return: The timeout_in_seconds of this HealthCheck.
:rtype: int
"""
return self._timeout_in_seconds
@timeout_in_seconds.setter
def timeout_in_seconds(self, timeout_in_seconds):
"""
Sets the timeout_in_seconds of this HealthCheck.
Response timeout represents wait time until request is considered failed, in seconds.
:param timeout_in_seconds: The timeout_in_seconds of this HealthCheck.
:type: int
"""
self._timeout_in_seconds = timeout_in_seconds
@property
def healthy_threshold(self):
"""
Gets the healthy_threshold of this HealthCheck.
Number of successful health checks after which the server is marked up.
:return: The healthy_threshold of this HealthCheck.
:rtype: int
"""
return self._healthy_threshold
@healthy_threshold.setter
def healthy_threshold(self, healthy_threshold):
"""
Sets the healthy_threshold of this HealthCheck.
Number of successful health checks after which the server is marked up.
:param healthy_threshold: The healthy_threshold of this HealthCheck.
:type: int
"""
self._healthy_threshold = healthy_threshold
@property
def unhealthy_threshold(self):
"""
Gets the unhealthy_threshold of this HealthCheck.
Number of failed health checks after which the server is marked down.
:return: The unhealthy_threshold of this HealthCheck.
:rtype: int
"""
return self._unhealthy_threshold
@unhealthy_threshold.setter
def unhealthy_threshold(self, unhealthy_threshold):
"""
Sets the unhealthy_threshold of this HealthCheck.
Number of failed health checks after which the server is marked down.
:param unhealthy_threshold: The unhealthy_threshold of this HealthCheck.
:type: int
"""
self._unhealthy_threshold = unhealthy_threshold
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 37.589041 | 479 | 0.672133 | 2,094 | 16,464 | 5.073066 | 0.122732 | 0.031629 | 0.088017 | 0.068248 | 0.72296 | 0.649252 | 0.595971 | 0.512002 | 0.454674 | 0.414478 | 0 | 0.004114 | 0.261723 | 16,464 | 437 | 480 | 37.675057 | 0.869848 | 0.579082 | 0 | 0.085271 | 0 | 0 | 0.121739 | 0.039038 | 0 | 0 | 0 | 0 | 0 | 1 | 0.20155 | false | 0 | 0.015504 | 0.015504 | 0.395349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2cf1316f9ee5f955fd15c0e34a1d960d2f6a156f | 303 | py | Python | mysite/SocialApp/migrations/0003_delete_remotefollow.py | asmao7/Cmput404W2021 | 82c1f42492c93048d5f144e2bbb416764d78013b | [
"MIT"
] | 3 | 2021-01-20T18:23:14.000Z | 2021-02-22T19:38:46.000Z | mysite/SocialApp/migrations/0003_delete_remotefollow.py | asmao7/Cmput404W2021 | 82c1f42492c93048d5f144e2bbb416764d78013b | [
"MIT"
] | 24 | 2021-02-18T19:28:46.000Z | 2021-04-14T17:12:21.000Z | mysite/SocialApp/migrations/0003_delete_remotefollow.py | asmao7/Cmput404W2021 | 82c1f42492c93048d5f144e2bbb416764d78013b | [
"MIT"
] | 1 | 2021-05-13T04:43:00.000Z | 2021-05-13T04:43:00.000Z | # Generated by Django 3.1.6 on 2021-04-12 08:45
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('SocialApp', '0002_auto_20210411_2237'),
]
operations = [
migrations.DeleteModel(
name='RemoteFollow',
),
]
| 17.823529 | 49 | 0.613861 | 32 | 303 | 5.71875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141553 | 0.277228 | 303 | 16 | 50 | 18.9375 | 0.694064 | 0.148515 | 0 | 0 | 1 | 0 | 0.171875 | 0.089844 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2cf4f7aa04d5ae102d82d9c9bd6e398a5f525f06 | 5,230 | py | Python | src/export_blueprints.py | nutanixdev/export_blueprints | 5100dc3342c4b7d01b7fd4276fd69fc2ff150c5a | [
"MIT"
] | null | null | null | src/export_blueprints.py | nutanixdev/export_blueprints | 5100dc3342c4b7d01b7fd4276fd69fc2ff150c5a | [
"MIT"
] | null | null | null | src/export_blueprints.py | nutanixdev/export_blueprints | 5100dc3342c4b7d01b7fd4276fd69fc2ff150c5a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3.8
"""
export_blueprints.py
Connect to a Nutanix Prism Central instance, grab all Calm blueprints and export them to JSON files.
You would need to *heavily* modify this script for use in a production environment so that it contains appropriate error-checking and exception handling.
"""
__author__ = "Chris Rasmussen @ Nutanix"
__version__ = "1.1"
__maintainer__ = "Chris Rasmussen @ Nutanix"
__email__ = "crasmussen@nutanix.com"
__status__ = "Development/Demo"
# default modules
import json
import getpass
import argparse
from time import localtime, strftime
import urllib3
# custom modules
import apiclient
def set_options():
global ENTITY_RESPONSE_LENGTH
"""
set ENTITY_RESPONSE_LENGTH to the maximum number of blueprints you want
to export
this is only required since the v3 list APIs will only return 20
entities by default
"""
ENTITY_RESPONSE_LENGTH = 50
def get_options():
global cluster_ip
global username
global password
# process the command-line arguments
parser = argparse.ArgumentParser(
description="Export all Calm blueprints to JSON files"
)
parser.add_argument("pc_ip", help="Prism Central IP address")
parser.add_argument("-u", "--username", help="Prism Central username")
parser.add_argument("-p", "--password", help="Prism Central password")
args = parser.parse_args()
# validate the arguments to make sure all required info has been supplied
if args.username:
username = args.username
else:
username = input("Please enter your Prism Central username: ")
if args.password:
password = args.password
else:
password = getpass.getpass()
cluster_ip = args.pc_ip
def main():
# set the global options
set_options()
# get the cluster connection info
get_options()
"""
disable insecure connection warnings
please be advised and aware of the implications
in a production environment!
"""
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
# make sure all required info has been provided
if not cluster_ip:
raise Exception("Cluster IP is required.")
elif not username:
raise Exception("Username is required.")
elif not password:
raise Exception("Password is required.")
else:
"""
do a preliminary check to see if this is AOS or CE
not used in this script but is could be useful for
later modifications
"""
client = apiclient.ApiClient(
"post",
cluster_ip,
"clusters/list",
'{ "kind": "cluster" }',
username,
password,
)
results = client.get_info()
is_ce = False
for cluster in results["entities"]:
if (
"-ce-"
in cluster["status"]["resources"]["config"]["build"]["full_version"]
):
is_ce = True
endpoints = {}
endpoints["blueprints"] = ["blueprint", (f'"length":{ENTITY_RESPONSE_LENGTH}')]
# get all blueprints
for endpoint in endpoints:
if endpoints[endpoint][1] != "":
client = apiclient.ApiClient(
"post",
cluster_ip,
(f"{endpoints[endpoint][0]}s/list"),
(
f'{{ "kind": "{endpoints[endpoint][0]}", {endpoints[endpoint][1]} }}'
),
username,
password,
)
else:
client = apiclient.ApiClient(
"post",
cluster_ip,
(f"{endpoints[endpoint][0]}s/list"),
(f'{{ "kind": "{endpoints[endpoint][0]}" }}'),
username,
password,
)
results = client.get_info()
# make sure the user knows what's happening ... ;-)
print(f"\n{len(results['entities'])} blueprints collected from {cluster_ip}\n")
'''
go through all the blueprints and export them to appropriately named files
filename will match the blueprint name and should work find if blueprint name contains spaces (tested on Ubuntu Linux)
'''
for blueprint in results["entities"]:
day = strftime("%d-%b-%Y", localtime())
time = strftime("%H%M%S", localtime())
blueprint_filename = f"{day}_{time}_{blueprint['status']['name']}.json"
client = apiclient.ApiClient(
"get",
cluster_ip,
f"blueprints/{blueprint['status']['uuid']}/export_file",
'{ "kind": "cluster" }',
username,
password,
)
exported_json = client.get_info()
with open(f"./{blueprint_filename}", "w") as f:
json.dump(exported_json, f)
print(
f"Successfully exported blueprint '{blueprint['status']['name']}'"
)
print("\nFinished!\n")
if __name__ == "__main__":
main()
| 30.231214 | 157 | 0.573231 | 554 | 5,230 | 5.287004 | 0.379061 | 0.027654 | 0.027313 | 0.028679 | 0.132127 | 0.115056 | 0.077842 | 0.057357 | 0.057357 | 0.057357 | 0 | 0.005098 | 0.324857 | 5,230 | 172 | 158 | 30.406977 | 0.824412 | 0.116444 | 0 | 0.271028 | 0 | 0 | 0.250452 | 0.09605 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028037 | false | 0.11215 | 0.056075 | 0 | 0.084112 | 0.093458 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa04b991cc6d146ac7f1dac666aa9bb071d80333 | 908 | py | Python | scripts/media_to_wp.py | benjaminaschultz/pypress | 358e088ac361aa4ae808f0a3d05fb6320d62240c | [
"MIT"
] | 2 | 2015-01-25T17:21:53.000Z | 2020-02-15T08:30:26.000Z | scripts/media_to_wp.py | benjaminaschultz/pypress | 358e088ac361aa4ae808f0a3d05fb6320d62240c | [
"MIT"
] | 1 | 2015-04-22T20:36:01.000Z | 2015-04-22T20:36:01.000Z | scripts/media_to_wp.py | benjaminaschultz/pypress | 358e088ac361aa4ae808f0a3d05fb6320d62240c | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import os,re,sys
import mimetypes as mt
import argparse
import wordpress_xmlrpc as wp
from pypress import *
def main(argv,client=None):
parser = argparse.ArgumentParser()
parser.add_argument('-b','--blog', help='url of wordpress blog to which you want to post',dest='url')
parser.add_argument('-u','--user', help="username with which you'd like to post to the blog",dest='username')
parser.add_argument('-p','--password', help="password with which you'd like to post to the blog",dest='password')
parser.add_argument('files', help='file to be uploaded to the Glog',
nargs='+')
args = parser.parse_args(argv)
conf= WPConfig(url=args.url,username=args.username,password=args.password)
if (client is None):
client = conf.getDefaultClient()
wmpu=WPMediaUploader(client)
wmpu.upload(args.files)
if __name__=="__main__":
main(sys.argv[1:])
| 34.923077 | 115 | 0.709251 | 136 | 908 | 4.632353 | 0.470588 | 0.057143 | 0.107937 | 0.04127 | 0.114286 | 0.114286 | 0.114286 | 0.114286 | 0.114286 | 0.114286 | 0 | 0.001292 | 0.147577 | 908 | 25 | 116 | 36.32 | 0.812662 | 0.022026 | 0 | 0 | 0 | 0 | 0.269448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0.1 | 0.25 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa0619b095728c9cbe0aa36d7fe788dda554c238 | 3,704 | py | Python | tp/log/es.py | chinapnr/agbot | 9739ce1c2198e50111629db2d1de785edd06876e | [
"MIT"
] | 2 | 2018-06-23T06:48:46.000Z | 2018-06-23T10:11:50.000Z | tp/log/es.py | chinapnr/agbot | 9739ce1c2198e50111629db2d1de785edd06876e | [
"MIT"
] | 5 | 2020-01-03T09:33:02.000Z | 2021-06-02T00:49:52.000Z | tp/log/es.py | chinapnr/agbot | 9739ce1c2198e50111629db2d1de785edd06876e | [
"MIT"
] | 1 | 2021-07-07T07:17:27.000Z | 2021-07-07T07:17:27.000Z | import re
import time
from datetime import datetime
from enum import Enum
from fishbase.fish_logger import logger
from .elk_connector import Es
from ..base.tp_base import TpBase, TestStatus, Conf, VerticalContext
from ..base.tp_base import get_params_dict
# LogTestPoint
class LogESTestPoint(TpBase):
# 类的初始化过程
# 2018.6.11 create by yanan.wu #748921
def __init__(self, tp_conf, vertical_context: VerticalContext):
TpBase.__init__(self, tp_conf, vertical_context)
self.conf_enum = LogESTestPointEnum
self.__tc_start_time = ''
self.vertical_context = vertical_context
# 准备请求参数
# 2018.6.11 create by yanan.wu #748921
def build_request(self):
tc_ctx = self.vertical_context.tc_context
try:
# 获取请参
self.req_param = {'index': self.tp_conf.get('index'),
'key_word': self.tp_conf.get('key_word')}
# 获取 tc 执行起始时间
time_struct = time.mktime(tc_ctx.start_time.timetuple())
self.__tc_start_time = datetime.utcfromtimestamp(
time_struct).strftime('%Y-%m-%dT%H:%M:%S')
return self.req_param
except RuntimeError as e:
logger.error('tp->log:get req params error: {}'.format(str(e)))
raise Exception(str(e))
# 测试案例的执行
# 2018.6.11 create by yanan.wu #748921
def execute(self, request):
try:
es_conf = {}
# 发起接口调用请求并接收响应
es = Es(es_conf['server_ip'], es_conf['server_port'],
es_conf['auth_user'], es_conf['auth_password'])
tp_utc_time = datetime.utcnow().strftime('%Y-%m-%dT%H:%M:%S')
resp = es.search_match(request.get(LogESTestPointEnum.index.key),
self.__tc_start_time,
tp_utc_time, 100,
request.get(LogESTestPointEnum.key_word.key))
return resp, ''
except Exception as e:
logger.error('tp->log: execute error: {}'.format(str(e)))
raise Exception(str(e))
# 预期结果的校验
# 2018.6.11 create by yanan.wu #748921
def test_status(self):
tc_ctx = self.vertical_context.tc_context
# 获取期望返回参数
if self.tp_conf.get('expect_data'):
params_name_list = self.tp_conf.get('expect_data').split(',')
self.expect_dict = get_params_dict(params_name_list, tc_ctx.tc_detail.data)
if self.tp_conf.get('check_type') == LogCheckType.ROWS_CHECK.value:
if self.expect_dict.get(LogESTestPointEnum.expect_data.key) == str(tc_ctx.current_tp_context.response.content['hits']['total']):
return TestStatus.PASSED
else:
return TestStatus.NOT_PASSED
if self.tp_conf.get('check_type') == LogCheckType.REG_CHECK.value:
for hit in tc_ctx.current_tp_context.response.content['hits']['hits']:
match_obj = re.search(
self.expect_dict.get(LogESTestPointEnum.expect_data.key),
hit['_source']['message'])
if match_obj:
return TestStatus.PASSED
return TestStatus.NOT_PASSED
# 后处理
def post_handler(self):
pass
# 日志检查类型
# 2018.6.12 create by yanan.wu #806640
class LogCheckType(Enum):
# 行数校验
ROWS_CHECK = '01'
# 正则校验
REG_CHECK = '02'
# 日志配置文件枚举
class LogESTestPointEnum(Conf):
tp_name = 'tp_name', 'tp 名称', True, ''
key_word = 'key_word', '查询关键字', False, ''
index = 'index', '查询索引', True, ''
check_type = 'check_type', '校验方式', True, '01'
expect_data = 'expect_data', '期望返回结果', True, '' | 35.615385 | 140 | 0.601242 | 457 | 3,704 | 4.630197 | 0.306346 | 0.022684 | 0.037807 | 0.036862 | 0.3431 | 0.323251 | 0.256144 | 0.241966 | 0.058601 | 0 | 0 | 0.027925 | 0.284557 | 3,704 | 104 | 141 | 35.615385 | 0.770566 | 0.079374 | 0 | 0.144928 | 0 | 0 | 0.090077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072464 | false | 0.086957 | 0.115942 | 0 | 0.42029 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa063d565e386e5f6704b88cfa4179fa400be8cb | 5,326 | py | Python | main.py | nmanzini/flashcardipy | a36be3d733d27d5252485d3b9ff71437dc3453cf | [
"MIT"
] | null | null | null | main.py | nmanzini/flashcardipy | a36be3d733d27d5252485d3b9ff71437dc3453cf | [
"MIT"
] | null | null | null | main.py | nmanzini/flashcardipy | a36be3d733d27d5252485d3b9ff71437dc3453cf | [
"MIT"
] | null | null | null | import sqlite3, random, os
import time
name = 'test01.db'
filename = "grelist.txt"
conn = sqlite3.connect(name)
c = conn.cursor()
'''word, definition, example, history, time, seen, right, wrong, streak, reported'''
class Word(object):
def __init__(self, row_id):
"""
Initialize a Word object with all that stuff
:param row_id: integer value of the row
:type row_id: integer
"""
c.execute('SELECT * FROM words WHERE rowid = ' + str(row_id))
word_data = c.fetchone()
self.row_id = row_id
self.word = word_data[0]
self.definition = word_data[1]
self.example = word_data[2]
self.history = word_data[3]
self.time_h = word_data[4]
self.seen = word_data[5]
self.right = word_data[6]
self.wrong = word_data[7]
self.streak = word_data[8]
self.reported = word_data[9]
def show(self):
"""
shows the previously selected word and react to the input
:return:
:rtype:
"""
# TODO: polish the console gui by adding an introduction at the beginning
# TODO: polish the visualization of words, showing history and last time seen.
positive = ("yes", "y", "Y", "Yes", "YES", "1", " ")
negative = ("no", "n", "N", "No", "NO", "0", "")
exit_answers = ("exit", "e")
report = ("report", "r")
print()
print(" " + self.word.upper())
print()
print()
answer = input('do you remember this word?')
print()
if answer in positive:
print("DEFINITION:")
print(self.definition)
print()
print()
print("good!")
input('press enter when done')
self.opened_edit()
self.positive_edit()
self.streak_edit(1)
elif answer in negative:
print("DEFINITION:")
print(self.definition)
print()
print()
print("you will remember next time")
input('press enter when done')
self.opened_edit()
self.negative_edit()
self.streak_edit(0)
elif answer in exit_answers:
print("Ok, see you soon!")
return True
elif answer in report:
print("sorry the word was incorrect")
self.report_edit()
else:
print("invalid input")
self.update()
os.system('cls')
return
def opened_edit(self):
"""
react to the opening of the file, increments seen and add a time slot
:return:
:rtype:
"""
if self.time_h:
self.time_h += " , " + str(int(time.time()))
else:
self.time_h = str((int(time.time())))
self.seen += 1
# TODO: merge opened(self,input) with positive and negative, the input shal be 1 or 0 for right or wrong
def positive_edit(self):
"""
react to the positive answer updating histoy and the right counter
:return:
:rtype:
"""
if self.history:
self.history += 1
else:
self.history = 1
self.right +=1
def negative_edit(self):
"""
React to a negative answer updating history and wrong
:return:
:rtype:
"""
if self.history:
self.history += 0
else:
self.history = 0
self.wrong += 1
def report_edit(self):
self.reported = 1
def streak_edit(self, value):
"""
update the streak, positive means the user is on a positive streak for the word and vice versa
:param value: integer (1 or 0)
:type value: int
"""
if value == 1:
if self.streak >= 0:
self.streak += 1
else:
self.streak = 1
if value == 0:
if self.streak >= 0:
self.streak = -1
else:
self.streak += -1
def update(self):
variables = ['history', 'time', 'seen', 'right', 'wrong', 'streak', 'reported']
marks = ["?"]*len(variables)
values = [self.history, self.time_h, self.seen, self.right, self.wrong, self.streak, self.reported]
output_list = [a+" = "+b for a, b in zip(variables, marks)]
output_line = " , ".join(output_list)
c.execute('UPDATE words SET '+ output_line+' WHERE rowid = '+str(self.row_id),values)
conn.commit()
def chooser():
case = random.random()
if case < 0.70:
c.execute('SELECT rowid FROM words WHERE reported = 0 ORDER BY RANDOM() LIMIT 1;')
elif case < 0.95:
c.execute('SELECT rowid FROM words WHERE reported = 0 and streak < 0 ORDER BY RANDOM() LIMIT 1;')
else:
c.execute('SELECT rowid FROM words WHERE reported = 0 and streak > 0 ORDER BY RANDOM() LIMIT 1;')
row_id = c.fetchone()
if not row_id:
c.execute('SELECT rowid FROM words WHERE reported = 0 ORDER BY RANDOM() LIMIT 1;')
row_id = c.fetchone()
return row_id[0]
if __name__ == "__main__":
while True:
test_word = Word(chooser())
result = test_word.show()
if result:
break
| 28.481283 | 108 | 0.533421 | 654 | 5,326 | 4.253823 | 0.262997 | 0.01977 | 0.025162 | 0.027318 | 0.280374 | 0.267434 | 0.267434 | 0.1977 | 0.162473 | 0.132998 | 0 | 0.015914 | 0.351108 | 5,326 | 186 | 109 | 28.634409 | 0.789063 | 0.154149 | 0 | 0.256198 | 0 | 0 | 0.16109 | 0 | 0 | 0 | 0 | 0.010753 | 0 | 1 | 0.07438 | false | 0 | 0.016529 | 0 | 0.123967 | 0.14876 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa12c7d1995b97754015bbb18baea049a24d51b2 | 1,056 | py | Python | Leetcode Practice/strStr.py | falconcode16/pythonprogramming | fc53a879be473ebceb1d7da061b0e8fc2a20706c | [
"MIT"
] | 2 | 2020-04-11T14:15:10.000Z | 2020-05-12T09:57:29.000Z | Leetcode Practice/strStr.py | falconcode16/pythonprogramming | fc53a879be473ebceb1d7da061b0e8fc2a20706c | [
"MIT"
] | null | null | null | Leetcode Practice/strStr.py | falconcode16/pythonprogramming | fc53a879be473ebceb1d7da061b0e8fc2a20706c | [
"MIT"
] | 1 | 2021-10-10T02:13:42.000Z | 2021-10-10T02:13:42.000Z | # Link - https://leetcode.com/problems/implement-strstr/
"""
28. Implement strStr()
Implement strStr().
Return the index of the first occurrence of needle in haystack, or -1 if needle is not part of haystack.
Clarification:
What should we return when needle is an empty string? This is a great question to ask during an interview.
For the purpose of this problem, we will return 0 when needle is an empty string. This is consistent to C's strstr() and Java's indexOf().
Example 1:
Input: haystack = "hello", needle = "ll"
Output: 2
Example 2:
Input: haystack = "aaaaa", needle = "bba"
Output: -1
Example 3:
Input: haystack = "", needle = ""
Output: 0
Constraints:
0 <= haystack.length, needle.length <= 5 * 104
haystack and needle consist of only lower-case English characters.
"""
class Solution:
def strStr(self, haystack: str, needle: str) -> int:
if len(needle) == 0:
return 0
else:
try:
return haystack.index(needle)
except ValueError:
return -1
| 22 | 138 | 0.660985 | 149 | 1,056 | 4.684564 | 0.536913 | 0.06447 | 0.034384 | 0.040115 | 0.088825 | 0.088825 | 0.088825 | 0.088825 | 0 | 0 | 0 | 0.022528 | 0.243371 | 1,056 | 47 | 139 | 22.468085 | 0.851064 | 0.744318 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
fa132dcd77d4356013f00fce442c60e3fac1fb8d | 816 | py | Python | skaio/scheduler.py | cipriantarta/skaio | 30716a3bd30d055c0c18d10a899522934bb71613 | [
"BSD-3-Clause"
] | null | null | null | skaio/scheduler.py | cipriantarta/skaio | 30716a3bd30d055c0c18d10a899522934bb71613 | [
"BSD-3-Clause"
] | null | null | null | skaio/scheduler.py | cipriantarta/skaio | 30716a3bd30d055c0c18d10a899522934bb71613 | [
"BSD-3-Clause"
] | null | null | null | import importlib.util
import inspect
from skaio import log
from skaio.core.publisher import Publisher
from skaio.core.base.task import BaseTask
from skaio.utils.common import get_loop
tasks = ['samples.simple_tasks']
class Scheduler:
def start(self):
publisher = Publisher()
loop = get_loop()
for task_mod in tasks:
m = importlib.import_module(task_mod)
task_classes = filter(lambda x: inspect.isclass(x[1])
and x[1].__name__ != 'BaseTask'
and issubclass(x[1], BaseTask),
inspect.getmembers(m))
for name, task in task_classes:
log.info(f'Sending tasks for {name}')
loop.run_until_complete(publisher.publish(task))
| 32.64 | 65 | 0.590686 | 96 | 816 | 4.875 | 0.479167 | 0.076923 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005445 | 0.324755 | 816 | 24 | 66 | 34 | 0.84392 | 0 | 0 | 0 | 0 | 0 | 0.063725 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.35 | 0 | 0.45 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
fa17a8836d0b0829d07d117b80ec33b5f3ae92ce | 1,123 | py | Python | Analytics_Deployment/amls/model_deployment/download_model.py | dciborow/Azure-Synapse-Retail-Recommender-Solution-Accelerator | 7ce56d00071bd1f521429dd15ea14c0d0b217008 | [
"MIT"
] | 12 | 2021-02-13T06:23:05.000Z | 2022-03-26T05:17:49.000Z | Analytics_Deployment/amls/model_deployment/download_model.py | dciborow/Azure-Synapse-Retail-Recommender-Solution-Accelerator | 7ce56d00071bd1f521429dd15ea14c0d0b217008 | [
"MIT"
] | 1 | 2021-10-17T00:23:51.000Z | 2021-10-17T00:23:51.000Z | Analytics_Deployment/amls/model_deployment/download_model.py | dciborow/Azure-Synapse-Retail-Recommender-Solution-Accelerator | 7ce56d00071bd1f521429dd15ea14c0d0b217008 | [
"MIT"
] | 13 | 2021-02-13T06:23:07.000Z | 2022-02-25T11:23:24.000Z | # Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
import os, uuid, sys, pickle, shutil, io, logging
from azure.storage.filedatalake import DataLakeServiceClient
from azure.core._match_conditions import MatchConditions
from azure.storage.filedatalake._models import ContentSettings
from utility_functions.az_storage_reader import *
# Enter the name of the Azure Data Lake Storage Gen2 Account
DATA_LAKE_NAME=""
# Enter the name of the filesystem
DATA_LAKE_FILE_SYSTEM_NAME=""
# Enter the Primary Key of the Data Lake Account
DATA_LAKE_PRIMARY_KEY=""
file_system_client = connect_to_adls(DATA_LAKE_NAME, DATA_LAKE_PRIMARY_KEY, DATA_LAKE_FILE_SYSTEM_NAME)
dirs_to_write = ["itemFactors", "metadata", "userFactors"]
prep_dirs_for_write(dirs_to_write, "retailai_recommendation_model")
for directory in dirs_to_write:
copy_files_from_directory(file_system_client, "user/trusted-service-user/retailai_recommendation_model/"+directory, directory, "retailai_recommendation_model")
shutil.make_archive("retailai_recommendation_model", 'zip', "model\\retailai_recommendation_model") | 48.826087 | 163 | 0.836153 | 155 | 1,123 | 5.716129 | 0.458065 | 0.072235 | 0.15237 | 0.063205 | 0.088036 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000978 | 0.089047 | 1,123 | 23 | 164 | 48.826087 | 0.865103 | 0.203028 | 0 | 0 | 0 | 0 | 0.238202 | 0.201124 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.357143 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
fa23b5ad8e2d48f157fde43ab9bbb1141bdb0d96 | 676 | py | Python | tests/conftest.py | tohanss/repobee-sanitizer | d7a22dc51f298857db4f0138c04ffd5f3fe43511 | [
"MIT"
] | null | null | null | tests/conftest.py | tohanss/repobee-sanitizer | d7a22dc51f298857db4f0138c04ffd5f3fe43511 | [
"MIT"
] | 137 | 2020-06-18T14:57:11.000Z | 2022-01-16T15:58:27.000Z | tests/conftest.py | tohanss/repobee-sanitizer | d7a22dc51f298857db4f0138c04ffd5f3fe43511 | [
"MIT"
] | 2 | 2020-06-20T21:47:40.000Z | 2020-06-24T13:04:54.000Z | """Global fixtures and setup code for the test suite."""
import sys
import pathlib
import pytest
import repobee
sys.path.append(str(pathlib.Path(__file__).parent / "helpers"))
@pytest.fixture(autouse=True)
def unregister_plugins():
"""Fixture that automatically unregisters all plugins after each test
function. This is important for the end-to-end tests.
"""
repobee.unregister_all_plugins()
@pytest.fixture
def sanitizer_config(tmpdir):
"""Config file which only specifies sanitizer as a plugin."""
config_file = pathlib.Path(tmpdir) / "sanitizer_config.cnf"
config_file.write_text("[DEFAULTS]\nplugins = sanitizer\n")
yield config_file
| 28.166667 | 73 | 0.745562 | 90 | 676 | 5.455556 | 0.611111 | 0.081466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149408 | 676 | 23 | 74 | 29.391304 | 0.853913 | 0.335799 | 0 | 0 | 0 | 0 | 0.141509 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
fa31418ce189be6854e296ddbd28f6d7bc22e85e | 344 | py | Python | backend/research_note/migrations/0006_remove_researchnote_is_written.py | andy23512/research-note-system | 42a9d67de07a0f32615c4b9c6505b46e7c852f79 | [
"MIT"
] | null | null | null | backend/research_note/migrations/0006_remove_researchnote_is_written.py | andy23512/research-note-system | 42a9d67de07a0f32615c4b9c6505b46e7c852f79 | [
"MIT"
] | 6 | 2021-06-04T23:01:14.000Z | 2022-02-26T19:57:11.000Z | backend/research_note/migrations/0006_remove_researchnote_is_written.py | andy23512/research-note-system | 42a9d67de07a0f32615c4b9c6505b46e7c852f79 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.7 on 2019-11-12 16:03
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('research_note', '0005_auto_20191112_2255'),
]
operations = [
migrations.RemoveField(
model_name='researchnote',
name='is_written',
),
]
| 19.111111 | 53 | 0.610465 | 37 | 344 | 5.513514 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125506 | 0.281977 | 344 | 17 | 54 | 20.235294 | 0.700405 | 0.130814 | 0 | 0 | 1 | 0 | 0.195286 | 0.077441 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa3658cc83397f893af5141259e17243f9b55e03 | 4,060 | py | Python | web.py | dujinle/AccountByTornado | ef76be1d8cfffea2797bf024dcb0eaa887ca0aff | [
"Apache-2.0"
] | null | null | null | web.py | dujinle/AccountByTornado | ef76be1d8cfffea2797bf024dcb0eaa887ca0aff | [
"Apache-2.0"
] | null | null | null | web.py | dujinle/AccountByTornado | ef76be1d8cfffea2797bf024dcb0eaa887ca0aff | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
import sys, os
import tornado.ioloop
import tornado.web
import tornado.httpserver
import logging
import logging.handlers
import re
from urllib import unquote
import config
from travellers import *
reload(sys)
sys.setdefaultencoding('utf8')
def deamon(chdir = False):
try:
if os.fork() > 0:
os._exit(0)
except OSError, e:
print 'fork #1 failed: %d (%s)' % (e.errno, e.strerror)
os._exit(1)
def init():
pass
class DefaultHandler(tornado.web.RequestHandler):
def get(self):
self.write('Travellers Say Hello! (v%s)' % config.VERSION)
class LogHandler(tornado.web.RequestHandler):
def get(self):
log_filename = 'logs/logging'
if not os.path.exists(log_filename):
self.write('The log file is empty.')
return
log_file = None
log_file_lines = None
try:
log_file = open(log_filename, 'r')
if log_file is None:
raise Exception('log_file is None')
log_file_lines = log_file.readlines()
if log_file_lines is None:
raise Exception('log_file_lines is None')
except Exception, e:
logger = logging.getLogger('web')
logger.error('Failed to read the log file (logs/logging), error: %s' % e)
finally:
if log_file is not None:
log_file.close()
if log_file_lines is None:
self.write('Failed to read the log file.')
line_limit = 500
for _ in log_file_lines[::-1]:
line_limit -= 1
if line_limit > 0:
self.write(unquote(_) + '<BR/>')
settings = {
"static_path": os.path.join(os.path.dirname(__file__), "static"),
"cookie_secret": "SAB8LF2sGBflryMb6eXFkX#ou@CNta9V",
}
routes = [
(r"/", DefaultHandler),
(r"/api/user/authkey", AuthKeyHandler), # Send AuthKey (POST)(JWT)
(r"/api/user/register", RegisterHandler), # Register (POST)(JWT)
(r"/api/user/login", LoginHandler), # Login (POST)(JWT)
(r"/api/user/logout", LogoutHandler), # Logout (POST)
(r"/api/user/reset", ResetHandler), # Reset password (POST)(JWT)
(r"/api/user/forget", ForgetHandler), # Forget password (POST)(JWT)
(r"/api/user/update", UUpdateHandler), # UserUpdate (GET/POST)
(r"/api/user/getuser", GetUserHandler), # GetUser (GET/POST)
(r"/api/user/getall", GetAllMbersHandler), # GetAllUsers (GET/POST)
(r"/api/user/icon", AvatarHandler), # UpdateIcon (GET/POST)
(r"/api/user/geticon", GetIconHandler), # GetIcon (GET/POST)
(r"/api/user/pos", PostionHandler), # Update Pos info (GET/POST)
(r"/api/group/create", AddGroupHandler), # AddGroup (GET/POST)
(r"/api/group/destroy", DelGroupHandler), # DelGroup (GET/POST)
(r"/api/group/join", AddMemberHandler), # AddMember (GET/POST)
(r"/api/group/quit", DelMemberHandler), # DelMember (GET/POST)
(r"/api/group/getgroup", GetMembersHandler), # GerMembers (GET/POST)
(r"/api/group/rename", RenameGroupHandler), # GerMembers (GET/POST)
(r"/api/group/setshare",SetPosShareHandler), # GerMembers (GET/POST)
]
if config.Mode == 'DEBUG':
routes.append((r"/log", LogHandler))
application = tornado.web.Application(routes, **settings)
if __name__ == "__main__":
if '-d' in sys.argv:
deamon()
logdir = 'logs'
if not os.path.exists(logdir):
os.makedirs(logdir)
fmt = '%(asctime)s - %(filename)s:%(lineno)s - %(name)s - %(message)s'
formatter = logging.Formatter(fmt)
handler = logging.handlers.TimedRotatingFileHandler(
'%s/logging' % logdir, 'M', 20, 360)
handler.suffix = '%Y%m%d%H%M%S.log'
handler.extMatch = re.compile(r'^\d{4}\d{2}\d{2}\d{2}\d{2}\d{2}')
handler.setFormatter(formatter)
logger = logging.getLogger('web')
logger.addHandler(handler)
if config.Mode == 'DEBUG':
logger.setLevel(logging.DEBUG)
else:
logger.setLevel(logging.ERROR)
init()
if '-P' in sys.argv:
http_server = tornado.httpserver.HTTPServer(application)
http_server.bind(8080, '0.0.0.0')
http_server.start() #TODO Based on CPU kernel number
print 'Server is running, listening on port 80....'
tornado.ioloop.IOLoop.instance().start()
else:
http_server = tornado.httpserver.HTTPServer(application)
application.listen(8080)
print 'Server is running, listening on port 80....'
tornado.ioloop.IOLoop.instance().start()
| 31.230769 | 76 | 0.698276 | 568 | 4,060 | 4.908451 | 0.34331 | 0.02726 | 0.037303 | 0.047346 | 0.304519 | 0.194405 | 0.053085 | 0.053085 | 0.049498 | 0.049498 | 0 | 0.011969 | 0.135714 | 4,060 | 129 | 77 | 31.472868 | 0.782559 | 0.1133 | 0 | 0.160714 | 0 | 0.017857 | 0.233371 | 0.024036 | 0 | 0 | 0 | 0.007752 | 0 | 0 | null | null | 0.008929 | 0.089286 | null | null | 0.026786 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa372cb86b56c891becb57810223ec547518e2ca | 8,739 | py | Python | fluiddb/data/user.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | 3 | 2021-05-10T14:41:30.000Z | 2021-12-16T05:53:30.000Z | fluiddb/data/user.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | null | null | null | fluiddb/data/user.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | 2 | 2018-01-24T09:03:21.000Z | 2021-06-25T08:34:54.000Z | import crypt
import random
import re
from string import ascii_letters, digits
from uuid import uuid4
from storm.locals import (
Storm, DateTime, Int, Unicode, UUID, Reference, AutoReload, RawStr)
from fluiddb.data.exceptions import DuplicateUserError, MalformedUsernameError
from fluiddb.data.store import getMainStore
from fluiddb.util.constant import Constant, ConstantEnum, EnumBase
class Role(EnumBase):
"""User roles.
@cvar ANONYMOUS: A user with the anonymous role only has read-only access
to data in Fluidinfo, unless a permission specifically grants write
access to a particular entity.
@cvar SUPERUSER: A user with the superuser role has read-write access to
all data in Fluidinfo and is not subject to permission checks.
@cvar USER: A user with the user role has read-write access to some data
in Fluidinfo, based on the rules defined by the permission system.
@cvar USER_MANAGER: A user with the user manager role is the same as a
C{USER}, except they can create, update and delete L{User}s.
"""
ANONYMOUS = Constant(1, 'ANONYMOUS')
SUPERUSER = Constant(2, 'SUPERUSER')
USER = Constant(3, 'USER')
USER_MANAGER = Constant(4, 'USER_MANAGER')
DOT_ATOM = r"(^[-!#$%&'*+/=?^_`{}|~0-9A-Z]+(\.[-!#$%&'*+/=?^_`{}|~0-9A-Z]+)*"
QUOTED_STRING = (r"|^\"([\001-\010\013\014\016-\037!#-\[\]-\177]|\\[\001-\011"
r"\013\014\016-\177])*\"")
DOMAIN_STRING = r")@(?:[A-Z0-9-]+\.)+[A-Z]{2,6}$"
EMAIL_REGEXP = re.compile(DOT_ATOM + QUOTED_STRING + DOMAIN_STRING,
re.IGNORECASE)
def validateEmail(obj, attribute, value):
"""Validate a L{User.email} value before storing it in the database.
@param obj: The L{User} instance being updated.
@param attribute: The name of the attribute being set.
@param value: The email address being stored.
@raise ValueError: Raised if the value isn't a valid email address.
@return: The value to store.
"""
if value is not None and not EMAIL_REGEXP.match(value):
raise ValueError('%r is not a valid email address.' % value)
return value
class User(Storm):
"""A user of Fluidinfo.
@param username: The username of the user.
@param passwordHash: The hashed password of the user.
@param fullname: The name of the user.
@param email: The email address for the user.
@param role: The L{Role} for the user.
"""
__storm_table__ = 'users'
id = Int('id', primary=True, allow_none=False, default=AutoReload)
objectID = UUID('object_id', allow_none=False)
role = ConstantEnum('role', enum_class=Role, allow_none=False)
username = Unicode('username', allow_none=False)
passwordHash = RawStr('password_hash', allow_none=False)
fullname = Unicode('fullname', allow_none=False)
email = Unicode('email', validator=validateEmail)
namespaceID = Int('namespace_id')
creationTime = DateTime('creation_time', default=AutoReload)
namespace = Reference(namespaceID, 'Namespace.id')
def __init__(self, username, passwordHash, fullname, email, role):
self.objectID = uuid4()
self.username = username
self.passwordHash = passwordHash
self.fullname = fullname
self.email = email
self.role = role
def isAnonymous(self):
"""Returns C{True} if this user has the anonymous role."""
return self.role == Role.ANONYMOUS
def isSuperuser(self):
"""Returns C{True} if this user has the super user role."""
return self.role == Role.SUPERUSER
def isUser(self):
"""Returns C{True} if this user has the regular user role."""
return self.role == Role.USER
def createUser(username, password, fullname, email=None, role=None):
"""Create a L{User} called C{name} with C{role}.
@param username: A C{unicode} username for the user.
@param password: A C{unicode} password in plain text for the user. The
password will be hashed before being stored. The password will be
disabled if C{None} is provided.
@param email: Optionally, an email address for the user.
@param role: Optionally, a role for the user, defaults to L{Role.USER}.
@raise MalformedUsernameError: Raised if C{username} is not valid.
@raise DuplicateUserError: Raised if a user with the given C{username}
already exists.
@return: A new L{User} instance persisted in the main store.
"""
if not isValidUsername(username):
raise MalformedUsernameError(username)
store = getMainStore()
if store.find(User.id, User.username == username).any():
raise DuplicateUserError([username])
passwordHash = '!' if password is None else hashPassword(password)
role = role if role is not None else Role.USER
return store.add(User(username, passwordHash, fullname, email, role))
def getUsers(usernames=None, ids=None, objectIDs=None):
"""Get L{User}s.
@param usernames: Optionally, a sequence of L{User.username}s to filter
the results with.
@param ids: Optionally, a sequence of L{User.id}s to filter the results
with.
@param objectIDs: Optionally, a sequence of L{User.objectID}s to filter the
result with.
@return: A C{ResultSet} with matching L{User}s.
"""
store = getMainStore()
where = []
if ids:
where.append(User.id.is_in(ids))
if usernames:
where.append(User.username.is_in(usernames))
if objectIDs:
where.append(User.objectID.is_in(objectIDs))
return store.find(User, *where)
# Password hashing code used by the low-level functions for creating users
ALPHABET = ascii_letters + digits
SALT_LENGTH = 8
def hashPassword(password, salt=None):
"""Convert a password string into a secure hash.
This function generates an MD5-hashed password, which consists of three
fields separated by a C{$} symbol:
1. The status of the password. If this field is empty, the user is
enabled, otherwise it's disabled. The C{!} character should be used
when specifying that a user is disabled.
2. The mechanism (1 for MD5, 2a for Blowfish, 5 for SHA-256 and 6 for
SHA-512).
3. The salt.
4. The hashed password.
@param password: The C{unicode} password to be hashed.
@param salt: Optionally, a key to be passed to the L{crypt.crypt} function
to secure against brute-force attacks. Defaults to a random string
and the MD5 hashing algorithm.
@return: A C{str} hash of C{password} generated with C{crypt} algorithm.
"""
# crypt.crypt needs the password to be encoded in ASCII
password = password.encode('utf-8')
if salt is None:
salt = '$1$' + ''.join(random.choice(ALPHABET)
for _ in xrange(SALT_LENGTH))
return crypt.crypt(password, salt)
USERNAME_REGEXP = re.compile(r'^[\:\.\-\w]{1,128}$', re.UNICODE)
def isValidUsername(username):
"""Determine if C{username} is valid.
A username may only contain letters, numbers, and colon, dash, dot and
underscore characters. It can't contain more than 128 characters.
@param path: A C{unicode} username to validate.
@return: C{True} if C{username} is valid, otherwise C{False}.
"""
return (USERNAME_REGEXP.match(username) is not None)
class TwitterUser(Storm):
"""The Twitter UID for a Fluidinfo user.
@param userID: The L{User.id} to link to Twitter.
@param uid: The Twitter UID to link to the L{Fluidinfo} user.
"""
__storm_table__ = 'twitter_users'
userID = Int('user_id', primary=True, allow_none=False)
uid = Int('uid', allow_none=False)
creationTime = DateTime('creation_time', default=AutoReload)
user = Reference(userID, User.id)
def __init__(self, userID, uid):
self.userID = userID
self.uid = uid
def createTwitterUser(user, uid):
"""Create a L{TwitterUser}.
@param user: The L{User} to link to a Twitter account.
@param uid: The Twitter UID for the user.
@return: A new L{TwitterUser} instance persisted in the main store.
"""
store = getMainStore()
return store.add(TwitterUser(user.id, uid))
def getTwitterUsers(uids=None):
"""Get C{(User, TwitterUser)} 2-tuples matching specified Twitter UIDs.
@param uids: Optionally, a sequence of L{TwitterUser.uid}s to filter the
results with.
@return: A C{ResultSet} with matching C{(User, TwitterUser)} 2-tuples.
"""
store = getMainStore()
where = []
if uids:
where.append(TwitterUser.uid.is_in(uids))
return store.find((User, TwitterUser),
User.id == TwitterUser.userID,
*where)
| 36.11157 | 79 | 0.669642 | 1,206 | 8,739 | 4.804312 | 0.235489 | 0.015706 | 0.01933 | 0.010356 | 0.166552 | 0.119261 | 0.048326 | 0.016569 | 0.016569 | 0 | 0 | 0.012084 | 0.223481 | 8,739 | 241 | 80 | 36.261411 | 0.841733 | 0.463897 | 0 | 0.08 | 0 | 0 | 0.078241 | 0.026389 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0.09 | 0.09 | 0 | 0.54 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa3a63f302b46c02ff39a3b03b267bd4f406883d | 241 | py | Python | Books/Book/urls.py | qq292/Books | d3b85829592bcbeb87eeccc568e22c510a289487 | [
"MIT"
] | null | null | null | Books/Book/urls.py | qq292/Books | d3b85829592bcbeb87eeccc568e22c510a289487 | [
"MIT"
] | null | null | null | Books/Book/urls.py | qq292/Books | d3b85829592bcbeb87eeccc568e22c510a289487 | [
"MIT"
] | null | null | null | from django.contrib import admin
from django.urls import path
from django.views.generic import TemplateView
from .views import MainPage
urlpatterns = [
path('admin/', admin.site.urls),
path('', MainPage.as_view(), name='books'),
]
| 21.909091 | 47 | 0.73029 | 32 | 241 | 5.46875 | 0.53125 | 0.171429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145228 | 241 | 10 | 48 | 24.1 | 0.849515 | 0 | 0 | 0 | 0 | 0 | 0.045643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
fa3cbfd8b261691f0db27a183a13227b321d7c35 | 1,782 | py | Python | neopixel/main.py | morgulbrut/wemos-mupy | fbe425c6a383c9c28f36f48a23c70aae98bf62cd | [
"MIT"
] | null | null | null | neopixel/main.py | morgulbrut/wemos-mupy | fbe425c6a383c9c28f36f48a23c70aae98bf62cd | [
"MIT"
] | null | null | null | neopixel/main.py | morgulbrut/wemos-mupy | fbe425c6a383c9c28f36f48a23c70aae98bf62cd | [
"MIT"
] | null | null | null | from machine import Pin
import neopixel
import time
class NeoMatrix:
def __init__(self, x, y):
self.colors = []
for i in range(x):
for j in range(x):
colors[i][j] = Color(0,0,0)
self.np = neopixel.NeoPixel(Pin(4, Pin.OUT),x*y)
self.np.write()
def set_pixel(self,x,y,r=0,g=0,b=0):
self.colors[x][y] = Color(r,g,b)
self.np.write()
class Color:
def __init__(self,r,g,b):
self.r = r
self.g = g
self.b = b
# def clear(np):
# set(np)
# def cycle(np,r=0,g=0,b=0,delay=25,cycles=1,invert=False):
# for i in range(cycles):
# bounce(np,r,g,b,delay,1,invert)
# def bounce(np,r=0,g=0,b=0,delay=25,cycles=2,invert=False):
# for i in range(cycles * np.n):
# if invert:
# for j in range(np.n):
# np[j] = (r, g, b)
# if (i // np.n) % 2 == 0:
# np[i % np.n] = (0, 0, 0)
# else:
# np[np.n - 1 - (i % np.n)] = (0, 0, 0)
# else:
# for j in range(np.n):
# np[j] = (0, 0, 0)
# if (i // np.n) % 2 == 0:
# np[i % np.n] = (r, g, b)
# else:
# np[np.n - 1 - (i % np.n)] = (r, g, b)
# np.write()
# time.sleep_ms(delay)
# clear(np)
# def fade_in(np,r=0,g=0,b=0,delay=25,cycles=1):
# for i in range(0,256,8):
# rn = my_map(r,in_max=i,out_max=r)
# gn = my_map(g,in_max=i,out_max=g)
# bn = my_map(b,in_max=i,out_max=b)
# set(np,rn,gn,bn)
# print(rn)
# def my_map(val,in_min=0,in_max=255,out_min=0,out_max=255):
# try:
# return int((val - in_min) * (out_max - out_min) / (in_max - in_min) + out_min)
# except ZeroDivisionError:
# return 0
# def demo(np):
# n = np.n
# # fade in/out
# for i in range(0, 4 * 256, 8):
# for j in range(n):
# if (i // 256) % 2 == 0:
# val = i & 0xff
# else:
# val = 255 - (i & 0xff)
# np[j] = (val, 0, 0)
# np.write()
# # clear
# clear(np)
| 20.964706 | 82 | 0.537037 | 362 | 1,782 | 2.558011 | 0.179558 | 0.042117 | 0.019438 | 0.059395 | 0.316415 | 0.25162 | 0.238661 | 0.167387 | 0.100432 | 0.100432 | 0 | 0.054034 | 0.241863 | 1,782 | 84 | 83 | 21.214286 | 0.631384 | 0.691919 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.157895 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa3d7b7b58127ec40c852f915a22ddc033b65684 | 377 | py | Python | tests/unit/request/request_builders/recurring_get_schedule_builder_test.py | Zhenay/python-sdk | 53161cc591fe2ec54ecfefad4fc4d5625a97afd9 | [
"MIT"
] | 3 | 2018-06-08T12:21:28.000Z | 2020-04-07T12:54:04.000Z | tests/unit/request/request_builders/recurring_get_schedule_builder_test.py | Zhenay/python-sdk | 53161cc591fe2ec54ecfefad4fc4d5625a97afd9 | [
"MIT"
] | 2 | 2017-09-21T15:43:46.000Z | 2019-07-22T08:20:50.000Z | tests/unit/request/request_builders/recurring_get_schedule_builder_test.py | Zhenay/python-sdk | 53161cc591fe2ec54ecfefad4fc4d5625a97afd9 | [
"MIT"
] | 4 | 2020-09-21T07:11:22.000Z | 2022-03-21T09:42:51.000Z | import unittest
from platron.request.request_builders.recurring_get_schedule_builder import RecurringGetScheduleBuilder
class RecurringGetScheduleBuilderTest(unittest.TestCase):
def test_get_params(self):
builder = RecurringGetScheduleBuilder('12345')
params = builder.get_params()
self.assertEquals('12345', params.get('pg_recurring_profile'))
| 31.416667 | 103 | 0.787798 | 37 | 377 | 7.783784 | 0.594595 | 0.0625 | 0.090278 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030581 | 0.132626 | 377 | 11 | 104 | 34.272727 | 0.850153 | 0 | 0 | 0 | 0 | 0 | 0.079576 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
fa401c88954339276304ae936fc6f58c6bc88be6 | 2,372 | py | Python | malleus/api/domain/protos/timings_pb2.py | joelgerard/malleus | 763850ef270a449829b89a998cdce8febf5020ef | [
"Apache-2.0"
] | null | null | null | malleus/api/domain/protos/timings_pb2.py | joelgerard/malleus | 763850ef270a449829b89a998cdce8febf5020ef | [
"Apache-2.0"
] | 2 | 2021-02-08T20:22:50.000Z | 2021-06-01T22:07:40.000Z | malleus/api/domain/protos/timings_pb2.py | joelgerard/malleus | 763850ef270a449829b89a998cdce8febf5020ef | [
"Apache-2.0"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: malleus/api/domain/protos/timings.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from malleus.api.domain.protos import timing_pb2 as malleus_dot_api_dot_domain_dot_protos_dot_timing__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='malleus/api/domain/protos/timings.proto',
package='malleus.api.domain',
syntax='proto3',
serialized_pb=_b('\n\'malleus/api/domain/protos/timings.proto\x12\x12malleus.api.domain\x1a&malleus/api/domain/protos/timing.proto\"6\n\x07Timings\x12+\n\x07timings\x18\x01 \x03(\x0b\x32\x1a.malleus.api.domain.Timingb\x06proto3')
,
dependencies=[malleus_dot_api_dot_domain_dot_protos_dot_timing__pb2.DESCRIPTOR,])
_TIMINGS = _descriptor.Descriptor(
name='Timings',
full_name='malleus.api.domain.Timings',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='timings', full_name='malleus.api.domain.Timings.timings', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=103,
serialized_end=157,
)
_TIMINGS.fields_by_name['timings'].message_type = malleus_dot_api_dot_domain_dot_protos_dot_timing__pb2._TIMING
DESCRIPTOR.message_types_by_name['Timings'] = _TIMINGS
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Timings = _reflection.GeneratedProtocolMessageType('Timings', (_message.Message,), dict(
DESCRIPTOR = _TIMINGS,
__module__ = 'malleus.api.domain.protos.timings_pb2'
# @@protoc_insertion_point(class_scope:malleus.api.domain.Timings)
))
_sym_db.RegisterMessage(Timings)
# @@protoc_insertion_point(module_scope)
| 32.493151 | 231 | 0.778246 | 314 | 2,372 | 5.570064 | 0.343949 | 0.06175 | 0.100629 | 0.075472 | 0.246998 | 0.191538 | 0.133219 | 0.133219 | 0.085192 | 0.085192 | 0 | 0.023102 | 0.105818 | 2,372 | 72 | 232 | 32.944444 | 0.801509 | 0.102024 | 0 | 0.113208 | 1 | 0 | 0.09887 | 0.06403 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.132075 | 0 | 0.132075 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa483b3f10489cad64a57007f1b02b4ab7eddb87 | 445 | py | Python | python_scripts/countries_dates.py | tuxskar/elpythonista | ac0cd45e97dffcf6a40d1566fdee5b01380f535a | [
"MIT"
] | 2 | 2021-09-13T16:26:26.000Z | 2021-10-04T04:40:11.000Z | python_scripts/countries_dates.py | tuxskar/elpythonista | ac0cd45e97dffcf6a40d1566fdee5b01380f535a | [
"MIT"
] | null | null | null | python_scripts/countries_dates.py | tuxskar/elpythonista | ac0cd45e97dffcf6a40d1566fdee5b01380f535a | [
"MIT"
] | null | null | null | from datetime import datetime
import pytz
if __name__ == '__main__':
places_tz = ['Asia/Tokyo', 'Europe/Madrid', 'America/Argentina/Buenos_Aires', 'US/eastern', 'US/Pacific', 'UTC']
cities_name = ['Tokyo', 'Madrid', 'Buenos Aires', 'New York', 'California', 'UTC']
for place_tz, city_name in zip(places_tz, cities_name):
city_time = datetime.now(pytz.timezone(place_tz))
print(f'Fecha en {city_name} - {city_time}')
| 44.5 | 116 | 0.678652 | 61 | 445 | 4.639344 | 0.606557 | 0.09894 | 0.084806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159551 | 445 | 9 | 117 | 49.444444 | 0.756684 | 0 | 0 | 0 | 0 | 0 | 0.364045 | 0.067416 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa48f487c0f61d98b503010843e5e1dd19fce9a8 | 5,871 | py | Python | algebra_utilities/structures/semigroup.py | computational-group-the-golden-ticket/AlgebraUtilities | d5c7c2806b6bd394564ae4146a2c5164f4ebe882 | [
"MIT"
] | null | null | null | algebra_utilities/structures/semigroup.py | computational-group-the-golden-ticket/AlgebraUtilities | d5c7c2806b6bd394564ae4146a2c5164f4ebe882 | [
"MIT"
] | null | null | null | algebra_utilities/structures/semigroup.py | computational-group-the-golden-ticket/AlgebraUtilities | d5c7c2806b6bd394564ae4146a2c5164f4ebe882 | [
"MIT"
] | null | null | null | from algebra_utilities.objects.baseobjects import *
from algebra_utilities.structures.baseobjects import Printable
from algebra_utilities.utils.errors import UnexpectedTypeError
from algebra_utilities.utils.errors import NonAssociativeSetError
from algebra_utilities.utils.errors import ElementsOverflow
class SemiGroup(Printable):
"""
Esta clase representa un semigrupo, un semigrupo es un conjunto no vacio
S con una operacion binaria multiplicacion (*): S x S -> S que es
asociativa
Atributos
---------
generators: lista con los elementos generadores del semigrupo
name: nombre del semigrupo
"""
def __init__(self, generators, name='G'):
super(SemiGroup, self).__init__()
# nombre del semigrupo
self.name = name
# todos los generadores deben heredar de SemiAlgebraicObject o de
# AlgebraicObject, para asegurar la definicion de operaciones basicas
for generator in generators:
if not isinstance(generator, SemiAlgebraicObject) and \
not isinstance(generator, AlgebraicObject):
raise UnexpectedTypeError('The objects has an invalid type in SemiGroup initialization')
# lista de los generadores del semigrupo, esta podria coincidir con la
# lista de todos los elementos del semigrupo
self.generators = generators
# TODO: discutir si se guardan los elementos en un "list" o en un "set"
self.elements = self.generate_elements(generators)
# un semigrupo es un conjunto con una operacion binaria que ademas es
# asociativa
if not self.check_associativity():
raise NonAssociativeSetError('The operation defined is not associative')
def __len__(self):
return len(self.elements)
def generate_orbit(self, element):
"""
Este metodo genera (de forma iterativa) todas las potencias (bajo la
operacion) de un elemento dado hasta llegar a la identidad
"""
orbit = []
pow_element = element
while pow_element not in orbit:
orbit.append(pow_element)
# potencias
pow_element *= element
return orbit
def remove_repeating_elements(self, elements):
"""
Este metodo elimina elementos repetidos en la lista pasada como
argumento
"""
dummy = []
for element in elements:
if element not in dummy:
dummy.append(element)
return dummy
def all_posible_multiplication(self, elements, limit=-1):
"""
Este metodo realiza todas las posibles multiplicaciones entre los
elementos pasados como argumentos y aquellos que se van generando
"""
old_length = -1
current_length = len(elements)
# la longitud de la lista cambia siempre que aparezcan nuevos elementos
while old_length != current_length:
# TODO: this is a bottle and must be reimplemented
for i in range(current_length):
for j in range(current_length):
# no siempre el producto es conmutativo
left_multiplication = elements[i] * elements[j]
right_multiplication = elements[j] * elements[i]
if left_multiplication not in elements:
elements.append(left_multiplication)
if right_multiplication not in elements:
elements.append(right_multiplication)
# se actualiza el valor de la longitud
current_length, old_length = len(elements), current_length
if limit > 0 and current_length > limit:
raise ElementsOverflow('Limit of allowed elements exceeded in the generation of elements')
return elements
def generate_elements(self, generators):
"""
Este metodo genera todos los elementos del grupo
"""
elements = []
# se generan las orbitas de cada generador
for generator in generators:
elements.extend(self.generate_orbit(generator))
# se eliminan elementos repetidos y se realizan todas las posibles
# multiplicaciones
elements = self.remove_repeating_elements(elements)
elements = self.all_posible_multiplication(elements)
return elements
def add_element(self, element):
"""
Este metodo agrega un nuevo generador al grupo y genera los nuevos
elementos
"""
# chequeo de tipo
if not isinstance(generator, SemiAlgebraicObject) and \
not isinstance(generator, AlgebraicObject):
raise UnexpectedTypeError('The objects has an invalid type in element aggregation')
if element not in self.elements:
self.elements.extend(self.generate_orbit(element))
self.elements = self.all_posible_multiplication(self.elements)
def check_associativity(self):
# TODO: buscar un algoritmo de orden menor a n^3
for a in self.elements:
for b in self.elements:
for c in self.elements:
if a * (b * c) != (a * b) * c:
return False
return True
def get_cayley_table(self, a=None, b=None):
"""
Este metodo muestra todas las posibles multiplicaciones
"""
if a is None:
a = self.elements
if b is None:
b = self.elements
for i in range(len(a)):
for j in range(len(b)):
c = a[i] * b[j]
print('%s * %s = %s' % (a[i], b[j], c))
| 35.79878 | 107 | 0.607392 | 645 | 5,871 | 5.44031 | 0.305426 | 0.044457 | 0.028498 | 0.021374 | 0.196922 | 0.131946 | 0.076945 | 0.076945 | 0.076945 | 0.076945 | 0 | 0.00102 | 0.332141 | 5,871 | 163 | 108 | 36.018405 | 0.893905 | 0.269119 | 0 | 0.102564 | 0 | 0 | 0.058959 | 0 | 0 | 0 | 0 | 0.055215 | 0 | 1 | 0.115385 | false | 0 | 0.064103 | 0.012821 | 0.282051 | 0.012821 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa4c5e2c67d5f8e3cdc1d31a4e00b2fd7c6a269b | 6,301 | py | Python | tests/unit/lib/logs/test_formatter.py | OscarVanL/aws-sam-cli | 13e02ae53ac3a4484acd74c944123921e27823f3 | [
"Apache-2.0"
] | null | null | null | tests/unit/lib/logs/test_formatter.py | OscarVanL/aws-sam-cli | 13e02ae53ac3a4484acd74c944123921e27823f3 | [
"Apache-2.0"
] | null | null | null | tests/unit/lib/logs/test_formatter.py | OscarVanL/aws-sam-cli | 13e02ae53ac3a4484acd74c944123921e27823f3 | [
"Apache-2.0"
] | null | null | null | import json
from unittest import TestCase
from mock import Mock, patch, call
from nose_parameterized import parameterized
from samcli.lib.logs.formatter import LogsFormatter, LambdaLogMsgFormatters, KeywordHighlighter, JSONMsgFormatter
from samcli.lib.logs.event import LogEvent
class TestLogsFormatter_pretty_print_event(TestCase):
def setUp(self):
self.colored_mock = Mock()
self.group_name = "group name"
self.stream_name = "stream name"
self.message = "message"
self.event_dict = {"timestamp": 1, "message": self.message, "logStreamName": self.stream_name}
def test_must_serialize_event(self):
colored_timestamp = "colored timestamp"
colored_stream_name = "colored stream name"
self.colored_mock.yellow.return_value = colored_timestamp
self.colored_mock.cyan.return_value = colored_stream_name
event = LogEvent(self.group_name, self.event_dict)
expected = " ".join([colored_stream_name, colored_timestamp, self.message])
result = LogsFormatter._pretty_print_event(event, self.colored_mock)
self.assertEquals(expected, result)
self.colored_mock.yellow.has_calls()
self.colored_mock.cyan.assert_called_with(self.stream_name)
def _passthru_formatter(event, colored):
return event
class TestLogsFormatter_do_format(TestCase):
def setUp(self):
self.colored_mock = Mock()
# Set formatter chain method to return the input unaltered.
self.chain_method1 = Mock(wraps=_passthru_formatter)
self.chain_method2 = Mock(wraps=_passthru_formatter)
self.chain_method3 = Mock(wraps=_passthru_formatter)
self.formatter_chain = [self.chain_method1, self.chain_method2, self.chain_method3]
@patch.object(LogsFormatter, "_pretty_print_event", wraps=_passthru_formatter)
def test_must_map_formatters_sequentially(self, pretty_print_mock):
events_iterable = [1, 2, 3]
expected_result = [1, 2, 3]
expected_call_order = [
call(1, colored=self.colored_mock),
call(2, colored=self.colored_mock),
call(3, colored=self.colored_mock),
]
formatter = LogsFormatter(self.colored_mock, self.formatter_chain)
result_iterable = formatter.do_format(events_iterable)
self.assertEquals(list(result_iterable), expected_result)
self.chain_method1.assert_has_calls(expected_call_order)
self.chain_method2.assert_has_calls(expected_call_order)
self.chain_method3.assert_has_calls(expected_call_order)
pretty_print_mock.assert_has_calls(expected_call_order) # Pretty Printer must always be called
@patch.object(LogsFormatter, "_pretty_print_event", wraps=_passthru_formatter)
def test_must_work_without_formatter_chain(self, pretty_print_mock):
events_iterable = [1, 2, 3]
expected_result = [1, 2, 3]
expected_call_order = [
call(1, colored=self.colored_mock),
call(2, colored=self.colored_mock),
call(3, colored=self.colored_mock),
]
# No formatter chain.
formatter = LogsFormatter(self.colored_mock)
result_iterable = formatter.do_format(events_iterable)
self.assertEquals(list(result_iterable), expected_result)
# Pretty Print is always called, even if there are no other formatters in the chain.
pretty_print_mock.assert_has_calls(expected_call_order)
self.chain_method1.assert_not_called()
self.chain_method2.assert_not_called()
self.chain_method3.assert_not_called()
class TestLambdaLogMsgFormatters_colorize_crashes(TestCase):
@parameterized.expand(
[
"Task timed out",
"Something happened. Task timed out. Something else happend",
"Process exited before completing request",
]
)
def test_must_color_crash_messages(self, input_msg):
color_result = "colored messaage"
colored = Mock()
colored.red.return_value = color_result
event = LogEvent("group_name", {"message": input_msg})
result = LambdaLogMsgFormatters.colorize_errors(event, colored)
self.assertEquals(result.message, color_result)
colored.red.assert_called_with(input_msg)
def test_must_ignore_other_messages(self):
colored = Mock()
event = LogEvent("group_name", {"message": "some msg"})
result = LambdaLogMsgFormatters.colorize_errors(event, colored)
self.assertEquals(result.message, "some msg")
colored.red.assert_not_called()
class TestKeywordHighlight_highlight_keyword(TestCase):
def test_must_highlight_all_keywords(self):
input_msg = "this keyword some keyword other keyword"
keyword = "keyword"
color_result = "colored"
expected_msg = "this colored some colored other colored"
colored = Mock()
colored.underline.return_value = color_result
event = LogEvent("group_name", {"message": input_msg})
result = KeywordHighlighter(keyword).highlight_keywords(event, colored)
self.assertEquals(result.message, expected_msg)
colored.underline.assert_called_with(keyword)
def test_must_ignore_if_keyword_is_absent(self):
colored = Mock()
input_msg = "this keyword some keyword other keyword"
event = LogEvent("group_name", {"message": input_msg})
result = KeywordHighlighter().highlight_keywords(event, colored)
self.assertEquals(result.message, input_msg)
colored.underline.assert_not_called()
class TestJSONMsgFormatter_format_json(TestCase):
def test_must_pretty_print_json(self):
data = {"a": "b"}
input_msg = '{"a": "b"}'
expected_msg = json.dumps(data, indent=2)
event = LogEvent("group_name", {"message": input_msg})
result = JSONMsgFormatter.format_json(event, None)
self.assertEquals(result.message, expected_msg)
@parameterized.expand(["this is not json", '{"not a valid json"}'])
def test_ignore_non_json(self, input_msg):
event = LogEvent("group_name", {"message": input_msg})
result = JSONMsgFormatter.format_json(event, None)
self.assertEquals(result.message, input_msg)
| 38.187879 | 113 | 0.702746 | 736 | 6,301 | 5.722826 | 0.179348 | 0.04962 | 0.060541 | 0.031339 | 0.476021 | 0.433048 | 0.407407 | 0.398623 | 0.311491 | 0.270893 | 0 | 0.006403 | 0.206793 | 6,301 | 164 | 114 | 38.420732 | 0.836335 | 0.031265 | 0 | 0.361345 | 0 | 0 | 0.091326 | 0 | 0 | 0 | 0 | 0 | 0.184874 | 1 | 0.10084 | false | 0.05042 | 0.05042 | 0.008403 | 0.201681 | 0.07563 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa4c6f431c760c50f8930890bae505b1e114beff | 3,921 | py | Python | openconnect-cli.py | tcpipuk/OpenConnect-FE | d7d213cedb16b5eb30c7e100848e7e3b53467dd3 | [
"Apache-2.0"
] | null | null | null | openconnect-cli.py | tcpipuk/OpenConnect-FE | d7d213cedb16b5eb30c7e100848e7e3b53467dd3 | [
"Apache-2.0"
] | null | null | null | openconnect-cli.py | tcpipuk/OpenConnect-FE | d7d213cedb16b5eb30c7e100848e7e3b53467dd3 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import argparse, pexpect
from getpass import getpass
from time import sleep
# Set up argument parser
parser = argparse.ArgumentParser(prog='openconnect-cli', description='Automate logins to the OpenConnect SSL VPN client')
# Type of VPN to initiate
parser_type = parser.add_mutually_exclusive_group(required=False)
parser_type.add_argument('--anyconnect', action='store_true', default=False, help='Cisco AnyConnect SSL VPN')
parser_type.add_argument('--fortinet', action='store_true', default=False, help='Fortinet FortiClient SSL VPN')
parser_type.add_argument('--pulsesecure', action='store_true', default=False, help='Juniper Network Connect / Pulse Secure SSL VPN')
parser_type.add_argument('--paloalto', action='store_true', default=False, help='Palo Alto Networks (PAN) GlobalProtect SSL VPN')
# VPN server details
parser_dst = parser.add_argument_group('VPN Server Details', 'Any missing fields will be prompted on launch')
parser_dst.add_argument('--host', type=str, default=False, help='DNS hostname of SSL VPN server')
parser_dst.add_argument('--user', type=str, default=False, help='Username for SSL VPN account')
parser_dst.add_argument('--pw', type=str, default=False, help='Password for SSL VPN account')
# Import options, output help if none provided
args = vars(parser.parse_args())
#args = vars(parser.parse_args(args=None if sys.argv[1:] else ['--help']))
def vpnTypePrompt():
try:
print('Please enter one of the following and press enter:')
print('1 for Cisco AnyConnect')
print('2 for Fortinet FortiClient')
print('3 for Pulse Secure or Juniper Network Connect')
print('4 for Palo Alto Networks GlobalProtect')
protocol = int(input('SSL VPN Type: '))
if protocol == 1:
return 'anyconnect'
elif protocol == 2:
return 'fortinet'
elif protocol == 3:
return 'nc'
elif protocol == 4:
return 'gp'
else:
return False
except:
return False
if 'anyconnect' in args and args['anyconnect']:
args['protocol'] = 'anyconnect'
elif 'fortinet' in args and args['fortinet']:
args['protocol'] = 'fortinet'
elif 'pulsesecure' in args and args['pulsesecure']:
args['protocol'] = 'nc'
elif 'paloalto' in args and args['paloalto']:
args['protocol'] = 'gp'
else:
args['protocol'] = False
while args['protocol'] == False:
args['protocol'] = vpnTypePrompt()
# Fields to prompt for when False
prompt_for = {
'host': 'DNS hostname of SSL VPN server: ',
'user': 'Username for SSL VPN account: ',
'pw': 'Password for SSL VPN account: '
}
# Interate through fields and prompt for missing ones
if 'help' not in args:
for field,prompt in prompt_for.items():
if str(field) not in args or not args[field]:
while args[field] == False:
try:
if field == 'pw' and args['protocol'] != 'gp':
args[field] = 'N/A'
elif field == 'pw':
args[field] = getpass(prompt)
else:
args[field] = input(prompt)
except:
pass
# Collate arguments for command
command = [
'sudo openconnect',
'--interface=vpn0',
'--script=/usr/share/vpnc-scripts/vpnc-script',
'--protocol="' + args['protocol'] + '"',
'--user="' + args['user'] + '"',
args['host']
]
# Compile command
command = ' '.join(command)
# Start process
process = pexpect.spawnu('/bin/bash', ['-c', command])
# Automate login process for Palo Alto GlobalProtect
if args['protocol'] == 'gp':
process.expect('Password: ')
process.sendline(args['pw'])
process.expect('GATEWAY: ')
process.sendline('Primary GP')
process.expect('anything else to view:')
process.sendline('yes')
process.expect('Password: ')
process.sendline(args['pw'])
# Clear remaining private data
args = None
command = None
# Hand over input to user, wait for process to end if interactive mode ends
process.interact()
while process.isalive():
sleep(5)
| 33.228814 | 132 | 0.683499 | 522 | 3,921 | 5.078544 | 0.314176 | 0.02716 | 0.042248 | 0.031686 | 0.208978 | 0.146737 | 0.031686 | 0 | 0 | 0 | 0 | 0.003722 | 0.177761 | 3,921 | 117 | 133 | 33.512821 | 0.818548 | 0.128284 | 0 | 0.147727 | 0 | 0 | 0.339013 | 0.012926 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011364 | false | 0.079545 | 0.034091 | 0 | 0.113636 | 0.056818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa4c79eb3a2bcc3c5bb9874b8d41fc99facf7f36 | 1,047 | py | Python | unittestdemo/unittestdemo01.py | caoyp2/PyProject01 | 1a7d022894908b400fa24da1bb74b47a87ec04e4 | [
"Apache-2.0"
] | null | null | null | unittestdemo/unittestdemo01.py | caoyp2/PyProject01 | 1a7d022894908b400fa24da1bb74b47a87ec04e4 | [
"Apache-2.0"
] | null | null | null | unittestdemo/unittestdemo01.py | caoyp2/PyProject01 | 1a7d022894908b400fa24da1bb74b47a87ec04e4 | [
"Apache-2.0"
] | null | null | null | import unittest
#4.定义测试类,父类为unittest.TestCase。
#可继承unittest.TestCase的方法,如setUp和tearDown方法,不过此方法可以在子类重写,覆盖父类方法。
#可继承unittest.TestCase的各种断言方法。
class Test(unittest.TestCase):
#5.定义setUp()方法用于测试用例执行前的初始化工作。
#注意,所有类中方法的入参为self,定义方法的变量也要“self.变量”
def setUp(self):
print("开始。。。。。。。。")
self.number = 10
#6.定义测试用例,以“test_”开头命名的方法
#注意,方法的入参为self
#可使用unittest.TestCase类下面的各种断言方法用于对测试结果的判断
#可定义多个测试用例
#最重要的就是该部分
def test_case1(self):
self.assertEqual(10,10)
def test_case2(self):
# self.number为期望值,20为实际值
self.assertEqual(self.number,20,msg="your input is not 20")
@unittest.skip('暂时跳过用例3的测试')
def test_case3(self):
self.assertEqual(self.number,30,msg='Your input is not 30')
#7.定义tearDown()方法用于测试用例执行之后的善后工作。
#注意,方法的入参为self
def tearDown(self):
print("结束。。。。。。。。")
#8如果直接运行该文件(__name__值为__main__),则执行以下语句,常用于测试脚本是否能够正常运行
if __name__=='__main__':
#8.1执行测试用例方案一如下:
#unittest.main()方法会搜索该模块下所有以test开头的测试用例方法,并自动执行它们。
#执行顺序是命名顺序:先执行test_case1,再执行test_case2
unittest.main() | 26.175 | 67 | 0.726839 | 120 | 1,047 | 6.158333 | 0.591667 | 0.040595 | 0.051421 | 0.067659 | 0.046008 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032044 | 0.135626 | 1,047 | 40 | 68 | 26.175 | 0.78453 | 0.479465 | 0 | 0 | 0 | 0 | 0.147448 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 1 | 0.3125 | false | 0 | 0.0625 | 0 | 0.4375 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa4d864c5607687ad1cc9e70a90a7da7550acda4 | 351 | py | Python | pytools/unit.py | ry-shika/Geister-cpp-lib | d8630185e19fe06c5b2bf63ee0f37d665d7f357b | [
"BSL-1.0"
] | 8 | 2021-03-12T00:06:44.000Z | 2022-01-15T20:09:51.000Z | pytools/unit.py | ry-shika/Geister-cpp-lib | d8630185e19fe06c5b2bf63ee0f37d665d7f357b | [
"BSL-1.0"
] | 40 | 2019-06-19T04:54:55.000Z | 2020-10-25T17:58:31.000Z | pytools/unit.py | ry-shika/Geister-cpp-lib | d8630185e19fe06c5b2bf63ee0f37d665d7f357b | [
"BSL-1.0"
] | 3 | 2021-05-25T08:26:26.000Z | 2021-06-22T08:26:39.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
class Unit:
def __init__(self, x, y, color, name):
self.x = x
self.y = y
self.color = color
self.name = name
self.taken = False
class OpUnit(Unit):
def __init__(self, x, y, color, name):
super().__init__(x, y, color, name)
self.blue = 0.0 | 21.9375 | 43 | 0.532764 | 51 | 351 | 3.431373 | 0.431373 | 0.085714 | 0.12 | 0.188571 | 0.405714 | 0.297143 | 0.297143 | 0.297143 | 0 | 0 | 0 | 0.0125 | 0.316239 | 351 | 16 | 44 | 21.9375 | 0.716667 | 0.119658 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa4e2e368a0acbcb5f65a50b8b60acca846de6d5 | 1,752 | py | Python | pull_scp/config.py | FNNDSC/pl-pull_scp | 99b310e5ad88f1afdb1114f14d8cf10bb9b3657d | [
"MIT"
] | null | null | null | pull_scp/config.py | FNNDSC/pl-pull_scp | 99b310e5ad88f1afdb1114f14d8cf10bb9b3657d | [
"MIT"
] | null | null | null | pull_scp/config.py | FNNDSC/pl-pull_scp | 99b310e5ad88f1afdb1114f14d8cf10bb9b3657d | [
"MIT"
] | null | null | null | """Remote host configuration."""
from os import getenv, path
from dotenv import load_dotenv
from .log import LOGGER
import pudb
# Load environment variables from .env
# Originally set to the "installation" directory of the app...
BASE_DIR = path.abspath(path.dirname(__file__))
# But using /tmp might just be easier.
BASE_DIR = '/tmp'
load_dotenv(path.join(BASE_DIR, ".env"))
# SSH Connection Variables
ENVIRONMENT = getenv("ENVIRONMENT")
SSH_REMOTE_HOST = getenv("SSH_REMOTE_HOST")
SSH_USERNAME = getenv("SSH_USERNAME")
SSH_PASSWORD = getenv("SSH_PASSWORD")
SSH_KEY_FILEPATH = getenv("SSH_KEY_FILEPATH")
SCP_DESTINATION_FOLDER = getenv("SCP_DESTINATION_FOLDER")
SSH_CONFIG_VALUES = [
{"host": SSH_REMOTE_HOST},
{"user": SSH_USERNAME},
{"password": SSH_PASSWORD},
{"ssh": SSH_KEY_FILEPATH},
{"path": SCP_DESTINATION_FOLDER},
]
# Kerberos
KERBEROS_USER = getenv("KERBEROS_USER")
# Database config
DATABASE_HOSTS = [
{"hdprod": getenv("DATABASE_HDPROD_URI")},
{"sdprod": getenv("DATABASE_SDPROD_URI")},
{"gamedata": getenv("DATABASE_GAMEDATA_URI")},
{"gameentry": getenv("DATABASE_GAMEENTRY_URI")},
{"boxfile": getenv("DATABASE_BOXFILE_URI")},
]
# EC2 instances in a devstack
DEVSTACK_BOXES = ["web", "api", "app", "state", ""]
# Verify all config values are present
for config in SSH_CONFIG_VALUES + SSH_CONFIG_VALUES:
if None in config.values():
LOGGER.warning(f"Config value not set: {config.popitem()}")
raise Exception("Please set your environment variables via a `.env` file.")
# Local file directory (no trailing slashes)
LOCAL_FILE_DIRECTORY = f"{BASE_DIR}"
| 31.854545 | 83 | 0.67637 | 213 | 1,752 | 5.300469 | 0.413146 | 0.053144 | 0.034544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000714 | 0.200342 | 1,752 | 54 | 84 | 32.444444 | 0.805139 | 0.182648 | 0 | 0 | 0 | 0 | 0.27433 | 0.045839 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
fa50b1c117a73b73cdb56047b0035eccafe1e85c | 343 | py | Python | project-a/labels.py | achon22/cs231nLung | 37a1d46a36f737a8d8461f7ef9237f818a74493f | [
"MIT"
] | 2 | 2018-05-06T12:45:03.000Z | 2019-03-31T07:01:41.000Z | project-a/labels.py | achon22/cs231nLung | 37a1d46a36f737a8d8461f7ef9237f818a74493f | [
"MIT"
] | null | null | null | project-a/labels.py | achon22/cs231nLung | 37a1d46a36f737a8d8461f7ef9237f818a74493f | [
"MIT"
] | 1 | 2018-05-06T12:56:05.000Z | 2018-05-06T12:56:05.000Z | #!/usr/bin/env python
def main():
f = open('stage1_solution.csv')
ones = 0
zeros = 0
total = 0
for line in f:
if line[:3] == 'id,':
continue
line = line.strip().split(',')
label = int(line[1])
if label == 1:
ones += 1
total += 1
zeros = total-ones
print float(zeros)/total
f.close()
if __name__ == '__main__':
main() | 16.333333 | 32 | 0.586006 | 54 | 343 | 3.555556 | 0.574074 | 0.104167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033962 | 0.227405 | 343 | 21 | 33 | 16.333333 | 0.690566 | 0.058309 | 0 | 0 | 0 | 0 | 0.095975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fa601fe1d2a43b58da3a5908f30d4dd1e67e4207 | 2,507 | py | Python | pyethereum/config.py | CJentzsch/pyethereum | 35c67e28ea1279b63ac40f3a987a741d6d994022 | [
"MIT"
] | null | null | null | pyethereum/config.py | CJentzsch/pyethereum | 35c67e28ea1279b63ac40f3a987a741d6d994022 | [
"MIT"
] | null | null | null | pyethereum/config.py | CJentzsch/pyethereum | 35c67e28ea1279b63ac40f3a987a741d6d994022 | [
"MIT"
] | 2 | 2020-09-09T20:01:12.000Z | 2021-09-01T15:47:10.000Z |
import os
import uuid
import StringIO
import ConfigParser
from pyethereum.utils import data_dir
from pyethereum.packeter import Packeter
from pyethereum.utils import sha3
def default_data_dir():
data_dir._set_default()
return data_dir.path
def default_config_path():
return os.path.join(default_data_dir(), 'config.txt')
def default_client_version():
return Packeter.CLIENT_VERSION # FIXME
def default_node_id():
return sha3(str(uuid.uuid1())).encode('hex')
config_template = \
"""
# NETWORK OPTIONS ###########
[network]
# Connect to remote host/port
# poc-6.ethdev.com:30300
remote_host = 207.12.89.101
remote_port = 30303
# Listen on the given host/port for incoming connections
listen_host = 0.0.0.0
listen_port = 30303
# Number of peer to connections to establish
num_peers = 10
# unique id of this node
node_id = {0}
# API OPTIONS ###########
[api]
# Serve the restful json api on the given host/port
listen_host = 0.0.0.0
listen_port = 30203
# path to server the api at
api_path = /api/v02a
# MISC OIPTIONS #########
[misc]
# Load database from path
data_dir = {1}
# percent cpu devoted to mining 0=off
mining = 30
# how verbose should the client be (1-3)
verbosity = 3
# set log level and filters (WARM, INFO, DEBUG)
# examples:
# get every log message from every module
# :DEBUG
# get every warning from every module
# :WARN
# get every message from module chainmanager and all warnings
# pyethereum.chainmanager:DEBUG,:WARN
logging = pyethereum.chainmanager:DEBUG,pyethereum.synchronizer:DEBUG,:INFO
# WALLET OPTIONS ##################
[wallet]
# Set the coinbase (mining payout) address
coinbase = 6c386a4b26f73c802f34673f7248bb118f97424a
""".format(default_node_id(), default_data_dir())
def get_default_config():
f = StringIO.StringIO()
f.write(config_template)
f.seek(0)
config = ConfigParser.ConfigParser()
config.readfp(f)
config.set('network', 'client_version', default_client_version())
return config
def read_config(cfg_path = default_config_path()):
# create default if not existent
if not os.path.exists(cfg_path):
open(cfg_path, 'w').write(config_template)
# extend on the default config
config = get_default_config()
config.read(cfg_path)
return config
def dump_config(config):
r = ['']
for section in config.sections():
for a,v in config.items(section):
r.append('[%s] %s = %r' %( section, a, v))
return '\n'.join(r)
| 21.991228 | 75 | 0.70004 | 354 | 2,507 | 4.822034 | 0.392655 | 0.028705 | 0.00703 | 0.029291 | 0.049209 | 0.02812 | 0.02812 | 0.02812 | 0 | 0 | 0 | 0.040956 | 0.181891 | 2,507 | 113 | 76 | 22.185841 | 0.791321 | 0.025927 | 0 | 0.052632 | 0 | 0 | 0.040197 | 0 | 0 | 0 | 0 | 0.00885 | 0 | 1 | 0.184211 | false | 0 | 0.184211 | 0.078947 | 0.552632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
fa6881de69c9929008a699b6ce522f86648d2221 | 403 | py | Python | input/migrations/0016_auto_20190524_0055.py | JianaXu/productivity | f7478ce2f154538185f071d58a0e4a2cc0b17ee8 | [
"bzip2-1.0.6"
] | null | null | null | input/migrations/0016_auto_20190524_0055.py | JianaXu/productivity | f7478ce2f154538185f071d58a0e4a2cc0b17ee8 | [
"bzip2-1.0.6"
] | 10 | 2019-05-15T06:25:36.000Z | 2022-02-10T08:46:38.000Z | input/migrations/0016_auto_20190524_0055.py | JianaXu/productivity | f7478ce2f154538185f071d58a0e4a2cc0b17ee8 | [
"bzip2-1.0.6"
] | 1 | 2019-05-21T03:01:48.000Z | 2019-05-21T03:01:48.000Z | # Generated by Django 2.1.1 on 2019-05-24 00:55
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('input', '0015_auto_20190524_0052'),
]
operations = [
migrations.AlterField(
model_name='post',
name='ctime',
field=models.DateTimeField(auto_now=True),
),
]
| 21.210526 | 55 | 0.57072 | 42 | 403 | 5.357143 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113139 | 0.320099 | 403 | 18 | 56 | 22.388889 | 0.708029 | 0.111663 | 0 | 0 | 1 | 0 | 0.109467 | 0.068047 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3af8f1e275cd19adaa5bb2394843a71b553551e5 | 758 | py | Python | py/py_0685_inverse_digit_sum_ii.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | py/py_0685_inverse_digit_sum_ii.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | py/py_0685_inverse_digit_sum_ii.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | # Solution of;
# Project Euler Problem 685: Inverse Digit Sum II
# https://projecteuler.net/problem=685
#
# Writing down the numbers which have a digit sum of 10 in ascending order, we
# get:$19, 28, 37, 46,55,64,73,82,91,109, 118,\dots$Let $f(n,m)$ be the
# $m^{\text{th}}$ occurrence of the digit sum $n$. For example, $f(10,1)=19$,
# $f(10,10)=109$ and $f(10,100)=1423$. Let $\displaystyle S(k)=\sum_{n=1}^k
# f(n^3,n^4)$. For example $S(3)=7128$ and $S(10)\equiv 32287064 \mod
# 1\,000\,000\,007$. Find $S(10\,000)$ modulo $1\,000\,000\,007$.
#
# by lcsm29 http://github.com/lcsm29/project-euler
import timed
def dummy(n):
pass
if __name__ == '__main__':
n = 1000
i = 10000
prob_id = 685
timed.caller(dummy, n, i, prob_id)
| 30.32 | 79 | 0.633245 | 141 | 758 | 3.326241 | 0.595745 | 0.051173 | 0.029851 | 0.042644 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178628 | 0.172823 | 758 | 24 | 80 | 31.583333 | 0.569378 | 0.76781 | 0 | 0 | 0 | 0 | 0.04908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
3afa222274ac5c4cc6f82d844f11e23c6709df36 | 2,196 | py | Python | test_scripts/json_over_tcp.py | luozh05/Doc | 3471a9c8578d6530abf5c3ee3100488b0dc1e447 | [
"Apache-2.0"
] | null | null | null | test_scripts/json_over_tcp.py | luozh05/Doc | 3471a9c8578d6530abf5c3ee3100488b0dc1e447 | [
"Apache-2.0"
] | null | null | null | test_scripts/json_over_tcp.py | luozh05/Doc | 3471a9c8578d6530abf5c3ee3100488b0dc1e447 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import socket
import sys
import json
#HOST="localhost"
HOST="172.20.1.11"
PORT=37568
def send_and_receive_msg(msg):
# Create a socket (SOCK_STREAM means a TCP socket)
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# Connect host:port
sock.connect((HOST, PORT))
# Connect to server and send data
sock.send(msg)
# Receive data from the server and shut down
recv_message = []
time = 100
while time > 0:
time = time - 1
d = sock.recv(1024)
if d:
recv_message.append(d)
else:
break
sock.close()
print "\n"
print "Sent: {}".format(msg)
print "Received: {}".format(recv_message)
json_recv = json.loads(recv_message[0])
for key, value in json_recv.items():
if cmp(json_recv[key], "succeed") == 0 :
return True
elif cmp(json_recv[key], "failed") == 0:
return False
if __name__ == '__main__':
#test get cases:right input, case 0
cmd = {}
cmd["cmd-set-parameters"] = {}
cmd["cmd-set-parameters"]["evt-version"] = "evt-version-2.0"
cmd["cmd-set-parameters"]["camera-name"] = "remote-camera-avm-synthesis"
cmd_string = json.dumps(cmd)
result = send_and_receive_msg(cmd_string)
if result == False:
print "case 0 failed!"
exit()
print "case 0 succeed!"
#test get cases:right input, case 1
cmd = {}
cmd["cmd-get-parameters"] = {}
cmd["cmd-get-parameters"]["camera-name"]="remote-camera-avm-synthesis"
cmd_string = json.dumps(cmd)
result = send_and_receive_msg(cmd_string)
if result == False:
print "case 1 failed!"
exit()
print "case 1 succeed!"
#test get cases:right input, case 2
cmd = {}
cmd["cmd-set-parameters"]={}
cmd["cmd-set-parameters"]["preview-format"]="preview-format-h264"
cmd["cmd-set-parameters"]["preview-size"]="1280x800"
cmd["cmd-set-parameters"]["preview-fps-values"]="30"
cmd_string = json.dumps(cmd)
result = send_and_receive_msg(cmd_string)
if result == False:
print "case 2 failed!"
exit()
print "case 2 succeed!"
| 26.780488 | 76 | 0.607013 | 299 | 2,196 | 4.334448 | 0.324415 | 0.055556 | 0.048611 | 0.102623 | 0.421296 | 0.375772 | 0.35571 | 0.304784 | 0.304784 | 0.241512 | 0 | 0.029179 | 0.250911 | 2,196 | 81 | 77 | 27.111111 | 0.758663 | 0.127505 | 0 | 0.288136 | 0 | 0 | 0.252753 | 0.028317 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.050847 | null | null | 0.152542 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3afc9808039be0c4906a8ad82c7a46e1f069e435 | 5,628 | py | Python | test_api.py | atifar/mini-key-value | 0efe621260aa0a1aaa22ed721d05b4173b4171dd | [
"MIT"
] | null | null | null | test_api.py | atifar/mini-key-value | 0efe621260aa0a1aaa22ed721d05b4173b4171dd | [
"MIT"
] | null | null | null | test_api.py | atifar/mini-key-value | 0efe621260aa0a1aaa22ed721d05b4173b4171dd | [
"MIT"
] | null | null | null | def test_check_sanity(client):
resp = client.get('/sanity')
assert resp.status_code == 200
assert 'Sanity check passed.' == resp.data.decode()
# 'list collections' tests
def test_get_api_root(client):
resp = client.get('/', content_type='application/json')
assert resp.status_code == 200
resp_data = resp.get_json()
assert 'keys_url' in resp_data
assert len(resp_data) == 1
assert resp_data['keys_url'][-5:] == '/keys'
def test_delete_api_root_not_allowed(client):
resp = client.delete('/', content_type='application/json')
assert resp.status_code == 405
# 'list keys' tests
def test_get_empty_keys_list(client):
resp = client.get('/keys', content_type='application/json')
assert resp.status_code == 200
resp_data = resp.get_json()
assert len(resp_data) == 0
def test_get_nonempty_keys_list(client, keys, add_to_keys):
add_to_keys({'key': 'babboon', 'value': 'Larry'})
add_to_keys({'key': 'bees', 'value': ['Ann', 'Joe', 'Dee']})
resp = client.get('/keys', content_type='application/json')
assert resp.status_code == 200
resp_data = resp.get_json()
assert isinstance(resp_data, list)
assert len(resp_data) == 2
for doc_idx in (0, 1):
for k in ('key', 'http_url'):
assert k in resp_data[doc_idx]
if resp_data[doc_idx]['key'] == 'babboon':
assert resp_data[doc_idx]['http_url'][-13:] == '/keys/babboon'
else:
assert resp_data[doc_idx]['http_url'][-10:] == '/keys/bees'
def test_delete_on_keys_not_allowed(client):
resp = client.delete('/keys', content_type='application/json')
assert resp.status_code == 405
# 'get a key' tests
def test_get_existing_key(client, keys, add_to_keys):
add_to_keys({'key': 'babboon', 'value': 'Larry'})
add_to_keys({'key': 'bees', 'value': ['Ann', 'Joe', 'Dee']})
resp = client.get('/keys/bees', content_type='application/json')
assert resp.status_code == 200
resp_data = resp.get_json()
assert isinstance(resp_data, dict)
for k in ('key', 'http_url', 'value'):
assert k in resp_data
assert resp_data['key'] == 'bees'
assert resp_data['http_url'][-10:] == '/keys/bees'
assert resp_data['value'] == ['Ann', 'Joe', 'Dee']
def test_get_nonexisting_key(client, keys):
resp = client.get('/keys/bees', content_type='application/json')
assert resp.status_code == 404
def test_post_on_a_key_not_allowed(client):
resp = client.post('/keys/bees', content_type='application/json')
assert resp.status_code == 405
# 'create a key' tests
def test_create_new_key(client, keys):
new_doc = {'key': 'oscillator', 'value': 'Colpitts'}
resp = client.post(
'/keys',
json=new_doc,
content_type='application/json'
)
assert resp.status_code == 201
resp_data = resp.get_json()
assert isinstance(resp_data, dict)
for k in ('key', 'http_url', 'value'):
assert k in resp_data
assert resp_data['key'] == new_doc['key']
assert resp_data['value'] == new_doc['value']
assert resp_data['http_url'][-16:] == '/keys/oscillator'
def test_create_duplicate_key(client, keys, add_to_keys):
new_doc = {'key': 'oscillator', 'value': 'Colpitts'}
add_to_keys(new_doc.copy())
resp = client.post(
'/keys',
json=new_doc,
content_type='application/json'
)
assert resp.status_code == 400
resp_data = resp.get_json()
assert 'error' in resp_data
assert resp_data['error'] == "Can't create duplicate key (oscillator)."
def test_create_new_key_missing_key(client, keys):
new_doc = {'value': 'Colpitts'}
resp = client.post(
'/keys',
json=new_doc,
content_type='application/json'
)
assert resp.status_code == 400
resp_data = resp.get_json()
assert 'error' in resp_data
assert resp_data['error'] == 'Please provide the missing "key" parameter!'
def test_create_new_key_missing_value(client, keys):
new_doc = {'key': 'oscillator'}
resp = client.post(
'/keys',
json=new_doc,
content_type='application/json'
)
assert resp.status_code == 400
resp_data = resp.get_json()
assert 'error' in resp_data
assert resp_data['error'] == 'Please provide the missing "value" ' \
'parameter!'
# 'update a key' tests
def test_update_a_key_existing(client, keys, add_to_keys):
add_to_keys({'key': 'oscillator', 'value': 'Colpitts'})
update_value = {'value': ['Pierce', 'Hartley']}
resp = client.put(
'/keys/oscillator',
json=update_value,
content_type='application/json'
)
assert resp.status_code == 204
def test_update_a_key_nonexisting(client, keys, add_to_keys):
add_to_keys({'key': 'oscillator', 'value': 'Colpitts'})
update_value = {'value': ['Pierce', 'Hartley']}
resp = client.put(
'/keys/gadget',
json=update_value,
content_type='application/json'
)
assert resp.status_code == 400
resp_data = resp.get_json()
assert 'error' in resp_data
assert resp_data['error'] == 'Update failed.'
# 'delete a key' tests
def test_delete_a_key_existing(client, keys, add_to_keys):
add_to_keys({'key': 'oscillator', 'value': 'Colpitts'})
resp = client.delete(
'/keys/oscillator',
content_type='application/json'
)
assert resp.status_code == 204
def test_delete_a_key_nonexisting(client, keys):
resp = client.delete(
'/keys/oscillator',
content_type='application/json'
)
assert resp.status_code == 404
| 31.266667 | 78 | 0.643923 | 762 | 5,628 | 4.489501 | 0.115486 | 0.088863 | 0.079509 | 0.099386 | 0.797135 | 0.708857 | 0.653902 | 0.620871 | 0.620871 | 0.588424 | 0 | 0.014567 | 0.207178 | 5,628 | 179 | 79 | 31.441341 | 0.752129 | 0.021855 | 0 | 0.568345 | 0 | 0 | 0.192434 | 0 | 0 | 0 | 0 | 0 | 0.323741 | 1 | 0.122302 | false | 0.007194 | 0 | 0 | 0.122302 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d70045662563ef27515bc31c292104985de68053 | 2,733 | py | Python | top_book/book/views.py | mfarjami/DRF-top | 914ed9068ed73d7d8a692c8ed2e77cba96cee5b1 | [
"MIT"
] | null | null | null | top_book/book/views.py | mfarjami/DRF-top | 914ed9068ed73d7d8a692c8ed2e77cba96cee5b1 | [
"MIT"
] | null | null | null | top_book/book/views.py | mfarjami/DRF-top | 914ed9068ed73d7d8a692c8ed2e77cba96cee5b1 | [
"MIT"
] | null | null | null | from rest_framework.decorators import api_view
from rest_framework.views import APIView
from rest_framework import status
from rest_framework.response import Response
from .models import Book
from .serializers import BookSerializer
# Create your views here.
class GetAllData(APIView):
def get(self, request):
query = Book.objects.all().order_by('-create_at')
serializers = BookSerializer(query , many=True)
return Response(serializers.data, status=status.HTTP_200_OK)
@api_view(['GET'])
def allApi(request):
if request.method == 'GET':
query = Book.objects.all().order_by('-create_at')
serializer = BookSerializer(query, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@api_view(['POST'])
def SetData(request):
if request.method == 'POST':
serializer = BookSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATE)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class GetFavData(APIView):
def get(self,request):
query = Book.objects.filter(fav=True)
serializer = BookSerializer(query, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
class UpdateFavData(APIView):
def get(self, request, pk):
query = Book.objects.get(pk=pk)
serializer = BookSerializer(query)
return Response(serializer.data, status=status.HTTP_200_OK)
def put(self, request, pk):
query = Book.objects.get(pk=pk)
serializer = BookSerializer(query, data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
else:
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class PostModelData(APIView):
def post(self, request):
serializer = BookSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class SearchData(APIView):
def get(self, request):
search = request.GET['name']
query = Book.objects.filter(store_name__contains=search)
serializer= BookSerializer(query, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
class DeleteData(APIView):
def delete(self, request, pk):
query = Book.objects.get(pk=pk)
query.delete()
return Response(status=status.HTTP_204_NO_CONTENT) | 35.960526 | 82 | 0.691914 | 327 | 2,733 | 5.64526 | 0.211009 | 0.091008 | 0.104009 | 0.086674 | 0.688516 | 0.668472 | 0.646262 | 0.646262 | 0.553629 | 0.507584 | 0 | 0.016582 | 0.205635 | 2,733 | 76 | 83 | 35.960526 | 0.833717 | 0.008416 | 0 | 0.459016 | 0 | 0 | 0.014027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147541 | false | 0 | 0.098361 | 0 | 0.540984 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d7051e18592cfcaabdc61f04c12bac4c5a8c6910 | 423 | py | Python | prophet-gpl/tools/return_counter.py | jyi/ITSP | 0553f683f99403efb5ef440af826c1d229a52376 | [
"MIT"
] | 18 | 2017-06-14T07:55:45.000Z | 2022-03-24T09:32:43.000Z | prophet-gpl/tools/return_counter.py | jyi/ITSP | 0553f683f99403efb5ef440af826c1d229a52376 | [
"MIT"
] | 2 | 2017-07-25T13:44:39.000Z | 2018-03-16T06:43:40.000Z | prophet-gpl/tools/return_counter.py | jyi/ITSP | 0553f683f99403efb5ef440af826c1d229a52376 | [
"MIT"
] | 5 | 2017-07-29T19:09:37.000Z | 2021-04-10T16:39:48.000Z | #!/usr/bin/env python
f = open("repair.log", "r");
lines = f.readlines();
cnt = 0;
for line in lines:
tokens = line.strip().split();
if (len(tokens) > 3):
if (tokens[0] == "Total") and (tokens[1] == "return"):
cnt += int(tokens[3]);
if (tokens[0] == "Total") and (tokens[2] == "different") and (tokens[3] == "repair"):
cnt += int(tokens[1]);
print "Total size: " + str(cnt);
| 32.538462 | 93 | 0.520095 | 59 | 423 | 3.728814 | 0.542373 | 0.095455 | 0.081818 | 0.136364 | 0.272727 | 0.272727 | 0.272727 | 0.272727 | 0 | 0 | 0 | 0.028125 | 0.243499 | 423 | 12 | 94 | 35.25 | 0.659375 | 0.047281 | 0 | 0 | 0 | 0 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d70bfa071ce979b042b3019dc6f0957d42a9409f | 808 | py | Python | src/genome/visibles/sphere_gene.py | stu-smith/blender-evolution | 644b92d4dc30e82708b2e78da4f274c9ab142704 | [
"MIT"
] | null | null | null | src/genome/visibles/sphere_gene.py | stu-smith/blender-evolution | 644b92d4dc30e82708b2e78da4f274c9ab142704 | [
"MIT"
] | null | null | null | src/genome/visibles/sphere_gene.py | stu-smith/blender-evolution | 644b92d4dc30e82708b2e78da4f274c9ab142704 | [
"MIT"
] | null | null | null | from ..gene import Gene
from ..scalar_gene_property import ScalarGeneProperty
from ..color_gene_property import ColorGeneProperty
from ...visible_objects.sphere import Sphere
class SphereGene(Gene):
def __init__(self):
self._size_property = ScalarGeneProperty(min=0.1, max=20, value=2)
self._color_property = ColorGeneProperty(hsv=[0.0, 0.0, 1.0])
def all_properties(self):
return [
self._size_property,
self._color_property
]
def express(self, genome_expression):
location = [0, 0, 0] # TODO genome_expression stack position
size = self._size_property.value
hsv = self._color_property.value
sphere = Sphere(location=location, size=size, hsv=hsv)
genome_expression.add_visible_object(sphere)
| 29.925926 | 74 | 0.689356 | 99 | 808 | 5.353535 | 0.373737 | 0.018868 | 0.090566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 0.220297 | 808 | 26 | 75 | 31.076923 | 0.819048 | 0.045792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0 | 1 | 0.157895 | false | 0 | 0.210526 | 0.052632 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d7141ae014ac95863434646bbd6943b07b157f83 | 216 | py | Python | Ex-15.py | gilmartins83/Guanabara-Python | 43128c35fcd601db1f72c80a9c76f4b4f4085c7f | [
"MIT"
] | null | null | null | Ex-15.py | gilmartins83/Guanabara-Python | 43128c35fcd601db1f72c80a9c76f4b4f4085c7f | [
"MIT"
] | null | null | null | Ex-15.py | gilmartins83/Guanabara-Python | 43128c35fcd601db1f72c80a9c76f4b4f4085c7f | [
"MIT"
] | null | null | null | dias = int(input("quantos dias voce deseja alugar o carro? "))
km = float(input("Quantos kilometros você andou? "))
pago = dias * 60 + (km * 0.15)
print("o valor do aluguel do carro foi de R$ {:.2f}" .format(pago))
| 36 | 67 | 0.662037 | 36 | 216 | 3.972222 | 0.75 | 0.167832 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033708 | 0.175926 | 216 | 5 | 68 | 43.2 | 0.769663 | 0 | 0 | 0 | 0 | 0 | 0.539535 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d720f208a3df6385750ad1742504776a36069d68 | 6,151 | py | Python | Latest/venv/Lib/site-packages/apptools/permissions/default/user_manager.py | adamcvj/SatelliteTracker | 49a8f26804422fdad6f330a5548e9f283d84a55d | [
"Apache-2.0"
] | 1 | 2022-01-09T20:04:31.000Z | 2022-01-09T20:04:31.000Z | Latest/venv/Lib/site-packages/apptools/permissions/default/user_manager.py | adamcvj/SatelliteTracker | 49a8f26804422fdad6f330a5548e9f283d84a55d | [
"Apache-2.0"
] | 1 | 2022-02-15T12:01:57.000Z | 2022-03-24T19:48:47.000Z | Latest/venv/Lib/site-packages/apptools/permissions/default/user_manager.py | adamcvj/SatelliteTracker | 49a8f26804422fdad6f330a5548e9f283d84a55d | [
"Apache-2.0"
] | null | null | null | #------------------------------------------------------------------------------
# Copyright (c) 2008, Riverbank Computing Limited
# All rights reserved.
#
# This software is provided without warranty under the terms of the BSD
# license included in enthought/LICENSE.txt and may be redistributed only
# under the conditions described in the aforementioned license. The license
# is also available online at http://www.enthought.com/licenses/BSD.txt
# Thanks for using Enthought open source!
#
# Author: Riverbank Computing Limited
# Description: <Enthought permissions package component>
#------------------------------------------------------------------------------
# Enthought library imports.
from pyface.action.api import Action
from traits.api import Bool, Event, HasTraits, provides, \
Instance, List, Unicode
# Local imports.
from apptools.permissions.i_user import IUser
from apptools.permissions.i_user_manager import IUserManager
from apptools.permissions.package_globals import get_permissions_manager
from apptools.permissions.permission import ManageUsersPermission
from .i_user_database import IUserDatabase
@provides(IUserManager)
class UserManager(HasTraits):
"""The default user manager implementation."""
#### 'IUserManager' interface #############################################
management_actions = List(Instance(Action))
user = Instance(IUser)
user_actions = List(Instance(Action))
user_authenticated = Event(IUser)
#### 'UserManager' interface ##############################################
# The user database.
user_db = Instance(IUserDatabase)
###########################################################################
# 'IUserManager' interface.
###########################################################################
def bootstrapping(self):
"""Return True if we are bootstrapping, ie. no users have been defined.
"""
return self.user_db.bootstrapping()
def authenticate_user(self):
"""Authenticate the user."""
if self.user_db.authenticate_user(self.user):
self.user.authenticated = True
# Tell the policy manager before everybody else.
get_permissions_manager().policy_manager.load_policy(self.user)
self.user_authenticated = self.user
def unauthenticate_user(self):
"""Unauthenticate the user."""
if self.user.authenticated and self.user_db.unauthenticate_user(self.user):
self.user.authenticated = False
# Tell the policy manager before everybody else.
get_permissions_manager().policy_manager.load_policy(None)
self.user_authenticated = None
def matching_user(self, name):
"""Select a user."""
return self.user_db.matching_user(name)
###########################################################################
# Trait handlers.
###########################################################################
def _management_actions_default(self):
"""Return the list of management actions."""
from apptools.permissions.secure_proxy import SecureProxy
user_db = self.user_db
actions = []
perm = ManageUsersPermission()
if user_db.can_add_user:
act = Action(name="&Add a User...", on_perform=user_db.add_user)
actions.append(SecureProxy(act, permissions=[perm], show=False))
if user_db.can_modify_user:
act = Action(name="&Modify a User...",
on_perform=user_db.modify_user)
actions.append(SecureProxy(act, permissions=[perm], show=False))
if user_db.can_delete_user:
act = Action(name="&Delete a User...",
on_perform=user_db.delete_user)
actions.append(SecureProxy(act, permissions=[perm], show=False))
return actions
def _user_actions_default(self):
"""Return the list of user actions."""
actions = []
if self.user_db.can_change_password:
actions.append(_ChangePasswordAction())
return actions
def _user_default(self):
"""Return the default current user."""
return self.user_db.user_factory()
def _user_db_default(self):
"""Return the default user database."""
# Defer to an external user database if there is one.
try:
from apptools.permissions.external.user_database import UserDatabase
except ImportError:
from apptools.permissions.default.user_database import UserDatabase
return UserDatabase()
class _ChangePasswordAction(Action):
"""An action that allows the current user to change their password. It
isn't exported through actions/api.py because it is specific to this user
manager implementation."""
#### 'Action' interface ###################################################
enabled = Bool(False)
name = Unicode("&Change Password...")
###########################################################################
# 'object' interface.
###########################################################################
def __init__(self, **traits):
"""Initialise the object."""
super(_ChangePasswordAction, self).__init__(**traits)
get_permissions_manager().user_manager.on_trait_event(self._refresh_enabled, 'user_authenticated')
###########################################################################
# 'Action' interface.
###########################################################################
def perform(self, event):
"""Perform the action."""
um = get_permissions_manager().user_manager
um.user_db.change_password(um.user)
###########################################################################
# Private interface.
###########################################################################
def _refresh_enabled(self, user):
"""Invoked whenever the current user's authorisation state changes."""
self.enabled = user is not None
| 33.612022 | 106 | 0.558121 | 583 | 6,151 | 5.722127 | 0.295026 | 0.030576 | 0.048261 | 0.023981 | 0.263789 | 0.163369 | 0.1256 | 0.105815 | 0.105815 | 0.089329 | 0 | 0.000809 | 0.196228 | 6,151 | 182 | 107 | 33.796703 | 0.673948 | 0.263047 | 0 | 0.1 | 0 | 0 | 0.024066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157143 | false | 0.085714 | 0.157143 | 0 | 0.528571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d72a11a2c8bf6250447fd4fa6eb0f797bb01fbab | 6,184 | py | Python | autoPyTorch/core/autonet_classes/autonet_feature_data.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | 1 | 2019-09-02T00:37:52.000Z | 2019-09-02T00:37:52.000Z | autoPyTorch/core/autonet_classes/autonet_feature_data.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | null | null | null | autoPyTorch/core/autonet_classes/autonet_feature_data.py | thomascherickal/Auto-PyTorch | 9e25a3bdef8e836e63979229eef77830cd64bb53 | [
"BSD-3-Clause"
] | 1 | 2019-09-02T00:40:30.000Z | 2019-09-02T00:40:30.000Z |
__author__ = "Max Dippel, Michael Burkart and Matthias Urban"
__version__ = "0.0.1"
__license__ = "BSD"
from autoPyTorch.core.api import AutoNet
class AutoNetFeatureData(AutoNet):
@classmethod
def get_default_pipeline(cls):
from autoPyTorch.pipeline.base.pipeline import Pipeline
from autoPyTorch.pipeline.nodes.autonet_settings import AutoNetSettings
from autoPyTorch.pipeline.nodes.optimization_algorithm import OptimizationAlgorithm
from autoPyTorch.pipeline.nodes.cross_validation import CrossValidation
from autoPyTorch.pipeline.nodes.imputation import Imputation
from autoPyTorch.pipeline.nodes.normalization_strategy_selector import NormalizationStrategySelector
from autoPyTorch.pipeline.nodes.one_hot_encoding import OneHotEncoding
from autoPyTorch.pipeline.nodes.preprocessor_selector import PreprocessorSelector
from autoPyTorch.pipeline.nodes.resampling_strategy_selector import ResamplingStrategySelector
from autoPyTorch.pipeline.nodes.embedding_selector import EmbeddingSelector
from autoPyTorch.pipeline.nodes.network_selector import NetworkSelector
from autoPyTorch.pipeline.nodes.optimizer_selector import OptimizerSelector
from autoPyTorch.pipeline.nodes.lr_scheduler_selector import LearningrateSchedulerSelector
from autoPyTorch.pipeline.nodes.log_functions_selector import LogFunctionsSelector
from autoPyTorch.pipeline.nodes.metric_selector import MetricSelector
from autoPyTorch.pipeline.nodes.loss_module_selector import LossModuleSelector
from autoPyTorch.pipeline.nodes.train_node import TrainNode
# build the pipeline
pipeline = Pipeline([
AutoNetSettings(),
OptimizationAlgorithm([
CrossValidation([
Imputation(),
NormalizationStrategySelector(),
OneHotEncoding(),
PreprocessorSelector(),
ResamplingStrategySelector(),
EmbeddingSelector(),
NetworkSelector(),
OptimizerSelector(),
LearningrateSchedulerSelector(),
LogFunctionsSelector(),
MetricSelector(),
LossModuleSelector(),
TrainNode()
])
])
])
cls._apply_default_pipeline_settings(pipeline)
return pipeline
@staticmethod
def _apply_default_pipeline_settings(pipeline):
from autoPyTorch.pipeline.nodes.normalization_strategy_selector import NormalizationStrategySelector
from autoPyTorch.pipeline.nodes.preprocessor_selector import PreprocessorSelector
from autoPyTorch.pipeline.nodes.embedding_selector import EmbeddingSelector
from autoPyTorch.pipeline.nodes.network_selector import NetworkSelector
from autoPyTorch.pipeline.nodes.optimizer_selector import OptimizerSelector
from autoPyTorch.pipeline.nodes.lr_scheduler_selector import LearningrateSchedulerSelector
from autoPyTorch.pipeline.nodes.train_node import TrainNode
from autoPyTorch.components.networks.feature import MlpNet, ResNet, ShapedMlpNet, ShapedResNet
from autoPyTorch.components.optimizer.optimizer import AdamOptimizer, SgdOptimizer
from autoPyTorch.components.lr_scheduler.lr_schedulers import SchedulerCosineAnnealingWithRestartsLR, SchedulerNone, \
SchedulerCyclicLR, SchedulerExponentialLR, SchedulerReduceLROnPlateau, SchedulerReduceLROnPlateau, SchedulerStepLR
from autoPyTorch.components.networks.feature import LearnedEntityEmbedding
from sklearn.preprocessing import MinMaxScaler, StandardScaler, MaxAbsScaler
from autoPyTorch.components.preprocessing.feature_preprocessing import \
TruncatedSVD, FastICA, RandomKitchenSinks, KernelPCA, Nystroem
from autoPyTorch.training.early_stopping import EarlyStopping
from autoPyTorch.training.mixup import Mixup
pre_selector = pipeline[PreprocessorSelector.get_name()]
pre_selector.add_preprocessor('truncated_svd', TruncatedSVD)
pre_selector.add_preprocessor('fast_ica', FastICA)
pre_selector.add_preprocessor('kitchen_sinks', RandomKitchenSinks)
pre_selector.add_preprocessor('kernel_pca', KernelPCA)
pre_selector.add_preprocessor('nystroem', Nystroem)
norm_selector = pipeline[NormalizationStrategySelector.get_name()]
norm_selector.add_normalization_strategy('minmax', MinMaxScaler)
norm_selector.add_normalization_strategy('standardize', StandardScaler)
norm_selector.add_normalization_strategy('maxabs', MaxAbsScaler)
emb_selector = pipeline[EmbeddingSelector.get_name()]
emb_selector.add_embedding_module('learned', LearnedEntityEmbedding)
net_selector = pipeline[NetworkSelector.get_name()]
net_selector.add_network('mlpnet', MlpNet)
net_selector.add_network('shapedmlpnet', ShapedMlpNet)
net_selector.add_network('resnet', ResNet)
net_selector.add_network('shapedresnet', ShapedResNet)
opt_selector = pipeline[OptimizerSelector.get_name()]
opt_selector.add_optimizer('adam', AdamOptimizer)
opt_selector.add_optimizer('sgd', SgdOptimizer)
lr_selector = pipeline[LearningrateSchedulerSelector.get_name()]
lr_selector.add_lr_scheduler('cosine_annealing', SchedulerCosineAnnealingWithRestartsLR)
lr_selector.add_lr_scheduler('cyclic', SchedulerCyclicLR)
lr_selector.add_lr_scheduler('exponential', SchedulerExponentialLR)
lr_selector.add_lr_scheduler('step', SchedulerStepLR)
lr_selector.add_lr_scheduler('plateau', SchedulerReduceLROnPlateau)
lr_selector.add_lr_scheduler('none', SchedulerNone)
train_node = pipeline[TrainNode.get_name()]
train_node.add_training_technique("early_stopping", EarlyStopping)
train_node.add_batch_loss_computation_technique("mixup", Mixup)
| 52.854701 | 126 | 0.733991 | 535 | 6,184 | 8.229907 | 0.257944 | 0.109017 | 0.125369 | 0.146264 | 0.348853 | 0.267091 | 0.246196 | 0.246196 | 0.228935 | 0.228935 | 0 | 0.000611 | 0.205692 | 6,184 | 116 | 127 | 53.310345 | 0.895765 | 0.002911 | 0 | 0.178947 | 0 | 0 | 0.039916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021053 | false | 0 | 0.347368 | 0 | 0.389474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
d7315024b9feb9895ac394720425de68d6266009 | 12,033 | py | Python | StreamPy/StreamPy-UI/src/root/nested/temp/MakeNetwork(1).py | AnomalyInc/StreamPy | 94abca276b2857de48259f4f42ef95efbdf5f6d1 | [
"Apache-2.0"
] | 2 | 2017-04-27T11:04:27.000Z | 2019-02-07T21:03:32.000Z | StreamPy/StreamPy-UI/src/root/nested/temp/MakeNetwork(1).py | StreamPy/StreamPy | 94abca276b2857de48259f4f42ef95efbdf5f6d1 | [
"Apache-2.0"
] | null | null | null | StreamPy/StreamPy-UI/src/root/nested/temp/MakeNetwork(1).py | StreamPy/StreamPy | 94abca276b2857de48259f4f42ef95efbdf5f6d1 | [
"Apache-2.0"
] | null | null | null | from Stream import Stream
from Stream import _no_value, _multivalue
from Agent import Agent
from root.nested.OperatorsTestNew import stream_agent
def make_network(stream_names_tuple, agent_descriptor_dict):
""" This function makes a network of agents given the names
of the streams in the network and a description of the
agents in the network.
Parameters
----------
stream_names_tuple: tuple of str
A tuple consisting of names of streams in the network.
Each stream in the network must have a unique name.
agent_descriptor_dict: dict of tuples
The key is an agent name
The value is a tuple:
in_list, out_list, f, f_type, f_args, state
where:
in_list: list of input stream names
out_list: list of output stream names
f: function associated with the agent
f_type: 'element', 'list', 'window', etc
f_args: tuple of arguments for functions f
state: the state associated with this agent.
Local Variables
---------------
stream_dict: dict
key: stream name
value: Stream
agent_dict: dict
key: agent name
value: agent with the specified description:
in_list, out_list, f, f_type, f_args, state,
call_streams=[timer_stream]
where one timer stream is associated with
each agent.
agent_timer_dict: dict
key: agent_name
value: Stream
The value is the timer stream associated with the
agent. When the timer stream has a message, the
agent is made to execute a step.
"""
# Create streams and insert streams into stream_dict.
stream_dict = dict()
for stream_name in stream_names_tuple:
stream_dict[stream_name] = Stream(stream_name)
## # Only for debugging
## for key, value in stream_dict.items():
## print 'stream_name: ', key
## print 'stream:', value
agent_dict = dict()
agent_timer_dict = dict()
# Create agents with the specified description
# and put the agents into agent_dict.
for agent_name in agent_descriptor_dict.keys():
# print 'agent_name:', agent_name
in_list, out_list, f, f_type, f_args, state = \
agent_descriptor_dict[agent_name]
## # Only for debugging
## print 'in_list', in_list
## print 'out_list', out_list
## print 'f', f
## print 'f_args', f_args
## print 'f_type', f_type
## print 'state', state
# Replace a list consisting of a single input stream
# by the stream itself.
if len(in_list) == 1:
single_input_stream_name = in_list[0]
inputs = stream_dict[single_input_stream_name]
else:
inputs = list()
for input_stream_name in in_list:
inputs.append(stream_dict[input_stream_name])
# Replace a list consisting of a single output stream
# by the stream itself.
if len(out_list) == 1:
single_output_stream_name = out_list[0]
outputs = stream_dict[single_output_stream_name]
else:
outputs = list()
for output_stream_name in out_list:
outputs.append(stream_dict[output_stream_name])
# Create timer streams and insert them into agent_timer_dict
agent_timer_dict[agent_name] = Stream(
agent_name + ':timer')
# Create agents and insert them into agent_dict
agent_dict[agent_name] = stream_agent(
inputs, outputs, f_type, f, f_args, state,
call_streams=[agent_timer_dict[agent_name]])
# Set the name for this agent.
agent_dict[agent_name].name = agent_name
return (stream_dict, agent_dict, agent_timer_dict)
def network_data_structures(stream_names_tuple, agent_descriptor_dict):
"""Builds data structures that improve the efficiency
of driving networks for animation or command-line
execution of network nodes.
Parameters
----------
Same as for make_network.
Return Values
-------------
(stream_to_agent_list_dict,
agent_to_stream_dict,
agent_to_agent_list_dict)
stream_to_agent_list_dict
key: stream_name
value: list of agent_name.
The stream is an input stream of each agent
with the agent name in the list.
agent_to_stream_dict
key: stream_name
value: str. A single agent_name.
The stream is the output stream of the
this agent.
agent_to_agent_list_dict
key: agent_name
value: list of agent names
The agent with name in key has an output stream to
each agent whose name is in value.
"""
stream_to_agent_list_dict = dict()
for stream_name in stream_names_tuple:
stream_to_agent_list_dict[stream_name] = list()
agent_to_stream_dict = dict()
# Construct stream_to_agent_list_dict and agent_to_stream_dict
# from agent_descriptor_dict
for agent_name, descriptor in agent_descriptor_dict.iteritems():
input_stream_list = descriptor[0]
output_stream_list = descriptor[1]
for stream_name in input_stream_list:
stream_to_agent_list_dict[stream_name].append(agent_name)
for stream_name in output_stream_list:
if stream_name in agent_to_stream_dict:
raise Exception(
stream_name+'output by'+agent_to_stream_dict[stream_name]+'and'+agent_name)
agent_to_stream_dict[stream_name] = agent_name
# Construct agent_to_agent_list_dict from
# agent_descriptor_dict, stream_to_agent_list_dict, and
# agent_to_stream_dict.
agent_to_agent_list_dict = dict()
# Initialize agent_to_agent_list_dict
for agent_name in agent_descriptor_dict.keys():
agent_to_agent_list_dict[agent_name] = list()
# Compute agent_to_agent_list_dict
# If a stream is output of agent x and input to agents y, z
# then agent x outputs to [y,z]
for stream_name, agent_name in agent_to_stream_dict.iteritems():
agent_to_agent_list_dict[agent_name].extend(
stream_to_agent_list_dict[stream_name])
# Construct agent_from_agent_list_dict from
# agent_descriptor_dict, stream_to_agent_list_dict, and
# agent_to_stream_dict.
agent_from_agent_list_dict = dict()
# Initialize agent_from_agent_list_dict
for agent_name in agent_descriptor_dict.keys():
agent_from_agent_list_dict[agent_name] = list()
# Compute agent_from_agent_list_dict
# If a stream is an input of agent x and is an output of agents y, z
# then agents[y,z] output to agent x.
for stream_name, agent_name_list in stream_to_agent_list_dict.iteritems():
for receiving_agent_name in agent_name_list:
agent_from_agent_list_dict[receiving_agent_name].append(
agent_to_stream_dict[stream_name])
return (stream_to_agent_list_dict, agent_to_stream_dict,
agent_to_agent_list_dict, agent_from_agent_list_dict)
def main():
# STEP 1
# PROVIDE CODE OR IMPORT PURE (NON-STREAM) FUNCTIONS
from random import randint
def rand(f_args):
max_integer = f_args[0]
return randint(0, max_integer)
def split(m, f_args):
divisor = f_args[0]
return [_no_value, m] if m%divisor else [m, _no_value]
def print_value(v, index):
#print name + '[' , index , '] = ', v
print '[' , index , '] = ', v
return (index+1)
# STEP 2
# SPECIFY THE NETWORK.
# Specify names of all the streams.
stream_names_tuple = ('random_stream', 'multiples_stream', 'non_multiples_stream')
# Specify the agents:
# key: agent name
# value: list of input streams, list of output streams, function, function type,
# tuple of arguments, state
agent_descriptor_dict = {
'generate_random': [
[], ['random_stream'], rand, 'element', (100,), None],
'split': [
['random_stream'], ['multiples_stream', 'non_multiples_stream'],
split, 'element', (2,), None],
'print_random': [
['random_stream'], [], print_value, 'element', None, 0],
'print_multiples': [['multiples_stream'], [], print_value, 'element', None, 0],
'print_non_multiples': [['non_multiples_stream'], [], print_value, 'element', None, 0]
}
# STEP 3: MAKE THE NETWORK
stream_dict, agent_dict, agent_timer_dict = make_network(
stream_names_tuple, agent_descriptor_dict)
(stream_to_agent_list_dict, agent_to_stream_dict,
agent_to_agent_list_dict, agent_from_agent_list_dict) = \
network_data_structures(stream_names_tuple, agent_descriptor_dict)
## # Only for debugging
## for key, value in stream_to_agent_list_dict.iteritems():
## print 'stream_to_agent_list_dict key: ', key
## print 'stream_to_agent_list_dict value: ', value
## for key, value in agent_to_stream_dict.iteritems():
## print 'agent_to_stream_dict key: ', key
## print 'agent_to_stream_dict value: ', value
## for key, value in agent_to_agent_list_dict.iteritems():
## print 'agent_to_agent_list_dict key: ', key
## print 'agent_to_agent_list_dict value: ', value
## for key, value in s_dict.items():
## print 'stream name', key
## print 'stream', value
## for key, value in a_dict.items():
## print 'agent name', key
## print 'agent', value
## for key, value in agent_timer_dict.items():
## print 'timer name is', key
## print 'timer', value
# STEP 4: DRIVE THE NETWORK BY APPENDING
# VALUES TO TIMER STREAMS
for t in range(5):
print
print '--------- time step: ', t
# Append t to each of the timer streams
for agent_name, timer_stream in agent_timer_dict.iteritems():
print
print 'Execute single step of agent with name', agent_name
timer_stream.append(t)
## # for debugging
## for stream in stream_dict.values():
## stream.print_recent()
for receiving_agent_name in agent_to_agent_list_dict[agent_name]:
descriptor = agent_descriptor_dict[receiving_agent_name]
receiving_agent = agent_dict[receiving_agent_name]
input_stream_list = descriptor[0]
for stream_name in input_stream_list:
stream = stream_dict[stream_name]
print 'from', agent_name, 'on', stream_name, 'to', receiving_agent_name,
print stream.recent[stream.start[receiving_agent]:stream.stop]
descriptor = agent_descriptor_dict[agent_name]
agent = agent_dict[agent_name]
input_stream_list = descriptor[0]
for stream_name in input_stream_list:
stream = stream_dict[stream_name]
sending_agent_name = agent_to_stream_dict[stream_name]
print 'from', sending_agent_name, 'on', stream_name, 'to', agent_name,
print stream.recent[stream.start[agent]:stream.stop]
## # Print messages in transit to the input port
## # of each agent.
## for agent_name, agent in agent_dict.iteritems():
## descriptor = agent_descriptor_dict[agent_name]
## input_stream_list = descriptor[0]
## for stream_name in input_stream_list:
## stream = stream_dict[stream_name]
## print "messages in ", stream_name, "to", agent.name
## print stream.recent[stream.start[agent]:stream.stop]
if __name__ == '__main__':
main()
| 37.25387 | 95 | 0.634588 | 1,577 | 12,033 | 4.519341 | 0.103995 | 0.061877 | 0.06749 | 0.061036 | 0.520836 | 0.441841 | 0.360039 | 0.259015 | 0.210467 | 0.169075 | 0 | 0.002911 | 0.286379 | 12,033 | 322 | 96 | 37.369565 | 0.827064 | 0.252722 | 0 | 0.145299 | 0 | 0 | 0.057014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.042735 | null | null | 0.119658 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d73b450d21d773f70afe8701a469d62e69a7223b | 720 | py | Python | pydis_site/apps/api/migrations/0055_reminder_mentions.py | Numerlor/site | e4cec0aeb2a791e622be8edd94fb4e82d150deab | [
"MIT"
] | 700 | 2018-11-17T15:56:51.000Z | 2022-03-30T22:53:17.000Z | pydis_site/apps/api/migrations/0055_reminder_mentions.py | Numerlor/site | e4cec0aeb2a791e622be8edd94fb4e82d150deab | [
"MIT"
] | 542 | 2018-11-17T13:39:42.000Z | 2022-03-31T11:24:00.000Z | pydis_site/apps/api/migrations/0055_reminder_mentions.py | Numerlor/site | e4cec0aeb2a791e622be8edd94fb4e82d150deab | [
"MIT"
] | 178 | 2018-11-21T09:06:56.000Z | 2022-03-31T07:43:28.000Z | # Generated by Django 2.2.14 on 2020-07-15 07:37
import django.contrib.postgres.fields
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api', '0054_user_invalidate_unknown_role'),
]
operations = [
migrations.AddField(
model_name='reminder',
name='mentions',
field=django.contrib.postgres.fields.ArrayField(base_field=models.BigIntegerField(validators=[django.core.validators.MinValueValidator(limit_value=0, message='Mention IDs cannot be negative.')]), blank=True, default=list, help_text='IDs of roles or users to ping with the reminder.', size=None),
),
]
| 34.285714 | 307 | 0.704167 | 88 | 720 | 5.670455 | 0.738636 | 0.048096 | 0.084168 | 0.108216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035775 | 0.184722 | 720 | 20 | 308 | 36 | 0.81431 | 0.063889 | 0 | 0 | 1 | 0 | 0.19494 | 0.049107 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.214286 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d746595f48844a5e6d0dbe01d637017879e8559b | 7,064 | py | Python | app/run.py | mourgaya/iscsi_ihm | aadf09590a02f6f1e6b16b000c2a4380e015f840 | [
"Apache-2.0"
] | null | null | null | app/run.py | mourgaya/iscsi_ihm | aadf09590a02f6f1e6b16b000c2a4380e015f840 | [
"Apache-2.0"
] | null | null | null | app/run.py | mourgaya/iscsi_ihm | aadf09590a02f6f1e6b16b000c2a4380e015f840 | [
"Apache-2.0"
] | null | null | null | #author : eric mourgya
#
import commands
from flask import jsonify
from flask import Flask, Response, request, redirect,session, url_for
from flask.ext.login import LoginManager, UserMixin,login_required, login_user, logout_user
#@app.after_request
#def treat_as_plain_text(response):
# response.headers["content-type"] = "text/plain; charset=utf-8"
# return response
app = Flask(__name__)
app.secret_key="gloubiboulga"
login_manager = LoginManager()
login_manager.setup_app(app)
login_manager.login_view = "login"
class User(UserMixin):
def __init__(self, id):
self.id = id
self.name = "user" + str(id)
self.password = self.name + "_secret"
def __repr__(self):
return "%d/%s/%s" % (self.id, self.name, self.password)
@app.route('/', methods = ['GET'])
@login_required
def help():
"""Welcome page and help page."""
func_list = {}
for rule in app.url_map.iter_rules():
if rule.endpoint != 'static':
func_list[rule.rule] = app.view_functions[rule.endpoint].__doc__
return jsonify(func_list)
def cmdline(cmd):
# make and exec of cmd command on system
status, output = commands.getstatusoutput(cmd)
if status != 0:
error_str= cmd + ": command failed! : " +status+" "+output
print error_str
return error_str
else:
print cmd + "done"
return output
@app.route('/show/discovery')
@login_required
def showdiscovery():
"""------------------------Show discovery portals."""
cmdshow="iscsiadm -m discovery -P1"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/nodes')
@login_required
def shownodes():
"""Show nodes."""
cmdshow="iscsiadm -m node -P1"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/disks')
@login_required
def showdisk():
"""Show discovery disk."""
cmdshow="iscsiadm -m session -P3"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/lsblk')
@login_required
def showlsblk():
"""Show discovery sessions and disks."""
cmdshow="lsblk"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/sessiondetail')
@login_required
def showsessiondetail():
"""Show session in detail without disk."""
cmdshow="iscsiadm -m session -P1"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/session')
@login_required
def showsession():
"""Show session ids"""
cmdshow="iscsiadm -m session"
res=cmdline(cmdshow)
return Response(response=res,status=200,mimetype="text/plain")
@app.route('/show/specifiquesession',methods=["GET", "POST"])
@login_required
def showspecifiquesession():
"""show specifique session"""
if request.method == 'POST':
session=request.form['session']
cmdres="iscsiadm -m session -r"+session +" -P3"
res=cmdline(cmdres)
return Response(response=res,status=200,mimetype="text/plain")
else:
return Response('''
<form action="" method="post">
<p><input placeholder="session id" type=text name=session>
<p><input type=submit value=submit>
</form>
''')
@app.route('/rescan/session',methods=["GET", "POST"])
@login_required
def rescansession():
"""rescan a specifique session"""
if request.method == 'POST':
ip=request.form['session']
cmdres="iscsiadm -m session -r"+session +" -R"
res=cmdline(cmdres)
return redirect(url_for('showspecifiquesession'),code=302)
else:
return Response('''
<form action="" method="post">
<p><input placeholder="session id" type=text name=session>
<p><input type=submit value=submit>
</form>
''')
@app.route('/make/discovery',methods=["GET", "POST"])
@login_required
def makediscovery():
"""make a discovery
"""
if request.method == 'POST':
ipaddr=request.form['ip']
print ipaddr
cmdres="iscsiadm -m discovery -t sendtargets -p "+ipaddr+":3260 -P 1"
res=cmdline(cmdres)
return Response(response=res,status=200,mimetype="text/plain")
else:
return Response('''
<form action="" method="post">
<p><input placeholder="portal ip" type=text name=ip>
<p><input type=submit value=submit>
</form>
''')
@app.route('/make/nodelogin',methods=["GET", "POST"])
@login_required
def makenodelogin():
"""make a node login
"""
if request.method == 'POST':
ipaddr=request.form['ip']
iqn=request.form['iqn']
cmdres="iscsiadm -m node "+ iqn + "-p " +ipaddr + "-o update -n node.startup -v automatic"
res=cmdline(cmdres)
return Response(response=res,status=200,mimetype="text/plain")
else:
return Response('''
<form action="" method="post">
<p><input placeholder="portal ip" type=text name=ip>
<p><input placeholder="portal iqn" type=text name=iqn>
<p><input type=submit value=submit>
</form>
''')
@app.route('/make/sessionlogin',methods=["GET", "POST"])
@login_required
def makesessionlogin():
"""make a session login
"""
if request.method == 'POST':
ipaddr=request.form['ip']
iqn=request.form['iqn']
cmdres="iscsiadm -m node "+ iqn + "-p " +ipaddr + "-l"
res=cmdline(cmdres)
return Response(response=res,status=200,mimetype="text/plain")
else:
return Response('''
<form action="" method="post">
<p><input placeholder="portal ip" type=text name=ip>
<p><input placeholder="portal iqn" type=text name=iqn>
<p><input type=submit value=submit>
</form>
''')
@app.route("/login", methods=["GET", "POST"])
def login():
"""login page"""
if request.method == 'POST':
username = request.form['username']
password = request.form['password']
if password == username + "_secret":
id = username.split('user')[0]
user = User(id)
login_user(user)
return redirect(url_for('help'))
else:
return abort(401)
else:
return Response('''
<form action="" method="post">
<p><input placeholder="Username" type=text name=username>
<p><input placeholder="Password" type=password name=password>
<p><input type=submit value=Login>
</form>
''')
@app.route("/logout")
@login_required
def logout():
"""logout page """
logout_user()
return Response('<p>Logged out</p>')
@app.errorhandler(401)
def page_not_found(e):
return Response('<p>Login failed</p>')
@login_manager.user_loader
def load_user(userid):
return User(userid)
app.run(debug=True,port=5001)
| 27.920949 | 98 | 0.614383 | 837 | 7,064 | 5.109916 | 0.204301 | 0.062193 | 0.048632 | 0.058452 | 0.491934 | 0.474398 | 0.422492 | 0.422492 | 0.413608 | 0.391162 | 0 | 0.010275 | 0.228482 | 7,064 | 252 | 99 | 28.031746 | 0.774495 | 0.028171 | 0 | 0.491713 | 0 | 0 | 0.328259 | 0.013671 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.027624 | 0.022099 | null | null | 0.016575 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d7469f7f64e499f08213f3dd92d29492cf06e4a9 | 539 | py | Python | src/config/defaults/sc2/config.py | ewanlee/mackrl | 6dd505aa09830f16c35a022f67e255db935c807e | [
"Apache-2.0"
] | 26 | 2019-10-28T09:01:45.000Z | 2021-09-20T08:56:12.000Z | src/config/defaults/sc2/config.py | ewanlee/mackrl | 6dd505aa09830f16c35a022f67e255db935c807e | [
"Apache-2.0"
] | 1 | 2020-07-25T06:50:05.000Z | 2020-07-25T06:50:05.000Z | src/config/defaults/sc2/config.py | ewanlee/mackrl | 6dd505aa09830f16c35a022f67e255db935c807e | [
"Apache-2.0"
] | 6 | 2019-12-18T12:02:57.000Z | 2021-03-03T13:15:47.000Z | def get_cfg(existing_cfg, _log):
"""
generates
"""
_sanity_check(existing_cfg, _log)
import ntpath, os, yaml
with open(os.path.join(os.path.dirname(__file__), "{}.yml".format(ntpath.basename(__file__).split(".")[0])),
'r') as stream:
try:
ret = yaml.load(stream)
except yaml.YAMLError as exc:
assert "Default config yaml for '{}' not found!".format(os.path.splitext(__file__)[0])
return ret
def _sanity_check(existing_cfg, _log):
"""
"""
return | 29.944444 | 112 | 0.595547 | 67 | 539 | 4.447761 | 0.597015 | 0.110738 | 0.14094 | 0.147651 | 0.167785 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004963 | 0.252319 | 539 | 18 | 113 | 29.944444 | 0.734491 | 0.016698 | 0 | 0 | 1 | 0 | 0.093254 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.166667 | false | 0 | 0.083333 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d74f99009f7b23a8259c513299a6bb144a926283 | 4,881 | py | Python | src/old/api_server.py | ssupdoc/k8-simulation | 7834d3faaed3e86b547554c6228540c316621011 | [
"CC0-1.0"
] | null | null | null | src/old/api_server.py | ssupdoc/k8-simulation | 7834d3faaed3e86b547554c6228540c316621011 | [
"CC0-1.0"
] | null | null | null | src/old/api_server.py | ssupdoc/k8-simulation | 7834d3faaed3e86b547554c6228540c316621011 | [
"CC0-1.0"
] | null | null | null | from src.deployment import Deployment
from src.end_point import EndPoint
from src.etcd import Etcd
from src.pod import Pod
from src.pid_controller import PIDController
from src.request import Request
from src.worker_node import WorkerNode
import threading
import random
#The APIServer handles the communication between controllers and the cluster. It houses
#the methods that can be called for cluster management
class APIServer:
def __init__(self, ctrlValues = [0, 0, 0]):
self.etcd = Etcd()
self.etcdLock = threading.Lock()
self.kubeletList = []
self.requestWaiting = threading.Event()
self.controller = PIDController(ctrlValues[0], ctrlValues[1], ctrlValues[2])#Tune your controller
# GetDeployments method returns the list of deployments stored in etcd
def GetDeployments(self):
return self.etcd.deploymentList.copy()
def GetDepByLabel(self, label):
return next(filter(lambda deployment: deployment.deploymentLabel == label, self.etcd.deploymentList), None)
# GetWorkers method returns the list of WorkerNodes stored in etcd
def GetWorkers(self):
return self.etcd.nodeList.copy()
# GetPending method returns the list of PendingPods stored in etcd
def GetPending(self):
return self.etcd.pendingPodList.copy()
# GetEndPoints method returns the list of EndPoints stored in etcd
def GetEndPoints(self):
return self.etcd.endPointList.copy()
# CreateWorker creates a WorkerNode from a list of arguments and adds it to the etcd nodeList
def CreateWorker(self, info):
worker = WorkerNode(info)
self.etcd.nodeList.append(worker)
print("Worker_Node " + worker.label + " created")
# CreateDeployment creates a Deployment object from a list of arguments and adds it to the etcd deploymentList
def CreateDeployment(self, info):
deployment = Deployment(info)
self.etcd.deploymentList.append(deployment)
print("Deployment " + deployment.deploymentLabel + " created")
# RemoveDeployment deletes the associated Deployment object from etcd and sets the status of all associated pods to 'TERMINATING'
def RemoveDeployment(self, info):
for deployment in self.etcd.deploymentList:
if deployment.deploymentLabel == info[0]:
deployment.expectedReplicas = 0
# CreateEndpoint creates an EndPoint object using information from a provided Pod and Node and appends it
# to the endPointList in etcd
def CreateEndPoint(self, pod, worker):
endPoint = EndPoint(pod, pod.deploymentLabel, worker)
self.etcd.endPointList.append(endPoint)
print("New Endpoint for "+endPoint.deploymentLabel+"- NODE: "+ endPoint.node.label + " POD: " + endPoint.pod.podName)
# GetEndPointsByLabel returns a list of EndPoints associated with a given deployment
def GetEndPointsByLabel(self, deploymentLabel):
endPoints = []
for endPoint in self.etcd.endPointList:
if endPoint.deploymentLabel == deploymentLabel:
endPoints.append(endPoint)
return endPoints
#RemoveEndPoint removes the EndPoint from the list within etcd
def RemoveEndPoint(self, endPoint):
endPoint.node.available_cpu+=endPoint.pod.assigned_cpu
print("Removing EndPoint for: "+endPoint.deploymentLabel)
self.etcd.endPointList.remove(endPoint)
#GeneratePodName creates a random label for a pod
def GeneratePodName(self):
label = random.randint(111,999)
for pod in self.etcd.runningPodList:
if pod.podName == label:
label = self.GeneratePodName()
for pod in self.etcd.pendingPodList:
if pod.podName == label:
label = self.GeneratePodName()
return label
# CreatePod finds the resource allocations associated with a deployment and creates a pod using those metrics
def CreatePod(self, deployment):
podName = deployment.deploymentLabel + "_" + str(self.GeneratePodName())
pod = Pod(podName, deployment.cpuCost, deployment.deploymentLabel)
print("Pod " + pod.podName + " created")
self.etcd.pendingPodList.append(pod)
# GetPod returns the pod object associated with an EndPoint
def GetPod(self, endPoint):
return endPoint.pod
#TerminatePod gracefully shuts down a Pod
def TerminatePod(self, endPoint):
pod = endPoint.pod
pod.status="TERMINATING"
self.RemoveEndPoint(endPoint)
print("Removing Pod "+pod.podName)
# CrashPod finds a pod from a given deployment and sets its status to 'FAILED'
# Any resource utilisation on the pod will be reset to the base 0
def CrashPod(self, info):
endPoints = self.GetEndPointsByLabel(info[0])
if len(endPoints) == 0:
print("No Pods to crash")
else:
print("GETTING PODS")
pod = self.GetPod(endPoints[0])
pod.status = "FAILED"
pod.crash.set()
print ("Pod "+pod.podName+" crashed")
# Alter these method so that the requests are pushed to Deployments instead of etcd
def PushReq(self, info):
self.etcd.reqCreator.submit(self.ReqPusher, info)
def ReqPusher(self, info):
self.etcd.pendingReqs.append(Request(info))
self.requestWaiting.set()
| 36.699248 | 129 | 0.763983 | 633 | 4,881 | 5.873618 | 0.259084 | 0.036579 | 0.012103 | 0.021517 | 0.074771 | 0.042496 | 0.042496 | 0.020441 | 0.020441 | 0.020441 | 0 | 0.004342 | 0.150584 | 4,881 | 132 | 130 | 36.977273 | 0.892426 | 0.309158 | 0 | 0.045455 | 0 | 0 | 0.052522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.215909 | false | 0 | 0.102273 | 0.068182 | 0.420455 | 0.102273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d75199ae09fb507ebd0bec385de7a0580d95d595 | 1,946 | py | Python | ngram_graphs/TextGraph/IGraphTextGraph.py | loginn/ngrams_graphs | 74fc42d3895cdbc51eec5aaf7b5505b4432619e3 | [
"MIT"
] | 8 | 2018-04-24T17:03:29.000Z | 2022-02-08T14:36:00.000Z | ngram_graphs/TextGraph/IGraphTextGraph.py | loginn/ngrams-graphs | 74fc42d3895cdbc51eec5aaf7b5505b4432619e3 | [
"MIT"
] | 1 | 2019-08-13T08:50:54.000Z | 2021-12-03T11:19:51.000Z | ngram_graphs/TextGraph/IGraphTextGraph.py | loginn/ngrams_graphs | 74fc42d3895cdbc51eec5aaf7b5505b4432619e3 | [
"MIT"
] | null | null | null | from igraph import Graph
def find_node_name(graph, node_idx):
return graph.vs[node_idx]["name"]
class IGraphTextGraph(Graph):
def __init__(self):
super().__init__(directed=True)
def __copy__(self):
g = IGraphTextGraph()
for v in self.vs:
g.add_vertex(v)
for e in self.es:
source = find_node_name(self, e.source)
target = find_node_name(self, e.target)
self.add_edge(source, target, weight=e["weight"], name=source + ' ' + target)
return g
@staticmethod
def __calc_new_weight(s_edge, weight, learning_factor) -> float:
return s_edge["weight"] + ((weight - s_edge["weight"]) * learning_factor)
def __add_unknown_vertices(self, other):
for vertex in other.vs:
if vertex["name"] not in self.vs["name"]:
self.add_vertex(name=vertex["name"])
def __add_unknown_edge(self, other, o_edge, learning_factor):
source = find_node_name(other, o_edge.source)
target = find_node_name(other, o_edge.target)
self.add_edge(source, target, weight=o_edge["weight"] * learning_factor, name=source + ' ' + target)
def __update_edges(self, other: 'IGraphTextGraph', learning_factor: float):
for o_edge in other.es:
s_edge = next(iter([e for e in self.es if e["name"] == o_edge["name"]]), None)
if s_edge is not None:
s_edge["weight"] = self.__calc_new_weight(s_edge, o_edge["weight"], learning_factor)
else:
self.__add_unknown_edge(other, o_edge, learning_factor)
for s_edge in self.es:
if s_edge["name"] not in other.es["name"]:
s_edge["weight"] = self.__calc_new_weight(s_edge, 0, learning_factor)
def update(self, other: 'IGraphTextGraph', learning_factor):
self.__add_unknown_vertices(other)
self.__update_edges(other, learning_factor)
| 38.92 | 108 | 0.631552 | 266 | 1,946 | 4.278195 | 0.195489 | 0.04833 | 0.052724 | 0.084359 | 0.443761 | 0.212654 | 0.119508 | 0.057996 | 0.057996 | 0 | 0 | 0.000685 | 0.250257 | 1,946 | 49 | 109 | 39.714286 | 0.779301 | 0 | 0 | 0 | 0 | 0 | 0.054471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205128 | false | 0 | 0.025641 | 0.051282 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d75247a3779551deda4745fa7b2623bb3e507eaf | 619 | py | Python | pyFiDEL/utils.py | sungcheolkim78/pyFiDEL | 670067b12a2efd276e23382251ec612af678731f | [
"Apache-2.0"
] | null | null | null | pyFiDEL/utils.py | sungcheolkim78/pyFiDEL | 670067b12a2efd276e23382251ec612af678731f | [
"Apache-2.0"
] | null | null | null | pyFiDEL/utils.py | sungcheolkim78/pyFiDEL | 670067b12a2efd276e23382251ec612af678731f | [
"Apache-2.0"
] | null | null | null | '''
utils.py - utility functions
Soli Deo Gloria
'''
__author__ = 'Sung-Cheol Kim'
__version__ = '1.0.0'
import numpy as np
def fermi_l(x: np.array, l1: float, l2: float) -> np.array:
''' calculate fermi-dirac distribution with np.array x with l1 and l2'''
return 1. / (1. + np.exp(l2 * x + l1))
def fermi_b(x: np.array, b: float, m: float, normalized: bool = False):
''' calculate fermi-dirac distribution with np.array x with beta and mu'''
if normalized:
return 1. / (1. + np.exp(b / float(len(x)) * (x - m * float(len(x)))))
else:
return 1. / (1. + np.exp(b * (x - m)))
| 23.807692 | 78 | 0.596123 | 101 | 619 | 3.554455 | 0.425743 | 0.097493 | 0.066852 | 0.083565 | 0.376045 | 0.339833 | 0.261838 | 0.261838 | 0.261838 | 0 | 0 | 0.031579 | 0.232633 | 619 | 25 | 79 | 24.76 | 0.724211 | 0.289176 | 0 | 0 | 0 | 0 | 0.045238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d7599edaa481a1e260e406f973a13c62378a0dab | 389 | py | Python | tests/utils.py | amatissart/idunn | cae32024165a85ad99881c373a88ea5702631d71 | [
"Apache-2.0"
] | null | null | null | tests/utils.py | amatissart/idunn | cae32024165a85ad99881c373a88ea5702631d71 | [
"Apache-2.0"
] | null | null | null | tests/utils.py | amatissart/idunn | cae32024165a85ad99881c373a88ea5702631d71 | [
"Apache-2.0"
] | null | null | null | from contextlib import contextmanager
from copy import deepcopy
from app import settings
@contextmanager
def override_settings(overrides):
"""
A utility function used by some fixtures to override settings
"""
old_settings = deepcopy(settings._settings)
settings._settings.update(overrides)
try:
yield
finally:
settings._settings = old_settings
| 24.3125 | 65 | 0.732648 | 43 | 389 | 6.488372 | 0.55814 | 0.229391 | 0.136201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210797 | 389 | 15 | 66 | 25.933333 | 0.908795 | 0.156812 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d759cca6aa40b22128380587e7622577a783f241 | 1,422 | py | Python | parse.py | OpenScienceFramework/citations | 3679ef4a1746a287061186078ce4b9144afe9083 | [
"Apache-2.0"
] | 4 | 2015-05-20T15:48:35.000Z | 2021-06-24T17:04:51.000Z | parse.py | OpenScienceFramework/citations | 3679ef4a1746a287061186078ce4b9144afe9083 | [
"Apache-2.0"
] | null | null | null | parse.py | OpenScienceFramework/citations | 3679ef4a1746a287061186078ce4b9144afe9083 | [
"Apache-2.0"
] | 1 | 2015-01-10T13:02:45.000Z | 2015-01-10T13:02:45.000Z | # encoding: utf-8
"""
Parse module for parsing citations into structured data. Currently this uses
the Parsley library to do this, the grammars are defined in the grammars/
folder and cycled through until one is found that works.
to_dict will convert the Reference named tuple into a dictionary, which allows
for easy transformation into JSON.
"""
import collections
import glob
import re
import parsley
DASHES = ['-', u'–']
fields = "ref names year title journal edition pages doi".split()
Reference = collections.namedtuple("Reference", ' '.join(fields))
def normalize(string):
"""Normalize whitespace."""
string = string.strip()
string = re.sub(r'\s+', ' ', string)
return string
parsers = []
for gname in glob.glob("grammars/*.parsley"):
with open(gname) as gfile:
grammar = unicode(gfile.read())
parser = parsley.makeGrammar(grammar, dict(DASHES=DASHES,
Reference=Reference, normalize=normalize))
parsers.append(parser)
def parse(text):
"""
Attempt to parse data into a Reference named tuple. Returns None if it
fails.
"""
for parser in parsers:
try:
return parser(text).line()
except Exception as e:
print e.message
pass
def to_dict(s):
"""Turns a citation into a dictioarny that can then be turned into JSON."""
return parser(s).line()._asdict()
| 28.44 | 79 | 0.662447 | 185 | 1,422 | 5.081081 | 0.589189 | 0.015957 | 0.040426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000922 | 0.23699 | 1,422 | 49 | 80 | 29.020408 | 0.864516 | 0.010549 | 0 | 0 | 0 | 0 | 0.091324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.037037 | 0.148148 | null | null | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d75e34968052a479dcb04800c417d5b42abec672 | 3,876 | py | Python | test/clean_directory.py | adevress/gfal2 | ce8945d1c153e26c5d10ad43d1940b8dcace0579 | [
"Apache-2.0"
] | null | null | null | test/clean_directory.py | adevress/gfal2 | ce8945d1c153e26c5d10ad43d1940b8dcace0579 | [
"Apache-2.0"
] | null | null | null | test/clean_directory.py | adevress/gfal2 | ce8945d1c153e26c5d10ad43d1940b8dcace0579 | [
"Apache-2.0"
] | 1 | 2020-04-28T09:36:46.000Z | 2020-04-28T09:36:46.000Z | #!/usr/bin/env python
import gfal2
import logging
import optparse
import stat
import sys
log = logging.getLogger('gfal2.clean_directory')
class Cleaner(object):
def __init__(self, abort_on_error=False, recursive=False, only_files=False, chmod=False):
self.abort_on_error = abort_on_error
self.recursive = recursive
self.only_files = only_files
self.chmod = chmod
self.context = gfal2.creat_context()
def _get_list(self, surl):
files = []
directories = []
dh = self.context.opendir(surl)
d_entry, d_stat = dh.readpp()
while d_entry:
full_path = surl + '/' + d_entry.d_name
if stat.S_ISREG(d_stat.st_mode):
files.append((full_path, d_stat))
elif stat.S_ISDIR(d_stat.st_mode):
directories.append((full_path, d_stat))
d_entry, d_stat = dh.readpp()
return files, directories
def __call__(self, surl):
log.info("Cleaning %s" % surl)
try:
files, directories = self._get_list(surl)
except gfal2.GError, e:
if self.abort_on_error:
raise
logging.error("Could not list %s (%s)" % (surl, e.message))
return 0, 0
n_files, n_directories = 0, 0
for file, f_stat in files:
try:
self.context.unlink(file)
log.info("Unlink %s" % file)
n_files += 1
except gfal2.GError, e:
if self.abort_on_error:
raise
log.error("Could not unlink %s (%s)" % (file, e.message))
for directory, d_stat in directories:
if self.chmod and not (d_stat.st_mode & 0666):
try:
self.context.chmod(directory, 0775)
log.info("Chmod for %s" % directory)
except gfal2.GError, e:
log.warn("Failed chmod for %s (%s)" % (directory, e.message))
sub_files, sub_directories = self(directory)
n_files += sub_files
n_directories += sub_directories
if not self.only_files:
try:
log.info("Rmdir %s" % directory)
self.context.rmdir(directory)
n_directories += 1
except gfal2.GError, e:
if self.abort_on_error:
raise
log.error("Failed to rmdir %s (%s)" % (directory, e.message))
return n_files, n_directories
if __name__ == '__main__':
parser = optparse.OptionParser(usage='usage: %prog [options] surl')
parser.add_option('-x', '--abort', dest='abort_on_error', default=False,
action='store_true', help='Abort cleaning on the first error')
parser.add_option('-r', '--recursive', dest='recursive', default=False,
action='store_true', help='Traverse directories recursively')
parser.add_option('-f', '--files', dest='only_files', default=False,
action='store_true', help='Unlink only files')
parser.add_option('-c', '--chmod', dest='chmod', default=False,
action='store_true', help='Attempt a chmod when a directory is not writeable')
(options, args) = parser.parse_args()
if len(args) != 1:
parser.error('Wrong number of arguments')
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(logging.Formatter('[%(levelname)s] %(message)s'))
log.addHandler(stdout_handler)
log.setLevel(logging.INFO)
cleaner = Cleaner(options.abort_on_error, options.recursive, options.only_files, options.chmod)
n_files, n_directories = cleaner(args[0])
logging.info("Removed %d files and %d directories" % (n_files, n_directories))
| 36.914286 | 100 | 0.576625 | 465 | 3,876 | 4.604301 | 0.260215 | 0.026156 | 0.044839 | 0.037366 | 0.17702 | 0.141523 | 0.065857 | 0.065857 | 0.065857 | 0.065857 | 0 | 0.008598 | 0.309856 | 3,876 | 104 | 101 | 37.269231 | 0.791776 | 0.00516 | 0 | 0.188235 | 0 | 0 | 0.136446 | 0.005447 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d75e92e1cf6688e5a8ad55c6253bccf7855e6a50 | 2,196 | py | Python | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/viper/calculators/calc_utilities.py | PascalGuenther/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 69 | 2021-12-16T01:34:09.000Z | 2022-03-31T08:27:39.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/viper/calculators/calc_utilities.py | PascalGuenther/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 6 | 2022-01-12T18:22:08.000Z | 2022-03-25T10:19:27.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/viper/calculators/calc_utilities.py | PascalGuenther/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 21 | 2021-12-20T09:05:45.000Z | 2022-03-28T02:52:28.000Z | from pyradioconfig.parts.bobcat.calculators.calc_utilities import Calc_Utilities_Bobcat
from pycalcmodel.core.variable import ModelVariableFormat
from enum import Enum
class Calc_Utilities_Viper(Calc_Utilities_Bobcat):
def buildVariables(self, model):
#Build all variables from the inherited class
super().buildVariables(model)
#Now also build the fefilt_selected variable
self._addModelVariable(model, 'fefilt_selected', str, ModelVariableFormat.ASCII)
def calc_fefilt_selected(self, model):
#This method calculates which FEFILT register set should be used based on demod
#Read in model variables
demod_select = model.vars.demod_select.value
#Calculate fefilt_selected is FEFILT0 only for Viper
fefilt_selected = 'FEFILT0'
#Write the model variable
model.vars.fefilt_selected.value = fefilt_selected
def get_fefilt_actual(self, model, reg_name_str):
#This method queries the value of a reg name string based on the FEFILT register set in use
#Read in model variables
fefilt_selected = model.vars.fefilt_selected.value
#Get the register object
reg_name_complete = fefilt_selected+'_'+reg_name_str
reg = getattr(model.vars, reg_name_complete)
#Return the register value
return reg.value
def get_fefilt_value_forced(self, model, reg_name_str):
#This method queries the value of a reg name string based on the FEFILT register set in use
#Read in model variables
fefilt_selected = model.vars.fefilt_selected.value
#Get the register object
reg_name_complete = fefilt_selected+'_'+reg_name_str
reg = getattr(model.vars, reg_name_complete)
#Return the register value
return reg.value_forced
def write_fefilt_reg(self, model, reg_name_str, value):
#This method writes an FEFILT register field based on the FEFILT register set in use
#Read in model variables
fefilt_selected = model.vars.fefilt_selected.value
#Write the register field
self._reg_write_by_name_concat(model, fefilt_selected, reg_name_str, value)
| 37.220339 | 99 | 0.722222 | 292 | 2,196 | 5.219178 | 0.253425 | 0.146982 | 0.03937 | 0.052493 | 0.479659 | 0.433071 | 0.433071 | 0.433071 | 0.433071 | 0.433071 | 0 | 0.001176 | 0.22541 | 2,196 | 58 | 100 | 37.862069 | 0.894768 | 0.325592 | 0 | 0.291667 | 0 | 0 | 0.016393 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0 | 0.125 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d75ec669dfc620aaaefd9ee6ffa0b75d16e42209 | 2,029 | py | Python | tests/logger/check_logger.py | rancp/ducktape-docs | e1a3b1b7e68beedf5f8d29a4e5f196912a20e264 | [
"Apache-2.0"
] | null | null | null | tests/logger/check_logger.py | rancp/ducktape-docs | e1a3b1b7e68beedf5f8d29a4e5f196912a20e264 | [
"Apache-2.0"
] | null | null | null | tests/logger/check_logger.py | rancp/ducktape-docs | e1a3b1b7e68beedf5f8d29a4e5f196912a20e264 | [
"Apache-2.0"
] | null | null | null | # Copyright 2016 Confluent Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
import os
import psutil
import shutil
import tempfile
from ducktape.tests.loggermaker import LoggerMaker, close_logger
class DummyFileLoggerMaker(LoggerMaker):
def __init__(self, log_dir, n_handles):
"""Create a logger with n_handles file handles, with files in log_dir"""
self.log_dir = log_dir
self.n_handles = n_handles
@property
def logger_name(self):
return "a.b.c"
def configure_logger(self):
for i in range(self.n_handles):
fh = logging.FileHandler(os.path.join(self.log_dir, "log-" + str(i)))
self._logger.addHandler(fh)
def open_files():
# current process
p = psutil.Process()
return p.open_files()
class CheckLogger(object):
def setup_method(self, _):
self.temp_dir = tempfile.mkdtemp()
def check_close_logger(self):
"""Check that calling close_logger properly cleans up resources."""
initial_open_files = open_files()
n_handles = 100
l = DummyFileLoggerMaker(self.temp_dir, n_handles)
# accessing logger attribute lazily triggers configuration of logger
the_logger = l.logger
assert len(open_files()) == len(initial_open_files) + n_handles
close_logger(the_logger)
assert len(open_files()) == len(initial_open_files)
def teardown_method(self, _):
if os.path.exists(self.temp_dir):
shutil.rmtree(self.temp_dir)
| 30.283582 | 81 | 0.699359 | 280 | 2,029 | 4.903571 | 0.467857 | 0.046613 | 0.032047 | 0.023307 | 0.062637 | 0.062637 | 0.062637 | 0.062637 | 0.062637 | 0 | 0 | 0.00691 | 0.215377 | 2,029 | 66 | 82 | 30.742424 | 0.855528 | 0.374569 | 0 | 0 | 0 | 0 | 0.007241 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.205882 | false | 0 | 0.176471 | 0.029412 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d766600ac591a726caf5047535ca5f091cf8d7ac | 10,478 | py | Python | Tutorials/SENSEI/Advection_AmrLevel/Exec/SingleVortex/sensei/render_iso_catalyst_3d.py | ylunalin/amrex | 5715b2fc8a77e0db17bfe7907982e29ec44811ca | [
"BSD-3-Clause-LBNL"
] | null | null | null | Tutorials/SENSEI/Advection_AmrLevel/Exec/SingleVortex/sensei/render_iso_catalyst_3d.py | ylunalin/amrex | 5715b2fc8a77e0db17bfe7907982e29ec44811ca | [
"BSD-3-Clause-LBNL"
] | null | null | null | Tutorials/SENSEI/Advection_AmrLevel/Exec/SingleVortex/sensei/render_iso_catalyst_3d.py | ylunalin/amrex | 5715b2fc8a77e0db17bfe7907982e29ec44811ca | [
"BSD-3-Clause-LBNL"
] | 1 | 2020-01-17T05:00:26.000Z | 2020-01-17T05:00:26.000Z |
from paraview.simple import *
from paraview import coprocessing
#--------------------------------------------------------------
# Code generated from cpstate.py to create the CoProcessor.
# ParaView 5.4.1 64 bits
#--------------------------------------------------------------
# Global screenshot output options
imageFileNamePadding=5
rescale_lookuptable=False
# ----------------------- CoProcessor definition -----------------------
def CreateCoProcessor():
def _CreatePipeline(coprocessor, datadescription):
class Pipeline:
# state file generated using paraview version 5.4.1
# ----------------------------------------------------------------
# setup views used in the visualization
# ----------------------------------------------------------------
#### disable automatic camera reset on 'Show'
paraview.simple._DisableFirstRenderCameraReset()
# Create a new 'Render View'
renderView1 = CreateView('RenderView')
renderView1.ViewSize = [1000, 700]
renderView1.AxesGrid = 'GridAxes3DActor'
renderView1.CenterOfRotation = [0.5, 0.5, 0.5]
renderView1.StereoType = 0
renderView1.CameraPosition = [0.5, 0.5, 3.2557533687070332]
renderView1.CameraFocalPoint = [0.5, 0.5, -0.09031184624419736]
renderView1.CameraParallelScale = 0.8660254037844386
renderView1.Background = [0.0, 0.0, 0.0]
# register the view with coprocessor
# and provide it with information such as the filename to use,
# how frequently to write the images, etc.
coprocessor.RegisterView(renderView1,
filename='pv_image_3d_%t.png', freq=1, fittoscreen=0, magnification=1, width=1000, height=700, cinema={})
renderView1.ViewTime = datadescription.GetTime()
# ----------------------------------------------------------------
# setup the data processing pipelines
# ----------------------------------------------------------------
# create a new 'XML UniformGrid AMR Reader'
# create a producer from a simulation input
mesh_000 = coprocessor.CreateProducer(datadescription, 'mesh')
# create a new 'Cell Data to Point Data'
cellDatatoPointData1 = CellDatatoPointData(Input=mesh_000)
# create a new 'Contour'
contour1 = Contour(Input=cellDatatoPointData1)
contour1.ContourBy = ['POINTS', 'phi']
contour1.ComputeScalars = 1
contour1.Isosurfaces = [0.99429, 1.1043655555555556, 1.214441111111111, 1.3245166666666668, 1.4345922222222223, 1.5446677777777778, 1.6547433333333332, 1.764818888888889, 1.8748944444444444, 1.98497]
contour1.PointMergeMethod = 'Uniform Binning'
# create a new 'Annotate Time'
annotateTime1 = AnnotateTime()
annotateTime1.Format = 't = %0.2f'
# ----------------------------------------------------------------
# setup color maps and opacity mapes used in the visualization
# note: the Get..() functions create a new object, if needed
# ----------------------------------------------------------------
# get color transfer function/color map for 'phi'
phiLUT = GetColorTransferFunction('phi')
phiLUT.RGBPoints = [0.99429, 0.278431372549, 0.278431372549, 0.858823529412, 1.13595724, 0.0, 0.0, 0.360784313725, 1.2766338000000002, 0.0, 1.0, 1.0, 1.41929172, 0.0, 0.501960784314, 0.0, 1.55996828, 1.0, 1.0, 0.0, 1.70163552, 1.0, 0.380392156863, 0.0, 1.84330276, 0.419607843137, 0.0, 0.0, 1.9849700000000001, 0.878431372549, 0.301960784314, 0.301960784314]
phiLUT.ColorSpace = 'RGB'
phiLUT.ScalarRangeInitialized = 1.0
# get opacity transfer function/opacity map for 'phi'
phiPWF = GetOpacityTransferFunction('phi')
phiPWF.Points = [0.99429, 0.0, 0.5, 0.0, 1.9849700000000001, 1.0, 0.5, 0.0]
phiPWF.ScalarRangeInitialized = 1
# ----------------------------------------------------------------
# setup the visualization in view 'renderView1'
# ----------------------------------------------------------------
# show data from mesh_000
mesh_000Display = Show(mesh_000, renderView1)
# trace defaults for the display properties.
mesh_000Display.Representation = 'AMR Blocks'
mesh_000Display.ColorArrayName = [None, '']
mesh_000Display.DiffuseColor = [0.0, 0.0, 0.0]
mesh_000Display.OSPRayScaleArray = 'GhostType'
mesh_000Display.OSPRayScaleFunction = 'PiecewiseFunction'
mesh_000Display.SelectOrientationVectors = 'None'
mesh_000Display.ScaleFactor = 0.1
mesh_000Display.SelectScaleArray = 'None'
mesh_000Display.GlyphType = 'Arrow'
mesh_000Display.GlyphTableIndexArray = 'None'
mesh_000Display.DataAxesGrid = 'GridAxesRepresentation'
mesh_000Display.PolarAxes = 'PolarAxesRepresentation'
mesh_000Display.ScalarOpacityUnitDistance = 0.0174438098693218
# init the 'GridAxesRepresentation' selected for 'DataAxesGrid'
mesh_000Display.DataAxesGrid.XTitle = 'X'
mesh_000Display.DataAxesGrid.YTitle = 'Y'
mesh_000Display.DataAxesGrid.ZTitle = 'Z'
mesh_000Display.DataAxesGrid.XTitleBold = 1
mesh_000Display.DataAxesGrid.XTitleFontSize = 14
mesh_000Display.DataAxesGrid.YTitleBold = 1
mesh_000Display.DataAxesGrid.YTitleFontSize = 14
mesh_000Display.DataAxesGrid.ZTitleBold = 1
mesh_000Display.DataAxesGrid.ZTitleFontSize = 14
mesh_000Display.DataAxesGrid.XLabelBold = 1
mesh_000Display.DataAxesGrid.XLabelFontSize = 14
mesh_000Display.DataAxesGrid.YLabelBold = 1
mesh_000Display.DataAxesGrid.YLabelFontSize = 14
mesh_000Display.DataAxesGrid.ZLabelBold = 1
mesh_000Display.DataAxesGrid.ZLabelFontSize = 14
# show data from contour1
contour1Display = Show(contour1, renderView1)
# trace defaults for the display properties.
contour1Display.Representation = 'Surface'
contour1Display.ColorArrayName = ['POINTS', 'phi']
contour1Display.LookupTable = phiLUT
contour1Display.OSPRayScaleArray = 'GhostType'
contour1Display.OSPRayScaleFunction = 'PiecewiseFunction'
contour1Display.SelectOrientationVectors = 'GhostType'
contour1Display.ScaleFactor = 0.0572519063949585
contour1Display.SelectScaleArray = 'GhostType'
contour1Display.GlyphType = 'Arrow'
contour1Display.GlyphTableIndexArray = 'GhostType'
contour1Display.DataAxesGrid = 'GridAxesRepresentation'
contour1Display.PolarAxes = 'PolarAxesRepresentation'
contour1Display.GaussianRadius = 0.02862595319747925
contour1Display.SetScaleArray = ['POINTS', 'GhostType']
contour1Display.ScaleTransferFunction = 'PiecewiseFunction'
contour1Display.OpacityArray = ['POINTS', 'GhostType']
contour1Display.OpacityTransferFunction = 'PiecewiseFunction'
# show color legend
contour1Display.SetScalarBarVisibility(renderView1, True)
# show data from annotateTime1
annotateTime1Display = Show(annotateTime1, renderView1)
# trace defaults for the display properties.
annotateTime1Display.Bold = 1
annotateTime1Display.FontSize = 12
annotateTime1Display.WindowLocation = 'LowerLeftCorner'
# setup the color legend parameters for each legend in this view
# get color legend/bar for phiLUT in view renderView1
phiLUTColorBar = GetScalarBar(phiLUT, renderView1)
phiLUTColorBar.WindowLocation = 'AnyLocation'
phiLUTColorBar.Position = [0.852, 0.07857142857142851]
phiLUTColorBar.Title = 'phi'
phiLUTColorBar.ComponentTitle = ''
phiLUTColorBar.TitleBold = 1
phiLUTColorBar.TitleFontSize = 24
phiLUTColorBar.LabelBold = 1
phiLUTColorBar.LabelFontSize = 18
phiLUTColorBar.ScalarBarThickness = 24
phiLUTColorBar.ScalarBarLength = 0.8357142857142857
# ----------------------------------------------------------------
# finally, restore active source
SetActiveSource(mesh_000)
# ----------------------------------------------------------------
return Pipeline()
class CoProcessor(coprocessing.CoProcessor):
def CreatePipeline(self, datadescription):
self.Pipeline = _CreatePipeline(self, datadescription)
coprocessor = CoProcessor()
# these are the frequencies at which the coprocessor updates.
freqs = {'mesh': [1, 1, 1]}
coprocessor.SetUpdateFrequencies(freqs)
return coprocessor
#--------------------------------------------------------------
# Global variable that will hold the pipeline for each timestep
# Creating the CoProcessor object, doesn't actually create the ParaView pipeline.
# It will be automatically setup when coprocessor.UpdateProducers() is called the
# first time.
coprocessor = CreateCoProcessor()
#--------------------------------------------------------------
# Enable Live-Visualizaton with ParaView and the update frequency
coprocessor.EnableLiveVisualization(False, 1)
# ---------------------- Data Selection method ----------------------
def RequestDataDescription(datadescription):
"Callback to populate the request for current timestep"
global coprocessor
if datadescription.GetForceOutput() == True:
# We are just going to request all fields and meshes from the simulation
# code/adaptor.
for i in range(datadescription.GetNumberOfInputDescriptions()):
datadescription.GetInputDescription(i).AllFieldsOn()
datadescription.GetInputDescription(i).GenerateMeshOn()
return
# setup requests for all inputs based on the requirements of the
# pipeline.
coprocessor.LoadRequestedData(datadescription)
# ------------------------ Processing method ------------------------
def DoCoProcessing(datadescription):
"Callback to do co-processing for current timestep"
global coprocessor
# Update the coprocessor by providing it the newly generated simulation data.
# If the pipeline hasn't been setup yet, this will setup the pipeline.
coprocessor.UpdateProducers(datadescription)
# Write output data, if appropriate.
coprocessor.WriteData(datadescription);
# Write image capture (Last arg: rescale lookup table), if appropriate.
coprocessor.WriteImages(datadescription, rescale_lookuptable=rescale_lookuptable,
image_quality=0, padding_amount=imageFileNamePadding)
# Live Visualization, if enabled.
coprocessor.DoLiveVisualization(datadescription, "localhost", 22222)
| 44.777778 | 364 | 0.654132 | 954 | 10,478 | 7.137317 | 0.371069 | 0.008812 | 0.007049 | 0.005287 | 0.036422 | 0.02247 | 0.020708 | 0 | 0 | 0 | 0 | 0.095564 | 0.176083 | 10,478 | 233 | 365 | 44.969957 | 0.693154 | 0.317141 | 0 | 0.015748 | 1 | 0 | 0.071895 | 0.012516 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03937 | false | 0 | 0.015748 | 0 | 0.181102 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d7683f0603e42e3b0a2ea0473fd022c587fdae78 | 995 | py | Python | performance/forms.py | linikerunk/tcc-people-analytics | fdda975682d5299c8384e31ebb974dc085330875 | [
"MIT"
] | null | null | null | performance/forms.py | linikerunk/tcc-people-analytics | fdda975682d5299c8384e31ebb974dc085330875 | [
"MIT"
] | 1 | 2020-10-11T10:09:39.000Z | 2020-10-11T10:09:39.000Z | performance/forms.py | linikerunk/TCC_PeopleAnalytics | fdda975682d5299c8384e31ebb974dc085330875 | [
"MIT"
] | null | null | null | """ This is a forms.py that helps to work on the payload of front-end """
from django import forms
from django.contrib.auth import get_user_model
from django.core.exceptions import ValidationError
from django.contrib.auth.models import User
from django.forms import ModelForm
from django.forms.models import inlineformset_factory
from django.forms.widgets import TextInput
# from extra_views import ModelFormSetView, FormSetView
from .models import EvaluationSkill, Skill, Evaluation
class SkillForm(forms.ModelForm):
class Meta:
model = Skill
fields = '__all__'
class EvaluationForm(forms.ModelForm):
class Meta:
model = Evaluation
exclude = ()
class EvaluationSkillForm(forms.ModelForm):
class Meta:
model = EvaluationSkill
exclude = ()
EvaluationSkillFormSet = inlineformset_factory(
Evaluation, EvaluationSkill, form=EvaluationSkillForm,
fields=['evaluation', 'skill', 'grade'], extra=1, can_delete=True
) | 27.638889 | 73 | 0.742714 | 114 | 995 | 6.394737 | 0.473684 | 0.096022 | 0.061728 | 0.09465 | 0.115226 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00123 | 0.182915 | 995 | 36 | 74 | 27.638889 | 0.895449 | 0.121608 | 0 | 0.208333 | 0 | 0 | 0.031106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
d76a161827a2e2c2e3a71fbfaf612214e039ac3b | 1,188 | py | Python | preprocess.py | austinben/ECE470 | 05b6c4c8b41a2c1d634560a0e5ce6af5e4adbc3a | [
"MIT"
] | null | null | null | preprocess.py | austinben/ECE470 | 05b6c4c8b41a2c1d634560a0e5ce6af5e4adbc3a | [
"MIT"
] | null | null | null | preprocess.py | austinben/ECE470 | 05b6c4c8b41a2c1d634560a0e5ce6af5e4adbc3a | [
"MIT"
] | null | null | null | import numpy as np
import os
import cv2
import imutils
import numpy as np
from keras.preprocessing import image
from matplotlib import pyplot as plt
def crop_image(img):
#convert the images to greyscale and add a slight guassian blur
gray = cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)
blurred = cv2.GaussianBlur(gray, (5,5), 0)
#threshold the image and perform erosions and dilations to remove noise
thresh = cv2.threshold(gray,45,255,cv2.THRESH_BINARY)[1]
thresh = cv2.erode(thresh, None, iterations=2)
thresh = cv2.dilate(thresh, None, iterations=2)
#detect contours in the threshold image, then get the largest one we can find.
cnts = cv2.findContours(thresh.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = imutils.grab_contours(cnts)
c = max(cnts, key=cv2.contourArea)
#map extreme points of the image.ext
extLeft = tuple(c[c[:, :, 0].argmin()][0])
extRight = tuple(c[c[:, :, 0].argmax()][0])
extTop = tuple(c[c[:, :, 1].argmin()][0])
extBot = tuple(c[c[:, :, 1].argmax()][0])
# crop new image out of the original
new_image = img[extTop[1]:extBot[1], extLeft[0]:extRight[0]]
return new_image | 34.941176 | 86 | 0.686027 | 182 | 1,188 | 4.428571 | 0.5 | 0.029777 | 0.034739 | 0.037221 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037306 | 0.18771 | 1,188 | 34 | 87 | 34.941176 | 0.797927 | 0.234848 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.318182 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
d76f76f73a1b4d94c460fbf42b33ff78ab858a28 | 8,668 | py | Python | Pcolor_Peaks.py | nchaparr/Sam_Output_Anls | c6736f7863b36d09738ac95b7cbde19ba69526cf | [
"MIT"
] | null | null | null | Pcolor_Peaks.py | nchaparr/Sam_Output_Anls | c6736f7863b36d09738ac95b7cbde19ba69526cf | [
"MIT"
] | 1 | 2015-04-18T14:47:49.000Z | 2015-05-01T21:51:44.000Z | Pcolor_Peaks.py | nchaparr/Sam_Output_Anls | c6736f7863b36d09738ac95b7cbde19ba69526cf | [
"MIT"
] | null | null | null | from __future__ import division
from netCDF4 import Dataset
import glob,os.path
import numpy as np
import numpy.ma as ma
from scipy.interpolate import UnivariateSpline
from matplotlib import cm
from matplotlib import ticker
import matplotlib.pyplot as plt
#import site
#site.addsitedir('/tera/phil/nchaparr/SAM2/sam_main/python')
#from Percentiles import *
from matplotlib.patches import Patch
import sys
#sys.path.insert(0, '/tera/phil/nchaparr/python')
import nchap_fun as nc
from Make_Timelist import *
import warnings
warnings.simplefilter('ignore', np.RankWarning)
#import pywt
from scipy import stats
from datetime import datetime
import fastfit as fsft
"""
In testing phase -- get_fit() for identifying ML top
To plot gradient maxima ie BL heights, and w on a 2d horizontal domain,
and get a histogram or contour plot of BL heigths
for an individual case
added function to get ticks and labels based on mean and standard deviation
"""
#TODO: a mess right now. but can be tidied up once regression code is included
def get_ticks(mean, stddev, max, min):
"""
gets ticks and tick lavels for contour plot based on mean and standard deviation
Arguments:
mean, stddev, max, min
Returns:
ticks, tick_labels
"""
tick_list = []
label_list = []
int1=int(np.ceil((mean-min)/stddev))
int2=int(np.ceil((max-mean)/stddev))
for i in range(int1):
if int1==1:
tick_list.append(min)
label_list.append(r'$\mu - %.1f \sigma$' %((mean-min)/stddev))
elif i > 0:
tick_list.append(mean - (int1-i)*stddev)
label_list.append(r'$\mu - %.1f \sigma$' %(int1-i))
#else:
#tick_list.append(min)
#label_list.append(r'$\mu - %.1f \sigma$' %((mean-min)/stddev))
tick_list.append(mean)
label_list.append(r'$\mu$')
for i in range(int2):
if int2==1:
tick_list.append(max)
label_list.append(r'$\mu + %.1f \sigma$' %((max-mean)/stddev))
elif i< int2-1:
tick_list.append(mean + (i+1)*stddev)
label_list.append(r'$\mu + %.1f \sigma$' %(i+1))
#else:
#tick_list.append(max)
#label_list.append(r'$\mu + %.1f \sigma$' %((max-mean)/stddev))
return label_list, tick_list
def get_fit(theta, height):
"""
Fitting the local theta profile with three lines
"""
fitvals = np.zeros_like(theta)
RSS = np.empty((290, 290))+ np.nan
print RSS[0,0]
for j in range(290):
if j > 2:
for k in range(290):
if k>j+1 and k<289:
b_1 = (np.sum(np.multiply(height[:j], theta[:j])) - 1/j*np.sum(height[:j])*np.sum(theta[:j]))/(np.sum(height[:j]**2) - 1/j*np.sum(height[:j])**2)
a_1 = np.sum(np.multiply(height[:j], theta[:j]))/np.sum(height[:j]) - b_1*np.sum(height[:j]**2)/np.sum(height[:j])
b_2 = (np.sum(theta[j:k]) - (k-j)*(a_1+b_1*height[j]))/(np.sum(height[j:k]) - (k-j)*height[j])
a_2 = np.sum(np.multiply(height[j:k], theta[j:k]))/np.sum(height[j:k]) - b_2*np.sum(height[j:k]**2)/np.sum(height[j:k])
b_3 = (np.sum(theta[k:290]) - (290-k)*(a_2+b_2*height[k]))/(np.sum(height[k:290]) - (290-k)*height[k])
a_3 = np.sum(np.multiply(height[k:290], theta[k:290]))/np.sum(height[k:290]) - b_3*np.sum(height[k:290]**2)/np.sum(height[k:290])
RSS[j, k] = np.sum(np.add(theta[2:j], -(a_1+ b_1*height[2:j]))**2) + np.sum(np.add(theta[j:k], -(a_2+ b_2*height[j:k]))**2) + np.sum(np.add(theta[k:290], -(a_3+ b_3*height[k:290]))**2)
RSS = ma.masked_where(np.isnan(RSS), RSS)
[j, k] = np.unravel_index(ma.argmin(RSS), RSS.shape)
b_1 = (np.sum(np.multiply(height[:j], theta[:j])) - 1/j*np.sum(height[:j]*np.sum(theta[:j])))/(np.sum(height[:j]**2) - 1/j*np.sum(height[2:j])**2)
a_1 = np.sum(np.multiply(height[:j], theta[:j]))/np.sum(height[:j]) - b_1*np.sum(height[:j]**2)/np.sum(height[:j])
b_2 = (np.sum(theta[j:k]) - (k-j)*(a_1+b_1*height[j]))/(np.sum(height[j:k]) - (k-j)*height[j])
a_2 = np.sum(np.multiply(height[j:k], theta[j:k]))/np.sum(height[j:k]) - b_2*np.sum(height[j:k]**2)/np.sum(height[j:k])
b_3 = (np.sum(theta[k:290]) - (290-k)*(a_2+b_2*height[k]))/(np.sum(height[k:290]) - (290-k)*height[k])
a_3 = np.sum(np.multiply(height[k:290], theta[k:290]))/np.sum(height[k:290]) - b_3*np.sum(height[k:290]**2)/np.sum(height[k:290])
fitvals[:j] = b_1*height[:j] + a_1
fitvals[j:k] = b_2*height[j:k] + a_2
fitvals[k:290] = b_3*height[k:290] + a_3
return fitvals, RSS, j, k
#Lists of times relating to output (nc) files
dump_time_list, time_hrs = Make_Timelists(1, 600, 28800)
dump_time = dump_time_list[11]
print dump_time
for k in range(1):
#getting variables from nc files
[wvels, theta, tracer, height] = nc.Get_Var_Arrays("/tera2/nchaparr/Mar52014/runs/sam_case", "/OUT_3D/keep/NCHAPP1_testing_doscamiopdata_24_", dump_time, k+1)
#getting points of maximum theta gradient, getting rid of this soon
#[dvardz, grad_peaks] = nc.Domain_Grad(theta, height)
#tops_indices=np.where(np.abs(grad_peaks - 1400)<10)
#choosing one horizontal point
for i in range(1):
#top_index = [tops_indices[0][i], tops_indices[1][i]]
#[i, j] = top_index
[i, j] = [50, 50]
thetavals = theta[:, i, j]
startTime = datetime.now()
#print 'Start', startTime#1
top = np.where(np.abs(height-2300)<100)[0][0]
print top, height[top]
RSS, J, K = fsft.get_fit(thetavals, height, top)
#print J, height[J]
#print 'RSS time', (datetime.now()-startTime)
fitvals = np.zeros_like(thetavals[:top])
b_1 = (np.sum(np.multiply(height[9:J], thetavals[9:J])) - 1.0/(J-9)*np.sum(height[9:J]*np.sum(thetavals[9:J])))/(np.sum(height[9:J]**2) - 1.0/(J-9)*np.sum(height[9:J])**2)
#print np.sum(np.multiply(height[9:J], thetavals[9:J])), - 1.0/(J-9)*np.sum(height[9:J]*np.sum(thetavals[9:J])), np.sum(height[9:J]**2), - 1.0/(J-9)*np.sum(height[9:J])**2
a_1 = np.sum(np.multiply(height[9:J], thetavals[9:J]))/np.sum(height[9:J]) - b_1*np.sum(height[9:J]**2)/np.sum(height[9:J])
b_2 = (np.sum(thetavals[J:K]) - (K-J)*(a_1+b_1*height[J]))/(np.sum(height[J:K]) - (K-J)*height[J])
a_2 = np.sum(np.multiply(height[J:K], thetavals[J:K]))/np.sum(height[J:K]) - b_2*np.sum(height[J:K]**2)/np.sum(height[J:K])
b_3 = (np.sum(thetavals[K:top]) - (top-K)*(a_2+b_2*height[K]))/(np.sum(height[K:top]) - (top-K)*height[K])
a_3 = np.sum(np.multiply(height[K:top], thetavals[K:top]))/np.sum(height[K:top]) - b_3*np.sum(height[K:top]**2)/np.sum(height[K:top])
#print b_2, b_3
fitvals[:J] = b_1*height[:J] + a_1
fitvals[J:K] = b_2*height[J:K] + a_2
fitvals[K:top] = b_3*height[K:top] + a_3
#set up plot
theFig = plt.figure(i)
theFig.clf()
theAx = theFig.add_subplot(121)
theAx.set_title('Fit')
theAx.set_xlabel(r'$\overline{\theta} (K)$')
theAx.set_ylabel('z (m)')
theAx1 = theFig.add_subplot(122)
theAx1.set_title('Profile and Fit')
theAx1.set_xlabel(r'$\overline{\theta} (K) $')
theAx1.set_ylabel('z (m)')
theAx1.plot(thetavals, height[:], 'wo')
theAx.plot(fitvals[:J], height[:J], 'r-')
theAx.plot(fitvals[J:K], height[J:K], 'b-')
theAx.plot(fitvals[K:top], height[K:top], 'g-')
theAx1.plot(fitvals[:top], height[:top], 'r-')
theAx1.set_xlim(300, 320)
theAx1.set_ylim(0, 2000)
theAx.set_ylim(0, 2000)
theAx.set_xlim(300, 320)
plt.show()
| 40.12963 | 210 | 0.53311 | 1,351 | 8,668 | 3.324204 | 0.174685 | 0.079047 | 0.11022 | 0.061456 | 0.452683 | 0.417502 | 0.362503 | 0.36228 | 0.34558 | 0.345357 | 0 | 0.049773 | 0.288417 | 8,668 | 215 | 211 | 40.316279 | 0.67834 | 0.121135 | 0 | 0.091743 | 0 | 0 | 0.036587 | 0.012005 | 0 | 0 | 0 | 0.004651 | 0 | 0 | null | null | 0 | 0.155963 | null | null | 0.027523 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d7702a9b2e58db18587254af35a4b4199fcc47d0 | 1,386 | py | Python | plio/sqlalchemy_json/alchemy.py | kaitlyndlee/plio | 99f0852d8eb92efeba72f366077bd023a7da7cdd | [
"Unlicense"
] | 11 | 2018-02-01T02:56:26.000Z | 2022-02-21T12:08:12.000Z | plio/sqlalchemy_json/alchemy.py | kaitlyndlee/plio | 99f0852d8eb92efeba72f366077bd023a7da7cdd | [
"Unlicense"
] | 151 | 2016-06-15T21:31:37.000Z | 2021-11-15T16:55:53.000Z | plio/sqlalchemy_json/alchemy.py | kaitlyndlee/plio | 99f0852d8eb92efeba72f366077bd023a7da7cdd | [
"Unlicense"
] | 21 | 2016-06-17T17:02:39.000Z | 2021-03-08T20:47:50.000Z | # Third-party modules
try:
import simplejson as json
except ImportError:
import json
import sqlalchemy
from sqlalchemy.ext import mutable
# Custom modules
from . import track
class NestedMutable(mutable.MutableDict, track.TrackedDict):
"""SQLAlchemy `mutable` extension dictionary with nested change tracking."""
def __setitem__(self, key, value):
"""Ensure that items set are converted to change-tracking types."""
super(NestedMutable, self).__setitem__(key, self.convert(value, self))
@classmethod
def coerce(cls, key, value):
"""Convert plain dictionary to NestedMutable."""
if isinstance(value, cls):
return value
if isinstance(value, dict):
return cls(value)
return super(cls).coerce(key, value)
class _JsonTypeDecorator(sqlalchemy.TypeDecorator):
"""Enables JSON storage by encoding and decoding on the fly."""
impl = sqlalchemy.String
def process_bind_param(self, value, dialect):
return json.dumps(value)
def process_result_value(self, value, dialect):
return json.loads(value)
class JsonObject(_JsonTypeDecorator):
"""JSON object type for SQLAlchemy with change tracking as base level."""
class NestedJsonObject(_JsonTypeDecorator):
"""JSON object type for SQLAlchemy with nested change tracking."""
mutable.MutableDict.associate_with(JsonObject)
NestedMutable.associate_with(NestedJsonObject)
| 27.176471 | 78 | 0.752525 | 166 | 1,386 | 6.180723 | 0.463855 | 0.054581 | 0.031189 | 0.046784 | 0.14425 | 0.093567 | 0.093567 | 0 | 0 | 0 | 0 | 0 | 0.152958 | 1,386 | 50 | 79 | 27.72 | 0.873935 | 0.287157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.222222 | 0.074074 | 0.740741 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d7815d2c4c1816fa3330d39ab973353055d555f9 | 4,073 | py | Python | tests/test_graph.py | nokia/PyBGL | e9868361e5a3870b5247872a8c8c91a1c065fe84 | [
"BSD-3-Clause"
] | 11 | 2019-05-20T16:47:03.000Z | 2021-12-17T10:24:22.000Z | tests/test_graph.py | nokia/PyBGL | e9868361e5a3870b5247872a8c8c91a1c065fe84 | [
"BSD-3-Clause"
] | null | null | null | tests/test_graph.py | nokia/PyBGL | e9868361e5a3870b5247872a8c8c91a1c065fe84 | [
"BSD-3-Clause"
] | 3 | 2019-05-24T02:24:30.000Z | 2020-03-17T09:55:40.000Z | #!/usr/bin/env pytest-3
# -*- coding: utf-8 -*-
__author__ = "Marc-Olivier Buob"
__maintainer__ = "Marc-Olivier Buob"
__email__ = "marc-olivier.buob@nokia-bell-labs.com"
__copyright__ = "Copyright (C) 2020, Nokia"
__license__ = "BSD-3"
from pybgl.graph import *
from pybgl.graphviz import graph_to_html
def test_graph_vertex():
for G in [DirectedGraph, UndirectedGraph]:
g = G(2)
assert set(vertices(g)) == {0, 1}
assert num_vertices(g) == 2
assert set(edges(g)) == set()
assert num_edges(g) == 0
q = add_vertex(g)
assert num_vertices(g) == 3
assert num_edges(g) == 0
assert set(vertices(g)) == {0, 1, 2}
def test_graph_edge():
for G in [DirectedGraph, UndirectedGraph]:
g = G(3)
(u, v, w) = (q for q in vertices(g))
assert set(edges(g)) == set()
assert num_edges(g) == 0
assert out_degree(u, g) == 0
assert out_degree(v, g) == 0
assert out_degree(w, g) == 0
(e1, added) = add_edge(u, v, g)
assert added
assert source(e1, g) == u
assert target(e1, g) == v
assert set(edges(g)) == {e1}
assert num_edges(g) == 1
assert set(out_edges(u, g)) == {e1}
assert set(out_edges(v, g)) == set() if is_directed(g) else {e1}
assert set(out_edges(w, g)) == set()
assert out_degree(u, g) == 1
assert out_degree(v, g) == 0 if is_directed(g) else 1
assert out_degree(w, g) == 0
(e2, added) = add_edge(u, v, g)
assert added
assert source(e2, g) == u
assert target(e2, g) == v
assert set(edges(g)) == {e1, e2}
assert num_edges(g) == 2
assert set(out_edges(u, g)) == {e1, e2}
assert set(out_edges(v, g)) == set() if is_directed(g) else {e1, e2}
assert set(out_edges(w, g)) == set()
assert out_degree(u, g) == 2
assert out_degree(v, g) == 0 if is_directed(g) else 2
assert out_degree(w, g) == 0
(e3, added) = add_edge(u, w, g)
assert added
assert source(e3, g) == u
assert target(e3, g) == w
assert set(edges(g)) == {e1, e2, e3}
assert num_edges(g) == 3
assert set(out_edges(u, g)) == {e1, e2, e3}
assert set(out_edges(v, g)) == set() if is_directed(g) else {e1, e2}
assert set(out_edges(w, g)) == set() if is_directed(g) else {e3}
assert out_degree(u, g) == 3
assert out_degree(v, g) == 0 if is_directed(g) else 2
assert out_degree(w, g) == 0 if is_directed(g) else 1
assert num_vertices(g) == 3
remove_edge(e2, g)
assert num_edges(g) == 2
assert set(edges(g)) == {e1, e3}
assert out_degree(u, g) == 2
def test_graph_remove_vertex():
for G in [DirectedGraph, UndirectedGraph]:
g = G(3)
(e1, _) = add_edge(0, 1, g)
(e2, _) = add_edge(0, 1, g)
(e3, _) = add_edge(0, 2, g)
(e4, _) = add_edge(0, 2, g)
(e5, _) = add_edge(1, 2, g)
(e6, _) = add_edge(2, 2, g)
assert num_vertices(g) == 3
assert set(vertices(g)) == {0, 1, 2}
assert num_edges(g) == 6
assert set(edges(g)) == {e1, e2, e3, e4, e5, e6}
remove_vertex(1, g)
assert num_vertices(g) == 2
assert set(vertices(g)) == {0, 2}
assert num_edges(g) == 3
assert set(edges(g)) == {e3, e4, e6}
remove_vertex(2, g)
assert num_vertices(g) == 1
assert set(vertices(g)) == {0}
assert num_edges(g) == 0
assert set(edges(g)) == set()
def test_graph_is_directed():
for G in [DirectedGraph, UndirectedGraph]:
g = G()
assert is_directed(g) == (G is DirectedGraph)
def test_graph_graphviz():
for G in [DirectedGraph, UndirectedGraph]:
g = G(3)
(e1, _) = add_edge(0, 1, g)
(e2, _) = add_edge(0, 1, g)
(e3, _) = add_edge(0, 2, g)
(e4, _) = add_edge(0, 2, g)
(e5, _) = add_edge(1, 2, g)
svg = graph_to_html(g)
| 33.661157 | 76 | 0.532286 | 625 | 4,073 | 3.2896 | 0.112 | 0.100681 | 0.094844 | 0.072957 | 0.743191 | 0.695525 | 0.635214 | 0.423152 | 0.359436 | 0.344358 | 0 | 0.046022 | 0.311809 | 4,073 | 120 | 77 | 33.941667 | 0.687478 | 0.010803 | 0 | 0.480769 | 0 | 0 | 0.025087 | 0.00919 | 0 | 0 | 0 | 0 | 0.596154 | 1 | 0.048077 | false | 0 | 0.019231 | 0 | 0.067308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d787a69ca7a5c43e3169025b3f9e4ad5662b526b | 8,695 | py | Python | 16_1.py | yunjung-lee/class_python_numpy | 589817c8bbca85d70596e4097c0ece093b5353c3 | [
"MIT"
] | null | null | null | 16_1.py | yunjung-lee/class_python_numpy | 589817c8bbca85d70596e4097c0ece093b5353c3 | [
"MIT"
] | null | null | null | 16_1.py | yunjung-lee/class_python_numpy | 589817c8bbca85d70596e4097c0ece093b5353c3 | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from pandas import DataFrame, Series
import matplotlib.pyplot as plt
num = np.array(['3.14','-2.7','30'], dtype=np.string_) #코드 이해 쉽게 : dtype=np.string_
# num=num.astype(int)
# print(num)
# ValueError: invalid literal for int() with base 10: '3.14'
num=num.astype(float).astype(int)
print(num)
# [ 3 -2 30] : 바로 int형 변형이 안되면 float으로 바꿨다가 바꿀 수 있다.
num=num.astype(float)
print(num)
# [ 3.14 -2.7 30. ]
arr=np.arange(32).reshape((8,4))
print(arr)
# [[ 0 1 2 3]
# [ 4 5 6 7]
# [ 8 9 10 11]
# [12 13 14 15]
# [16 17 18 19]
# [20 21 22 23]
# [24 25 26 27]
# [28 29 30 31]]
print(arr[[1,5,7,2],[0,3,1,2]]) #지정된 데이터 추출[[행번호],[열번호]]==>(행,열)순서쌍으로 요소 확인
# [ 4 23 29 10]
print(arr[[1,5,7,2]][:,[0,3,1,2]]) #[[행]][:,[열]] : 연속의 의미==>행 1,5,7,2번 index에 해당하는 행
# [[ 4 7 5 6]
# [20 23 21 22]
# [28 31 29 30]
# [ 8 11 9 10]]
print(arr[[1,5,7,2]][:,[3,1]]) #[[행]][:,[열]] : 연속의 의미==>index행에 대한 열 1,3번 index에 해당하는 열
# [[ 7 5]
# [23 21]
# [31 29]
# [11 9]]
import random
walk = []
position =0
steps=1000
for i in range(steps):
step = 1 if random.randint(0,1) else -1 #randint,randn,rannormal
position+=step
walk.append(position)
print("position : ",position)
# position : 18
print("walk : ",walk)
# walk : [-1, 0, 1, 0, -1, -2, -1, -....]
print(min(walk))
# -7
print(max(walk))
# 28
# print(abs(walk)) #abs : 절대값 변환
obj = Series([1,2,-3,4])
print(obj)
# 0 1
# 1 2
# 2 -3
# 3 4
# dtype: int64
print(obj.values) #values : 값만 추출함(속성, 함수)
# [ 1 2 -3 4]
print(obj.index) #index : 인덱스 추출
# RangeIndex(start=0, stop=4, step=1)
#인덱스 지정
obj = Series([1,2,-3,4],index=['x','y','z','k']) #index의 이름을 직접 부여
print(obj)
# 지정 인덱스 출력
# x 1
# y 2
# z -3
# k 4
# dtype: int64
print(obj['y'])
# 2
obj['x']=10
print(obj)
# x 10
# y 2
# z -3
# k 4
# dtype: int64
#여러개를 참조하는 방법
# print(obj['x','y'])
# # KeyError: ('x', 'y')
print(obj[['x','y','z']]) #index 1개 참조시 [],2개이상 참조시 [[]]
# x 10
# y 2
# z -3
# dtype: int64
print('='*50)
print(obj>0) #조건식 사용 가능
# x True
# y True
# z False
# k True
# dtype: bool
print(obj[obj>0]) #조건식으로 추출 가능
# x 10
# y 2
# k 4
# dtype: int64
print(obj*2) # 사칙연산 가능
# x 20
# y 4
# z -6
# k 8
# dtype: int64
print(np.exp(obj)) # 지수승
# x 22026.465795
# y 7.389056
# z 0.049787
# k 54.598150
# dtype: float64
# null(초기화 되지 않은 상태), na(결측치)
print(obj)
print('a' in obj) #in : 특정 문자가 있는지 확인
print('x' in obj) # 열: 특징, 행 : 관측치
print('='*50)
#key & value->Series->index & value 변환(key=>index,value=>value)
sdata = {'Ohio': 35000, 'Texas': 71000, "Oregon":16000, "Utah":5000}
obj3=Series(sdata) #dictionaly도 Series로 변환 가능
print(obj3)
# Ohio 35000
# Texas 71000
# Oregon 16000
# Utah 5000
# dtype: int64
print(type(obj3))
# <class 'pandas.core.series.Series'>
states = ['California','Ohio','Oregon','Texas']
obj99 = Series(states) #list를 Series로 변환
# print(obj99)
# # 0 California
# # 1 Ohio
# # 2 Oregon
# # 3 Texas
# # dtype: object
obj4 = Series(sdata, index=states) #sdata를 사용하여 index는 states기준으로 Series자료구조 변환
print(obj4)
# California NaN
# Ohio 35000.0
# Oregon 16000.0
# Texas 71000.0
# dtype: float64
print(pd.isnull(obj4))
# California True
# Ohio False
# Oregon False
# Texas False
# dtype: bool
#일반적인 개념 nan : 숫자가 아닌 문자같은 것.
#na : 값이 누락, null : 값이 초기화 되지 않은 상태
#pandas개념 : 혼용하여 사용
#isnull함수 : na(null,nan) 인지 아닌지 확인
print(obj4+obj3) # 교집합만의 value만 출력
obj4.name = 'population'
obj.index.name = 'state'
print(obj4)
# California NaN
# Ohio 35000.0
# Oregon 16000.0
# Texas 71000.0
# Name: population, dtype: float64
obj4.index=['w','x','y','z'] #index를 직접 변환
print(obj4)
# w NaN
# x 35000.0
# y 16000.0
# z 71000.0
# Name: population, dtype: float64
data = {
'state' : ['Ohio','Ohio','Ohio','Nevada','Nevada'],
'year': [2000,2001,2002,2001,2002],
'pop': [1.5,1.7,3.6,2.4,2.9]}
frame = DataFrame(data) #series 들의 묶음과 같음
print(frame)
# state year pop
# 0 Ohio 2000 1.5
# 1 Ohio 2001 1.7
# 2 Ohio 2002 3.6
# 3 Nevada 2001 2.4
# 4 Nevada 2002 2.9
print(DataFrame(data, columns=['year','state','pop'])) # column의 순서 변경(임시적)
# year state pop
# 0 2000 Ohio 1.5
# 1 2001 Ohio 1.7
# 2 2002 Ohio 3.6
# 3 2001 Nevada 2.4
# 4 2002 Nevada 2.9
frame = DataFrame(data, columns=['year','state','pop']) #fram으로 완전히 순서 변경
frame2= DataFrame(data, columns=['year','state','pop','debt'], index=['one','two','three','four','five'])
print(frame2)
# year state pop debt
# one 2000 Ohio 1.5 NaN
# two 2001 Ohio 1.7 NaN
# three 2002 Ohio 3.6 NaN
# four 2001 Nevada 2.4 NaN
# five 2002 Nevada 2.9 NaN
print(frame2['state']) # 원하는 열만 출력
# one Ohio
# two Ohio
# three Ohio
# four Nevada
# five Nevada
# Name: state, dtype: object
print(frame2['year'])
# one 2000
# two 2001
# three 2002
# four 2001
# five 2002
# Name: year, dtype: int64
print(frame2.ix['three']) #ix : 특정 index(행)만 참조
#두개 이상의 열 또는 행을 추출 => [[]]사용
# print(frame2[['year','state']])
#
# print(frame2.ix[['three','five']])
print(frame2)
frame2['debt']=16.5
print(frame2)
# year state pop debt
# one 2000 Ohio 1.5 16.5
# two 2001 Ohio 1.7 16.5
# three 2002 Ohio 3.6 16.5
# four 2001 Nevada 2.4 16.5
# five 2002 Nevada 2.9 16.5
# frame2['debt']=np.arange(3)
# print(frame2)
# # ValueError: Length of values does not match length of index
frame2['debt']=np.arange(5)
print(frame2)
# year state pop debt
# one 2000 Ohio 1.5 0
# two 2001 Ohio 1.7 1
# three 2002 Ohio 3.6 2
# four 2001 Nevada 2.4 3
# five 2002 Nevada 2.9 4
print('='*50)
val = Series([-1.2,-1.5,-1.7],index=['two','three','five'])
print(val)
# two -1.2
# three -1.5
# five -1.7
# dtype: float64
#길이가 다른 데이터 열을추가시 -> 시리즈를 생성하여 추가
frame2['debt']=val # index를 지정하여 value 변경(index의 숫자가 동일하지 않아도 index가 지정되어있어서 대입가능)
print(frame2)
# 새로운 열 추가 : 동부에 속하는 Ohio는 True, 나머지는 False로 한다.(조건 제시형)
frame2['eastern']=frame2.state=='Ohio'
print(frame2)
# year state pop debt eastern
# one 2000 Ohio 1.5 NaN True
# two 2001 Ohio 1.7 -1.2 True
# three 2002 Ohio 3.6 -1.5 True
# four 2001 Nevada 2.4 NaN False
# five 2002 Nevada 2.9 -1.7 False
#열 제거
del frame2['eastern']
print(frame2)
# year state pop debt
# one 2000 Ohio 1.5 NaN
# two 2001 Ohio 1.7 -1.2
# three 2002 Ohio 3.6 -1.5
# four 2001 Nevada 2.4 NaN
# five 2002 Nevada 2.9 -1.7
print(frame2.columns)
# Index(['year', 'state', 'pop', 'debt'], dtype='object')
print(frame2.index)
# Index(['one', 'two', 'three', 'four', 'five'], dtype='object')
pop = {'Nevada' : {2001 : 2.4,2002:2.9},'Ohio' : {2000 : 1.5,2001:1.7,2002:3.6}}
frame3 = DataFrame(pop)
print(frame3)
# Nevada Ohio
# 2000 NaN 1.5
# 2001 2.4 1.7
# 2002 2.9 3.6
# 열과 행 바꿈(transfer)
print(frame3.T)
# 2000 2001 2002
# Nevada NaN 2.4 2.9
# Ohio 1.5 1.7 3.6
# frame4 = DataFrame(pop,index=[2001,2002,2003]) #index 지정을 하려면 DataFrame을 사용해야한다.(딕셔너리엔 index가 없음)
# print(frame4)
# # AttributeError: 'list' object has no attribute 'astype'
frame4 = DataFrame(frame3,index=[2001,2002,2003])
print(frame4)
# Nevada Ohio
# 2001 2.4 1.7
# 2002 2.9 3.6
# 2003 NaN NaN
print(frame3)
# Nevada Ohio
# 2000 NaN 1.5
# 2001 2.4 1.7
# 2002 2.9 3.6
pdata = {'Ohio':frame3['Ohio'][:-1],'Nevada':frame3['Nevada'][:2]} #[:-1] : 마지막 행 제외,[:2] : 0,1 행만 출력
frame5=DataFrame(pdata)
print(frame5)
# Ohio Nevada
# 2000 1.5 NaN
# 2001 1.7 2.4
pdata = {'Ohio':frame3['Ohio'][:-1],'Nevada':frame3['Nevada']}
#'Nevada'-모두 출력이기 때문에 [:-1]사용으로 자료가 없는 'Ohio'의 2002는 NaN이 된다.
frame5=DataFrame(pdata)
print(frame5)
# Ohio Nevada
# 2000 1.5 NaN
# 2001 1.7 2.4
# 2002 NaN 2.9
| 24.287709 | 114 | 0.531685 | 1,389 | 8,695 | 3.326854 | 0.229662 | 0.009522 | 0.025968 | 0.024237 | 0.315516 | 0.279593 | 0.194979 | 0.176152 | 0.136334 | 0.136334 | 0 | 0.158157 | 0.308453 | 8,695 | 357 | 115 | 24.355742 | 0.610344 | 0.582404 | 0 | 0.245098 | 0 | 0 | 0.090559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04902 | 0 | 0.04902 | 0.539216 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
d791857131ffb45651a77bad5f5ede0b6842e7f5 | 3,774 | py | Python | unittests/unintary_tests.py | OneCricketeer/pysqoop | 616199a8441d886ffcc4111445da3d0351401454 | [
"MIT"
] | 9 | 2019-06-17T19:21:22.000Z | 2021-07-12T05:14:03.000Z | unittests/unintary_tests.py | OneCricketeer/pysqoop | 616199a8441d886ffcc4111445da3d0351401454 | [
"MIT"
] | 5 | 2019-07-19T14:42:43.000Z | 2020-08-06T16:55:05.000Z | unittests/unintary_tests.py | OneCricketeer/pysqoop | 616199a8441d886ffcc4111445da3d0351401454 | [
"MIT"
] | 14 | 2019-06-05T16:50:27.000Z | 2021-08-05T16:36:15.000Z | import unittest
from pysqoop.SqoopImport import Sqoop
class TestStringMethods(unittest.TestCase):
def test_empty_sqoop(self):
try:
Sqoop()
except Exception as e:
self.assertEqual(str(e), 'all parameters are empty')
def test_properties_not_empty(self):
try:
Sqoop(fields_terminated_by='\"')
except Exception as e:
self.assertEqual(str(e), Sqoop._EMPTY_TABLE_AND_QUERY_PARAMETERS_EXCEPTION)
def test_parameters_order(self):
for iteration in range(0, 10000):
sqoop = Sqoop(null_string='\'\'', fields_terminated_by='\"', table='prova')
self.assertEqual(sqoop.command(), 'sqoop import --fields-terminated-by \" --null-string \'\' --table prova')
def test_real_case(self):
for iteration in range(0, 10000):
expected = 'sqoop import -fs hdfs://remote-cluster:8020 --hive-drop-import-delims --fields-terminated-by \; --enclosed-by \'\"\' --escaped-by \\\\ --null-string \'\' --null-non-string \'\' --table sample_table --target-dir hdfs://remote-cluster/user/hive/warehouse/db/sample_table --delete-target-dir --connect jdbc:oracle:thin:@//your_ip:your_port/your_schema --username user --password pwd --num-mappers 2 --bindir /path/to/bindir/folder'
sqoop = Sqoop(fs='hdfs://remote-cluster:8020', hive_drop_import_delims=True, fields_terminated_by='\;',
enclosed_by='\'"\'', escaped_by='\\\\', null_string='\'\'', null_non_string='\'\'',
table='sample_table',
target_dir='hdfs://remote-cluster/user/hive/warehouse/db/sample_table',
delete_target_dir=True, connect='jdbc:oracle:thin:@//your_ip:your_port/your_schema',
username='user', password='pwd', num_mappers=2,
bindir='/path/to/bindir/folder')
self.assertEqual(expected, sqoop.command())
def test_hbase_basic_import(self):
expected = "sqoop import --table Rutas " \
"--connect 'jdbc:sqlserver://127.0.0.1:1433;DatabaseName=SQLDB;user=root;password=password' " \
"--incremental lastmodified --hbase-table Rutas --column-family Id_Ruta " \
"--hbase-row-key Id_Ruta -m 1"
sqoop = Sqoop(
connect="'jdbc:sqlserver://127.0.0.1:1433;DatabaseName=SQLDB;user=root;password=password'",
table="Rutas",
incremental="lastmodified",
hbase_table="Rutas",
hbase_row_key="Id_Ruta",
column_family="Id_Ruta",
m=1
)
self.assertEqual(expected, sqoop.command())
def test_hbase_lazy_contruction(self):
expected = "sqoop import --table Rutas " \
"--connect 'jdbc:sqlserver://127.0.0.1:1433;DatabaseName=SQLDB;user=root;password=password' " \
"--incremental lastmodified --hbase-table Rutas --column-family Id_Ruta " \
"--hbase-row-key Id_Ruta -m 1"
sqoop = Sqoop()
sqoop.set_param(param="--connect",
value="'jdbc:sqlserver://127.0.0.1:1433;DatabaseName=SQLDB;user=root;password=password'")
sqoop.set_param(param="--table", value="Rutas")
sqoop.set_param(param="--incremental", value="lastmodified")
# sqoop.unset_param(param="--connect")
sqoop.command()
sqoop.set_param(param="--hbase-table", value="Rutas")
sqoop.set_param(param="--column-family", value="Id_Ruta")
sqoop.set_param(param="--hbase-row-key", value="Id_Ruta")
sqoop.set_param(param="-m", value="1")
self.assertEqual(expected, sqoop.command())
if __name__ == '__main__':
unittest.main()
| 51.69863 | 454 | 0.607578 | 439 | 3,774 | 5.045558 | 0.250569 | 0.02167 | 0.041084 | 0.056885 | 0.708352 | 0.67088 | 0.654176 | 0.572009 | 0.496163 | 0.457336 | 0 | 0.022997 | 0.239534 | 3,774 | 72 | 455 | 52.416667 | 0.74878 | 0.009539 | 0 | 0.278689 | 0 | 0.098361 | 0.388383 | 0.186563 | 0 | 0 | 0 | 0 | 0.098361 | 1 | 0.098361 | false | 0.098361 | 0.131148 | 0 | 0.245902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ad032b910fb71a08f9b40c52e3ef58efd6aac044 | 368 | py | Python | a4/decrypt/elliptic.py | fultonms/crypto | a3819e3e81b9f93b818a63382183c1804d2edacc | [
"MIT"
] | null | null | null | a4/decrypt/elliptic.py | fultonms/crypto | a3819e3e81b9f93b818a63382183c1804d2edacc | [
"MIT"
] | null | null | null | a4/decrypt/elliptic.py | fultonms/crypto | a3819e3e81b9f93b818a63382183c1804d2edacc | [
"MIT"
] | null | null | null | import argparse
parser = argparse.ArgumentParser(description="Decrpyt a selection of text from a substitution cypher, with the provided key")
parser.add_argument('cryptFile', metavar='encrypted', type=str, help='Path to the encrpyted text')
parser.add_argument('keyFile', metavar='key', type=str, help='Path to the key file')
args = parser.parse_args()
key = dict()
| 40.888889 | 125 | 0.766304 | 53 | 368 | 5.264151 | 0.622642 | 0.064516 | 0.121864 | 0.107527 | 0.143369 | 0.143369 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111413 | 368 | 8 | 126 | 46 | 0.853211 | 0 | 0 | 0 | 0 | 0 | 0.410326 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.