hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ec66790593d3924cae7b163c779dbbcaea214eeb | 19 | py | Python | src/__init__.py | KOLANICH/pymspack | e0af3e3427d1c48982a378d580fa6162fe75eba4 | [
"BSD-3-Clause"
] | 2 | 2016-12-31T20:45:15.000Z | 2020-02-20T07:56:12.000Z | src/__init__.py | KOLANICH/pymspack | e0af3e3427d1c48982a378d580fa6162fe75eba4 | [
"BSD-3-Clause"
] | 2 | 2016-12-31T20:45:24.000Z | 2018-04-14T15:02:25.000Z | src/__init__.py | KOLANICH/pymspack | e0af3e3427d1c48982a378d580fa6162fe75eba4 | [
"BSD-3-Clause"
] | 4 | 2017-01-04T20:16:56.000Z | 2020-02-20T07:56:14.000Z | from .ext import *
| 9.5 | 18 | 0.684211 | 3 | 19 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ec7fcc284113883918120f3b4967454905721488 | 49 | py | Python | function-server/function/handler.py | imikushin/python3-base-image | 6eb15cd0d9b0572dac9801dca171855c261be23f | [
"Apache-2.0"
] | 3 | 2018-04-27T00:07:01.000Z | 2018-06-15T21:39:21.000Z | function-server/function/handler.py | imikushin/python3-base-image | 6eb15cd0d9b0572dac9801dca171855c261be23f | [
"Apache-2.0"
] | 3 | 2018-04-20T23:36:16.000Z | 2018-05-01T17:34:31.000Z | function-server/function/handler.py | imikushin/python3-base-image | 6eb15cd0d9b0572dac9801dca171855c261be23f | [
"Apache-2.0"
] | 4 | 2018-04-03T21:57:06.000Z | 2018-07-16T22:40:40.000Z | def handle(context, payload):
return payload
| 16.333333 | 29 | 0.734694 | 6 | 49 | 6 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183673 | 49 | 2 | 30 | 24.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
ec98503ca54294f23a80892dc7455f0d929c4269 | 7,589 | py | Python | figuras/PycharmKayStatisticalReport/lmmse_geometrical_interpretation_sequential.py | bor9/estudiando_el_kay | 6e07908b8b0b5a5166dadce30001e6100e8304c3 | [
"MIT"
] | null | null | null | figuras/PycharmKayStatisticalReport/lmmse_geometrical_interpretation_sequential.py | bor9/estudiando_el_kay | 6e07908b8b0b5a5166dadce30001e6100e8304c3 | [
"MIT"
] | null | null | null | figuras/PycharmKayStatisticalReport/lmmse_geometrical_interpretation_sequential.py | bor9/estudiando_el_kay | 6e07908b8b0b5a5166dadce30001e6100e8304c3 | [
"MIT"
] | 1 | 2021-11-02T05:27:27.000Z | 2021-11-02T05:27:27.000Z | import matplotlib.pyplot as plt
from matplotlib import rc
from matplotlib import rcParams
__author__ = 'ernesto'
# if use latex or mathtext
rc('text', usetex=True)
rcParams['text.latex.preamble']=[r"\usepackage{amsmath}"]
# colors
lgray = "#dddddd" # ligth gray
# range of x and y axis
xmin_ax = -1
xmax_ax = 2
ymin_ax = -0.75
ymax_ax = 1
# font size
fontsize = 13
# arrows head length and head width
hl = 8
hw = 5
hl_ax = 6
hw_ax = 3
fig = plt.figure(0, figsize=(9, 2.5), frameon=False)
ax = plt.subplot2grid((1, 12), (0, 0), rowspan=1, colspan=4)
plt.xlim(xmin_ax, xmax_ax)
plt.ylim(ymin_ax, ymax_ax)
# x axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmax_ax, 0), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# z axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(0, ymax_ax), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# y axis
y_e = -0.6
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmin_ax, y_e), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# pendiente del eje y
m = (0 - y_e) / (0 - xmin_ax)
A_x = 0.4
A_y = 0.8
# A vector
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(A_x, A_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# x[1] vector
x1_x = 1.3
x1_y = -0.3
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x1_x, x1_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# x[0] vector
x0_x = -0.8
x0_y = m * x0_x
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x0_x, x0_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# hat A
hA_x = A_x
hA_y = -0.3
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(hA_x, hA_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# proyecciones
x0 = hA_x
y0 = hA_y
y1 = 0
x1 = x0 - y0 / m
plt.plot([x0, x1], [y0, y1], 'k--', lw=1)
plt.plot([x0, x0-x1], [y0, y0], 'k--', lw=1)
# hat A[0]
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x0-x1, y0), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# error
plt.plot([hA_x, A_x], [hA_y, A_y], 'k--', lw=1)
# labels
plt.text(x1_x+0.1, x1_y-0.1, r'$x[1]$', fontsize=fontsize, ha='left', va='bottom')
plt.text(x0_x + 0.15, x0_y - 0.05, r'$x[0]$', fontsize=fontsize, ha='left', va='center')
plt.text(A_x + 0.05, A_y, r'$A$', fontsize=fontsize, ha='left', va='bottom')
plt.text(hA_x + 0.02, hA_y - 0.15, r'$\hat{A}[1]$', fontsize=fontsize, ha='left', va='center')
plt.text(x0-x1 - 0.08, y0, r'$\hat{A}[0]$', fontsize=fontsize, ha='right', va='bottom')
plt.axis('off')
###############################################################
ax = plt.subplot2grid((1, 12), (0, 4), rowspan=1, colspan=4)
plt.xlim(xmin_ax, xmax_ax)
plt.ylim(ymin_ax, ymax_ax)
# x axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmax_ax, 0), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# z axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(0, ymax_ax), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmin_ax, y_e), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# A vector
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(A_x, A_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# hat A
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(hA_x, hA_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# proyecciones
plt.plot([x0, x1], [y0, y1], 'k--', lw=1)
plt.plot([x0, x0-x1], [y0, y0], 'k--', lw=1)
# hat A[0]
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x0-x1, y0), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# Delta hat A[0]
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x1, y1), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# error
plt.plot([hA_x, A_x], [hA_y, A_y], 'k--', lw=1)
# labels
plt.text(A_x + 0.05, A_y, r'$A$', fontsize=fontsize, ha='left', va='bottom')
plt.text(hA_x + 0.02, hA_y - 0.15, r'$\hat{A}[1]$', fontsize=fontsize, ha='left', va='center')
plt.text(x0-x1 - 0.08, y0, r'$\hat{A}[0]$', fontsize=fontsize, ha='right', va='bottom')
plt.text(x1, y1+0.1, r'$\Delta\hat{A}[1]$', fontsize=fontsize, ha='center', va='baseline')
plt.axis('off')
###############################################################
ax = plt.subplot2grid((1, 12), (0, 8), rowspan=1, colspan=4)
plt.xlim(xmin_ax, xmax_ax)
plt.ylim(ymin_ax, ymax_ax)
# x axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmax_ax, 0), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# z axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(0, ymax_ax), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# y axis
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(xmin_ax, y_e), textcoords='data',
arrowprops=dict(width=0.01, headwidth=hw_ax, headlength=hl_ax, facecolor='black', shrink=0.002))
# Delta hat A[0]
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x1, y1), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# x[0] vector
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x0_x, x0_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# x[1] vector
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x1_x, x1_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# proyeccion de x[1]
y2 = 0
x2 = (y2 - (x1_y - m * x1_x)) / m
plt.annotate("", xytext=(x2, y2), xycoords='data', xy=(x1_x, x1_y), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
plt.annotate("", xytext=(0, 0), xycoords='data', xy=(x2, y2), textcoords='data',
arrowprops=dict(width=1, headwidth=hw, headlength=hl, facecolor='black', shrink=0.002))
# labels
plt.text(x1, y1+0.1, r'$\Delta\hat{A}[1]$', fontsize=fontsize, ha='right', va='baseline')
plt.text(x0_x + 0.15, x0_y - 0.05, r'$x[0]$', fontsize=fontsize, ha='left', va='center')
plt.text(x1_x-0.15, x1_y-0.2, r'$x[1]$', fontsize=fontsize, ha='right', va='bottom')
plt.text(x1_x+0.2, x1_y-0.1, r'$\hat{x}[1|0]$', fontsize=fontsize, ha='left', va='bottom')
plt.text(x2, y2+0.1, r'$x[1]-\hat{x}[1|0]$', fontsize=fontsize, ha='center', va='baseline')
plt.axis('off')
# save as pdf image
plt.savefig('lmmse_geometrical_interpretation_sequential.pdf', bbox_inches='tight')
plt.show()
| 36.485577 | 109 | 0.629464 | 1,264 | 7,589 | 3.686709 | 0.10443 | 0.009871 | 0.083906 | 0.138197 | 0.873605 | 0.868026 | 0.859227 | 0.856652 | 0.842275 | 0.808155 | 0 | 0.062194 | 0.139808 | 7,589 | 207 | 110 | 36.661836 | 0.651654 | 0.053498 | 0 | 0.59292 | 0 | 0 | 0.104873 | 0.006697 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026549 | 0 | 0.026549 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
01b0e0b6da2e60450cc255371fbfa823c37e9f1d | 137 | py | Python | katas/kyu_7/discover_the_original_price.py | the-zebulan/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 40 | 2016-03-09T12:26:20.000Z | 2022-03-23T08:44:51.000Z | katas/kyu_7/discover_the_original_price.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | null | null | null | katas/kyu_7/discover_the_original_price.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 36 | 2016-11-07T19:59:58.000Z | 2022-03-31T11:18:27.000Z | def discover_original_price(discounted_price, sale_percentage):
return round(discounted_price / ((100 - sale_percentage) * 0.01), 2)
| 45.666667 | 72 | 0.773723 | 18 | 137 | 5.555556 | 0.722222 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057851 | 0.116788 | 137 | 2 | 73 | 68.5 | 0.768595 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
01c7e5bf4ec4ed3dd6e83a22e3a2453b0457dfb5 | 165 | py | Python | application/control.py | Jhoneagle/TilastointiOhjelma | 283178202fb1ddfe6d3875417274171b2a10c723 | [
"Apache-2.0"
] | null | null | null | application/control.py | Jhoneagle/TilastointiOhjelma | 283178202fb1ddfe6d3875417274171b2a10c723 | [
"Apache-2.0"
] | 2 | 2018-04-27T17:43:39.000Z | 2018-12-10T09:31:35.000Z | application/control.py | Jhoneagle/TilastointiOhjelma | 283178202fb1ddfe6d3875417274171b2a10c723 | [
"Apache-2.0"
] | null | null | null | """ routes """
from flask import render_template
from application import app
@app.route('/')
def home():
return render_template("index.html", title='Pääsivu')
| 18.333333 | 57 | 0.709091 | 21 | 165 | 5.47619 | 0.761905 | 0.243478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139394 | 165 | 8 | 58 | 20.625 | 0.809859 | 0.036364 | 0 | 0 | 0 | 0 | 0.119205 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 6 |
bf260ef1301ec0e3da57e2e70a61d84ce51a0951 | 42 | py | Python | wapy-lib/pythons/aio/cpy/websocket.py | pmp-p/wapy-pack | dff596e126272ad9684f92a9bc5318b7e53b350f | [
"MIT"
] | null | null | null | wapy-lib/pythons/aio/cpy/websocket.py | pmp-p/wapy-pack | dff596e126272ad9684f92a9bc5318b7e53b350f | [
"MIT"
] | null | null | null | wapy-lib/pythons/aio/cpy/websocket.py | pmp-p/wapy-pack | dff596e126272ad9684f92a9bc5318b7e53b350f | [
"MIT"
] | null | null | null | from lomond import WebSocket as websocket
| 21 | 41 | 0.857143 | 6 | 42 | 6 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 42 | 1 | 42 | 42 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
174ea0a3a2a8e34cfa72137758d68c4a74b5e4c7 | 109 | py | Python | discovermetal/modules/modules.py | packethost/discover-metal | b213a9ec455dbc85f291d5941b360f2d0564b58a | [
"Apache-2.0"
] | null | null | null | discovermetal/modules/modules.py | packethost/discover-metal | b213a9ec455dbc85f291d5941b360f2d0564b58a | [
"Apache-2.0"
] | null | null | null | discovermetal/modules/modules.py | packethost/discover-metal | b213a9ec455dbc85f291d5941b360f2d0564b58a | [
"Apache-2.0"
] | null | null | null | class DiscoverModule(object):
pass
def discovery_modules():
return DiscoverModule.__subclasses__()
| 15.571429 | 42 | 0.761468 | 10 | 109 | 7.8 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155963 | 109 | 6 | 43 | 18.166667 | 0.847826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 6 |
bda9dd412b488f486e505a46462b817c7787e8a7 | 10,610 | py | Python | tests/test_auto_tablename.py | ramuta/sqla-wrapper | 65bb7fe94fdfd0f2709d96b6e9d93e88e9a69970 | [
"X11"
] | null | null | null | tests/test_auto_tablename.py | ramuta/sqla-wrapper | 65bb7fe94fdfd0f2709d96b6e9d93e88e9a69970 | [
"X11"
] | null | null | null | tests/test_auto_tablename.py | ramuta/sqla-wrapper | 65bb7fe94fdfd0f2709d96b6e9d93e88e9a69970 | [
"X11"
] | null | null | null | from sqlalchemy.ext.declarative import AbstractConcreteBase, declared_attr
from sqla_wrapper import SQLAlchemy
def test_jti_custom_tablename():
"""Test Joined Table Inheritance with a custom table name.
"""
db = SQLAlchemy('sqlite://')
class Person(db.Model):
__tablename__ = 'jti_custom_people'
id = db.Column(db.Integer, primary_key=True)
discriminator = db.Column('type', db.String(50))
__mapper_args__ = {'polymorphic_on': discriminator}
class Engineer(Person):
__tablename__ = 'jti_custom_engineers'
__mapper_args__ = {'polymorphic_identity': 'engineer'}
id = db.Column(db.Integer, db.ForeignKey(Person.id), primary_key=True)
primary_language = db.Column(db.String(50))
class Teacher(Person):
__tablename__ = 'jti_custom_teachers'
__mapper_args__ = {'polymorphic_identity': 'teacher'}
id = db.Column(db.Integer, db.ForeignKey(Person.id), primary_key=True)
primary_language = db.Column(db.String(50))
assert Person.__tablename__ == 'jti_custom_people'
assert Engineer.__tablename__ == 'jti_custom_engineers'
assert Teacher.__tablename__ == 'jti_custom_teachers'
db.expunge_all()
def test_jti_auto_tablename():
"""Test Joined Table Inheritance with an autonatically
asigned table name.
"""
db = SQLAlchemy('sqlite://')
class JaPerson(db.Model):
id = db.Column(db.Integer, primary_key=True)
discriminator = db.Column('type', db.String(50))
__mapper_args__ = {'polymorphic_on': discriminator}
class JaEngineer(JaPerson):
__mapper_args__ = {'polymorphic_identity': 'engineer'}
id = db.Column(db.Integer, db.ForeignKey(JaPerson.id), primary_key=True)
primary_language = db.Column(db.String(50))
class JaTeacher(JaPerson):
__tablename__ = 'jti_auto_teachers'
__mapper_args__ = {'polymorphic_identity': 'teacher'}
id = db.Column(db.Integer, db.ForeignKey(JaPerson.id), primary_key=True)
primary_language = db.Column(db.String(50))
assert JaPerson.__tablename__ == 'ja_people'
assert JaEngineer.__tablename__ == 'ja_engineers'
assert JaTeacher.__tablename__ == 'jti_auto_teachers'
db.expunge_all()
def test_sti_custom_tablename():
"""Test Single Table Inheritance with a custom table name.
"""
db = SQLAlchemy('sqlite://')
class Employee(db.Model):
__tablename__ = 'sti_custom_employee'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
manager_data = db.Column(db.String(50))
engineer_info = db.Column(db.String(50))
type = db.Column(db.String(20))
__mapper_args__ = {
'polymorphic_on': type,
'polymorphic_identity': 'employee'
}
class Manager(Employee):
__mapper_args__ = {
'polymorphic_identity': 'manager'
}
class Engineer(Employee):
__mapper_args__ = {
'polymorphic_identity': 'engineer'
}
assert Employee.__tablename__ == 'sti_custom_employee'
assert Manager.__tablename__ == 'sti_custom_employee'
assert Engineer.__tablename__ == 'sti_custom_employee'
db.expunge_all()
def test_sti_auto_tablename():
"""Test Single Table Inheritance with an autonatically
asigned table name.
"""
db = SQLAlchemy('sqlite://')
class SaEmployee(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
manager_data = db.Column(db.String(50))
engineer_info = db.Column(db.String(50))
type = db.Column(db.String(20))
__mapper_args__ = {
'polymorphic_on': type,
'polymorphic_identity': 'employee'
}
class SaManager(SaEmployee):
__mapper_args__ = {
'polymorphic_identity': 'manager'
}
class SaEngineer(SaEmployee):
__mapper_args__ = {
'polymorphic_identity': 'engineer'
}
assert SaEmployee.__tablename__ == 'sa_employees'
assert SaManager.__tablename__ == 'sa_employees'
assert SaEngineer.__tablename__ == 'sa_employees'
db.expunge_all()
def test_cti_custom_tablename():
"""Test Concrete Table Inheritance with a custom table name.
"""
db = SQLAlchemy('sqlite://')
class Person(db.Model):
__tablename__ = 'cti_custom_people'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
class Engineer(Person):
__tablename__ = 'cti_custom_engineers'
__mapper_args__ = {'concrete': True}
id = db.Column(db.Integer, primary_key=True)
primary_language = db.Column(db.String(50))
name = db.Column(db.String(50))
assert Person.__tablename__ == 'cti_custom_people'
assert Engineer.__tablename__ == 'cti_custom_engineers'
db.expunge_all()
def test_cti_auto_tablename():
"""Test Concrete Table Inheritance with an autonatically
asigned table name.
"""
db = SQLAlchemy('sqlite://')
class Person(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
class Engineer(Person):
__mapper_args__ = {'concrete': True}
id = db.Column(db.Integer, primary_key=True)
primary_language = db.Column(db.String(50))
name = db.Column(db.String(50))
class Teacher(Person):
__tablename__ = 'cti_auto_teachers'
__mapper_args__ = {'concrete': True}
id = db.Column(db.Integer, primary_key=True)
primary_language = db.Column(db.String(50))
name = db.Column(db.String(50))
assert Person.__tablename__ == 'people'
assert Engineer.__tablename__ == 'engineers'
assert Teacher.__tablename__ == 'cti_auto_teachers'
db.expunge_all()
def test_acti_custom_tablename():
"""Test Abstract Concrete Table Inheritance with a custom table name.
"""
db = SQLAlchemy('sqlite://')
class Employee(AbstractConcreteBase, db.Model):
pass
class Manager(Employee):
__tablename__ = 'acti_custom_managers'
employee_id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
manager_data = db.Column(db.String(40))
__mapper_args__ = {
'polymorphic_identity': 'manager',
'concrete': True
}
class Engineer(Employee):
__tablename__ = 'acti_custom_engineers'
employee_id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
engineer_info = db.Column(db.String(40))
__mapper_args__ = {
'polymorphic_identity': 'engineer',
'concrete': True
}
assert Manager.__tablename__ == 'acti_custom_managers'
assert Engineer.__tablename__ == 'acti_custom_engineers'
db.expunge_all()
def test_acti_auto_tablename():
"""Test Abstract Concrete Table Inheritance with an autonatically
asigned table name.
"""
db = SQLAlchemy('sqlite://')
class Employee(AbstractConcreteBase, db.Model):
pass
class Manager(Employee):
__tablename__ = 'acti_auto_managers'
employee_id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
manager_data = db.Column(db.String(40))
__mapper_args__ = {
'polymorphic_identity': 'manager',
'concrete': True
}
class AaEngineer(Employee):
employee_id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
engineer_info = db.Column(db.String(40))
__mapper_args__ = {
'polymorphic_identity': 'engineer',
'concrete': True
}
assert Manager.__tablename__ == 'acti_auto_managers'
assert AaEngineer.__tablename__ == 'aa_engineers'
db.expunge_all()
def test_mixin_tablename():
"""Test for a tablename defined in a mixin.
"""
db = SQLAlchemy('sqlite://')
class EmployeeMixin(object):
__tablename__ = 'mixin_tablename'
@declared_attr
def id(cls):
return db.Column(db.Integer, primary_key=True)
class Engineer(EmployeeMixin, db.Model):
name = db.Column(db.String(50))
assert Engineer.__tablename__ == 'mixin_tablename'
db.expunge_all()
def test_mixin_no_tablename():
"""Test for a tablename NOT defined in a mixin.
"""
db = SQLAlchemy('sqlite://')
class BaseMixin(object):
@declared_attr
def id(cls):
return db.Column(db.Integer, primary_key=True)
class MEngineer(BaseMixin, db.Model):
name = db.Column(db.String(50))
assert MEngineer.__tablename__ == 'm_engineers'
db.expunge_all()
def test_mixin_overwritten_tablename():
"""Test for a tablename defined in a mixin but overwritten.
"""
db = SQLAlchemy('sqlite://')
class EmployeeMixin(object):
__tablename__ = 'mixin_tablename'
@declared_attr
def id(cls):
return db.Column(db.Integer, primary_key=True)
class Engineer(EmployeeMixin, db.Model):
__tablename__ = 'mixin_overwritten_tablename'
name = db.Column(db.String(50))
assert Engineer.__tablename__ == 'mixin_overwritten_tablename'
db.expunge_all()
def test_declared_attr_mixin_tablename():
"""Test for a tablename defined as a @declared_attr in a mixin.
"""
db = SQLAlchemy('sqlite://')
class EmployeeMixin(object):
@declared_attr
def __tablename__(cls):
return 'declared_attr_mixin_tablename'
@declared_attr
def id(cls):
return db.Column(db.Integer, primary_key=True)
class Engineer(EmployeeMixin, db.Model):
name = db.Column(db.String(50))
assert Engineer.__tablename__ == 'declared_attr_mixin_tablename'
db.expunge_all()
def test_declared_attr_mixin_overwritten_tablename():
"""Test for a tablename defined as a @declared_attr in a mixin but overwritten
"""
db = SQLAlchemy('sqlite://')
class EmployeeMixin(object):
@declared_attr
def __tablename__(cls):
return 'declared_attr_mixin_tablename'
@declared_attr
def id(cls):
return db.Column(db.Integer, primary_key=True)
class Engineer(EmployeeMixin, db.Model):
__tablename__ = 'declared_attr_mixin_overwritten_engineers'
name = db.Column(db.String(50))
assert Engineer.__tablename__ == 'declared_attr_mixin_overwritten_engineers'
db.expunge_all()
| 30.66474 | 82 | 0.653063 | 1,191 | 10,610 | 5.423174 | 0.080605 | 0.070599 | 0.085153 | 0.081746 | 0.846416 | 0.816071 | 0.75151 | 0.711101 | 0.698715 | 0.65939 | 0 | 0.008623 | 0.234873 | 10,610 | 345 | 83 | 30.753623 | 0.787017 | 0.080773 | 0 | 0.685106 | 0 | 0 | 0.148663 | 0.027473 | 0 | 0 | 0 | 0 | 0.110638 | 1 | 0.085106 | false | 0.008511 | 0.008511 | 0.029787 | 0.629787 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
da016bef1bf421962d4008372aa69189380b3b54 | 8,960 | py | Python | portfolio/position_test.py | efecarranza/fxCarranza | ecca346d0f078d6f812926ac6781b0059942350b | [
"Unlicense",
"MIT"
] | null | null | null | portfolio/position_test.py | efecarranza/fxCarranza | ecca346d0f078d6f812926ac6781b0059942350b | [
"Unlicense",
"MIT"
] | 1 | 2021-06-01T21:50:11.000Z | 2021-06-01T21:50:11.000Z | portfolio/position_test.py | efecarranza/fxCarranza | ecca346d0f078d6f812926ac6781b0059942350b | [
"Unlicense",
"MIT"
] | null | null | null | from decimal import Decimal
import unittest
from position import Position
class TickerMock(object):
"""
A mock object that allows a representation of the
ticker/pricing handler.
"""
def __init__(self):
self.pairs = ["GBPUSD", "EURUSD"]
self.prices = {
"GBPUSD": {"bid": Decimal("1.50328"), "ask": Decimal("1.50349")},
"EURUSD": {"bid": Decimal("1.07832"), "ask": Decimal("1.07847")},
"USDCAD": {"bid": Decimal("1.37200"), "ask": Decimal("1.37220")},
}
# =====================================
# GBP Home Currency with GBP/USD traded
# =====================================
class TestLongGBPUSDPosition(unittest.TestCase):
"""
Unit tests that cover going long GBP/USD with an account
denominated currency of GBP, using 2,000 units of GBP/USD.
"""
def setUp(self):
home_currency = "USD"
position_type = "long"
currency_pair = "GBPUSD"
units = Decimal("2000")
ticker = TickerMock()
take_profit = "1.50649"
stop_loss = "1.50049"
self.position = Position(
home_currency, position_type,
currency_pair, units, ticker,
take_profit, stop_loss
)
def test_calculate_init_pips(self):
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-2.100"))
def test_calculate_init_profit_base(self):
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-0.42000"))
def test_calculate_init_profit_perc(self):
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.02100"))
def test_calculate_updated_values(self):
"""
Check that after the bid/ask prices move, that the updated
pips, profit and percentage profit calculations are correct.
"""
prices = self.position.ticker.prices
prices["GBPUSD"] = {"bid": Decimal("1.50486"), "ask": Decimal("1.50586")}
self.position.update_position_price()
# Check pips
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("13.7000"))
# Check profit base
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("2.74000"))
# Check profit percentage
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("0.1370"))
class TestShortGBPUSDPosition(unittest.TestCase):
"""
Unit tests that cover going short GBP/USD with an account
denominated currency of GBP, using 2,000 units of GBP/USD.
"""
def setUp(self):
home_currency = "USD"
position_type = "short"
currency_pair = "GBPUSD"
units = Decimal("2000")
ticker = TickerMock()
take_profit = "1.50049"
stop_loss = "1.50649"
self.position = Position(
home_currency, position_type,
currency_pair, units, ticker,
take_profit, stop_loss
)
def test_calculate_init_pips(self):
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-2.1000"))
def test_calculate_init_profit_base(self):
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-0.4200"))
def test_calculate_init_profit_perc(self):
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.02100"))
def test_calculate_updated_values(self):
"""
Check that after the bid/ask prices move, that the updated
pips, profit and percentage profit calculations are correct.
"""
prices = self.position.ticker.prices
prices["GBPUSD"] = {"bid": Decimal("1.50486"), "ask": Decimal("1.50586")}
self.position.update_position_price()
# Check pips
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-25.80000"))
# Check profit base
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-5.16000"))
# Check profit percentage
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.25800"))
# =====================================
# GBP Home Currency with EUR/USD traded
# =====================================
class TestLongEURUSDPosition(unittest.TestCase):
"""
Unit tests that cover going long EUR/USD with an account
denominated currency of GBP, using 2,000 units of EUR/USD.
"""
def setUp(self):
home_currency = "USD"
position_type = "long"
currency_pair = "EURUSD"
units = Decimal("2000")
ticker = TickerMock()
take_profit = "1.08147"
stop_loss = "1.07532"
self.position = Position(
home_currency, position_type,
currency_pair, units, ticker,
take_profit, stop_loss
)
def test_calculate_init_pips(self):
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-1.5000"))
def test_calculate_init_profit_base(self):
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-0.30000"))
def test_calculate_init_profit_perc(self):
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.01500"))
def test_calculate_updated_values(self):
"""
Check that after the bid/ask prices move, that the updated
pips, profit and percentage profit calculations are correct.
"""
prices = self.position.ticker.prices
prices["EURUSD"] = {"bid": Decimal("1.07811"), "ask": Decimal("1.07827")}
self.position.update_position_price()
# Check pips
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-3.60000"))
# Check profit base
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-0.72000"))
# Check profit percentage
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.03600"))
class TestShortEURUSDPosition(unittest.TestCase):
"""
Unit tests that cover going short EUR/USD with an account
denominated currency of GBP, using 2,000 units of EUR/USD.
"""
def setUp(self):
home_currency = "USD"
position_type = "short"
currency_pair = "EURUSD"
units = Decimal("2000")
ticker = TickerMock()
take_profit = "1.07532"
stop_loss = "1.08147"
self.position = Position(
home_currency, position_type,
currency_pair, units, ticker
)
def test_calculate_init_pips(self):
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("-1.50000"))
def test_calculate_init_profit_base(self):
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("-0.30000"))
def test_calculate_init_profit_perc(self):
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("-0.01500"))
def test_calculate_updated_values(self):
"""
Check that after the bid/ask prices move, that the updated
pips, profit and percentage profit calculations are correct.
"""
prices = self.position.ticker.prices
prices["EURUSD"] = {"bid": Decimal("1.07811"), "ask": Decimal("1.07827")}
self.position.update_position_price()
# Check pips
pos_pips = self.position.calculate_pips()
self.assertEqual(pos_pips, Decimal("0.50000"))
# Check profit base
profit_base = self.position.calculate_profit_base()
self.assertEqual(profit_base, Decimal("0.10000"))
# Check profit percentage
profit_perc = self.position.calculate_profit_perc()
self.assertEqual(profit_perc, Decimal("0.00500"))
class TestCalculatePipValue(unittest.TestCase):
def test_calculate_pip_value_with_USD_base(self):
position = Position(
"USD", "long",
"EURUSD", 2000, TickerMock(),
"1.08147", "1.07532"
)
pip_value = position.calculate_pip_value()
self.assertEqual(pip_value, Decimal("0.2"))
def test_calculate_pip_value_with_CAD_base(self):
position = Position(
"USD", "long",
"USDCAD", 2000, TickerMock(),
"1.3753", "1.3693"
)
pip_value = position.calculate_pip_value()
self.assertEqual(pip_value, Decimal("0.15"))
if __name__ == "__main__":
unittest.main() | 35.41502 | 81 | 0.628348 | 1,030 | 8,960 | 5.250485 | 0.126214 | 0.08432 | 0.093195 | 0.079882 | 0.860762 | 0.856509 | 0.834689 | 0.834689 | 0.802515 | 0.802515 | 0 | 0.04997 | 0.245089 | 8,960 | 253 | 82 | 35.41502 | 0.749556 | 0.162946 | 0 | 0.63354 | 0 | 0 | 0.078194 | 0 | 0 | 0 | 0 | 0 | 0.161491 | 1 | 0.142857 | false | 0 | 0.018634 | 0 | 0.198758 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da074bcfab8e79c5e75445966581b7a800db2d00 | 99 | py | Python | li_hang/perceptron/__init__.py | LucienShui/HelloMachineLearning | b00a4b3791808ace3b1e45112350c2b3c539995e | [
"Apache-2.0"
] | 2 | 2019-07-28T08:25:40.000Z | 2019-07-29T05:29:10.000Z | li_hang/perceptron/__init__.py | LucienShui/HelloMachineLearning | b00a4b3791808ace3b1e45112350c2b3c539995e | [
"Apache-2.0"
] | null | null | null | li_hang/perceptron/__init__.py | LucienShui/HelloMachineLearning | b00a4b3791808ace3b1e45112350c2b3c539995e | [
"Apache-2.0"
] | null | null | null | from perceptron.perceptron import Perceptron
from perceptron.dual_perceptron import DualPerceptron
| 33 | 53 | 0.89899 | 11 | 99 | 8 | 0.454545 | 0.318182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080808 | 99 | 2 | 54 | 49.5 | 0.967033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
da49655dca5efcb21fefe8e781a47e04c40da97d | 10,831 | py | Python | openviduconnect/client/syncclient.py | amoghmadan/openviduconnect | 799526b69c7012e5137d716c90fc762f1a9d26e4 | [
"MIT"
] | 1 | 2021-05-22T04:06:03.000Z | 2021-05-22T04:06:03.000Z | openviduconnect/client/syncclient.py | amoghmadan/openviduconnect | 799526b69c7012e5137d716c90fc762f1a9d26e4 | [
"MIT"
] | null | null | null | openviduconnect/client/syncclient.py | amoghmadan/openviduconnect | 799526b69c7012e5137d716c90fc762f1a9d26e4 | [
"MIT"
] | null | null | null | from __future__ import annotations
from urllib.parse import urljoin
from httpx import Client, Response
from .base import BaseClient
from ..exceptions import (
SessionBodyParameterError,
SessionExistsError,
SessionNotFoundError,
ConnectionBodyParameterError,
ConnectionIPCAMError,
SessionDoesNotExistError,
ConnectionNotFound,
SessionOrConnectionDoesNotExist,
RecordingBodyParameterError,
RecordingResolutionOrBrowserSettingsError,
RecordingNoConnectedParticipantsError,
RecordingNotConfiguredForMediaNodeError,
RecordingDisabledOnServerError,
RecordingNotFoundError,
RecordingStartingProgressError,
RecordingNotCompletedError,
)
class OpenViduClient(BaseClient):
"""."""
def __enter__(self):
"""."""
return self
def __exit__(self, exc_type, exc_val, exc_tb):
"""."""
pass
def create_session(self: OpenViduClient, **kwargs: str) -> dict:
"""."""
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.post(self._apis["sessions"], headers=self._headers, json=kwargs)
if response.status_code == 400:
raise SessionBodyParameterError("Problem with some body parameter")
if response.status_code == 409:
raise SessionExistsError("Parameter customSessionId corresponds to an existing Session")
return response.json()
def get_session(self: OpenViduClient, session_id: str) -> dict:
"""."""
url: str = urljoin(self._apis["sessions"], session_id)
with Client(verify=self._verify, headers=self._timeout) as client:
response: Response = client.get(url, headers=self._headers)
if response.status_code == 404:
raise SessionNotFoundError("No Session exists for the passed SESSION_ID")
return response.json()
def get_sessions(self: OpenViduClient) -> dict:
"""."""
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.get(self._apis["sessions"], headers=self._headers)
return response.json()
def delete_session(self: OpenViduClient, session_id: str) -> dict:
"""."""
url: str = urljoin(self._apis["sessions"], session_id)
with Client(verify=self._verify, headers=self._timeout) as client:
response: Response = client.delete(url, headers=self._headers)
if response.status_code == 404:
raise SessionNotFoundError("No Session exists for the passed SESSION_ID")
return response.json()
def create_connection(self: OpenViduClient, session_id: str, **kwargs: str) -> dict:
"""."""
session_url: str = urljoin(self._apis["sessions"], session_id)
url: str = urljoin(session_url, "connection")
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.post(url, headers=self._headers, json=kwargs)
if response.status_code == 400:
raise ConnectionBodyParameterError("Problem with some body parameter")
if response.status_code == 404:
raise SessionNotFoundError("No session exists for the passed SESSION_ID")
if response.status_code == 500:
raise ConnectionIPCAMError("Unexpected error when creating the Connection object")
return response.json()
def get_connection(self: OpenViduClient, session_id: str, connection_id: str) -> dict:
"""."""
session_url: str = urljoin(self._apis["sessions"], session_id)
connection_url: str = urljoin(session_url, "connection")
url: str = urljoin(connection_url, connection_id)
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.get(url, headers=self._headers)
if response.status_code == 400:
raise SessionDoesNotExistError("No Session exists for the passed SESSION_ID")
if response.status_code == 404:
raise ConnectionNotFound("No Connection exists for the passed CONNECTION_ID")
return response.json()
def get_connections(self: OpenViduClient, session_id: str) -> dict:
"""."""
session_url: str = urljoin(self._apis["session"], session_id)
url: str = urljoin(session_url, "connection")
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.get(url, headers=self._headers)
if response.status_code == 404:
raise SessionNotFoundError("No Session exists for the passed SESSION_ID")
return response.json()
def update_connection(self: OpenViduClient, session_id: str, connection_id: str, **kwargs: str) -> dict:
"""."""
session_url: str = urljoin(self._apis["session"], session_id)
connection_url: str = urljoin(session_url, "connection")
url: str = urljoin(connection_url, connection_id)
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.patch(url, headers=self._headers, json=kwargs)
if response.status_code == 400:
raise ConnectionBodyParameterError("Problem with some body parameter")
if response.status_code == 404:
raise SessionOrConnectionDoesNotExist(
"No Session exists for the passed SESSION_ID, or no Connection exists for the passed CONNECTION_ID"
)
return response.json()
def delete_connection(self: OpenViduClient, session_id: str, connection_id: str) -> dict:
"""."""
session_url: str = urljoin(self._apis["session"], session_id)
connection_url: str = urljoin(session_url, "connection")
url: str = urljoin(connection_url, connection_id)
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.delete(url, headers=self._headers)
if response.status_code == 400:
raise SessionDoesNotExistError("No Session exists for the passed SESSION_ID")
if response.status_code == 404:
raise ConnectionNotFound("No Connection for the passed CONNECTION_ID")
return response.json()
def start_recording(self: OpenViduClient, **kwargs: str) -> dict:
"""."""
url: str = urljoin(self._apis["recordings"], "start")
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.post(url, headers=self._headers, json=kwargs)
if response.status_code == 400:
raise RecordingBodyParameterError("Problem with some body parameter")
if response.status_code == 404:
raise SessionNotFoundError("No session exists for the passed session body parameter")
if response.status_code == 406:
raise RecordingNoConnectedParticipantsError("The session has no connected participants")
if response.status_code == 409:
raise RecordingNotConfiguredForMediaNodeError(
"The session is not configured for using MediaMode ROUTED or it is already being recorded"
)
if response.status_code == 422:
raise RecordingResolutionOrBrowserSettingsError(
"resolution parameter exceeds acceptable values (for both width and height, min 100px and max 1999px) "
"or trying to start a recording with both hasAudio and hasVideo to false"
)
if response.status_code == 501:
raise RecordingDisabledOnServerError(
"OpenVidu Server recording module is disabled: "
"OPENVIDU_RECORDING configuration property is set to false"
)
return response.json()
def stop_recording(self: OpenViduClient, recording_id: str) -> dict:
"""."""
stop_url: str = urljoin(self._apis["recordings"], "stop")
url: str = urljoin(stop_url, recording_id)
with Client() as client:
response: Response = client.post(url, headers=self._headers)
if response.status_code == 404:
raise RecordingNotFoundError("No recording exists for the passed RECORDING_ID")
if response.status_code == 406:
raise RecordingStartingProgressError(
"Recording has starting status. Wait until started status before stopping the recording"
)
if response.status_code == 501:
raise RecordingDisabledOnServerError(
"OpenVidu Server recording module is disabled: "
"OPENVIDU_RECORDING configuration property is set to false"
)
return response.json()
def get_recording(self: OpenViduClient, recording_id: str) -> dict:
"""."""
url: str = urljoin(self._apis["recordings"], recording_id)
with Client() as client:
response: Response = client.get(url, headers=self._headers)
if response.status_code == 404:
raise RecordingNotFoundError("No recording exists for the passed RECORDING_ID")
if response.status_code == 501:
raise RecordingDisabledOnServerError(
"OpenVidu Server recording module is disabled: "
"OPENVIDU_RECORDING configuration property is set to false"
)
return response.json()
def get_recordings(self: OpenViduClient) -> dict:
"""."""
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.get(self._apis["recordings"], headers=self._headers)
if response.status_code == 501:
raise RecordingDisabledOnServerError(
"OpenVidu Server recording module is disabled: "
"OPENVIDU_RECORDING configuration property is set to false"
)
return response.json()
def delete_recording(self: OpenViduClient, recording_id: str) -> dict:
"""."""
url: str = urljoin(self._apis["recordings"], recording_id)
with Client(verify=self._verify, timeout=self._timeout) as client:
response: Response = client.delete(url, headers=self._headers)
if response.status_code == 404:
raise RecordingNotFoundError("No recording exists for the passed RECORDING_ID")
if response.status_code == 409:
raise RecordingNotCompletedError("The recording has started status. Stop it before deletion")
if response.status_code == 501:
raise RecordingDisabledOnServerError(
"OpenVidu Server recording module is disabled: "
"OPENVIDU_RECORDING configuration property is set to false"
)
return response.json()
| 40.565543 | 119 | 0.66005 | 1,125 | 10,831 | 6.201778 | 0.126222 | 0.041565 | 0.066504 | 0.08313 | 0.773972 | 0.754192 | 0.72094 | 0.711911 | 0.701304 | 0.677798 | 0 | 0.011559 | 0.249192 | 10,831 | 266 | 120 | 40.718045 | 0.846409 | 0.003047 | 0 | 0.572973 | 0 | 0 | 0.18848 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086486 | false | 0.075676 | 0.027027 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
da8c9a7a680cb49b1fd01931eb7ce9461f915408 | 6,896 | py | Python | ShapeShifter/SQLiteFile.py | kimballh/ShapeShifter | a9897cabec700726629466eea0159e75ba68ba91 | [
"MIT"
] | null | null | null | ShapeShifter/SQLiteFile.py | kimballh/ShapeShifter | a9897cabec700726629466eea0159e75ba68ba91 | [
"MIT"
] | null | null | null | ShapeShifter/SQLiteFile.py | kimballh/ShapeShifter | a9897cabec700726629466eea0159e75ba68ba91 | [
"MIT"
] | null | null | null | import tempfile
import pandas as pd
from SSFile import SSFile
class SQLiteFile(SSFile):
def read_input_to_pandas(self, columnList=[], indexCol="Sample"):
from sqlalchemy import create_engine
filePath=self.filePath
if self.isGzipped:
tempFile = super()._gunzip_to_temp_file()
engine = create_engine('sqlite:///'+tempFile.name)
#filePath= tempFile.name
else:
engine = create_engine('sqlite:///' + filePath)
table = filePath.split('.')[0]
tableList = table.split('/')
table = tableList[len(tableList) - 1]
query = "SELECT * FROM " + table
if len(columnList) > 0:
query = "SELECT " + ", ".join(columnList) + " FROM " + table
df = pd.read_sql(query, engine)
# if self.isGzipped:
# os.remove(filePath)
return df
def export_filter_results(self, inputSSFile, column_list=[], query=None, transpose=False, include_all_columns=False,
gzip_results=False, index_col="Sample"):
filePath=self.filePath #needs to be stored separately as a string, can't be turned to a file object
if query != None:
query = super()._translate_null_query(query)
if inputSSFile.isGzipped:
inputSSFile.filePath=gzip.open(inputSSFile.filePath)
df = inputSSFile._filter_data(columnList=column_list, query=query,
includeAllColumns=include_all_columns, indexCol=index_col)
null = 'NA'
includeIndex = False
if len(df.columns) > 999:
raise SizeExceededError.SizeExceededError("SQLite supports a maximum of 999 columns. Your data has " + str(
len(df.columns)) + " columns. Please use a smaller data set or consider using a different file type")
# print("Warning: SQLite supports a maximum of 999 columns. Your data has " + str(
# len(df.columns)) + " columns. Extra data has been truncated.")
# df=df.iloc[:,0:999]
chunksize = 999//len(df.columns)
from sqlalchemy import create_engine
if gzip_results:
tempFile = tempfile.NamedTemporaryFile(delete=False)
engine = create_engine('sqlite:///' + tempFile.name)
table = filePath.split('.')[0]
tableList = table.split('/')
table = tableList[len(tableList) - 1]
if not transpose:
df = df.set_index(index_col) if index_col in df.columns else df.set_index(df.columns[0])
df.to_sql(table, engine, index=True, if_exists="replace", chunksize=chunksize)
else:
index_col = df.columns[0]
df = df.set_index(index_col) if index_col in df.columns else df.set_index(df.columns[0])
df = df.transpose()
df.to_sql(table, engine, if_exists="replace", index=True, index_label=index_col, chunksize=chunksize)
tempFile.close()
super()._gzip_results(tempFile.name, filePath)
else:
engine = create_engine('sqlite:///' + super()._remove_gz(filePath))
table = filePath.split('.')[0]
tableList = table.split('/')
table = tableList[len(tableList) - 1]
if not transpose:
df = df.set_index(index_col) if index_col in df.columns else df.set_index(df.columns[0])
df.to_sql(table, engine, if_exists='replace', chunksize=chunksize)
else:
index_col = df.columns[0]
df = df.set_index(index_col) if index_col in df.columns else df.set_index(df.columns[0])
df = df.transpose()
df.to_sql(table, engine, if_exists="replace", index=True, index_label=index_col, chunksize=chunksize)
def write_to_file(self, df, gzipResults=False, includeIndex=False, null='NA', indexCol="Sample", transpose=False):
filePath = self.filePath
if len(df.columns) > 999:
raise SizeExceededError.SizeExceededError("SQLite supports a maximum of 999 columns. Your data has " + str(
len(df.columns)) + " columns. Please use a smaller data set or consider using a different file type")
from sqlalchemy import create_engine
chunksize = 999 // len(df.columns)
# if gzipResults:
# tempFile= tempfile.NamedTemporaryFile(delete=False)
# engine = create_engine('sqlite:///' + tempFile.name)
# table = filePath.split('.')[0]
# tableList = table.split('/')
# table = tableList[len(tableList) - 1]
# df.to_sql(table,engine, index=True, if_exists="replace", chunksize=chunksize)
# tempFile.close()
# super()._gzip_results(tempFile.name, filePath)
# else:
# engine = create_engine('sqlite:///' + super()._remove_gz(filePath))
# table = filePath.split('.')[0]
# tableList = table.split('/')
# table = tableList[len(tableList) - 1]
# df.to_sql(table, engine, index=True, if_exists="replace", chunksize=chunksize)
if gzipResults:
tempFile = tempfile.NamedTemporaryFile(delete=False)
engine = create_engine('sqlite:///' + tempFile.name)
table = filePath.split('.')[0]
tableList = table.split('/')
table = tableList[len(tableList) - 1]
if not transpose:
df = df.set_index(indexCol) if indexCol in df.columns else df.set_index(df.columns[0])
df.to_sql(table, engine, index=True, if_exists="replace", chunksize=chunksize)
else:
indexCol = df.columns[0]
df = df.set_index(indexCol) if indexCol in df.columns else df.set_index(df.columns[0])
df = df.transpose()
df.to_sql(table, engine, if_exists="replace", index=True, index_label=indexCol, chunksize=chunksize)
tempFile.close()
super()._gzip_results(tempFile.name, filePath)
else:
engine = create_engine('sqlite:///' + super()._remove_gz(filePath))
table = filePath.split('.')[0]
tableList = table.split('/')
table = tableList[len(tableList) - 1]
if not transpose:
df = df.set_index(indexCol) if indexCol in df.columns else df.set_index(df.columns[0])
df.to_sql(table, engine, if_exists='replace', chunksize=chunksize)
else:
indexCol = df.columns[0]
df = df.set_index(indexCol) if indexCol in df.columns else df.set_index(df.columns[0])
df = df.transpose()
df.to_sql(table, engine, if_exists="replace", index=True, index_label=indexCol, chunksize=chunksize)
import gzip
import SizeExceededError | 49.257143 | 120 | 0.595998 | 798 | 6,896 | 5.022556 | 0.141604 | 0.060629 | 0.03992 | 0.035928 | 0.756238 | 0.713323 | 0.704341 | 0.704341 | 0.704341 | 0.704341 | 0 | 0.010608 | 0.289153 | 6,896 | 140 | 121 | 49.257143 | 0.807018 | 0.141096 | 0 | 0.728155 | 0 | 0 | 0.075737 | 0 | 0.009709 | 0 | 0 | 0 | 0 | 1 | 0.029126 | false | 0 | 0.07767 | 0 | 0.126214 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e546a5294e18aad1b19fdb6f8fdde581beb69a3c | 106 | py | Python | sst_gmt_relationship/calc_cell_weights_cluster_script.py | rmiddelanis/harvey_scaling | a94064996fb200c26a90482cc63804dcdc3cf6dd | [
"MIT"
] | null | null | null | sst_gmt_relationship/calc_cell_weights_cluster_script.py | rmiddelanis/harvey_scaling | a94064996fb200c26a90482cc63804dcdc3cf6dd | [
"MIT"
] | null | null | null | sst_gmt_relationship/calc_cell_weights_cluster_script.py | rmiddelanis/harvey_scaling | a94064996fb200c26a90482cc63804dcdc3cf6dd | [
"MIT"
] | null | null | null | from sst_gmt_relationship.grid_weights import calc_grid_weights
calc_grid_weights('north_atlantic_ocean') | 35.333333 | 63 | 0.90566 | 16 | 106 | 5.4375 | 0.6875 | 0.37931 | 0.344828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04717 | 106 | 3 | 64 | 35.333333 | 0.861386 | 0 | 0 | 0 | 0 | 0 | 0.186916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e58ee17b34f44a467d8aa4d7e0d9284ccd3bf8e2 | 22 | py | Python | keyboards/__init__.py | Vladvlad9/moderator_bot | 080d4f60d3492514a9da52ee9787320d788b83ed | [
"MIT"
] | 7 | 2017-01-31T02:27:56.000Z | 2021-11-08T13:01:43.000Z | keyboards/__init__.py | Vladvlad9/moderator_bot | 080d4f60d3492514a9da52ee9787320d788b83ed | [
"MIT"
] | null | null | null | keyboards/__init__.py | Vladvlad9/moderator_bot | 080d4f60d3492514a9da52ee9787320d788b83ed | [
"MIT"
] | 9 | 2017-04-14T11:45:45.000Z | 2019-06-26T16:05:55.000Z | from .default import * | 22 | 22 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e5a5bd866fdb4eec037cef9b542711bbad5158b3 | 118 | py | Python | python/testData/intentions/PyAnnotateVariableTypeIntentionTest/AnnotationCollectionsNamedTupleInOtherFile/lib.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/intentions/PyAnnotateVariableTypeIntentionTest/AnnotationCollectionsNamedTupleInOtherFile/lib.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/intentions/PyAnnotateVariableTypeIntentionTest/AnnotationCollectionsNamedTupleInOtherFile/lib.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | from collections import namedtuple
MyTuple = namedtuple('MyTuple', ['foo'])
def func():
return MyTuple(foo=42)
| 14.75 | 40 | 0.70339 | 14 | 118 | 5.928571 | 0.714286 | 0.409639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0.161017 | 118 | 7 | 41 | 16.857143 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.084746 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
e5bb20f030bfa43c6f62ba984429c8d407fbd375 | 18,737 | py | Python | server/patient/utils.py | SanahSidhu/DementiaCare | e0a1f7edb2f68f7d9336d2771969189cb8a461fd | [
"MIT"
] | 1 | 2022-03-22T17:24:21.000Z | 2022-03-22T17:24:21.000Z | server/patient/utils.py | NVombat/DementiaCare | bb95c6b547c8942894a13ab302217e6025b6ff2c | [
"MIT"
] | null | null | null | server/patient/utils.py | NVombat/DementiaCare | bb95c6b547c8942894a13ab302217e6025b6ff2c | [
"MIT"
] | 1 | 2022-02-25T15:28:11.000Z | 2022-02-25T15:28:11.000Z | from rest_framework import status
from django.http import response
import datetime as d
from core.settings import AWS_BUCKET_FOLDER, AWS_OBJECT_URL_PREFIX
from .errors import (
InvalidUserCredentialsError,
UserDoesNotExistError,
InvalidInsertionError,
DataInsertionError,
DataFetchingError,
InvalidFieldError,
DataRemovalError,
UserExistsError,
)
from . import userdb, s3
def signup_user(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST SIGNUP")
print("Request Object DATA:", request.data)
name = request.data.get("Name")["value"]
email = request.data.get("Email")["value"]
password = request.data.get("Password")["value"]
phone_num = request.data.get("PhoneNumber")["value"]
print(name, email, password, phone_num)
userdb.insert_user(name, email, password, phone_num)
return response.JsonResponse(
data={"success_status": True},
status=status.HTTP_201_CREATED,
)
except UserExistsError as uee:
return response.JsonResponse(
{"error": str(uee)},
status=status.HTTP_400_BAD_REQUEST,
)
except Exception as e:
print(e)
return response.JsonResponse(
{
"error": "Error Occured While Receiving Registration Data",
},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def login_user(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST LOGIN")
print("Request Object DATA:", request.data)
email = request.data.get("Email")["value"]
password = request.data.get("Password")["value"]
print(email, password)
if userdb.check_hash(email, password):
return response.JsonResponse(
data={"success_status": True},
status=status.HTTP_200_OK,
)
except InvalidUserCredentialsError as ice:
return response.JsonResponse(
{"error": str(ice)},
status=status.HTTP_401_UNAUTHORIZED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{
"error": "Error Occured While Receiving Login Data",
},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_checklist_data(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST CHECKLIST")
print("request.data:", request.data)
email = request.data.get("Email")
text = request.data.get("Text")
function = request.data.get("Function")
print(email)
print("TEXT: ", text)
print("FUNCTION: ", function)
if function == "Add":
userdb.insert_cl_nt_data(email, text, add=True, cl=True)
elif function == "Remove":
userdb.insert_cl_nt_data(email, text[0], remove=True, cl=True)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except InvalidInsertionError as iie:
return response.JsonResponse(
{"error": str(iie)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_checklist_data(request, **kwargs) -> response.JsonResponse:
try:
print("GET REQUEST CHECKLIST")
print("request.data:", request.data)
email = request.data.get("Email")
record = userdb.get_cl_nt_data(email, cl=True)
return record
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_notes_data(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST NOTES")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
note = request.data.get("Note")["value"]
function = request.data.get("Function")["value"]
if function == "Add":
userdb.insert_cl_nt_data(email, note, add=True, nt=True)
elif function == "Remove":
userdb.insert_cl_nt_data(email, note, remove=True, nt=True)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except InvalidInsertionError as iie:
return response.JsonResponse(
{"error": str(iie)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_notes_data(request, **kwargs) -> response.JsonResponse:
try:
print("GET REQUEST NOTES")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
record = userdb.get_cl_nt_data(email, nt=True)
return record
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_medlist_data(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST MEDLIST")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
medicine = request.data.get("Medicine")["value"]
time = request.data.get("Time")["value"]
purpose = request.data.get("Purpose")["value"]
function = request.data.get("Function")["value"]
time_lst = time.split(",")
med_data = {
"Medicine": medicine,
"Purpose": purpose,
"Time": time_lst,
}
if function == "Add":
userdb.insert_ml_inv_emg_data(email, med_data, add=True, ml=True)
elif function == "Remove":
userdb.insert_ml_inv_emg_data(email, med_data, remove=True, ml=True)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except InvalidInsertionError as iie:
return response.JsonResponse(
{"error": str(iie)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_medlist_data(request, **kwargs) -> response.JsonResponse:
try:
print("GET REQUEST MEDLIST")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
medlist = userdb.get_ml_inv_emg_data(email, ml=True)
return medlist
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_inv_data(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST INVENTORY")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
item = request.data.get("Item")["value"]
location = request.data.get("Location")["value"]
function = request.data.get("Function")["value"]
inv_data = {
"Item": item,
"Location": location,
}
if function == "Add":
userdb.insert_ml_inv_emg_data(email, inv_data, add=True, inv=True)
elif function == "Remove":
userdb.insert_ml_inv_emg_data(email, inv_data, remove=True, inv=True)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except InvalidInsertionError as iie:
return response.JsonResponse(
{"error": str(iie)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_inv_data(request, **kwargs) -> response.JsonResponse:
try:
print("GET REQUEST INVENTORY")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
inventory = userdb.get_ml_inv_emg_data(email, inv=True)
return inventory
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_emg_contact(request, **kwargs) -> response.JsonResponse:
"""
Func Desc
"""
try:
print("POST REQUEST EMERGENCY DATA")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
contact_name = request.data.get("Name")["value"]
contact_num = request.data.get("PhoneNumber")["value"]
relation = request.data.get("Relation")["value"]
function = request.data.get("Function")
em_data = {
"Name": contact_name,
"Number": contact_num,
"Relation": relation,
}
if function == "Add":
userdb.insert_ml_inv_emg_data(email, em_data, add=True, emg=True)
elif function == "Remove":
userdb.insert_ml_inv_emg_data(email, em_data, remove=True, emg=True)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except InvalidInsertionError as iie:
return response.JsonResponse(
{"error": str(iie)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_emg_contact(request, **kwargs) -> response.JsonResponse:
"""
Func Desc
"""
try:
print("GET REQUEST EMERGENCY DATA")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
emg_cont = userdb.get_ml_inv_emg_data(email, emg=True)
return emg_cont
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def recv_media(request, **kwargs) -> response.JsonResponse:
try:
print("POST REQUEST MEDIA DATA")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
filename = request.data.get("Filename")["value"]
fileobj = request.data.get("File")["value"]
desc = request.data.get("Description")["value"]
function = request.data.get("Function")["value"]
print(email, filename, fileobj, desc)
date = d.datetime.now()
date = date.strftime("%d/%m/%Y, %H:%M:%S")
filename = filename.lower()
subfolder = email.split("@")[0]
cloudFilename = AWS_BUCKET_FOLDER + subfolder + "/" + filename
objectURL = AWS_OBJECT_URL_PREFIX + cloudFilename
data = {
"Date": date,
"Filename": filename,
"CloudFilename": cloudFilename,
"ObjectURL": objectURL,
}
if function == "Add":
userdb.insert_media(email, data, add=True)
elif function == "Remove":
userdb.insert_media(email, data, remove=True)
s3.upload_file_to_s3(cloudFilename, fileobj)
return response.JsonResponse(
{"success_status": True},
status=status.HTTP_200_OK,
)
except DataRemovalError as dre:
return response.JsonResponse(
{"error": str(dre)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except DataInsertionError as die:
return response.JsonResponse(
{"error": str(die)},
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
except UserDoesNotExistError as udne:
return response.JsonResponse(
{"error": str(udne)},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Receiving Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def send_media(request, **kwargs) -> response.JsonResponse:
"""
Func Desc
"""
try:
print("GET REQUEST MEDIA DATA")
print("request.data:", request.data)
email = request.data.get("Email")["value"]
media = userdb.get_media(email)
return media
except InvalidFieldError as ife:
return response.JsonResponse(
{"error": str(ife)},
status=status.HTTP_405_METHOD_NOT_ALLOWED,
)
except DataFetchingError as dfe:
return response.JsonResponse(
{"error": str(dfe), "success_status": False},
status=status.HTTP_404_NOT_FOUND,
)
except Exception as e:
print(e)
return response.JsonResponse(
{"error": "Error Occured While Sending Data", "success_status": False},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
| 31.490756 | 85 | 0.601003 | 1,945 | 18,737 | 5.615424 | 0.076093 | 0.135506 | 0.142831 | 0.147592 | 0.850394 | 0.831075 | 0.808826 | 0.780443 | 0.771928 | 0.704541 | 0 | 0.0139 | 0.289694 | 18,737 | 594 | 86 | 31.543771 | 0.806747 | 0.001548 | 0 | 0.614286 | 0 | 0 | 0.116065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0.012245 | 0.012245 | 0 | 0.17551 | 0.097959 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e5eb5e12d5f89fc9fcd0ee10b66b1614d42df684 | 19 | py | Python | tools/lint/__init__.py | shs96c/web-platform-tests | 61acad6dd9bb99d32340eb41f5146de64f542359 | [
"BSD-3-Clause"
] | 2,151 | 2020-04-18T07:31:17.000Z | 2022-03-31T08:39:18.000Z | tools/lint/__init__.py | shs96c/web-platform-tests | 61acad6dd9bb99d32340eb41f5146de64f542359 | [
"BSD-3-Clause"
] | 395 | 2020-04-18T08:22:18.000Z | 2021-12-08T13:04:49.000Z | tools/lint/__init__.py | shs96c/web-platform-tests | 61acad6dd9bb99d32340eb41f5146de64f542359 | [
"BSD-3-Clause"
] | 338 | 2020-04-18T08:03:10.000Z | 2022-03-29T12:33:22.000Z | from . import lint
| 9.5 | 18 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f902c5f2ab6bd158e3f2220194f9100db528132a | 420 | py | Python | mathgenerator/funcs/computer_science/__init__.py | Sankari-K/mathgenerator | 712c74fbe34fe594c4c0f7e3b3057b01d85112ba | [
"MIT"
] | 40 | 2020-11-17T19:45:20.000Z | 2022-03-22T18:16:43.000Z | mathgenerator/funcs/computer_science/__init__.py | Sankari-K/mathgenerator | 712c74fbe34fe594c4c0f7e3b3057b01d85112ba | [
"MIT"
] | 209 | 2020-10-14T15:32:08.000Z | 2020-11-03T19:08:19.000Z | mathgenerator/funcs/computer_science/__init__.py | Sankari-K/mathgenerator | 712c74fbe34fe594c4c0f7e3b3057b01d85112ba | [
"MIT"
] | 179 | 2020-10-14T15:36:55.000Z | 2020-10-29T19:26:16.000Z | from ...__init__ import *
from .bcd_to_decimal import *
from .binary_2s_complement import *
from .binary_complement_1s import *
from .binary_to_decimal import *
from .binary_to_hex import *
from .decimal_to_bcd import *
from .decimal_to_binary import *
from .decimal_to_hexadeci import *
from .decimal_to_octal import *
from .fibonacci_series import *
from .modulo_division import *
from .nth_fibonacci_number import *
| 28 | 35 | 0.807143 | 61 | 420 | 5.131148 | 0.311475 | 0.383387 | 0.204473 | 0.242812 | 0.159744 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00545 | 0.12619 | 420 | 14 | 36 | 30 | 0.847411 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f9159ca162f3f63d1d77de8027de35c92bada434 | 5,930 | py | Python | tests/networks_tests.py | sureshanaparti/cloudstack-gcestack | de273cef53e29c40b0043761dad13b8a91239f17 | [
"Apache-2.0"
] | 6 | 2015-05-06T13:37:06.000Z | 2021-11-09T21:38:44.000Z | tests/networks_tests.py | sureshanaparti/cloudstack-gcestack | de273cef53e29c40b0043761dad13b8a91239f17 | [
"Apache-2.0"
] | 2 | 2018-07-19T07:45:59.000Z | 2021-05-20T12:35:01.000Z | tests/networks_tests.py | sureshanaparti/cloudstack-gcestack | de273cef53e29c40b0043761dad13b8a91239f17 | [
"Apache-2.0"
] | 11 | 2015-05-06T13:56:57.000Z | 2021-11-09T21:38:35.000Z | #!/usr/bin/env python
# encoding: utf-8
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
import mock
import json
from gstack.helpers import read_file
from . import GStackAppTestCase
class NetworksTestCase(GStackAppTestCase):
def test_list_networks(self):
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/valid_describe_security_groups.json')
get.return_value.status_code = 200
with mock.patch('requests.get', get):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token)}
response = self.get(
'/compute/v1/projects/exampleproject/global/networks', headers=headers)
self.assert_ok(response)
def test_get_network(self):
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/valid_describe_security_group.json')
get.return_value.status_code = 200
with mock.patch('requests.get', get):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token)}
response = self.get(
'/compute/v1/projects/exampleproject/global/networks/networkname', headers=headers)
self.assert_ok(response)
def test_get_network_network_not_found(self):
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/empty_describe_security_groups.json')
get.return_value.status_code = 200
with mock.patch('requests.get', get):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token)}
response = self.get(
'/compute/v1/projects/exampleproject/global/networks/networkname', headers=headers)
self.assert_not_found(response)
assert 'The resource \'/compute/v1/projects/exampleproject/global/networks/networkname\'' \
in response.data
def test_add_network(self):
data = {
'IPv4Range': '10.0.0.0/8',
'kind': 'compute#network',
'name': 'networkname',
'description': ''
}
data = json.dumps(data)
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/valid_create_security_group.json')
get.return_value.status_code = 200
with mock.patch('requests.get', get):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token),
}
response = self.post_json(
'/compute/v1/projects/admin/global/networks', data=data, headers=headers)
self.assert_ok(response)
def test_add_network_network_duplicate(self):
data = {
'IPv4Range': '10.0.0.0/8',
'kind': 'compute#network',
'name': 'networkname',
'description': ''
}
data = json.dumps(data)
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/duplicate_create_security_group.json')
get.return_value.status_code = 200
with mock.patch('requests.get', get):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token),
}
response = self.post_json(
'/compute/v1/projects/admin/global/networks', data=data, headers=headers)
assert 'RESOURCE_ALREADY_EXISTS' in response.data
def test_delete_network(self):
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/valid_delete_security_group.json')
get.return_value.status_code = 200
get_networks = mock.Mock()
get_networks.return_value = json.loads(
read_file('tests/data/valid_get_security_group.json'))
with mock.patch('requests.get', get):
with mock.patch('gstack.controllers.get_item_with_name', get_networks):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token),
}
response = self.delete(
'/compute/v1/projects/exampleproject/global/networks/networkname', headers=headers)
self.assert_ok(response)
def test_delete_network_network_not_found(self):
get = mock.Mock()
get.return_value.text = read_file(
'tests/data/valid_delete_security_group.json')
get.return_value.status_code = 200
get_networks = mock.Mock()
get_networks.return_value = None
with mock.patch('requests.get', get):
with mock.patch('gstack.controllers.get_item_with_name', get_networks):
headers = {
'authorization': 'Bearer ' + str(GStackAppTestCase.access_token),
}
response = self.delete(
'/compute/v1/projects/exampleproject/global/networks/invalidnetworkname', headers=headers)
self.assert_not_found(response)
assert 'The resource \'/compute/v1/projects/exampleproject/global/networks/invalidnetworkname\'' \
in response.data
| 34.678363 | 110 | 0.626138 | 661 | 5,930 | 5.452345 | 0.22239 | 0.048835 | 0.054384 | 0.037736 | 0.74889 | 0.731132 | 0.731132 | 0.718091 | 0.710599 | 0.710599 | 0 | 0.011335 | 0.270995 | 5,930 | 170 | 111 | 34.882353 | 0.822346 | 0.135413 | 0 | 0.720721 | 0 | 0 | 0.239671 | 0.165263 | 0 | 0 | 0 | 0 | 0.081081 | 1 | 0.063063 | false | 0 | 0.036036 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
005f44cb6a0cc35028e2ed8d77a97292cb86783f | 271 | py | Python | pysectprop/__init__.py | Pretsdaya/pysectprop | e01a04c13a99e5430b235d745975c27ac38de5ac | [
"MIT"
] | 1 | 2022-01-30T05:59:50.000Z | 2022-01-30T05:59:50.000Z | pysectprop/__init__.py | Pretsdaya/pysectprop | e01a04c13a99e5430b235d745975c27ac38de5ac | [
"MIT"
] | null | null | null | pysectprop/__init__.py | Pretsdaya/pysectprop | e01a04c13a99e5430b235d745975c27ac38de5ac | [
"MIT"
] | 1 | 2021-07-01T12:37:33.000Z | 2021-07-01T12:37:33.000Z | from .general.generalsection import GeneralSection
from .general.materialsection import MaterialSection
from .general.compositesection import CompositeSection
from .general.thinwalledsection import ThinWalledSection
from .general.cripplingsection import CripplingSection
| 45.166667 | 56 | 0.889299 | 25 | 271 | 9.64 | 0.32 | 0.228216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073801 | 271 | 5 | 57 | 54.2 | 0.960159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
00b6cfb1ce7e72bf1c1c958f68a53fe9cabde12c | 160 | py | Python | mysite/polls/admin.py | tenderghost/FlyPersonalAssistant | f9b379a42c32ff1ea73803d25cce7be04f8ec497 | [
"MIT"
] | 1 | 2018-01-07T16:45:31.000Z | 2018-01-07T16:45:31.000Z | mysite/polls/admin.py | tenderghost/FlyPersonalAssistant | f9b379a42c32ff1ea73803d25cce7be04f8ec497 | [
"MIT"
] | null | null | null | mysite/polls/admin.py | tenderghost/FlyPersonalAssistant | f9b379a42c32ff1ea73803d25cce7be04f8ec497 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Question, Choice
# add django admin site support
admin.site.register(Question)
admin.site.register(Choice) | 26.666667 | 36 | 0.81875 | 23 | 160 | 5.695652 | 0.521739 | 0.206107 | 0.259542 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10625 | 160 | 6 | 37 | 26.666667 | 0.916084 | 0.18125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
00bf82eefa0966e9f62374ff9a32f2becbe521e0 | 48 | py | Python | main4.py | sunnyjamm/second_project | 424fbcc7bf725263723e5901f117b5c7db779a1b | [
"MIT"
] | null | null | null | main4.py | sunnyjamm/second_project | 424fbcc7bf725263723e5901f117b5c7db779a1b | [
"MIT"
] | null | null | null | main4.py | sunnyjamm/second_project | 424fbcc7bf725263723e5901f117b5c7db779a1b | [
"MIT"
] | null | null | null | print('user 3')
print('user 2')
print('user 1')
| 12 | 15 | 0.625 | 9 | 48 | 3.333333 | 0.555556 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.125 | 48 | 3 | 16 | 16 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
00da87465f44ada5769612b5b1ad383223193827 | 111 | py | Python | utils/__init__.py | AnyKeyShik/Kektor3000 | 3d1735bf82cabbeafe2593ef36fa23ae135a4940 | [
"MIT"
] | null | null | null | utils/__init__.py | AnyKeyShik/Kektor3000 | 3d1735bf82cabbeafe2593ef36fa23ae135a4940 | [
"MIT"
] | 4 | 2020-04-06T23:46:14.000Z | 2020-04-07T00:57:18.000Z | utils/__init__.py | AnyKeyShik/Kektor3000 | 3d1735bf82cabbeafe2593ef36fa23ae135a4940 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .logger import *
from .json_handler import json_handler
from .proxy import patch
| 18.5 | 38 | 0.720721 | 16 | 111 | 4.875 | 0.625 | 0.282051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010753 | 0.162162 | 111 | 5 | 39 | 22.2 | 0.827957 | 0.189189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97268cbbb4a68a59944e9f79fa9dcb755e51e65a | 30 | py | Python | __init__.py | impactaky/xonsh2py | ff6d62b194bc682f3bd0e7d923270e100a3d834e | [
"MIT"
] | 2 | 2020-10-02T04:51:50.000Z | 2022-03-17T20:48:59.000Z | __init__.py | impactaky/xonsh2py | ff6d62b194bc682f3bd0e7d923270e100a3d834e | [
"MIT"
] | null | null | null | __init__.py | impactaky/xonsh2py | ff6d62b194bc682f3bd0e7d923270e100a3d834e | [
"MIT"
] | null | null | null | from .xonsh2py import convert
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.133333 | 30 | 1 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97438a4ef5b536293a1c7d7159f6bac8e7712762 | 5,619 | py | Python | cocluster/ccnblplt.py | ashish-code/co-clustering-visual-categorization | 0e6d97858793c6562d28b2cefd9f61bc2fa079d0 | [
"MIT"
] | null | null | null | cocluster/ccnblplt.py | ashish-code/co-clustering-visual-categorization | 0e6d97858793c6562d28b2cefd9f61bc2fa079d0 | [
"MIT"
] | null | null | null | cocluster/ccnblplt.py | ashish-code/co-clustering-visual-categorization | 0e6d97858793c6562d28b2cefd9f61bc2fa079d0 | [
"MIT"
] | null | null | null | '''
Created on 2 Aug 2011
plot error bar of mean average precision performance
@author: ag00087
'''
import matplotlib.pyplot as plt
import numpy as np
rootDir = '/vol/vssp/diplecs/ash/Data/'
outDir = '/results/'
def plotresult(dataset,xticklabels,result1,result2,figfmt='svg',title='CoClusterBagofFeatures',ccType='i'):
nXTicks = len(xticklabels)
outPath = rootDir + dataset + outDir + '%s%s%s%s%s'%(title,dataset,ccType,'.',figfmt)
plt.figure()
ax = plt.subplot(111)
plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'CoCluster')
plt.errorbar(np.arange(1,(nXTicks+1)), result2[0], result2[1], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = 'o',markerfacecolor='k', label = 'BoF')
plt.xlabel('Visual Categories')
plt.ylabel('Performance Metric')
plt.title('%s Performance: %s ' % (title,dataset))
plt.legend(loc="lower right")
plt.ylim([0.0,1.0])
ax.set_xticks(np.arange(1,(nXTicks+2)))
ax.set_xticklabels(xticklabels,rotation=30,size='small',ha='center')
plt.savefig(outPath,format=figfmt)
plt.show()
plt.close()
def plotClassifierResult(result1,result2,xticklabels,dataset,outPath,title,figfmt):
nXTicks = len(xticklabels)
ax = plt.subplot(111)
#plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1][0], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'CoCluster-SVM')
#plt.errorbar(np.arange(1,(nXTicks+1)), result2[0][0], result2[1][0], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = 'o',markerfacecolor='k', label = 'BoF-SVM')
plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'CoCluster-KNN')
plt.errorbar(np.arange(1,(nXTicks+1)), result2[0], result2[1], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = 'o',markerfacecolor='k', label = 'BoF-KNN')
plt.xlabel('Visual Categories')
plt.ylabel('fBeta_score')
plt.title('%s Performance: %s ' % (title,dataset))
plt.legend(loc="lower right")
plt.ylim([0.0,1.0])
ax.set_xticks(np.arange(1,(nXTicks+2)))
ax.set_xticklabels(xticklabels,rotation=30,size='small',ha='center')
plt.savefig(outPath,format=figfmt)
plt.show()
plt.close()
def plotWordResult(result1,result2,xticklabels,dataset,outPath,title,figfmt):
nXTicks = len(xticklabels)
ax = plt.subplot(111)
#plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1][0], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'CoCluster-SVM')
#plt.errorbar(np.arange(1,(nXTicks+1)), result2[0][0], result2[1][0], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = 'o',markerfacecolor='k', label = 'BoF-SVM')
plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'Co-Cluster')
plt.errorbar(np.arange(1,(nXTicks+1)), result2[0], result2[1], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = '+',markerfacecolor='k', label = 'Base-line')
plt.xlabel('visual categories')
plt.ylabel('F1_score')
# plt.title('%s Performance: %s ' % (title,dataset))
plt.title('%s ' % dataset)
plt.legend(loc="upper right")
plt.ylim([0.0,1.0])
ax.set_xticks(np.arange(1,(nXTicks+2)))
ax.set_xticklabels(xticklabels,rotation=30,size='small',ha='center')
plt.savefig(outPath,format=figfmt)
plt.show()
plt.close()
def plotTopicResult(result1,result2,xticklabels,dataset,outPath,figfmt):
nXTicks = len(xticklabels)
ax = plt.subplot(111)
plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'cocluster-Topic')
plt.errorbar(np.arange(1,(nXTicks+1)), result2[0], result2[1], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = '+',markerfacecolor='k', label = 'Bag-of-Words')
plt.xlabel('visual categories')
plt.ylabel('F1_score')
plt.title('%s ' % dataset)
plt.legend(loc="upper right")
plt.ylim([0.0,1.0])
ax.set_xticks(np.arange(1,(nXTicks+2)))
ax.set_xticklabels(xticklabels,rotation=30,size='small',ha='center')
plt.savefig(outPath,format=figfmt)
plt.show()
plt.close()
def plotKNNresult(result1,result2,xticklabels,dataset,outPath,title,figfmt):
nXTicks = len(xticklabels)
ax = plt.subplot(111)
#plt.errorbar(np.arange(1,(nXTicks+1)), result1[0], result1[1][0], fmt = '-', color= 'r', ecolor='r', elinewidth=1, marker = 'x',markerfacecolor='k', label = 'CoCluster-SVM')
#plt.errorbar(np.arange(1,(nXTicks+1)), result2[0][0], result2[1][0], fmt = '-', color= 'b', ecolor='b', elinewidth=1, marker = 'o',markerfacecolor='k', label = 'BoF-SVM')
plt.plot(np.arange(1,(nXTicks+1)), result1, fmt = '-', color= 'r', marker = 'x',markerfacecolor='k', label = 'CoCluster-KNN')
plt.plot(np.arange(1,(nXTicks+1)), result2, fmt = '-', color= 'b', marker = 'o',markerfacecolor='k', label = 'BoF-KNN')
plt.xlabel('Visual Categories')
plt.ylabel('F1_score')
plt.title('%s Performance: %s ' % (title,dataset))
plt.legend(loc="upper right")
plt.ylim([0.0,1.0])
ax.set_xticks(np.arange(1,(nXTicks+2)))
ax.set_xticklabels(xticklabels,rotation=30,size='small',ha='center')
plt.savefig(outPath,format=figfmt)
plt.show()
plt.close()
if __name__ == '__main__':
pass | 54.553398 | 178 | 0.648692 | 791 | 5,619 | 4.580278 | 0.140329 | 0.04637 | 0.052167 | 0.092741 | 0.867513 | 0.856749 | 0.8435 | 0.831907 | 0.819211 | 0.819211 | 0 | 0.039422 | 0.137747 | 5,619 | 103 | 179 | 54.553398 | 0.708359 | 0.209468 | 0 | 0.698795 | 0 | 0 | 0.121472 | 0.011063 | 0.024096 | 0 | 0 | 0 | 0 | 1 | 0.060241 | false | 0.012048 | 0.024096 | 0 | 0.084337 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9791acba5c8145133bc784739472b37ac9fc0a65 | 43 | py | Python | ch9_packages/sports/extras/celebrations.py | mikeckennedy/gk_python_demos | e9a81c5a0775f0ad368843ccec434c5b1d588d71 | [
"MIT"
] | null | null | null | ch9_packages/sports/extras/celebrations.py | mikeckennedy/gk_python_demos | e9a81c5a0775f0ad368843ccec434c5b1d588d71 | [
"MIT"
] | null | null | null | ch9_packages/sports/extras/celebrations.py | mikeckennedy/gk_python_demos | e9a81c5a0775f0ad368843ccec434c5b1d588d71 | [
"MIT"
] | null | null | null | def celebrate():
print("Yay, we win!")
| 14.333333 | 25 | 0.581395 | 6 | 43 | 4.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209302 | 43 | 2 | 26 | 21.5 | 0.735294 | 0 | 0 | 0 | 0 | 0 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
97926aff8483c95f0bf547d7d26dcfb753a81794 | 114 | py | Python | pacote download/lista-exercicios/ex97.py | GiseleViedenhelfen/ExerciciosPython | b616b97a15484aac4980968197955702bd7aaa6d | [
"MIT"
] | null | null | null | pacote download/lista-exercicios/ex97.py | GiseleViedenhelfen/ExerciciosPython | b616b97a15484aac4980968197955702bd7aaa6d | [
"MIT"
] | null | null | null | pacote download/lista-exercicios/ex97.py | GiseleViedenhelfen/ExerciciosPython | b616b97a15484aac4980968197955702bd7aaa6d | [
"MIT"
] | null | null | null | def escreva(txt):
print('-'*len(txt))
print(txt)
print('-' * len(txt))
escreva('Feliz Aniversário!') | 16.285714 | 29 | 0.578947 | 14 | 114 | 4.714286 | 0.5 | 0.363636 | 0.333333 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192982 | 114 | 7 | 29 | 16.285714 | 0.717391 | 0 | 0 | 0.4 | 0 | 0 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0.6 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
97bf5115ccda419d8a6b30bd64fffed365085870 | 48 | py | Python | src/papermodels/paper/__init__.py | connorferster/sdm | 517b8fabda116add7519a57c8c422c1514429a42 | [
"Apache-2.0"
] | null | null | null | src/papermodels/paper/__init__.py | connorferster/sdm | 517b8fabda116add7519a57c8c422c1514429a42 | [
"Apache-2.0"
] | null | null | null | src/papermodels/paper/__init__.py | connorferster/sdm | 517b8fabda116add7519a57c8c422c1514429a42 | [
"Apache-2.0"
] | null | null | null | from . import annotations_fdf
from . import plot | 24 | 29 | 0.8125 | 7 | 48 | 5.428571 | 0.714286 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 30 | 24 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
97cb7f7b591293a379f7bd3b79084b29b74decdd | 353 | py | Python | validate.py | KTOmega/pyns | eaf1d2a06159c7051b78834cb6dedb9f447736c2 | [
"MIT"
] | 1 | 2019-03-18T17:33:38.000Z | 2019-03-18T17:33:38.000Z | validate.py | KTOmega/pyns | eaf1d2a06159c7051b78834cb6dedb9f447736c2 | [
"MIT"
] | null | null | null | validate.py | KTOmega/pyns | eaf1d2a06159c7051b78834cb6dedb9f447736c2 | [
"MIT"
] | null | null | null | import re
IP_REGEX = re.compile("^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$")
HOST_REGEX = re.compile("^([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9\-]*[a-zA-Z0-9])+$")
def hostname(host):
return HOST_REGEX.fullmatch(host) is not None
def ip(ip):
return IP_REGEX.fullmatch(ip) is not None
| 32.090909 | 130 | 0.569405 | 85 | 353 | 2.317647 | 0.282353 | 0.081218 | 0.060914 | 0.121827 | 0.314721 | 0.314721 | 0.314721 | 0.314721 | 0.314721 | 0.314721 | 0 | 0.146875 | 0.093484 | 353 | 10 | 131 | 35.3 | 0.46875 | 0 | 0 | 0 | 0 | 0.285714 | 0.447592 | 0.447592 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.285714 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
c14c886416912f2997603c5a3558f6cca4a5e72f | 68 | py | Python | basic/range/range.py | mklsw/python-learning | 7923b3d80ddabc05121543c88178f51379ae910b | [
"MIT"
] | null | null | null | basic/range/range.py | mklsw/python-learning | 7923b3d80ddabc05121543c88178f51379ae910b | [
"MIT"
] | null | null | null | basic/range/range.py | mklsw/python-learning | 7923b3d80ddabc05121543c88178f51379ae910b | [
"MIT"
] | null | null | null | for i in range(5):
print(i)
for i in range(5,10):
print(i) | 11.333333 | 21 | 0.558824 | 15 | 68 | 2.533333 | 0.466667 | 0.210526 | 0.315789 | 0.578947 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 0.279412 | 68 | 6 | 22 | 11.333333 | 0.693878 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
c155c1eb682c1a2241c4838c03833332c695fff8 | 90 | py | Python | libs/__init__.py | spirit1431007/qiandao-1 | e05976da4e5dfc6f790f7782b6cc97f94ee50611 | [
"MIT"
] | 763 | 2020-05-05T15:45:58.000Z | 2021-10-11T12:34:24.000Z | libs/__init__.py | spirit1431007/qiandao-1 | e05976da4e5dfc6f790f7782b6cc97f94ee50611 | [
"MIT"
] | 114 | 2020-05-16T14:10:36.000Z | 2021-10-12T15:55:46.000Z | libs/__init__.py | BlueskyClouds/qiandao | 5154c09963e1d05ce077772868cfcf4614f551c7 | [
"MIT"
] | 220 | 2020-05-06T03:04:36.000Z | 2021-10-06T11:09:19.000Z | import os,sys
sys.path.append(os.path.abspath(os.path.dirname(os.path.dirname(__file__)))) | 45 | 76 | 0.788889 | 16 | 90 | 4.1875 | 0.5 | 0.268657 | 0.38806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 90 | 2 | 76 | 45 | 0.761364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c192691a777ea46c759cd3b82472b7e3e9dadff7 | 144 | py | Python | 2019/08/04/Organizing a Flask Project Beyond Single File/app_structure/app_structure/views.py | kenjitagawa/youtube_video_code | ef3c48b9e136b3745d10395d94be64cb0a1f1c97 | [
"Unlicense"
] | 492 | 2019-06-25T12:54:31.000Z | 2022-03-30T12:38:28.000Z | 2019/08/04/Organizing a Flask Project Beyond Single File/app_structure/app_structure/views.py | kenjitagawa/youtube_video_code | ef3c48b9e136b3745d10395d94be64cb0a1f1c97 | [
"Unlicense"
] | 23 | 2019-10-01T01:36:08.000Z | 2022-02-10T12:46:16.000Z | 2019/08/04/Organizing a Flask Project Beyond Single File/app_structure/app_structure/views.py | kenjitagawa/youtube_video_code | ef3c48b9e136b3745d10395d94be64cb0a1f1c97 | [
"Unlicense"
] | 1,734 | 2019-06-03T06:25:13.000Z | 2022-03-31T23:57:53.000Z | from flask import Blueprint
main = Blueprint('main', __name__)
@main.route('/')
def main_index():
return 'Blueprint Views.py Hello!' | 20.571429 | 38 | 0.680556 | 18 | 144 | 5.166667 | 0.722222 | 0.27957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180556 | 144 | 7 | 38 | 20.571429 | 0.788136 | 0 | 0 | 0 | 0 | 0 | 0.215827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0.6 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 6 |
c1c1faaafdc6538ac3a15816c38ad27342d85424 | 9,443 | py | Python | tests/contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/test_michelson_coding_KT1TpK.py | juztin/pytezos-1 | 7e608ff599d934bdcf129e47db43dbdb8fef9027 | [
"MIT"
] | 1 | 2021-05-20T16:52:08.000Z | 2021-05-20T16:52:08.000Z | tests/contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/test_michelson_coding_KT1TpK.py | juztin/pytezos-1 | 7e608ff599d934bdcf129e47db43dbdb8fef9027 | [
"MIT"
] | 1 | 2020-12-30T16:44:56.000Z | 2020-12-30T16:44:56.000Z | tests/contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/test_michelson_coding_KT1TpK.py | juztin/pytezos-1 | 7e608ff599d934bdcf129e47db43dbdb8fef9027 | [
"MIT"
] | 1 | 2022-03-20T19:01:00.000Z | 2022-03-20T19:01:00.000Z | from unittest import TestCase
from tests import get_data
from pytezos.michelson.micheline import michelson_to_micheline
from pytezos.michelson.formatter import micheline_to_michelson
class MichelsonCodingTestKT1TpK(TestCase):
def setUp(self):
self.maxDiff = None
def test_michelson_parse_code_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/code_KT1TpK.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/code_KT1TpK.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_code_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/code_KT1TpK.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/code_KT1TpK.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_code_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/code_KT1TpK.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_storage_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/storage_KT1TpK.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/storage_KT1TpK.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_storage_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/storage_KT1TpK.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/storage_KT1TpK.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_storage_KT1TpK(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/storage_KT1TpK.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_opPXR3(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_opPXR3.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_opPXR3.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_opPXR3(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_opPXR3.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_opPXR3.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_opPXR3(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_opPXR3.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_ooXbxf(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooXbxf.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooXbxf.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_ooXbxf(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooXbxf.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooXbxf.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_ooXbxf(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooXbxf.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_ooGmSN(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooGmSN.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooGmSN.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_ooGmSN(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooGmSN.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooGmSN.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_ooGmSN(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooGmSN.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_oosH2o(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_oosH2o.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_oosH2o.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_oosH2o(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_oosH2o.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_oosH2o.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_oosH2o(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_oosH2o.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_ooMKby(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooMKby.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooMKby.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_ooMKby(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooMKby.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooMKby.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_ooMKby(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_ooMKby.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_onmk5E(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_onmk5E.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_onmk5E.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_onmk5E(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_onmk5E.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_onmk5E.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_onmk5E(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_onmk5E.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_op1yUC(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_op1yUC.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_op1yUC.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_op1yUC(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_op1yUC.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_op1yUC.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_op1yUC(self):
expected = get_data(
path='contracts/KT1TpKkwKzGwMrWrGnPp9KixhraD2dtE5wE5/parameter_op1yUC.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
| 46.9801 | 90 | 0.733983 | 880 | 9,443 | 7.563636 | 0.05 | 0.048377 | 0.074369 | 0.135216 | 0.963341 | 0.963341 | 0.963341 | 0.963341 | 0.947416 | 0.947416 | 0 | 0.035883 | 0.191359 | 9,443 | 200 | 91 | 47.215 | 0.835778 | 0 | 0 | 0.639053 | 0 | 0 | 0.316531 | 0.316531 | 0 | 0 | 0 | 0 | 0.159763 | 1 | 0.16568 | false | 0 | 0.023669 | 0 | 0.195266 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c1c458c8397258536a9bb4d6df3f6cd9b49fc7c2 | 47 | py | Python | commit_and_push_test/test.py | kounorimich/Samurai | f165e3ab8db998f50b2a753a5aee504a317fafd2 | [
"MIT"
] | null | null | null | commit_and_push_test/test.py | kounorimich/Samurai | f165e3ab8db998f50b2a753a5aee504a317fafd2 | [
"MIT"
] | null | null | null | commit_and_push_test/test.py | kounorimich/Samurai | f165e3ab8db998f50b2a753a5aee504a317fafd2 | [
"MIT"
] | null | null | null | print('COMMIT!!というかpublish??')
print('PUSH!!') | 15.666667 | 30 | 0.659574 | 5 | 47 | 6.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 47 | 3 | 31 | 15.666667 | 0.688889 | 0 | 0 | 0 | 0 | 0 | 0.5625 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
c1c795ae7ab935a3bf0e65abbca453ab8b46632b | 44 | py | Python | config.py | 1ierro1ast/shodan-tool | 76ce7524ad2ab4ad914e01dfce251ecdcd129dd6 | [
"MIT"
] | 4 | 2019-08-17T08:53:02.000Z | 2022-01-27T10:09:01.000Z | config.py | 1ierro1ast/shodan-tool | 76ce7524ad2ab4ad914e01dfce251ecdcd129dd6 | [
"MIT"
] | null | null | null | config.py | 1ierro1ast/shodan-tool | 76ce7524ad2ab4ad914e01dfce251ecdcd129dd6 | [
"MIT"
] | null | null | null | apiKey = "PSKINdQe1GyxGgecYz2191H2JoS9qvgD"
| 22 | 43 | 0.863636 | 2 | 44 | 19 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 0.068182 | 44 | 1 | 44 | 44 | 0.756098 | 0 | 0 | 0 | 0 | 0 | 0.727273 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
de06eee650ff3338af19376b8ed6aee50ffc80e8 | 35 | py | Python | core/Modules/__init__.py | davidliyutong/Flint | 4e2552dac8d781c21e8998ad68bbf1b986b09258 | [
"MIT"
] | null | null | null | core/Modules/__init__.py | davidliyutong/Flint | 4e2552dac8d781c21e8998ad68bbf1b986b09258 | [
"MIT"
] | 1 | 2020-07-08T02:57:50.000Z | 2020-07-08T02:57:50.000Z | core/Modules/__init__.py | davidliyutong/Flint | 4e2552dac8d781c21e8998ad68bbf1b986b09258 | [
"MIT"
] | null | null | null | from .sequential import sequential
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e7743b9762c52863e8d68f63f45693d443b9f95d | 201 | py | Python | myopenpantry/models/__init__.py | MyOpenPantry/flask-backend | e94702bfa04f36c1a6015ae3e9c37dfb7b923279 | [
"MIT"
] | null | null | null | myopenpantry/models/__init__.py | MyOpenPantry/flask-backend | e94702bfa04f36c1a6015ae3e9c37dfb7b923279 | [
"MIT"
] | 4 | 2021-03-28T19:47:04.000Z | 2021-05-04T00:59:46.000Z | myopenpantry/models/__init__.py | MyOpenPantry/flask-backend | e94702bfa04f36c1a6015ae3e9c37dfb7b923279 | [
"MIT"
] | null | null | null | from .associations import RecipeIngredient, recipe_tags # noqa
from .ingredients import Ingredient # noqa
from .recipes import Recipe # noqa
from .items import Item # noqa
from .tags import Tag # noqa
| 33.5 | 62 | 0.791045 | 27 | 201 | 5.851852 | 0.481481 | 0.202532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154229 | 201 | 5 | 63 | 40.2 | 0.929412 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e77cd0012bfbdb74273e0223a50cd2867a257447 | 680 | py | Python | config_sample.py | MIT-LCP/QueryBuilder-AWS | 20484ab95d04ce4edec41102b33e6cf91180d623 | [
"MIT"
] | 1 | 2019-05-12T23:32:14.000Z | 2019-05-12T23:32:14.000Z | config_sample.py | MIT-LCP/QueryBuilder-AWS | 20484ab95d04ce4edec41102b33e6cf91180d623 | [
"MIT"
] | 3 | 2017-11-21T14:15:41.000Z | 2019-03-08T00:04:29.000Z | config_sample.py | MIT-LCP/QueryBuilder-AWS | 20484ab95d04ce4edec41102b33e6cf91180d623 | [
"MIT"
] | 1 | 2019-09-23T13:24:52.000Z | 2019-09-23T13:24:52.000Z | class Config:
def __init__(self, name = None, user = None, passwd = None):
self.user = "username"
self.passwd = "password"
self.name = "QueryBuilder"
def getUser(self):
return self.user
def getPassword(self):
return self.passwd
def getDBName(self):
return self.name
class DBConfig:
def __init__(self, name = None, user = None, passwd = None):
self.user = "mimicusername"
self.passwd = "mimicpassword"
self.name = "mimic"
def getUser(self):
return self.user
def getPassword(self):
return self.passwd
def getDBName(self):
return self.name
| 22.666667 | 64 | 0.589706 | 76 | 680 | 5.171053 | 0.263158 | 0.122137 | 0.21374 | 0.076336 | 0.697201 | 0.697201 | 0.697201 | 0.697201 | 0.697201 | 0.697201 | 0 | 0 | 0.310294 | 680 | 29 | 65 | 23.448276 | 0.837953 | 0 | 0 | 0.636364 | 0 | 0 | 0.086765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0.363636 | 0 | 0.272727 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
99c0db2049afc52e1d708936f96b7bf3c4c57f72 | 37 | py | Python | singoutloud/__init__.py | Happy-Kunal/singOutLoud | e4733101721c435c5cf4a6b8e0edd7ac1eaab817 | [
"MIT"
] | 4 | 2021-09-05T15:05:27.000Z | 2021-09-30T13:58:48.000Z | singoutloud/__init__.py | Happy-Kunal/singOutLoud | e4733101721c435c5cf4a6b8e0edd7ac1eaab817 | [
"MIT"
] | 4 | 2021-09-06T17:01:11.000Z | 2021-09-08T13:58:49.000Z | singoutloud/__init__.py | Happy-Kunal/singOutLoud | e4733101721c435c5cf4a6b8e0edd7ac1eaab817 | [
"MIT"
] | null | null | null | import player
import killAbleThread
| 9.25 | 21 | 0.864865 | 4 | 37 | 8 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 37 | 3 | 22 | 12.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
82078d36ea9df2320701b271dabdbed1d63c7bfb | 28 | py | Python | molecules/plot/__init__.py | braceal/molecules | 6c6c7efc2b968aa42b957be4afd418da190b43dd | [
"MIT"
] | 4 | 2020-08-06T20:08:25.000Z | 2021-01-25T00:13:57.000Z | molecules/plot/__init__.py | braceal/molecules | 6c6c7efc2b968aa42b957be4afd418da190b43dd | [
"MIT"
] | 43 | 2020-05-06T04:33:19.000Z | 2021-03-17T14:47:36.000Z | molecules/plot/__init__.py | braceal/molecules | 6c6c7efc2b968aa42b957be4afd418da190b43dd | [
"MIT"
] | 2 | 2020-06-08T15:17:39.000Z | 2020-07-29T16:40:34.000Z | from .tsne import plot_tsne
| 14 | 27 | 0.821429 | 5 | 28 | 4.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
413963a586e70a431d65653f43658ed1af66ede4 | 19,268 | py | Python | src/picasso/gui/programs/groops_interface.py | sreimond/picasso | 89948b1707f44ca03c566da1d4424d9bc208e380 | [
"MIT"
] | null | null | null | src/picasso/gui/programs/groops_interface.py | sreimond/picasso | 89948b1707f44ca03c566da1d4424d9bc208e380 | [
"MIT"
] | null | null | null | src/picasso/gui/programs/groops_interface.py | sreimond/picasso | 89948b1707f44ca03c566da1d4424d9bc208e380 | [
"MIT"
] | 1 | 2021-02-26T18:33:33.000Z | 2021-02-26T18:33:33.000Z | # -*- coding: utf-8 -*-
"""
Created on Thu Apr 26 12:37:04 2018
@author: sreimond
"""
import os
import numpy as np
import pkg_resources
import shutil, tempfile
from picasso.utils.dates_and_time import date_functions
from picasso.gui.programs import compute_grace as cg
from picasso.gui.programs import compute_corrections as cc
from picasso.gui.programs import compute_internal_validation as civ
def make_grid_file(groops_bin,grid_file,lon,lat,h,area,*args,**kwargs):
# determine dimensions
point_count = np.size(lon)
args_count = len(args)
if args_count>5: # max. 5 args allowed
args_count=5
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/ascii2grid_args%d.xml' % args_count)
# create and fill output array
output_ascii = np.ones((point_count,4+args_count)) * np.nan
output_ascii[:,0] = lon
output_ascii[:,1] = lat
output_ascii[:,2] = h
output_ascii[:,3] = area
for ix in list(range(args_count)):
output_ascii[:,4+ix] = args[ix]
# replace nans and infs
ix_nan = np.isnan(output_ascii)
output_ascii[ix_nan] = 1e30
ix_inf = np.isinf(output_ascii)
output_ascii[ix_inf] = 1e30
# limit point count?
if 'point_count' in kwargs:
new_point_count = kwargs.get("point_count")
if new_point_count>point_count or new_point_count<1:
new_point_count = point_count
output_ascii = output_ascii[0:new_point_count,:]
# get temporary file and save
temp_dir = tempfile.mkdtemp()
ascii_file = os.path.join(temp_dir, 'ascii.txt')
np.savetxt(ascii_file,output_ascii,'%+.16e')
# call groops
sys_str = ""
sys_str += groops_bin
sys_str += " -g outputfileGriddedData=%s" % grid_file
sys_str += " -g inputfile=%s" % ascii_file
sys_str += " %s 2>/dev/null" %(xml_file)
ret = os.system(sys_str)
shutil.rmtree(temp_dir)
return ret
def build_pointmass_normals(groops_bin,mjd_start,mjd_end,grid_file,output_path,leo=False,compute_goce=False,goce_only=False):
year,month,day = date_functions.mjd2ymd(mjd_start)
gridi = np.genfromtxt(grid_file,skip_header=2)
gridi = np.array(gridi,ndmin=2)
point_count = gridi.shape[0]
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/build_pointmass_normals_grace_only.xml')
if compute_goce and goce_only:
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/build_pointmass_normals_goce_only.xml')
elif compute_goce:
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/build_pointmass_normals_grace_goce.xml')
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g timeStart=xxx_mjd_time_start"
sys_str += " -g timeEnd=xxx_mjd_time_end"
sys_str += " -g numberOfPoints=xxx_number_of_points"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g outputfileNormalequationGraceDat=xxx_output_path_grace_xxxx-xx-normals.dat"
sys_str += " -g outputfileNormalequationGoceDat=xxx_output_path_goce_xxxx-xx-normals.dat"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd_time_start','%08.2f' % mjd_start)
sys_str_i = sys_str_i.replace('xxx_mjd_time_end','%08.2f' % mjd_end)
sys_str_i = sys_str_i.replace('xxxx-xx',('%4d-%02d' % (year,month)))
sys_str_i = sys_str_i.replace('xxx_number_of_points','%d' % point_count)
sys_str_i = sys_str_i.replace('xxx_grid_file',grid_file)
sys_str_i = sys_str_i.replace('xxx_output_path_',output_path+'/')
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',cg.data_path)
# execute command if local, otherwise (if leo) return command string
if not leo:
ret = os.system(sys_str_i)
else:
ret = sys_str_i
return ret
def build_pointmass_normals_ga(groops_bin,mjd_start,mjd_end,grid_file,output_path,leo=False,love_enabled=False):
year,month,day = date_functions.mjd2ymd(mjd_start)
gridi = np.genfromtxt(grid_file,skip_header=2)
gridi = np.array(gridi,ndmin=2)
point_count = gridi.shape[0]
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/build_pointmass_normals_grace_only_ga.xml')
if love_enabled:
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/build_pointmass_normals_grace_only_love_enabled_ga.xml')
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g timeStart=xxx_mjd_time_start"
sys_str += " -g timeEnd=xxx_mjd_time_end"
sys_str += " -g numberOfPoints=xxx_number_of_points"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g outputfileNormalequationGraceDat=xxx_output_path_-normals.dat"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd_time_start','%08.2f' % mjd_start)
sys_str_i = sys_str_i.replace('xxx_mjd_time_end','%08.2f' % mjd_end)
sys_str_i = sys_str_i.replace('xxxx-xx',('%4d-%02d' % (year,month)))
sys_str_i = sys_str_i.replace('xxx_number_of_points','%d' % point_count)
sys_str_i = sys_str_i.replace('xxx_grid_file',grid_file)
sys_str_i = sys_str_i.replace('xxx_output_path_',output_path)
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',cg.data_path)
# execute command if local, otherwise (if leo) return command string
if not leo:
ret = os.system(sys_str_i)
else:
ret = sys_str_i
return ret
def combine_eliminate_solve_pointmass_normals(groops_bin,mjd_start,mjd_end,grid_file,output_path,leo=False,compute_goce=False,goce_only=False):
year,month,day = date_functions.mjd2ymd(mjd_start)
gridi = np.genfromtxt(grid_file,skip_header=2)
gridi = np.array(gridi,ndmin=2)
point_count = gridi.shape[0]
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/combine_eliminate_solve_pointmass_normals_grace_only.xml')
if compute_goce and goce_only:
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/combine_eliminate_solve_pointmass_normals_goce_only.xml')
elif compute_goce:
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/combine_eliminate_solve_pointmass_normals.xml')
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g timeStart=xxx_mjd_time_start"
sys_str += " -g timeEnd=xxx_mjd_time_end"
sys_str += " -g numberOfPoints=xxx_number_of_points"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g outputfileNormalequationGraceGoceDat=xxx_output_path_grace_goce_xxxx-xx-normals.dat"
sys_str += " -g outputfileNormalequationGraceGoceTxt=xxx_output_path_grace_goce_xxxx-xx-normals.txt"
sys_str += " -g outputfileNormalequationRegularizedGraceGoceTxt=xxx_output_path_grace_goce_xxxx-xx-normalsRegularized.txt"
sys_str += " -g outputfileSolutionGraceGoce=xxx_output_path_grace_goce_xxxx-xx-x.txt"
sys_str += " -g outputfileSigmaxGraceGoce=xxx_output_path_grace_goce_xxxx-xx-sigmax.txt"
sys_str += " -g outputfileNormalequationGraceDat=xxx_output_path_grace_xxxx-xx-normals.dat"
sys_str += " -g outputfileNormalequationGraceTxt=xxx_output_path_grace_xxxx-xx-normals.txt"
sys_str += " -g outputfileNormalequationRegularizedGraceTxt=xxx_output_path_grace_xxxx-xx-normalsRegularized.txt"
sys_str += " -g outputfileSolutionGrace=xxx_output_path_grace_xxxx-xx-x.txt"
sys_str += " -g outputfileSigmaxGrace=xxx_output_path_grace_xxxx-xx-sigmax.txt"
sys_str += " -g outputfileNormalequationGoceDat=xxx_output_path_goce_xxxx-xx-normals.dat"
sys_str += " -g outputfileNormalequationGoceTxt=xxx_output_path_goce_xxxx-xx-normals.txt"
sys_str += " -g outputfileNormalequationRegularizedGoceTxt=xxx_output_path_goce_xxxx-xx-normalsRegularized.txt"
sys_str += " -g outputfileSolutionGoce=xxx_output_path_goce_xxxx-xx-x.txt"
sys_str += " -g outputfileSigmaxGoce=xxx_output_path_goce_xxxx-xx-sigmax.txt"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd_time_start','%08.2f' % mjd_start)
sys_str_i = sys_str_i.replace('xxx_mjd_time_end','%08.2f' % mjd_end)
sys_str_i = sys_str_i.replace('xxxx-xx',('%4d-%02d' % (year,month)))
sys_str_i = sys_str_i.replace('xxx_number_of_points','%d' % point_count)
sys_str_i = sys_str_i.replace('xxx_grid_file',grid_file)
sys_str_i = sys_str_i.replace('xxx_output_path_',output_path+'/')
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',cg.data_path)
# execute command if local, otherwise (if leo) return command string
if not leo:
ret = os.system(sys_str_i)
else:
ret = sys_str_i
return ret
def eliminate_solve_pointmass_normals_ga(groops_bin,mjd_start,mjd_end,grid_file,output_path,leo=False):
year,month,day = date_functions.mjd2ymd(mjd_start)
gridi = np.genfromtxt(grid_file,skip_header=2)
gridi = np.array(gridi,ndmin=2)
point_count = gridi.shape[0]
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/eliminate_solve_pointmass_normals_grace_only_ga.xml')
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g timeStart=xxx_mjd_time_start"
sys_str += " -g timeEnd=xxx_mjd_time_end"
sys_str += " -g numberOfPoints=xxx_number_of_points"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g outputfileNormalequationGraceDat=xxx_output_path_-normals.dat"
sys_str += " -g outputfileNormalequationGraceTxt=xxx_output_path_-normals.txt"
sys_str += " -g outputfileNormalequationRegularizedGraceTxt=xxx_output_path_-normalsRegularized.txt"
sys_str += " -g outputfileSolutionGrace=xxx_output_path_-x.txt"
sys_str += " -g outputfileSigmaxGrace=xxx_output_path_-sigmax.txt"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd_time_start','%08.2f' % mjd_start)
sys_str_i = sys_str_i.replace('xxx_mjd_time_end','%08.2f' % mjd_end)
sys_str_i = sys_str_i.replace('xxxx-xx',('%4d-%02d' % (year,month)))
sys_str_i = sys_str_i.replace('xxx_number_of_points','%d' % point_count)
sys_str_i = sys_str_i.replace('xxx_grid_file',grid_file)
sys_str_i = sys_str_i.replace('xxx_output_path_',output_path)
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',cg.data_path)
# execute command if local, otherwise (if leo) return command string
if not leo:
ret = os.system(sys_str_i)
else:
ret = sys_str_i
return ret
def make_matrix_file(groops_bin,matrix_file,array):
# determine dimensions
array = np.array(array,ndmin=2)
row_count = array.shape[0]
col_count = array.shape[1]
ele_count = row_count * col_count
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/ascii2matrix.xml')
# create and fill output array
output_ascii = np.zeros((ele_count,3))
ix = -1
for m in list(range(row_count)):
for n in list(range(col_count)):
ix += 1
output_ascii[ix,0] = m
output_ascii[ix,1] = n
output_ascii[ix,2] = array[m,n]
# get temporary file and save
temp_dir = tempfile.mkdtemp()
ascii_file = os.path.join(temp_dir, 'ascii.txt')
np.savetxt(ascii_file,output_ascii,'%d %d %+.16e')
# call groops
sys_str = ""
sys_str += groops_bin
sys_str += " -g numberColumns=%d" % col_count
sys_str += " -g numberRows=%d" % row_count
sys_str += " -g inputfileAscii=%s" % ascii_file
sys_str += " -g outputfileMatrix=%s" % matrix_file
sys_str += " %s 2>/dev/null" %(xml_file)
ret = os.system(sys_str)
shutil.rmtree(temp_dir)
return ret
def compute_goco_grid(groops_bin,input_grid,output_grid,mjd):
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/goco2grid.xml')
# call groops
sys_str = ""
sys_str += groops_bin
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g timeStart=xxx_mjd_time_start"
sys_str += " -g inputfileGrid=xxx_input_grid_file"
sys_str += " -g outputfileGriddedData=xxx_output_grid_file"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd_time_start','%08.2f' % mjd)
sys_str_i = sys_str_i.replace('xxx_input_grid_file',input_grid)
sys_str_i = sys_str_i.replace('xxx_output_grid_file',output_grid)
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',os.path.join(cc.data_path,'IFG','raw'))
# execute command if local
ret = os.system(sys_str_i)
return ret
def make_grid_in_polygon_file(groops_bin,output_grid,polygon_file):
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon.xml')
# call groops
sys_str = ""
sys_str += groops_bin
sys_str += " -g outputfileGrid=%s" % output_grid
sys_str += " -g inputfilePolygon=%s" % polygon_file
sys_str += " %s 2>/dev/null" %(xml_file)
ret = os.system(sys_str)
return ret
def make_specific_grid_in_polygon_file(groops_bin,output_grid,polygon_file,grid_type,grid_resolution):
# call groops
sys_str = ""
sys_str += groops_bin
sys_str += " -g outputfileGrid=%s" % output_grid
sys_str += " -g inputfilePolygon=%s" % polygon_file
# determine xml file location
if grid_type==0 or grid_type=='geographical':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_geographical.xml')
sys_str += " -g delta=%.10f" % grid_resolution
elif grid_type==1 or grid_type=='triangleVertex':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_triangleVertex.xml')
sys_str += " -g level=%d" % grid_resolution
elif grid_type==2 or grid_type=='triangleCenter':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_triangleCenter.xml')
sys_str += " -g level=%d" % grid_resolution
elif grid_type==3 or grid_type=='gauss':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_gauss.xml')
sys_str += " -g parallelsCount=%d" % grid_resolution
elif grid_type==4 or grid_type=='reuter':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_reuter.xml')
sys_str += " -g gamma=%d" % grid_resolution
elif grid_type==5 or grid_type=='corput':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_corput.xml')
sys_str += " -g globalPointsCount=%d" % grid_resolution
elif grid_type==6 or grid_type=='driscoll':
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid_in_polygon_driscoll.xml')
sys_str += " -g dimension=%d" % grid_resolution
# call groops
sys_str += " %s 2>/dev/null" %(xml_file)
ret = os.system(sys_str)
return ret
def compute_gfc_grid(groops_bin,input_grid,output_grid,mjd,min_degree,max_degree,gauss,data_center='itsg'):
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/%s2grid.xml' % data_center)
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g mjd=xxx_mjd"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g outputfileGriddedData=xxx_output_grid"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " -g minDegree=xxx_min_degree"
sys_str += " -g maxDegree=xxx_max_degree"
sys_str += " -g gaussRadius=xxx_gauss"
sys_str += " xxx_xml_file 2>/dev/null"
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_mjd','%08.2f' % mjd)
sys_str_i = sys_str_i.replace('xxx_grid_file',input_grid)
sys_str_i = sys_str_i.replace('xxx_output_grid',output_grid)
sys_str_i = sys_str_i.replace('xxx_min_degree','%d' % min_degree)
sys_str_i = sys_str_i.replace('xxx_max_degree','%d' % max_degree)
sys_str_i = sys_str_i.replace('xxx_gauss','%d' % gauss)
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',civ.data_path)
# execute command if local
ret = os.system(sys_str_i)
return ret
def compute_grid_to_gfc_to_grid(groops_bin,input_gridded_data,output_grid,input_grid,gauss,min_degree=0,max_degree=60):
# determine xml file location
xml_file = pkg_resources.resource_filename('picasso.data', 'GROOPS/grid2gfc2grid.xml')
# system string template
sys_str = ""
sys_str += "xxx_grops_bin"
sys_str += " -g inputfileGriddedData=xxx_gridded_data"
sys_str += " -g outputfileGriddedData=xxx_output_grid"
sys_str += " -g outputfilePotentialCoefficients=xxx_gfc_file"
sys_str += " -g inputfileGrid=xxx_grid_file"
sys_str += " -g groopsPath=xxx_groops_path"
sys_str += " -g minDegree=xxx_min_degree"
sys_str += " -g maxDegree=xxx_max_degree"
sys_str += " -g gaussRadius=xxx_gauss"
sys_str += " xxx_xml_file 2>/dev/null"
# get temporary file and save
temp_dir = tempfile.mkdtemp()
gfc_file = os.path.join(temp_dir, 'tmp.gfc')
# replace placeholder
sys_str_i = sys_str
sys_str_i = sys_str_i.replace('xxx_grops_bin',groops_bin)
sys_str_i = sys_str_i.replace('xxx_gridded_data',input_gridded_data)
sys_str_i = sys_str_i.replace('xxx_output_grid',output_grid)
sys_str_i = sys_str_i.replace('xxx_grid_file',input_grid)
sys_str_i = sys_str_i.replace('xxx_gfc_file',gfc_file)
sys_str_i = sys_str_i.replace('xxx_min_degree','%d' % min_degree)
sys_str_i = sys_str_i.replace('xxx_max_degree','%d' % max_degree)
sys_str_i = sys_str_i.replace('xxx_gauss','%d' % gauss)
sys_str_i = sys_str_i.replace('xxx_xml_file',xml_file)
sys_str_i = sys_str_i.replace('xxx_groops_path',civ.data_path)
# execute command if local
ret = os.system(sys_str_i)
shutil.rmtree(temp_dir)
return ret
| 48.17 | 143 | 0.718134 | 2,938 | 19,268 | 4.313479 | 0.084071 | 0.125464 | 0.07733 | 0.053657 | 0.8324 | 0.817091 | 0.785134 | 0.776217 | 0.744023 | 0.697862 | 0 | 0.008105 | 0.167584 | 19,268 | 399 | 144 | 48.290727 | 0.782031 | 0.064771 | 0 | 0.621951 | 0 | 0 | 0.327582 | 0.199354 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033537 | false | 0 | 0.02439 | 0 | 0.091463 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
418ce8f073402db9cf2346aefde155b743bad840 | 1,212 | py | Python | sdk/python/pulumi_aws_native/rds/__init__.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 29 | 2021-09-30T19:32:07.000Z | 2022-03-22T21:06:08.000Z | sdk/python/pulumi_aws_native/rds/__init__.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 232 | 2021-09-30T19:26:26.000Z | 2022-03-31T23:22:06.000Z | sdk/python/pulumi_aws_native/rds/__init__.py | pulumi/pulumi-aws-native | 1ae4a4d9c2256b2a79ca536f8d8497b28d10e4c3 | [
"Apache-2.0"
] | 4 | 2021-11-10T19:42:01.000Z | 2022-02-05T10:15:49.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
from .. import _utilities
import typing
# Export this package's modules as members:
from ._enums import *
from .db_cluster import *
from .db_cluster_parameter_group import *
from .db_instance import *
from .db_parameter_group import *
from .db_proxy import *
from .db_proxy_endpoint import *
from .db_proxy_target_group import *
from .db_security_group import *
from .db_security_group_ingress import *
from .db_subnet_group import *
from .event_subscription import *
from .get_db_cluster import *
from .get_db_cluster_parameter_group import *
from .get_db_instance import *
from .get_db_parameter_group import *
from .get_db_proxy import *
from .get_db_proxy_endpoint import *
from .get_db_proxy_target_group import *
from .get_db_security_group import *
from .get_db_security_group_ingress import *
from .get_db_subnet_group import *
from .get_event_subscription import *
from .get_global_cluster import *
from .get_option_group import *
from .global_cluster import *
from .option_group import *
from ._inputs import *
from . import outputs
| 32.756757 | 80 | 0.797855 | 186 | 1,212 | 4.865591 | 0.290323 | 0.309392 | 0.18674 | 0.165746 | 0.560221 | 0.304972 | 0.067403 | 0 | 0 | 0 | 0 | 0.000952 | 0.133663 | 1,212 | 36 | 81 | 33.666667 | 0.860952 | 0.167492 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
41a9712ad288b41ebbbe14e22dc6a742a537679b | 361 | py | Python | setup.py | terra-submersa/coordinates-label-photos | 71604b581006cd7d054def3b5f343ecd0eb8adb2 | [
"MIT"
] | 1 | 2022-03-25T20:31:03.000Z | 2022-03-25T20:31:03.000Z | setup.py | terra-submersa/coordinates-label-photos | 71604b581006cd7d054def3b5f343ecd0eb8adb2 | [
"MIT"
] | null | null | null | setup.py | terra-submersa/coordinates-label-photos | 71604b581006cd7d054def3b5f343ecd0eb8adb2 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
entry_points={
'console_scripts': [
'coordinates-label-photos=coordinates_label_photos.scripts.label_photos:main',
'plot-gpx-tracks=coordinates_label_photos.scripts.plot_gpx_tracks:main',
'images-to-gpx=coordinates_label_photos.scripts.images_to_gpx:main',
],
}
)
| 30.083333 | 90 | 0.686981 | 41 | 361 | 5.731707 | 0.414634 | 0.234043 | 0.374468 | 0.370213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207756 | 361 | 11 | 91 | 32.818182 | 0.821678 | 0 | 0 | 0 | 0 | 0 | 0.620499 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6b571c958ac14c8fa5ffce2675182085e9383116 | 105 | py | Python | colabcode/__init__.py | mhamri/colabcode | 3efdbd8774d74701eff2bf3639d6d2fb9126cd9b | [
"MIT"
] | 1 | 2021-11-17T13:45:23.000Z | 2021-11-17T13:45:23.000Z | colabcode/__init__.py | mhamri/colabcode | 3efdbd8774d74701eff2bf3639d6d2fb9126cd9b | [
"MIT"
] | 14 | 2020-10-09T10:51:19.000Z | 2021-08-09T14:01:40.000Z | colabcode/__init__.py | mhamri/colabcode | 3efdbd8774d74701eff2bf3639d6d2fb9126cd9b | [
"MIT"
] | 1 | 2021-09-17T05:58:40.000Z | 2021-09-17T05:58:40.000Z | """allows for `from colabcode import ColabCode`"""
from .code import ColabCode
print(ColabCode.__doc__)
| 21 | 50 | 0.771429 | 13 | 105 | 5.923077 | 0.615385 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 105 | 4 | 51 | 26.25 | 0.827957 | 0.419048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6b6067a3e7acbac60e86f3cd40bc1fdb0f5aeb3e | 115 | py | Python | tasks/EPAM/python_course/foundation-python/l4/m4-15.py | AleksNeStu/projects | 1a4c68dfbdcb77228f0f3617e58fd18fcb1f5dbb | [
"Apache-2.0"
] | 2 | 2022-01-19T18:01:35.000Z | 2022-02-06T06:54:38.000Z | tasks/EPAM/python_course/foundation-python/l4/m4-15.py | AleksNeStu/projects | 1a4c68dfbdcb77228f0f3617e58fd18fcb1f5dbb | [
"Apache-2.0"
] | null | null | null | tasks/EPAM/python_course/foundation-python/l4/m4-15.py | AleksNeStu/projects | 1a4c68dfbdcb77228f0f3617e58fd18fcb1f5dbb | [
"Apache-2.0"
] | null | null | null | """Import."""
# Import module hello
import hello
# Call function defined in that module
hello.print_func("EPAM")
| 14.375 | 38 | 0.730435 | 16 | 115 | 5.1875 | 0.6875 | 0.26506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147826 | 115 | 7 | 39 | 16.428571 | 0.846939 | 0.565217 | 0 | 0 | 0 | 0 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6bfc4e15489028b5c419ff11aeca6ad30bb39b55 | 1,651 | py | Python | extensions/.stubs/clrclasses/System/Collections/Generic/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | 1 | 2020-03-25T03:27:24.000Z | 2020-03-25T03:27:24.000Z | extensions/.stubs/clrclasses/System/Collections/Generic/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | extensions/.stubs/clrclasses/System/Collections/Generic/__init__.py | vicwjb/Pycad | 7391cd694b7a91ad9f9964ec95833c1081bc1f84 | [
"MIT"
] | null | null | null | from __clrclasses__.System.Collections.Generic import Comparer
from __clrclasses__.System.Collections.Generic import Dictionary
from __clrclasses__.System.Collections.Generic import EqualityComparer
from __clrclasses__.System.Collections.Generic import HashSet
from __clrclasses__.System.Collections.Generic import ICollection
from __clrclasses__.System.Collections.Generic import IComparer
from __clrclasses__.System.Collections.Generic import IDictionary
from __clrclasses__.System.Collections.Generic import IEnumerable
from __clrclasses__.System.Collections.Generic import IEnumerator
from __clrclasses__.System.Collections.Generic import IEqualityComparer
from __clrclasses__.System.Collections.Generic import IList
from __clrclasses__.System.Collections.Generic import IReadOnlyCollection
from __clrclasses__.System.Collections.Generic import IReadOnlyDictionary
from __clrclasses__.System.Collections.Generic import IReadOnlyList
from __clrclasses__.System.Collections.Generic import ISet
from __clrclasses__.System.Collections.Generic import KeyNotFoundException
from __clrclasses__.System.Collections.Generic import KeyValuePair
from __clrclasses__.System.Collections.Generic import LinkedList
from __clrclasses__.System.Collections.Generic import LinkedListNode
from __clrclasses__.System.Collections.Generic import List
from __clrclasses__.System.Collections.Generic import Queue
from __clrclasses__.System.Collections.Generic import SortedDictionary
from __clrclasses__.System.Collections.Generic import SortedList
from __clrclasses__.System.Collections.Generic import SortedSet
from __clrclasses__.System.Collections.Generic import Stack
| 63.5 | 74 | 0.894004 | 175 | 1,651 | 7.862857 | 0.177143 | 0.25436 | 0.363372 | 0.563227 | 0.799419 | 0.799419 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060569 | 1,651 | 25 | 75 | 66.04 | 0.88717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2e12c1f99ef8ef3e74fe21d7a19cb987e9a89db0 | 41 | py | Python | src/data_acq/download_genomes/__init__.py | axiom-of-joy/onekgenomes | c1255f85496ba5d97950bcdfc25253c46ea964b1 | [
"MIT"
] | null | null | null | src/data_acq/download_genomes/__init__.py | axiom-of-joy/onekgenomes | c1255f85496ba5d97950bcdfc25253c46ea964b1 | [
"MIT"
] | 12 | 2020-01-28T22:40:35.000Z | 2022-02-10T00:10:43.000Z | src/data_acq/download_genomes/__init__.py | axiom-of-joy/one-k-genomes | c1255f85496ba5d97950bcdfc25253c46ea964b1 | [
"MIT"
] | null | null | null | from download_genomes.download import *
| 13.666667 | 39 | 0.829268 | 5 | 41 | 6.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 41 | 2 | 40 | 20.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e1e6b90ce2b80ca8a4cf8157b603d03fe7080d3 | 136 | py | Python | parser/__init__.py | choleraehyq/yuujins | dff6f7def0081f24afac30a7c1e3ca6755a5ea3f | [
"MIT"
] | null | null | null | parser/__init__.py | choleraehyq/yuujins | dff6f7def0081f24afac30a7c1e3ca6755a5ea3f | [
"MIT"
] | null | null | null | parser/__init__.py | choleraehyq/yuujins | dff6f7def0081f24afac30a7c1e3ca6755a5ea3f | [
"MIT"
] | null | null | null | from parser.parser import parse_from_string
from parser.preparser import preparse
from parser.extend import expand_list, expand_formals
| 34 | 53 | 0.875 | 20 | 136 | 5.75 | 0.55 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095588 | 136 | 3 | 54 | 45.333333 | 0.934959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e2a322b0f299aa2d41621d3d450a011bfb8649c | 44 | py | Python | wsgi.py | tanj/log-it | d7223af1d0216d3febe4ebc39e06e24dceb3115f | [
"BSD-3-Clause"
] | null | null | null | wsgi.py | tanj/log-it | d7223af1d0216d3febe4ebc39e06e24dceb3115f | [
"BSD-3-Clause"
] | null | null | null | wsgi.py | tanj/log-it | d7223af1d0216d3febe4ebc39e06e24dceb3115f | [
"BSD-3-Clause"
] | null | null | null | from log_it import create_app
create_app()
| 11 | 29 | 0.818182 | 8 | 44 | 4.125 | 0.75 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 44 | 3 | 30 | 14.666667 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2e3c932816be19fba65f158d122a261cad2c88d2 | 73 | py | Python | chainer_graphics/camera/__init__.py | Idein/chainer-graphics | 3646fd961003297ff7e3f5efb71360c16d5eb9f5 | [
"MIT"
] | 3 | 2019-07-01T04:38:50.000Z | 2021-12-03T06:22:58.000Z | chainer_graphics/camera/__init__.py | Idein/chainer-graphics | 3646fd961003297ff7e3f5efb71360c16d5eb9f5 | [
"MIT"
] | null | null | null | chainer_graphics/camera/__init__.py | Idein/chainer-graphics | 3646fd961003297ff7e3f5efb71360c16d5eb9f5 | [
"MIT"
] | 1 | 2021-12-03T06:22:59.000Z | 2021-12-03T06:22:59.000Z | from .basic import *
from .projection import *
from .distortion import *
| 18.25 | 25 | 0.753425 | 9 | 73 | 6.111111 | 0.555556 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164384 | 73 | 3 | 26 | 24.333333 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e53e3ba5e5cbc8c70cf4c3503669ac8df749173 | 89 | py | Python | messaging_components/routers/dispatch/management/qdmanage.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | messaging_components/routers/dispatch/management/qdmanage.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | messaging_components/routers/dispatch/management/qdmanage.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | # @TODO QDManage implementation
class QDManage:
def __init__(self, ):
pass
| 12.714286 | 31 | 0.651685 | 9 | 89 | 6 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.269663 | 89 | 6 | 32 | 14.833333 | 0.830769 | 0.325843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
2e627ee0fb948fc16d88f23ed944b53bae8dea77 | 4,090 | py | Python | preprocess/preprocess.py | ryosukehata/severstal | cb54703b820cb27d7b93fb80a42b41f84ec8cf08 | [
"Apache-2.0"
] | null | null | null | preprocess/preprocess.py | ryosukehata/severstal | cb54703b820cb27d7b93fb80a42b41f84ec8cf08 | [
"Apache-2.0"
] | null | null | null | preprocess/preprocess.py | ryosukehata/severstal | cb54703b820cb27d7b93fb80a42b41f84ec8cf08 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
import numpy as np
def dataframe_preprocess(df_filepath):
"""
df_filepath : input fileのpathを指定する。
"""
df = pd.read_csv(df_filepath)
df["ImageId"], df["ClassId"] = zip(*df["ImageId_ClassId"].str.split("_"))
df["ClassId"] = df["ClassId"].astype(int)
df = df.pivot(index="ImageId", columns="ClassId", values="EncodedPixels")
df["defects"] = df.count(axis=1)
return df
def make_mask(row_id, df):
"""
Data Encoder
This function is intended to use for dataframe
after "dataframe_prerocess" function.
Input
row_id : gicven a low
df : dataframe after above function dataframe_prerocess
Output
fname : image_id
mask : numpy array (256, 1600, 4) indecites where decects is
https://www.kaggle.com/paulorzp/rle-functions-run-lenght-encode-decode
"""
fname = df.iloc[row_id].name
labels = df.iloc[row_id][:4] # 4 channel
masks = np.zeros((256, 1600, 4), dtype=np.float32) # float32 is V.Imp
# 4:class 1~4 (ch:0~3)
for idx, label in enumerate(labels.values):
if label is not np.nan:
label = label.split(" ")
positions = map(int, label[0::2])
length = map(int, label[1::2])
mask = np.zeros(256 * 1600, dtype=np.uint8)
for pos, le in zip(positions, length):
mask[pos: (pos + le)] = 1
masks[:, :, idx] = mask.reshape(256, 1600, order="F")
return fname, masks
def make_mask_only3(row_id, df):
"""
Data Encoder
This function is intended to use for dataframe
after "dataframe_prerocess" function.
Input
row_id : gicven a low
df : dataframe after above function dataframe_prerocess
Output
fname : image_id
mask : numpy array (256, 1600, 4) indecites where decects is
https://www.kaggle.com/paulorzp/rle-functions-run-lenght-encode-decode
"""
fname = df.iloc[row_id].name
labels = df.iloc[row_id][:4] # 4 channel
masks = np.zeros((256, 1600, 1), dtype=np.float32) # float32 is V.Imp
# 4:class 1~4 (ch:0~3)
for idx, label in enumerate(labels.values):
if label is not np.nan and idx == 2:
label = label.split(" ")
positions = map(int, label[0::2])
length = map(int, label[1::2])
mask = np.zeros(256 * 1600, dtype=np.uint8)
for pos, le in zip(positions, length):
mask[pos: (pos + le)] = 1
masks[:, :, 0] = mask.reshape(256, 1600, order="F")
return fname, masks
| 54.533333 | 175 | 0.376039 | 362 | 4,090 | 4.18232 | 0.290055 | 0.02642 | 0.023778 | 0.029062 | 0.796565 | 0.796565 | 0.796565 | 0.796565 | 0.796565 | 0.743725 | 0 | 0.052239 | 0.54132 | 4,090 | 74 | 176 | 55.27027 | 0.754797 | 0.534719 | 0 | 0.540541 | 0 | 0 | 0.050837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0 | 0.054054 | 0 | 0.216216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf0ae00c2f93e7f0bc034f35afa492805d65b8ba | 4,449 | py | Python | examples/python/test_iter.py | SmartEconomyWorkshop/workshop | 5961dcc8832f60b3a0407cb9a8361ba5485ac280 | [
"MIT"
] | 79 | 2017-10-22T03:35:06.000Z | 2021-12-02T10:28:06.000Z | examples/python/test_iter.py | SmartEconomyWorkshop/workshop | 5961dcc8832f60b3a0407cb9a8361ba5485ac280 | [
"MIT"
] | 122 | 2017-10-19T12:34:08.000Z | 2020-08-20T12:38:17.000Z | examples/python/test_iter.py | SmartEconomyWorkshop/workshop | 5961dcc8832f60b3a0407cb9a8361ba5485ac280 | [
"MIT"
] | 76 | 2017-10-19T05:09:55.000Z | 2020-12-08T12:03:59.000Z | from boa_test.tests.boa_test import BoaTest
from boa.compiler import Compiler
from neo.Prompt.Commands.BuildNRun import TestBuild
class TestContract(BoaTest):
def test_aWhile1(self):
output = Compiler.instance().load('%s/boa_test/example/WhileTest1.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 6)
def test_aWhile2(self):
output = Compiler.instance().load('%s/boa_test/example/WhileTest2.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 6)
def test_aWhile3(self):
output = Compiler.instance().load('%s/boa_test/example/WhileTest.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 24)
def test_Iter1(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 18)
def test_Iter2(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest2.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 8)
def test_Iter3(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest3.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 7)
def test_Iter4(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest4.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetByteArray(), bytearray(b'abcdabcdabcd\x0c'))
def test_Iter5(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest5.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 51)
def test_Range1(self):
output = Compiler.instance().load('%s/boa_test/example/RangeTest.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
res = results[0].GetArray()
self.assertEqual(len(res), 20)
def test_Range2(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest6.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 10)
def test_Range3(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest7.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 12)
def test_Range4(self):
output = Compiler.instance().load('%s/boa_test/example/IterTest8.py' % TestContract.dirname).default
out = output.write()
tx, results, total_ops, engine = TestBuild(out, [], self.GetWallet1(), '', '07')
self.assertEqual(len(results), 1)
self.assertEqual(results[0].GetBigInteger(), 6)
| 47.83871 | 109 | 0.647336 | 521 | 4,449 | 5.454894 | 0.149712 | 0.126671 | 0.082336 | 0.109782 | 0.833216 | 0.833216 | 0.833216 | 0.833216 | 0.833216 | 0.643209 | 0 | 0.027554 | 0.192403 | 4,449 | 92 | 110 | 48.358696 | 0.763429 | 0 | 0 | 0.506494 | 0 | 0 | 0.095527 | 0.086536 | 0 | 0 | 0 | 0 | 0.311688 | 1 | 0.155844 | false | 0 | 0.038961 | 0 | 0.207792 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf4378651faa42f7cecd8d09973ee3d9a9678a6c | 150 | py | Python | MoleculeACE/GNN/__init__.py | molML/MoleculeACE | e831d2371a9b89f4853a03d5c04cc4bf59f64ee0 | [
"MIT"
] | 9 | 2022-03-26T17:36:03.000Z | 2022-03-29T19:50:26.000Z | MoleculeACE/GNN/__init__.py | molML/MoleculeACE | e831d2371a9b89f4853a03d5c04cc4bf59f64ee0 | [
"MIT"
] | null | null | null | MoleculeACE/GNN/__init__.py | molML/MoleculeACE | e831d2371a9b89f4853a03d5c04cc4bf59f64ee0 | [
"MIT"
] | null | null | null | from MoleculeACE.GNN.models import train_model_with_hyperparameters_optimization, init_model
from MoleculeACE.GNN.data import get_moleculecsv_dataset
| 50 | 92 | 0.906667 | 20 | 150 | 6.45 | 0.75 | 0.232558 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06 | 150 | 2 | 93 | 75 | 0.914894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
d8553c26a8fe530fc84fd90f8ca9212948bd8024 | 48 | py | Python | beebird/ui/__init__.py | randydu/beebird | 4f7e4a891b34b09e31174822a9acb5a9a46127de | [
"MIT"
] | 1 | 2021-02-09T07:19:54.000Z | 2021-02-09T07:19:54.000Z | beebird/ui/__init__.py | randydu/beebird | 4f7e4a891b34b09e31174822a9acb5a9a46127de | [
"MIT"
] | null | null | null | beebird/ui/__init__.py | randydu/beebird | 4f7e4a891b34b09e31174822a9acb5a9a46127de | [
"MIT"
] | null | null | null | ''' Task UI modules '''
from .ui import TaskUI
| 12 | 23 | 0.645833 | 7 | 48 | 4.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 48 | 3 | 24 | 16 | 0.815789 | 0.3125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d85a8f8f1935c8b8e7d18136a0824d636ec64b10 | 49 | py | Python | geek/geek1.py | lcarlin/guppe | a0ee7b85e8687e8fb8243fbb509119a94bc6460f | [
"Apache-2.0"
] | 1 | 2021-12-18T15:29:24.000Z | 2021-12-18T15:29:24.000Z | geek/geek1.py | lcarlin/guppe | a0ee7b85e8687e8fb8243fbb509119a94bc6460f | [
"Apache-2.0"
] | null | null | null | geek/geek1.py | lcarlin/guppe | a0ee7b85e8687e8fb8243fbb509119a94bc6460f | [
"Apache-2.0"
] | 3 | 2021-08-23T22:45:20.000Z | 2022-02-17T13:17:09.000Z | piu = 3.1456
def funcao1(a, b) :
return a + b | 16.333333 | 19 | 0.571429 | 10 | 49 | 2.8 | 0.8 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 0.285714 | 49 | 3 | 20 | 16.333333 | 0.628571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
d85f7de4c7857aa98894e4154b0756d370611f2f | 71,001 | py | Python | filesystems/tests/common.py | Julian/Filesystems | 8c758c8dbbe1263b2ad574ccb8e2c81c865c0a61 | [
"MIT"
] | 2 | 2017-04-17T18:30:15.000Z | 2018-05-05T23:11:06.000Z | filesystems/tests/common.py | Julian/Filesystems | 8c758c8dbbe1263b2ad574ccb8e2c81c865c0a61 | [
"MIT"
] | 46 | 2016-09-11T19:40:49.000Z | 2020-02-05T01:49:34.000Z | filesystems/tests/common.py | Julian/Filesystems | 8c758c8dbbe1263b2ad574ccb8e2c81c865c0a61 | [
"MIT"
] | 4 | 2017-01-13T14:47:00.000Z | 2020-01-17T00:45:49.000Z | # -*- coding: utf-8 -*-
import errno
import os
from pyrsistent import s
from testscenarios import multiply_scenarios, with_scenarios
from filesystems import Path, exceptions
from filesystems._path import RelativePath
@with_scenarios()
class _NonExistingFileMixin(object):
scenarios = [
(
"read_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rb")),
), (
"read_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="r")),
), (
"read_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rt")),
), (
"stat",
dict(act_on=lambda fs, path: fs.stat(path=path)),
), (
"lstat",
dict(act_on=lambda fs, path: fs.lstat(path=path)),
), (
"list_directory",
dict(act_on=lambda fs, path: fs.list_directory(path=path)),
), (
"remove_empty_directory",
dict(act_on=lambda fs, path: fs.remove_empty_directory(path=path)),
), (
"remove_file",
dict(act_on=lambda fs, path: fs.remove_file(path=path)),
), (
"readlink",
dict(act_on=lambda fs, path: fs.readlink(path=path)),
),
]
def test_non_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.FileNotFound) as e:
self.act_on(fs=fs, path=tempdir / "does not exist")
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOENT) + ": " + str(tempdir / "does not exist"),
)
class TestFS(_NonExistingFileMixin):
def test_open_read_non_existing_nested_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.FileNotFound) as e:
fs.open(tempdir.descendant("unittesting", "file"))
self.assertEqual(
str(e.exception), (
os.strerror(errno.ENOENT) +
": " +
str(tempdir.descendant("unittesting", "file"))
)
)
def test_open_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.IsADirectory) as e:
fs.open(tempdir)
self.assertEqual(
str(e.exception),
os.strerror(errno.EISDIR) + ": " + str(tempdir),
)
def test_open_append_binary_and_native_non_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", "ab") as f:
f.write(b"some ")
with fs.open(tempdir / "unittesting", "a") as f:
f.write("things!")
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), "some things!")
def test_open_append_native_and_binary_non_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", "a") as f:
f.write("some ")
with fs.open(tempdir / "unittesting", "ab") as f:
f.write(b"things!")
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), "some things!")
def test_open_append_non_existing_nested_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.FileNotFound) as e:
fs.open(tempdir.descendant("unittesting", "file"), "ab")
self.assertEqual(
str(e.exception), (
os.strerror(errno.ENOENT) +
": " +
str(tempdir.descendant("unittesting", "file"))
)
)
def test_create_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.create(tempdir / "unittesting") as f:
f.write("some things!")
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), "some things!")
def test_create_file_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.create(tempdir / "unittesting"):
pass
with self.assertRaises(exceptions.FileExists) as e:
fs.create(tempdir / "unittesting")
self.assertEqual(
str(e.exception), (
os.strerror(errno.EEXIST) +
": " +
str(tempdir / "unittesting")
),
)
def test_create_file_existing_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.create_directory(tempdir / "unittesting")
with self.assertRaises(exceptions.FileExists) as e:
fs.create(tempdir / "unittesting")
self.assertEqual(
str(e.exception), (
os.strerror(errno.EEXIST) +
": " +
str(tempdir / "unittesting")
),
)
def test_create_file_existing_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
with self.assertRaises(exceptions.FileExists) as e:
fs.create(to)
self.assertEqual(
str(e.exception), os.strerror(errno.EEXIST) + ": " + str(to),
)
def test_get_contents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", "wb") as f:
f.write(b"some more things!")
self.assertEqual(
fs.get_contents(tempdir / "unittesting"),
"some more things!",
)
def test_get_set_contents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.set_contents(tempdir / "unittesting", "foo\nbar\nbaz")
self.assertEqual(
fs.get_contents(path=tempdir / "unittesting"),
"foo\nbar\nbaz",
)
def test_get_contents_text(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", "wb") as f:
f.write(u"שלום".encode("utf-8"))
self.assertEqual(
fs.get_contents(tempdir / "unittesting", mode="t"),
u"שלום",
)
def test_get_set_contents_text(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.set_contents(tempdir / "unittesting", u"שלום", mode="t")
self.assertEqual(
fs.get_contents(path=tempdir / "unittesting", mode="t"),
u"שלום",
)
def test_set_contents_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.set_contents(tempdir / "unittesting", "foo\nbar\nbaz")
fs.set_contents(tempdir / "unittesting", "spam\nquux\n")
self.assertEqual(
fs.get_contents(path=tempdir / "unittesting"),
"spam\nquux\n",
)
def test_create_with_contents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.create_with_contents(tempdir / "unittesting", "foo\nbar\nbaz")
self.assertEqual(
fs.get_contents(path=tempdir / "unittesting"),
"foo\nbar\nbaz",
)
def test_create_with_contents_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
fs.set_contents(tempdir / "unittesting", "foo\nbar\nbaz")
with self.assertRaises(exceptions.FileExists):
fs.create_with_contents(tempdir / "unittesting", "spam\nquux\n")
self.assertEqual(
fs.get_contents(path=tempdir / "unittesting"),
"foo\nbar\nbaz",
)
def test_remove(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "directory"
fs.create_directory(directory)
a = directory / "a"
b = directory / "b"
c = directory.descendant("b", "c")
d = directory / "d"
fs.touch(path=a)
fs.create_directory(path=b)
fs.touch(path=c)
fs.touch(path=d)
fs.remove(directory)
self.assertEqual(fs.children(path=tempdir), s())
def test_removing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.removing(path=tempdir / "directory") as path:
self.assertFalse(fs.is_dir(path=path))
fs.create_directory(path=path)
self.assertTrue(fs.is_dir(path=path))
self.assertFalse(fs.is_dir(path=path))
def test_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.touch(source)
fs.link(source=source, to=to)
self.assertEqual(
dict(
exists=fs.exists(path=to),
is_dir=fs.is_dir(path=to),
is_file=fs.is_file(path=to),
is_link=fs.is_link(path=to),
), dict(
exists=True,
is_dir=False,
is_file=True,
is_link=True,
),
)
def test_link_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.create_directory(source)
fs.link(source=source, to=to)
self.assertEqual(
dict(
exists=fs.exists(path=to),
is_dir=fs.is_dir(path=to),
is_file=fs.is_file(path=to),
is_link=fs.is_link(path=to),
), dict(
exists=True,
is_dir=True,
is_file=False,
is_link=True,
),
)
def test_link_directory_link_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
zero, one = tempdir / "0", tempdir / "1"
fs.create_directory(path=zero)
fs.link(source=zero, to=one)
fs.create_directory(path=zero / "2")
three = one / "3"
fs.link(source=one / "2", to=three)
self.assertEqual(
dict(
exists=fs.exists(path=three),
is_dir=fs.is_dir(path=three),
is_file=fs.is_file(path=three),
is_link=fs.is_link(path=three),
),
dict(exists=True, is_dir=True, is_file=False, is_link=True),
)
def test_link_nonexisting_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(
dict(
exists=fs.exists(path=to),
is_dir=fs.is_dir(path=to),
is_file=fs.is_file(path=to),
is_link=fs.is_link(path=to),
), dict(
exists=False,
is_dir=False,
is_file=False,
is_link=True,
),
)
def test_link_existing_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
with self.assertRaises(exceptions.FileExists) as e:
fs.link(source=source, to=to)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(to),
)
def test_link_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.touch(path=to)
with self.assertRaises(exceptions.FileExists) as e:
fs.link(source=source, to=to)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(to),
)
def test_link_existing_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.create_directory(path=to)
with self.assertRaises(exceptions.FileExists) as e:
fs.link(source=source, to=to)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(to),
)
def test_link_nonexistant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(
dict(
exists=fs.exists(path=to),
is_dir=fs.is_dir(path=to),
is_file=fs.is_file(path=to),
is_link=fs.is_link(path=to),
), dict(
exists=False,
is_dir=False,
is_file=False,
is_link=True,
),
)
def test_multiple_links(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source = tempdir / "source"
first = tempdir / "first"
second = tempdir / "second"
third = tempdir / "third"
fs.link(source=source, to=first)
fs.link(source=first, to=second)
fs.link(source=second, to=third)
with fs.open(source, "wb") as f:
f.write(b"some things way over here!")
self.assertEqual(fs.get_contents(third), "some things way over here!")
def test_link_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.create_directory(source)
fs.link(source=source, to=to)
self.assertEqual(
fs.realpath(to / "child"),
source / "child",
)
def test_link_descendant_of_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source = tempdir / "source"
not_a_dir = tempdir / "dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.link(source=source, to=not_a_dir / "to")
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_read_from_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
with fs.open(source, "wb") as f:
f.write(b"some things over here!")
self.assertEqual(fs.get_contents(to), "some things over here!")
def test_write_to_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
with fs.open(to, "wb") as f:
f.write(b"some things over here!")
self.assertEqual(fs.get_contents(source), "some things over here!")
def test_write_to_created_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.create_directory(source)
fs.link(source=source, to=to)
child = to / "child"
with fs.create(child) as f:
f.write("some things over here!")
self.assertEqual(fs.get_contents(child), "some things over here!")
def test_link_nonexistant_parent(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source = tempdir / "source"
orphan = tempdir.descendant("nonexistant", "orphan")
with self.assertRaises(exceptions.FileNotFound) as e:
fs.link(source=source, to=orphan)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOENT) + ": " + str(orphan.parent()),
)
def test_realpath(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(fs.realpath(to), source)
def test_realpath_relative(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = RelativePath("source", "dir"), tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(
fs.realpath(to),
to.sibling("source") / "dir",
)
def test_realpath_normal_path(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source = tempdir / "source"
self.assertEqual(fs.realpath(source), source)
def test_realpath_double_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
# /1 -> /0/1
# /1/3 -> /1/2/3
# realpath(/1/3) == /0/1/2/3
zero, one = tempdir / "0", tempdir / "1"
two = one / "2"
fs.create_directory(path=zero)
fs.create_directory(path=zero / "1")
fs.link(source=zero / "1", to=one)
fs.create_directory(path=two)
fs.create_directory(path=two / "3")
fs.link(source=two / "3", to=one / "3")
self.assertEqual(
fs.realpath(one / "3"),
zero.descendant("1", "2", "3"),
)
def test_realpath_mega_link(self):
"""
Now with even more nested links!
Make sure we don't just accidentally solve the double link case
and not the more general one.
"""
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
# /1 -> /0/1
# /2/3 -> /1/2/3
# /3 -> /2/3
# /3/5 -> /3/4/5
# realpath(/3/5) == /0/1/2/3/4/5
directories = (
tempdir / "0",
tempdir / "1",
tempdir / "2",
tempdir / "3",
)
fs.create_directory(path=directories[0])
fs.create_directory(path=directories[0] / "1")
fs.create_directory(path=directories[2])
fs.link(source=directories[0] / "1", to=directories[1])
fs.create_directory(path=directories[1] / "2")
fs.create_directory(path=directories[1] / "2" / "3")
fs.link(source=directories[1] / "2" / "3", to=directories[2] / "3")
fs.link(source=directories[2] / "3", to=directories[3])
fs.link(source=directories[3] / "4" / "5", to=directories[3] / "5")
self.assertEqual(
fs.realpath(tempdir / "3" / "5"),
directories[0].descendant("1", "2", "3", "4", "5"),
)
def test_remove_does_not_follow_directory_links(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "directory"
fs.create_directory(path=directory)
fs.touch(directory / "a")
link = tempdir / "link"
fs.link(source=directory, to=link)
self.assertTrue(fs.is_link(path=link))
fs.remove(path=link)
self.assertEqual(
fs.children(path=directory), s(directory / "a"),
)
def test_create_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
self.assertFalse(fs.is_dir(path=directory))
fs.create_directory(path=directory)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_directory_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
self.assertFalse(fs.is_dir(path=directory))
fs.create_directory(path=directory, with_parents=True)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_directory_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
self.assertFalse(fs.is_dir(path=directory))
fs.create_directory(path=directory, allow_existing=True)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_directory_with_parents_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
self.assertFalse(fs.is_dir(path=directory))
fs.create_directory(
path=directory,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_deep_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
top_dir = tempdir / "dir"
directory = top_dir / "sub1" / "sub2" / "sub3"
self.assertFalse(fs.is_dir(path=top_dir))
with self.assertRaises(exceptions.FileNotFound) as e:
fs.create_directory(path=directory)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOENT) + ": " + str(directory.parent()),
)
def test_create_deep_directory_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
top_dir = tempdir / "dir"
directory = top_dir / "sub1" / "sub2" / "sub3"
self.assertFalse(fs.is_dir(path=top_dir))
fs.create_directory(path=directory, with_parents=True)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_deep_directory_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
top_dir = tempdir / "dir"
directory = top_dir / "sub1" / "sub2" / "sub3"
self.assertFalse(fs.is_dir(path=top_dir))
with self.assertRaises(exceptions.FileNotFound) as e:
fs.create_directory(path=directory, allow_existing=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOENT) + ": " + str(directory.parent()),
)
def test_create_deep_directory_with_parents_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
top_dir = tempdir / "dir"
directory = top_dir / "sub1" / "sub2" / "sub3"
self.assertFalse(fs.is_dir(path=top_dir))
fs.create_directory(
path=directory,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_existing_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=directory)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(directory),
)
def test_create_existing_directory_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=directory, with_parents=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(directory),
)
def test_create_existing_directory_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
fs.create_directory(path=directory, allow_existing=True)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_existing_directory_with_parents_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
fs.create_directory(
path=directory,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
dict(
exists=fs.exists(path=directory),
is_dir=fs.is_dir(path=directory),
is_file=fs.is_file(path=directory),
is_link=fs.is_link(path=directory),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_create_existing_directory_from_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=not_a_dir)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=not_a_dir, with_parents=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=not_a_dir, allow_existing=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_with_parents_allow_existing(
self,
):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(
path=not_a_dir,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.create_directory(path=not_a_dir.descendant("file_child"))
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_child_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.create_directory(
path=not_a_dir.descendant("file_child"),
with_parents=True,
)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_child_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.create_directory(
path=not_a_dir.descendant("file_child"),
allow_existing=True,
)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_file_child_with_parents_allow_existing( # noqa: E501
self,
):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.create_directory(
path=not_a_dir.descendant("file_child"),
with_parents=True,
allow_existing=True,
)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_create_existing_directory_from_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
link = tempdir / "link"
fs.link(source=tempdir, to=link)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=link)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(link),
)
def test_create_existing_directory_from_link_with_parents(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
link = tempdir / "link"
fs.link(source=tempdir, to=link)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=link, with_parents=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(link),
)
# TODO: cover links to all of:
# files
# directories
# non-looping links
# looping links
# non-existent targets
# ?
def test_create_existing_directory_from_directory_link_allow_existing(self): # noqa: E501
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "directory"
fs.create_directory(path=directory)
link = tempdir / "link"
fs.link(source=directory, to=link)
fs.create_directory(path=link, allow_existing=True)
self.assertEqual(
dict(
exists=fs.exists(path=link),
is_dir=fs.is_dir(path=link),
is_file=fs.is_file(path=link),
is_link=fs.is_link(path=link),
),
dict(exists=True, is_dir=True, is_file=False, is_link=True),
)
def test_create_existing_directory_from_directory_link_with_parents_allow_existing( # noqa: E501
self,
):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "directory"
fs.create_directory(path=directory)
link = tempdir / "link"
fs.link(source=directory, to=link)
fs.create_directory(
path=link,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
dict(
exists=fs.exists(path=link),
is_dir=fs.is_dir(path=link),
is_file=fs.is_file(path=link),
is_link=fs.is_link(path=link),
),
dict(exists=True, is_dir=True, is_file=False, is_link=True),
)
def test_create_existing_directory_from_file_link_allow_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
file = tempdir / "file"
fs.touch(path=file)
link = tempdir / "link"
fs.link(source=file, to=link)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(path=link, allow_existing=True)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(link),
)
def test_create_existing_directory_from_file_link_with_parents_allow_existing( # noqa: E501
self,
):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
file = tempdir / "file"
fs.touch(path=file)
link = tempdir / "link"
fs.link(source=file, to=link)
with self.assertRaises(exceptions.FileExists) as e:
fs.create_directory(
path=link,
with_parents=True,
allow_existing=True,
)
self.assertEqual(
str(e.exception),
os.strerror(errno.EEXIST) + ": " + str(link),
)
def test_create_directory_parent_does_not_exist(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir.descendant("some", "child", "dir")
self.assertFalse(fs.is_dir(path=directory.parent()))
with self.assertRaises(exceptions.FileNotFound) as e:
fs.create_directory(path=directory)
# Putting the first dir that doesn't exist would require some
# traversal, so just stick with the parent for now.
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOENT) + ": " + str(directory.parent()),
)
def test_create_directory_returns_the_new_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
self.assertEqual(fs.create_directory(tempdir / "dir"), directory)
def test_create_directory_link_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
zero, one = tempdir / "0", tempdir / "1"
fs.create_directory(path=zero)
fs.create_directory(path=zero / "1")
fs.link(source=zero / "1", to=one)
two = one / "2"
fs.create_directory(path=two)
self.assertEqual(
dict(
exists=fs.exists(path=two),
is_dir=fs.is_dir(path=two),
is_file=fs.is_file(path=two),
is_link=fs.is_link(path=two),
),
dict(exists=True, is_dir=True, is_file=False, is_link=False),
)
def test_link_link_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
zero, one = tempdir / "0", tempdir / "1"
fs.create_directory(path=zero)
fs.create_directory(path=zero / "1")
fs.link(source=zero / "1", to=one)
two = one / "2"
fs.create_directory(path=two)
three, four = two / "3", two / "4"
fs.touch(three)
fs.link(source=three, to=four)
self.assertEqual(
dict(
exists=fs.exists(path=four),
is_dir=fs.is_dir(path=four),
is_file=fs.is_file(path=four),
is_link=fs.is_link(path=four),
),
dict(exists=True, is_dir=False, is_file=True, is_link=True),
)
def test_remove_empty_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
fs.remove_empty_directory(path=directory)
self.assertFalse(fs.is_dir(path=directory))
def test_remove_nonempty_empty_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
nonempty = tempdir / "dir"
fs.create_directory(path=nonempty)
fs.create_directory(nonempty / "dir2")
self.assertTrue(fs.is_dir(path=nonempty))
with self.assertRaises(exceptions.DirectoryNotEmpty) as e:
fs.remove_empty_directory(path=nonempty)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTEMPTY) + ": " + str(nonempty),
)
def test_remove_empty_directory_but_its_a_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "file"
fs.touch(path=child)
self.assertTrue(fs.is_file(path=child))
with self.assertRaises(exceptions.NotADirectory) as e:
fs.remove_empty_directory(path=child)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(child),
)
def test_remove_empty_directory_but_its_a_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
directory = tempdir / "dir"
fs.create_directory(path=directory)
self.assertTrue(fs.is_dir(path=directory))
link = tempdir / "link"
fs.link(source=directory, to=link)
self.assertTrue(fs.is_dir(path=link))
with self.assertRaises(exceptions.NotADirectory) as e:
fs.remove_empty_directory(path=link)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(link),
)
def test_remove_on_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "child"
fs.touch(path=child)
self.assertTrue(fs.exists(path=child))
fs.remove(path=child)
self.assertFalse(fs.exists(path=child))
def test_remove_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "child"
fs.touch(path=child)
self.assertTrue(fs.exists(path=child))
fs.remove_file(path=child)
self.assertFalse(fs.exists(path=child))
def test_remove_file_on_empty_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "child"
fs.create_directory(path=child)
self.assertTrue(fs.exists(path=child))
with self.assertRaises(exceptions._UnlinkNonFileError) as e:
fs.remove_file(path=child)
self.assertEqual(
str(e.exception), (
os.strerror(exceptions._UnlinkNonFileError.errno) +
": " +
str(child)
),
)
def test_remove_file_on_nonempty_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "child"
fs.create_directory(path=child)
fs.touch(child / "grandchild")
self.assertTrue(fs.exists(path=child))
with self.assertRaises(exceptions._UnlinkNonFileError) as e:
fs.remove_file(path=child)
self.assertEqual(
str(e.exception), (
os.strerror(exceptions._UnlinkNonFileError.errno) +
": " +
str(child)
),
)
def test_remove_nonexisting_file_nonexisting_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir.descendant("dir", "child")
self.assertFalse(fs.is_file(path=child))
with self.assertRaises(exceptions.FileNotFound) as e:
fs.remove_file(path=child)
self.assertEqual(
str(e.exception), os.strerror(errno.ENOENT) + ": " + str(child),
)
def test_non_existing_file_types(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
nonexistant = tempdir / "solipsism"
self.assertEqual(
dict(
exists=fs.exists(path=nonexistant),
is_dir=fs.is_dir(path=nonexistant),
is_file=fs.is_file(path=nonexistant),
is_link=fs.is_link(path=nonexistant),
),
dict(exists=False, is_dir=False, is_file=False, is_link=False),
)
def test_list_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
a = tempdir / "a"
b = tempdir / "b"
c = tempdir.descendant("b", "c")
fs.touch(path=a)
fs.create_directory(path=b)
fs.touch(path=c)
self.assertEqual(set(fs.list_directory(tempdir)), {"a", "b"})
def test_list_empty_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
self.assertEqual(set(fs.list_directory(tempdir)), set())
def test_list_directory_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
# /source -> /link
# /source/{1, 2, 3}
source, link = tempdir / "source", tempdir / "link"
fs.create_directory(path=source)
fs.create_directory(path=source / "1")
fs.touch(path=source / "2")
fs.touch(path=source / "3")
fs.link(source=source, to=link)
self.assertEqual(set(fs.list_directory(link)), s("1", "2", "3"))
def test_list_directory_link_child(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
# /1 -> /0/1
# /1/3 -> /1/2/3
# realpath(/1/3) == /0/1/2/3
zero, one = tempdir / "0", tempdir / "1"
two = one / "2"
fs.create_directory(path=zero)
fs.create_directory(path=zero / "1")
fs.link(source=zero / "1", to=one)
fs.create_directory(path=two)
fs.create_directory(path=two / "3")
self.assertEqual(set(fs.list_directory(two)), {"3"})
def test_list_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
not_a_dir = tempdir / "not_a_dir"
fs.touch(not_a_dir)
with self.assertRaises(exceptions.NotADirectory) as e:
fs.list_directory(not_a_dir)
self.assertEqual(
str(e.exception),
os.strerror(errno.ENOTDIR) + ": " + str(not_a_dir),
)
def test_touch(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "a"
self.assertFalse(fs.exists(path=child))
fs.touch(path=child)
self.assertEqual(
dict(
exists=fs.exists(path=child),
is_dir=fs.is_dir(path=child),
is_file=fs.is_file(path=child),
is_link=fs.is_link(path=child),
),
dict(exists=True, is_dir=False, is_file=True, is_link=False),
)
def test_children(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
a = tempdir / "a"
b = tempdir / "b"
c = tempdir.descendant("b", "c")
d = tempdir / "d"
fs.touch(path=a)
fs.create_directory(path=b)
fs.touch(path=c)
fs.link(source=c, to=d)
self.assertEqual(fs.children(path=tempdir), s(a, b, d))
def test_glob_children(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
a = tempdir / "a"
b = tempdir / "b"
c = tempdir.descendant("b", "c")
abc = tempdir / "abc"
fedcba = tempdir / "fedcba"
fs.touch(path=a)
fs.create_directory(path=b)
fs.touch(path=c)
fs.touch(path=abc)
fs.touch(path=fedcba)
self.assertEqual(
fs.glob_children(path=tempdir, glob="*b*"),
s(b, abc, fedcba),
)
# With how crazy computers are, I'm not actually 100% sure that
# these tests for the behavior of the root directory will always be
# the case. But, onward we go.
def test_root_always_exists(self):
fs = self.FS()
self.assertTrue(fs.exists(Path.root()))
def test_realpath_root(self):
fs = self.FS()
self.assertEqual(fs.realpath(Path.root()), Path.root())
def test_readlink_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source, to = tempdir / "source", tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(fs.readlink(to), source)
def test_readlink_nested_link(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
source = tempdir / "source"
first = tempdir / "first"
second = tempdir / "second"
third = tempdir / "third"
fs.link(source=source, to=first)
fs.link(source=first, to=second)
fs.link(source=second, to=third)
self.assertEqual(fs.readlink(third), second)
def test_readlink_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
child = tempdir / "child"
fs.touch(child)
with self.assertRaises(exceptions.NotASymlink) as e:
fs.readlink(child)
self.assertEqual(
str(e.exception), os.strerror(errno.EINVAL) + ": " + str(child),
)
def test_readlink_directory(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.NotASymlink) as e:
fs.readlink(tempdir)
self.assertEqual(
str(e.exception), os.strerror(errno.EINVAL) + ": " + str(tempdir),
)
def test_readlink_relative(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
source, to = RelativePath("source", "dir"), tempdir / "to"
fs.link(source=source, to=to)
self.assertEqual(fs.readlink(to), source)
def test_readlink_child_link_from_source(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
# /1 -> /0/1
# /1/3 -> /1/2/3
# readlink(/0/1/3) == /1/2/3
zero, one = tempdir / "0", tempdir / "1"
two = one / "2"
fs.create_directory(path=zero)
fs.create_directory(path=zero / "1")
fs.link(source=zero / "1", to=one)
fs.create_directory(path=two)
fs.create_directory(path=two / "3")
fs.link(source=two / "3", to=one / "3")
self.assertEqual(fs.readlink(zero.descendant("1", "3")), two / "3")
@with_scenarios()
class InvalidModeMixin(object):
scenarios = [
("activity", {"mode": "z"}),
("mode", {"mode": "rz"}),
("extra", {"mode": "rbz"}),
("binary_and_text", {"mode": "rbt"}),
("read_and_write", {"mode": "rwb"}),
]
def test_invalid_mode(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with self.assertRaises(exceptions.InvalidMode):
fs.open(tempdir / "unittesting", self.mode)
@with_scenarios()
class OpenFileMixin(object):
scenarios = [
(
"bytes",
{
"expected": b"some things!",
"bytes": lambda c: c,
"mode": "rb",
},
),
(
"native",
{
"expected": "some things!",
"bytes": lambda c: c.encode(),
"mode": "r",
},
),
(
"text",
{
"expected": u"some things!",
"bytes": lambda c: c.encode(),
"mode": "rt",
},
),
]
def test_open_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", "wb") as f:
f.write(self.bytes(self.expected))
with fs.open(tempdir / "unittesting", self.mode) as g:
contents = g.read()
self.assertEqual(contents, self.expected)
self.assertIsInstance(contents, type(self.expected))
@with_scenarios()
class OpenWriteNonExistingFileMixin(object):
scenarios = [
("bytes", dict(contents=u"שלום".encode("utf-8"), mode="wb")),
("native", dict(contents="שלום", mode="w")),
("text", dict(contents=u"שלום", mode="wt")),
]
def test_open_write_non_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", self.mode) as f:
f.write(self.contents)
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), "שלום")
@with_scenarios()
class OpenAppendNonExistingFileMixin(object):
scenarios = [
("bytes", dict(first=b"some ", second=b"things!", mode="ab")),
("native", dict(first="some ", second="things!", mode="a")),
("text", dict(first=u"some ", second=u"things!", mode="at")),
]
def test_open_append_non_existing_file(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
with fs.open(tempdir / "unittesting", self.mode) as f:
f.write(self.first)
with fs.open(tempdir / "unittesting", self.mode) as f:
f.write(self.second)
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), "some things!")
@with_scenarios()
class WriteLinesMixin(object):
scenarios = [
(
"bytes",
{
"to_write": lambda text: text.encode(),
"mode": "ab",
},
),
(
"native",
{
"to_write": lambda text: text,
"mode": "a",
},
),
(
"text",
{
"to_write": lambda text: text,
"mode": "at",
},
),
]
def test_writelines(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
text = u"some\nthings!\n"
newline = self.to_write(u"\n")
to_write = self.to_write(text)
with fs.open(tempdir / "unittesting", self.mode) as f:
f.writelines(line + newline for line in to_write.splitlines())
with fs.open(tempdir / "unittesting") as g:
self.assertEqual(g.read(), text)
@with_scenarios()
class NonExistentChildMixin(object):
scenarios = multiply_scenarios(
[
(
"directory", dict(
Exception=exceptions.FileNotFound,
create=lambda fs, path: fs.create_directory(path=path),
),
), (
"file", dict(
Exception=exceptions.NotADirectory,
create=lambda fs, path: fs.touch(path=path),
),
), (
"link_to_file", dict(
Exception=exceptions.FileNotFound,
create=lambda fs, path: fs.touch( # Sorry :/
path=path.sibling("source"),
) and fs.link(source=path.sibling("source"), to=path),
),
"link_to_directory", dict(
Exception=exceptions.FileNotFound,
create=lambda fs, path: fs.create_directory( # Sorry :/
path=path.sibling("source"),
) and fs.link(source=path.sibling("source"), to=path),
),
"loop", dict(
Exception=exceptions.SymbolicLoop,
create=lambda fs, path: fs.link(source=path, to=path),
),
),
], [
(
"create_directory", dict(
act_on=lambda fs, path: fs.create_directory(path=path),
error_on_child=False,
),
), (
"list_directory",
dict(act_on=lambda fs, path: fs.list_directory(path=path)),
), (
"create_file",
dict(act_on=lambda fs, path: fs.create(path=path)),
), (
"remove_file",
dict(act_on=lambda fs, path: fs.remove_file(path=path)),
), (
"remove_empty_directory", dict(
act_on=lambda fs, path: fs.remove_empty_directory(
path=path,
),
),
), (
"read_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rb")),
), (
"read_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="r")),
), (
"read_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rt")),
), (
"write_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="wb")),
), (
"write_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="w")),
), (
"write_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="wt")),
), (
"append_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="ab")),
), (
"append_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="a")),
), (
"append_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="at")),
), (
"link", dict(
act_on=lambda fs, path: fs.link(source=path, to=path),
error_on_child=False,
),
), (
"readlink", dict(
act_on=lambda fs, path: fs.readlink(path=path),
),
), (
"stat", dict(
act_on=lambda fs, path: fs.stat(path=path),
),
), (
"lstat", dict(
act_on=lambda fs, path: fs.lstat(path=path),
),
),
],
)
def test_child_of_non_existing(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
existing = tempdir / "unittesting"
self.create(fs=fs, path=existing)
non_existing_child = existing.descendant("non_existing", "thing")
with self.assertRaises(self.Exception) as e:
self.act_on(fs=fs, path=non_existing_child)
path = ( # Sorry :/
non_existing_child
if getattr(self, "error_on_child", True)
else non_existing_child.parent()
)
self.assertEqual(
str(e.exception),
os.strerror(self.Exception.errno) + ": " + str(path),
)
def test_exists(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
existing = tempdir / "unittesting"
self.create(fs=fs, path=existing)
non_existing_child = existing.descendant("non_existing", "thing")
self.assertFalse(fs.exists(non_existing_child))
@with_scenarios()
class _SymbolicLoopMixin(object):
scenarios = multiply_scenarios(
[ # Size of loop
("one", dict(chain=["loop"])),
("two", dict(chain=["one", "two"])),
("many", dict(chain=["don't", "fall", "in", "the", "hole"])),
],
[ # Path to operate on
("itself", dict(path=lambda loop: loop)),
("child", dict(path=lambda loop: loop / "child")),
],
[ # Operation
(
"realpath",
dict(act_on=lambda fs, path: fs.realpath(path=path)),
), (
"list_directory",
dict(act_on=lambda fs, path: fs.list_directory(path=path)),
), (
"read_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rb")),
), (
"read_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="r")),
), (
"read_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="rt")),
), (
"write_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="wb")),
), (
"write_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="w")),
), (
"write_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="wt")),
), (
"append_bytes",
dict(act_on=lambda fs, path: fs.open(path=path, mode="ab")),
), (
"append_native",
dict(act_on=lambda fs, path: fs.open(path=path, mode="a")),
), (
"append_text",
dict(act_on=lambda fs, path: fs.open(path=path, mode="at")),
), (
"stat",
dict(act_on=lambda fs, path: fs.stat(path=path)),
), (
"exists",
dict(act_on=lambda fs, path: fs.exists(path=path)),
), (
"is_dir",
dict(act_on=lambda fs, path: fs.is_dir(path=path)),
), (
"is_file",
dict(act_on=lambda fs, path: fs.is_file(path=path)),
),
],
)
def fs_with_loop(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
tempdir = fs.realpath(tempdir)
for source, to in zip(self.chain, self.chain[1:]):
fs.link(source=tempdir / source, to=tempdir / to)
fs.link(
source=tempdir / self.chain[-1],
to=tempdir / self.chain[0],
)
return fs, tempdir.descendant(self.chain[0])
def test_it_detects_loops(self):
fs, loop = self.fs_with_loop()
with self.assertRaises(exceptions.SymbolicLoop):
self.act_on(fs=fs, path=self.path(loop))
# FIXME: Temporarily disabled, since this is "wrong" at the minute for
# memory.FS.
# self.assertEqual(
# str(e.exception),
# os.strerror(errno.ELOOP) + ": " + str(self.path(loop)),
# )
class SymbolicLoopMixin(_SymbolicLoopMixin):
def test_create_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.create(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
}
self.assertIn(str(e.exception), acceptable)
def test_create_directory_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.create_directory(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
}
self.assertIn(str(e.exception), acceptable)
def test_remove_file_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.remove_file(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
}
self.assertIn(str(e.exception), acceptable)
def test_remove_empty_directory_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.remove_empty_directory(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
}
self.assertIn(str(e.exception), acceptable)
def test_link_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.link(source=tempdir, to=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
}
self.assertIn(str(e.exception), acceptable)
def test_readlink_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.readlink(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
os.strerror(errno.ELOOP) + ": " + str(
loop.descendant("child", "path"),
),
}
self.assertIn(str(e.exception), acceptable)
def test_lstat_loop(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
self.assertTrue(fs.lstat(path=loop))
def test_lstat_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.lstat(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
os.strerror(errno.ELOOP) + ": " + str(
loop.descendant("child", "path"),
),
}
self.assertIn(str(e.exception), acceptable)
def test_is_link_loop(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
self.assertTrue(fs.is_link(path=loop))
def test_is_link_loop_descendant(self):
fs = self.FS()
tempdir = fs.temporary_directory()
self.addCleanup(fs.remove, tempdir)
loop = tempdir / "loop"
fs.link(source=loop, to=loop)
with self.assertRaises(exceptions.SymbolicLoop) as e:
fs.is_link(path=loop.descendant("child", "path"))
# We'd really like the first one, but on a real native FS, looking for
# it would be a race condition, so we allow the latter.
acceptable = {
os.strerror(errno.ELOOP) + ": " + str(loop),
os.strerror(errno.ELOOP) + ": " + str(loop / "child"),
os.strerror(errno.ELOOP) + ": " + str(
loop.descendant("child", "path"),
),
}
self.assertIn(str(e.exception), acceptable)
| 31.444198 | 104 | 0.555879 | 8,092 | 71,001 | 4.731587 | 0.035467 | 0.035729 | 0.029774 | 0.035102 | 0.871918 | 0.85455 | 0.827492 | 0.80759 | 0.785233 | 0.775178 | 0 | 0.004012 | 0.318897 | 71,001 | 2,257 | 105 | 31.45813 | 0.787733 | 0.029831 | 0 | 0.687891 | 0 | 0 | 0.044371 | 0.00064 | 0 | 0 | 0 | 0.000443 | 0.108016 | 1 | 0.065378 | false | 0.000569 | 0.003411 | 0 | 0.079591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d8abc1386dca6eb55e273213ab8bad51fb06e676 | 44,869 | py | Python | cogs/fun.py | SeldoW/Tomori | be610999f4002a9f2340ce430cf9d6c1c36f5034 | [
"MIT"
] | 1 | 2019-08-11T19:18:30.000Z | 2019-08-11T19:18:30.000Z | cogs/fun.py | SeldoW/Tomori | be610999f4002a9f2340ce430cf9d6c1c36f5034 | [
"MIT"
] | null | null | null | cogs/fun.py | SeldoW/Tomori | be610999f4002a9f2340ce430cf9d6c1c36f5034 | [
"MIT"
] | null | null | null | import os
import discord
import asyncio
import requests
import time
from datetime import datetime, date
import string
import random
import copy
import json
import asyncpg
import imghdr
from discord.ext import commands
from config.settings import settings
from PIL import Image, ImageChops, ImageFont, ImageDraw, ImageSequence, ImageFilter
from PIL.GifImagePlugin import getheader, getdata
from functools import partial
import aiohttp
from io import BytesIO
from typing import Union
from cogs.const import *
from cogs.ids import *
from cogs.locale import *
mask = Image.new('L', (364, 364), 0)
draws = ImageDraw.Draw(mask)
draws.ellipse((0, 0) + (40, 40), fill=255)
draws.ellipse((324, 0) + (364, 40), fill=255)
draws.ellipse((0, 324) + (40, 364), fill=255)
draws.ellipse((324, 324) + (364, 364), fill=255)
draws.rectangle(((0, 20), (364, 344)), fill=255)
draws.rectangle(((20, 0), (344, 364)), fill=255)
mask = mask.resize((364, 364), Image.ANTIALIAS)
mask_top = Image.new('L', (269, 269), 0)
draws_top = ImageDraw.Draw(mask_top)
draws_top.ellipse((0, 0) + (269, 269), fill=255)
mask_top = mask_top.resize((269, 269), Image.ANTIALIAS)
mask_top_back = Image.new('L', (549, 549), 0)
draws_top_back = ImageDraw.Draw(mask_top_back)
draws_top_back.rectangle((274, 0) + (474, 549), fill=255)
mask_top_back = mask_top_back.resize((549, 549), Image.ANTIALIAS)
async def f_me(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, is_me, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
if not const or not const["is_me"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
await client.send_typing(message.channel)
if not who:
who = message.author
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT * FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dat:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dat = await conn.fetchrow("SELECT * FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
background = dat["background"]
if not background:
if not message.server.id in konoha_servers:
background = random.choice(background_list)
else:
background = random.choice(konoha_background_list)
await conn.execute("UPDATE users SET background = '{back}' WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
back=random.choice(background_list),
stats_type=stats_type,
id=who.id
))
cash_rank = await conn.fetchrow("SELECT COUNT(DISTINCT cash) AS qty FROM users WHERE stats_type = '{}' AND cash > {}".format(stats_type, dat["cash"]))
rep_rank = await conn.fetchrow("SELECT COUNT(DISTINCT reputation) AS qty FROM users WHERE stats_type = '{}' AND reputation > {}".format(stats_type, dat["reputation"]))
xp_rank = await conn.fetchrow("SELECT COUNT(DISTINCT xp_count) AS qty FROM users WHERE stats_type = '{}' AND xp_count > {}".format(stats_type, dat["xp_count"]))
#================================================== stats 1
xp_lvl = 0
i = 1
if dat["xp_count"] > 0:
while dat["xp_count"] >= (i * (i + 1) * 5):
xp_lvl = xp_lvl + 1
i = i + 1
xp_count = "{}/{}".format(dat["xp_count"], ((xp_lvl + 1) * (xp_lvl + 2) * 5))
xp_lvl = str(xp_lvl)
#==================================================
back = Image.open("cogs/stat/backgrounds/{}".format(background))
draw_b = ImageDraw.Draw(back)
under = Image.open("cogs/stat/backgrounds/under.png")
font_right = ImageFont.truetype("cogs/stat/Lato.ttf", 16)
font_xp_count = ImageFont.truetype("cogs/stat/WhitneyBold.ttf", 37)
ava_url = who.avatar_url
if not ava_url:
ava_url = who.default_avatar_url
response = requests.get(ava_url)
avatar = Image.open(BytesIO(response.content))
avatar = avatar.resize((364, 364))
avatar.putalpha(mask)
avatar = avatar.resize((91, 91))
back = back.filter(ImageFilter.GaussianBlur(1))
back.paste(under, (0, 0), under)
back.paste(avatar, (355, 151), avatar)
halo = Image.new('RGBA', back.size, (0, 0, 0, 0))
ImageDraw.Draw(halo).text(
(400-font_xp_count.getsize(xp_count)[0]/2, 335-font_xp_count.getsize(xp_count)[1]/2),
xp_count,
(0, 0, 0),
font=font_xp_count
)
kernel = [
0, 1, 2, 1, 0,
1, 2, 4, 2, 1,
2, 4, 8, 4, 1,
1, 2, 4, 2, 1,
0, 1, 2, 1, 0
]
kernelsum = sum(kernel)
myfilter = ImageFilter.Kernel((5, 5), kernel, scale = 0.5 * kernelsum)
blurred_halo = halo.filter(myfilter)
ImageDraw.Draw(blurred_halo).text(
(400-font_xp_count.getsize(xp_count)[0]/2, 335-font_xp_count.getsize(xp_count)[1]/2),
xp_count,
(255, 255, 255),
font=font_xp_count
)
back = Image.composite(back, blurred_halo, ImageChops.invert(blurred_halo))
draw = ImageDraw.Draw(back)
draw.text(
(703-font_right.getsize(xp_lvl)[0]/2, 71-font_right.getsize(xp_lvl)[1]/2),
xp_lvl,
(0, 0, 0),
font=font_right
)
draw.text(
(703-font_right.getsize(str(dat["reputation"]))[0]/2, 118-font_right.getsize(str(dat["reputation"]))[1]/2),
str(dat["reputation"]),
(0, 0, 0),
font=font_right
)
draw.text(
(703-font_right.getsize(str(dat["cash"]))[0]/2, 353-font_right.getsize(str(dat["cash"]))[1]/2),
str(dat["cash"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["drink_count"]))[0], 353-font_right.getsize(str(dat["drink_count"]))[1]/2),
str(dat["drink_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["fuck_count"]))[0], 306-font_right.getsize(str(dat["fuck_count"]))[1]/2),
str(dat["fuck_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["wink_count"]))[0], 259-font_right.getsize(str(dat["wink_count"]))[1]/2),
str(dat["wink_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["five_count"]))[0], 212-font_right.getsize(str(dat["five_count"]))[1]/2),
str(dat["five_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["punch_count"]))[0], 165-font_right.getsize(str(dat["punch_count"]))[1]/2),
str(dat["punch_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["hug_count"]))[0], 118-font_right.getsize(str(dat["hug_count"]))[1]/2),
str(dat["hug_count"]),
(0, 0, 0),
font=font_right
)
draw.text(
(140-font_right.getsize(str(dat["kiss_count"]))[0], 71-font_right.getsize(str(dat["kiss_count"]))[1]/2),
str(dat["kiss_count"]),
(0, 0, 0),
font=font_right
)
name = u"{}".format(who.display_name)
name_size = 1
font_name = ImageFont.truetype("cogs/stat/Lato.ttf", name_size)
while font_name.getsize(name)[0] < 150:
name_size += 1
font_name = ImageFont.truetype("cogs/stat/Lato.ttf", name_size)
if name_size == 18:
break
name_size -= 1
font_name = ImageFont.truetype("cogs/stat/Lato.ttf", name_size)
draw.text(
(400-font_name.getsize(name)[0]/2, 255-font_name.getsize(name)[1]/2),
name,
(0, 0, 0),
font=font_name
)
back.save('cogs/stat/return/{}.png'.format(message.author.id))
await client.upload("cogs/stat/return/{}.png".format(message.author.id))
os.remove("cogs/stat/return/{}.png".format(message.author.id))
return
async def f_hug(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, hug_price, is_hug, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_hug"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
dates = await conn.fetchrow("SELECT hug_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dates:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dates = await conn.fetchrow("SELECT hug_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if dat:
if (const["hug_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(cash=dat["cash"] - const["hug_price"], stats_type=stats_type, id=message.author.id))
await conn.execute("UPDATE users SET hug_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(count=dates["hug_count"] + 1, stats_type=stats_type, id=who.id))
em.description = locale[lang]["fun_hug"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(hug_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const[0])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_kiss(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, kiss_price, is_kiss, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_kiss"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
dates = await conn.fetchrow("SELECT kiss_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dates:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dates = await conn.fetchrow("SELECT kiss_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if dat:
if (const["kiss_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["kiss_price"], id=message.author.id))
await conn.execute("UPDATE users SET kiss_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, count=dates["kiss_count"] + 1, id=who.id))
em.description = locale[lang]["fun_kiss"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(kiss_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const[0])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_five(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, five_price, is_five, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_five"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash, five_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
if dat:
if (const["five_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash}, five_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["five_price"], count=dat["five_count"]+1, id=message.author.id))
em.description = locale[lang]["fun_five"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(five_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const[0])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_punch(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, punch_price, is_punch, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_punch"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
dates = await conn.fetchrow("SELECT punch_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dates:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dates = await conn.fetchrow("SELECT punch_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if dat:
if (const["punch_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["punch_price"], id=message.author.id))
await conn.execute("UPDATE users SET punch_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, count=dates["punch_count"] + 1, id=who.id))
em.description = locale[lang]["fun_punch"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(punch_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const[0])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_fuck(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, fuck_price, is_fuck, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const[3]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash, fuck_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
if dat:
if (const["fuck_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash}, fuck_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["fuck_price"], count=dat["fuck_count"]+1, id=message.author.id))
em.description = locale[lang]["fun_fuck"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(fuck_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_wink(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, wink_price, is_wink, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_wink"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash, wink_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
if dat:
if (const["wink_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash}, wink_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["wink_price"], count=dat["wink_count"]+1, id=message.author.id))
em.description = locale[lang]["fun_wink"].format(message.author.mention, who.mention)
em.set_image(url=random.choice(wink_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_drink(client, conn, context):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, drink_price, is_drink, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_drink"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash, drink_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
if dat:
if (const["drink_price"] > dat["cash"]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash}, drink_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, cash=dat["cash"] - const["drink_price"], count=dat["drink_count"]+1, id=message.author.id))
em.description = locale[lang]["fun_drink"].format(message.author.mention)
em.set_image(url=random.choice(drink_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def f_rep(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT rep_cooldown, em_color, is_rep, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_rep"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT rep_time FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
dates = await conn.fetchrow("SELECT reputation FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dates:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dates = await conn.fetchrow("SELECT reputation FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dat:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
dat = await conn.fetchrow("SELECT rep_time FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
tim = int(time.time()) - dat["rep_time"]
if tim < const["rep_cooldown"]:
t=const["rep_cooldown"] - tim
h=str(t//3600)
m=str((t//60)%60)
s=str(t%60)
em.description = "{later1}\n{later2}".format(
later1=locale[lang]["rep_try_again_later1"].format(
who=clear_name(message.author.display_name+"#"+message.author.discriminator)
),
later2=locale[lang]["rep_try_again_later2"].format(
hours=h,
minutes=m,
seconds=s
)
)
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET rep_time = {time} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
time=int(time.time()),
stats_type=stats_type,
id=message.author.id
))
await conn.execute("UPDATE users SET reputation = {rep} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
rep=dates["reputation"] + 3,
stats_type=stats_type,
id=who.id
))
em.description = locale[lang]["fun_rep_success"].format(message.author.mention, who.mention)
await client.send_message(message.channel, embed=em)
return
async def f_sex(client, conn, context, who):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, em_color, sex_price, is_sex, locale FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_sex"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
if await are_you_nitty(client, lang, who, message):
return
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT cash, sex_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=message.author.id))
dates = await conn.fetchrow("SELECT sex_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if not dates:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(who.display_name[:50]), who.id, stats_type))
dates = await conn.fetchrow("SELECT sex_count FROM users WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(stats_type=stats_type, id=who.id))
if dat:
if (const[2] > dat[0]):
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
else:
await conn.execute("UPDATE users SET cash = {cash} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
cash=dat["cash"] - const["sex_price"],
stats_type=stats_type,
id=message.author.id
))
await conn.execute("UPDATE users SET sex_count = {count} WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
count=dates["sex_count"] + 1,
stats_type=stats_type,
id=who.id
))
em.description = "{} трахнул {}".format(message.author.display_name+"#"+message.author.discriminator, who.display_name+"#"+who.discriminator)
em.set_image(url=random.choice(sex_list))
else:
await conn.execute("INSERT INTO users(name, discord_id, stats_type) VALUES('{}', '{}', '{}')".format(clear_name(message.author.display_name[:50]), message.author.id, stats_type))
em.description = locale[lang]["global_dont_have_that_much_money"].format(who=message.author.display_name+"#"+message.author.discriminator, money=const["server_money"])
if not message.server.id in servers_without_follow_us:
em.add_field(
name=locale[lang]["global_follow_us"],
value=tomori_links,
inline=False
)
await client.send_message(message.channel, embed=em)
return
async def are_you_nitty(client, lang, who, message):
em = discord.Embed(colour=0xC5934B)
if not who:
em.description = locale[lang]["global_not_mention_on_user"].format(message.author.display_name+"#"+message.author.discriminator)
elif who.bot:
em.description = locale[lang]["global_bot_mentioned"].format(
who=message.author.display_name+"#"+message.author.discriminator,
bot=who.display_name[:50]
)
elif message.author == who:
em.description = locale[lang]["global_choose_someone_else"].format(message.author.display_name+"#"+message.author.discriminator)
else:
return False
await client.send_message(message.channel, embed=em)
return True
async def f_top(client, conn, context, page):
message = context.message
server_id = message.server.id
const = await conn.fetchrow("SELECT server_money, is_me, locale, em_color FROM settings WHERE discord_id = '{}'".format(server_id))
lang = const["locale"]
if not lang in locale.keys():
em = discord.Embed(description="{who}, {response}.".format(
who=message.author.display_name+"#"+message.author.discriminator,
response="ошибка локализации",
colour=0xC5934B))
await client.send_message(message.channel, embed=em)
return
em = discord.Embed(colour=int(const["em_color"], 16) + 512)
if not const or not const["is_me"]:
em.description = locale[lang]["global_not_available"].format(who=message.author.display_name+"#"+message.author.discriminator)
await client.send_message(message.channel, embed=em)
return
try:
await client.delete_message(message)
except:
pass
await client.send_typing(message.channel)
if message.server.id in local_stats_servers:
stats_type = message.server.id
else:
stats_type = "global"
dat = await conn.fetchrow("SELECT COUNT(name) FROM users WHERE stats_type = '{}'".format(stats_type))
all_count = dat[0]
pages = (((all_count - 1) // 5) + 1)
if not page:
page = 1
if all_count == 0:
em.description = locale[lang]["global_list_is_empty"]
await client.send_message(message.channel, embed=em)
return
if page > pages:
em.description = locale[lang]["global_page_not_exists"].format(who=message.author.display_name+"#"+message.author.discriminator, number=page)
await client.send_message(message.channel, embed=em)
return
dat = await conn.fetch("SELECT name, discord_id, avatar_url, xp_count FROM users WHERE stats_type = '{stats_type}' ORDER BY xp_count DESC LIMIT 5 OFFSET {offset}".format(stats_type=stats_type, offset=(page-1)*5))
#==================================================
img = Image.open("cogs/stat/top5.png")
back = Image.open("cogs/stat/top5.png")
draw = ImageDraw.Draw(img)
draws = ImageDraw.Draw(back)
font_position = ImageFont.truetype("cogs/stat/Roboto-Bold.ttf", 24)
font_xp_count = ImageFont.truetype("cogs/stat/Roboto-Regular.ttf", 16)
for i, user in enumerate(dat):
name = user["name"]
name = u"{}".format(name)
name_size = 1
font_name = ImageFont.truetype("cogs/stat/Roboto-Bold.ttf", name_size)
while font_name.getsize(name)[0] < 180:
name_size += 1
font_name = ImageFont.truetype("cogs/stat/Roboto-Bold.ttf", name_size)
if name_size == 31:
break
name_size -= 1
font_name = ImageFont.truetype("cogs/stat/Roboto-Bold.ttf", name_size)
if not name:
name = " "
ava_url = user["avatar_url"]
if not ava_url:
ava_url = client.user.default_avatar_url
for server in client.servers:
member = server.get_member(user["discord_id"])
if member and member.avatar_url:
ava_url = member.avatar_url
await conn.execute("UPDATE users SET avatar_url = '{url}' WHERE stats_type = '{stats_type}' AND discord_id = '{id}'".format(
url=ava_url,
stats_type=stats_type,
id=user["discord_id"]
))
break
response = requests.get(ava_url)
avatar = Image.open(BytesIO(response.content))
avatar_circle = avatar.resize((549, 549))
avatar_circle.putalpha(mask_top_back)
avatar_circle = avatar_circle.crop((274, 0, 474, 549))
back.paste(avatar_circle, (i*200, 0), avatar_circle)
avatar_circle = avatar.resize((269, 269))
bigsize = (avatar_circle.size[0] * 3, avatar_circle.size[1] * 3)
avatar_circle.putalpha(mask_top)
avatar_circle = avatar_circle.crop((0, 0, 134, 269))
img.paste(avatar_circle, (66+i*200, 125), avatar_circle)
position = "#{}".format(i+1+(page-1)*5)
draw.text(
(
100-font_position.getsize(position)[0]/2+i*200,
440-font_position.getsize(position)[1]/2
),
position,
(255, 255, 255),
font=font_position
)
draw.text((100-font_name.getsize(name)[0]/2+i*200, 50-font_name.getsize(name)[1]/2), name, (255, 255, 255), font=font_name)
xp_count = locale[lang]["fun_top5_xp_count"].format((user["xp_count"]))
draw.text(
(
100-font_xp_count.getsize(xp_count)[0]/2+i*200,
475-font_xp_count.getsize(xp_count)[1]/2
),
xp_count,
(255, 255, 255),
font=font_xp_count
)
back.paste(img, (0, 0), img)
back.save('cogs/stat/return/top/{}.png'.format(message.author.id))
await client.send_file(message.channel, "cogs/stat/return/top/{}.png".format(message.author.id), content=locale[lang]["fun_top5_response"])
os.remove("cogs/stat/return/top/{}.png".format(message.author.id))
return
| 50.642212 | 266 | 0.641824 | 5,927 | 44,869 | 4.67589 | 0.05011 | 0.070145 | 0.039402 | 0.05066 | 0.858086 | 0.835101 | 0.816194 | 0.789384 | 0.771668 | 0.7594 | 0 | 0.01974 | 0.217567 | 44,869 | 885 | 267 | 50.699435 | 0.769676 | 0.003521 | 0 | 0.639671 | 0 | 0.005869 | 0.208039 | 0.021115 | 0 | 0 | 0.002147 | 0 | 0 | 1 | 0 | false | 0.012911 | 0.026995 | 0 | 0.079812 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d8c698ebbed4c3a1178b34473235bd952c9bd975 | 186 | py | Python | tests/conftest.py | jindrichsamec/kontejnery | be1c592cae419cb8618f3f6f81b19db34e6c0a0f | [
"MIT"
] | null | null | null | tests/conftest.py | jindrichsamec/kontejnery | be1c592cae419cb8618f3f6f81b19db34e6c0a0f | [
"MIT"
] | 6 | 2020-09-05T03:15:43.000Z | 2022-02-26T02:31:05.000Z | tests/conftest.py | jindrichsamec/kontejnery | be1c592cae419cb8618f3f6f81b19db34e6c0a0f | [
"MIT"
] | null | null | null | import pytest
import os
@pytest.fixture
def prg8_html_content():
return open(os.path.join(os.path.dirname(__file__) + '/containers/parsers/', 'prg8.html'), encoding='utf-8').read()
| 26.571429 | 119 | 0.725806 | 27 | 186 | 4.777778 | 0.740741 | 0.124031 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.096774 | 186 | 6 | 120 | 31 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.182796 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
2b029229fda403c7a54520e8c88fedd9f7985974 | 83 | py | Python | test/test_models.py | bethgelab/robustness | aa0a6798fe3973bae5f47561721b59b39f126ab7 | [
"Apache-2.0"
] | 67 | 2020-07-01T01:13:19.000Z | 2022-03-28T15:33:20.000Z | test/test_models.py | bethgelab/robustness | aa0a6798fe3973bae5f47561721b59b39f126ab7 | [
"Apache-2.0"
] | 4 | 2021-03-04T13:24:52.000Z | 2022-03-30T22:07:40.000Z | test/test_models.py | bethgelab/robustness | aa0a6798fe3973bae5f47561721b59b39f126ab7 | [
"Apache-2.0"
] | 1 | 2021-05-25T09:41:10.000Z | 2021-05-25T09:41:10.000Z | import robusta
def test_imports():
"""API tests."""
assert robusta.models | 13.833333 | 25 | 0.662651 | 10 | 83 | 5.4 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204819 | 83 | 6 | 25 | 13.833333 | 0.818182 | 0.120482 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2b7a8ffc3baa23ec3b8618363ce51fedf77b5561 | 942 | py | Python | numpy_reshape_tile_repeat.py | vchatchai/python201 | 783481dbb8b2a641583f1f349f95f22126bfa9ed | [
"Apache-2.0"
] | null | null | null | numpy_reshape_tile_repeat.py | vchatchai/python201 | 783481dbb8b2a641583f1f349f95f22126bfa9ed | [
"Apache-2.0"
] | null | null | null | numpy_reshape_tile_repeat.py | vchatchai/python201 | 783481dbb8b2a641583f1f349f95f22126bfa9ed | [
"Apache-2.0"
] | null | null | null | import numpy as np
print(np.__version__)
a = np.arange(1, 13)
print( f'a => {a}')
print( f'a.reshape(4,3) => {a.reshape(4,3)}')
print(f'a.reshape(4,-1) => {a.reshape(4,-1)}')
print(f'a.reshape(-1,3) => {a.reshape(-1, 3)}')
b = a.reshape(-1,2)
print(f'b = a.reshape(-1,2) => {b}')
print(f'b.reshape(1,-1) => {b.reshape(1,-1)}')
print(f'b.ravel() => {b.ravel()}')
d = np.array([2,1,8])
print(f'd => {d}')
print(f'np.tile(d,3) => {np.tile(d, 3)}')
print(f'np.tile(d,3) => {np.tile(d,(3,1))}')
e = np.array([[1,2], [10,20]])
print(f'e => {e}')
print(f'np.tile(e, 3) => {np.tile(e, 3)}')
print(f'np.tile(e, (5,1)) => {np.tile(e, (5,1))}')
print(f'np.tile(e, (5,2)) => {np.tile(e, (5,2))}')
print(f'np.tile([1,0], 4) => {np.tile([1,0], 4)}')
print(f'np.tile([[1,0],[0,1]], (4,4)) => {np.tile([[1,0],[0,1]], (4,4))}')
print(f'np.repeat(9,3) => {np.repeat(9,3)}')
print(f'np.repeat([10, 15, 20], 2) => {np.repeat([10,15,20], 2)}') | 21.906977 | 74 | 0.505308 | 205 | 942 | 2.302439 | 0.141463 | 0.228814 | 0.152542 | 0.177966 | 0.512712 | 0.309322 | 0.144068 | 0.144068 | 0.09322 | 0.09322 | 0 | 0.1 | 0.118896 | 942 | 43 | 75 | 21.906977 | 0.468675 | 0 | 0 | 0 | 0 | 0.208333 | 0.623542 | 0.07105 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.041667 | 0.791667 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
99325ff4b7a8729dd4d1fd6c4c222b49525cb6ff | 134 | py | Python | csv2vcf/__init__.py | 84KaliPleXon3/csv2vcf | ec0deae41b34e92844302f10386430b39fc4a401 | [
"MIT"
] | 63 | 2017-07-05T10:02:33.000Z | 2022-02-01T00:32:39.000Z | csv2vcf/__init__.py | 84KaliPleXon3/csv2vcf | ec0deae41b34e92844302f10386430b39fc4a401 | [
"MIT"
] | 4 | 2019-09-27T12:27:32.000Z | 2020-10-15T19:04:58.000Z | csv2vcf/__init__.py | 84KaliPleXon3/csv2vcf | ec0deae41b34e92844302f10386430b39fc4a401 | [
"MIT"
] | 24 | 2017-07-05T15:37:15.000Z | 2022-01-31T13:22:25.000Z | # -*- coding: utf-8 -*-
import time
from .VCF import *
from tkinter.filedialog import askopenfilename
from tkinter import *
import os
| 19.142857 | 46 | 0.746269 | 18 | 134 | 5.555556 | 0.611111 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00885 | 0.156716 | 134 | 6 | 47 | 22.333333 | 0.876106 | 0.156716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
995c92100b1405a19666de22a5e723ecf5a2315d | 5,859 | py | Python | tests/unit/controller/test_controller_jwt.py | jlamoso/petisco | bd71d28a5c0ba6ea789fa7c1529e7a2d108da53f | [
"MIT"
] | null | null | null | tests/unit/controller/test_controller_jwt.py | jlamoso/petisco | bd71d28a5c0ba6ea789fa7c1529e7a2d108da53f | [
"MIT"
] | null | null | null | tests/unit/controller/test_controller_jwt.py | jlamoso/petisco | bd71d28a5c0ba6ea789fa7c1529e7a2d108da53f | [
"MIT"
] | null | null | null | import pytest
from meiga import Success
from petisco import controller, INFO, ERROR
from petisco.controller.tokens.jwt_config import JwtConfig
from tests.unit.mocks.fake_logger import FakeLogger
from tests.unit.mocks.log_message_mother import LogMessageMother
@pytest.fixture
def given_any_token_type():
return "TOKEN"
@pytest.fixture
def given_other_token_type():
return "REQUIRED_TOKEN"
@pytest.fixture
def given_any_token_type_with_user():
return "USER_TOKEN"
@pytest.fixture
def given_any_client_id():
return "client_id"
@pytest.fixture
def given_any_user_id():
return "user_id"
@pytest.fixture
def given_any_decoded_token_info(given_any_token_type, given_any_client_id):
return {
"user_id": None,
"client_id": given_any_client_id,
"token_type": given_any_token_type,
}
@pytest.fixture
def given_any_decoded_token_info_with_user(
given_any_token_type_with_user, given_any_client_id, given_any_user_id
):
return {
"user_id": given_any_user_id,
"client_id": given_any_client_id,
"token_type": given_any_token_type_with_user,
}
@pytest.mark.unit
def test_should_execute_successfully_a_empty_controller_with_jwt_requirement_without_user(
given_any_token_type, given_any_decoded_token_info
):
logger = FakeLogger()
jwt_config = JwtConfig(token_type=given_any_token_type)
@controller(logger=logger, jwt_config=jwt_config)
def my_controller(token_info):
return Success("Hello Petisco")
http_response = my_controller(token_info=given_any_decoded_token_info)
assert http_response == ({"message": "OK"}, 200)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller",
message="Result[status: success | value: Hello Petisco]",
).to_json(),
)
@pytest.mark.unit
def test_should_execute_successfully_a_empty_controller_with_jwt_requirement_with_user(
given_any_token_type_with_user, given_any_decoded_token_info_with_user
):
logger = FakeLogger()
jwt_config = JwtConfig(token_type=given_any_token_type_with_user, require_user=True)
@controller(logger=logger, jwt_config=jwt_config)
def my_controller(token_info, user_id):
return Success("Hello Petisco")
http_response = my_controller(token_info=given_any_decoded_token_info_with_user)
assert http_response == ({"message": "OK"}, 200)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller",
message="Result[status: success | value: Hello Petisco]",
).to_json(),
)
@pytest.mark.unit
def test_should_returns_an_error_when_a_empty_controller_do_not_get_a_required_jwt_token(
given_other_token_type, given_any_decoded_token_info
):
logger = FakeLogger()
jwt_config = JwtConfig(token_type=given_other_token_type)
@controller(logger=logger, jwt_config=jwt_config)
def my_controller(token_info):
return Success("Hello Petisco")
http_response = my_controller(token_info=given_any_decoded_token_info)
assert http_response == (
{
"error": {
"message": "Access token is missing or invalid. This entry point expects a valid REQUIRED_TOKEN Token",
"type": "InvalidTokenHttpError",
}
},
401,
)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
ERROR,
LogMessageMother.get_controller(
operation="my_controller",
message="Result[status: failure | value: InvalidTokenError]",
).to_json(),
)
@pytest.mark.unit
def test_should_returns_an_error_when_a_empty_controller_get_a_required_jwt_token_but_missing_user(
given_any_token_type, given_any_decoded_token_info
):
logger = FakeLogger()
jwt_config = JwtConfig(token_type=given_any_token_type, require_user=True)
@controller(logger=logger, jwt_config=jwt_config)
def my_controller(token_info):
return Success("Hello Petisco")
http_response = my_controller(token_info=given_any_decoded_token_info)
assert http_response == (
{
"error": {
"message": "Access token is missing or invalid. This entry point expects a valid TOKEN Token",
"type": "InvalidTokenHttpError",
}
},
401,
)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
ERROR,
LogMessageMother.get_controller(
operation="my_controller",
message="Result[status: failure | value: InvalidTokenError]",
).to_json(),
)
| 28.580488 | 119 | 0.700632 | 701 | 5,859 | 5.416548 | 0.128388 | 0.063208 | 0.041085 | 0.053727 | 0.910982 | 0.882012 | 0.859626 | 0.820121 | 0.794838 | 0.794838 | 0 | 0.004328 | 0.211299 | 5,859 | 204 | 120 | 28.720588 | 0.817356 | 0 | 0 | 0.649682 | 0 | 0 | 0.123912 | 0.007168 | 0 | 0 | 0 | 0 | 0.076433 | 1 | 0.095541 | false | 0 | 0.038217 | 0.070064 | 0.203822 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
998b6d398613c1b4ad051e6f61dc52839e34cf02 | 125 | py | Python | eventsourcing/utils/uuids.py | scbabacus/eventsourcing | 8404c5b26719ed9d9d1d257ebba774879c7243c4 | [
"BSD-3-Clause"
] | 1 | 2020-02-10T08:12:31.000Z | 2020-02-10T08:12:31.000Z | eventsourcing/utils/uuids.py | scbabacus/eventsourcing | 8404c5b26719ed9d9d1d257ebba774879c7243c4 | [
"BSD-3-Clause"
] | null | null | null | eventsourcing/utils/uuids.py | scbabacus/eventsourcing | 8404c5b26719ed9d9d1d257ebba774879c7243c4 | [
"BSD-3-Clause"
] | null | null | null | import uuid
def uuid_from_uri(uri):
return uuid.uuid5(uuid.NAMESPACE_URL, uri.encode('utf8') if bytes == str else uri)
| 20.833333 | 86 | 0.728 | 21 | 125 | 4.190476 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018868 | 0.152 | 125 | 5 | 87 | 25 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
51287592aca084585c4fb0c1810a2caeae40e11b | 99 | py | Python | src/sage/dev/__init__.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | null | null | null | src/sage/dev/__init__.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | null | null | null | src/sage/dev/__init__.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | null | null | null | from sagedev import SageDev
import sagedev_wrapper
dev = sagedev_wrapper.SageDevWrapper(SageDev())
| 24.75 | 47 | 0.848485 | 12 | 99 | 6.833333 | 0.5 | 0.317073 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 99 | 3 | 48 | 33 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
515cd32c4f6350e9e89f3db404d120b0dedd4dbe | 111 | py | Python | pythonidbot/error/unauthorized.py | hexatester/pythonidbot | 39964a340dca90dd64e3cd45d0513d5ae0be3986 | [
"MIT"
] | 1 | 2021-02-01T15:19:25.000Z | 2021-02-01T15:19:25.000Z | bot/error/unauthorized.py | hexatester/ptb-skeleton | f6f8b3b0dd814e223a8650a70a6749b6f208a225 | [
"MIT"
] | null | null | null | bot/error/unauthorized.py | hexatester/ptb-skeleton | f6f8b3b0dd814e223a8650a70a6749b6f208a225 | [
"MIT"
] | null | null | null | from telegram.error import Unauthorized
def unauthorized(update, context, exception: Unauthorized):
pass
| 18.5 | 59 | 0.792793 | 12 | 111 | 7.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144144 | 111 | 5 | 60 | 22.2 | 0.926316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
5a8d364823d3a4817e9c164d642ca811d084e47e | 251 | py | Python | src/icemac/ab/document/browser/startpage.py | icemac/icemac.ab.document | c5e9a68ca509b5ea59a84cf8a8c50f570a83a9eb | [
"BSD-2-Clause"
] | 1 | 2020-02-25T17:04:39.000Z | 2020-02-25T17:04:39.000Z | src/icemac/ab/document/browser/startpage.py | icemac/icemac.ab.document | c5e9a68ca509b5ea59a84cf8a8c50f570a83a9eb | [
"BSD-2-Clause"
] | null | null | null | src/icemac/ab/document/browser/startpage.py | icemac/icemac.ab.document | c5e9a68ca509b5ea59a84cf8a8c50f570a83a9eb | [
"BSD-2-Clause"
] | null | null | null | from icemac.addressbook.i18n import _
import icemac.addressbook.browser.addressbook.startpage
documents = icemac.addressbook.browser.addressbook.startpage.StartpageData(
'icemac.ab.document.interfaces.IRootFolder', 'index.html', _('Documents'))
| 35.857143 | 78 | 0.812749 | 26 | 251 | 7.769231 | 0.576923 | 0.252475 | 0.237624 | 0.346535 | 0.435644 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008584 | 0.071713 | 251 | 6 | 79 | 41.833333 | 0.858369 | 0 | 0 | 0 | 0 | 0 | 0.239044 | 0.163347 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
5a8ded4e98422d381173ade0a827cc3c1ca7708b | 42 | py | Python | dataset/src/visualization.py | hangwudy/pytorch_tutorial | 857b128253bd1e2bd30cb85e995c757e5acbb3a2 | [
"MIT"
] | null | null | null | dataset/src/visualization.py | hangwudy/pytorch_tutorial | 857b128253bd1e2bd30cb85e995c757e5acbb3a2 | [
"MIT"
] | null | null | null | dataset/src/visualization.py | hangwudy/pytorch_tutorial | 857b128253bd1e2bd30cb85e995c757e5acbb3a2 | [
"MIT"
] | null | null | null | # TODO visualize the data and annotations
| 21 | 41 | 0.809524 | 6 | 42 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 42 | 1 | 42 | 42 | 0.971429 | 0.928571 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 1 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5afc81842d4aeecd786198346dd41e9d7fffaf91 | 36 | py | Python | wagtail/core/hooks.py | stevedya/wagtail | 52e5abfe62547cdfd90ea7dfeb8bf5a52f16324c | [
"BSD-3-Clause"
] | 1 | 2022-02-09T05:25:30.000Z | 2022-02-09T05:25:30.000Z | wagtail/core/hooks.py | stevedya/wagtail | 52e5abfe62547cdfd90ea7dfeb8bf5a52f16324c | [
"BSD-3-Clause"
] | null | null | null | wagtail/core/hooks.py | stevedya/wagtail | 52e5abfe62547cdfd90ea7dfeb8bf5a52f16324c | [
"BSD-3-Clause"
] | null | null | null | from wagtail.hooks import * # noqa
| 18 | 35 | 0.722222 | 5 | 36 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 36 | 1 | 36 | 36 | 0.896552 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5afcb94095a2e0b45ec268a23ede74f277a087e9 | 18,534 | py | Python | tests/syntax/simple_expression/test_math_ops.py | PowerOlive/mindspore | bda20724a94113cedd12c3ed9083141012da1f15 | [
"Apache-2.0"
] | 3,200 | 2020-02-17T12:45:41.000Z | 2022-03-31T20:21:16.000Z | tests/syntax/simple_expression/test_math_ops.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | 176 | 2020-02-12T02:52:11.000Z | 2022-03-28T22:15:55.000Z | tests/syntax/simple_expression/test_math_ops.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | 621 | 2020-03-09T01:31:41.000Z | 2022-03-30T03:43:19.000Z | # Copyright 2021 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
""" test math ops """
import numpy as np
import mindspore.context as context
import mindspore.nn as nn
from mindspore import Tensor
from mindspore.ops import operations as P
context.set_context(mode=context.GRAPH_MODE)
class Add(nn.Cell):
def __init__(self):
super(Add, self).__init__()
self.add = P.Add()
def construct(self, x, y):
z = self.add(x, y)
return z
def test_number_add_number():
input_x = 0.1
input_y = -3.2
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = -3.1
assert result1 == expect
assert result2 == expect
def test_tensor_add_tensor_int8():
input_x = Tensor(np.ones(shape=[3])).astype(np.int8)
input_y = Tensor(np.zeros(shape=[3])).astype(np.int8)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_int16():
input_x = Tensor(np.ones(shape=[3])).astype(np.int16)
input_y = Tensor(np.zeros(shape=[3])).astype(np.int16)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_int32():
input_x = Tensor(np.ones(shape=[3])).astype(np.int32)
input_y = Tensor(np.zeros(shape=[3])).astype(np.int32)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_int64():
input_x = Tensor(np.ones(shape=[3])).astype(np.int64)
input_y = Tensor(np.zeros(shape=[3])).astype(np.int64)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_uint8():
input_x = Tensor(np.ones(shape=[3])).astype(np.uint8)
input_y = Tensor(np.zeros(shape=[3])).astype(np.uint8)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_uint16():
input_x = Tensor(np.ones(shape=[3])).astype(np.uint16)
input_y = Tensor(np.zeros(shape=[3])).astype(np.uint16)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_uint32():
input_x = Tensor(np.ones(shape=[3])).astype(np.uint32)
input_y = Tensor(np.zeros(shape=[3])).astype(np.uint32)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_uint64():
input_x = Tensor(np.ones(shape=[3])).astype(np.uint64)
input_y = Tensor(np.zeros(shape=[3])).astype(np.uint64)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_float16():
input_x = Tensor(np.ones(shape=[3])).astype(np.float16)
input_y = Tensor(np.zeros(shape=[3])).astype(np.float16)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_float32():
input_x = Tensor(np.ones(shape=[3])).astype(np.float32)
input_y = Tensor(np.zeros(shape=[3])).astype(np.float32)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_tensor_float64():
input_x = Tensor(np.ones(shape=[3])).astype(np.float64)
input_y = Tensor(np.zeros(shape=[3])).astype(np.float64)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3])
assert np.all(result1.asnumpy() == expect)
assert np.all(result2.asnumpy() == expect)
def test_tensor_add_number():
input_x = Tensor(np.ones(shape=[3])).astype(np.float32)
input_y = -0.4
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = np.ones(shape=[3]) * 0.6
assert np.all(result1.asnumpy() == expect.astype(np.float32))
assert np.all(result2.asnumpy() == expect.astype(np.float32))
def test_tuple_add_tuple():
input_x = (Tensor(np.ones(shape=[3])).astype(np.float32))
input_y = (Tensor(np.ones(shape=[3])).astype(np.float32) * 2)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = (np.ones(shape=[3]) * 3)
assert np.all(result1.asnumpy() == expect.astype(np.float32))
assert np.all(result2.asnumpy() == expect.astype(np.float32))
def test_tuple_add_tuple_shape():
input_x = (Tensor(np.ones(shape=[3])).astype(np.float32))
input_y = (Tensor(np.ones(shape=[4])).astype(np.float32) * 2)
result1 = input_x + input_y
add_net = Add()
result2 = add_net(input_x, input_y)
expect = (np.ones(shape=[3]) * 3)
assert np.all(result1.asnumpy() == expect.astype(np.float32))
assert np.all(result2.asnumpy() == expect.astype(np.float32))
def test_string_add_string():
input_x = "string111_"
input_y = "add_string222"
result = input_x + input_y
expect = "string111_add_string222"
assert result == expect
def test_list_add_list():
input_x = [1, 3, 5, 7, 9]
input_y = ["0", "6"]
result = input_x + input_y
expect = [1, 3, 5, 7, 9, "0", "6"]
assert result == expect
class Sub(nn.Cell):
def __init__(self):
super(Sub, self).__init__()
self.sub = P.Sub()
def construct(self, x, y):
z = self.sub(x, y)
return z
def test_number_sub_number():
input_x = 10.11
input_y = 902
result1 = input_x - input_y
sub_net = Sub()
result2 = sub_net(input_x, input_y)
expect = -891.89
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_sub_tensor():
input_x = Tensor(np.array([[2, 2], [3, 3]]))
input_y = Tensor(np.array([[1, 2], [-3, 3]]))
result1 = input_x - input_y
sub_net = Sub()
result2 = sub_net(input_x, input_y)
expect = Tensor(np.array([[1, 0], [6, 0]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_sub_number():
input_x = Tensor(np.array([[2, 2], [3, 3]]))
input_y = -2
result1 = input_x - input_y
sub_net = Sub()
result2 = sub_net(input_x, input_y)
expect = Tensor(np.array([[4, 4], [5, 5]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_sub_tensor():
input_x = Tensor(np.array([[2, 2], [3, 3]]))
input_y = -2
result1 = input_x - input_y
sub_net = Sub()
result2 = sub_net(input_x, input_y)
expect = Tensor(np.array([[-4, -4], [-5, -5]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
class Mul(nn.Cell):
def __init__(self):
super(Mul, self).__init__()
self.mul = P.Mul()
def construct(self, x, y):
z = self.mul(x, y)
return z
def test_number_mul_number():
input_x = 4.91
input_y = 0.16
result1 = input_x * input_y
mul_net = Mul()
result2 = mul_net(input_x, input_y)
expect = 0.7856
diff1 = result1 - expect
diff2 = result2 - expect
error = 1.0e-6
assert np.all(diff1 < error)
assert np.all(-diff1 < error)
assert np.all(diff2 < error)
assert np.all(-diff2 < error)
def test_tensor_mul_tensor():
input_x = Tensor(np.array([[2, 2], [3, 3]])).astype(np.float32)
input_y = Tensor(np.array([[1, 2], [3, 1]])).astype(np.float32)
result1 = input_x * input_y
mul_net = Mul()
result2 = mul_net(input_x, input_y)
expect = Tensor(np.array([[2, 4], [9, 3]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_mul_number():
input_x = Tensor(np.array([[2, 2], [3, 3]])).astype(np.float32)
input_y = -1
result1 = input_x * input_y
mul_net = Mul()
result2 = mul_net(input_x, input_y)
expect = Tensor(np.array([[-2, -2], [-3, -3]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_mul_tensor():
input_x = Tensor(np.array([[2, 2], [3, 3]])).astype(np.float32)
input_y = -1
result1 = input_x * input_y
mul_net = Mul()
result2 = mul_net(input_x, input_y)
expect = Tensor(np.array([[-2, -2], [-3, -3]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
class Div(nn.Cell):
def __init__(self):
super(Div, self).__init__()
self.div = P.Div()
def construct(self, x, y):
z = self.div(x, y)
return z
def test_number_div_number():
input_x = 4
input_y = -1
result1 = input_x / input_y
div_net = Div()
result2 = div_net(input_x, input_y)
expect = -4
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_div_tensor():
input_x = Tensor(np.array([[2, 2], [3, 3]])).astype(np.float32)
input_y = Tensor(np.array([[1, 2], [3, 1]])).astype(np.float32)
result1 = input_x / input_y
div_net = Div()
result2 = div_net(input_x, input_y)
expect = Tensor(np.array([[2, 1], [1, 3]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_div_number():
input_x = Tensor(np.array([[2, 2], [3, 3]])).astype(np.float32)
input_y = 2
result1 = input_x / input_y
div_net = Div()
result2 = div_net(input_x, input_y)
expect = Tensor(np.array([[1, 1], [1.5, 1.5]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_div_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 2
result1 = input_x / input_y
div_net = Div()
result2 = div_net(input_x, input_y)
expect = Tensor(np.array([[1, 1], [0.5, 0.5]]))
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
class Mod(nn.Cell):
def __init__(self):
super(Mod, self).__init__()
self.mod = P.Mod()
def construct(self, x, y):
z = self.mod(x, y)
return z
def test_number_mod_number():
input_x = 19
input_y = 2
result1 = input_x % input_y
mod_net = Mod()
result2 = mod_net(input_x, input_y)
expect = 1
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_mod_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
result1 = input_x % input_y
mod_net = Mod()
result2 = mod_net(input_x, input_y)
expect = Tensor(np.array([[0, 0], [0, 0]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_mod_number():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = -1
result1 = input_x % input_y
mod_net = Mod()
result2 = mod_net(input_x, input_y)
expect = Tensor(np.array([[0, 0], [0, 0]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_mod_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 5
result1 = input_x % input_y
mod_net = Mod()
result2 = mod_net(input_x, input_y)
expect = Tensor(np.array([[1, 1], [1, 1]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
class Pow(nn.Cell):
def __init__(self):
super(Pow, self).__init__()
self.pow = P.Pow()
def construct(self, x, y):
z = self.pow(x, y)
return z
def test_number_pow_number():
input_x = 2
input_y = 5
result1 = input_x ** input_y
pow_net = Pow()
result2 = pow_net(input_x, input_y)
expect = 32
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_pow_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
result1 = input_x ** input_y
pow_net = Pow()
result2 = pow_net(input_x, input_y)
expect = Tensor(np.array([[4, 4], [256, 256]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_pow_number():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x ** input_y
pow_net = Pow()
result2 = pow_net(input_x, input_y)
expect = Tensor(np.array([[8, 8], [64, 64]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_pow_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x ** input_y
pow_net = Pow()
result2 = pow_net(input_x, input_y)
expect = Tensor(np.array([[9, 9], [81, 81]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
class FloorDiv(nn.Cell):
def __init__(self):
super(FloorDiv, self).__init__()
self.floordiv = P.FloorDiv()
def construct(self, x, y):
z = self.floordiv(x, y)
return z
def test_number_floordiv_number():
input_x = 2
input_y = 5
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = 0
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_floordiv_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = Tensor(np.array([[1, 2], [-2, 4]])).astype(np.float32)
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[2, 1], [-2, 1]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_floordiv_number():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[0, 0], [1, 1]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_floordiv_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[1, 1], [0, 0]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_floormod_number():
input_x = 2
input_y = 5
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = 2
assert np.all(result1 == expect)
assert np.all(result2 == expect)
def test_tensor_floormod_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = Tensor(np.array([[1, 2], [-2, 4]])).astype(np.float32)
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[1, 0], [-2, 0]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_tensor_floormod_number():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[2, 2], [1, 1]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
def test_number_floormod_tensor():
input_x = Tensor(np.array([[2, 2], [4, 4]])).astype(np.float32)
input_y = 3
result1 = input_x // input_y
floordiv_net = FloorDiv()
result2 = floordiv_net(input_x, input_y)
expect = Tensor(np.array([[1, 1], [3, 3]])).astype(np.float32)
assert np.all(result1.asnumpy() == expect.asnumpy())
assert np.all(result2.asnumpy() == expect.asnumpy())
| 31.520408 | 78 | 0.634132 | 2,816 | 18,534 | 3.974432 | 0.055398 | 0.071301 | 0.08649 | 0.094353 | 0.879378 | 0.867137 | 0.843549 | 0.815225 | 0.807005 | 0.746694 | 0 | 0.046093 | 0.196989 | 18,534 | 587 | 79 | 31.574106 | 0.705906 | 0.035287 | 0 | 0.666667 | 0 | 0 | 0.0028 | 0.001288 | 0 | 0 | 0 | 0 | 0.196078 | 1 | 0.12854 | false | 0 | 0.010893 | 0 | 0.169935 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
518d3418cd5d7b9007143caaadbeff9d9133a5d8 | 129 | py | Python | gammapy/cube/__init__.py | grburgess/gammapy | 609e460698caca7223afeef5e71826c7b32728d1 | [
"BSD-3-Clause"
] | 3 | 2019-01-28T12:21:14.000Z | 2019-02-10T19:58:07.000Z | gammapy/cube/__init__.py | grburgess/gammapy | 609e460698caca7223afeef5e71826c7b32728d1 | [
"BSD-3-Clause"
] | null | null | null | gammapy/cube/__init__.py | grburgess/gammapy | 609e460698caca7223afeef5e71826c7b32728d1 | [
"BSD-3-Clause"
] | null | null | null | # Licensed under a 3-clause BSD style license - see LICENSE.rst
from .core import *
from .exposure import *
from .utils import * | 25.8 | 63 | 0.744186 | 20 | 129 | 4.8 | 0.75 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009434 | 0.178295 | 129 | 5 | 64 | 25.8 | 0.896226 | 0.472868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5193f0fa22e11ce0b7dd38990992ca4661a948ad | 179 | py | Python | jc_decrypter/__init__.py | cesarbruschetta/julio-cesar-decrypter | 1f8b94b6370fb0a8bbfc1fa6b44adc9d69bf088c | [
"BSD-2-Clause"
] | null | null | null | jc_decrypter/__init__.py | cesarbruschetta/julio-cesar-decrypter | 1f8b94b6370fb0a8bbfc1fa6b44adc9d69bf088c | [
"BSD-2-Clause"
] | null | null | null | jc_decrypter/__init__.py | cesarbruschetta/julio-cesar-decrypter | 1f8b94b6370fb0a8bbfc1fa6b44adc9d69bf088c | [
"BSD-2-Clause"
] | null | null | null | import sys
import os
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from jc_decrypter.utils import logger
# SET LOGGER
logger.configure_logger()
| 16.272727 | 76 | 0.787709 | 28 | 179 | 4.821429 | 0.535714 | 0.133333 | 0.192593 | 0.222222 | 0.237037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094972 | 179 | 10 | 77 | 17.9 | 0.833333 | 0.055866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51a84d556ead609d27d3a7105e06aae9e33848af | 1,946 | py | Python | py_api/nest.py | ImmaculateObsession/nest | 8f384e8847ea2e0d639b4defef11d4b226e44461 | [
"MIT"
] | 1 | 2015-04-09T01:30:38.000Z | 2015-04-09T01:30:38.000Z | py_api/nest.py | ImmaculateObsession/nest | 8f384e8847ea2e0d639b4defef11d4b226e44461 | [
"MIT"
] | null | null | null | py_api/nest.py | ImmaculateObsession/nest | 8f384e8847ea2e0d639b4defef11d4b226e44461 | [
"MIT"
] | null | null | null | import requests
NEST_URL='https://www.inkpebble.com/api/comic/'
class Nest(object):
auth = ()
comic_list = 'list/'
tag_list = 'tag/list/'
tag_detail_url = 'tag/'
panel_list = 'panel/list/'
panel_detail_url = 'panel/'
def __init__(self, url=NEST_URL, user=None, password=None):
self.url = url
if user and password:
self.auth = (user, password)
def get_comics(self):
url = "%s%s" % (self.url, self.comic_list)
if self.auth:
return requests.get(url, auth=self.auth).json()
return requests.get(url).json()
def get_tags(self):
url = "%s%s" % (self.url, self.tag_list)
if self.auth:
return requests.get(url, auth=self.auth).json()
return requests.get(url).json()
def get_panels(self):
url = "%s%s" % (self.url, self.panel_list)
if self.auth:
return requests.get(url, auth=self.auth).json()
return requests.get(url).json()
def comic_detail(self, id):
url = "%s%s/" % (self.url, id)
if self.auth:
return requests.get(url, auth=self.auth).json()
return requests.get(url).json()
def comic_save(self, id, data=None):
if not self.auth:
return {'error': 'You do not have permissions for that'}
url = "%s%s/" % (self.url, id)
r = requests.patch(url, data=data, auth=self.auth)
return r.json()
def tag_detail(self, id):
url = "%s%s%s/" % (self.url, self.tag_detail_url, id)
if self.auth:
return requests.get(url, auth=self.auth).json()
return requests.get(url).json()
def tag_save(self, id, data=None):
if not self.auth:
return {'error': 'You do not have permissions for that'}
url = "%s%s%s/" % (self.url, self.tag_detail_url, id)
r = requests.patch(url, data=data, auth=self.auth)
return r.json()
| 31.387097 | 68 | 0.571942 | 278 | 1,946 | 3.906475 | 0.161871 | 0.110497 | 0.156538 | 0.184162 | 0.745856 | 0.745856 | 0.709024 | 0.653775 | 0.653775 | 0.653775 | 0 | 0 | 0.276465 | 1,946 | 61 | 69 | 31.901639 | 0.771307 | 0 | 0 | 0.54 | 0 | 0 | 0.097172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16 | false | 0.06 | 0.02 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
51ad6055ff0346e514acbb1a0d524a13cdd80ef9 | 148 | py | Python | tests/conftest.py | uibiv/demo | 04ec2984a7414e292a6ccdd78b81512fff948ba9 | [
"MIT"
] | 5 | 2018-07-02T02:42:07.000Z | 2019-01-29T06:21:18.000Z | tests/conftest.py | uibiv/demo | 04ec2984a7414e292a6ccdd78b81512fff948ba9 | [
"MIT"
] | 1 | 2018-07-01T13:41:44.000Z | 2018-07-01T13:41:44.000Z | tests/conftest.py | guyskk/newio-requests | 2efaf8128fa421ba77919a263c00a760bb0a6338 | [
"MIT"
] | null | null | null | import os
import pytest
if os.getenv('TEST_HTTPBIN_REAL') == '1':
@pytest.fixture
def httpbin_both():
return 'http://httpbin.org'
| 16.444444 | 41 | 0.648649 | 20 | 148 | 4.65 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008547 | 0.209459 | 148 | 8 | 42 | 18.5 | 0.786325 | 0 | 0 | 0 | 0 | 0 | 0.243243 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.333333 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
51d2fd8d27e0b198f56bc333622f1e7efa19c80c | 91 | py | Python | Trans2D/models/__init__.py | urielsinger/Trans2D | f13eef6bf75cf64b6e7f7f2ebe4da727e5926ddc | [
"Apache-2.0"
] | 5 | 2021-11-19T16:03:18.000Z | 2022-03-03T13:12:21.000Z | Trans2D/models/__init__.py | urielsinger/Trans2D | f13eef6bf75cf64b6e7f7f2ebe4da727e5926ddc | [
"Apache-2.0"
] | null | null | null | Trans2D/models/__init__.py | urielsinger/Trans2D | f13eef6bf75cf64b6e7f7f2ebe4da727e5926ddc | [
"Apache-2.0"
] | 1 | 2022-01-07T09:53:05.000Z | 2022-01-07T09:53:05.000Z | from .SequenceModel import SequenceTransformerModule
from .Tokenizer import TensorTokenizer | 45.5 | 52 | 0.901099 | 8 | 91 | 10.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 91 | 2 | 53 | 45.5 | 0.97619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51df36835e2b04336ed44f4a273428f3065bf523 | 48 | py | Python | radio_gyms/visualizers/__init__.py | intelek-ai/radio-gyms | 021073d81e66d52e1c77ec431bcb432c1091fa35 | [
"MIT"
] | null | null | null | radio_gyms/visualizers/__init__.py | intelek-ai/radio-gyms | 021073d81e66d52e1c77ec431bcb432c1091fa35 | [
"MIT"
] | null | null | null | radio_gyms/visualizers/__init__.py | intelek-ai/radio-gyms | 021073d81e66d52e1c77ec431bcb432c1091fa35 | [
"MIT"
] | null | null | null | from .window import VisualizerWindow as Window
| 16 | 46 | 0.833333 | 6 | 48 | 6.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 47 | 24 | 0.97561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51e90b2db827b444d03b0c1e61649bfffa3dc161 | 9,824 | py | Python | test/hours/modify.py | sbutler/spotseeker_server | 02bd2d646eab9f26ddbe8536b30e391359796c9c | [
"Apache-2.0"
] | null | null | null | test/hours/modify.py | sbutler/spotseeker_server | 02bd2d646eab9f26ddbe8536b30e391359796c9c | [
"Apache-2.0"
] | null | null | null | test/hours/modify.py | sbutler/spotseeker_server | 02bd2d646eab9f26ddbe8536b30e391359796c9c | [
"Apache-2.0"
] | null | null | null | """ Copyright 2012, 2013 UW Information Technology, University of Washington
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from django.conf import settings
from django.test import TestCase
from django.test.client import Client
from spotseeker_server.models import Spot, SpotAvailableHours
import datetime
class SpotHoursModifyTest(TestCase):
""" Tests that when open hours are submitted that overlap with other open hours for a Spot, the previous Spot hours are adjusted rather than having multiple AvailableHours hanging around.
"""
def setUp(self):
spot = Spot.objects.create(name="This spot has overlapping hours")
self.spot = spot
def test_early_begin_time(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "08:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(8, 0), new_hours.start_time, "Start time is the same")
def test_late_begin_time(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "11:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(11, 0), new_hours.start_time, "Start time is the same")
def test_early_end_time(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.end_time = "11:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(11, 0), new_hours.end_time, "End time is the same")
def test_late_end_time(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.end_time = "13:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(13, 0), new_hours.end_time, "End time is the same")
def test_early_begin_early_end(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "08:00"
hours1.end_time = "11:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(8, 0), new_hours.start_time, "Start time is the same")
self.assertEquals(datetime.time(11, 0), new_hours.end_time, "End time is the same")
def test_early_begin_late_end(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "08:00"
hours1.end_time = "13:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(8, 0), new_hours.start_time, "Start time is the same")
self.assertEquals(datetime.time(13, 0), new_hours.end_time, "End time is the same")
def test_late_begin_late_end(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "10:00"
hours1.end_time = "13:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(10, 0), new_hours.start_time, "Start time is the same")
self.assertEquals(datetime.time(13, 0), new_hours.end_time, "End time is the same")
def test_late_begin_early_end(self):
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours1.start_time = "10:00"
hours1.end_time = "11:00"
hours1.save()
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(datetime.time(10, 0), new_hours.start_time, "Start time is the same")
self.assertEquals(datetime.time(11, 0), new_hours.end_time, "End time is the same")
def test_early_overlap(self):
""" Tests adding SpotAvailableHours with a start time earlier than an existing start, but end within the current open hours.
"""
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours2 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="05:00", end_time="8:00")
hours2.end_time = "10:00"
hours2.save()
# creating hours2 should get those times merged into hours1
hours_obj_count = self.spot.spotavailablehours_set.values_list().count()
self.assertEquals(hours_obj_count, 1, "Only one SpotAvailableHours object")
# check to see that start and end times are correct
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(hours2.start_time, new_hours.start_time, "Start time is the same")
self.assertEquals(hours1.end_time, new_hours.end_time, "End time is the same")
def test_late_overlap(self):
""" Tests adding SpotAvailableHours with a start time within current hours, but an end time later than the current open hours.
"""
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours2 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="13:00", end_time="15:00")
hours2.start_time = "11:00"
hours2.save()
# creating hours2 should get those times merged into hours1
hours_obj_count = self.spot.spotavailablehours_set.values_list().count()
self.assertEquals(hours_obj_count, 1, "Only one SpotAvailableHours object")
# check to see that start and end times are correct
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(hours1.start_time, new_hours.start_time, "Start time is the same")
self.assertEquals(hours2.end_time, new_hours.end_time, "End time is the same")
def test_total_overlap(self):
""" Tests adding SpotAvailableHours with an earlier start time and a later end time.
"""
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="09:00", end_time="12:00")
hours2 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="07:00", end_time="8:00")
hours2.end_time = "14:00"
hours2.save()
# creating hours2 should get those times merged into hours1
hours_obj_count = self.spot.spotavailablehours_set.values_list().count()
self.assertEquals(hours_obj_count, 1, "Only one SpotAvailableHours object")
# check to see that start and end times are correct
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(hours2.start_time, new_hours.start_time, "Start time is the same")
self.assertEquals(hours2.end_time, new_hours.end_time, "End time is the same")
def test_underlap(self):
""" Tests adding SpotAvailableHours with a start and end insice of currently open hours.
"""
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="08:00", end_time="14:00")
hours2 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="05:00", end_time="7:00")
hours2.start_time = "09:00"
hours2.end_time = "13:00"
hours2.save()
# creating hours2 should get those times merged into hours1
hours_obj_count = self.spot.spotavailablehours_set.values_list().count()
self.assertEquals(hours_obj_count, 1, "Only one SpotAvailableHours object")
# check to see that start and end times are correct
new_hours = self.spot.spotavailablehours_set.all()[0]
self.assertEquals(hours1.start_time, new_hours.start_time, "Start time is the same")
self.assertEquals(hours1.end_time, new_hours.end_time, "End time is the same")
def test_no_overlap(self):
""" Tests adding another available hours object that should not get merged into an existing one.
"""
hours1 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="08:00", end_time="12:00")
hours2 = SpotAvailableHours.objects.create(spot=self.spot, day="m", start_time="14:00", end_time="18:00")
hours2.start_time = "13:00"
hours2.end_time = "17:00"
hours2.save()
# creating hours2 should get those times merged into hours1
hours_obj_count = self.spot.spotavailablehours_set.values_list().count()
self.assertEquals(hours_obj_count, 2, "Two SpotAvailableHours objects")
# check to see that start and end times are correct
new_hours = self.spot.spotavailablehours_set.all()[0]
new_hours2 = self.spot.spotavailablehours_set.all()[1]
self.assertEquals(hours1.start_time, new_hours.start_time, "Start time is the same")
self.assertEquals(hours1.end_time, new_hours.end_time, "End time is the same")
self.assertEquals(hours2.start_time, new_hours2.start_time, "Start time is the same")
self.assertEquals(hours2.end_time, new_hours2.end_time, "End time is the same")
| 55.502825 | 191 | 0.694116 | 1,405 | 9,824 | 4.703203 | 0.12242 | 0.065678 | 0.032688 | 0.047216 | 0.800696 | 0.795248 | 0.786774 | 0.7727 | 0.76589 | 0.749546 | 0 | 0.044932 | 0.193506 | 9,824 | 176 | 192 | 55.818182 | 0.789095 | 0.190452 | 0 | 0.65 | 0 | 0 | 0.126017 | 0 | 0 | 0 | 0 | 0 | 0.241667 | 1 | 0.116667 | false | 0 | 0.041667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cfb72ac5a1c7568370202bd07dacd73aefc0c9e0 | 428 | py | Python | nighres/laminar/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 41 | 2017-08-15T12:23:31.000Z | 2022-02-28T15:12:22.000Z | nighres/laminar/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 130 | 2017-07-27T11:09:09.000Z | 2022-03-31T10:05:07.000Z | nighres/laminar/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 35 | 2017-08-17T17:05:41.000Z | 2022-03-28T12:22:14.000Z | from nighres.laminar.volumetric_layering import volumetric_layering
from nighres.laminar.profile_sampling import profile_sampling
from nighres.laminar.profile_averaging import profile_averaging
from nighres.laminar.profile_meshing import profile_meshing
from nighres.laminar.laminar_iterative_smoothing import laminar_iterative_smoothing
from nighres.laminar.laminar_regional_approximation import laminar_regional_approximation
| 61.142857 | 89 | 0.915888 | 52 | 428 | 7.230769 | 0.269231 | 0.175532 | 0.287234 | 0.199468 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056075 | 428 | 6 | 90 | 71.333333 | 0.930693 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cfc24da05a0af59b5a39552d326b9682972826f1 | 40 | py | Python | plugins/simple_footnotes/__init__.py | Naereen/cuisine | d5152bc189a4c7c385bbf6964f296c2cf2c63a1b | [
"MIT"
] | 10 | 2017-11-23T15:02:01.000Z | 2021-01-02T13:00:01.000Z | plugins/simple_footnotes/__init__.py | Naereen/cuisine | d5152bc189a4c7c385bbf6964f296c2cf2c63a1b | [
"MIT"
] | 171 | 2017-06-16T23:30:10.000Z | 2021-04-05T02:22:10.000Z | plugins/simple_footnotes/__init__.py | Naereen/Objectif-Zero-Dechet-2018 | a7a1caa1a57c9b932cf4b72e6595a91d8aa3a51e | [
"MIT"
] | 1 | 2018-07-27T21:08:48.000Z | 2018-07-27T21:08:48.000Z | from .simple_footnotes import * # NOQA
| 20 | 39 | 0.75 | 5 | 40 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 40 | 1 | 40 | 40 | 0.878788 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5c6013c929fd18ba5d8de4fc486f3b2e87464d71 | 3,718 | py | Python | example.py | rafa-rod/pytrendseries | 2202ccbee22e367667cc836c8d54cdcc601985f0 | [
"MIT"
] | 17 | 2021-05-17T03:17:02.000Z | 2021-12-24T18:51:37.000Z | example.py | rafa-rod/pytrendseries | 2202ccbee22e367667cc836c8d54cdcc601985f0 | [
"MIT"
] | null | null | null | example.py | rafa-rod/pytrendseries | 2202ccbee22e367667cc836c8d54cdcc601985f0 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
from random import randrange
import pandas as pd
import numpy as np
from pytrendseries import detecttrend
from pytrendseries import maxtrend
from pytrendseries import vizplot
'''Testing examples with integer random data'''
def random_integer_and_continuously_increasing_data(janela, trend, limit): # pragma: no cove
random_series = [i+randrange(10) for i in range(1,101)]
random_series = pd.DataFrame(random_series)
random_series["date"] = sorted(pd.to_datetime(np.random.randint(1, 101, size=100), unit='d').tolist())
random_series.columns = ["random","date"]
random_series = random_series[["date","random"]]
random_series["random"].plot()
getTrend3, quantile = detecttrend.detecttrend(random_series, trend=trend, limit=limit,
window=janela)
vizplot.plot_trend(random_series, getTrend3, "random", trend)
maxtrend = maxtrend.getmaxtrend(random_series, "random", trend)
vizplot.plot_maxdrawdown(random_series, maxtrend, "random", trend, style="shadow")
return getTrend3, quantile, maxtrend
janela, trend, limit = 3, "downtrend", 2
random_integer_and_continuously_increasing_data(janela, trend, limit)
janela, trend, limit = 3, "uptrend", 2
random_integer_and_continuously_increasing_data(janela, trend, limit)
def random_integer_and_continuously_decreasing_data(janela, trend, limit): # pragma: no cove
random_series = [i+randrange(10) for i in reversed(range(1,101))]
random_series = pd.DataFrame(random_series)
random_series["date"] = sorted(pd.to_datetime(np.random.randint(1, 101, size=100), unit='d').tolist())
random_series.columns = ["random","date"]
random_series = random_series[["date","random"]]
random_series["random"].plot()
getTrend3, quantile = detecttrend.detecttrend(random_series, trend=trend, limit=limit,
window=janela)
vizplot.plot_trend(random_series, getTrend3, "random", trend)
maxtrend = maxtrend.getmaxtrend(random_series, "random", trend)
vizplot.plot_maxdrawdown(random_series, maxtrend, "random", trend, style="shadow")
return getTrend3, quantile, maxtrend
'''Testing downtrend with integer values in a serie with no uptrend'''
janela, trend, limit = 3, "downtrend", 2
random_integer_and_continuously_decreasing_data(janela, trend, limit)
'''Testing uptrend with integer values in a serie with no uptrend'''
janela, trend, limit = 3, "uptrend", 2
random_integer_and_continuously_decreasing_data(janela, trend, limit)
'''Testing examples with float random data'''
def sine_random_values_testing(janela, trend, limit): # pragma: no cove
x = np.linspace(1, 101)
def f(x):
return np.sin(x) + np.random.normal(scale=0.1, size=len(x))
random_series = pd.DataFrame(f(x))
random_series["date"] = sorted(pd.to_datetime(np.random.randint(1, 101, size=50), unit='d').tolist())
random_series.columns = ["random","date"]
random_series = random_series[["date","random"]]
random_series["random"].plot()
getTrend3, quantile = detecttrend.detecttrend(random_series, trend=trend, limit=limit,
window=janela)
vizplot.plot_trend(random_series, getTrend3, "random", trend)
maxtrend = maxtrend.getmaxtrend(random_series, "random", trend)
vizplot.plot_maxdrawdown(random_series, maxtrend, "random", trend, style="shadow")
return getTrend3, quantile, maxtrend
'''Testing downtrend with float values in a series without long period trend'''
janela, trend, limit = 2, "downtrend", 2
sine_random_values_testing(janela, trend, limit)
'''Testing downtrend with float values in a series without trend'''
janela, trend, limit = 20, "downtrend", 12
sine_random_values_testing(janela, trend, limit) | 42.25 | 103 | 0.741259 | 491 | 3,718 | 5.446029 | 0.181263 | 0.15258 | 0.089753 | 0.062827 | 0.842184 | 0.83994 | 0.835453 | 0.791698 | 0.791698 | 0.735602 | 0 | 0.019553 | 0.133405 | 3,718 | 88 | 104 | 42.25 | 0.810366 | 0.024476 | 0 | 0.694915 | 0 | 0 | 0.066215 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067797 | false | 0 | 0.101695 | 0.016949 | 0.237288 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5c9c9d7d38c233b01f84fefe9c307e1bae2a99cd | 30 | py | Python | sample/pytest/test_root.py | parzingis/cricket | 126cc8c7b02f6efa19f369fd7c6decf75ac0f0ee | [
"BSD-3-Clause"
] | 98 | 2015-05-28T10:41:52.000Z | 2019-03-08T09:14:35.000Z | sample/pytest/test_root.py | SujeetGautam/cricket | 1476b597f499c1b9b34c9d21eeef0b4900892760 | [
"BSD-3-Clause"
] | 33 | 2015-02-11T12:39:55.000Z | 2019-03-29T23:23:00.000Z | sample/pytest/test_root.py | SujeetGautam/cricket | 1476b597f499c1b9b34c9d21eeef0b4900892760 | [
"BSD-3-Clause"
] | 49 | 2015-03-25T05:55:14.000Z | 2019-03-23T15:30:38.000Z |
def test_at_root():
pass
| 7.5 | 19 | 0.633333 | 5 | 30 | 3.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 30 | 3 | 20 | 10 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
5ca48d8280f1257fe4bbb2c9d2aa64fcfec9557a | 4,113 | py | Python | crowdsource/handlers/validate.py | texodus/crowdsource | 60d222ef3e9ad6b35b54c103e66e647908014e7a | [
"Apache-2.0"
] | null | null | null | crowdsource/handlers/validate.py | texodus/crowdsource | 60d222ef3e9ad6b35b54c103e66e647908014e7a | [
"Apache-2.0"
] | null | null | null | crowdsource/handlers/validate.py | texodus/crowdsource | 60d222ef3e9ad6b35b54c103e66e647908014e7a | [
"Apache-2.0"
] | null | null | null | import logging
import six
from ..utils import parse_body
from ..enums import CompetitionType
from ..persistence.models import Competition
def validate_competition_get(handler):
data = parse_body(handler.request)
data['competition_id'] = data.get('competition_id', handler.get_argument('competition_id', ()))
data['client_id'] = data.get('client_id', handler.get_argument('client_id', ()))
data['type'] = data.get('type', handler.get_argument('type', ()))
if isinstance(data['competition_id'], six.string_types):
data['competition_id'] = str(data['competition_id']).split(',')
if isinstance(data['client_id'], six.string_types):
data['client_id'] = str(data['client_id']).split(',')
if isinstance(data['type'], six.string_types):
data['type'] = list(map(lambda x: CompetitionType(x), str(data['type']).split(',')))
logging.info("GET COMPETITIONS")
return data
def validate_competition_post(handler):
data = parse_body(handler.request)
if not data.get('competition_id'):
handler._set_400('Client no id')
if int(data.get('competition_id', '-1')) not in handler._clients:
handler._set_400('Client not registered')
if data.get('spec', None) is None:
handler._set_400('Competition malformed')
return data
def validate_submission_get(handler):
data = parse_body(handler.request)
data['submission_id'] = data.get('id', handler.get_argument('submission_id', ()))
data['client_id'] = data.get('client_id', handler.get_argument('client_id', ()))
data['competition_id'] = data.get('competition_id', handler.get_argument('competition_id', ()))
data['type'] = data.get('type', handler.get_argument('type', ()))
if isinstance(data['submission_id'], six.string_types):
data['submission_id'] = str(data['submission_id']).split(',')
if isinstance(data['client_id'], six.string_types):
data['client_id'] = str(data['client_id']).split(',')
if isinstance(data['competition_id'], six.string_types):
data['competition_id'] = str(data['competition_id']).split(',')
if isinstance(data['type'], six.string_types):
data['type'] = list(map(lambda x: CompetitionType(x), str(data['type']).split(',')))
logging.info("GET SUBMISSIONS")
return data
def validate_submission_post(handler):
data = parse_body(handler.request)
if not data.get('client_id'):
handler._set_400('Client no id')
if int(data.get('client_id', '-1')) not in handler._clients:
handler._set_400('Client not registered')
if not data.get('competition_id'):
handler._set_400('Competition no id')
with handler.session() as session:
competition = session.query(Competition).filter_by(competition_id=int(data.get('competition_id'))).first()
if competition is None:
handler._set_400('Competition not registered')
if not data.get('submission'):
handler._set_400('Client provided no submission')
logging.info("POST SUBMISSION %s", data.get('submission_id'))
return data
def validate_leaderboard_get(handler):
data = parse_body(handler.request)
data['submission_id'] = data.get('submission_id', handler.get_argument('submission_id', ()))
data['client_id'] = data.get('client_id', handler.get_argument('client_id', ()))
data['competition_id'] = data.get('competition_id', handler.get_argument('competition_id', ()))
data['type'] = data.get('type', handler.get_argument('type', ()))
if isinstance(data['submission_id'], six.string_types):
data['submission_id'] = str(data['submission_id']).split(',')
if isinstance(data['client_id'], six.string_types):
data['client_id'] = str(data['client_id']).split(',')
if isinstance(data['competition_id'], six.string_types):
data['competition_id'] = str(data['competition_id']).split(',')
if isinstance(data['type'], six.string_types):
data['type'] = list(map(lambda x: CompetitionType(x), str(data['type']).split(',')))
logging.info("GET SUBMISSIONS")
return data
| 37.054054 | 114 | 0.672745 | 529 | 4,113 | 5.018904 | 0.115312 | 0.112618 | 0.076836 | 0.074576 | 0.826365 | 0.791714 | 0.759699 | 0.759699 | 0.745763 | 0.72806 | 0 | 0.007482 | 0.155118 | 4,113 | 110 | 115 | 37.390909 | 0.756547 | 0 | 0 | 0.653333 | 0 | 0 | 0.235838 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7a5b4615556008a8703c20fa37e673f693034b3f | 104 | py | Python | pathfinder/__init__.py | spatialthoughts/plugins | 976ce0a842612f2c404f9127f71902167f754588 | [
"Apache-2.0"
] | null | null | null | pathfinder/__init__.py | spatialthoughts/plugins | 976ce0a842612f2c404f9127f71902167f754588 | [
"Apache-2.0"
] | null | null | null | pathfinder/__init__.py | spatialthoughts/plugins | 976ce0a842612f2c404f9127f71902167f754588 | [
"Apache-2.0"
] | null | null | null | from .pathfinder import PathFinderPlugin
def classFactory(iface):
return PathFinderPlugin(iface) | 26 | 41 | 0.798077 | 10 | 104 | 8.3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144231 | 104 | 4 | 42 | 26 | 0.932584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
7a95a182cdbb647bb728f52221eb44f9dc9af707 | 75 | py | Python | flex/datastructures/__init__.py | centergy/flex_common | e1b994fa7447a2751138f1d046a5c79b2d817433 | [
"MIT"
] | null | null | null | flex/datastructures/__init__.py | centergy/flex_common | e1b994fa7447a2751138f1d046a5c79b2d817433 | [
"MIT"
] | null | null | null | flex/datastructures/__init__.py | centergy/flex_common | e1b994fa7447a2751138f1d046a5c79b2d817433 | [
"MIT"
] | null | null | null | from .collections import *
# from .orderedset import *
from .enum import *
| 18.75 | 27 | 0.733333 | 9 | 75 | 6.111111 | 0.555556 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173333 | 75 | 3 | 28 | 25 | 0.887097 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8f98ff9d72415181ec2265068fbf593662816aae | 131 | py | Python | tests/conftest.py | taoluo/simpy | 3e781549d8515e4d45a034a55957d1b284f56d3f | [
"MIT"
] | 91 | 2017-04-10T10:37:12.000Z | 2022-03-09T06:53:06.000Z | Dessert.Benchmarks/SimPy3/test/conftest.py | saboco/Dessert | bced609c40f74aeb2c938b754b475f0a79ba92c4 | [
"MIT"
] | 13 | 2018-08-11T16:25:35.000Z | 2021-09-27T06:51:32.000Z | Dessert.Benchmarks/SimPy3/test/conftest.py | saboco/Dessert | bced609c40f74aeb2c938b754b475f0a79ba92c4 | [
"MIT"
] | 20 | 2018-05-17T11:51:44.000Z | 2022-01-26T05:37:14.000Z | import pytest
import simpy
@pytest.fixture
def log():
return []
@pytest.fixture
def env():
return simpy.Environment()
| 9.357143 | 30 | 0.679389 | 16 | 131 | 5.5625 | 0.5625 | 0.292135 | 0.359551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206107 | 131 | 13 | 31 | 10.076923 | 0.855769 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
8f9af57d57f521f44a8825f4d336f78c8f7cd025 | 39 | py | Python | AWERA/eval/__init__.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | AWERA/eval/__init__.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | AWERA/eval/__init__.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | from . import optimal_harvesting_height | 39 | 39 | 0.897436 | 5 | 39 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 39 | 1 | 39 | 39 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8fa00439efebac2d070c7cdf7d0085f58f56f45c | 6,822 | py | Python | CGATPipelines/pipeline_docs/pipeline_proj007/trackers/macs_replicated_liver_testes_shared_intervals.py | cdrakesmith/CGATPipelines | 3c94ae4f9d87d51108255dc405c4b95af7c8b694 | [
"MIT"
] | 49 | 2015-04-13T16:49:25.000Z | 2022-03-29T10:29:14.000Z | CGATPipelines/pipeline_docs/pipeline_proj007/trackers/macs_replicated_liver_testes_shared_intervals.py | cdrakesmith/CGATPipelines | 3c94ae4f9d87d51108255dc405c4b95af7c8b694 | [
"MIT"
] | 252 | 2015-04-08T13:23:34.000Z | 2019-03-18T21:51:29.000Z | CGATPipelines/pipeline_docs/pipeline_proj007/trackers/macs_replicated_liver_testes_shared_intervals.py | cdrakesmith/CGATPipelines | 3c94ae4f9d87d51108255dc405c4b95af7c8b694 | [
"MIT"
] | 22 | 2015-05-21T00:37:52.000Z | 2019-09-25T05:04:27.000Z |
from CGATReport.Tracker import *
from cpgReport import *
##########################################################################
class replicatedSharedIntervals(cpgTracker):
"""Summary stats of intervals called by the peak finder. """
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getFirstRow(
"SELECT COUNT(*) as number, round(AVG(end-start),0) as length FROM liver_testes_shared_intervals" % locals())
return odict(list(zip(("Shared intervals", "mean_interval_length"), data)))
##########################################################################
class replicatedsharedIntervalLengths(cpgTracker):
"""Distribution of interval length. """
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getValues(
"SELECT (end-start) FROM liver_testes_shared_intervals" % locals())
return {"length": data}
##########################################################################
class replicatedSharedIntervalTSS(cpgTracker):
"""Distribution of distance to closest TSS """
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
ANNOTATIONS_NAME = P['annotations_name']
data = self.getValues( '''SELECT closest_dist FROM liver_testes_shared_intervals u,
liver_testes_merged_intervals i, liver_testes_merged_%(ANNOTATIONS_NAME)s_transcript_tss_distance t
WHERE u.interval_id=i.interval_id
AND t.gene_id=i.interval_id''' % locals() )
return {"distance": data}
##########################################################################
class replicatedSharedIntervalCpGDensity(cpgTracker):
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getAll( '''SELECT pCpG FROM liver_testes_shared_intervals u,
liver_testes_merged_intervals i, liver_testes_merged_composition c
WHERE u.contig=i.contig
AND u.start=i.start
AND c.gene_id=i.interval_id''' % locals() )
return data
##########################################################################
class replicatedSharedIntervalCpGObsExp(cpgTracker):
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getAll( '''SELECT CpG_ObsExp FROM liver_testes_shared_intervals u,
liver_testes_merged_intervals i,liver_testes_merged_composition c
WHERE u.contig=i.contig
AND u.start=i.start
AND c.gene_id=i.interval_id''' % locals() )
return data
##########################################################################
class replicatedSharedIntervalCpGNumber(cpgTracker):
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getAll( '''SELECT nCpG FROM liver_testes_shared_intervals u,
liver_testes_merged_intervals i, liver_testes_merged_composition c
WHERE u.contig=i.contig
AND u.start=i.start
AND c.gene_id=i.interval_id''' % locals() )
return data
##########################################################################
class replicatedSharedIntervalGCContent(cpgTracker):
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
data = self.getAll( '''SELECT pGC FROM liver_testes_shared_intervals u,
liver_testes_merged_intervals i, liver_testes_merged_composition c
WHERE u.contig=i.contig
AND u.start=i.start
AND c.gene_id=i.interval_id''' % locals() )
return data
##########################################################################
class replicatedSharedIntervalTranscriptOverlap(featureOverlap):
"""return overlap of interval with protein-coding transcripts """
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
ANNOTATIONS_NAME = P['annotations_name']
data = self.getValues( """ SELECT count(distinct gene_id) as intervals FROM (
SELECT gene_id,
CASE WHEN tss_transcript_extended_pover1 > 0 THEN 'TSS'
WHEN genes_pover1 > 0 THEN 'Gene'
WHEN upstream_flank_pover1 >0 THEN 'Upstream'
WHEN downstream_flank_pover1 >0 THEN 'Downstream'
ELSE 'Intergenic'
END AS feature_class
FROM liver_testes_merged_%(ANNOTATIONS_NAME)s_overlap o, liver_testes_shared_intervals u
WHERE u.interval_id=o.gene_id)
group by feature_class
order by feature_class asc""" % locals() )
return odict(list(zip(("Downstream", "Gene", "Intergenic", "TSS", "Upstream"), data)))
##########################################################################
class replicatedSharedIntervalGeneOverlap(featureOverlap):
"""return overlap of interval with protein-coding genes """
mPattern = "liver_testes_shared_intervals$"
def __call__(self, track, slice=None):
ANNOTATIONS_NAME = P['annotations_name']
data = self.getValues( """ SELECT count(distinct gene_id) as intervals FROM (
SELECT gene_id,
CASE WHEN tss_gene_extended_pover1 > 0 THEN 'TSS'
WHEN genes_pover1 > 0 THEN 'Gene'
WHEN upstream_flank_pover1 >0 THEN 'Upstream'
WHEN downstream_flank_pover1 >0 THEN 'Downstream'
ELSE 'Intergenic'
END AS feature_class
FROM liver_testes_merged_%(ANNOTATIONS_NAME)s_overlap o, liver_testes_shared_intervals u
WHERE u.interval_id=o.gene_id)
group by feature_class
order by feature_class asc""" % locals() )
return odict(list(zip(("Downstream", "Gene", "Intergenic", "TSS", "Upstream"), data)))
| 44.012903 | 133 | 0.521695 | 618 | 6,822 | 5.457929 | 0.171521 | 0.097836 | 0.09072 | 0.138749 | 0.777646 | 0.774088 | 0.769345 | 0.735843 | 0.703824 | 0.703824 | 0 | 0.003622 | 0.312079 | 6,822 | 154 | 134 | 44.298701 | 0.715108 | 0.035767 | 0 | 0.698925 | 0 | 0 | 0.655337 | 0.223827 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0 | 0.021505 | 0 | 0.408602 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8fb24f4aab4e05570c5eb3f9e2b408223d98534e | 40 | py | Python | tests/_site/apps/customer/models.py | akiyoko/oscar_sandbox | b384f1c0b5f297fd4b84509a575f6766a48630a5 | [
"BSD-3-Clause"
] | 68 | 2016-11-06T05:07:57.000Z | 2021-12-17T09:17:38.000Z | tests/_site/apps/customer/models.py | akiyoko/oscar_sandbox | b384f1c0b5f297fd4b84509a575f6766a48630a5 | [
"BSD-3-Clause"
] | 1 | 2021-12-13T20:48:46.000Z | 2021-12-13T20:48:46.000Z | tests/_site/apps/customer/models.py | akiyoko/oscar_sandbox | b384f1c0b5f297fd4b84509a575f6766a48630a5 | [
"BSD-3-Clause"
] | 28 | 2016-12-04T07:12:50.000Z | 2021-02-06T21:13:15.000Z | from oscar.apps.customer.models import * | 40 | 40 | 0.825 | 6 | 40 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8f1065d8f9397989835c8d6d5b7be7fc35403881 | 37 | py | Python | mayan/apps/storage/models.py | camerondphillips/MAYAN | b8cd44af50f0b2f2b59286d9c88e2f7aa573a93f | [
"Apache-2.0"
] | 2 | 2021-02-21T09:24:54.000Z | 2021-05-12T18:42:37.000Z | accounts/models.py | sahilr05/sitechecker | 9c5d99109cfe2cbf46eb9f3cb397fc0f3505f039 | [
"MIT"
] | null | null | null | accounts/models.py | sahilr05/sitechecker | 9c5d99109cfe2cbf46eb9f3cb397fc0f3505f039 | [
"MIT"
] | 1 | 2021-09-12T14:08:05.000Z | 2021-09-12T14:08:05.000Z | from django.db import models # NOQA
| 18.5 | 36 | 0.756757 | 6 | 37 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 1 | 37 | 37 | 0.933333 | 0.108108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8f119d35643f81e63623959cfb3fb36a22f03058 | 153 | py | Python | tests/sample_test_suites/nested/app_2/tests/test_sth.py | not-raspberry/pytest_reorder | 0cc9333c1641fcbb426791bcbcf52d6e50530eed | [
"MIT"
] | 4 | 2016-04-10T00:11:38.000Z | 2019-06-21T02:43:09.000Z | tests/sample_test_suites/nested/app_2/tests/test_sth.py | not-raspberry/pytest_reorder | 0cc9333c1641fcbb426791bcbcf52d6e50530eed | [
"MIT"
] | 3 | 2016-05-13T13:44:10.000Z | 2018-05-29T23:05:30.000Z | tests/sample_test_suites/nested/app_2/tests/test_sth.py | not-raspberry/pytest_reorder | 0cc9333c1641fcbb426791bcbcf52d6e50530eed | [
"MIT"
] | 1 | 2018-05-29T16:19:52.000Z | 2018-05-29T16:19:52.000Z | """Tests not matching any substring out of 'unit', 'integration' and 'ui'."""
def test_a():
pass
def test_b():
pass
def test_c():
pass
| 10.928571 | 77 | 0.601307 | 23 | 153 | 3.869565 | 0.73913 | 0.235955 | 0.247191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.248366 | 153 | 13 | 78 | 11.769231 | 0.773913 | 0.464052 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
8f66c73e205a587c5b4c969b87c435680320c5d7 | 69 | py | Python | quest/foraging.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | null | null | null | quest/foraging.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | null | null | null | quest/foraging.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | null | null | null | QUEST_CONTRACT_ADDRESS = '0x3132c76acF2217646fB8391918D28a16bD8A8Ef4' | 69 | 69 | 0.927536 | 4 | 69 | 15.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.41791 | 0.028986 | 69 | 1 | 69 | 69 | 0.507463 | 0 | 0 | 0 | 0 | 0 | 0.6 | 0.6 | 0 | 0 | 0.6 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8f7d2f269d8e0ca789a23753538fd89ea1c06f7f | 85 | py | Python | elabjournal/elabjournal/ExperimentImages.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | 2 | 2021-06-29T11:17:27.000Z | 2022-01-11T18:41:49.000Z | elabjournal/elabjournal/ExperimentImages.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | null | null | null | elabjournal/elabjournal/ExperimentImages.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | 1 | 2019-06-06T13:23:11.000Z | 2019-06-06T13:23:11.000Z | from .eLABJournalPager import *
class ExperimentImages(eLABJournalPager):
pass | 14.166667 | 41 | 0.788235 | 7 | 85 | 9.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152941 | 85 | 6 | 42 | 14.166667 | 0.930556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
56aa16ea95289ca39b0e097144ae89a5d839d6fc | 50 | py | Python | modular_bot/runbot.py | wonwooseo/modular_bot | 9f254f6b3b681743dc6850fd8ed64f3252e8e318 | [
"MIT"
] | null | null | null | modular_bot/runbot.py | wonwooseo/modular_bot | 9f254f6b3b681743dc6850fd8ed64f3252e8e318 | [
"MIT"
] | null | null | null | modular_bot/runbot.py | wonwooseo/modular_bot | 9f254f6b3b681743dc6850fd8ed64f3252e8e318 | [
"MIT"
] | null | null | null | from modular_bot import main_bot
main_bot.main()
| 12.5 | 32 | 0.82 | 9 | 50 | 4.222222 | 0.555556 | 0.368421 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 50 | 3 | 33 | 16.666667 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
56d1904694366b2da05ffc97bd4d4f210c4ee549 | 38 | py | Python | src/030-digit-fifth-powers/python/solve.py | xfbs/ProjectEulerRust | e26768c56ff87b029cb2a02f56dc5cd32e1f7c87 | [
"MIT"
] | 1 | 2018-01-26T21:18:12.000Z | 2018-01-26T21:18:12.000Z | src/030-digit-fifth-powers/python/solve.py | xfbs/ProjectEulerRust | e26768c56ff87b029cb2a02f56dc5cd32e1f7c87 | [
"MIT"
] | 3 | 2017-12-09T14:49:30.000Z | 2017-12-09T14:59:39.000Z | src/030-digit-fifth-powers/python/solve.py | xfbs/ProjectEulerRust | e26768c56ff87b029cb2a02f56dc5cd32e1f7c87 | [
"MIT"
] | null | null | null | import solver
print(solver.solve(5))
| 9.5 | 22 | 0.763158 | 6 | 38 | 4.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.105263 | 38 | 3 | 23 | 12.666667 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
56f7245007a4c2a9a9b65b059bf4212293e0acf9 | 28 | py | Python | tflib/ops/__init__.py | AlexBlack2202/EigenGAN-Tensorflow | 9668738852abdcd7161b64b7e6a074c7ebfea055 | [
"MIT"
] | 581 | 2018-05-06T05:15:05.000Z | 2022-03-29T08:13:54.000Z | tflib/ops/__init__.py | yaojia1/darknet_my | 92906e6b32cdcabaa841461c6d2efe06a54057d1 | [
"MIT"
] | 52 | 2018-05-11T09:33:30.000Z | 2022-03-24T04:27:07.000Z | tflib/ops/__init__.py | yaojia1/darknet_my | 92906e6b32cdcabaa841461c6d2efe06a54057d1 | [
"MIT"
] | 137 | 2018-05-08T14:30:03.000Z | 2022-02-24T01:50:37.000Z | from tflib.ops.ops import *
| 14 | 27 | 0.75 | 5 | 28 | 4.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
711f7ccf1d53f73a700809f7e8b1e91dd70e2735 | 36 | py | Python | pgtools/__init__.py | Jefferson5286/PgTools | f86f901bdb8da1863dd610ad6ac5d9d8b09cd247 | [
"MIT"
] | null | null | null | pgtools/__init__.py | Jefferson5286/PgTools | f86f901bdb8da1863dd610ad6ac5d9d8b09cd247 | [
"MIT"
] | null | null | null | pgtools/__init__.py | Jefferson5286/PgTools | f86f901bdb8da1863dd610ad6ac5d9d8b09cd247 | [
"MIT"
] | null | null | null | print('pgtools version 1.0.0dev1')
| 18 | 35 | 0.722222 | 6 | 36 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.111111 | 36 | 1 | 36 | 36 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
712acc93888499d295fc18d94deb9ecb39f1bdba | 174 | py | Python | tests/test_cdist.py | rafsaf/tw-complex | e34492ace6a1a49400d7aea09ac7407d53cf01f6 | [
"MIT"
] | null | null | null | tests/test_cdist.py | rafsaf/tw-complex | e34492ace6a1a49400d7aea09ac7407d53cf01f6 | [
"MIT"
] | null | null | null | tests/test_cdist.py | rafsaf/tw-complex | e34492ace6a1a49400d7aea09ac7407d53cf01f6 | [
"MIT"
] | null | null | null | from tw_complex.cdist import CDistAndKNN
import tests.utils as utils
def test_CDistAndKNN():
utils.run_all_tests(CDistAndKNN, "CDistAndKNN", _precision=0.8, draw=True)
| 24.857143 | 78 | 0.793103 | 25 | 174 | 5.32 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012987 | 0.114943 | 174 | 6 | 79 | 29 | 0.850649 | 0 | 0 | 0 | 0 | 0 | 0.063218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a42c4a4e1bc54a85bf215110984110a68c9dd7ce | 301 | py | Python | candidate_matching/libs/LightGCN_TF/evaluator/__init__.py | jimzhu/OpenCTR-benchmarks | e8e723cd7a0ef5ddd40e735b85ce7669955a3a99 | [
"Apache-2.0"
] | 59 | 2021-10-31T13:59:37.000Z | 2022-03-31T12:05:55.000Z | candidate_matching/libs/LightGCN_TF/evaluator/__init__.py | jimzhu/OpenCTR-benchmarks | e8e723cd7a0ef5ddd40e735b85ce7669955a3a99 | [
"Apache-2.0"
] | 5 | 2021-12-06T12:11:21.000Z | 2022-03-18T06:21:13.000Z | candidate_matching/libs/LightGCN_TF/evaluator/__init__.py | jimzhu/OpenCTR-benchmarks | e8e723cd7a0ef5ddd40e735b85ce7669955a3a99 | [
"Apache-2.0"
] | 17 | 2021-10-21T10:44:09.000Z | 2022-03-24T11:35:09.000Z | # import eval_score_matrix_foldout
try:
from evaluator.cpp.evaluate_foldout import eval_score_matrix_foldout
print("eval_score_matrix_foldout with cpp")
except:
from evaluator.python.evaluate_foldout import eval_score_matrix_foldout
print("eval_score_matrix_foldout with python")
| 33.444444 | 75 | 0.817276 | 41 | 301 | 5.585366 | 0.341463 | 0.196507 | 0.327511 | 0.480349 | 0.768559 | 0.646288 | 0.646288 | 0.646288 | 0.646288 | 0.646288 | 0 | 0 | 0.13289 | 301 | 8 | 76 | 37.625 | 0.877395 | 0.106312 | 0 | 0 | 0 | 0 | 0.265918 | 0.187266 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
a4320f3ee64bbd789b97a2bb919f2d58823a550d | 43 | py | Python | python/testData/pyi/lineMarkers/SimilarForRuntimeMethod/a.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/pyi/lineMarkers/SimilarForRuntimeMethod/a.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/pyi/lineMarkers/SimilarForRuntimeMethod/a.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | class A:
def method(self):
pass | 14.333333 | 21 | 0.534884 | 6 | 43 | 3.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.372093 | 43 | 3 | 22 | 14.333333 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
f13091d24cc31733eac19ec361d7c354029c7b30 | 373 | py | Python | babilim/training/callbacks/__init__.py | penguinmenac3/babilim | d3b1dd7c38a9de8f1e553cc5c0b2dfa62fe25c27 | [
"MIT"
] | 1 | 2020-05-04T15:20:55.000Z | 2020-05-04T15:20:55.000Z | babilim/training/callbacks/__init__.py | penguinmenac3/babilim | d3b1dd7c38a9de8f1e553cc5c0b2dfa62fe25c27 | [
"MIT"
] | 1 | 2019-11-28T09:03:20.000Z | 2019-11-28T09:03:20.000Z | babilim/training/callbacks/__init__.py | penguinmenac3/babilim | d3b1dd7c38a9de8f1e553cc5c0b2dfa62fe25c27 | [
"MIT"
] | 1 | 2019-11-28T08:30:13.000Z | 2019-11-28T08:30:13.000Z | from babilim.training.callbacks.base_callback import BaseCallback
from babilim.training.callbacks.checkpoint_callback import CheckpointCallback
from babilim.training.callbacks.log_callback import LogCallback
from babilim.training.callbacks.lr_update_callback import LearningRateUpdateCallback
from babilim.training.callbacks.tensorboard_callback import TensorboardCallback
| 62.166667 | 84 | 0.906166 | 41 | 373 | 8.097561 | 0.414634 | 0.165663 | 0.286145 | 0.421687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053619 | 373 | 5 | 85 | 74.6 | 0.94051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.