hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1420b5636f3d9df34dca6ec55321637de72a8c50 | 1,950 | py | Python | packages/modification_rules.py | VoigtLab/ripp-design | 5a83cc0e38b879d548ba836af5a9dadbe83aeadd | [
"MIT"
] | null | null | null | packages/modification_rules.py | VoigtLab/ripp-design | 5a83cc0e38b879d548ba836af5a9dadbe83aeadd | [
"MIT"
] | null | null | null | packages/modification_rules.py | VoigtLab/ripp-design | 5a83cc0e38b879d548ba836af5a9dadbe83aeadd | [
"MIT"
] | null | null | null |
aas = 'FILWVMYCPAGTSQNEDHKR'
core_rules_r = dict([
['tgnb' , r"[GFL][PHL][DS][TSVH][T][IMEH][VSR][T][EKRSAVLG][T][ITS][E][NSQM][AVRFW][D][PI][DS][EAMW][YLST][FYEP][LAFQG]"],
['plpxy', r"[AT][VTCQSNKRP][AS][A][MLN][Y][G][VATS][V][FITEKP][P]"],
['paap' , r"[E][E][NSTDKRYV][AMTGP][MVSTNHD][YF][TSAVLWYG][KRSAY][GSTEHMVL][QNTHALVPG][VLATKRW][IKQ][VING][LVAKQRS][SA]"],
['lynd' , r"[LIMVAYWFTQSNKRHEDGP][C][SNTQYWFDEKRHLIMVAGP]"],
['lasf' , r"[Q][LVYS][V][GAW][RV][RV][NEL][I]$"],
['pals' , r"[GPNSQHRVI][C][GS][GSQCH]"],
['epid' , r"[GL][SE][FVKG][NQTRY][SQRG][YLN][CV][C]$"],
['thcok', r"[YWA][S]$"],
['padek', r"[HYFWALGP][YLC][D][S]$"],
['tevp' , r"[FWMYCAGTSQNDHKR]"]])
core_rules = dict([(k,[set(cr) for cr in v.strip('$').strip(']').strip('[').split('][')]) for k, v in core_rules_r.items()])
#spacing rule tuple is arranged as:
# (optimal RS-to-mod distance,
# position of the mod in the core motif above (0 is at the beginning of the motif),
# insertion spring constant (compression, kc),
# deletion spring constant (stretching, ks))
spacing_rules = dict([
['tgnb' , (37 , 4, 140 , 40 )],
['plpxy', (6 , 5, 16 , 1700)],
['paap' , (0 , 0, 5500 , 5800)],
['lynd' , (8 , 1, 10 , 300 )],
['lasf' , (None, 7, None , None)],
['pals' , (None, 1, None , None)],
['epid' , (None, 7, None , None)],
['thcok', (None, 1, None , None)],
['padek', (None, 3, None , None)],
['tevp' , (0 , 0, 1000000000, 1000000000)]])
recognition_sites = dict([
['tgnb' , "PYIAKYVEE"],
['plpxy', "ELNEEELEAIAG"],
['paap' , "FSTLSQRISAIT"],
['lynd' , "LAELSEEAL"],
['tevp' , "ENLYFQ"]])
leaders = dict([
['tgnb' , ("YR","QTLQNSTNLVYDDITQISFINKEKNVKKINL")],
['plpxy', ("SIESAKAFYQRMTDDASFRTPFEAELSKEERQQLIKDSGYDFTAEEWQQAMTEIQAARSNE","G")],
['paap' , ("IK","")],
['lynd' , ("NKKNILPQLGQPVIRLTAGQLSSQ","GGVDAS")],
['tevp' , ("","")]]) | 40.625 | 126 | 0.551795 | 243 | 1,950 | 4.399177 | 0.604938 | 0.037418 | 0.018709 | 0.024322 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037888 | 0.174359 | 1,950 | 48 | 127 | 40.625 | 0.626087 | 0.121026 | 0 | 0 | 0 | 0.083333 | 0.460773 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
142153febfd732a27d28ccaae5410a49667aa838 | 629 | py | Python | mysort/quick_sort_2.py | UMP-45/Python_Study | 75126a432230b226f27f2a7434d848bd0ce5696e | [
"Unlicense"
] | 1 | 2022-03-29T09:20:19.000Z | 2022-03-29T09:20:19.000Z | mysort/quick_sort_2.py | UMP-45/Python_Study | 75126a432230b226f27f2a7434d848bd0ce5696e | [
"Unlicense"
] | null | null | null | mysort/quick_sort_2.py | UMP-45/Python_Study | 75126a432230b226f27f2a7434d848bd0ce5696e | [
"Unlicense"
] | null | null | null | #!/usr/bin/env python3
# -*- coding:utf-8 -*-
def quick_sort(array, left, right):
if left<right:
mid=adjust_array(array, left, right)
quick_sort(array, left, mid-1)
quick_sort(array, mid+1, right)
def adjust_array(array, left, right):
i=left
j=right
base_value=array[left]
while i<j:
while (i<j and array[j]>base_value):
j=j-1
if i<j:
array[i]=array[j]
i=i+1
while ( i<j and array[i]<base_value):
i=i+1
if i<j:
array[j]=array[i]
j=j-1
array[i]=base_value
return i
if __name__=='__main__':
testlist=[1,4,6,0,3,5,7,9,2,8]
n=len(testlist)
quick_sort(testlist,0,n-1)
print(testlist) | 17.472222 | 39 | 0.647059 | 124 | 629 | 3.137097 | 0.306452 | 0.030848 | 0.107969 | 0.092545 | 0.257069 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038388 | 0.171701 | 629 | 36 | 40 | 17.472222 | 0.708253 | 0.066773 | 0 | 0.222222 | 0 | 0 | 0.013652 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0 | 0 | 0.111111 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
142316f2e2f7f3766a103d7cbfe23336f033018a | 534 | py | Python | requests/ungzip-data/example-2.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 140 | 2017-02-21T22:49:04.000Z | 2022-03-22T17:51:58.000Z | requests/ungzip-data/example-2.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 5 | 2017-12-02T19:55:00.000Z | 2021-09-22T23:18:39.000Z | requests/ungzip-data/example-2.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 79 | 2017-01-25T10:53:33.000Z | 2022-03-11T16:13:57.000Z | #!/usr/bin/env python3
'''Display data send as gzip'''
import urllib
import gzip
import io
url = 'http://www.stream-urls.de/webradio'
r = urllib.urlopen(url)
# create file-like object in memory
buf = io.StringIO(r.read())
# create gzip object using file-like object instead of real file on disk
f = gzip.GzipFile(fileobj=buf)
# get data from file
html = f.read()
print('---')
print(r.text[:250], '...')
#print('Content-Type :', r.headers['Content-Type'])
#print('Content-Encoding :', r.headers['Content-Encoding'])
| 17.225806 | 72 | 0.67603 | 81 | 534 | 4.45679 | 0.604938 | 0.055402 | 0.077562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008811 | 0.149813 | 534 | 30 | 73 | 17.8 | 0.786344 | 0.529963 | 0 | 0 | 0 | 0 | 0.168067 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.3 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
142c155c6fbb50f6d6e41b47affd8bd4a1cac56d | 2,792 | py | Python | type_analysis.py | XYPB/SpecVQGAN | ed3c0f86c41bc408824979305d9c4f6df0877973 | [
"MIT"
] | null | null | null | type_analysis.py | XYPB/SpecVQGAN | ed3c0f86c41bc408824979305d9c4f6df0877973 | [
"MIT"
] | null | null | null | type_analysis.py | XYPB/SpecVQGAN | ed3c0f86c41bc408824979305d9c4f6df0877973 | [
"MIT"
] | null | null | null | import json
import os
AMT_testset = json.load(open('data/AMT_test_set.json', 'r'))
AMT_trainset = json.load(open('data/greatesthit_train_2.00.json', 'r'))
AMT_validset = json.load(open('data/greatesthit_valid_2.00.json', 'r'))
greatest_testset = json.load(open('data/greatesthit_test_2.00.json', 'r'))
record_dir = 'data/greatesthit/greatesthit_processed'
match_dict = {}
type_dict = {}
overall_cnt = {True: 0, False: 0}
def process_action(name, st):
record_path = os.path.join(record_dir, name, 'hit_record.json')
record = json.load(open(record_path, 'r'))
action_cnt = {}
for (t, action) in record:
if t >= st and t <= st + 2:
_, act = action.split(' ')
if act not in action_cnt.keys():
action_cnt[act] = 0
action_cnt[act] += 1
return action_cnt
def process_type(name, st, drop_none=False):
record_path = os.path.join(record_dir, name, 'hit_record.json')
record = json.load(open(record_path, 'r'))
action_cnt = {}
for (t, action) in record:
if 'None' in action:
continue
if t >= st and t <= st + 2:
if action not in action_cnt.keys():
action_cnt[action] = 0
action_cnt[action] += 1
return action_cnt
def check_match(cnt1, cnt2):
type1 = list(cnt1.keys())
if 'None' in type1:
type1.remove('None')
type2 = list(cnt2.keys())
if 'None' in type2:
type2.remove('None')
for t in type1:
if t not in type2:
return False
for t in type2:
if t not in type1:
return False
return True
if __name__ == '__main__':
# for target in AMT_testset:
# target_name, start_time = target.split('_')
# target_action_cnt = process_action(target_name, float(start_time))
# match_dict[target] = {}
# type_dict[target] = target_action_cnt
# for condition in AMT_testset[target]:
# cond_name, start_time = condition.split('_')
# cond_action_cnt = process_action(cond_name, float(start_time))
# match_dict[target][condition] = check_match(target_action_cnt, cond_action_cnt)
# overall_cnt[match_dict[target][condition]] += 1
# type_dict[condition] = cond_action_cnt
# json.dump(match_dict, open('data/AMT_test_set_match_dict.json', 'w'))
print(len(greatest_testset))
for video_idx in greatest_testset:
name, idx = video_idx.split('_')
time = float(idx) / 22050
action_cnt = process_type(name, time)
if len(action_cnt.keys()) == 1:
type_dict[video_idx] = list(action_cnt.keys())[0]
print(len(type_dict))
json.dump(type_dict, open('data/greatesthit_test_2.00_single_type_only.json', 'w'))
| 32.465116 | 93 | 0.621777 | 391 | 2,792 | 4.173913 | 0.191816 | 0.104779 | 0.044118 | 0.039216 | 0.343137 | 0.246324 | 0.214461 | 0.127451 | 0.127451 | 0.127451 | 0 | 0.020067 | 0.250358 | 2,792 | 85 | 94 | 32.847059 | 0.759675 | 0.229943 | 0 | 0.245614 | 0 | 0 | 0.126523 | 0.095127 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.035088 | 0 | 0.175439 | 0.035088 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
142f394af6643e95062c23f47632b2841495c600 | 2,512 | py | Python | graphics/pellet_swrl_barchart.py | edlectrico/dissertation | eec342383ef4f15968e6417020681a3eb095bf08 | [
"Apache-2.0"
] | null | null | null | graphics/pellet_swrl_barchart.py | edlectrico/dissertation | eec342383ef4f15968e6417020681a3eb095bf08 | [
"Apache-2.0"
] | null | null | null | graphics/pellet_swrl_barchart.py | edlectrico/dissertation | eec342383ef4f15968e6417020681a3eb095bf08 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import matplotlib
matplotlib.rcParams['legend.fancybox'] = True
import matplotlib.pyplot as plt
import numpy as np
#data
x = [1, 2, 3, 4]
#define some data
x = [5000, 10000, 15000, 20000]
pc_mean = [4.770, 6.327, 7.427, 8.147]
s3mini_mean = [96.878, 101.656, 243.981, 331.433]
s3_mean = [84.121, 74.248, 209.431, 216.005]
nexus10_mean = [22.317, 45.193, 85.543, 107.151]
#error data
pc_error = [0.141, 0.164, 0.444, 0.105]
s3mini_error = [0.109, 0.322, 0.298, 0.110]
s3_error = [0.869, 0.250, 1.699, 1.202]
nexus10_error = [0.333, 1.312, 5.490, 2.749]
ind = np.arange(4) # the x locations for the groups
width = 0.20 # the width of the bars
fig, ax = plt.subplots()
#plot data
rects1 = ax.bar(ind, pc_mean, width, color='r', yerr=pc_error)
rects2 = ax.bar(ind+width, s3mini_mean, width, color='y', yerr=s3mini_error)
rects3 = ax.bar(ind+width+width, s3_mean, width, color='b', yerr=s3_error)
rects4 = ax.bar(ind+width+width+width, nexus10_mean, width, color='g', yerr=nexus10_error)
#configure X axes
plt.xlim(0,4)
plt.xticks(ind+width)
#configure Y axes
plt.ylim(0.0, 350.0)
plt.yticks([0.0, 50.0, 100.0, 150.0, 200.0, 250.0, 300.0, 350.0])
#labels
plt.ylabel('Time (s)' + '\n')
plt.xlabel('SWRL axioms' + '\n')
#title
#plt.title('Resulting means by increasing the number of ABox axioms using Pellet and Pellet4Android' + '\n')
ax.set_xticklabels( ('5,000', '10,000', '15,000', '20,000') )
ax.legend( (rects1[0], rects2[0], rects3[0], rects4[0]), ('PC', 'Samsung Galaxy SIII Mini', 'Samsung Galaxy SIII', 'Nexus 10') )
def autolabel(rects):
# attach some text labels
for rect in rects:
height = rect.get_height()
ax.text(rect.get_x()+rect.get_width()/2., 1.05*height, '%d'%int(height),
ha='center', va='bottom')
autolabel(rects1)
autolabel(rects2)
autolabel(rects3)
autolabel(rects4)
for t, a in zip(x, pc_mean):
plt.plot([t], [a], 'b',)
plt.annotate(round(a, 3), xy=(t, a), xytext=(t + 0.01, a + 0.01), color='black')
for t, a in zip(x, s3mini_mean):
plt.plot([t], [a], 'b',)
plt.annotate(round(a, 3), xy=(t, a), xytext=(t + 0.01, a + 0.01), color='black')
for t, a in zip(x, s3_mean):
plt.plot([t], [a], 'b',)
plt.annotate(round(a, 3), xy=(t, a), xytext=(t + 0.01, a + 0.01), color='black')
for t, a in zip(x, nexus10_mean):
plt.plot([t], [a], 'b',)
plt.annotate(round(a, 3), xy=(t, a), xytext=(t + 0.01, a + 0.01), color='black')
#save plot
plt.savefig('pellet_swrl.pdf')
plt.show()
| 30.26506 | 128 | 0.632166 | 462 | 2,512 | 3.383117 | 0.383117 | 0.015355 | 0.020473 | 0.017914 | 0.204734 | 0.181702 | 0.174664 | 0.174664 | 0.174664 | 0.174664 | 0 | 0.142653 | 0.162818 | 2,512 | 82 | 129 | 30.634146 | 0.600571 | 0.118232 | 0 | 0.153846 | 0 | 0 | 0.077692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019231 | false | 0 | 0.057692 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14304bb0fc36ef4f288e68a1bc0d0727900e0859 | 6,039 | py | Python | transfer_learning.py | benedictquartey/Chiromancer | 9dcf2ffaf059680a8fa8c71a37c9156927ccc7ec | [
"MIT"
] | 4 | 2019-07-18T22:50:47.000Z | 2021-01-20T22:02:15.000Z | transfer_learning.py | benedictquartey/Chiromancer | 9dcf2ffaf059680a8fa8c71a37c9156927ccc7ec | [
"MIT"
] | null | null | null | transfer_learning.py | benedictquartey/Chiromancer | 9dcf2ffaf059680a8fa8c71a37c9156927ccc7ec | [
"MIT"
] | 1 | 2019-11-22T13:47:17.000Z | 2019-11-22T13:47:17.000Z | #adapted from PyimageSearch for testing accuracy between my standard NN_model architecture and a popular model
# set the matplotlib backend so figures can be saved in the background
import matplotlib
matplotlib.use("Agg")
import data_processing
from keras.preprocessing.image import ImageDataGenerator
from keras.applications import VGG16
from keras.layers.core import Dropout
from keras.layers.core import Flatten
from keras.layers.core import Dense
from keras.layers import Input
from keras.models import Model
from keras.optimizers import SGD
from sklearn.metrics import classification_report
from imutils import paths
import matplotlib.pyplot as plt
import numpy as np
import pickle
import os
import keras
from keras.callbacks import ModelCheckpoint
num_classes = 6
# seeding to enable exact reproduction of learning results
np.random.seed(0)
def preprocess():
(X_train, X_test, Y_train, Y_test)=data_processing.prepareData()
x_train = np.array(X_train)
y_train = np.array(Y_train)
x_test = np.array(X_test)
y_test = np.array(Y_test)
#more reshaping
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
return(x_train,x_test,y_train,y_test)
def plot_training(H, N, plotPath):
# construct a plot that plots and saves the training history
plt.style.use("ggplot")
plt.figure()
plt.plot(np.arange(0, N), H.history["loss"], label="train_loss")
plt.plot(np.arange(0, N), H.history["val_loss"], label="val_loss")
plt.plot(np.arange(0, N), H.history["acc"], label="train_acc")
plt.plot(np.arange(0, N), H.history["val_acc"], label="val_acc")
plt.title("Training Loss and Accuracy")
plt.xlabel("Epoch #")
plt.ylabel("Loss/Accuracy")
plt.legend(loc="lower left")
plt.savefig(plotPath)
# initialize the training data augmentation object
trainAug = ImageDataGenerator(
rotation_range=30,
zoom_range=0.15,
width_shift_range=0.2,
height_shift_range=0.2,
shear_range=0.15,
horizontal_flip=True,
fill_mode="nearest")
# initialize the validation/testing data augmentation object
valAug = ImageDataGenerator()
mean = np.array([123.68, 116.779, 103.939], dtype="float32")
trainAug.mean = mean
valAug.mean = mean
(x_train, x_test, y_train, y_test)=preprocess()
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
# initialize the training generator
trainGen = trainAug.flow(x_train,y_train,batch_size=32)
# initialize the validation generator
valGen = valAug.flow(x_test,y_test,batch_size=32)
# CNN surgury
# load the VGG16 network, ensuring the head FC layer sets are left
# off
baseModel = VGG16(weights="imagenet", include_top=False,
input_tensor=Input(shape=(245, 240, 3)))
# construct the head of the model that will be placed on top of the
# the base model
headModel = baseModel.output
headModel = Flatten(name="flatten")(headModel)
headModel = Dense(512, activation="relu")(headModel)
headModel = Dropout(0.5)(headModel)
headModel = Dense(num_classes, activation="softmax")(headModel)
# place the head FC model on top of the base model (this will become
# the actual model we will train)
model = Model(inputs=baseModel.input, outputs=headModel)
# loop over all layers in the base model and freeze them so they will
# *not* be updated during the first training process
for layer in baseModel.layers:
layer.trainable = False
# compile our model (this needs to be done after our setting our
# layers to being non-trainable
print("[INFO] compiling model...")
opt = SGD(lr=1e-4, momentum=0.9)
model.compile(loss="categorical_crossentropy", optimizer=opt,
metrics=["accuracy"])
# train the head of the network for a few epochs (all other layers
# are frozen) -- this will allow the new FC layers to start to become
# initialized with actual "learned" values versus pure random
print("[INFO] training head...")
H = model.fit_generator(
trainGen,
steps_per_epoch=10 ,
validation_data=valGen,
validation_steps=30,
epochs=20,
verbose=1)
# reset the testing generator and evaluate the network after
# fine-tuning just the network head
print("[INFO] evaluating after fine-tuning network head...")
valGen.reset()
plot_training(H, 20, 'head_only_plot' )
#now to unfreeze some of the latter conv layers
# reset our data generators
trainGen.reset()
valGen.reset()
# now that the head FC layers have been trained/initialized, lets
# unfreeze the final set of CONV layers and make them trainable
for layer in baseModel.layers[15:]:
layer.trainable = True
# loop over the layers in the model and show which ones are trainable
# or not
for layer in baseModel.layers:
print("{}: {}".format(layer, layer.trainable))
# callback function to be executed after every training epoch, only saves the trsined model
# if its validation mean_squared_error is less than the model from the previoud epoch
interimModelPoint = ModelCheckpoint('model-{epoch:03d}.h5',
monitor='val_loss',
verbose=0,
save_best_only = 'true',
mode = 'auto')
# for the changes to the model to take affect we need to recompile
# the model, this time using SGD with a *very* small learning rate
print("[INFO] re-compiling model...")
opt = SGD(lr=1e-4, momentum=0.9)
model.compile(loss="categorical_crossentropy", optimizer=opt,
metrics=["accuracy"])
# train the model again, this time fine-tuning *both* the final set
# of CONV layers along with our set of FC layers
print("[INFO] training head and latter conv...")
H = model.fit_generator(
trainGen,
steps_per_epoch=10 ,
validation_data=valGen,
validation_steps=30,
epochs=10,
callbacks = [interimModelPoint],
verbose=1)
plot_training(H, 10, 'full_training_plot' )
# # serialize the model to disk
# print("[INFO] serializing network...")
# model.save('models/model.h5')
| 28.757143 | 110 | 0.741679 | 916 | 6,039 | 4.790393 | 0.338428 | 0.016408 | 0.006837 | 0.013674 | 0.184594 | 0.139471 | 0.128988 | 0.128988 | 0.10825 | 0.087967 | 0 | 0.020404 | 0.155986 | 6,039 | 209 | 111 | 28.894737 | 0.840494 | 0.336645 | 0 | 0.192982 | 0 | 0 | 0.130643 | 0.012106 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017544 | false | 0 | 0.157895 | 0 | 0.175439 | 0.078947 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
143077d32ff1589b7aba438263ce1a1ace1fb22c | 4,174 | py | Python | app/voxity/__init__.py | voxity/vox-ui-api | 9da442a2ae8e5fec92485cf7dc4adf1a560aa8f5 | [
"MIT"
] | null | null | null | app/voxity/__init__.py | voxity/vox-ui-api | 9da442a2ae8e5fec92485cf7dc4adf1a560aa8f5 | [
"MIT"
] | null | null | null | app/voxity/__init__.py | voxity/vox-ui-api | 9da442a2ae8e5fec92485cf7dc4adf1a560aa8f5 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Voxity api module."""
from __future__ import absolute_import, division, unicode_literals
from requests_oauthlib import OAuth2Session
from flask import current_app, session, abort
from datetime import datetime, timedelta
from app.utils import datetime_to_timestamp
from requests.models import Response
from app.voxity.error import ExceptVoxityTokenExpired
_DURATION_TOKEN = timedelta(days=7)
def check_respons(resp, esc_bad_resp=True):
if isinstance(resp, Response):
if resp.status_code == 401:
session['try_refresh_token'] = 0
session.modified = True
raise ExceptVoxityTokenExpired()
if esc_bad_resp and resp.status_code >= 400:
abort(resp.status_code)
if not esc_bad_resp and resp.status_code > 400:
pass
return True
return False
def save_token(token):
'''
:param dict token: token object
:retype: None
'''
token['expires_in'] = int(_DURATION_TOKEN.total_seconds())
token['expires_at'] = datetime_to_timestamp(
datetime.now() + _DURATION_TOKEN
)
session['oauth_token'] = token
session['try_refresh_token'] = 0
session['user'] = self_user()
session.modified = True
def bind(**kwargs):
return OAuth2Session(client_id=current_app.config['CLIENT_ID'], **kwargs)
def refresh_token():
'''
:retryp:OAuth2Session
:return:valid conector
'''
vox_bind = bind(
token=session['oauth_token']
)
token = vox_bind.refresh_token(
current_app.config['TOKEN_URL'],
client_id=current_app.config['CLIENT_ID'],
client_secret=current_app.config['CLIENT_SECRET'],
refresh_token=session['oauth_token']['refresh_token'],
)
save_token(token)
return connectors()
def connectors(**kwargs):
"""
:param dict token: token dict, default = session[oauth_token]
:retryp:OAuth2Session
"""
token = kwargs.get('token', session.get('oauth_token', None))
if isinstance(token, dict):
return bind(
token=token,
auto_refresh_url=current_app.config['TOKEN_URL'],
auto_refresh_kwargs={
'client_id': current_app.config['CLIENT_ID'],
'client_secret': current_app.config['CLIENT_SECRET']
}
)
else:
return None
def pager_dict(headers):
'''
:param request.headers:
:retype: dict
:return: dict pagger from header response
'''
return {
'total_item': headers.get('x-paging-total-records', None),
'max_page': headers.get('x-paging-total-pages', None),
'curent': headers.get('x-paging-page', 1),
'next': headers.get('x-paging-next', None),
'previous': headers.get('x-paging-previous', None),
'limit': headers.get('x-paging-limit', None)
}
def oauth_status():
con = connectors()
if con is not None:
try:
return con.get(
current_app.config['BASE_URL'] + '/oauth/status'
).json().get('message', 'unknow').lower()
except Exception:
pass
return None
def self_user():
con = connectors()
if con is not None:
resp = con.get(
current_app.config['BASE_URL'] + '/users/self'
)
if check_respons(resp):
return resp.json()['result']
return None
def logout():
con = connectors()
if con is not None:
resp = con.get(current_app.config['BASE_URL'] + "/logout")
session['user'] = {}
session['oauth_token'] = {}
session['oauth_state'] = {}
session.modified = True
return resp
return None
def api_proxy(uri, method, params=None, data=None):
method = method.lower()
con = connectors()
if uri and uri[0] != '/':
uri = "/" + uri
uri = current_app.config['BASE_URL'] + uri
if con is None:
return None
if method == 'get':
if params is not None and not isinstance(params, dict):
raise ValueError('voxity.proxy : params must be a dict')
resp = con.get(uri, params=params)
return resp
| 26.585987 | 77 | 0.614279 | 497 | 4,174 | 4.977867 | 0.257545 | 0.048504 | 0.07114 | 0.041229 | 0.259095 | 0.190784 | 0.166532 | 0.130962 | 0.10671 | 0.10671 | 0 | 0.006171 | 0.262338 | 4,174 | 156 | 78 | 26.75641 | 0.797337 | 0.070915 | 0 | 0.196262 | 0 | 0 | 0.132384 | 0.005802 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093458 | false | 0.018692 | 0.065421 | 0.009346 | 0.299065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14328f5773d881a1cba21ca73ad3e092c25a5373 | 916 | py | Python | servidor.py | robsonsilv4/socket-calculadora | a452163a5d02e2c9dbc0ecce57827010a4feb48d | [
"MIT"
] | null | null | null | servidor.py | robsonsilv4/socket-calculadora | a452163a5d02e2c9dbc0ecce57827010a4feb48d | [
"MIT"
] | null | null | null | servidor.py | robsonsilv4/socket-calculadora | a452163a5d02e2c9dbc0ecce57827010a4feb48d | [
"MIT"
] | null | null | null | import socket
def soma(a, b):
temp = a + b
return str(temp)
def sub(a, b):
temp = a - b
return str(temp)
def mult(a, b):
temp = a * b
return str(temp)
def div(a, b):
if b == 0:
return 0
else:
temp = a / b
temp = int(temp)
return str(temp)
host = ''
porta = 12003
servidor = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
servidor.bind((host, porta))
servidor.listen(1)
print('O servidor está esperando uma operação e dois números...')
while True:
conn, addr = servidor.accept()
msg = conn.recv(1024).decode()
op, a, b = msg.split()
a = int(a)
b = int(b)
resultado = 0
if op == 'soma':
resultado = soma(a, b)
elif op == 'subtração':
resultado = sub(a, b)
elif op == 'multiplicação':
resultado = mult(a, b)
elif op == 'divisão':
resultado = div(a, b)
else:
print('Operação não permitida')
conn.send(resultado.encode())
conn.close() | 17.283019 | 65 | 0.606987 | 143 | 916 | 3.874126 | 0.398601 | 0.050542 | 0.043321 | 0.037906 | 0.129964 | 0.129964 | 0.129964 | 0.129964 | 0.129964 | 0 | 0 | 0.018625 | 0.237991 | 916 | 53 | 66 | 17.283019 | 0.775072 | 0 | 0 | 0.142857 | 0 | 0 | 0.121047 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.02381 | 0 | 0.238095 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
143684501c853bd1f0eecaff8605ff54a74eb3e2 | 1,429 | py | Python | threatstack/icon_threatstack/connection/connection.py | killstrelok/insightconnect-plugins | 911358925f4233ab273dbd8172e8b7b9188ebc01 | [
"MIT"
] | null | null | null | threatstack/icon_threatstack/connection/connection.py | killstrelok/insightconnect-plugins | 911358925f4233ab273dbd8172e8b7b9188ebc01 | [
"MIT"
] | 1 | 2021-02-23T23:57:37.000Z | 2021-02-23T23:57:37.000Z | threatstack/icon_threatstack/connection/connection.py | killstrelok/insightconnect-plugins | 911358925f4233ab273dbd8172e8b7b9188ebc01 | [
"MIT"
] | null | null | null | import insightconnect_plugin_runtime
from .schema import ConnectionSchema, Input
# Custom imports below
from threatstack import ThreatStack
from threatstack.errors import ThreatStackAPIError, ThreatStackClientError, APIRateLimitError
from insightconnect_plugin_runtime.exceptions import ConnectionTestException
import datetime
class Connection(insightconnect_plugin_runtime.Connection):
def __init__(self):
super(self.__class__, self).__init__(input=ConnectionSchema())
self.client = None
def connect(self, params):
api_key = params.get(Input.API_KEY)["secretKey"]
user_id = params.get(Input.USER_ID)
org_id = params.get(Input.ORG_ID)
timeout = params.get(Input.TIMEOUT, 120)
self.client = ThreatStack(api_key=api_key,
user_id=user_id,
org_id=org_id,
api_version=2,
timeout=timeout)
def test(self):
now = datetime.datetime.now().strftime("%Y-%m-%d")
try:
self.client.http_request(method="get", path="agents", params={"from": now, "until": now})
except (ThreatStackAPIError, ThreatStackClientError, APIRateLimitError) as e:
raise ConnectionTestException(cause="An error occurred!",
assistance=e)
return {"success": True}
| 38.621622 | 101 | 0.635409 | 145 | 1,429 | 6.041379 | 0.448276 | 0.027397 | 0.063927 | 0.03653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003865 | 0.275717 | 1,429 | 36 | 102 | 39.694444 | 0.842512 | 0.013996 | 0 | 0 | 0 | 0 | 0.042644 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0 | 0.214286 | 0 | 0.392857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
143819e8e38cf9f6b9a218a1dac3ea36b6289b08 | 3,719 | py | Python | src/kgmk/ds/gen/models/tf/base.py | kagemeka/python | 486ce39d97360b61029527bacf00a87fdbcf552c | [
"MIT"
] | null | null | null | src/kgmk/ds/gen/models/tf/base.py | kagemeka/python | 486ce39d97360b61029527bacf00a87fdbcf552c | [
"MIT"
] | null | null | null | src/kgmk/ds/gen/models/tf/base.py | kagemeka/python | 486ce39d97360b61029527bacf00a87fdbcf552c | [
"MIT"
] | null | null | null | import tensorflow as tf
from tensorflow.keras.models \
import (
Model,
)
tf.random.set_seed(0)
from tensorflow.keras import (
layers,
activations,
regularizers,
initializers,
losses,
optimizers,
metrics,
callbacks,
)
from typing import (
Tuple,
List,
)
def define_callbacks(
save_path: str=None,
) -> List[callbacks.Callback]:
callbacks_ = [
callbacks.EarlyStopping(
monitor='val_loss',
patience=10,
)
]
if save_path:
callbacks_.append(
callbacks.ModelCheckpoint(
filepath=save_path,
monitor='val_loss',
save_best_only=True,
save_weights_only=True,
)
)
return callbacks_
class TFBase(Model):
def __init__(
self,
input_shape: Tuple[int]=...,
output_shape: Tuple[int]=...,
**kwargs):
super(TFBase, self).__init__()
self.input_layer = \
layers.InputLayer(
input_shape=input_shape,
)
self.dropout = layers.Dropout(
rate=0.3,
seed=0,
)
def call(self, x):
return self.input_layer(x)
def train_(
self,
train_data=...,
validation_data=None,
validation_split: int=0.2,
save_path: str=None,
epochs: int=...,
batch_size: int=...):
callbacks_ = define_callbacks(
save_path=save_path,
)
common_kwargs = {
'x': train_data[0],
'y': train_data[1],
'epochs': epochs,
'batch_size': batch_size,
'callbacks': callbacks_,
}
if validation_data:
self.fit(
**common_kwargs,
validation_data= \
validation_data,
)
else:
self.fit(
**common_kwargs,
validation_split= \
validation_split,
)
if save_path:
self.load_weights(save_path)
def _fully_connected(n: int=6):
return layers.Dense(
units=1<<n,
activation=activations.relu,
kernel_regularizer= \
regularizers.L2(l2=1e-3),
)
def _simple_rnn(
n: int=7,
return_sequences: bool=False):
return layers.SimpleRNN(
units=1<<n,
activation=activations.relu,
recurrent_dropout=0.3,
dropout=0.3,
kernel_regularizer= \
regularizers.L2(l2=1e-3),
return_sequences= \
return_sequences,
)
def _lstm(
n: int=7,
return_sequences: bool=False):
return layers.LSTM(
units=1<<n,
activation=activations.tanh,
recurrent_activation= \
activations.sigmoid,
unroll=False,
use_bias=True,
dropout=0.3,
recurrent_dropout=.0,
kernel_regularizer= \
regularizers.L2(l2=1e-3),
return_sequences= \
return_sequences,
)
def _gru(
n: int=7,
return_sequences: bool=False):
return layers.GRU(
units=1<<n,
recurrent_activation= \
activations.relu,
activation=activations.relu,
recurrent_dropout=0.3,
dropout=0.3,
kernel_regularizer= \
regularizers.L2(l2=1e-3),
return_sequences= \
return_sequences,
)
layers.LSTM
layers.LSTMCell
layers.Conv1D
layers.Convolution1D
layers.Conv1DTranspose
layers.Convolution1DTranspose
layers.Conv2D
layers.Conv2DTranspose
layers.Convolution2D
layers.Convolution2DTranspose
layers.Conv3D
layers.ConvLSTM2D
layers.InputLayer
layers.Lambda
layers.GlobalAveragePooling1D
layers.GlobalAvgPool1D
layers.GlobalMaxPool1D
layers.GlobalMaxPooling1D
layers.Activation
layers.Attention
layers.AlphaDropout
layers.Dropout
layers.ELU
layers.Embedding
layers.ReLU
layers.RNN
layers.SimpleRNN
layers.GRU
layers.ZeroPadding1D
layers.Reshape
layers.Softmax
layers.Bidirectional
layers.Dense
layers.Concatenate
print(losses.Reduction.AUTO)
print(type(losses.Reduction.AUTO)) | 17.460094 | 35 | 0.653401 | 414 | 3,719 | 5.678744 | 0.318841 | 0.057422 | 0.019141 | 0.052744 | 0.251808 | 0.215228 | 0.198639 | 0.183326 | 0.183326 | 0.131008 | 0 | 0.021277 | 0.241732 | 3,719 | 213 | 36 | 17.460094 | 0.812411 | 0 | 0 | 0.261364 | 0 | 0 | 0.011559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.022727 | 0.028409 | 0.107955 | 0.011364 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
143a56af23b19f8f358aefd76e50c5a0b55a7cd6 | 578 | py | Python | Project_Alibaba_workload/Alibaba_cluster_predict_Total/data_preprocess.py | sssssch/jupyter-examples | cf9e26e22dcfa263bcd26323527911cdbcc2cd61 | [
"MIT"
] | 2 | 2020-07-29T13:07:52.000Z | 2021-01-15T09:22:07.000Z | Project_Alibaba_workload/Alibaba_cluster_predict_Total/data_preprocess.py | sssssch/jupyter-examples | cf9e26e22dcfa263bcd26323527911cdbcc2cd61 | [
"MIT"
] | null | null | null | Project_Alibaba_workload/Alibaba_cluster_predict_Total/data_preprocess.py | sssssch/jupyter-examples | cf9e26e22dcfa263bcd26323527911cdbcc2cd61 | [
"MIT"
] | null | null | null | #-*-coding:utf-8-*-
import pandas as pd
dataset = pd.read_csv(
'machine_usage_headed.csv',
index_col=1)
print(dataset.head())
dataset.drop('m_number', axis=1, inplace=True)
dataset.drop('mem_gps', axis=1, inplace=True)
dataset.drop('mpki', axis=1, inplace=True)
print(dataset.head())
dataset.columns = [
'cpu_util_percent',
'mem_util_percent',
'net_in',
'net_out',
'disk_usage_percent']
dataset.index.name = 'Time'
print(dataset.head())
k = dataset.groupby('Time').mean()
#在这里,我取精确度为小数点前4位
k = round(k, 4)
k.to_csv('Machine_usage_groupby.csv') | 19.931034 | 46 | 0.688581 | 87 | 578 | 4.37931 | 0.517241 | 0.094488 | 0.125984 | 0.125984 | 0.141732 | 0.141732 | 0 | 0 | 0 | 0 | 0 | 0.013944 | 0.131488 | 578 | 29 | 47 | 19.931034 | 0.74502 | 0.058824 | 0 | 0.15 | 0 | 0 | 0.255985 | 0.090239 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0.15 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
143b49e77f95faed0f33ec50d94bcc182f1daf5e | 402 | py | Python | torchreid/__init__.py | iankuoli/OSNet-TopDrop | 3ab57ba507e9f8939762e27834137172375cd91c | [
"MIT"
] | null | null | null | torchreid/__init__.py | iankuoli/OSNet-TopDrop | 3ab57ba507e9f8939762e27834137172375cd91c | [
"MIT"
] | null | null | null | torchreid/__init__.py | iankuoli/OSNet-TopDrop | 3ab57ba507e9f8939762e27834137172375cd91c | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import print_function
__author__ = 'Rodolfo Quispe'
__description__ = 'Top Batch Dropblock for Person Re-Identification'
__url__ = 'https://github.com/RQuispeC/top-batch-dropblock'
__based_on__ = 'https://github.com/KaiyangZhou/deep-person-reid'
from . import (
engine,
models,
losses,
metrics,
data,
optim,
utils,
)
| 22.333333 | 68 | 0.733831 | 47 | 402 | 5.702128 | 0.723404 | 0.074627 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169154 | 402 | 17 | 69 | 23.647059 | 0.802395 | 0 | 0 | 0 | 0 | 0 | 0.38806 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14419e865d5310216570696263e0af4f4d542f91 | 1,192 | py | Python | glaciersat/tests/test_utils.py | jlandmann/glaciersat | 7d4b108c17a31cc85e568193024dabc2d590caa3 | [
"MIT"
] | 2 | 2021-02-18T22:06:04.000Z | 2021-06-23T13:04:20.000Z | glaciersat/tests/test_utils.py | jlandmann/glaciersat | 7d4b108c17a31cc85e568193024dabc2d590caa3 | [
"MIT"
] | null | null | null | glaciersat/tests/test_utils.py | jlandmann/glaciersat | 7d4b108c17a31cc85e568193024dabc2d590caa3 | [
"MIT"
] | 1 | 2021-12-08T09:03:54.000Z | 2021-12-08T09:03:54.000Z | import pytest
from glaciersat.tests import requires_credentials
from glaciersat.utils import *
import configobj
@requires_credentials
def test_get_credentials():
cred = get_credentials(credfile=None)
assert isinstance(cred, configobj.ConfigObj)
cred = get_credentials(credfile='.\\.credentials')
assert isinstance(cred, configobj.ConfigObj)
def test_declutter():
input = np.zeros((7, 7), dtype=int)
input[1:6, 1:6] = 1
# erode a lot, then leave as is
res = declutter(input, 3, 1).astype(int)
desired = np.zeros((7, 7), dtype=int)
desired[2:5, 2:5] = 1
np.testing.assert_array_equal(res, desired)
# do not erode, but dilate
res = declutter(input, 1, 3).astype(int)
desired = np.ones((7, 7), dtype=int)
np.testing.assert_array_equal(res, desired)
# erode and dilate (the offset is unfortunate though)
res = declutter(input, 3, 2).astype(int)
desired = np.zeros((7, 7), dtype=int)
desired[1:5, 1:5] = 1
np.testing.assert_array_equal(res, desired)
res = declutter(input, 2, 3).astype(int)
desired = np.zeros((7, 7), dtype=int)
desired[1:, 1:] = 1
np.testing.assert_array_equal(res, desired)
| 29.073171 | 57 | 0.676174 | 176 | 1,192 | 4.494318 | 0.295455 | 0.088496 | 0.044248 | 0.063211 | 0.480405 | 0.359039 | 0.337547 | 0.2933 | 0.247788 | 0.154235 | 0 | 0.037267 | 0.189597 | 1,192 | 40 | 58 | 29.8 | 0.781574 | 0.088926 | 0 | 0.321429 | 0 | 0 | 0.013876 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 1 | 0.071429 | false | 0 | 0.142857 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1444c2a69f4ae5a67cb69bbbd85ac69cbd3a414e | 4,553 | py | Python | scripts/pred_clef2018.py | rdenaux/acred | ffe44953a96338acfe3860a9898e7f0b70b5c9cb | [
"Apache-2.0"
] | 8 | 2020-08-31T04:14:22.000Z | 2021-09-29T06:00:31.000Z | scripts/pred_clef2018.py | expertailab/acred | ee45840c942ef2fac4f26da8d756b7c47e42847c | [
"Apache-2.0"
] | null | null | null | scripts/pred_clef2018.py | expertailab/acred | ee45840c942ef2fac4f26da8d756b7c47e42847c | [
"Apache-2.0"
] | 1 | 2020-10-07T08:09:29.000Z | 2020-10-07T08:09:29.000Z | #
# 2020 ExpertSystem
#
'''Script for generating predictions for Task2 of CLEF 2018 using the
acred predictor
See https://github.com/clef2018-factchecking/clef2018-factchecking
See also scripts/fetch-data.sh, which should download the v1.0 release
and place it in the `data/evaluation/` folder.
'''
import argparse
import sys
import os
import os.path as osp
import time
import requests
import json
import pandas as pd
import logging
from esiutils import dictu
logger = logging.getLogger(__name__)
def read_all_factuality_claims(folder):
files = [f for f in os.listdir(folder) if 'README' not in f]
columns = ['line_number', 'speaker', 'text', 'claim_number', 'normalized_claim', 'label']
fds = {f: pd.read_csv(osp.join(folder, f), sep='\t', names=columns)
for f in files}
return {f: df[df['label'] != '-']
for f, df in fds.items()}
def acred_as_clef_label(ci_cred, thresh=0.4):
assert thresh >= 0.0
assert thresh <= 1.0
if '@type' in ci_cred:
val = ci_cred['ratingValue']
else:
val = ci_cred['value']
if val >= thresh:
return 'TRUE'
elif val <= -thresh:
return 'FALSE'
else:
return 'HALF-TRUE'
def build_parser():
parser = argparse.ArgumentParser(
description='Genrate predictions for Task2 of CLEF 2018',
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument(
'-inputFolder',
help='Path to a folder with tsv files ending in .txt',
required=True)
parser.add_argument(
'-outFolder',
help='Path to a folder where the results should be written to',
required=True)
parser.add_argument(
'-config',
help='Path to a json file with configurations for calling the predictor')
return parser
def setup_logging():
root_logger = logging.getLogger('')
root_logger.setLevel(logging.DEBUG)
lformat = logging.Formatter(
'%(asctime)s %(name)s:%(levelname)s: %(message)s')
lsh = logging.StreamHandler(sys.stdout)
lsh.setFormatter(lformat)
root_logger.addHandler(lsh)
if __name__ == '__main__':
parser = build_parser()
setup_logging() # do here so we can log issues during CLI parsing
args = parser.parse_args()
all_start = time.time()
f2df = read_all_factuality_claims(args.inputFolder)
assert os.path.exists(args.outFolder), 'Output folder %s must exist' % (args.outFolder)
assert os.path.isdir(args.outFolder), 'Value for outFolder is not a folder'
cfg = {}
if args.config is not None:
with open(args.config, encoding='utf8') as cf:
cfg = json.load(cf)
acredapi_url = cfg['acredapi_url']
cred_thresh = float(cfg['cred_threshold'])
cred_path = ['reviewRating']
for f, df in f2df.items():
clef_pred = []
handled_ids = []
claims = df.to_dict(orient='records')
for ci, claim in enumerate(claims):
logger.info('Claim %d of %d in %s' % (ci, len(claims), f))
cid = int(claim['claim_number'])
if cid in handled_ids:
logger.info('Skipping as previously handled')
continue
url = '%s/api/v1/claim/predict/credibility?claim=%s' % (
acredapi_url, claim['normalized_claim'])
resp = requests.get(url, verify=False)
resp.raise_for_status()
claimcreds = resp.json()
credRating = dictu.get_in(claimcreds[0], cred_path)
clef_pred.append({
'id': cid,
'label': acred_as_clef_label(
credRating, cred_thresh)})
handled_ids.append(cid)
out_dir = '%s/reviews' % (args.outFolder)
if not os.path.exists(out_dir):
print('Creating dir %s for the reviews' % (out_dir))
os.makedirs(out_dir)
# write CredibilityReview to outFolder
fname = f.replace('.txt', '_%s.json' % cid)
with open('%s/%s' % (out_dir, fname), 'w') as f_out:
json.dump(claimcreds[0], f_out, indent=2)
# Finished processing all input files, now write collected ratings
outf = '%s/%s' % (args.outFolder, f.replace('task2-en-', 'primary-en-'))
pd.DataFrame(clef_pred).to_csv(
outf, header=False, index=False, sep='\t')
total_s = time.time() - all_start
print('Finished in %.3fs i.e. %.3fclaims/s' % (
total_s, len(clef_pred)/total_s))
print('Finished pred_clef2018')
| 32.755396 | 91 | 0.62025 | 593 | 4,553 | 4.627319 | 0.391231 | 0.023688 | 0.018586 | 0.012026 | 0.054665 | 0.021137 | 0 | 0 | 0 | 0 | 0 | 0.013049 | 0.259389 | 4,553 | 138 | 92 | 32.992754 | 0.800712 | 0.09642 | 0 | 0.066667 | 0 | 0 | 0.190929 | 0.016337 | 0 | 0 | 0 | 0 | 0.038095 | 1 | 0.038095 | false | 0 | 0.095238 | 0 | 0.180952 | 0.028571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1446ef1bbef7e5c334c8b012e9fde1d3a271f09c | 2,712 | py | Python | vnpy_pro/data/source/tdxdata.py | zhangjf76/vnpy | b7c507a90c84a6a3c34831d8daf5f8947b9858d2 | [
"MIT"
] | 1 | 2020-06-12T14:42:02.000Z | 2020-06-12T14:42:02.000Z | vnpy_pro/data/source/tdxdata.py | zhangjf76/vnpy | b7c507a90c84a6a3c34831d8daf5f8947b9858d2 | [
"MIT"
] | null | null | null | vnpy_pro/data/source/tdxdata.py | zhangjf76/vnpy | b7c507a90c84a6a3c34831d8daf5f8947b9858d2 | [
"MIT"
] | null | null | null | from datetime import timedelta, datetime
from typing import List
from vnpy.trader.constant import Exchange, Interval
from vnpy.trader.object import BarData, HistoryRequest
from vnpy_pro.data.source.dataapi import SourceDataApi
from vnpy_pro.data.tdx.tdx_future_data import TdxFutureData
INTERVAL_VT2JQ = {
Interval.MINUTE: "1min",
Interval.HOUR: "1hour",
Interval.DAILY: "1day",
}
INTERVAL_ADJUSTMENT_MAP = {
Interval.MINUTE: timedelta(minutes=1),
Interval.HOUR: timedelta(hours=1),
Interval.DAILY: timedelta() # no need to adjust for daily bar
}
class TdxdataClient(SourceDataApi):
"""通达信数据源"""
def __init__(self):
self.tdx_api = TdxFutureData()
def init(self, username="", password=""):
if self.tdx_api.connection_status:
return True
return self.tdx_api.connect()
def query_history(self, req: HistoryRequest):
symbol = req.symbol
exchange = req.exchange
interval = req.interval
start = req.start
end = req.end
tdx_interval = INTERVAL_VT2JQ.get(interval)
if not tdx_interval:
return None
# For adjust timestamp from bar close point (RQData) to open point (VN Trader)
adjustment = INTERVAL_ADJUSTMENT_MAP[interval]
# For querying night trading period data
end += timedelta(1)
result, bar_data = self.tdx_api.get_bars(
symbol=symbol,
period=tdx_interval,
start_dt=start,
end_dt=end
)
data: List[BarData] = []
if bar_data is not None:
for row in bar_data:
bar = BarData(
symbol=symbol,
exchange=exchange,
interval=interval,
datetime=row.datetime.to_pydatetime() - adjustment,
open_price=row.open_price,
high_price=row.high_price,
low_price=row.low_price,
close_price=row.close_price,
volume=row.volume,
open_interest=row.open_interest,
gateway_name=row.gateway_name
)
data.append(bar)
return data
tdxdata_client = TdxdataClient()
if __name__ == "__main__":
tdxdata_client = TdxdataClient()
req = HistoryRequest(symbol='SR2009',
exchange=Exchange.CZCE,
start=datetime(2020, 4, 1, 16, 13, 49, 896628),
end=datetime(2020, 4, 11, 16, 13, 49, 896628),
interval=Interval.MINUTE)
test_data = tdxdata_client.query_history(req)
pass
| 28.851064 | 86 | 0.588127 | 294 | 2,712 | 5.238095 | 0.35034 | 0.020779 | 0.025974 | 0.019481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026923 | 0.328909 | 2,712 | 93 | 87 | 29.16129 | 0.819231 | 0.057153 | 0 | 0.058824 | 0 | 0 | 0.010597 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044118 | false | 0.029412 | 0.088235 | 0 | 0.205882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
144bb1749b574daae9018e81890b3f972d50fc99 | 1,125 | py | Python | draalcore/test_utils/upload.py | jojanper/draalcore | 3d3f5a53efe32c721c34d7e48267328a4e9e8402 | [
"MIT"
] | 1 | 2017-04-25T10:54:55.000Z | 2017-04-25T10:54:55.000Z | draalcore/test_utils/upload.py | jojanper/draalcore | 3d3f5a53efe32c721c34d7e48267328a4e9e8402 | [
"MIT"
] | 1 | 2022-02-10T06:48:36.000Z | 2022-02-10T06:48:36.000Z | draalcore/test_utils/upload.py | jojanper/draalcore | 3d3f5a53efe32c721c34d7e48267328a4e9e8402 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""File upload utility"""
# System imports
import os
TEST_FILE_IMAGE = os.path.join(os.path.dirname(__file__), 'pic.jpg')
TEST_FILE_CONTENT_HEADER = 'attachment; filename="pic.jpg"'
TEST_FILE_INVALID = os.path.join(os.path.dirname(__file__), 'test.invalid')
TEST_FILE_GIF = os.path.join(os.path.dirname(__file__), 'pic.gif')
TEST_FILE_MP3 = os.path.join(os.path.dirname(__file__), 'audio.mp3')
TEST_FILE_MP4 = os.path.join(os.path.dirname(__file__), 'video.mp4')
def upload_file(api, method='test_upload1', with_file=True, test_file='test1', **kwargs):
if test_file == 'test1':
upload_file = TEST_FILE_IMAGE
elif test_file == 'test3':
upload_file = TEST_FILE_GIF
elif test_file == 'audio':
upload_file = TEST_FILE_MP3
elif test_file == 'video':
upload_file = TEST_FILE_MP4
else:
upload_file = TEST_FILE_INVALID
with open(upload_file, 'rb') as fp:
attachment = {"name": "test upload"}
if with_file:
attachment['file'] = fp
return getattr(api, method)(attachment, **kwargs)
| 32.142857 | 89 | 0.673778 | 161 | 1,125 | 4.347826 | 0.310559 | 0.182857 | 0.071429 | 0.085714 | 0.201429 | 0.201429 | 0.201429 | 0.085714 | 0 | 0 | 0 | 0.011931 | 0.180444 | 1,125 | 34 | 90 | 33.088235 | 0.747289 | 0.068444 | 0 | 0 | 0 | 0 | 0.126923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.043478 | 0 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
144c3cf3a9fb5486330ac336fdca1f611c47ecb6 | 3,029 | py | Python | mudpi/utils.py | icyspace/mudpi-core | d2afffacee0ba5c5ca57eada3017495d8697c5a4 | [
"BSD-4-Clause"
] | 163 | 2020-02-17T16:34:54.000Z | 2022-02-28T23:12:11.000Z | mudpi/utils.py | icyspace/mudpi-core | d2afffacee0ba5c5ca57eada3017495d8697c5a4 | [
"BSD-4-Clause"
] | 28 | 2020-05-05T19:42:25.000Z | 2021-08-04T18:36:32.000Z | mudpi/utils.py | icyspace/mudpi-core | d2afffacee0ba5c5ca57eada3017495d8697c5a4 | [
"BSD-4-Clause"
] | 31 | 2020-02-27T18:30:42.000Z | 2022-01-18T21:14:56.000Z | import sys
import json
import socket
import inspect
import subprocess
from mudpi.extensions import Component, BaseExtension, BaseInterface
def get_ip():
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
try:
# doesn't even have to be reachable
s.connect(('10.255.255.255', 1))
IP = s.getsockname()[0]
except Exception:
IP = '127.0.0.1'
finally:
s.close()
return IP
def get_module_classes(module_name):
""" Get all the classes from a module """
clsmembers = inspect.getmembers(sys.modules[module_name], inspect.isclass)
return clsmembers
def decode_event_data(message):
if isinstance(message, dict):
# print('Dict Found')
return message
elif isinstance(message.decode('utf-8'), str):
try:
temp = json.loads(message.decode('utf-8'))
# print('Json Found')
return temp
except Exception as error:
# print('Json Error. Str Found')
return message.decode('utf-8') #{'event': 'Unknown', 'data': message}
else:
# print('Failed to detect type')
return {'event': 'Unknown', 'data': message}
def install_package(package, upgrade=False, target=None):
"""
Install a PyPi package with pip in the background.
Returns boolean.
"""
pip_args = [sys.executable, '-m', 'pip', 'install', '--quiet', package]
if upgrade:
pip_args.append('--upgrade')
if target:
pip_args += ['--target', os.path.abspath(target)]
try:
return 0 == subprocess.call(pip_args)
except subprocess.SubprocessError:
return False
def is_package_installed(package):
""" Check if a package is already installed """
reqs = subprocess.check_output([sys.executable, '-m', 'pip', 'freeze'])
if '==' not in package:
installed_packages = [r.decode().split('==')[0].lower() for r in reqs.split()]
else:
installed_packages = [r.decode().lower() for r in reqs.split()]
return package in installed_packages
def is_extension(cls):
""" Check if a class is a MudPi Extension.
Accepts class or instance of class
"""
if not inspect.isclass(cls):
if hasattr(cls, '__class__'):
cls = cls.__class__
else:
return False
return issubclass(cls, BaseExtension)
def is_interface(cls):
""" Check if a class is a MudPi Extension.
Accepts class or instance of class
"""
if not inspect.isclass(cls):
if hasattr(cls, '__class__'):
cls = cls.__class__
else:
return False
return issubclass(cls, BaseInterface)
def is_component(cls):
""" Check if a class is a MudPi component.
Accepts class or instance of class
"""
if not inspect.isclass(cls):
if hasattr(cls, '__class__'):
cls = cls.__class__
else:
return False
return issubclass(cls, Component) | 29.125 | 86 | 0.601849 | 366 | 3,029 | 4.852459 | 0.322404 | 0.027027 | 0.018018 | 0.028716 | 0.264077 | 0.264077 | 0.241554 | 0.241554 | 0.228041 | 0.228041 | 0 | 0.011045 | 0.282602 | 3,029 | 104 | 87 | 29.125 | 0.806259 | 0.179267 | 0 | 0.308824 | 0 | 0 | 0.055184 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.088235 | 0 | 0.426471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
144ff03552b9b5f4adb81f6c8238173d57c8615f | 482 | py | Python | examples/serializers/blowfishtest.py | Jumpscale/jumpscale_examples7 | 7d7be2401489a0c5465950c66bbdbb266fe57fe7 | [
"Apache-2.0"
] | 1 | 2015-10-26T10:38:37.000Z | 2015-10-26T10:38:37.000Z | examples/serializers/blowfishtest.py | Jumpscale/jumpscale6_core | 0502ddc1abab3c37ed982c142d21ea3955d471d3 | [
"BSD-2-Clause"
] | null | null | null | examples/serializers/blowfishtest.py | Jumpscale/jumpscale6_core | 0502ddc1abab3c37ed982c142d21ea3955d471d3 | [
"BSD-2-Clause"
] | null | null | null | import os
import struct
from JumpScale import j
import JumpScale.baselib.serializers
j.application.start("blowfishtest")
from random import randrange
msg = ""
for i in range(1000):
msg += chr(randrange(0, 256))
key = ""
for i in range(56):
key += chr(randrange(0, 256))
# b means blowfish
s = j.db.serializers.getSerializerType("b", key=key)
nr = 100000
j.base.timer.start()
for i in range(nr):
data = s.dumps(msg)
j.base.timer.stop(nr)
j.application.stop()
| 14.606061 | 52 | 0.690871 | 76 | 482 | 4.381579 | 0.486842 | 0.036036 | 0.054054 | 0.099099 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.170124 | 482 | 32 | 53 | 15.0625 | 0.7825 | 0.033195 | 0 | 0 | 0 | 0 | 0.028017 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.263158 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1451936f6556fb61e6556f37e17327936762d700 | 2,385 | py | Python | plico_motor/gui/motor_control_gui.py | ArcetriAdaptiveOptics/plico_motor | 955be506c6dcd21b1cc794790008f33272b9c99f | [
"MIT"
] | null | null | null | plico_motor/gui/motor_control_gui.py | ArcetriAdaptiveOptics/plico_motor | 955be506c6dcd21b1cc794790008f33272b9c99f | [
"MIT"
] | null | null | null | plico_motor/gui/motor_control_gui.py | ArcetriAdaptiveOptics/plico_motor | 955be506c6dcd21b1cc794790008f33272b9c99f | [
"MIT"
] | null | null | null | import sys
import plico_motor
from guietta import Gui, _, G, Exceptions
class Runner(object):
def __init__(self):
self.motor = None
def _setUp(self, host='localhost', port=7200, axis=1):
def moveby(gui):
nsteps = int(gui.nstepsby)
if self.motor:
self.motor.move_by(nsteps)
def moveto(gui):
nsteps = int(gui.nstepsto)
if self.motor:
self.motor.move_to(nsteps)
def home(gui):
if self.motor:
self.motor.home()
def getstatus(gui):
try:
if self.motor:
gui.pos = self.motor.position()
gui.status = self.motor.status().as_dict()
else:
gui.pos = '---'
gui.status = 'Not connected'
except Exception as e:
gui.pos = str(e)
gui.status = 'Not connected'
def connect(gui):
host = gui.host
port = gui.port
axis = gui.axis
self.motor = plico_motor.motor(host, int(port), int(axis))
connection_gui = Gui(
[ 'Host:', '__host__' ],
[ 'Port:', '__port__' ],
[ 'Axis:', '__axis__' ],
[ ['Connect'] ]
)
connection_gui.host = host
connection_gui.port = port
connection_gui.axis = axis
connection_gui.Connect = connect
control_gui = Gui(
[ 'Pos:' , 'pos' , 'steps' ],
[ ['Move to'] , '__nstepsto__', 'steps' ],
[ ['Move by'] , '__nstepsby__', 'steps' ],
[ ['Home'] , _ , _ ],
[ 'Status:' , 'status' , _ ], exceptions=Exceptions.OFF
)
control_gui.Moveby = moveby
control_gui.Moveto = moveto
control_gui.Home = home
control_gui.timer_start(getstatus, 0.1)
self.gui = Gui(
[ G('Connection') ],
[ G('Motor') ]
)
self.gui.Connection = connection_gui
self.gui.Motor = control_gui
def run(self, argv):
self._setUp(*argv)
self.gui.run()
def terminate(self, signal, frame):
pass
if __name__ == '__main__':
runner = Runner()
sys.exit(runner.run(sys.argv[1:]))
| 26.5 | 81 | 0.472537 | 238 | 2,385 | 4.5 | 0.273109 | 0.092437 | 0.041083 | 0.042017 | 0.063492 | 0.044818 | 0 | 0 | 0 | 0 | 0 | 0.00563 | 0.404193 | 2,385 | 89 | 82 | 26.797753 | 0.748065 | 0 | 0 | 0.086957 | 0 | 0 | 0.077181 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0.014493 | 0.043478 | 0 | 0.188406 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14532332bed6d241cade80730fa40fcc639066ed | 2,389 | py | Python | Manager/views.py | franklinwagbara/Brookstone-Pastoral-Management-System | a8a4cd66fbb7284e3cd61539bc313100ebd14f94 | [
"MIT"
] | null | null | null | Manager/views.py | franklinwagbara/Brookstone-Pastoral-Management-System | a8a4cd66fbb7284e3cd61539bc313100ebd14f94 | [
"MIT"
] | null | null | null | Manager/views.py | franklinwagbara/Brookstone-Pastoral-Management-System | a8a4cd66fbb7284e3cd61539bc313100ebd14f94 | [
"MIT"
] | null | null | null | from django.shortcuts import render, redirect
from .forms import SeasonForm, CurrentSeasonForm
from django.contrib import messages
from .functions import InitializeOtherSeasonValues
from StudentManager.models import Seasons, Students, CurrentSeason
from django.contrib.auth.decorators import login_required
from Dashboard.decorators import admin
@login_required(login_url='login')
@admin
def createSeason(request):
if request.method == 'POST':
form = SeasonForm(request.POST or None, request.FILES or None)
if form.is_valid():
Season = str(form.cleaned_data['SeasonName'])
if Seasons.objects.filter(SeasonName=Season).exists():
message = "Operation Failed!: Season Name already exists. Try using another name."
else:
form.save()
message = InitializeOtherSeasonValues(Season)
message = "Season Creation " + message
if "Failed" in message:
messages.error(request, message)
else:
messages.success(request, message)
form = SeasonForm()
return render(request, 'createSeason.html', {'form': form})
else:
form = SeasonForm()
return render(request, 'createSeason.html', {'form': form})
@login_required(login_url='login')
@admin
def changeCurrentSeason(request):
message=""
currentseason = CurrentSeason.objects.get(pk=1)
form = CurrentSeasonForm(instance=currentseason)
if request.method == 'POST':
form = CurrentSeasonForm(request.POST or None, request.FILES or None, instance=currentseason)
if form.is_valid():
form.save()
seasonName = str(form.cleaned_data['Season'])
message = InitializeOtherSeasonValues(seasonName)
if "Failed" in message:
messages.error(request, message)
else:
messages.success(request, message)
return redirect("Manager-changeCurrentSeason")
else:
form = CurrentSeasonForm(instance=currentseason)
return render(request, 'changeCurrentSeason.html', {'form': form})
@login_required(login_url='login')
@admin
def viewSettings(request):
currentSeason = CurrentSeason.objects.get(pk=1)
season = currentSeason.Season
return render(request, 'viewSettings.html', {'season': season}) | 41.189655 | 101 | 0.662202 | 238 | 2,389 | 6.60084 | 0.285714 | 0.044558 | 0.048377 | 0.040102 | 0.366645 | 0.337365 | 0.287715 | 0.266073 | 0.221515 | 0.156588 | 0 | 0.0011 | 0.239012 | 2,389 | 58 | 102 | 41.189655 | 0.863036 | 0 | 0 | 0.527273 | 0 | 0 | 0.107531 | 0.021339 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054545 | false | 0 | 0.127273 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1453dc66d7e27b9f482339e0f5d11f6092505b1c | 1,316 | py | Python | easy/872-Leaf-Similar Trees.py | Davidxswang/leetcode | d554b7f5228f14c646f726ddb91014a612673e06 | [
"Apache-2.0"
] | 2 | 2020-05-08T02:17:17.000Z | 2020-05-17T04:55:56.000Z | easy/872-Leaf-Similar Trees.py | Davidxswang/leetcode | d554b7f5228f14c646f726ddb91014a612673e06 | [
"Apache-2.0"
] | null | null | null | easy/872-Leaf-Similar Trees.py | Davidxswang/leetcode | d554b7f5228f14c646f726ddb91014a612673e06 | [
"Apache-2.0"
] | null | null | null | """
https://leetcode.com/problems/leaf-similar-trees/
Consider all the leaves of a binary tree. From left to right order, the values of those leaves form a leaf value sequence.
For example, in the given tree above, the leaf value sequence is (6, 7, 4, 9, 8).
Two binary trees are considered leaf-similar if their leaf value sequence is the same.
Return true if and only if the two given trees with head nodes root1 and root2 are leaf-similar.
Constraints:
Both of the given trees will have between 1 and 200 nodes.
Both of the given trees will have values between 0 and 200
"""
# time complexity: O(n), space complexity: O(1)
# Definition for a binary tree node.
# class TreeNode:
# def __init__(self, val=0, left=None, right=None):
# self.val = val
# self.left = left
# self.right = right
class Solution:
def leafSimilar(self, root1: TreeNode, root2: TreeNode) -> bool:
from itertools import zip_longest
return all(x == y for x, y in zip_longest(self.dfs(root1), self.dfs(root2)))
def dfs(self, root: TreeNode):
if root.left is None and root.right is None:
yield root.val
if root.left is not None:
yield from self.dfs(root.left)
if root.right is not None:
yield from self.dfs(root.right)
| 30.604651 | 123 | 0.677052 | 213 | 1,316 | 4.15493 | 0.403756 | 0.031638 | 0.057627 | 0.042938 | 0.126554 | 0.126554 | 0.126554 | 0.065537 | 0 | 0 | 0 | 0.021 | 0.240122 | 1,316 | 42 | 124 | 31.333333 | 0.864 | 0.612462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
145544a55d9f8af6fb2ce9e4cb8863368e3c6710 | 3,797 | py | Python | server/sections/views.py | ShahriarDhruvo/DU_Hackathon | 30650287a61a3314e6adb9e1c26206d4d8dbe312 | [
"MIT"
] | 3 | 2021-02-27T21:06:05.000Z | 2021-08-03T16:56:22.000Z | server/sections/views.py | ShahriarDhruvo/Class-Portal-Project-350- | 8ebc311a7992b61e5d15da19f0ba7dfdc059a705 | [
"MIT"
] | 1 | 2021-02-13T07:22:09.000Z | 2021-02-13T07:22:09.000Z | server/sections/views.py | ShahriarDhruvo/Class-Portal-Project-350- | 8ebc311a7992b61e5d15da19f0ba7dfdc059a705 | [
"MIT"
] | 1 | 2021-06-14T13:25:36.000Z | 2021-06-14T13:25:36.000Z | from rest_framework.decorators import api_view
from rest_framework.response import Response
from rest_framework.exceptions import (
NotFound,
APIException,
PermissionDenied,
)
from rest_framework.generics import (
CreateAPIView,
DestroyAPIView,
ListAPIView,
UpdateAPIView
)
from .serializers import (
SectionSerializer,
SectionUpdateSerializer
)
from .models import Section
from rooms.models import Room
from django.db.models import Q
class Conflict(APIException):
status_code = 409
default_code = 'conflit'
default_detail = 'Item already exist.'
class SectionList(ListAPIView):
serializer_class = SectionSerializer
def get_queryset(self):
room_pk = self.kwargs.get('room_pk', None)
queryset = Section.objects.filter(room_id=room_pk).order_by('id')
if queryset:
return queryset
else:
raise NotFound("No section has been created yet")
class SectionCreate(CreateAPIView):
serializer_class = SectionSerializer
def create(self, request, *args, **kwargs):
user_id = request.user.id
room_pk = self.kwargs.get('room_pk', None)
#is_authenticated = request.user.is_staff
is_teacher = Room.objects.filter(teachers=user_id, id=room_pk)
# if not is_authenticated:
if not is_teacher:
raise PermissionDenied(
"You are not authorized to create this section!")
request.data._mutable = True
request.data['room'] = room_pk
request.data._mutable = False
return super(SectionCreate, self).create(request, *args, **kwargs)
class SectionDelete(DestroyAPIView):
lookup_url_kwarg = 'section_pk'
def get_queryset(self):
user_id = self.request.user.id
#section_pk = self.kwargs.get('section_pk', None)
room_pk = self.kwargs.get('room_pk', None)
is_teacher = Room.objects.filter(teachers=user_id, id=room_pk)
if not is_teacher:
raise PermissionDenied(
"You are not authorized to delete this section!")
queryset = Section.objects.filter(room_id=room_pk)#id=section_pk)
if queryset:
return queryset
else:
raise NotFound("Section not found")
class SectionUpdate(UpdateAPIView):
serializer_class = SectionUpdateSerializer
lookup_url_kwarg = 'section_pk'
def get_queryset(self):
user_id = self.request.user.id
#section_pk = self.kwargs.get('section_pk', None)
room_pk = self.kwargs.get('room_pk', None)
is_teacher = Room.objects.filter(teachers=user_id, id=room_pk)
if not is_teacher:
raise PermissionDenied(
"You are not authorized to edit this section!")
queryset = Section.objects.filter(room_id=room_pk)#id=section_pk)
if queryset:
return queryset
else:
raise NotFound("Section not found")
class SectionDetails(ListAPIView):
serializer_class = SectionSerializer
def get_queryset(self):
user_id = self.request.user.id
section_pk = self.kwargs.get('section_pk', None)
room_pk = self.kwargs.get('room_pk', None)
is_teacher = Room.objects.filter(
(Q(teachers=user_id) | Q(students=user_id)), id=room_pk)
if not is_teacher:
raise PermissionDenied(
"You are not authorized to view this section!")
queryset = Section.objects.filter(id=section_pk, room_id=room_pk)
if queryset:
return queryset
else:
raise NotFound("Section not found")
| 27.316547 | 75 | 0.632078 | 433 | 3,797 | 5.364896 | 0.212471 | 0.049074 | 0.030994 | 0.051657 | 0.587172 | 0.587172 | 0.570383 | 0.552734 | 0.478261 | 0.464916 | 0 | 0.001105 | 0.285225 | 3,797 | 138 | 76 | 27.514493 | 0.854827 | 0.049776 | 0 | 0.494505 | 0 | 0 | 0.103637 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054945 | false | 0 | 0.087912 | 0 | 0.362637 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
145570f50fd0d5242fca2c5f9ae0324f2bd8ada0 | 862 | py | Python | test-CapitalizeFile.py | shastriUF/appveyor-test | b85f6724fc0e5fc15618060a96da3a2c409ff2c2 | [
"MIT"
] | 3 | 2018-10-03T14:52:11.000Z | 2019-10-19T07:56:26.000Z | test-CapitalizeFile.py | shastriUF/appveyor-test | b85f6724fc0e5fc15618060a96da3a2c409ff2c2 | [
"MIT"
] | null | null | null | test-CapitalizeFile.py | shastriUF/appveyor-test | b85f6724fc0e5fc15618060a96da3a2c409ff2c2 | [
"MIT"
] | null | null | null | import CapitalizeFile
import os
import sys
import unittest
from unittest.mock import patch
class TestCapitalization(unittest.TestCase):
def testUnitTestAssert(self):
self.assertEqual(1, 1)
def testCapitalizeFile(self):
with open('.inputFile', 'w') as inputFile:
inputFile.write('Hello world!')
CapitalizeFile.writeToFileInAllCaps('.inputFile', '.outputFile')
with open('.outputFile', 'r') as outputFile:
self.assertEqual(outputFile.read(), 'HELLO WORLD!')
os.remove('.inputFile')
os.remove('.outputFile')
def testCallCapitalizeFileWithBadArguments(self):
with patch.object(sys, 'argv', ["CapitalizeFile.py", "InputFile"]):
with self.assertRaises(RuntimeError):
CapitalizeFile.main()
if __name__ == '__main__':
unittest.main()
| 31.925926 | 75 | 0.657773 | 81 | 862 | 6.901235 | 0.481481 | 0.053667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002976 | 0.220418 | 862 | 26 | 76 | 33.153846 | 0.828869 | 0 | 0 | 0 | 0 | 0 | 0.147332 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.136364 | false | 0 | 0.227273 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14557ccde474e88eaae99ac8ee7c1c015c75995c | 3,249 | py | Python | tests/tests.py | ipashchenko/pulses | 048078b4cd268390fe54eb25af4ecff5ffe1c0e4 | [
"MIT"
] | null | null | null | tests/tests.py | ipashchenko/pulses | 048078b4cd268390fe54eb25af4ecff5ffe1c0e4 | [
"MIT"
] | null | null | null | tests/tests.py | ipashchenko/pulses | 048078b4cd268390fe54eb25af4ecff5ffe1c0e4 | [
"MIT"
] | null | null | null | from unittest import (TestCase, skipIf)
import numpy as np
from astropy.time import Time, TimeDelta
from pulses.dsp import DSP
from pulses.dedispersion import DeDisperser, noncoherent_dedispersion
from pulses.preprocess import PreProcesser, create_ellipses
from pulses.search import Searcher, search_shear, search_ell
from pulses.pipeline import Pipeline
from pulses.candidate import Candidate
# Test shapes, test direction of increasing DM
class TestAll(TestCase):
def setUp(self):
self.cache_dir = None
self.meta_data = {'exp_code': 'test', 'freq': 'K',
'band': 'U', 'pol': 'LLRR', 'antenna': 'EF'}
self.n_nu = 64
self.n_t = 1000
self.nu_0 = 1668.
self.d_nu = 0.5
self.d_t = 0.001
self.t_0 = Time.now()
self.dsp = DSP(self.n_nu, self.n_t, self.nu_0, self.d_nu, self.d_t,
meta_data=self.meta_data, t_0=self.t_0)
self.std = 1.
self.dm = 400.
self.width = 0.003
self.amp = 1.5
self.t0 = 0.5
self.dsp.add_noise(self.std)
self.dsp.add_pulse(self.t0, self.amp, self.width, self.dm)
self.dm_grid = np.arange(0, 1000, 20, dtype=float)
ddsp = DeDisperser(noncoherent_dedispersion, self.dm_grid,
nu_max=1668, d_nu=0.5, d_t=0.001, threads=4)
self.ddsp = ddsp(self.dsp)
def test_dsp_shape(self):
self.assertEqual(self.dsp.values.shape, (self.n_nu, self.n_t))
def test_dsp_d_t(self):
self.assertEqual(self.dsp.d_t, TimeDelta(self.d_t, format='sec'))
def test_dsp_tfull(self):
self.assertEqual(self.dsp.t[-1],
self.t_0 + (self.n_t-1) * TimeDelta(self.d_t,
format='sec'))
def test_shear(self):
searcher = Searcher(search_shear, mph=3.5, mpd=50, shear=0.4)
candidates = searcher(self.ddsp)
self.assertGreaterEqual(len(candidates), 1)
if len(candidates) == 1:
candidate = candidates[0]
self.assertAlmostEqual(candidate.dm, self.dm, delta=100.)
@skipIf(True, 'Passing')
def test_ell(self):
preprocesser = PreProcesser(create_ellipses, disk_size=3,
threshold_big_perc=90., threshold_perc=97.5,
statistic='mean')
ddsp = preprocesser(self.ddsp)
searcher = Searcher(search_ell, x_stddev=10., y_to_x_stddev=0.3,
theta_lims=[130., 180.], x_cos_theta=3.)
candidates = searcher(ddsp)
self.assertEqual(len(candidates), 1)
candidate = candidates[0]
self.assertAlmostEqual(candidate.dm, self.dm, delta=100.)
class TestDB(TestCase):
def setUp(self):
pipeline = Pipeline(None, None, [None], [None], 'db.sqlite')
candidates = list()
found_dmt = list()
t_0 = 0.
d_t = 0.001
for dm, ix_t in found_dmt:
candidate = Candidate(t_0 +
ix_t * TimeDelta(d_t, format='sec'),
dm)
candidates.append(candidate)
pipeline.save_to_db(candidates)
| 36.1 | 80 | 0.579255 | 430 | 3,249 | 4.213953 | 0.304651 | 0.009934 | 0.013245 | 0.009934 | 0.179912 | 0.136865 | 0.122517 | 0.122517 | 0.0883 | 0.0883 | 0 | 0.043054 | 0.306556 | 3,249 | 89 | 81 | 36.505618 | 0.761207 | 0.013543 | 0 | 0.082192 | 0 | 0 | 0.020938 | 0 | 0 | 0 | 0 | 0 | 0.09589 | 1 | 0.09589 | false | 0.013699 | 0.123288 | 0 | 0.246575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1456ebcd3a9f688ca99246c1d0ee66c13a04394d | 5,418 | py | Python | src/unpackaged/abm/BuildingTools/model4.py | agdturner/geog5990m | b6417820e6aaff7f0c785415c0d63eae3753a098 | [
"Apache-2.0"
] | null | null | null | src/unpackaged/abm/BuildingTools/model4.py | agdturner/geog5990m | b6417820e6aaff7f0c785415c0d63eae3753a098 | [
"Apache-2.0"
] | null | null | null | src/unpackaged/abm/BuildingTools/model4.py | agdturner/geog5990m | b6417820e6aaff7f0c785415c0d63eae3753a098 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
__version__ 1.0.0
"""
#import operator
import random
#import matplotlib.pyplot
import time
def distance_between(a, b):
""" A function to calculate the distance between agent a and agent b.
Args:
a: A list of two coordinates for orthoganol axes.
b: A list of two coordinates for the same orthoganol axes as a.
Returns:
The straight line distance between the a and b in the an plane given
by two orthoganol axes.
"""
distance = ((a[1] - b[1])**2 + (a[0] - b[0])**2)**0.5
##print("distance =", str(distance))
return distance
'''
Step 1: Initialise parameters
'''
print("Step 1: Initialise parameters")
num_of_agents = 1000
num_of_iterations = 1000
rangey = 100
rangex = 50
deltarange = 10
random_seed = 0 # Try varying this to get different results.
print("num_of_agents", num_of_agents)
print("num_of_iterations", num_of_iterations)
print("rangey", rangey)
print("rangex", rangex)
print("deltarange", deltarange)
print("random_seed", random_seed)
random.seed(random_seed)
'''
Step 2: Initialise agents.
'''
print("Step 2: Initialise agents.")
agents = [] # Create a new empty list for coordinates.
# Populate agents adding agents with random locations
for i in range(num_of_agents):
agents.append([random.randint(0,rangey),random.randint(0,rangex)])
## Print x, y locations of agents
#for i in range(num_of_agents):
# print("agents[" + str(i) + "] y =", agents[i][0], "x =", agents[i][1])
'''
Step 3: Move each agent up to a small (deltarange) random amount in x and y
directions num_of_iterations times. This implements a torus where agents moving
off the bottom move onto the top and those moving off the left move onto the
right and vice versa.
'''
start = time.clock()
print("Step 3: Move each agent up to a small (deltarange) random amount in",
"x and y directions num_of_iterations times. This implements a torus",
"where agents moving off the bottom move onto the top and those moving",
"off the left move onto the right and vice versa.")
for j in range(num_of_iterations):
for i in range(num_of_agents):
# Move y
deltay = random.randint(-deltarange, deltarange)
#print("deltay ", deltay)
agents[i][0] = (agents[i][0] + deltay) % rangey
# Move x
deltax = random.randint(-deltarange, deltarange)
#print("deltax ", deltax)
agents[i][1] = (agents[i][1] + deltax) % rangex
## Print x, y locations
#for i in range(num_of_agents):
# #print(str(i), agents[i][0])
# # str(i) is used to force i to be regarded as a string.
# print("agents[" + str(i) + "] y =", agents[i][0], "x =", agents[i][1])
end = time.clock()
print("time = " + str(end - start))
'''
Step 4: Calculate maximum and minimum distance between agents.
'''
print("Step 4: Calculate maximum and minimum distance between agents.")
# Time how long this takes to calculate
start = end
maxdistance = distance_between(agents[0], agents[1])
mindistance = maxdistance
for i in range(num_of_agents):
#for j in range(num_of_agents): # Timed with and without this optimisation
for j in range(i, num_of_agents):
#for j in range(num_of_agents):
#if (i != j): # Faster without this if statement!
#if (i > j):
# print("i=", i,"j=", j)
distance = distance_between(agents[i], agents[j])
maxdistance = max(maxdistance, distance)
mindistance = min(mindistance, distance)
#print("maxdistance=", maxdistance)
#print("mindistance=", mindistance)
print("maxdistance=", maxdistance)
print("mindistance=", mindistance)
end = time.clock()
print("time = " + str(end - start))
""" This code is commented out as this program was all about testing timings.
'''
Step 4: Calculate, store and print out the element of agents with the
largest and smallest first and second elements.
'''
print("Step 5: Calculate, store and print out the element of agents with the",
"largest and smallest first and second elements.")
maxy = max(agents, key=operator.itemgetter(0))
print("Element of agents with the largest first element", maxy)
miny = min(agents, key=operator.itemgetter(0))
print("Element of agents with the smallest first element", miny)
maxx = max(agents, key=operator.itemgetter(1))
print("Element of agents with the largest second element", maxx)
minx = min(agents, key=operator.itemgetter(1))
print("Element of agents with the smallest second element", minx)
'''
Step 5: Plot agents.
'''
print("Step 6: Plot agents.")
matplotlib.pyplot.ylim(0, rangex) # This is why I think it is odd axis order!
matplotlib.pyplot.xlim(0, rangey)
# Plot all agents
print("Plot all agents black.")
for i in range(num_of_agents):
matplotlib.pyplot.scatter(agents[i][0],agents[i][1], color='black')
# Plot agent with the maxy blue.
print("Plot agent with the maxy blue.")
matplotlib.pyplot.scatter(maxy[0], maxy[1], color='blue')
# Plot agent with the miny red.
print("Plot agent with the miny red.")
matplotlib.pyplot.scatter(miny[0], miny[1], color='red')
# Plot agent with the maxy blue.
print("Plot agent with the maxx pink.")
matplotlib.pyplot.scatter(maxx[0], maxx[1], color='pink')
# Plot agent with the miny red.
print("Plot agent with the minx green.")
matplotlib.pyplot.scatter(minx[0], minx[1], color='green')
matplotlib.pyplot.show()
""" | 36.857143 | 80 | 0.685124 | 820 | 5,418 | 4.469512 | 0.209756 | 0.041473 | 0.036016 | 0.029468 | 0.500955 | 0.459209 | 0.397271 | 0.374079 | 0.332333 | 0.303956 | 0 | 0.015664 | 0.186969 | 5,418 | 147 | 81 | 36.857143 | 0.816345 | 0.235696 | 0 | 0.142857 | 0 | 0 | 0.235324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020408 | false | 0 | 0.040816 | 0 | 0.081633 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1457921a9de9e9b972d2a00acd04f6e5e653bd5f | 13,451 | py | Python | project/app.py | FelippeJC/besa | 083e62457c9c9beccc869beb89675720be0d248a | [
"MIT"
] | null | null | null | project/app.py | FelippeJC/besa | 083e62457c9c9beccc869beb89675720be0d248a | [
"MIT"
] | 2 | 2021-05-10T16:40:12.000Z | 2021-09-05T09:26:07.000Z | project/app.py | FelippeJC/besa | 083e62457c9c9beccc869beb89675720be0d248a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
from flask import Flask, render_template, current_app
from flask_migrate import Migrate
from flask_restful import Api
from config import Config, DevelopmentConfig, TestingConfig
from .database import db
from datetime import datetime
import random
import folium
import pandas as pd
from .opendata.objects import (get_barcelona_map, get_bicycle_map_layer,
get_mercats_i_fires_al_carrer_map_layer,
get_public_wifi_map_layer,
get_bus_stations_map_layer,
Data,
)
##########################################################################
# Configurations
##########################################################################
app = Flask(__name__)
# Sets the configurations
if os.environ.get('FLASK_ENV', None) == 'production':
config = Config()
elif os.environ.get('FLASK_ENV', None) == 'test':
config = TestingConfig()
else:
config = DevelopmentConfig()
app.config.from_object(config)
# Defines the API
api = Api(app)
# Sets the database
db.init_app(app)
# Sets flask-migrate
migrate = Migrate(app, db)
@app.before_first_request
def create_tables():
db.create_all()
# HTTP error handling
@app.errorhandler(404)
def not_found(error):
return render_template('404.html'), 404
##########################################################################
# Views
##########################################################################
@app.route("/")
def index():
return render_template('index.html', data=[20, 40, 40])
@app.route("/city-initiatives")
def city_initiatives():
return render_template('city_initiatives.html')
@app.route("/city-amenities")
def city_amenities():
bcn_map = get_barcelona_map()
bcn_map.add_child(get_mercats_i_fires_al_carrer_map_layer())
wifi_map_layer = get_public_wifi_map_layer()
if wifi_map_layer is not None:
bcn_map.add_child(wifi_map_layer)
folium.LayerControl().add_to(bcn_map)
return render_template('city_amenities.html', folium_map=bcn_map._repr_html_())
@app.route("/about")
def about():
return render_template('about.html')
### ENVIRONMENT ###
@app.route("/temperature")
def temperature():
try:
# temp = Data("resource_id=0e3b6840-7dff-4731-a556-44fac28a7873&limit=400")
# temp_df = temp.df
temp_df = pd.read_csv(current_app.open_resource("static/src/temperaturesbarcelonadesde1780.csv"))
temp_df = temp_df.astype(float)
temp_df.astype({'Any': 'int32'}).dtypes
if "_id" in temp_df.columns:
temp_df.drop('_id', axis=1, inplace=True)
temp_df.set_index('Any', inplace=True)
temp_df.rename(columns={'Temp_Mitjana_Gener': 'January',
'Temp_Mitjana_Febrer': 'February',
'Temp_Mitjana_Marc': 'March',
'Temp_Mitjana_Abril': 'April',
'Temp_Mitjana_Maig': 'May',
'Temp_Mitjana_Juny': 'June',
'Temp_Mitjana_Juliol': 'July',
'Temp_Mitjana_Agost': 'August',
'Temp_Mitjana_Setembre': 'September',
'Temp_Mitjana_Octubre': 'October',
'Temp_Mitjana_Novembre': 'November',
'Temp_Mitjana_Desembre': 'December'},
inplace=True)
dataset = list()
for (columnName, columnData) in temp_df.iteritems():
color_r, color_g, color_b = (random.randint(0, 255), random.randint(0, 255), random.randint(0, 255))
dataset.append({"label": columnName,
"lineTension": 0.3,
"backgroundColor": "rgba({r}, {g}, {b}, 0.05)".format(r=color_r, g=color_g, b=color_b),
"borderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"borderWidth": 1,
"pointRadius": 1,
"pointBackgroundColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointBorderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHoverRadius": 1,
"pointHoverBackgroundColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHoverBorderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHitRadius": 2,
"pointBorderWidth": 1,
"data": list(columnData)
})
year_average = dict(temp_df.mean(axis=1))
yearMaxValue = max(year_average.items(), key=lambda x: x[1])
average_temperatures = dict(temp_df.mean())
itemMinValue = min(average_temperatures.items(), key=lambda x: x[1])
itemMaxValue = max(average_temperatures.items(), key=lambda x: x[1])
# radar graph
average_temperatures = sorted(average_temperatures.items(), key=lambda kv: (datetime.strptime(kv[0], '%B'), kv[1]))
average_temperatures_labels = list()
average_temperatures_data = list()
average_temperatures_data_colors = list()
for label, value in average_temperatures:
average_temperatures_data_colors.append("#" + ''.join([random.choice('0123456789ABCDEF') for j in range(6)]))
average_temperatures_labels.append(label)
average_temperatures_data.append(value)
radar_graph_data = {"labels": average_temperatures_labels,
"datasets": [{
"label": "Average temperatures over time",
"data": average_temperatures_data,
"backgroundColor": average_temperatures_data_colors,
"hoverBackgroundColor": average_temperatures_data_colors,
"hoverBorderColor": "rgba(234, 236, 244, 0.1)",
}],
}
labels = list(temp_df.index)
return render_template('temperature.html',
label=labels,
data=dataset,
radar_graph_data=radar_graph_data,
itemMinValue="{month} ({temp:.2f} °C)".format(month=itemMinValue[0], temp=itemMinValue[1]),
itemMaxValue="{month} ({temp:.2f} °C)".format(month=itemMaxValue[0], temp=itemMaxValue[1]),
amount_of_data=len(labels),
yearMaxValue="{year} with {temp:.2f} °C average".format(year=int(yearMaxValue[0]), temp=yearMaxValue[1]),)
except:
return render_template('blank.html')
@app.route("/precipitation")
def precipitation():
try:
# data = Data("resource_id=6f1fb778-0767-478b-b332-c64a833d26d2&limit=400")
data_df = pd.read_csv(current_app.open_resource("static/src/precipitacionsbarcelonadesde1786.csv"))
data_df = data_df.astype(float)
data_df.astype({'Any': 'int32'}).dtypes
if "_id" in data_df.columns:
data_df.drop('_id', axis=1, inplace=True)
data_df.set_index('Any', inplace=True)
data_df.rename(columns={'Precip_Acum_Gener': 'January',
'Precip_Acum_Febrer': 'February',
'Precip_Acum_Marc': 'March',
'Precip_Acum_Abril': 'April',
'Precip_Acum_Maig': 'May',
'Precip_Acum_Juny': 'June',
'Precip_Acum_Juliol': 'July',
'Precip_Acum_Agost': 'August',
'Precip_Acum_Setembre': 'September',
'Precip_Acum_Octubre': 'October',
'Precip_Acum_Novembre': 'November',
'Precip_Acum_Desembre': 'December'},
inplace=True)
dataset = list()
for (columnName, columnData) in data_df.iteritems():
color_r, color_g, color_b = (random.randint(0, 255), random.randint(0, 255), random.randint(0, 255))
dataset.append({"label": columnName,
"lineTension": 0.3,
"backgroundColor": "rgba({r}, {g}, {b}, 0.05)".format(r=color_r, g=color_g, b=color_b),
"borderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"borderWidth": 1,
"pointRadius": 1,
"pointBackgroundColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointBorderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHoverRadius": 1,
"pointHoverBackgroundColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHoverBorderColor": "rgba({r}, {g}, {b}, 1)".format(r=color_r, g=color_g, b=color_b),
"pointHitRadius": 2,
"pointBorderWidth": 1,
"data": list(columnData)
})
average_precipitations = dict(data_df.mean())
itemMinValue = min(average_precipitations.items(), key=lambda x: x[1])
itemMaxValue = max(average_precipitations.items(), key=lambda x: x[1])
labels = list(data_df.index)
year_average = dict(data_df.mean(axis=1))
yearMaxValue = max(year_average.items(), key=lambda x: x[1])
# Radar
radar_labels = list()
radar_values = list()
radar_colors = list()
for label, value in sorted(year_average.items(), key=lambda x: int(x[0])):
radar_labels.append(label)
radar_values.append(value)
radar_colors.append("#" + ''.join([random.choice('0123456789ABCDEF') for j in range(6)]))
radar_graph_data = {"labels": radar_labels,
"datasets": [{
"label": "Average yearly precipitation",
"data": radar_values,
"backgroundColor": radar_colors,
"hoverBackgroundColor": radar_colors,
"hoverBorderColor": "rgba(234, 236, 244, 0.1)",
}],
}
return render_template('precipitation.html',
label=labels,
data=dataset,
radar_data=radar_graph_data,
itemMinValue="{month} ({temp:.2f} mm)".format(month=itemMinValue[0], temp=itemMinValue[1]),
itemMaxValue="{month} ({temp:.2f} mm)".format(month=itemMaxValue[0], temp=itemMaxValue[1]),
amount_of_data=len(labels),
yearMaxValue="{year} with {temp:.2f} mm average".format(year=int(yearMaxValue[0]), temp=yearMaxValue[1]))
except:
return render_template('blank.html')
@app.route("/city-trees")
def city_trees():
return render_template('city_trees.html')
@app.route("/green-spaces")
def green_spaces():
return render_template('green_spaces.html')
@app.route("/waste-management")
def waste_management():
return render_template('waste_management.html')
### MOBILITY ###
# @app.route("/city-flow")
# def city_flow():
# return render_template('city_flow.html')
@app.route("/public-transportation")
def public_transportation():
bcn_map = get_barcelona_map()
bus_map_layer = get_bus_stations_map_layer()
if bus_map_layer is not None:
bcn_map.add_child(bus_map_layer)
folium.LayerControl().add_to(bcn_map)
return render_template('public_transportation.html', folium_map=bcn_map._repr_html_())
@app.route("/bicycle")
def bicycle():
bcn_map = get_barcelona_map()
bicycle_map_layer = get_bicycle_map_layer()
if bicycle_map_layer is not None:
bcn_map.add_child(bicycle_map_layer)
folium.LayerControl().add_to(bcn_map)
return render_template('bicycle.html', folium_map=bcn_map._repr_html_())
@app.route("/car")
def car():
return render_template('car.html')
@app.route("/traffic-incidents")
def traffic_incidents():
return render_template('traffic_incidents.html')
### POPULATION ###
# @app.route("/demography")
# def demography():
# return render_template('demography.html')
# @app.route("/society-and-welfare")
# def society_and_welfare():
# return render_template('society_and_welfare.html')
@app.route("/blank")
def blank():
return render_template('blank.html')
if __name__ == '__main__':
# Bind to PORT if defined, otherwise default to 5000.
port = int(os.environ.get('PORT', 5000))
app.run(host='0.0.0.0', port=port)
| 42.034375 | 137 | 0.544272 | 1,425 | 13,451 | 4.893333 | 0.18807 | 0.006884 | 0.057364 | 0.012046 | 0.484153 | 0.460204 | 0.426072 | 0.402553 | 0.34863 | 0.309049 | 0 | 0.02373 | 0.307635 | 13,451 | 319 | 138 | 42.166144 | 0.724686 | 0.05442 | 0 | 0.270386 | 0 | 0 | 0.185445 | 0.028996 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06867 | false | 0 | 0.04721 | 0.042918 | 0.188841 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
145e44674579d43491b3bb899a2a1249d889f439 | 6,326 | py | Python | orms_tools/sqlalchemy_tools/sa_combine.py | biobdeveloper/ormc | 413d3362fdf5cd7f171dfdb56ab00299099339ed | [
"MIT"
] | 9 | 2022-01-22T19:10:58.000Z | 2022-01-27T12:49:57.000Z | orms_tools/sqlalchemy_tools/sa_combine.py | biobdeveloper/ormc | 413d3362fdf5cd7f171dfdb56ab00299099339ed | [
"MIT"
] | null | null | null | orms_tools/sqlalchemy_tools/sa_combine.py | biobdeveloper/ormc | 413d3362fdf5cd7f171dfdb56ab00299099339ed | [
"MIT"
] | null | null | null | import datetime
from typing import List
from sqlalchemy.orm import ColumnProperty, DeclarativeMeta, class_mapper
from core.combine import AbstractModelCombine
from core.db_primitives import CoreField, CoreModel
from .sa_base import SABase, metadata, sa
class SQLAlchemyModelCombine(AbstractModelCombine):
metaclass = DeclarativeMeta
def __init__(self, *args):
super().__init__(*args)
@classmethod
def is_model(cls, model):
return issubclass(model.__class__, cls.metaclass) and hasattr(
model, "__tablename__"
)
@classmethod
def integer(cls, **kwargs):
return sa.Integer
@classmethod
def character(cls, **kwargs):
if kwargs.get("length"):
return sa.String(**kwargs)
else:
return sa.Text
@classmethod
def boolean(cls, **kwargs):
return sa.Boolean
@classmethod
def float(cls, **kwargs):
return sa.Float
@classmethod
def bytes(cls, **kwargs):
return sa.BINARY(**kwargs)
@classmethod
def decimal(cls, **kwargs):
return sa.DECIMAL(**kwargs)
@classmethod
def date(cls, **kwargs):
for key in ('auto_on_create', 'auto_on_update'):
try:
kwargs.pop(key)
except KeyError:
pass
return sa.Date(**kwargs)
@classmethod
def datetime(cls, **kwargs):
for key in ('auto_on_create', 'auto_on_update'):
try:
kwargs.pop(key)
except KeyError:
pass
return sa.DateTime(**kwargs)
def get_fields(self, model) -> List[ColumnProperty]:
fields = []
mapper = class_mapper(model)
for v in mapper.iterate_properties:
if isinstance(v, ColumnProperty):
fields.append(v)
return fields
def to_core_model(self, model: SABase) -> CoreModel:
"""Convert SQLAlchemy Model to CoreModel"""
model_kwargs = {
"tablename": model.__tablename__,
"doc": model.__doc__,
"fields": [self.to_core_field(field) for field in self.get_fields(model)],
}
unique_together = []
if hasattr(model, "__table_args__"):
for arg in model.__table_args__:
if isinstance(arg, sa.UniqueConstraint) and len(arg.columns) > 1:
unique_together.append(tuple([col.name for col in arg.columns]))
if unique_together:
model_kwargs["unique_together"] = tuple(unique_together)
return CoreModel(**model_kwargs)
def to_core_field(self, field: ColumnProperty) -> CoreField:
"""Convert CoreField to SQLAlchemy field"""
spec_params = {}
if len(field.columns) != 1:
raise
column = field.expression
sql_type = None
for core_type in self.type_map.keys():
if core_type == column.type.python_type:
sql_type = core_type
break
if not sql_type:
raise
if column.default:
default_value = column.default.arg
else:
default_value = None
if isinstance(column.type, (sa.Float, sa.DECIMAL)):
if column.type.precision:
spec_params["precision"] = column.type.precision
if column.type.scale:
spec_params["scale"] = column.type.scale
if isinstance(column.type, (sa.Date, sa.DateTime)):
if hasattr(column.default, 'arg') and column.default.arg.__name__ == 'utcnow':
spec_params["auto_on_create"] = True
if hasattr(column.onupdate, 'arg') and column.onupdate.arg.__name__ == 'utcnow':
spec_params['auto_on_update'] = True
try:
foreign_key_column = column.foreign_keys.pop().column
foreign_key = f"{foreign_key_column.table.name.capitalize()}.{foreign_key_column.name}"
except KeyError:
foreign_key = None
core_field = CoreField(
name=column.name,
nullable=column.nullable,
primary_key=column.primary_key,
doc=column.doc,
sql_type=sql_type,
unique=column.unique,
default=default_value,
foreign_key=foreign_key,
**spec_params,
)
return core_field
def from_core_model(self, model: CoreModel) -> SABase:
"""Convert CoreModel to SQLAlchemy Model"""
fields = [self.from_core_field(field) for field in model.fields]
fields_as_dict = {f.name: f for f in fields}
unique_together = []
model_kwargs = {
"__module__": model.__module__,
"__qualname__": model.tablename.capitalize(),
"__doc__": model.__doc__,
"__tablename__": model.tablename,
**fields_as_dict,
}
for unique_seq in model.unique_together:
unique_together.append(sa.UniqueConstraint(*unique_seq))
if unique_together:
model_kwargs["__table_args__"] = tuple(unique_together)
sa_model = DeclarativeMeta(
model.tablename.capitalize(),
(SABase,),
model_kwargs,
)
return sa_model
def from_core_field(self, field: CoreField) -> sa.Column:
"""Convert SQLAlchemy field to CoreField"""
sa_type = self.type_map[field.sql_type]
sa_type_instance = sa_type(**field.spec_params)
relations = []
if field.foreign_key:
relations.append(
sa.ForeignKey(
column=field.foreign_key.lower(),
name=field.name,
),
)
column_kwargs = dict(
name=field.name,
primary_key=field.primary_key,
default=field.default,
doc=field.doc,
unique=field.unique,
nullable=field.nullable,
)
if field.spec_params.get('auto_on_create'):
column_kwargs['default'] = datetime.datetime.utcnow
if field.spec_params.get('auto_on_update'):
column_kwargs['onupdate'] = datetime.datetime.utcnow
sa_field = sa.Column(sa_type_instance, *relations, **column_kwargs)
return sa_field
| 32.777202 | 99 | 0.587259 | 677 | 6,326 | 5.212703 | 0.177253 | 0.024936 | 0.02777 | 0.024086 | 0.120147 | 0.091244 | 0.077642 | 0.046472 | 0.046472 | 0.046472 | 0 | 0.000461 | 0.314417 | 6,326 | 192 | 100 | 32.947917 | 0.813235 | 0.02387 | 0 | 0.203704 | 0 | 0 | 0.057036 | 0.011375 | 0 | 0 | 0 | 0 | 0 | 1 | 0.092593 | false | 0.012346 | 0.037037 | 0.037037 | 0.234568 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
145e5344e9bd1be8c32e4a85709baff72526841b | 5,791 | py | Python | research/object_detection/dataset_tools/create_modalnet_tf_record.py | qa276390/tf-models | 39fc5ec5cfbd35c79f7017f52e36ff7c404cf252 | [
"Apache-2.0"
] | null | null | null | research/object_detection/dataset_tools/create_modalnet_tf_record.py | qa276390/tf-models | 39fc5ec5cfbd35c79f7017f52e36ff7c404cf252 | [
"Apache-2.0"
] | null | null | null | research/object_detection/dataset_tools/create_modalnet_tf_record.py | qa276390/tf-models | 39fc5ec5cfbd35c79f7017f52e36ff7c404cf252 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import re
import hashlib
import io
import logging
import os
from lxml import etree
import PIL.Image
import tensorflow as tf
from object_detection.utils import dataset_util
from object_detection.utils import label_map_util
class Object:
def __init__(self, xmin, ymin, xmax, ymax, label):
self.xmin = xmin
self.ymin = ymin
self.xmax = xmax
self.ymax = ymax
self.label = label
class Image:
def __init__(self, filename):
self.filename = filename
self.Objects = []
self.length = 0
def append(self, Object):
self.Objects.append(Object)
self.length += 1
flags = tf.app.flags
flags.DEFINE_string(
'data_dir', '', 'Root directory to raw PASCAL VOC dataset.')
flags.DEFINE_string('set', 'train', 'Convert training set, validation set or '
'merged set.')
flags.DEFINE_string('annotations_file', 'Annotations',
'(Relative) path to annotations directory.')
flags.DEFINE_string('output_path', '', 'Path to output TFRecord')
flags.DEFINE_boolean('val', False, 'Valid or not')
FLAGS = flags.FLAGS
SETS = ['train', 'val', 'trainval', 'test']
VAL = FLAGS.val
def read_anno(filepath):
Image_list = []
with open(filepath) as fp:
line = fp.readline()
while(line):
arr = line.split()
pattern2 = re.compile("\Atrain_images/2")
pattern20 = re.compile("\Atrain_images/20")
# print(arr[0], end='')
if(VAL and not pattern20.match(arr[0])):
line = fp.readline()
continue
elif(not VAL and (not pattern2.match(arr[0]) or pattern20.match(arr[0]))):
#print(line)
line = fp.readline()
continue
IM = 0
print(arr[0])
for x in arr:
if(IM == 0):
IM = Image(x.split('/')[1])
else:
feats = x.split(',')
# print(feats)
# print(len(feats))
Ob = Object(feats[0], feats[1],
feats[2], feats[3], feats[4])
IM.append(Ob)
Image_list.append(IM)
line = fp.readline()
return Image_list
def dict_to_tf_example(data, dataset_directory):
img_path = data.filename
full_path = os.path.join(dataset_directory, img_path)
with tf.gfile.GFile(full_path, 'rb') as fid:
encoded_jpg = fid.read()
encoded_jpg_io = io.BytesIO(encoded_jpg)
image = PIL.Image.open(encoded_jpg_io)
if image.format != 'JPEG':
raise ValueError('Image format not JPEG')
key = hashlib.sha256(encoded_jpg).hexdigest()
width = 400
height = 600
xmin = []
ymin = []
xmax = []
ymax = []
classes = []
classes_text = []
truncated = []
poses = []
if data.length != 0:
for obj in data.Objects:
xmin.append(float(obj.xmin) / width)
ymin.append(float(obj.ymin) / height)
xmax.append(float(obj.xmax) / width)
ymax.append(float(obj.ymax) / height)
classes.append(int(obj.label)+1)
#----------------------------wed-----------------------------------#
print(data.filename+" "+str(data.length))
example = tf.train.Example(features=tf.train.Features(feature={
'image/height': dataset_util.int64_feature(height),
'image/width': dataset_util.int64_feature(width),
'image/filename': dataset_util.bytes_feature(
data.filename.encode('utf8')),
'image/source_id': dataset_util.bytes_feature(
data.filename.encode('utf8')),
'image/key/sha256': dataset_util.bytes_feature(key.encode('utf8')),
'image/encoded': dataset_util.bytes_feature(encoded_jpg),
'image/format': dataset_util.bytes_feature('jpeg'.encode('utf8')),
'image/object/bbox/xmin': dataset_util.float_list_feature(xmin),
'image/object/bbox/xmax': dataset_util.float_list_feature(xmax),
'image/object/bbox/ymin': dataset_util.float_list_feature(ymin),
'image/object/bbox/ymax': dataset_util.float_list_feature(ymax),
'image/object/class/label': dataset_util.int64_list_feature(classes)}))
return example
def main(_):
if FLAGS.set not in SETS:
raise ValueError('set must be in : {}'.format(SETS))
data_dir = FLAGS.data_dir
writer = tf.python_io.TFRecordWriter(FLAGS.output_path)
examples_list = read_anno(FLAGS.annotations_file)
print('start!')
for (idx, example) in enumerate(examples_list):
if idx % 10 == 0:
logging.info('On image %d of %d', idx, len(examples_list))
tf_example = dict_to_tf_example(example, FLAGS.data_dir)
writer.write(tf_example.SerializeToString())
print('On image %d of %d', idx, len(examples_list))
writer.close()
if __name__ == '__main__':
tf.app.run()
| 32.533708 | 86 | 0.605595 | 719 | 5,791 | 4.719054 | 0.307371 | 0.042146 | 0.023578 | 0.033893 | 0.09608 | 0.046566 | 0.046566 | 0.046566 | 0.046566 | 0 | 0 | 0.014127 | 0.25436 | 5,791 | 177 | 87 | 32.717514 | 0.771654 | 0.136419 | 0 | 0.062992 | 0 | 0 | 0.120482 | 0.02249 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047244 | false | 0 | 0.102362 | 0 | 0.181102 | 0.03937 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14600f8835996a9b20a3a6f88d1ded4994fdafbf | 6,752 | py | Python | oseoserver/settings.py | pyoseo/oseoserver | 8c97ee5a7d698cc989e1c8cab8cfe0db78491307 | [
"Apache-2.0"
] | null | null | null | oseoserver/settings.py | pyoseo/oseoserver | 8c97ee5a7d698cc989e1c8cab8cfe0db78491307 | [
"Apache-2.0"
] | 10 | 2015-02-10T17:10:33.000Z | 2018-04-05T10:05:01.000Z | oseoserver/settings.py | pyoseo/oseoserver | 8c97ee5a7d698cc989e1c8cab8cfe0db78491307 | [
"Apache-2.0"
] | null | null | null | """Providing custom values for oseoserver's settings."""
from django.conf import settings
from . import constants
def _get_setting(parameter, default_value):
return getattr(settings, parameter, default_value)
def get_mail_recipient_handler():
return _get_setting("OSEOSERVER_MAIL_RECIPIENT_HANDLER", None)
def get_processing_class():
return _get_setting(
"OSEOSERVER_PROCESSING_CLASS",
"oseoserver.orderpreparation.ExampleOrderProcessor"
)
def get_max_order_items():
return _get_setting("OSEOSERVER_MAX_ORDER_ITEMS", 200)
def get_max_active_items():
return _get_setting("OSEOSERVER_MAX_ACTIVE_ITEMS", 400)
def get_massive_order_max_size():
return _get_setting("OSEOSERVER_MASSIVE_ORDER_MAX_SIZE", 1000)
def get_product_order():
return _get_setting(
"OSEOSERVER_PRODUCT_ORDER",
{
"enabled": False,
"automatic_approval": False,
"item_processor": "oseoserver.orderpreparation."
"exampleorderprocessor.ExampleOrderProcessor",
"item_availability_days": 10,
"notifications": {
"moderation": False,
"item_availability": False,
"batch_availability": None,
}
}
)
def get_subscription_order():
return _get_setting(
"OSEOSERVER_SUBSCRIPTION_ORDER",
{
"enabled": False,
"automatic_approval": False,
"item_processor": "oseoserver.orderpreparation."
"exampleorderprocessor.ExampleOrderProcessor",
"item_availability_days": 10,
"notifications": {
"moderation": True,
"item_availability": False,
"batch_availability": "daily",
}
}
)
def get_tasking_order():
return _get_setting(
"OSEOSERVER_TASKING_ORDER",
{
"enabled": False,
"automatic_approval": False,
"item_processor": "oseoserver.orderpreparation."
"exampleorderprocessor.ExampleOrderProcessor",
"item_availability_days": 10,
"notifications": {
"moderation": True,
"item_availability": False,
"batch_availability": "immediate",
}
}
)
def get_massive_order():
return _get_setting(
"OSEOSERVER_MASSIVE_ORDER",
{
"enabled": False,
"automatic_approval": False,
"item_processor": "oseoserver.orderpreparation."
"exampleorderprocessor.ExampleOrderProcessor",
"item_availability_days": 10,
"notifications": {
"moderation": True,
"item_availability": False,
"batch_availability": "immediate",
}
}
)
def get_processing_options():
return _get_setting(
"OSEOSERVER_PROCESSING_OPTIONS",
[
{
"name": "dummy option",
"description": "A dummy option",
"multiple_entries": False,
"choices": ["first", "second"],
}
]
)
def get_online_data_access_options():
return _get_setting(
"OSEOSERVER_ONLINE_DATA_ACCESS_OPTIONS",
[
{
"protocol": constants.DeliveryOptionProtocol.FTP.value,
"fee": 0,
},
{
"protocol": constants.DeliveryOptionProtocol.HTTP.value,
"fee": 0,
}
]
)
def get_online_data_delivery_options():
return _get_setting(
"OSEOSERVER_ONLINE_DATA_DELIVERY_OPTIONS",
[
{
"protocol": constants.DeliveryOptionProtocol.FTP.value,
"fee": 0,
},
]
)
def get_media_delivery_options():
return _get_setting(
"OSEOSERVER_MEDIA_DELIVERY_OPTIONS",
{
"media": [
{
"type": constants.DeliveryMedium.CD_ROM,
"fee": 0,
}
],
"shipping": [
constants.DeliveryMethod.ALL_READY,
]
}
)
def get_payment_options():
return _get_setting(
"OSEOSERVER_PAYMENT_OPTIONS",
[
{
"name": "dummy payment option",
"description": "A dummy payment option",
"multiple_entries": False,
"choices": None,
}
]
)
def get_collections():
return _get_setting(
"OSEOSERVER_COLLECTIONS",
[
{
"name": "dummy collection",
"catalogue_endpoint": "http://localhost",
"collection_identifier": "dummy_collection_id",
"product_price": 0,
"generation_frequency": "Once per hour",
"product_order": {
"enabled": False,
"order_processing_fee": 0,
"options": [],
"online_data_access_options": [],
"online_data_delivery_options": [],
"media_delivery_options": [],
"payment_options": [],
"scene_selection_options": [],
},
"subscription_order": {
"enabled": False,
"order_processing_fee": 0,
"options": [],
"online_data_access_options": [],
"online_data_delivery_options": [],
"media_delivery_options": [],
"payment_options": [],
"scene_selection_options": [],
},
"tasking_order": {
"enabled": False,
"order_processing_fee": 0,
"options": [],
"online_data_access_options": [],
"online_data_delivery_options": [],
"media_delivery_options": [],
"payment_options": [],
"scene_selection_options": [],
},
"massive_order": {
"enabled": False,
"order_processing_fee": 0,
"options": [],
"online_data_access_options": [],
"online_data_delivery_options": [],
"media_delivery_options": [],
"payment_options": [],
"scene_selection_options": [],
},
},
]
)
| 29.103448 | 76 | 0.499408 | 479 | 6,752 | 6.617954 | 0.210856 | 0.030284 | 0.07571 | 0.123028 | 0.670347 | 0.564669 | 0.505363 | 0.478233 | 0.44164 | 0.44164 | 0 | 0.006663 | 0.399882 | 6,752 | 231 | 77 | 29.229437 | 0.775666 | 0.007405 | 0 | 0.426396 | 0 | 0 | 0.337814 | 0.189815 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081218 | false | 0 | 0.010152 | 0.081218 | 0.172589 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1461a6da7a9beed86ae1959d1a45ad20a0e024f7 | 90,700 | py | Python | ietf/ydk/models/ietf/iana_if_type.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 177 | 2016-03-15T17:03:51.000Z | 2022-03-18T16:48:44.000Z | ietf/ydk/models/ietf/iana_if_type.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2016-03-30T10:45:22.000Z | 2020-07-14T16:28:13.000Z | ietf/ydk/models/ietf/iana_if_type.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2016-03-16T20:38:57.000Z | 2022-02-22T04:26:02.000Z | """ iana_if_type
This YANG module defines YANG identities for IANA\-registered
interface types.
This YANG module is maintained by IANA and reflects the
'ifType definitions' registry.
The latest revision of this YANG module can be obtained from
the IANA web site.
Requests for new values should be made to IANA via
email (iana@iana.org).
Copyright (c) 2014 IETF Trust and the persons identified as
authors of the code. All rights reserved.
Redistribution and use in source and binary forms, with or
without modification, is permitted pursuant to, and subject
to the license terms contained in, the Simplified BSD License
set forth in Section 4.c of the IETF Trust's Legal Provisions
Relating to IETF Documents
(http\://trustee.ietf.org/license\-info).
The initial version of this YANG module is part of RFC 7224;
see the RFC itself for full legal notices.
"""
from collections import OrderedDict
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
from ydk.models.ietf.ietf_interfaces import InterfaceType
class IanaInterfaceType(InterfaceType):
"""
This identity is used as a base for all interface types
defined in the 'ifType definitions' registry.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iana-interface-type"):
super(IanaInterfaceType, self).__init__(ns, pref, tag)
class Other(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:other"):
super(Other, self).__init__(ns, pref, tag)
class Regular1822(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:regular1822"):
super(Regular1822, self).__init__(ns, pref, tag)
class Hdh1822(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hdh1822"):
super(Hdh1822, self).__init__(ns, pref, tag)
class DdnX25(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ddnX25"):
super(DdnX25, self).__init__(ns, pref, tag)
class Rfc877x25(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:rfc877x25"):
super(Rfc877x25, self).__init__(ns, pref, tag)
class EthernetCsmacd(IanaInterfaceType):
"""
For all Ethernet\-like interfaces, regardless of speed,
as per RFC 3635.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ethernetCsmacd"):
super(EthernetCsmacd, self).__init__(ns, pref, tag)
class Iso88023Csmacd(IanaInterfaceType):
"""
Deprecated via RFC 3635.
Use ethernetCsmacd(6) instead.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88023Csmacd"):
super(Iso88023Csmacd, self).__init__(ns, pref, tag)
class Iso88024TokenBus(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88024TokenBus"):
super(Iso88024TokenBus, self).__init__(ns, pref, tag)
class Iso88025TokenRing(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88025TokenRing"):
super(Iso88025TokenRing, self).__init__(ns, pref, tag)
class Iso88026Man(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88026Man"):
super(Iso88026Man, self).__init__(ns, pref, tag)
class StarLan(IanaInterfaceType):
"""
Deprecated via RFC 3635.
Use ethernetCsmacd(6) instead.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:starLan"):
super(StarLan, self).__init__(ns, pref, tag)
class Proteon10Mbit(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:proteon10Mbit"):
super(Proteon10Mbit, self).__init__(ns, pref, tag)
class Proteon80Mbit(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:proteon80Mbit"):
super(Proteon80Mbit, self).__init__(ns, pref, tag)
class Hyperchannel(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hyperchannel"):
super(Hyperchannel, self).__init__(ns, pref, tag)
class Fddi(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fddi"):
super(Fddi, self).__init__(ns, pref, tag)
class Lapb(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:lapb"):
super(Lapb, self).__init__(ns, pref, tag)
class Sdlc(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sdlc"):
super(Sdlc, self).__init__(ns, pref, tag)
class Ds1(IanaInterfaceType):
"""
DS1\-MIB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ds1"):
super(Ds1, self).__init__(ns, pref, tag)
class E1(IanaInterfaceType):
"""
Obsolete; see DS1\-MIB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:e1"):
super(E1, self).__init__(ns, pref, tag)
class BasicISDN(IanaInterfaceType):
"""
No longer used. See also RFC 2127.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:basicISDN"):
super(BasicISDN, self).__init__(ns, pref, tag)
class PrimaryISDN(IanaInterfaceType):
"""
No longer used. See also RFC 2127.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:primaryISDN"):
super(PrimaryISDN, self).__init__(ns, pref, tag)
class PropPointToPointSerial(IanaInterfaceType):
"""
Proprietary serial.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propPointToPointSerial"):
super(PropPointToPointSerial, self).__init__(ns, pref, tag)
class Ppp(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ppp"):
super(Ppp, self).__init__(ns, pref, tag)
class SoftwareLoopback(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:softwareLoopback"):
super(SoftwareLoopback, self).__init__(ns, pref, tag)
class Eon(IanaInterfaceType):
"""
CLNP over IP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:eon"):
super(Eon, self).__init__(ns, pref, tag)
class Ethernet3Mbit(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ethernet3Mbit"):
super(Ethernet3Mbit, self).__init__(ns, pref, tag)
class Nsip(IanaInterfaceType):
"""
XNS over IP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:nsip"):
super(Nsip, self).__init__(ns, pref, tag)
class Slip(IanaInterfaceType):
"""
Generic SLIP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:slip"):
super(Slip, self).__init__(ns, pref, tag)
class Ultra(IanaInterfaceType):
"""
Ultra Technologies.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ultra"):
super(Ultra, self).__init__(ns, pref, tag)
class Ds3(IanaInterfaceType):
"""
DS3\-MIB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ds3"):
super(Ds3, self).__init__(ns, pref, tag)
class Sip(IanaInterfaceType):
"""
SMDS, coffee.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sip"):
super(Sip, self).__init__(ns, pref, tag)
class FrameRelay(IanaInterfaceType):
"""
DTE only.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frameRelay"):
super(FrameRelay, self).__init__(ns, pref, tag)
class Rs232(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:rs232"):
super(Rs232, self).__init__(ns, pref, tag)
class Para(IanaInterfaceType):
"""
Parallel\-port.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:para"):
super(Para, self).__init__(ns, pref, tag)
class Arcnet(IanaInterfaceType):
"""
ARCnet.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:arcnet"):
super(Arcnet, self).__init__(ns, pref, tag)
class ArcnetPlus(IanaInterfaceType):
"""
ARCnet Plus.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:arcnetPlus"):
super(ArcnetPlus, self).__init__(ns, pref, tag)
class Atm(IanaInterfaceType):
"""
ATM cells.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atm"):
super(Atm, self).__init__(ns, pref, tag)
class Miox25(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:miox25"):
super(Miox25, self).__init__(ns, pref, tag)
class Sonet(IanaInterfaceType):
"""
SONET or SDH.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sonet"):
super(Sonet, self).__init__(ns, pref, tag)
class X25ple(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:x25ple"):
super(X25ple, self).__init__(ns, pref, tag)
class Iso88022llc(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88022llc"):
super(Iso88022llc, self).__init__(ns, pref, tag)
class LocalTalk(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:localTalk"):
super(LocalTalk, self).__init__(ns, pref, tag)
class SmdsDxi(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:smdsDxi"):
super(SmdsDxi, self).__init__(ns, pref, tag)
class FrameRelayService(IanaInterfaceType):
"""
FRNETSERV\-MIB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frameRelayService"):
super(FrameRelayService, self).__init__(ns, pref, tag)
class V35(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:v35"):
super(V35, self).__init__(ns, pref, tag)
class Hssi(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hssi"):
super(Hssi, self).__init__(ns, pref, tag)
class Hippi(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hippi"):
super(Hippi, self).__init__(ns, pref, tag)
class Modem(IanaInterfaceType):
"""
Generic modem.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:modem"):
super(Modem, self).__init__(ns, pref, tag)
class Aal5(IanaInterfaceType):
"""
AAL5 over ATM.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aal5"):
super(Aal5, self).__init__(ns, pref, tag)
class SonetPath(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sonetPath"):
super(SonetPath, self).__init__(ns, pref, tag)
class SonetVT(IanaInterfaceType):
"""
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sonetVT"):
super(SonetVT, self).__init__(ns, pref, tag)
class SmdsIcip(IanaInterfaceType):
"""
SMDS InterCarrier Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:smdsIcip"):
super(SmdsIcip, self).__init__(ns, pref, tag)
class PropVirtual(IanaInterfaceType):
"""
Proprietary virtual/internal.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propVirtual"):
super(PropVirtual, self).__init__(ns, pref, tag)
class PropMultiplexor(IanaInterfaceType):
"""
Proprietary multiplexing.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propMultiplexor"):
super(PropMultiplexor, self).__init__(ns, pref, tag)
class Ieee80212(IanaInterfaceType):
"""
100BaseVG.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee80212"):
super(Ieee80212, self).__init__(ns, pref, tag)
class FibreChannel(IanaInterfaceType):
"""
Fibre Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fibreChannel"):
super(FibreChannel, self).__init__(ns, pref, tag)
class HippiInterface(IanaInterfaceType):
"""
HIPPI interfaces.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hippiInterface"):
super(HippiInterface, self).__init__(ns, pref, tag)
class FrameRelayInterconnect(IanaInterfaceType):
"""
Obsolete; use either
frameRelay(32) or frameRelayService(44).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frameRelayInterconnect"):
super(FrameRelayInterconnect, self).__init__(ns, pref, tag)
class Aflane8023(IanaInterfaceType):
"""
ATM Emulated LAN for 802.3.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aflane8023"):
super(Aflane8023, self).__init__(ns, pref, tag)
class Aflane8025(IanaInterfaceType):
"""
ATM Emulated LAN for 802.5.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aflane8025"):
super(Aflane8025, self).__init__(ns, pref, tag)
class CctEmul(IanaInterfaceType):
"""
ATM Emulated circuit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:cctEmul"):
super(CctEmul, self).__init__(ns, pref, tag)
class FastEther(IanaInterfaceType):
"""
Obsoleted via RFC 3635.
ethernetCsmacd(6) should be used instead.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fastEther"):
super(FastEther, self).__init__(ns, pref, tag)
class Isdn(IanaInterfaceType):
"""
ISDN and X.25.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:isdn"):
super(Isdn, self).__init__(ns, pref, tag)
class V11(IanaInterfaceType):
"""
CCITT V.11/X.21.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:v11"):
super(V11, self).__init__(ns, pref, tag)
class V36(IanaInterfaceType):
"""
CCITT V.36.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:v36"):
super(V36, self).__init__(ns, pref, tag)
class G703at64k(IanaInterfaceType):
"""
CCITT G703 at 64Kbps.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:g703at64k"):
super(G703at64k, self).__init__(ns, pref, tag)
class G703at2mb(IanaInterfaceType):
"""
Obsolete; see DS1\-MIB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:g703at2mb"):
super(G703at2mb, self).__init__(ns, pref, tag)
class Qllc(IanaInterfaceType):
"""
SNA QLLC.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:qllc"):
super(Qllc, self).__init__(ns, pref, tag)
class FastEtherFX(IanaInterfaceType):
"""
Obsoleted via RFC 3635.
ethernetCsmacd(6) should be used instead.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fastEtherFX"):
super(FastEtherFX, self).__init__(ns, pref, tag)
class Channel(IanaInterfaceType):
"""
Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:channel"):
super(Channel, self).__init__(ns, pref, tag)
class Ieee80211(IanaInterfaceType):
"""
Radio spread spectrum.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee80211"):
super(Ieee80211, self).__init__(ns, pref, tag)
class Ibm370parChan(IanaInterfaceType):
"""
IBM System 360/370 OEMI Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ibm370parChan"):
super(Ibm370parChan, self).__init__(ns, pref, tag)
class Escon(IanaInterfaceType):
"""
IBM Enterprise Systems Connection.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:escon"):
super(Escon, self).__init__(ns, pref, tag)
class Dlsw(IanaInterfaceType):
"""
Data Link Switching.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dlsw"):
super(Dlsw, self).__init__(ns, pref, tag)
class Isdns(IanaInterfaceType):
"""
ISDN S/T interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:isdns"):
super(Isdns, self).__init__(ns, pref, tag)
class Isdnu(IanaInterfaceType):
"""
ISDN U interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:isdnu"):
super(Isdnu, self).__init__(ns, pref, tag)
class Lapd(IanaInterfaceType):
"""
Link Access Protocol D.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:lapd"):
super(Lapd, self).__init__(ns, pref, tag)
class IpSwitch(IanaInterfaceType):
"""
IP Switching Objects.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ipSwitch"):
super(IpSwitch, self).__init__(ns, pref, tag)
class Rsrb(IanaInterfaceType):
"""
Remote Source Route Bridging.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:rsrb"):
super(Rsrb, self).__init__(ns, pref, tag)
class AtmLogical(IanaInterfaceType):
"""
ATM Logical Port.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmLogical"):
super(AtmLogical, self).__init__(ns, pref, tag)
class Ds0(IanaInterfaceType):
"""
Digital Signal Level 0.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ds0"):
super(Ds0, self).__init__(ns, pref, tag)
class Ds0Bundle(IanaInterfaceType):
"""
Group of ds0s on the same ds1.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ds0Bundle"):
super(Ds0Bundle, self).__init__(ns, pref, tag)
class Bsc(IanaInterfaceType):
"""
Bisynchronous Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:bsc"):
super(Bsc, self).__init__(ns, pref, tag)
class Async(IanaInterfaceType):
"""
Asynchronous Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:async"):
super(Async, self).__init__(ns, pref, tag)
class Cnr(IanaInterfaceType):
"""
Combat Net Radio.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:cnr"):
super(Cnr, self).__init__(ns, pref, tag)
class Iso88025Dtr(IanaInterfaceType):
"""
ISO 802.5r DTR.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88025Dtr"):
super(Iso88025Dtr, self).__init__(ns, pref, tag)
class Eplrs(IanaInterfaceType):
"""
Ext Pos Loc Report Sys.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:eplrs"):
super(Eplrs, self).__init__(ns, pref, tag)
class Arap(IanaInterfaceType):
"""
Appletalk Remote Access Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:arap"):
super(Arap, self).__init__(ns, pref, tag)
class PropCnls(IanaInterfaceType):
"""
Proprietary Connectionless Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propCnls"):
super(PropCnls, self).__init__(ns, pref, tag)
class HostPad(IanaInterfaceType):
"""
CCITT\-ITU X.29 PAD Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hostPad"):
super(HostPad, self).__init__(ns, pref, tag)
class TermPad(IanaInterfaceType):
"""
CCITT\-ITU X.3 PAD Facility.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:termPad"):
super(TermPad, self).__init__(ns, pref, tag)
class FrameRelayMPI(IanaInterfaceType):
"""
Multiproto Interconnect over FR.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frameRelayMPI"):
super(FrameRelayMPI, self).__init__(ns, pref, tag)
class X213(IanaInterfaceType):
"""
CCITT\-ITU X213.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:x213"):
super(X213, self).__init__(ns, pref, tag)
class Adsl(IanaInterfaceType):
"""
Asymmetric Digital Subscriber Loop.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:adsl"):
super(Adsl, self).__init__(ns, pref, tag)
class Radsl(IanaInterfaceType):
"""
Rate\-Adapt. Digital Subscriber Loop.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:radsl"):
super(Radsl, self).__init__(ns, pref, tag)
class Sdsl(IanaInterfaceType):
"""
Symmetric Digital Subscriber Loop.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sdsl"):
super(Sdsl, self).__init__(ns, pref, tag)
class Vdsl(IanaInterfaceType):
"""
Very H\-Speed Digital Subscrib. Loop.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:vdsl"):
super(Vdsl, self).__init__(ns, pref, tag)
class Iso88025CRFPInt(IanaInterfaceType):
"""
ISO 802.5 CRFP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88025CRFPInt"):
super(Iso88025CRFPInt, self).__init__(ns, pref, tag)
class Myrinet(IanaInterfaceType):
"""
Myricom Myrinet.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:myrinet"):
super(Myrinet, self).__init__(ns, pref, tag)
class VoiceEM(IanaInterfaceType):
"""
Voice recEive and transMit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceEM"):
super(VoiceEM, self).__init__(ns, pref, tag)
class VoiceFXO(IanaInterfaceType):
"""
Voice Foreign Exchange Office.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceFXO"):
super(VoiceFXO, self).__init__(ns, pref, tag)
class VoiceFXS(IanaInterfaceType):
"""
Voice Foreign Exchange Station.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceFXS"):
super(VoiceFXS, self).__init__(ns, pref, tag)
class VoiceEncap(IanaInterfaceType):
"""
Voice encapsulation.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceEncap"):
super(VoiceEncap, self).__init__(ns, pref, tag)
class VoiceOverIp(IanaInterfaceType):
"""
Voice over IP encapsulation.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceOverIp"):
super(VoiceOverIp, self).__init__(ns, pref, tag)
class AtmDxi(IanaInterfaceType):
"""
ATM DXI.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmDxi"):
super(AtmDxi, self).__init__(ns, pref, tag)
class AtmFuni(IanaInterfaceType):
"""
ATM FUNI.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmFuni"):
super(AtmFuni, self).__init__(ns, pref, tag)
class AtmIma(IanaInterfaceType):
"""
ATM IMA.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmIma"):
super(AtmIma, self).__init__(ns, pref, tag)
class PppMultilinkBundle(IanaInterfaceType):
"""
PPP Multilink Bundle.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pppMultilinkBundle"):
super(PppMultilinkBundle, self).__init__(ns, pref, tag)
class IpOverCdlc(IanaInterfaceType):
"""
IBM ipOverCdlc.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ipOverCdlc"):
super(IpOverCdlc, self).__init__(ns, pref, tag)
class IpOverClaw(IanaInterfaceType):
"""
IBM Common Link Access to Workstn.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ipOverClaw"):
super(IpOverClaw, self).__init__(ns, pref, tag)
class StackToStack(IanaInterfaceType):
"""
IBM stackToStack.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:stackToStack"):
super(StackToStack, self).__init__(ns, pref, tag)
class VirtualIpAddress(IanaInterfaceType):
"""
IBM VIPA.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:virtualIpAddress"):
super(VirtualIpAddress, self).__init__(ns, pref, tag)
class Mpc(IanaInterfaceType):
"""
IBM multi\-protocol channel support.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mpc"):
super(Mpc, self).__init__(ns, pref, tag)
class IpOverAtm(IanaInterfaceType):
"""
IBM ipOverAtm.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ipOverAtm"):
super(IpOverAtm, self).__init__(ns, pref, tag)
class Iso88025Fiber(IanaInterfaceType):
"""
ISO 802.5j Fiber Token Ring.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:iso88025Fiber"):
super(Iso88025Fiber, self).__init__(ns, pref, tag)
class Tdlc(IanaInterfaceType):
"""
IBM twinaxial data link control.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:tdlc"):
super(Tdlc, self).__init__(ns, pref, tag)
class GigabitEthernet(IanaInterfaceType):
"""
Obsoleted via RFC 3635.
ethernetCsmacd(6) should be used instead.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gigabitEthernet"):
super(GigabitEthernet, self).__init__(ns, pref, tag)
class Hdlc(IanaInterfaceType):
"""
HDLC.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hdlc"):
super(Hdlc, self).__init__(ns, pref, tag)
class Lapf(IanaInterfaceType):
"""
LAP F.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:lapf"):
super(Lapf, self).__init__(ns, pref, tag)
class V37(IanaInterfaceType):
"""
V.37.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:v37"):
super(V37, self).__init__(ns, pref, tag)
class X25mlp(IanaInterfaceType):
"""
Multi\-Link Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:x25mlp"):
super(X25mlp, self).__init__(ns, pref, tag)
class X25huntGroup(IanaInterfaceType):
"""
X25 Hunt Group.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:x25huntGroup"):
super(X25huntGroup, self).__init__(ns, pref, tag)
class TranspHdlc(IanaInterfaceType):
"""
Transp HDLC.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:transpHdlc"):
super(TranspHdlc, self).__init__(ns, pref, tag)
class Interleave(IanaInterfaceType):
"""
Interleave channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:interleave"):
super(Interleave, self).__init__(ns, pref, tag)
class Fast(IanaInterfaceType):
"""
Fast channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fast"):
super(Fast, self).__init__(ns, pref, tag)
class Ip(IanaInterfaceType):
"""
IP (for APPN HPR in IP networks).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ip"):
super(Ip, self).__init__(ns, pref, tag)
class DocsCableMaclayer(IanaInterfaceType):
"""
CATV Mac Layer.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableMaclayer"):
super(DocsCableMaclayer, self).__init__(ns, pref, tag)
class DocsCableDownstream(IanaInterfaceType):
"""
CATV Downstream interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableDownstream"):
super(DocsCableDownstream, self).__init__(ns, pref, tag)
class DocsCableUpstream(IanaInterfaceType):
"""
CATV Upstream interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableUpstream"):
super(DocsCableUpstream, self).__init__(ns, pref, tag)
class A12MppSwitch(IanaInterfaceType):
"""
Avalon Parallel Processor.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:a12MppSwitch"):
super(A12MppSwitch, self).__init__(ns, pref, tag)
class Tunnel(IanaInterfaceType):
"""
Encapsulation interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:tunnel"):
super(Tunnel, self).__init__(ns, pref, tag)
class Coffee(IanaInterfaceType):
"""
Coffee pot.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:coffee"):
super(Coffee, self).__init__(ns, pref, tag)
class Ces(IanaInterfaceType):
"""
Circuit Emulation Service.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ces"):
super(Ces, self).__init__(ns, pref, tag)
class AtmSubInterface(IanaInterfaceType):
"""
ATM Sub Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmSubInterface"):
super(AtmSubInterface, self).__init__(ns, pref, tag)
class L2vlan(IanaInterfaceType):
"""
Layer 2 Virtual LAN using 802.1Q.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:l2vlan"):
super(L2vlan, self).__init__(ns, pref, tag)
class L3ipvlan(IanaInterfaceType):
"""
Layer 3 Virtual LAN using IP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:l3ipvlan"):
super(L3ipvlan, self).__init__(ns, pref, tag)
class L3ipxvlan(IanaInterfaceType):
"""
Layer 3 Virtual LAN using IPX.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:l3ipxvlan"):
super(L3ipxvlan, self).__init__(ns, pref, tag)
class DigitalPowerline(IanaInterfaceType):
"""
IP over Power Lines.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:digitalPowerline"):
super(DigitalPowerline, self).__init__(ns, pref, tag)
class MediaMailOverIp(IanaInterfaceType):
"""
Multimedia Mail over IP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mediaMailOverIp"):
super(MediaMailOverIp, self).__init__(ns, pref, tag)
class Dtm(IanaInterfaceType):
"""
Dynamic synchronous Transfer Mode.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dtm"):
super(Dtm, self).__init__(ns, pref, tag)
class Dcn(IanaInterfaceType):
"""
Data Communications Network.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dcn"):
super(Dcn, self).__init__(ns, pref, tag)
class IpForward(IanaInterfaceType):
"""
IP Forwarding Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ipForward"):
super(IpForward, self).__init__(ns, pref, tag)
class Msdsl(IanaInterfaceType):
"""
Multi\-rate Symmetric DSL.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:msdsl"):
super(Msdsl, self).__init__(ns, pref, tag)
class Ieee1394(IanaInterfaceType):
"""
IEEE1394 High Performance Serial Bus.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee1394"):
super(Ieee1394, self).__init__(ns, pref, tag)
class IfGsn(IanaInterfaceType):
"""
HIPPI\-6400.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:if-gsn"):
super(IfGsn, self).__init__(ns, pref, tag)
class DvbRccMacLayer(IanaInterfaceType):
"""
DVB\-RCC MAC Layer.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbRccMacLayer"):
super(DvbRccMacLayer, self).__init__(ns, pref, tag)
class DvbRccDownstream(IanaInterfaceType):
"""
DVB\-RCC Downstream Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbRccDownstream"):
super(DvbRccDownstream, self).__init__(ns, pref, tag)
class DvbRccUpstream(IanaInterfaceType):
"""
DVB\-RCC Upstream Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbRccUpstream"):
super(DvbRccUpstream, self).__init__(ns, pref, tag)
class AtmVirtual(IanaInterfaceType):
"""
ATM Virtual Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmVirtual"):
super(AtmVirtual, self).__init__(ns, pref, tag)
class MplsTunnel(IanaInterfaceType):
"""
MPLS Tunnel Virtual Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mplsTunnel"):
super(MplsTunnel, self).__init__(ns, pref, tag)
class Srp(IanaInterfaceType):
"""
Spatial Reuse Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:srp"):
super(Srp, self).__init__(ns, pref, tag)
class VoiceOverAtm(IanaInterfaceType):
"""
Voice over ATM.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceOverAtm"):
super(VoiceOverAtm, self).__init__(ns, pref, tag)
class VoiceOverFrameRelay(IanaInterfaceType):
"""
Voice Over Frame Relay.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceOverFrameRelay"):
super(VoiceOverFrameRelay, self).__init__(ns, pref, tag)
class Idsl(IanaInterfaceType):
"""
Digital Subscriber Loop over ISDN.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:idsl"):
super(Idsl, self).__init__(ns, pref, tag)
class CompositeLink(IanaInterfaceType):
"""
Avici Composite Link Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:compositeLink"):
super(CompositeLink, self).__init__(ns, pref, tag)
class Ss7SigLink(IanaInterfaceType):
"""
SS7 Signaling Link.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ss7SigLink"):
super(Ss7SigLink, self).__init__(ns, pref, tag)
class PropWirelessP2P(IanaInterfaceType):
"""
Prop. P2P wireless interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propWirelessP2P"):
super(PropWirelessP2P, self).__init__(ns, pref, tag)
class FrForward(IanaInterfaceType):
"""
Frame Forward Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frForward"):
super(FrForward, self).__init__(ns, pref, tag)
class Rfc1483(IanaInterfaceType):
"""
Multiprotocol over ATM AAL5.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:rfc1483"):
super(Rfc1483, self).__init__(ns, pref, tag)
class Usb(IanaInterfaceType):
"""
USB Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:usb"):
super(Usb, self).__init__(ns, pref, tag)
class Ieee8023adLag(IanaInterfaceType):
"""
IEEE 802.3ad Link Aggregate.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee8023adLag"):
super(Ieee8023adLag, self).__init__(ns, pref, tag)
class Bgppolicyaccounting(IanaInterfaceType):
"""
BGP Policy Accounting.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:bgppolicyaccounting"):
super(Bgppolicyaccounting, self).__init__(ns, pref, tag)
class Frf16MfrBundle(IanaInterfaceType):
"""
FRF.16 Multilink Frame Relay.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frf16MfrBundle"):
super(Frf16MfrBundle, self).__init__(ns, pref, tag)
class H323Gatekeeper(IanaInterfaceType):
"""
H323 Gatekeeper.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:h323Gatekeeper"):
super(H323Gatekeeper, self).__init__(ns, pref, tag)
class H323Proxy(IanaInterfaceType):
"""
H323 Voice and Video Proxy.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:h323Proxy"):
super(H323Proxy, self).__init__(ns, pref, tag)
class Mpls(IanaInterfaceType):
"""
MPLS.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mpls"):
super(Mpls, self).__init__(ns, pref, tag)
class MfSigLink(IanaInterfaceType):
"""
Multi\-frequency signaling link.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mfSigLink"):
super(MfSigLink, self).__init__(ns, pref, tag)
class Hdsl2(IanaInterfaceType):
"""
High Bit\-Rate DSL \- 2nd generation.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hdsl2"):
super(Hdsl2, self).__init__(ns, pref, tag)
class Shdsl(IanaInterfaceType):
"""
Multirate HDSL2.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:shdsl"):
super(Shdsl, self).__init__(ns, pref, tag)
class Ds1FDL(IanaInterfaceType):
"""
Facility Data Link (4Kbps) on a DS1.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ds1FDL"):
super(Ds1FDL, self).__init__(ns, pref, tag)
class Pos(IanaInterfaceType):
"""
Packet over SONET/SDH Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pos"):
super(Pos, self).__init__(ns, pref, tag)
class DvbAsiIn(IanaInterfaceType):
"""
DVB\-ASI Input.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbAsiIn"):
super(DvbAsiIn, self).__init__(ns, pref, tag)
class DvbAsiOut(IanaInterfaceType):
"""
DVB\-ASI Output.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbAsiOut"):
super(DvbAsiOut, self).__init__(ns, pref, tag)
class Plc(IanaInterfaceType):
"""
Power Line Communications.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:plc"):
super(Plc, self).__init__(ns, pref, tag)
class Nfas(IanaInterfaceType):
"""
Non\-Facility Associated Signaling.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:nfas"):
super(Nfas, self).__init__(ns, pref, tag)
class Tr008(IanaInterfaceType):
"""
TR008.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:tr008"):
super(Tr008, self).__init__(ns, pref, tag)
class Gr303RDT(IanaInterfaceType):
"""
Remote Digital Terminal.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gr303RDT"):
super(Gr303RDT, self).__init__(ns, pref, tag)
class Gr303IDT(IanaInterfaceType):
"""
Integrated Digital Terminal.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gr303IDT"):
super(Gr303IDT, self).__init__(ns, pref, tag)
class Isup(IanaInterfaceType):
"""
ISUP.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:isup"):
super(Isup, self).__init__(ns, pref, tag)
class PropDocsWirelessMaclayer(IanaInterfaceType):
"""
Cisco proprietary Maclayer.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propDocsWirelessMaclayer"):
super(PropDocsWirelessMaclayer, self).__init__(ns, pref, tag)
class PropDocsWirelessDownstream(IanaInterfaceType):
"""
Cisco proprietary Downstream.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propDocsWirelessDownstream"):
super(PropDocsWirelessDownstream, self).__init__(ns, pref, tag)
class PropDocsWirelessUpstream(IanaInterfaceType):
"""
Cisco proprietary Upstream.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propDocsWirelessUpstream"):
super(PropDocsWirelessUpstream, self).__init__(ns, pref, tag)
class Hiperlan2(IanaInterfaceType):
"""
HIPERLAN Type 2 Radio Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:hiperlan2"):
super(Hiperlan2, self).__init__(ns, pref, tag)
class PropBWAp2Mp(IanaInterfaceType):
"""
PropBroadbandWirelessAccesspt2Multipt (use of this value
for IEEE 802.16 WMAN interfaces as per IEEE Std 802.16f
is deprecated, and ieee80216WMAN(237) should be used
instead).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propBWAp2Mp"):
super(PropBWAp2Mp, self).__init__(ns, pref, tag)
class SonetOverheadChannel(IanaInterfaceType):
"""
SONET Overhead Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sonetOverheadChannel"):
super(SonetOverheadChannel, self).__init__(ns, pref, tag)
class DigitalWrapperOverheadChannel(IanaInterfaceType):
"""
Digital Wrapper.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:digitalWrapperOverheadChannel"):
super(DigitalWrapperOverheadChannel, self).__init__(ns, pref, tag)
class Aal2(IanaInterfaceType):
"""
ATM adaptation layer 2.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aal2"):
super(Aal2, self).__init__(ns, pref, tag)
class RadioMAC(IanaInterfaceType):
"""
MAC layer over radio links.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:radioMAC"):
super(RadioMAC, self).__init__(ns, pref, tag)
class AtmRadio(IanaInterfaceType):
"""
ATM over radio links.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmRadio"):
super(AtmRadio, self).__init__(ns, pref, tag)
class Imt(IanaInterfaceType):
"""
Inter\-Machine Trunks.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:imt"):
super(Imt, self).__init__(ns, pref, tag)
class Mvl(IanaInterfaceType):
"""
Multiple Virtual Lines DSL.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mvl"):
super(Mvl, self).__init__(ns, pref, tag)
class ReachDSL(IanaInterfaceType):
"""
Long Reach DSL.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:reachDSL"):
super(ReachDSL, self).__init__(ns, pref, tag)
class FrDlciEndPt(IanaInterfaceType):
"""
Frame Relay DLCI End Point.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:frDlciEndPt"):
super(FrDlciEndPt, self).__init__(ns, pref, tag)
class AtmVciEndPt(IanaInterfaceType):
"""
ATM VCI End Point.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmVciEndPt"):
super(AtmVciEndPt, self).__init__(ns, pref, tag)
class OpticalChannel(IanaInterfaceType):
"""
Optical Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:opticalChannel"):
super(OpticalChannel, self).__init__(ns, pref, tag)
class OpticalTransport(IanaInterfaceType):
"""
Optical Transport.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:opticalTransport"):
super(OpticalTransport, self).__init__(ns, pref, tag)
class PropAtm(IanaInterfaceType):
"""
Proprietary ATM.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:propAtm"):
super(PropAtm, self).__init__(ns, pref, tag)
class VoiceOverCable(IanaInterfaceType):
"""
Voice Over Cable Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceOverCable"):
super(VoiceOverCable, self).__init__(ns, pref, tag)
class Infiniband(IanaInterfaceType):
"""
Infiniband.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:infiniband"):
super(Infiniband, self).__init__(ns, pref, tag)
class TeLink(IanaInterfaceType):
"""
TE Link.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:teLink"):
super(TeLink, self).__init__(ns, pref, tag)
class Q2931(IanaInterfaceType):
"""
Q.2931.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:q2931"):
super(Q2931, self).__init__(ns, pref, tag)
class VirtualTg(IanaInterfaceType):
"""
Virtual Trunk Group.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:virtualTg"):
super(VirtualTg, self).__init__(ns, pref, tag)
class SipTg(IanaInterfaceType):
"""
SIP Trunk Group.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sipTg"):
super(SipTg, self).__init__(ns, pref, tag)
class SipSig(IanaInterfaceType):
"""
SIP Signaling.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sipSig"):
super(SipSig, self).__init__(ns, pref, tag)
class DocsCableUpstreamChannel(IanaInterfaceType):
"""
CATV Upstream Channel.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableUpstreamChannel"):
super(DocsCableUpstreamChannel, self).__init__(ns, pref, tag)
class Econet(IanaInterfaceType):
"""
Acorn Econet.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:econet"):
super(Econet, self).__init__(ns, pref, tag)
class Pon155(IanaInterfaceType):
"""
FSAN 155Mb Symetrical PON interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pon155"):
super(Pon155, self).__init__(ns, pref, tag)
class Pon622(IanaInterfaceType):
"""
FSAN 622Mb Symetrical PON interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pon622"):
super(Pon622, self).__init__(ns, pref, tag)
class Bridge(IanaInterfaceType):
"""
Transparent bridge interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:bridge"):
super(Bridge, self).__init__(ns, pref, tag)
class Linegroup(IanaInterfaceType):
"""
Interface common to multiple lines.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:linegroup"):
super(Linegroup, self).__init__(ns, pref, tag)
class VoiceEMFGD(IanaInterfaceType):
"""
Voice E&M Feature Group D.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceEMFGD"):
super(VoiceEMFGD, self).__init__(ns, pref, tag)
class VoiceFGDEANA(IanaInterfaceType):
"""
Voice FGD Exchange Access North American.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceFGDEANA"):
super(VoiceFGDEANA, self).__init__(ns, pref, tag)
class VoiceDID(IanaInterfaceType):
"""
Voice Direct Inward Dialing.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceDID"):
super(VoiceDID, self).__init__(ns, pref, tag)
class MpegTransport(IanaInterfaceType):
"""
MPEG transport interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mpegTransport"):
super(MpegTransport, self).__init__(ns, pref, tag)
class SixToFour(IanaInterfaceType):
"""
6to4 interface (DEPRECATED).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:sixToFour"):
super(SixToFour, self).__init__(ns, pref, tag)
class Gtp(IanaInterfaceType):
"""
GTP (GPRS Tunneling Protocol).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gtp"):
super(Gtp, self).__init__(ns, pref, tag)
class PdnEtherLoop1(IanaInterfaceType):
"""
Paradyne EtherLoop 1.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pdnEtherLoop1"):
super(PdnEtherLoop1, self).__init__(ns, pref, tag)
class PdnEtherLoop2(IanaInterfaceType):
"""
Paradyne EtherLoop 2.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pdnEtherLoop2"):
super(PdnEtherLoop2, self).__init__(ns, pref, tag)
class OpticalChannelGroup(IanaInterfaceType):
"""
Optical Channel Group.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:opticalChannelGroup"):
super(OpticalChannelGroup, self).__init__(ns, pref, tag)
class Homepna(IanaInterfaceType):
"""
HomePNA ITU\-T G.989.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:homepna"):
super(Homepna, self).__init__(ns, pref, tag)
class Gfp(IanaInterfaceType):
"""
Generic Framing Procedure (GFP).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gfp"):
super(Gfp, self).__init__(ns, pref, tag)
class CiscoISLvlan(IanaInterfaceType):
"""
Layer 2 Virtual LAN using Cisco ISL.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ciscoISLvlan"):
super(CiscoISLvlan, self).__init__(ns, pref, tag)
class ActelisMetaLOOP(IanaInterfaceType):
"""
Acteleis proprietary MetaLOOP High Speed Link.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:actelisMetaLOOP"):
super(ActelisMetaLOOP, self).__init__(ns, pref, tag)
class FcipLink(IanaInterfaceType):
"""
FCIP Link.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:fcipLink"):
super(FcipLink, self).__init__(ns, pref, tag)
class Rpr(IanaInterfaceType):
"""
Resilient Packet Ring Interface Type.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:rpr"):
super(Rpr, self).__init__(ns, pref, tag)
class Qam(IanaInterfaceType):
"""
RF Qam Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:qam"):
super(Qam, self).__init__(ns, pref, tag)
class Lmp(IanaInterfaceType):
"""
Link Management Protocol.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:lmp"):
super(Lmp, self).__init__(ns, pref, tag)
class CblVectaStar(IanaInterfaceType):
"""
Cambridge Broadband Networks Limited VectaStar.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:cblVectaStar"):
super(CblVectaStar, self).__init__(ns, pref, tag)
class DocsCableMCmtsDownstream(IanaInterfaceType):
"""
CATV Modular CMTS Downstream Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableMCmtsDownstream"):
super(DocsCableMCmtsDownstream, self).__init__(ns, pref, tag)
class Adsl2(IanaInterfaceType):
"""
Asymmetric Digital Subscriber Loop Version 2
(DEPRECATED/OBSOLETED \- please use adsl2plus(238)
instead).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:adsl2"):
super(Adsl2, self).__init__(ns, pref, tag)
class MacSecControlledIF(IanaInterfaceType):
"""
MACSecControlled.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:macSecControlledIF"):
super(MacSecControlledIF, self).__init__(ns, pref, tag)
class MacSecUncontrolledIF(IanaInterfaceType):
"""
MACSecUncontrolled.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:macSecUncontrolledIF"):
super(MacSecUncontrolledIF, self).__init__(ns, pref, tag)
class AviciOpticalEther(IanaInterfaceType):
"""
Avici Optical Ethernet Aggregate.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aviciOpticalEther"):
super(AviciOpticalEther, self).__init__(ns, pref, tag)
class Atmbond(IanaInterfaceType):
"""
atmbond.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:atmbond"):
super(Atmbond, self).__init__(ns, pref, tag)
class VoiceFGDOS(IanaInterfaceType):
"""
Voice FGD Operator Services.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceFGDOS"):
super(VoiceFGDOS, self).__init__(ns, pref, tag)
class MocaVersion1(IanaInterfaceType):
"""
MultiMedia over Coax Alliance (MoCA) Interface
as documented in information provided privately to IANA.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:mocaVersion1"):
super(MocaVersion1, self).__init__(ns, pref, tag)
class Ieee80216WMAN(IanaInterfaceType):
"""
IEEE 802.16 WMAN interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee80216WMAN"):
super(Ieee80216WMAN, self).__init__(ns, pref, tag)
class Adsl2plus(IanaInterfaceType):
"""
Asymmetric Digital Subscriber Loop Version 2 \-
Version 2 Plus and all variants.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:adsl2plus"):
super(Adsl2plus, self).__init__(ns, pref, tag)
class DvbRcsMacLayer(IanaInterfaceType):
"""
DVB\-RCS MAC Layer.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbRcsMacLayer"):
super(DvbRcsMacLayer, self).__init__(ns, pref, tag)
class DvbTdm(IanaInterfaceType):
"""
DVB Satellite TDM.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbTdm"):
super(DvbTdm, self).__init__(ns, pref, tag)
class DvbRcsTdma(IanaInterfaceType):
"""
DVB\-RCS TDMA.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:dvbRcsTdma"):
super(DvbRcsTdma, self).__init__(ns, pref, tag)
class X86Laps(IanaInterfaceType):
"""
LAPS based on ITU\-T X.86/Y.1323.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:x86Laps"):
super(X86Laps, self).__init__(ns, pref, tag)
class WwanPP(IanaInterfaceType):
"""
3GPP WWAN.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:wwanPP"):
super(WwanPP, self).__init__(ns, pref, tag)
class WwanPP2(IanaInterfaceType):
"""
3GPP2 WWAN.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:wwanPP2"):
super(WwanPP2, self).__init__(ns, pref, tag)
class VoiceEBS(IanaInterfaceType):
"""
Voice P\-phone EBS physical interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:voiceEBS"):
super(VoiceEBS, self).__init__(ns, pref, tag)
class IfPwType(IanaInterfaceType):
"""
Pseudowire interface type.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ifPwType"):
super(IfPwType, self).__init__(ns, pref, tag)
class Ilan(IanaInterfaceType):
"""
Internal LAN on a bridge per IEEE 802.1ap.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ilan"):
super(Ilan, self).__init__(ns, pref, tag)
class Pip(IanaInterfaceType):
"""
Provider Instance Port on a bridge per IEEE 802.1ah PBB.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:pip"):
super(Pip, self).__init__(ns, pref, tag)
class AluELP(IanaInterfaceType):
"""
Alcatel\-Lucent Ethernet Link Protection.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluELP"):
super(AluELP, self).__init__(ns, pref, tag)
class Gpon(IanaInterfaceType):
"""
Gigabit\-capable passive optical networks (G\-PON) as per
ITU\-T G.948.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:gpon"):
super(Gpon, self).__init__(ns, pref, tag)
class Vdsl2(IanaInterfaceType):
"""
Very high speed digital subscriber line Version 2
(as per ITU\-T Recommendation G.993.2).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:vdsl2"):
super(Vdsl2, self).__init__(ns, pref, tag)
class CapwapDot11Profile(IanaInterfaceType):
"""
WLAN Profile Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:capwapDot11Profile"):
super(CapwapDot11Profile, self).__init__(ns, pref, tag)
class CapwapDot11Bss(IanaInterfaceType):
"""
WLAN BSS Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:capwapDot11Bss"):
super(CapwapDot11Bss, self).__init__(ns, pref, tag)
class CapwapWtpVirtualRadio(IanaInterfaceType):
"""
WTP Virtual Radio Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:capwapWtpVirtualRadio"):
super(CapwapWtpVirtualRadio, self).__init__(ns, pref, tag)
class Bits(IanaInterfaceType):
"""
bitsport.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:bits"):
super(Bits, self).__init__(ns, pref, tag)
class DocsCableUpstreamRfPort(IanaInterfaceType):
"""
DOCSIS CATV Upstream RF Port.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:docsCableUpstreamRfPort"):
super(DocsCableUpstreamRfPort, self).__init__(ns, pref, tag)
class CableDownstreamRfPort(IanaInterfaceType):
"""
CATV downstream RF Port.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:cableDownstreamRfPort"):
super(CableDownstreamRfPort, self).__init__(ns, pref, tag)
class VmwareVirtualNic(IanaInterfaceType):
"""
VMware Virtual Network Interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:vmwareVirtualNic"):
super(VmwareVirtualNic, self).__init__(ns, pref, tag)
class Ieee802154(IanaInterfaceType):
"""
IEEE 802.15.4 WPAN interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ieee802154"):
super(Ieee802154, self).__init__(ns, pref, tag)
class OtnOdu(IanaInterfaceType):
"""
OTN Optical Data Unit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:otnOdu"):
super(OtnOdu, self).__init__(ns, pref, tag)
class OtnOtu(IanaInterfaceType):
"""
OTN Optical channel Transport Unit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:otnOtu"):
super(OtnOtu, self).__init__(ns, pref, tag)
class IfVfiType(IanaInterfaceType):
"""
VPLS Forwarding Instance Interface Type.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:ifVfiType"):
super(IfVfiType, self).__init__(ns, pref, tag)
class G9981(IanaInterfaceType):
"""
G.998.1 bonded interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:g9981"):
super(G9981, self).__init__(ns, pref, tag)
class G9982(IanaInterfaceType):
"""
G.998.2 bonded interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:g9982"):
super(G9982, self).__init__(ns, pref, tag)
class G9983(IanaInterfaceType):
"""
G.998.3 bonded interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:g9983"):
super(G9983, self).__init__(ns, pref, tag)
class AluEpon(IanaInterfaceType):
"""
Ethernet Passive Optical Networks (E\-PON).
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluEpon"):
super(AluEpon, self).__init__(ns, pref, tag)
class AluEponOnu(IanaInterfaceType):
"""
EPON Optical Network Unit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluEponOnu"):
super(AluEponOnu, self).__init__(ns, pref, tag)
class AluEponPhysicalUni(IanaInterfaceType):
"""
EPON physical User to Network interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluEponPhysicalUni"):
super(AluEponPhysicalUni, self).__init__(ns, pref, tag)
class AluEponLogicalLink(IanaInterfaceType):
"""
The emulation of a point\-to\-point link over the EPON
layer.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluEponLogicalLink"):
super(AluEponLogicalLink, self).__init__(ns, pref, tag)
class AluGponOnu(IanaInterfaceType):
"""
GPON Optical Network Unit.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluGponOnu"):
super(AluGponOnu, self).__init__(ns, pref, tag)
class AluGponPhysicalUni(IanaInterfaceType):
"""
GPON physical User to Network interface.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:aluGponPhysicalUni"):
super(AluGponPhysicalUni, self).__init__(ns, pref, tag)
class VmwareNicTeam(IanaInterfaceType):
"""
VMware NIC Team.
"""
_prefix = 'ianaift'
_revision = '2014-05-08'
def __init__(self, ns="urn:ietf:params:xml:ns:yang:iana-if-type", pref="iana-if-type", tag="iana-if-type:vmwareNicTeam"):
super(VmwareNicTeam, self).__init__(ns, pref, tag)
| 20.492544 | 141 | 0.619901 | 11,592 | 90,700 | 4.614044 | 0.066425 | 0.091987 | 0.153311 | 0.127603 | 0.674849 | 0.674027 | 0.557211 | 0.555117 | 0.555117 | 0.555117 | 0 | 0.042732 | 0.215127 | 90,700 | 4,425 | 142 | 20.497175 | 0.708603 | 0.090728 | 0 | 0.398249 | 0 | 0 | 0.328126 | 0.194544 | 0 | 0 | 0 | 0 | 0 | 1 | 0.199125 | false | 0 | 0.004376 | 0 | 0.800875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146492b5825e6748c11ad8a01ce7cd816d909acc | 5,307 | py | Python | TimeSeriesTools/Similarities/similarities.py | Psicowired87/TimeSeriesTools | de42dbcc5371ee576df6c9521b1c79a47c147dd1 | [
"MIT"
] | 1 | 2015-05-01T14:14:02.000Z | 2015-05-01T14:14:02.000Z | TimeSeriesTools/Similarities/similarities.py | Psicowired87/TimeSeriesTools | de42dbcc5371ee576df6c9521b1c79a47c147dd1 | [
"MIT"
] | null | null | null | TimeSeriesTools/Similarities/similarities.py | Psicowired87/TimeSeriesTools | de42dbcc5371ee576df6c9521b1c79a47c147dd1 | [
"MIT"
] | 1 | 2015-05-01T14:15:03.000Z | 2015-05-01T14:15:03.000Z |
"""
Distances or similarities between time series are functions that computes a
distance measure or similarity measure pairwise and returns a matrix of
distances between each timeserie.
"""
import numpy as np
#import math
def general_comparison(X, method, kwargs={}):
"""Function which acts as a switcher and wraps all the comparison functions
available in this package.
Parameters
----------
X: array_like, shape(N, M)
a collection of M time-series.
method: str, optional
measure we want to use for make the comparison.
kwargs: dict
parameters for the functions selected by the method parameter.
Returns
-------
comparisons: array_like, shape (M, M)
returns the measure of all the possible time series versus all the
others.
"""
if type(method).__name__ == 'function':
comparisons = method(X, **kwargs)
elif method == 'lag_based':
comparisons = general_lag_distance(X, **kwargs)
elif method == 'static_based':
comparisons = general_distance_M(X, **kwargs)
return comparisons
def general_lag_distance(X, method_f, tlags, simmetrical=False, kwargs={}):
"""Build a 3d matrix of distance using the method_f given.
Parameters
----------
X: array_like, shape(Ntimes, Nelements)
the signals of the system.
maxt: integer or list or array_like
max lag to be considered.
method_f: function
the function 1v1 to compute the distance or similarity desired.
simmetrical: boolean
the possibility to safe computational power only computing one
directional pairs.
kwargs: dict
the parameters of the selected method. The parameters needed by
method_f.
Returns
-------
M: array_like, shape (Nelements, Nelements, nlags)
the matrix of possible distances for each choosen lag time.
"""
## 0. Format inputs and needed variables
# Format lags
lags = np.array([tlags]) if type(tlags) in [int, list] else tlags
lags = lags.reshape(-1)
# Elements
n = X.shape[1]
tl = lags.shape[0]
## Build the tensor
M = np.zeros((n, n, tl))
# Loop over lags
for l in range(lags.shape[0]):
tlag = lags[l]
# Loop over the possible combinations of elements
for i in range(n):
# Loop over the necessary considering simmetrical or not
if simmetrical:
pairedelements = range(i, n)
else:
pairedelements = range(n)
# Loop over the pair elements
for j in pairedelements:
M[i, j, l] = method_f(X[tlag:, i], X[:X.shape[0]-tlag, j],
**kwargs)
return M
def general_distance_M(X, method_f, simmetrical, kwargs={}):
"""This function is an applicator of a method given by the function
method_f.
Parameters
----------
X: array_like, shape(Ntimes, Nelements)
the signals of the system.
maxt: integer or list or array_like
max lag to be considered.
method_f: function
the function 1v1 to compute the distance or similarity desired.
simmetrical: boolean
the possibility to safe computational power only computing one
directional pairs.
kwargs: dict
the parameters of the selected method. The parameters needed by
method_f.
"""
# Initialization
n = X.shape[1]
M = np.zeros((n, n))
# Loop over the possible combinations of elements
for i in range(n):
# Loop over the necessary considering simmetrical or not
if simmetrical:
pairedelements = range(i, n)
else:
pairedelements = range(n)
# Loop over the pair elements
for j in pairedelements:
M[i, j] = method_f(X[:, i], X[:, j], **kwargs)
return M
def comparison_1v1(x, y, method, kwargs={}):
"""Function which acts as a switcher and wraps all the comparison functions
available in this package.
Parameters
----------
x: array_like, shape(N,)
time-serie with N times.
y: array_like, shape(N,)
time-serie with N times.
method: str, optional
measure we want to use for make the comparison.
kwargs: dict
parameters for the functions selected by the method parameter.
Returns
-------
comparisons: float
returns the measure of comparison between the two time-series given
using the measure specified in the inputs.
"""
return comparisons
def comparison_f_1v1(x, y, method, kwargs={}):
"""Function which acts as a switcher and wraps all the comparison functions
available in this package and returns a instantiable function.
Parameters
----------
method: str, optional
measure we want to use for make the comparison.
kwargs: dict
parameters for the functions selected by the method parameter.
Returns
-------
comparator: function
the instantiable function which could be called to compare two time
series.
"""
if type(method).__name__ == 'function':
comparator = lambda x, y: method(x, y, **kwargs)
elif method == '':
pass
return comparator
| 28.686486 | 79 | 0.628415 | 681 | 5,307 | 4.837004 | 0.223201 | 0.021251 | 0.029751 | 0.018215 | 0.600182 | 0.569217 | 0.569217 | 0.569217 | 0.569217 | 0.553127 | 0 | 0.004235 | 0.28811 | 5,307 | 184 | 80 | 28.842391 | 0.867655 | 0.599585 | 0 | 0.444444 | 0 | 0 | 0.021179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.022222 | 0.022222 | 0 | 0.244444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146511309f8c17002dac29adf6b1235ccc4469be | 1,381 | py | Python | BDAI/datapredict/arandom.py | FYPYTHON/method | 94c8768b284f498dd4876ff152c355d76655f6c0 | [
"Apache-2.0"
] | null | null | null | BDAI/datapredict/arandom.py | FYPYTHON/method | 94c8768b284f498dd4876ff152c355d76655f6c0 | [
"Apache-2.0"
] | null | null | null | BDAI/datapredict/arandom.py | FYPYTHON/method | 94c8768b284f498dd4876ff152c355d76655f6c0 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
from random import randint
start = 1
end = 100
def check_leve(num_ori):
num = num_ori * 100 / 100
if 0 < num <= 19:
level = 1
elif 19 < num <= 19 + 17:
level = 2
elif 36 < num <= 36 + 15:
level = 3
elif 51 < num <= 51 + 13:
level = 4
elif 64 < num <= 64 + 11:
level = 5
elif 75 < num <= 75 + 9:
level = 6
elif 84 < num <= 84 + 7:
level = 7
elif 91 < num <= 91 + 5:
level = 8
elif 96 < num <= 96 + 3:
level = 9
elif 99 < num <= 100:
level = 10
else:
level = 0
return level
def get_fre():
data_fre = dict()
num_data = 1000
for i in range(num_data):
rd = randint(start, end)
rkey = check_leve(rd)
if rkey in data_fre.keys():
data_fre[rkey] += 1
else:
data_fre[rkey] = 1
data_p = sorted(data_fre.items(), key=lambda d: d[1], reverse=True)
print(data_p)
for data in data_p:
# print(type(data[1]))
print(data[0], "%.2f" % (data[1] / num_data * 100))
def get_char():
cha_s = 0x4e00
cha_e = 0x9fa5
c_num = 100
for i in range(c_num):
# cchar = randint(cha_s, cha_e)
# print(chr(cchar))
i = i + 19968 + 100 * 2
print(i, chr(i))
if __name__ == "__main__":
# get_fre()
get_char()
| 21.920635 | 71 | 0.493121 | 209 | 1,381 | 3.100478 | 0.373206 | 0.054012 | 0.018519 | 0.033951 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125874 | 0.378711 | 1,381 | 62 | 72 | 22.274194 | 0.629371 | 0.065894 | 0 | 0.039216 | 0 | 0 | 0.009346 | 0 | 0 | 0 | 0.009346 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.019608 | 0 | 0.098039 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146608cbf54e2dc17b4bdc5c43fcee82318c9ee7 | 1,399 | py | Python | sw_stresstest/util.py | ox-inet-resilience/sw_stresstest | 6ffb6674b7550f32ac490a67fd5cb08352c92525 | [
"Apache-2.0"
] | null | null | null | sw_stresstest/util.py | ox-inet-resilience/sw_stresstest | 6ffb6674b7550f32ac490a67fd5cb08352c92525 | [
"Apache-2.0"
] | null | null | null | sw_stresstest/util.py | ox-inet-resilience/sw_stresstest | 6ffb6674b7550f32ac490a67fd5cb08352c92525 | [
"Apache-2.0"
] | null | null | null | import subprocess
import time
import matplotlib.pyplot as plt
def get_process_time():
return "%.2f mins" % (time.process_time() / 60)
def savefig(sname, bar=False):
barstr = "_bar" if bar else ""
git_rev_hash = (
subprocess.check_output("git rev-parse HEAD".split()).decode("utf-8").strip()
)
plotname = f"{sname}{barstr}_{git_rev_hash}.png"
print(plotname)
plt.savefig("plots/" + plotname)
# plt.savefig('plots/' + plotname.replace('png', 'eps'))
# end
def setup_matplotlib():
# plt.style.use('fivethirtyeight')
# plt.style.use('ggplot')
from cycler import cycler
_cmap = plt.get_cmap("tab20")
_cycler = cycler(
color=[_cmap(i / 10) for i in range(10)] + [_cmap(3 / 12), _cmap(5 / 12)]
) + cycler(marker=[4, 5, 6, 7, "d", "o", ".", 4, 5, 6, 7, "d"])
plt.rc("font", **{"size": 11}) # , 'sans-serif': ['Computer Modern Sans Serif']})
plt.rc("axes", prop_cycle=_cycler, titlesize="xx-large", grid=True, axisbelow=True)
plt.rc("grid", linestyle=":")
plt.rc("figure", titlesize="xx-large")
plt.rc("savefig", dpi=200)
return _cycler
def draw_figlegend(fig, right=0.85, legend_title=None):
# Remove duplicate label by using only 1 axes
handles, labels = plt.gca().get_legend_handles_labels()
fig.legend(handles, labels, loc=7, title=legend_title)
fig.subplots_adjust(right=right)
| 29.765957 | 87 | 0.634739 | 198 | 1,399 | 4.348485 | 0.545455 | 0.029036 | 0.023229 | 0.053426 | 0.074332 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029903 | 0.187277 | 1,399 | 46 | 88 | 30.413043 | 0.727353 | 0.148678 | 0 | 0 | 0 | 0 | 0.110642 | 0.028716 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | false | 0 | 0.137931 | 0.034483 | 0.344828 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14661876a82e21d72a76cbaff98ac6765f7f7da1 | 1,792 | py | Python | src/omniverse/omniverse/pub_sub.py | XinghuiTao/ros2_turtlebot | 206aa5c8830cc6598077ab4cf86ffed261183d1a | [
"MIT"
] | null | null | null | src/omniverse/omniverse/pub_sub.py | XinghuiTao/ros2_turtlebot | 206aa5c8830cc6598077ab4cf86ffed261183d1a | [
"MIT"
] | null | null | null | src/omniverse/omniverse/pub_sub.py | XinghuiTao/ros2_turtlebot | 206aa5c8830cc6598077ab4cf86ffed261183d1a | [
"MIT"
] | null | null | null | import rclpy
from rclpy.node import Node
from geometry_msgs.msg import Twist
from sensor_msgs.msg import LaserScan
from rclpy.qos import ReliabilityPolicy, QoSProfile
from messager.msg import Date
class Bootstrap(Node):
def __init__(self):
super().__init__('Bootstrap')
self.publisher_ = self.create_publisher(Twist, 'cmd_vel', 10)
self.subscriber = self.create_subscription(LaserScan, '/scan', self.move_turtlebot, QoSProfile(depth=10, reliability=ReliabilityPolicy.BEST_EFFORT))
# prevent unused variable warning
self.subscriber
# define the timer period for 0.5 seconds
self.timer_period = 0.5
# define the variable to save the received info
self.laser_forward = 0
# create a Twist message
self.cmd = Twist()
self.timer = self.create_timer(self.timer_period, self.motion)
def move_turtlebot(self,msg):
# Save the frontal laser scan info at 0°
self.laser_forward = msg.ranges[359]
def motion(self):
# print the data
self.get_logger().info('I receive: "%s"' % str(self.laser_forward))
# Logic of move
if self.laser_forward > 5:
self.cmd.linear.x = 0.5
self.cmd.angular.z = 0.5
elif self.laser_forward <5 and self.laser_forward>=0.5:
self.cmd.linear.x = 0.2
self.cmd.angular.z = 0.0
else:
self.cmd.linear.x = 0.0
self.cmd.angular.z = 0.0
# Publishing the cmd_vel values to topipc
self.publisher_.publish(self.cmd)
def main(args=None):
rclpy.init(args=args)
bootstrap = Bootstrap()
rclpy.spin(bootstrap)
bootstrap.destroy_node()
rclpy.shutdown()
if __name__ == '__main__':
main() | 33.811321 | 156 | 0.638393 | 240 | 1,792 | 4.6125 | 0.379167 | 0.050587 | 0.086721 | 0.03794 | 0.087624 | 0.059621 | 0 | 0 | 0 | 0 | 0 | 0.021986 | 0.263951 | 1,792 | 53 | 157 | 33.811321 | 0.816528 | 0.138393 | 0 | 0.052632 | 0 | 0 | 0.028646 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.157895 | 0 | 0.289474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146653bc27b7e95442355cef810e6af40d36eb20 | 958 | py | Python | agent/indy_catalyst_agent/messaging/basicmessage/handlers/basicmessage_handler.py | blhagadorn/indy-catalyst | c268dba024096d312f541fde40443a1757f21661 | [
"Apache-2.0"
] | null | null | null | agent/indy_catalyst_agent/messaging/basicmessage/handlers/basicmessage_handler.py | blhagadorn/indy-catalyst | c268dba024096d312f541fde40443a1757f21661 | [
"Apache-2.0"
] | null | null | null | agent/indy_catalyst_agent/messaging/basicmessage/handlers/basicmessage_handler.py | blhagadorn/indy-catalyst | c268dba024096d312f541fde40443a1757f21661 | [
"Apache-2.0"
] | null | null | null | """Basic message handler."""
from ...base_handler import BaseHandler, BaseResponder, RequestContext
from ..messages.basicmessage import BasicMessage
class BasicMessageHandler(BaseHandler):
"""Message handler class for basic messages."""
async def handle(self, context: RequestContext, responder: BaseResponder):
"""
Message handler logic for basic messages.
Args:
context: request context
responder: responder callback
"""
self._logger.debug(f"BasicMessageHandler called with context {context}")
assert isinstance(context.message, BasicMessage)
self._logger.info("Received basic message: %s", context.message.content)
content = context.message.content
if content.startswith("Reply with: "):
reply = content[12:]
reply = BasicMessage(content=reply, _l10n=context.message._l10n)
await responder.send_reply(reply)
| 34.214286 | 80 | 0.677453 | 94 | 958 | 6.840426 | 0.446809 | 0.087092 | 0.049767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008119 | 0.228601 | 958 | 27 | 81 | 35.481481 | 0.861976 | 0.066806 | 0 | 0 | 0 | 0 | 0.120166 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14667b6cfd7a7154cebc9042e17bc855580d4270 | 3,410 | py | Python | src/InversionSection.py | r-barnes/PIS-G | 8ce458654e7d7cfa4d89d0ddc618b60abebc9581 | [
"MIT"
] | 3 | 2020-09-24T14:33:46.000Z | 2021-01-12T03:49:14.000Z | src/InversionSection.py | r-barnes/PIS-G | 8ce458654e7d7cfa4d89d0ddc618b60abebc9581 | [
"MIT"
] | 8 | 2020-12-30T07:41:21.000Z | 2021-01-13T04:30:04.000Z | src/InversionSection.py | r-barnes/PIS-G | 8ce458654e7d7cfa4d89d0ddc618b60abebc9581 | [
"MIT"
] | 1 | 2021-01-10T20:19:15.000Z | 2021-01-10T20:19:15.000Z | import matplotlib
matplotlib.use("Qt5Agg")
from PyQt5.QtWidgets import *
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.backends.backend_qt5agg import NavigationToolbar2QT as NavigationToolbar
from matplotlib.figure import Figure
import matplotlib.pyplot as plt
from matplotlib.pyplot import *
class MyMplPara(FigureCanvas):
def __init__(self, parent=None):
plt.rcParams['font.family'] = ['Times New Roman']
plt.rcParams['axes.unicode_minus'] = False
plt.rcParams['savefig.dpi'] = 300
self.fig = Figure()
self.axes = self.fig.add_subplot(111, aspect='equal')
FigureCanvas.__init__(self, self.fig)
self.setParent(parent)
FigureCanvas.setSizePolicy(self, QSizePolicy.Expanding, QSizePolicy.Expanding)
FigureCanvas.updateGeometry(self)
def setTitle(self, title):
self.axes.set_title(title)
def xy_section(self, x1, x2, y1, y2, z, dx, dy, dz, deep, colorbarTitle):
xx = np.linspace(x1, x2, int((x2-x1)/dx+1))
yy = np.linspace(y1, y2, int((y2-y1)/dy+1))
X,Y = np.meshgrid(xx,yy)
CS = self.axes.contourf(X,Y,z,10, cmap=cm.jet)
cbar = self.fig.colorbar(CS, shrink=1, aspect=10,)
if colorbarTitle == "":
cbar.ax.set_ylabel("density(g/cm^3)")
else:
cbar.ax.set_ylabel(colorbarTitle)
self.axes.tick_params(labelsize=9)
self.axes.set_xlabel('x/m')
self.axes.set_ylabel('y/m')
def xz_section(self, x1,x2, z1, z2, value, dx, dy, dz, deep, colorbarTitle):
xx = np.linspace(x1, x2, int((x2-x1)/dx+1))
zz = np.linspace(z1, z2, int((z2-z1)/dz+1))
X,Z = np.meshgrid(xx,zz)
CS = self.axes.contourf(X,Z,value,10, cmap=cm.jet)
cbar = self.fig.colorbar(CS, shrink=1, aspect=10,)
if colorbarTitle == "":
cbar.ax.set_ylabel("density(g/cm^3)")
else:
cbar.ax.set_ylabel(colorbarTitle)
self.axes.tick_params(labelsize=9)
self.axes.set_xlabel('x/m')
self.axes.set_ylabel('z/m')
self.axes.invert_yaxis()
def yz_section(self, y1,y2, z1, z2, value, dx, dy, dz, deep, colorbarTitle):
yy = np.linspace(y1, y2, int((y2-y1)/dy+1))
zz = np.linspace(z1, z2, int((z2-z1)/dz+1))
Y,Z = np.meshgrid(yy,zz)
CS = self.axes.contourf(Y,Z,value,10, cmap=cm.jet)
cbar = self.fig.colorbar(CS, shrink=1, aspect=10,)
if colorbarTitle == "":
cbar.ax.set_ylabel("density(g/cm^3)")
else:
cbar.ax.set_ylabel(colorbarTitle)
self.axes.tick_params(labelsize=9)
self.axes.set_xlabel('y/m')
self.axes.set_ylabel('z/m')
self.axes.invert_yaxis()
def saveFig(self, fileName):
self.fig.savefig(fileName)
class InversionSection(QWidget):
def __init__(self, parent=None):
super(InversionSection, self).__init__(parent)
self.initUi()
def initUi(self):
self.layout = QVBoxLayout(self)
self.mpl = MyMplPara(self)
self.mpl_ntb = NavigationToolbar(self.mpl, self)
self.layout.addWidget(self.mpl)
self.layout.addWidget(self.mpl_ntb)
| 37.888889 | 89 | 0.594135 | 448 | 3,410 | 4.426339 | 0.25 | 0.064549 | 0.03883 | 0.045386 | 0.549672 | 0.472012 | 0.430661 | 0.430661 | 0.409985 | 0.409985 | 0 | 0.029731 | 0.270088 | 3,410 | 89 | 90 | 38.314607 | 0.766975 | 0 | 0 | 0.432432 | 0 | 0 | 0.038844 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108108 | false | 0 | 0.094595 | 0 | 0.22973 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146dd66480416c85e1f3fbd450fd3d6014348667 | 829 | py | Python | editor_csv.py | zhixing121/python-web-spider | b696bfee2c14bf30f925e202f2c61da45d4a7c23 | [
"CNRI-Python"
] | 1 | 2017-12-26T16:01:13.000Z | 2017-12-26T16:01:13.000Z | editor_csv.py | zhixing121/python-web-spider | b696bfee2c14bf30f925e202f2c61da45d4a7c23 | [
"CNRI-Python"
] | null | null | null | editor_csv.py | zhixing121/python-web-spider | b696bfee2c14bf30f925e202f2c61da45d4a7c23 | [
"CNRI-Python"
] | null | null | null | import csv
import os
from urllib.request import urlopen
from bs4 import BeautifulSoup
html = urlopen("http://en.wikipedia.org/wiki/Comparison_of_text_editors")
bsObj = BeautifulSoup(html, "html.parser")
main_table = bsObj.findAll("table", {"class": "wikitable"})[0] #findAll生成列表
rows = main_table.findAll("tr")
try:
csvFile = open("../files/csvFile.csv", "wt", newline='', encoding='utf-8')
except FileNotFoundError:
print(FileNotFoundError)
os.mkdir("../files")
csvFile = open("../files/csvFile.csv", "wt", newline='', encoding='utf-8')
writer = csv.writer(csvFile)
try:
for row in rows:
csvRow = []
for cell in row.findAll(["td", "th"]):
csvRow.append(cell.get_text())
writer.writerow(csvRow)
finally:
csvFile.close()
# print(rows) | 27.633333 | 79 | 0.641737 | 101 | 829 | 5.207921 | 0.554455 | 0.068441 | 0.060837 | 0.087452 | 0.178707 | 0.178707 | 0.178707 | 0.178707 | 0.178707 | 0.178707 | 0 | 0.005997 | 0.195416 | 829 | 30 | 80 | 27.633333 | 0.782609 | 0.027744 | 0 | 0.173913 | 0 | 0 | 0.197165 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.173913 | 0 | 0.173913 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
146e8fb91d80fa949c6e8011c9464b27f7799c13 | 2,380 | py | Python | kstore/migrations/0002_product.py | KeoH/django-keoh-kstore | 825d7984a06823a4e592265c4e791b455ddbb481 | [
"BSD-2-Clause"
] | null | null | null | kstore/migrations/0002_product.py | KeoH/django-keoh-kstore | 825d7984a06823a4e592265c4e791b455ddbb481 | [
"BSD-2-Clause"
] | null | null | null | kstore/migrations/0002_product.py | KeoH/django-keoh-kstore | 825d7984a06823a4e592265c4e791b455ddbb481 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('kstore', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Product',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=255, verbose_name='Product Name')),
('description', models.TextField(null=True, verbose_name='Product description', blank=True)),
('description_short', models.TextField(null=True, verbose_name='Short description', blank=True)),
('ean13', models.CharField(max_length=13, null=True, verbose_name='Codigo de barras', blank=True)),
('width', models.DecimalField(null=True, verbose_name='Width in cm', max_digits=20, decimal_places=2, blank=True)),
('height', models.DecimalField(null=True, verbose_name='Height in cm', max_digits=20, decimal_places=2, blank=True)),
('depth', models.DecimalField(null=True, verbose_name='Depth in cm', max_digits=20, decimal_places=2, blank=True)),
('weight', models.DecimalField(null=True, verbose_name='Weight in kg', max_digits=20, decimal_places=2, blank=True)),
('quantity', models.PositiveSmallIntegerField(default=0, verbose_name='Quantity')),
('price', models.DecimalField(verbose_name='Price in euros', max_digits=20, decimal_places=2)),
('cost_price', models.DecimalField(verbose_name='Cost price in euros', max_digits=20, decimal_places=2)),
('taxes', models.DecimalField(verbose_name='Taxes', max_digits=20, decimal_places=2)),
('out_of_stock', models.BooleanField(default=True, verbose_name='Out of Stock')),
('manufacturer', models.ForeignKey(verbose_name=b'Manufacturer', blank=True, to='kstore.Manufacturer', null=True)),
('supplier', models.ForeignKey(verbose_name=b'Supplier', blank=True, to='kstore.Supplier', null=True)),
],
options={
'db_table': 'ks_product',
'verbose_name': 'product',
'verbose_name_plural': 'products',
},
),
]
| 58.04878 | 133 | 0.628571 | 264 | 2,380 | 5.481061 | 0.310606 | 0.136835 | 0.08293 | 0.091914 | 0.405667 | 0.319972 | 0.153421 | 0.153421 | 0.129924 | 0.078784 | 0 | 0.018488 | 0.227311 | 2,380 | 40 | 134 | 59.5 | 0.768352 | 0.008824 | 0 | 0 | 0 | 0 | 0.184132 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.147059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14717cb73f4fcafb66b24b0e05933e41c92a4ce2 | 6,194 | py | Python | lib/tensor_helper.py | jwspaeth/FAA-Project | afa9d3bec10deead48c4b17dff69df2e02691e41 | [
"MIT"
] | null | null | null | lib/tensor_helper.py | jwspaeth/FAA-Project | afa9d3bec10deead48c4b17dff69df2e02691e41 | [
"MIT"
] | 2 | 2019-10-20T00:42:40.000Z | 2019-10-30T18:06:11.000Z | lib/tensor_helper.py | jwspaeth/FAA-Project | afa9d3bec10deead48c4b17dff69df2e02691e41 | [
"MIT"
] | null | null | null |
import tensorflow as tf
from tensorflow.keras import backend
from tensorflow.keras.layers import Lambda
from tensorflow.keras import layers
def split(tensor, axis, keep_dims=False):
shape = backend.int_shape(tensor)
tensor_list = []
for i in range(shape[axis]):
sliced_tensor = Lambda(lambda x: x[:,i,:,:])(tensor)
if keep_dims:
sliced_tensor = Lambda(lambda x: backend.expand_dims(x, axis=axis))(sliced_tensor)
tensor_list.append(sliced_tensor)
return tensor_list
def merge(tensor_list, axis, expand_dims=False, name=""):
if expand_dims:
for i in range(len(tensor_list)):
tensor_list[i] = Lambda(lambda x: backend.expand_dims(x, axis=axis))(tensor_list[i])
merged_tensor = Lambda(lambda x: backend.concatenate(x, axis))(tensor_list)
if name != "":
merged_tensor = Lambda(lambda x: x, name=name)(merged_tensor)
return merged_tensor
def map(func, tensor_list):
func_output_list = []
for tensor in tensor_list:
func_output_list.append(func(tensor))
return func_output_list
##########################################
class MyDenseStackLayer(tf.keras.layers.Layer):
def __init__(self, input_size, stack_config, name=""):
if name != "":
super(MyDenseStackLayer, self).__init__(name=name)
else:
super(MyDenseStackLayer, self).__init__()
### Collect all layers of encoder
self.stack_layer_list = []
for i in range(len(stack_config.n_layers_list)):
### Define dense layer
dense_layer = layers.Dense(units=stack_config.n_layers_list[i],
activation=stack_config.activation_type_list[i])
dense_layer.trainable = stack_config.trainable_list[i]
self.stack_layer_list.append(dense_layer)
def call(self, inputs):
pipeline = inputs
for i in range(len(self.stack_layer_list)):
pipeline = self.stack_layer_list[i](pipeline)
outputs = pipeline
return outputs
class MyEncoderLayer(tf.keras.layers.Layer):
def __init__(self, encoder_config, name=""):
if name != "":
super(MyEncoderLayer, self).__init__(name=name)
else:
super(MyEncoderLayer, self).__init__()
### Collect all layers of encoder
self.encoder_layer_list = []
for i in range(len(encoder_config.n_filters_list)):
### Define convolution layer
conv_layer = layers.Conv2D(filters=encoder_config.n_filters_list[i],
kernel_size=encoder_config.kernel_size_list[i],
strides=encoder_config.n_strides_list[i],
padding=encoder_config.padding_list[i],
activation=encoder_config.activation_type_list[i])
conv_layer.trainable = encoder_config.trainable_list[i]
self.encoder_layer_list.append(conv_layer)
def call(self, inputs):
pipeline = inputs
for i in range(len(self.encoder_layer_list)):
pipeline = self.encoder_layer_list[i](pipeline)
outputs = pipeline
return outputs
class MyDecoderLayer(tf.keras.layers.Layer):
def __init__(self, decoder_config, name=""):
if name != "":
super(MyDecoderLayer, self).__init__(name=name)
else:
super(MyDecoderLayer, self).__init__()
self.decoder_layer_list = []
for i in range(len(decoder_config.n_filters_list)):
### Define convolution layer
transconv_layer = layers.Conv2DTranspose(filters=decoder_config.n_filters_list[i],
kernel_size=decoder_config.kernel_size_list[i],
strides=decoder_config.n_strides_list[i],
padding=decoder_config.padding_list[i],
output_padding=decoder_config.output_padding[i],
activation=decoder_config.activation_type_list[i])
transconv_layer.trainable = decoder_config.trainable_list[i]
self.decoder_layer_list.append(transconv_layer)
def call(self, inputs):
pipeline = inputs
for i in range(len(self.decoder_layer_list)):
pipeline = self.decoder_layer_list[i](pipeline)
outputs = pipeline
return outputs
class MyNoiserLayer(tf.keras.layers.Layer):
def __init__(self, noiser_config, name=""):
if name != "":
super(MyNoiserLayer, self).__init__(name=name)
else:
super(MyNoiserLayer, self).__init__()
self.gaussian_noise_variable = 0
self.binarizer_layer = 0
self.multiplication_layer = Lambda(lambda x: x[0]+x[1])
def call(self, inputs):
return outputs
class MyInputLayer(tf.keras.layers.Layer):
def __init__(self, name=""):
if name != "":
super(MyInputLayer, self).__init__(name=name)
else:
super(MyInputLayer, self).__init__()
self.layer = Lambda(lambda x: x)
def build(self, input_shape):
super(MyInputLayer, self).build(input_shape)
def call(self, inputs):
return self.layer(inputs)
class MySplitLayer(tf.keras.layers.Layer):
def __init__(self, axis_size):
super(MySplitLayer, self).__init__()
self.layer_list = []
for i in range(axis_size):
self.layer_list.append(Lambda(lambda x: x[:,i,:,:]))
def build(self, input_shape):
super(MySplitLayer, self).build(input_shape)
def call(self, inputs):
tensor_list = []
for layer in self.layer_list:
tensor_list.append(layer(inputs))
return tensor_list
class MyMergeLayer(tf.keras.layers.Layer):
def __init__(self, merge_axis, axis_size, expand_dims=False, name=""):
if name != "":
super(MyMergeLayer, self).__init__(name=name)
else:
super(MyMergeLayer, self).__init__()
self.expand_dims = expand_dims
self.axis_size = axis_size
if expand_dims:
self.expand_layer = Lambda(lambda x: backend.expand_dims(x, axis=merge_axis))
self.merge_layer = Lambda(lambda x: backend.concatenate(x, merge_axis))
def build(self, input_shape):
super(MyMergeLayer, self).build(input_shape)
def call(self, inputs):
if self.expand_dims:
for i in range(self.axis_size):
inputs[i] = self.expand_layer(inputs[i])
merged_inputs = self.merge_layer(inputs)
return merged_inputs
###############################
| 30.81592 | 94 | 0.669357 | 790 | 6,194 | 4.944304 | 0.111392 | 0.025602 | 0.015361 | 0.028162 | 0.528674 | 0.380184 | 0.261137 | 0.15617 | 0.100614 | 0.041475 | 0 | 0.001225 | 0.209558 | 6,194 | 200 | 95 | 30.97 | 0.796569 | 0.020665 | 0 | 0.276596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.141844 | false | 0 | 0.028369 | 0.014184 | 0.29078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1471ffde947b195df3146f6179d3445bcf40ebc8 | 2,126 | py | Python | tests/unit/configuration_subsystem/test_internals.py | LaudateCorpus1/ansible-navigator | 28cdea13dba3e9039382eb993989db4b3e61b237 | [
"Apache-2.0"
] | null | null | null | tests/unit/configuration_subsystem/test_internals.py | LaudateCorpus1/ansible-navigator | 28cdea13dba3e9039382eb993989db4b3e61b237 | [
"Apache-2.0"
] | null | null | null | tests/unit/configuration_subsystem/test_internals.py | LaudateCorpus1/ansible-navigator | 28cdea13dba3e9039382eb993989db4b3e61b237 | [
"Apache-2.0"
] | null | null | null | """Test the internals of a NavigatorConfiguration."""
import os
from copy import deepcopy
from ansible_navigator.configuration_subsystem import Constants
from ansible_navigator.configuration_subsystem import NavigatorConfiguration
from ansible_navigator.initialization import parse_and_update
from .defaults import TEST_FIXTURE_DIR
def test_settings_file_path_file_none():
"""Confirm a settings file path is not stored in the internals when not present."""
args = deepcopy(NavigatorConfiguration)
parse_and_update(params=[], args=args, initial=True)
assert args.internals.settings_file_path is None
assert args.internals.settings_source == Constants.NONE
def test_settings_file_path_file_system(monkeypatch):
"""Confirm a settings file path is stored in the internals when searched.
:param monkeypatch: Fixture providing these helper methods for safely patching and mocking
functionality in tests
"""
settings_file = "ansible-navigator.yml"
settings_file_path = os.path.join(TEST_FIXTURE_DIR, settings_file)
args = deepcopy(NavigatorConfiguration)
def getcwd():
return TEST_FIXTURE_DIR
monkeypatch.setattr(os, "getcwd", getcwd)
parse_and_update(params=[], args=args, initial=True)
assert args.internals.settings_file_path == settings_file_path
assert args.internals.settings_source == Constants.SEARCH_PATH
def test_settings_file_path_environment_variable(monkeypatch):
"""Confirm a settings file path is stored in the internals when set via environment variable.
:param monkeypatch: Fixture providing these helper methods for safely patching and mocking
functionality in tests
"""
settings_file = "ansible-navigator.yml"
settings_file_path = os.path.join(TEST_FIXTURE_DIR, settings_file)
monkeypatch.setenv("ANSIBLE_NAVIGATOR_CONFIG", settings_file_path)
args = deepcopy(NavigatorConfiguration)
parse_and_update(params=[], args=args, initial=True)
assert args.internals.settings_file_path == settings_file_path
assert args.internals.settings_source == Constants.ENVIRONMENT_VARIABLE
| 39.37037 | 97 | 0.784572 | 267 | 2,126 | 6.011236 | 0.250936 | 0.134579 | 0.139564 | 0.100935 | 0.71838 | 0.689097 | 0.553271 | 0.553271 | 0.553271 | 0.553271 | 0 | 0 | 0.147695 | 2,126 | 53 | 98 | 40.113208 | 0.885762 | 0.246943 | 0 | 0.413793 | 0 | 0 | 0.046332 | 0.042471 | 0 | 0 | 0 | 0 | 0.206897 | 1 | 0.137931 | false | 0 | 0.206897 | 0.034483 | 0.37931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1476362831f27710fac9b1814044976cc0886949 | 14,703 | py | Python | cogs/juan/fanclub.py | nwunderly/RevBots | 820a5a28c093f9fc4a73651117e06900e16976a2 | [
"MIT"
] | 5 | 2020-05-08T10:09:10.000Z | 2020-07-22T23:55:00.000Z | cogs/juan/fanclub.py | nwunderly/RevBots | 820a5a28c093f9fc4a73651117e06900e16976a2 | [
"MIT"
] | 2 | 2020-02-27T00:30:49.000Z | 2020-03-02T21:35:31.000Z | cogs/juan/fanclub.py | nwunderly/RevBots | 820a5a28c093f9fc4a73651117e06900e16976a2 | [
"MIT"
] | 2 | 2020-02-27T13:03:27.000Z | 2020-03-01T19:12:42.000Z |
import discord
from discord.ext import commands
from discord.ext import tasks
import datetime
import asyncio
import decimal
import yaml
import re
import copy
import collections
from utils import juan_checks as checks
from utils import db
from utils import converters
VISITOR_ROLE = 582405430274293770
DEVELOPER_ROLE = 582399407698477057
MANAGE_GUILD = 582741228274319380
DEFAULT_BOT_COLOR = 590977254008553494
BOTS_ROLE = 576258076957605889
DEV_CHANNELS_CATEGORY = 577706266768703488
MOD_BOTS_ROLE = 590977954474098700
BOT_LORDS_ROLE = 664328292522000424
pattern = re.compile(r"^= ([\w| ]+) =$")
def sort_roles(roles):
role_list = copy.copy(roles)
role_list.reverse()
_sorted = collections.defaultdict(list)
category = "top"
for role in role_list:
match = pattern.fullmatch(role.name)
if match:
category = match.group(1)
else:
_sorted[category].append(role)
return _sorted
def get_category_list(roles):
role_list = copy.copy(roles)
role_list.reverse()
categories = list()
for role in roles:
match = pattern.fullmatch(role.name)
if match:
categories.append(role)
return categories
class DevData:
def __init__(self, cog):
self.cog = cog
self.bot = cog.bot
self.table = cog.table
def get(self, user_id, key=None):
try:
if key:
return self.table.get([user_id, 'devData'])[key]
else:
return self.table.get([user_id, 'devData'])
except KeyError:
return None
def put(self, new_data, user_id):
try:
data = self.table.get([user_id, 'devData'])
except KeyError:
data = {}
data.update(new_data)
self.table.put(data, [user_id, 'devData'])
def get_all(self):
data = self.table.read('devData')
for item in data:
item.pop('dataType')
item['user'] = item.pop('snowflake')
return data
def get_bot(self, bot_id):
data = self.get_all()
for user_data in data:
for bot_data in user_data['bots']:
if bot_data['id'] == bot_id:
bot_data['owner'] = user_data['user']
return bot_data
return None
def whose_bot(self, bot_id):
data = self.get_all()
for user_data in data:
for bot_data in user_data['bots']:
if bot_data['id'] == bot_id:
return user_data['user']
return None
def whose_channel(self, channel_id):
data = self.get_all()
for user_data in data:
if 'devChannel' not in user_data.keys():
continue
if user_data['devChannel'] == channel_id:
return user_data['user']
return None
def whose_role(self, role_id):
data = self.get_all()
for user_data in data:
if 'botRole' not in user_data.keys():
continue
if user_data['botRole'] == role_id:
return user_data['user']
return None
class FanClub(commands.Cog):
def __init__(self, bot):
self.bot = bot
self.table = self.bot.table
self.dev_data = DevData(self)
@commands.Cog.listener()
async def on_ready(self):
self.cleanup_roles.start()
@tasks.loop(minutes=30)
async def cleanup_roles(self):
guild = self.bot.get_guild(self.bot.properties.guild)
category_roles = get_category_list(guild.roles)
bot_roles = [guild.get_role(role) for role in [BOTS_ROLE, BOT_LORDS_ROLE, MOD_BOTS_ROLE]]
dev_bot_roles = sort_roles(guild.roles).get('Special Bot Roles')
for member in guild.members:
for role in category_roles:
if role in member.roles:
await member.remove_roles(*category_roles, reason="Role cleanup")
break
if not member.bot:
for role in bot_roles:
if role in member.roles:
await member.remove_roles(*bot_roles, reason="Role cleanup")
break
# for role in dev_bot_roles:
# if role in member.roles:
# await member.remove_roles(*dev_bot_roles, reason="Role cleanup")
# break
def get_bots(self, user):
if isinstance(user, int) or isinstance(user, decimal.Decimal):
return self.dev_data.get(user, 'bots')
return self.dev_data.get(user.id, 'bots')
def get_dev_channel(self, user):
if isinstance(user, int) or isinstance(user, decimal.Decimal):
return self.dev_data.get(user, 'devChannel')
return self.dev_data.get(user.id, 'devChannel')
def get_bot_role(self, user):
if isinstance(user, int) or isinstance(user, decimal.Decimal):
return self.dev_data.get(user, 'botRole')
return self.dev_data.get(user.id, 'botRole')
def update_dev_data(self, new_data, user_id, key):
data = {key: new_data}
self.dev_data.put(data, user_id)
def register_bot(self, user, bot_data):
user_id = user if isinstance(user, int) or isinstance(user, decimal.Decimal) else user.id
bot_data['timestamp'] = str(datetime.datetime.now())
bots = self.get_bots(user_id)
if bots:
for bot in bots:
if bot['id'] == bot_data['id']:
bots.remove(bot)
else:
bots = []
bots.append(bot_data)
self.update_dev_data(bots, user_id, 'bots')
@staticmethod
async def added_by(member):
async for entry in member.guild.audit_logs(limit=10, action=discord.AuditLogAction.bot_add):
if entry.target == member:
return entry.user
return None
async def handle_unregistered_bot(self, member):
added_by = await self.added_by(member)
added_by_id = added_by.id if added_by else None
await member.kick(reason=f"Unregistered bot, added by {added_by} ({added_by_id})")
logs = self.bot.get_channel(self.bot.properties.channels['logs'])
await logs.send(f"{datetime.datetime.now()}: {member} has been kicked: unregistered bot, added by {added_by} ({added_by_id}).")
general = self.bot.get_channel(self.bot.properties.channels['general'])
await general.send(f"{member} has been kicked." + (f" {added_by.mention}, p" if added_by else " P") +
"lease register your bot with me before adding it to this server.")
async def handle_registered_bot(self, member, bot_data):
added_by = await self.added_by(member)
roles = []
timestamp = bot_data['timestamp'] if 'timestamp' in bot_data.keys() else "No timestamp"
owner_id = bot_data['owner'] if 'owner' in bot_data.keys() else None
bot_role_id = self.get_bot_role(owner_id)
try:
owner = await self.bot.fetch_user(owner_id)
except discord.NotFound:
owner = None
added_by_id = added_by.id if added_by else None
self.bot.dispatch('bot_add', member, added_by, owner)
if bot_role_id:
roles.append(member.guild.get_role(bot_role_id))
else:
roles.append(member.guild.get_role(DEFAULT_BOT_COLOR))
roles.append(member.guild.get_role(BOTS_ROLE))
if roles:
await member.add_roles(*roles, reason=f"Bot added by {added_by} ({added_by_id}), registered to {owner} ({owner_id}) [{timestamp}]")
logs = self.bot.get_channel(self.bot.properties.channels['logs'])
await logs.send(f"{datetime.datetime.now()}: {member} added by {added_by} ({added_by_id}), registered to {owner} ({owner_id}) [{timestamp}]")
general = self.bot.get_channel(self.bot.properties.channels['general'])
await general.send(f"{added_by.mention if added_by else None} has added bot {member.mention} to the server!")
if not owner or owner not in member.guild.members:
return
owner = member.guild.get_member(owner.id)
bots = self.get_bots(owner)
if len(bots) == 1:
role = member.guild.get_role(DEVELOPER_ROLE)
if role not in member.roles:
await owner.add_roles(role, reason="First bot! 🎉")
await general.send(f"Congrats {owner.mention} on adding your first bot! 🎉")
@commands.Cog.listener()
async def on_member_join(self, member):
if member.guild.id != self.bot.properties.guild:
return
if not member.bot:
general = self.bot.get_channel(self.bot.properties.channels['general'])
visitor_role = member.guild.get_role(VISITOR_ROLE)
await general.send(f"Welcome {member.mention}!")
await member.add_roles(visitor_role, reason="Autorole")
return
bot_data = self.dev_data.get_bot(member.id)
await asyncio.sleep(2)
if bot_data:
await self.handle_registered_bot(member, bot_data)
else:
await self.handle_unregistered_bot(member)
@commands.command()
async def bots(self, ctx, who: discord.Member = None):
who = who if who else ctx.author
bots = self.get_bots(who.id)
for bot_info in bots:
for key, value in bot_info.items():
if isinstance(value, decimal.Decimal):
bot_info[key] = str(self.bot.get_user(int(value)))
formatted = yaml.dump(bots)
await ctx.send(f"```{formatted}```")
@commands.command()
async def bot(self, ctx, who: discord.Member):
bot_info = self.dev_data.get_bot(who)
for key, value in bot_info.items():
if isinstance(value, decimal.Decimal):
bot_info[key] = str(self.bot.get_user(int(value)))
formatted = yaml.dump(bot_info)
await ctx.send(f"```{formatted}```")
@commands.command()
async def whosebot(self, ctx, bot: discord.Member):
owner = self.dev_data.whose_bot(bot.id)
await ctx.send(f"This bot is registered to {owner}.")
@commands.command()
async def whosechannel(self, ctx, channel: discord.TextChannel):
owner = self.dev_data.whose_channel(channel.id)
await ctx.send(f"This channel is registered to {owner}.")
@commands.command()
async def whoserole(self, ctx, role: discord.Role):
owner = self.dev_data.whose_role(role.id)
await ctx.send(f"This role is registered to {owner}.")
@commands.command()
@checks.is_admin()
async def forceregister(self, ctx, owner_id: int, bot_id: int, prefix):
try:
data = {
'id': bot_id,
'prefix': prefix
}
owner = await self.bot.fetch_user(owner_id)
self.register_bot(owner, data)
await ctx.send(f"Registered {bot_id} to {owner_id}.")
except Exception as e:
await ctx.send(str(e))
@commands.command()
async def addbot(self, ctx, bot_id: int, prefix):
url = discord.utils.oauth_url(str(bot_id))
data = {
'id': bot_id,
'prefix': prefix
}
self.register_bot(ctx.author, data)
manage_guild = ctx.guild.get_role(MANAGE_GUILD)
await ctx.author.add_roles(manage_guild, reason=f"Perms to add bot, client id {bot_id}")
def check(bot, added_by, owner):
return ctx.author.id in [added_by.id if added_by else None, owner.id if owner else None]
await ctx.send("<" + url + ">")
try:
await self.bot.wait_for('bot_add', check=check, timeout=600)
except asyncio.TimeoutError:
pass
await ctx.author.remove_roles(manage_guild, reason="Added bot or request timed out")
@commands.command()
@checks.is_developer()
async def register(self, ctx, bot: discord.Member, prefix):
owner_id = self.dev_data.whose_bot(bot)
if owner_id:
try:
owner = await self.bot.fetch_user(owner_id)
except discord.NotFound:
owner = None
await ctx.send(f"{bot} is already owned by {owner if owner else owner_id}!")
data = {
'id': bot.id,
'prefix': prefix
}
self.register_bot(ctx.author, data)
role = self.get_bot_role(ctx.author)
if role:
role = ctx.guild.get_role(role)
if role:
try:
timestamp = self.dev_data.get_bot(bot.id)['timestamp']
except TypeError:
timestamp = None
await bot.add_roles(role, reason=f"Registered to {ctx.author} ({ctx.author.id}) [{timestamp}]")
await ctx.send(f"Registered {bot} to {ctx.author}.")
@commands.command()
@checks.is_admin()
async def unregister(self, ctx, bot_id: int):
n = 0
while (bot_data := self.dev_data.get_bot(bot_id)) is not None:
user_id = bot_data.pop('owner')
bots = self.dev_data.get(user_id, 'bots')
for bot in bots:
if bot['id'] == bot_id:
n += 1
bots.remove(bot)
await ctx.send(f"Removed from bots owned by user {user_id}.")
self.update_dev_data(bots, user_id, 'bots')
await ctx.send(f"Done. Bot was registered {n} times.")
@commands.command()
@checks.is_developer()
async def claimchannel(self, ctx, channel: discord.TextChannel):
if channel.category.id != DEV_CHANNELS_CATEGORY:
await ctx.send("Channel must be in \"Dev Channels\" category.")
return
owner = self.dev_data.whose_channel(channel)
if owner:
await ctx.send(f"That channel is owned by {owner}.")
return
self.update_dev_data(channel.id, ctx.author.id, 'devChannel')
await ctx.send("Done!")
@commands.command()
@checks.is_developer()
async def claimrole(self, ctx, role: discord.Role):
owner = self.dev_data.whose_role(role)
if owner:
await ctx.send(f"That role is owned by {owner}.")
return
sorted_roles = sort_roles(ctx.guild.roles)
if role not in sorted_roles['Special Bot Roles']:
await ctx.send("Role must be in \"Special Bot Roles\" category.")
self.update_dev_data(role.id, ctx.author.id, 'botRole')
await ctx.send("Done!")
def setup(bot):
bot.add_cog(FanClub(bot))
| 35.948655 | 149 | 0.600694 | 1,928 | 14,703 | 4.418568 | 0.114108 | 0.025472 | 0.024533 | 0.018312 | 0.474586 | 0.425167 | 0.378213 | 0.297922 | 0.26435 | 0.212231 | 0 | 0.014937 | 0.289669 | 14,703 | 408 | 150 | 36.036765 | 0.800555 | 0.009658 | 0 | 0.364706 | 0 | 0.008824 | 0.114737 | 0.003573 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052941 | false | 0.002941 | 0.038235 | 0.002941 | 0.182353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147ab841bf342fcda700f1104a6e09f84f9d4bf0 | 2,383 | py | Python | lab6/program/lidar_model.py | windsuzu/Robot-Navigation | 2b66df58e0a0799c25244f89a4e5a16d856b2320 | [
"MIT"
] | 6 | 2021-11-09T12:42:13.000Z | 2022-02-25T03:43:00.000Z | lab6/program/lidar_model.py | windsuzu/Robot-Navigation | 2b66df58e0a0799c25244f89a4e5a16d856b2320 | [
"MIT"
] | null | null | null | lab6/program/lidar_model.py | windsuzu/Robot-Navigation | 2b66df58e0a0799c25244f89a4e5a16d856b2320 | [
"MIT"
] | 1 | 2021-10-01T15:30:30.000Z | 2021-10-01T15:30:30.000Z | import numpy as np
import cv2
import sys
from utils import *
class LidarModel:
def __init__(self,
img_map,
sensor_size = 21,
start_angle = -120.0,
end_angle = 120.0,
max_dist = 200.0,
):
self.sensor_size = sensor_size
self.start_angle = start_angle
self.end_angle = end_angle
self.max_dist = max_dist
self.img_map = img_map
def measure(self, pos):
sense_data = []
inter = (self.end_angle-self.start_angle) / (self.sensor_size-1)
for i in range(self.sensor_size):
theta = pos[2] + self.start_angle + i*inter
sense_data.append(self._ray_cast(np.array((pos[0], pos[1])), theta))
return sense_data
def measure_2d(self, pos):
sdata = self.measure(pos)
plist = EndPoint(pos, [self.sensor_size, self.start_angle, self.end_angle], sdata)
return sdata, plist
def _ray_cast(self, pos, theta):
end = np.array((pos[0] + self.max_dist*np.cos(np.deg2rad(theta)), pos[1] + self.max_dist*np.sin(np.deg2rad(theta))))
x0, y0 = int(pos[0]), int(pos[1])
x1, y1 = int(end[0]), int(end[1])
plist = Bresenham(x0, x1, y0, y1)
i = 0
dist = self.max_dist
for p in plist:
if p[1] >= self.img_map.shape[0] or p[0] >= self.img_map.shape[1] or p[1]<0 or p[0]<0:
continue
if self.img_map[p[1], p[0]] < 0.5:
tmp = np.power(float(p[0]) - pos[0], 2) + np.power(float(p[1]) - pos[1], 2)
tmp = np.sqrt(tmp)
if tmp < dist:
dist = tmp
return dist
if __name__ == "__main__":
img = cv2.flip(cv2.imread("Maps/map.png"),0)
img[img>128] = 255
img[img<=128] = 0
m = np.asarray(img)
m = cv2.cvtColor(m, cv2.COLOR_RGB2GRAY)
m = m.astype(float) / 255.
img = img.astype(float)/255.
lmodel = LidarModel(m)
pos = (100,200,0)
sdata, plist = lmodel.measure_2d(pos)
img_ = img.copy()
for pts in plist:
cv2.line(
img_,
(int(1*pos[0]), int(1*pos[1])),
(int(1*pts[0]), int(1*pts[1])),
(0.0,1.0,0.0), 1)
cv2.circle(img_,(pos[0],pos[1]),5,(0.5,0.5,0.5),3)
img_ = cv2.flip(img_,0)
cv2.imshow("Lidar Test", img_)
k = cv2.waitKey(0) | 33.097222 | 124 | 0.534201 | 371 | 2,383 | 3.277628 | 0.247978 | 0.029605 | 0.041118 | 0.03125 | 0.073191 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071255 | 0.310953 | 2,383 | 72 | 125 | 33.097222 | 0.669306 | 0 | 0 | 0 | 0 | 0 | 0.012584 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.060606 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147b00995ae20769b7f485083e84166a340a1e74 | 1,399 | py | Python | main/tools/verification_code.py | anngle/t923 | 078d2c566c77afa2ca1be7663d3c23c9f0ecddac | [
"BSD-3-Clause"
] | 1 | 2021-11-28T05:46:45.000Z | 2021-11-28T05:46:45.000Z | main/tools/verification_code.py | anngle/t923 | 078d2c566c77afa2ca1be7663d3c23c9f0ecddac | [
"BSD-3-Clause"
] | null | null | null | main/tools/verification_code.py | anngle/t923 | 078d2c566c77afa2ca1be7663d3c23c9f0ecddac | [
"BSD-3-Clause"
] | null | null | null |
import random,string
from datetime import datetime
from ..views.public import bp as public_bp
def rndChar():
"""随机字母:"""
str = ''
for i in range(4):
str += chr(random.randint(65, 90))
return str
def rndColor():
"""随机颜色1"""
return (random.randint(64, 255), random.randint(64, 255), random.randint(64, 255))
def rndColor2():
"""随机颜色2"""
return (random.randint(32, 127), random.randint(32, 127), random.randint(32, 127))
@public_bp.route('/generate_verification_code')
def generate_verification_code():
"""验证码"""
output = BytesIO()
width = 70
height = 30
image = Image.new('RGB',(width,height),(255,255,255))
#字体对象
font = ImageFont.truetype(current_app.config['VERIFICATION_CODE_FONT'], 18)
draw = ImageDraw.Draw(image)
for x in range(width):
for y in range(height):
draw.point((x, y), fill=rndColor())
verify_str = rndChar()
draw.text((10, 5),verify_str, font=font, fill=rndColor2())
#模糊
# image = image.filter(ImageFilter.BLUR)
# li = []
# for i in range(10):
# temp = random.randrange(65,90)
# c = chr(temp)
# li.append(c)
image.save(output,"JPEG")
img_data = output.getvalue()
session['verify'] = verify_str
response = make_response(img_data)
response.headers['Content-Type'] = 'image/jpeg'
return response
| 21.859375 | 86 | 0.61544 | 182 | 1,399 | 4.648352 | 0.456044 | 0.107565 | 0.053191 | 0.06383 | 0.12766 | 0.12766 | 0.12766 | 0.12766 | 0 | 0 | 0 | 0.058387 | 0.228735 | 1,399 | 63 | 87 | 22.206349 | 0.725672 | 0.113653 | 0 | 0 | 0 | 0 | 0.069594 | 0.040597 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129032 | false | 0 | 0.096774 | 0 | 0.354839 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147bc4f8e6ae805cfc73d5804d3fd691847d2e69 | 973 | py | Python | docs/conf.py | purplepinapples/aniffinity | 36b6eec3c7b135aec61f138cdc139dd39561448a | [
"MIT"
] | 9 | 2019-03-10T01:22:35.000Z | 2021-11-08T12:36:10.000Z | docs/conf.py | purplepinapples/aniffinity | 36b6eec3c7b135aec61f138cdc139dd39561448a | [
"MIT"
] | 10 | 2019-01-04T21:22:42.000Z | 2019-04-27T13:10:53.000Z | docs/conf.py | purplepinapples/aniffinity | 36b6eec3c7b135aec61f138cdc139dd39561448a | [
"MIT"
] | 1 | 2019-04-12T08:56:14.000Z | 2019-04-12T08:56:14.000Z | import datetime
import sys
sys.path.insert(0, "..")
from aniffinity import __about__ # noqa: E402
year = datetime.datetime.now().year
author = __about__.__author__
project = __about__.__title__
release = __about__.__version__
copyright = "{}, {}".format(year, author)
version = ".".join(release.split(".")[:2])
extensions = ["sphinx.ext.autodoc"]
templates_path = ["_templates"]
source_suffix = ".rst"
master_doc = "index"
language = None
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
pygments_style = "sphinx"
html_theme = "sphinx_rtd_theme"
# html_static_path = ["_static"]
htmlhelp_basename = project
# Force it to document ``__init__`, because it doesn't do that by default
# https://stackoverflow.com/a/5599712
def skip(app, what, name, obj, skip, options):
if name == "__init__":
return False
return skip
def setup(app):
app.connect("autodoc-skip-member", skip)
| 18.711538 | 74 | 0.661871 | 116 | 973 | 5.112069 | 0.698276 | 0.033727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015424 | 0.200411 | 973 | 51 | 75 | 19.078431 | 0.746787 | 0.153135 | 0 | 0 | 0 | 0 | 0.15625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.12 | 0 | 0.28 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147e4c37e82e99774bdffd577533bd3c9ad7347e | 716 | py | Python | staff/files.py | skyydq/GreaterWMS | e14014a73b36ec0f0df03712a229b0931cb388fb | [
"Apache-2.0"
] | 2 | 2021-11-09T10:29:44.000Z | 2021-11-15T08:03:40.000Z | staff/files.py | ashrafali46/GreaterWMS | 1aed14a8c26c8ac4571db5e6b07ab7e4fa3c7c72 | [
"Apache-2.0"
] | null | null | null | staff/files.py | ashrafali46/GreaterWMS | 1aed14a8c26c8ac4571db5e6b07ab7e4fa3c7c72 | [
"Apache-2.0"
] | 1 | 2021-07-01T03:05:21.000Z | 2021-07-01T03:05:21.000Z | from rest_framework_csv.renderers import CSVStreamingRenderer
class FileRenderCN(CSVStreamingRenderer):
header = [
'staff_name',
'staff_type',
'create_time',
'update_time'
]
labels = dict([
('staff_name', u'员工用户名'),
('staff_type', u'员工类型'),
('create_time', u'创建时间'),
('update_time', u'更新时间')
])
class FileRenderEN(CSVStreamingRenderer):
header = [
'staff_name',
'staff_type',
'create_time',
'update_time'
]
labels = dict([
('staff_name', u'Staff Name'),
('staff_type', u'Staff Type'),
('create_time', u'Create Time'),
('update_time', u'Update Time'),
])
| 23.866667 | 61 | 0.550279 | 71 | 716 | 5.295775 | 0.338028 | 0.119681 | 0.111702 | 0.143617 | 0.446809 | 0.446809 | 0.446809 | 0.446809 | 0.446809 | 0.446809 | 0 | 0 | 0.298883 | 716 | 29 | 62 | 24.689655 | 0.749004 | 0 | 0 | 0.518519 | 0 | 0 | 0.317039 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.259259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147f39b77ce21747030f0597ae31c34fe414a767 | 3,837 | py | Python | scripts/utils.py | AlgoveraAI/DeFi | 9e98aa68ad34e5bb177ec8345121e8a03ada6edc | [
"MIT"
] | null | null | null | scripts/utils.py | AlgoveraAI/DeFi | 9e98aa68ad34e5bb177ec8345121e8a03ada6edc | [
"MIT"
] | 2 | 2021-12-15T20:15:35.000Z | 2021-12-16T10:03:58.000Z | scripts/utils.py | AlgoveraAI/DeFi | 9e98aa68ad34e5bb177ec8345121e8a03ada6edc | [
"MIT"
] | 1 | 2022-01-15T11:49:59.000Z | 2022-01-15T11:49:59.000Z | import pandas as pd
from fastai.tabular.all import *
def get_token_features(df, tokens):
df1 = pd.DataFrame()
for tok in tokens:
df_tok = df[df['Token']==tok]
df_tok = df_tok.drop(['Token', 'Date'], axis=1)
col_names = []
for col in df_tok.columns:
if col == 'Timestamp':
col_names.append(f'{col}')
else:
col_names.append(f'{tok}_{col}')
df_tok.columns = col_names
#df_tok = df_tok.set_index('Timestamp', drop=True)
if df1.empty:
df1 = df_tok
else:
df1 = pd.merge(df1, df_tok, on='Timestamp')
def get_target(row, target_column, target_window):
try:
target = df1[df1['Timestamp'] == row['Timestamp'] + 1800.0*target_window][target_column].values[0]
except:
target = np.NaN
return target
def get_tabpandas_singletimestep(df, tokens, target_window):
y_names = []
for tok in tokens:
target = f'{tok}_Target'
y_names.append(target)
target_column = f'{tok}_Borrowing Rate'
df[target] = df.apply(lambda x: get_target(x, target_column, target_window), axis=1)
df = df.dropna()
df = df.drop(['Timestamp', 'Date'], axis=1)
df['Train'] = None
train_index = int(len(df)*0.8)
df.loc[:train_index, 'Train'] = True
df.loc[train_index:, 'Train'] = False
df = df.reset_index(drop=True)
splits = (list(df[df['Train']==True].index), list(df[df['Train']==False].index))
df = df.drop(['Train'], axis=1)
cont_names = list(df.columns[:len(tokens)])
procs = [Categorify, FillMissing, Normalize]
y_block = RegressionBlock()
to = TabularPandas(df, procs=procs, cont_names=cont_names,
y_names=y_names, y_block=y_block, splits=splits)
dls = to.dataloaders(bs=128)
return to, dls
def get_tabpandas_multi(df, target_window, n_timepoint_inp):
df = df.reset_index(drop=True)
feature_cols = ['DAI_Borrowing Rate', 'DAI_Deposit Rate', 'DAI_Borrow Volume', 'DAI_Supply Volume',
'USDC_Borrowing Rate', 'USDC_Deposit Rate', 'USDC_Borrow Volume', 'USDC_Supply Volume',
'USDT_Borrowing Rate', 'USDT_Deposit Rate', 'USDT_Borrow Volume', 'USDT_Supply Volume',
'ETH_Borrowing Rate', 'ETH_Deposit Rate', 'ETH_Borrow Volume', 'ETH_Supply Volume']
target_columns = ['DAI_Borrowing Rate', 'USDC_Borrowing Rate', 'USDT_Borrowing Rate', 'ETH_Borrowing Rate']
cols_names = []
for j in range(n_timepoint_inp):
for col in feature_cols:
cols_names.append(f'{col}_t-{n_timepoint_inp -j-1}')
cols_names += target_columns
pairs = []
for i, row in tqdm(df.iterrows()):
if i < (len(df)-target_window-n_timepoint_inp-1):
features = df.loc[i:i+n_timepoint_inp-1, feature_cols].values
features = [item for sublist in features for item in sublist]
targ = list(df.loc[i+n_timepoint_inp-1+target_window, target_columns].values)
features += targ
pairs.append(features)
df = pd.DataFrame(pairs, columns=cols_names)
df = df.dropna()
df = df.reset_index(drop=True)
#train_test_split
df['Train'] = None
train_index = int(len(df)*0.8)
df.loc[:train_index, 'Train'] = True
df.loc[train_index:, 'Train'] = False
splits = (list(df[df['Train']==True].index), list(df[df['Train']==False].index))
df = df.drop(['Train'], axis=1)
cont_names = list(df.columns[:-4])
procs = [Categorify, FillMissing, Normalize]
y_block = RegressionBlock()
to = TabularPandas(df, procs=procs, cont_names=cont_names, y_names=target_columns, y_block=y_block, splits=splits)
dls = to.dataloaders(bs=128)
return to, dls | 33.365217 | 118 | 0.617931 | 527 | 3,837 | 4.294118 | 0.204934 | 0.022978 | 0.034468 | 0.026513 | 0.372072 | 0.34821 | 0.295183 | 0.295183 | 0.295183 | 0.295183 | 0 | 0.011379 | 0.244201 | 3,837 | 115 | 119 | 33.365217 | 0.768966 | 0.01694 | 0 | 0.353659 | 0 | 0 | 0.147176 | 0.006364 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04878 | false | 0 | 0.02439 | 0 | 0.109756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147ff2eb5cc2520a781b4df48351b63b5d4614b5 | 1,645 | py | Python | tests/readers/test_po_reader.py | fabiocorneti/xlpo | 9fc4729c7e139a9a9fc74c88068e2691b2b120df | [
"BSD-3-Clause"
] | null | null | null | tests/readers/test_po_reader.py | fabiocorneti/xlpo | 9fc4729c7e139a9a9fc74c88068e2691b2b120df | [
"BSD-3-Clause"
] | null | null | null | tests/readers/test_po_reader.py | fabiocorneti/xlpo | 9fc4729c7e139a9a9fc74c88068e2691b2b120df | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals
import os
import unittest
from xlpo.readers import POFileTranslationsReader
from tests.base import BaseTestCase
class TestPOTranslationsReader(BaseTestCase):
def test_valid_files(self):
po_file = os.path.join(self.FILES_DIR, 'messages.po')
reader = POFileTranslationsReader(po_file)
self.assertEqual(len(reader), 2)
self.assertEqual(reader[0].message, 'Hello')
self.assertEqual(reader[0].translation, 'Ciao')
self.assertEqual(reader[1].message, 'Yes')
self.assertEqual(reader[1].translation, 'Sì')
def test_invalid_files(self):
invalid_file = os.path.join(self.FILES_DIR, 'not_a_po.po')
reader = POFileTranslationsReader(invalid_file)
self.assertRaises(IOError, lambda: reader.read())
def test_invalid_filenames(self):
self.assertRaises(Exception, lambda: POFileTranslationsReader(None))
reader = POFileTranslationsReader('not_here')
self.assertRaises(IOError, lambda: reader.read())
reader = POFileTranslationsReader(self.FILES_DIR)
self.assertRaises(IOError, lambda: reader.read())
def test_caching(self):
po_file = os.path.join(self.FILES_DIR, 'messages.po')
reader = POFileTranslationsReader(po_file)
reader.read()
t1 = reader._translations
self.assertTrue(t1 is not None)
self.assertEqual(len(reader), 2)
reader.read()
t2 = reader._translations
self.assertTrue(t1 is t2)
if __name__ == '__main__':
unittest.main()
| 32.9 | 76 | 0.689362 | 187 | 1,645 | 5.871658 | 0.352941 | 0.081967 | 0.043716 | 0.038251 | 0.396175 | 0.350638 | 0.249545 | 0.225865 | 0.142077 | 0.142077 | 0 | 0.009112 | 0.199392 | 1,645 | 49 | 77 | 33.571429 | 0.824601 | 0.025532 | 0 | 0.305556 | 0 | 0 | 0.03935 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.111111 | false | 0 | 0.138889 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
147ff7eaa3c8d013a61ba02817c8400feb311c49 | 6,576 | py | Python | pytube/contrib/channel.py | JarbasAl/pytube | 10d0bdb6761682c2a48e4c816aec4bb9b801f7aa | [
"Unlicense"
] | 4,079 | 2015-01-08T13:09:24.000Z | 2020-12-31T08:59:22.000Z | pytube/contrib/channel.py | JarbasAl/pytube | 10d0bdb6761682c2a48e4c816aec4bb9b801f7aa | [
"Unlicense"
] | 807 | 2015-02-23T12:49:43.000Z | 2020-12-31T16:09:01.000Z | pytube/contrib/channel.py | JarbasAl/pytube | 10d0bdb6761682c2a48e4c816aec4bb9b801f7aa | [
"Unlicense"
] | 1,079 | 2015-01-08T10:16:47.000Z | 2020-12-30T15:26:13.000Z | # -*- coding: utf-8 -*-
"""Module for interacting with a user's youtube channel."""
import json
import logging
from typing import Dict, List, Optional, Tuple
from pytube import extract, Playlist, request
from pytube.helpers import uniqueify
logger = logging.getLogger(__name__)
class Channel(Playlist):
def __init__(self, url: str, proxies: Optional[Dict[str, str]] = None):
"""Construct a :class:`Channel <Channel>`.
:param str url:
A valid YouTube channel URL.
:param proxies:
(Optional) A dictionary of proxies to use for web requests.
"""
super().__init__(url, proxies)
self.channel_uri = extract.channel_name(url)
self.channel_url = (
f"https://www.youtube.com{self.channel_uri}"
)
self.videos_url = self.channel_url + '/videos'
self.playlists_url = self.channel_url + '/playlists'
self.community_url = self.channel_url + '/community'
self.featured_channels_url = self.channel_url + '/channels'
self.about_url = self.channel_url + '/about'
# Possible future additions
self._playlists_html = None
self._community_html = None
self._featured_channels_html = None
self._about_html = None
@property
def channel_name(self):
"""Get the name of the YouTube channel.
:rtype: str
"""
return self.initial_data['metadata']['channelMetadataRenderer']['title']
@property
def channel_id(self):
"""Get the ID of the YouTube channel.
This will return the underlying ID, not the vanity URL.
:rtype: str
"""
return self.initial_data['metadata']['channelMetadataRenderer']['externalId']
@property
def vanity_url(self):
"""Get the vanity URL of the YouTube channel.
Returns None if it doesn't exist.
:rtype: str
"""
return self.initial_data['metadata']['channelMetadataRenderer'].get('vanityChannelUrl', None) # noqa:E501
@property
def html(self):
"""Get the html for the /videos page.
:rtype: str
"""
if self._html:
return self._html
self._html = request.get(self.videos_url)
return self._html
@property
def playlists_html(self):
"""Get the html for the /playlists page.
Currently unused for any functionality.
:rtype: str
"""
if self._playlists_html:
return self._playlists_html
else:
self._playlists_html = request.get(self.playlists_url)
return self._playlists_html
@property
def community_html(self):
"""Get the html for the /community page.
Currently unused for any functionality.
:rtype: str
"""
if self._community_html:
return self._community_html
else:
self._community_html = request.get(self.community_url)
return self._community_html
@property
def featured_channels_html(self):
"""Get the html for the /channels page.
Currently unused for any functionality.
:rtype: str
"""
if self._featured_channels_html:
return self._featured_channels_html
else:
self._featured_channels_html = request.get(self.featured_channels_url)
return self._featured_channels_html
@property
def about_html(self):
"""Get the html for the /about page.
Currently unused for any functionality.
:rtype: str
"""
if self._about_html:
return self._about_html
else:
self._about_html = request.get(self.about_url)
return self._about_html
@staticmethod
def _extract_videos(raw_json: str) -> Tuple[List[str], Optional[str]]:
"""Extracts videos from a raw json page
:param str raw_json: Input json extracted from the page or the last
server response
:rtype: Tuple[List[str], Optional[str]]
:returns: Tuple containing a list of up to 100 video watch ids and
a continuation token, if more videos are available
"""
initial_data = json.loads(raw_json)
# this is the json tree structure, if the json was extracted from
# html
try:
videos = initial_data["contents"][
"twoColumnBrowseResultsRenderer"][
"tabs"][1]["tabRenderer"]["content"][
"sectionListRenderer"]["contents"][0][
"itemSectionRenderer"]["contents"][0][
"gridRenderer"]["items"]
except (KeyError, IndexError, TypeError):
try:
# this is the json tree structure, if the json was directly sent
# by the server in a continuation response
important_content = initial_data[1]['response']['onResponseReceivedActions'][
0
]['appendContinuationItemsAction']['continuationItems']
videos = important_content
except (KeyError, IndexError, TypeError):
try:
# this is the json tree structure, if the json was directly sent
# by the server in a continuation response
# no longer a list and no longer has the "response" key
important_content = initial_data['onResponseReceivedActions'][0][
'appendContinuationItemsAction']['continuationItems']
videos = important_content
except (KeyError, IndexError, TypeError) as p:
logger.info(p)
return [], None
try:
continuation = videos[-1]['continuationItemRenderer'][
'continuationEndpoint'
]['continuationCommand']['token']
videos = videos[:-1]
except (KeyError, IndexError):
# if there is an error, no continuation is available
continuation = None
# remove duplicates
return (
uniqueify(
list(
# only extract the video ids from the video data
map(
lambda x: (
f"/watch?v="
f"{x['gridVideoRenderer']['videoId']}"
),
videos
)
),
),
continuation,
)
| 32.554455 | 114 | 0.572384 | 678 | 6,576 | 5.39528 | 0.246313 | 0.035539 | 0.02187 | 0.027884 | 0.304265 | 0.275287 | 0.275287 | 0.242482 | 0.193275 | 0.193275 | 0 | 0.003455 | 0.339872 | 6,576 | 201 | 115 | 32.716418 | 0.839208 | 0.254562 | 0 | 0.3 | 0 | 0 | 0.131579 | 0.058333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.081818 | 0 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1481014f1238c87b509f6d7636e1089e842e9bb5 | 2,821 | py | Python | lib/python3.6/site-packages/pony/orm/tests/test_getattr.py | PhonPhey/Magnezi | bf96246d69edc6882653ba5e1332c0eff8d10294 | [
"MIT"
] | 3 | 2017-04-27T09:37:25.000Z | 2017-08-12T16:25:22.000Z | lib/python3.6/site-packages/pony/orm/tests/test_getattr.py | PhonPhey/Magnezi | bf96246d69edc6882653ba5e1332c0eff8d10294 | [
"MIT"
] | 2 | 2017-06-29T13:25:58.000Z | 2017-07-21T10:09:27.000Z | lib/python3.6/site-packages/pony/orm/tests/test_getattr.py | PhonPhey/Magnezi | bf96246d69edc6882653ba5e1332c0eff8d10294 | [
"MIT"
] | 1 | 2017-10-08T11:42:52.000Z | 2017-10-08T11:42:52.000Z | from pony.py23compat import basestring
import unittest
from pony.orm import *
from pony import orm
from pony.utils import cached_property
from pony.orm.tests.testutils import raises_exception
class Test(unittest.TestCase):
@cached_property
def db(self):
return orm.Database('sqlite', ':memory:')
def setUp(self):
db = self.db
class Genre(db.Entity):
name = orm.Required(str)
artists = orm.Set('Artist')
class Hobby(db.Entity):
name = orm.Required(str)
artists = orm.Set('Artist')
class Artist(db.Entity):
name = orm.Required(str)
age = orm.Optional(int)
hobbies = orm.Set(Hobby)
genres = orm.Set(Genre)
db.generate_mapping(check_tables=True, create_tables=True)
with orm.db_session:
pop = Genre(name='pop')
Artist(name='Sia', age=40, genres=[pop])
pony.options.INNER_JOIN_SYNTAX = True
@db_session
def test_no_caching(self):
for attr, type in zip(['name', 'age'], [basestring, int]):
val = select(getattr(x, attr) for x in self.db.Artist).first()
self.assertIsInstance(val, type)
@db_session
def test_simple(self):
val = select(getattr(x, 'age') for x in self.db.Artist).first()
self.assertIsInstance(val, int)
@db_session
def test_expr(self):
val = select(getattr(x, ''.join(['ag', 'e'])) for x in self.db.Artist).first()
self.assertIsInstance(val, int)
@db_session
def test_external(self):
class data:
id = 1
val = select(x.id for x in self.db.Artist if x.id >= getattr(data, 'id')).first()
self.assertIsNotNone(val)
@db_session
def test_related(self):
val = select(getattr(x.genres, 'name') for x in self.db.Artist).first()
self.assertIsNotNone(val)
@db_session
def test_not_instance_iter(self):
val = select(getattr(x.name, 'startswith')('S') for x in self.db.Artist).first()
self.assertTrue(val)
@db_session
@raises_exception(TypeError, '`x.name` should be either external expression or constant.')
def test_not_external(self):
select(getattr(x, x.name) for x in self.db.Artist)
@raises_exception(TypeError, 'getattr(x, 1): attribute name must be string. Got: 1')
@db_session
def test_not_string(self):
select(getattr(x, 1) for x in self.db.Artist)
@raises_exception(TypeError, 'getattr(x, name): attribute name must be string. Got: 1')
@db_session
def test_not_string(self):
name = 1
select(getattr(x, name) for x in self.db.Artist)
| 30.663043 | 95 | 0.594116 | 370 | 2,821 | 4.432432 | 0.256757 | 0.040244 | 0.032927 | 0.054878 | 0.479268 | 0.42622 | 0.39939 | 0.396951 | 0.3 | 0.3 | 0 | 0.004978 | 0.287841 | 2,821 | 91 | 96 | 31 | 0.811349 | 0 | 0 | 0.294118 | 0 | 0 | 0.083211 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 1 | 0.161765 | false | 0 | 0.088235 | 0.014706 | 0.338235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1485a31bf1f79c39d579abb85dfc18c55092eb28 | 3,545 | py | Python | fastapi_code_generator/__main__.py | allen-munsch/fastapi-code-generator | 516735f8992ab9e9b5b038a90766deffbb25e702 | [
"MIT"
] | null | null | null | fastapi_code_generator/__main__.py | allen-munsch/fastapi-code-generator | 516735f8992ab9e9b5b038a90766deffbb25e702 | [
"MIT"
] | null | null | null | fastapi_code_generator/__main__.py | allen-munsch/fastapi-code-generator | 516735f8992ab9e9b5b038a90766deffbb25e702 | [
"MIT"
] | null | null | null | from datetime import datetime, timezone
from pathlib import Path
from typing import Dict, Optional
import typer
from datamodel_code_generator import PythonVersion, chdir
from datamodel_code_generator.format import CodeFormatter
from datamodel_code_generator.parser.openapi import OpenAPIParser as OpenAPIModelParser
from jinja2 import Environment, FileSystemLoader
from fastapi_code_generator.parser import MODEL_PATH, OpenAPIParser, ParsedObject
app = typer.Typer()
BUILTIN_TEMPLATE_DIR = Path(__file__).parent / "template"
@app.command()
def main(
input_file: typer.FileText = typer.Option(..., "--input", "-i"),
output_dir: Path = typer.Option(..., "--output", "-o"),
template_dir: Optional[Path] = typer.Option(None, "--template-dir", "-t"),
) -> None:
input_name: str = input_file.name
input_text: str = input_file.read()
return generate_code(input_name, input_text, output_dir, template_dir)
def generate_code(
input_name: str, input_text: str, output_dir: Path, template_dir: Optional[Path]
) -> None:
if not output_dir.exists():
output_dir.mkdir(parents=True)
if not template_dir:
template_dir = BUILTIN_TEMPLATE_DIR
model_parser = OpenAPIModelParser(source=input_text,)
parser = OpenAPIParser(input_name, input_text, openapi_model_parser=model_parser)
parsed_object: ParsedObject = parser.parse()
environment: Environment = Environment(
loader=FileSystemLoader(
template_dir if template_dir else f"{Path(__file__).parent}/template",
encoding="utf8",
),
)
results: Dict[Path, str] = {}
code_formatter = CodeFormatter(PythonVersion.PY_38, Path().resolve())
for target in template_dir.rglob("*"):
relative_path = target.relative_to(template_dir)
result = environment.get_template(str(relative_path)).render(
operations=parsed_object.operations,
imports=parsed_object.imports,
info=parsed_object.info,
)
results[relative_path] = code_formatter.format_code(result)
timestamp = datetime.now(timezone.utc).replace(microsecond=0).isoformat()
header = f"""\
# generated by fastapi-codegen:
# filename: {Path(input_name).name}
# timestamp: {timestamp}"""
for path, code in results.items():
with output_dir.joinpath(path.with_suffix(".py")).open("wt") as file:
print(header, file=file)
print("", file=file)
print(code.rstrip(), file=file)
with chdir(output_dir):
results = model_parser.parse()
if not results:
return
elif isinstance(results, str):
output = output_dir / MODEL_PATH
modules = {output: (results, input_name)}
else:
raise Exception('Modular references are not supported in this version')
header = f'''\
# generated by fastapi-codegen:
# filename: {{filename}}'''
# if not disable_timestamp:
header += f'\n# timestamp: {timestamp}'
for path, body_and_filename in modules.items():
body, filename = body_and_filename
if path is None:
file = None
else:
if not path.parent.exists():
path.parent.mkdir(parents=True)
file = path.open('wt', encoding='utf8')
print(header.format(filename=filename), file=file)
if body:
print('', file=file)
print(body.rstrip(), file=file)
if file is not None:
file.close()
if __name__ == "__main__":
typer.run(main)
| 33.130841 | 87 | 0.666855 | 419 | 3,545 | 5.434368 | 0.288783 | 0.057971 | 0.022398 | 0.034256 | 0.035134 | 0.035134 | 0.035134 | 0 | 0 | 0 | 0 | 0.002171 | 0.22031 | 3,545 | 106 | 88 | 33.443396 | 0.821635 | 0.007052 | 0 | 0.070588 | 0 | 0 | 0.096471 | 0.015652 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023529 | false | 0 | 0.117647 | 0 | 0.164706 | 0.070588 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14867dc769343133304ed2cb1cb736d1e1c8931c | 752 | py | Python | code/timewalk.py | spicecoder/Udacity_Robo_search | 682568845e23a5d6f1db8aa2423b6a3e0396a152 | [
"MIT"
] | null | null | null | code/timewalk.py | spicecoder/Udacity_Robo_search | 682568845e23a5d6f1db8aa2423b6a3e0396a152 | [
"MIT"
] | null | null | null | code/timewalk.py | spicecoder/Udacity_Robo_search | 682568845e23a5d6f1db8aa2423b6a3e0396a152 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import csv
import time
x = []
y = []
# Draw a point based on the x, y axis value.
def draw_point(a,b):
# Draw a point at the location (3, 9) with size 1000
plt.scatter(a, b, s=1000)
# Set chart title.
plt.title("Square Numbers", fontsize=19)
# Set x axis label.
plt.xlabel("Number", fontsize=10)
# Set y axis label.
plt.ylabel("Square of Number", fontsize=10)
# Set size of tick labels.
plt.tick_params(axis='both', which='major', labelsize=9)
# Display the plot in the matplotlib's viewer.
plt.show()
with open('path.txt','r') as csvfile:
plots = csv.reader(csvfile, delimiter=',')
for row in plots:
time.sleep(1)
draw_point(row[0],row[1])
| 20.888889 | 60 | 0.632979 | 121 | 752 | 3.909091 | 0.545455 | 0.008457 | 0.042283 | 0.080338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034722 | 0.234043 | 752 | 35 | 61 | 21.485714 | 0.786458 | 0.287234 | 0 | 0 | 0 | 0 | 0.104364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.176471 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
148fccfed7ce693db285318059d68d89b499a64d | 334 | py | Python | exemplo_8.py | felipesantoos/pandas | 4ebecbfed9e7963ee74d9510cca9157c2fcb54c0 | [
"MIT"
] | null | null | null | exemplo_8.py | felipesantoos/pandas | 4ebecbfed9e7963ee74d9510cca9157c2fcb54c0 | [
"MIT"
] | null | null | null | exemplo_8.py | felipesantoos/pandas | 4ebecbfed9e7963ee74d9510cca9157c2fcb54c0 | [
"MIT"
] | null | null | null | import pandas as pd
import shutil as sh
langs = {
"name": ["Go", "Python", "TypeScript", "PHP"],
"score": [10, 9, 10, 6]
}
df = pd.DataFrame(langs, index = ["lang1", "lang2", "lang3", "lang4"])
print(df.loc["lang2"]) # Retorna um Pandas Series.
print("-" * sh.get_terminal_size().columns)
print(df.loc[["lang3", "lang4"]])
| 23.857143 | 70 | 0.607784 | 47 | 334 | 4.276596 | 0.680851 | 0.099502 | 0.099502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046263 | 0.158683 | 334 | 13 | 71 | 25.692308 | 0.669039 | 0.07485 | 0 | 0 | 0 | 0 | 0.214984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.3 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
148feabf54ade022cfd30a6e3c7acb31b5014b87 | 2,563 | py | Python | webserver/python2.7/site-packages/area/__init__.py | maxr1876/Radix | bf9a5470908ea0823c8398565086b1e6b960c73b | [
"BSD-2-Clause"
] | null | null | null | webserver/python2.7/site-packages/area/__init__.py | maxr1876/Radix | bf9a5470908ea0823c8398565086b1e6b960c73b | [
"BSD-2-Clause"
] | null | null | null | webserver/python2.7/site-packages/area/__init__.py | maxr1876/Radix | bf9a5470908ea0823c8398565086b1e6b960c73b | [
"BSD-2-Clause"
] | null | null | null | from __future__ import division
import json
from math import pi, sin
__version__ = '1.1.0'
WGS84_RADIUS = 6378137
def rad(value):
return value * pi / 180
def ring__area(coordinates):
"""
Calculate the approximate _area of the polygon were it projected onto
the earth. Note that this _area will be positive if ring is oriented
clockwise, otherwise it will be negative.
Reference:
Robert. G. Chamberlain and William H. Duquette, "Some Algorithms for
Polygons on a Sphere", JPL Publication 07-03, Jet Propulsion
Laboratory, Pasadena, CA, June 2007 http://trs-new.jpl.nasa.gov/dspace/handle/2014/40409
@Returns
{float} The approximate signed geodesic _area of the polygon in square meters.
"""
assert isinstance(coordinates, list)
_area = 0
coordinates_length = len(coordinates)
if coordinates_length > 2:
for i in range(0, coordinates_length):
if i == (coordinates_length - 2):
lower_index = coordinates_length - 2
middle_index = coordinates_length - 1
upper_index = 0
elif i == (coordinates_length - 1):
lower_index = coordinates_length - 1
middle_index = 0
upper_index = 1
else:
lower_index = i
middle_index = i + 1
upper_index = i + 2
p1 = coordinates[lower_index]
p2 = coordinates[middle_index]
p3 = coordinates[upper_index]
_area += (rad(p3[0]) - rad(p1[0])) * sin(rad(p2[1]))
_area = _area * WGS84_RADIUS * WGS84_RADIUS / 2
return _area
def polygon__area(coordinates):
assert isinstance(coordinates, list)
_area = 0
if len(coordinates) > 0:
_area += abs(ring__area(coordinates[0]))
for i in range(1, len(coordinates)):
_area -= abs(ring__area(coordinates[i]))
return _area
def area(geometry):
if isinstance(geometry, str):
geometry = json.loads(geometry)
assert isinstance(geometry, dict)
_area = 0
if geometry['type'] == 'Polygon':
return polygon__area(geometry['coordinates'])
elif geometry['type'] == 'MultiPolygon':
for i in range(0, len(geometry['coordinates'])):
_area += polygon__area(geometry['coordinates'][i])
elif geometry['type'] == 'GeometryCollection':
for i in range(0, len(geometry['geometries'])):
_area += area(geometry['geometries'][i])
return _area
| 26.978947 | 96 | 0.609442 | 305 | 2,563 | 4.927869 | 0.367213 | 0.090486 | 0.015968 | 0.029275 | 0.121091 | 0.07851 | 0.030605 | 0 | 0 | 0 | 0 | 0.036504 | 0.294577 | 2,563 | 94 | 97 | 27.265957 | 0.794801 | 0.203668 | 0 | 0.150943 | 0 | 0 | 0.053688 | 0 | 0 | 0 | 0 | 0 | 0.056604 | 1 | 0.075472 | false | 0 | 0.056604 | 0.018868 | 0.226415 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1493a2c8eff385805e1d1e0101be2be42f87f69a | 4,760 | py | Python | cryptopals/set-2-block-crypto/chal-11/ecb_cbc_detection_oracle.py | reider-roque/crypto-challenges | efbc2afe5e2ca88671d9918b1d8eca26ba042b04 | [
"MIT"
] | null | null | null | cryptopals/set-2-block-crypto/chal-11/ecb_cbc_detection_oracle.py | reider-roque/crypto-challenges | efbc2afe5e2ca88671d9918b1d8eca26ba042b04 | [
"MIT"
] | null | null | null | cryptopals/set-2-block-crypto/chal-11/ecb_cbc_detection_oracle.py | reider-roque/crypto-challenges | efbc2afe5e2ca88671d9918b1d8eca26ba042b04 | [
"MIT"
] | null | null | null | import os
from base64 import b64decode
from binascii import hexlify, unhexlify
from functools import reduce
from random import SystemRandom
from Crypto.Cipher import AES
def split_by_n(seq, n):
"""A generator to divide a sequence into chunks of n units."""
while seq:
yield seq[:n]
seq = seq[n:]
def xor(str1, str2):
"""Takes two byte strings"""
if len(str1) != len(str2):
raise ValueError("Arguments str1 and str2 lenghts differ.")
str1 = hexlify(str1)
str2 = hexlify(str2)
int1 = int(str1, 16)
int2 = int(str2, 16)
xor_result = hex(int1 ^ int2).split('x')[1]
# adjust string length
xor_result = xor_result.zfill(len(str1))
xor_result = unhexlify(xor_result)
return xor_result
def pad(plaintext, block_size):
if block_size < 2 or block_size > 255:
raise ValueError("block_size cannot be less than 2 and greater than 255")
last_block_size = len(plaintext) % block_size
pad_size = block_size - last_block_size
if pad_size == 0:
pad_size = 16
pad_byte = bytes([pad_size])
plaintext += pad_size * pad_byte
return plaintext
def unpad(plaintext):
last_byte = plaintext[-1]
padding_bytes = (-1-i for i in range(4))
for byte in padding_bytes:
if plaintext[byte] != last_byte:
return False, plaintext # Fail padding check
plaintext = plaintext[:-last_byte] # Remove padding
return True, plaintext
def aes_block_enc(key, pt):
mode = AES.MODE_ECB
encryptor = AES.new(key, mode)
ct = encryptor.encrypt(pt)
return ct
def aes_block_dec(key, ct):
mode = AES.MODE_ECB
decryptor = AES.new(key, mode)
pt = decryptor.decrypt(ct)
return pt
def cbc_mode_enc(key, pt, iv):
pt = pad(pt, 16)
pt_blocks = split_by_n(pt, 16) # 1 block = 16 bytes
ct = b""
for pt_block in pt_blocks:
pt_block = xor(pt_block, iv)
ct_block = aes_block_enc(key, pt_block)
ct += ct_block
iv = ct_block
return ct
def cbc_mode_dec(key, ct, iv):
ct_blocks = split_by_n(ct, 16) # 1 block = 16 bytes
pt = b""
for ct_block in ct_blocks:
pt_block = aes_block_dec(key, ct_block)
pt_block = xor(pt_block, iv)
pt += pt_block
iv = ct_block
_, pt = unpad(plaintext) # Do not care if unpadding is successful
return pt
def ecb_mode_enc(key, pt):
pt = pad(pt, 16)
pt_blocks = split_by_n(pt, 16) # 1 block = 16 bytes
ct = b""
for pt_block in pt_blocks:
ct_block = aes_block_enc(key, pt_block)
ct += ct_block
return ct
def ecb_mode_dec(key, ct):
ct_blocks = split_by_n(ct, 16) # 1 block = 16 bytes
pt = b""
for ct_block in ct_blocks:
pt_block = aes_block_dec(key, ct_block)
pt += pt_block
_, pt = unpad(plaintext) # Do not care if unpadding is successful
return pt
def generate_aes_128_key():
return os.urandom(16)
def generate_iv(block_size):
return os.urandom(block_size)
def pad_with_randomness(plaintext):
int_gen = SystemRandom() # uses os.urandom() underneath, hence secure
prefix_len = int_gen.randrange(5, 10)
postfix_len = int_gen.randrange(5, 10)
prefix = os.urandom(prefix_len)
postfix = os.urandom(postfix_len)
plaintext = prefix + plaintext + postfix
return plaintext
def encryption_oracle(plaintext):
key = generate_aes_128_key()
plaintext = pad_with_randomness(plaintext)
ciphertext = ""
random_byte = ord(os.urandom(1))
if random_byte <= 127: # Encrypting with ECB
ciphertext = ecb_mode_enc(key, plaintext)
mode = "ECB"
else: # Encrypting with CBC
iv = generate_iv(16)
ciphertext = cbc_mode_enc(key, plaintext, iv)
mode = "CBC"
# return mode, ciphertext
return ciphertext
def distinguish_oracle_output():
plaintext = b"abcdefghijgkmnop" * 4
ciphertext = encryption_oracle(plaintext)
ct_blocks = split_by_n(ciphertext, 16)
distinguisher_blocks = list(ct_blocks)[1:-1]
# If the ciphertext was encrypted using ECB the distinguisher_blocks
# variable should contain three identical blocks at this point.
# List of identicatl elelemnts converted to a set will reduce to one
# element.
if len(set(distinguisher_blocks)) == 1:
mode = "ECB"
else:
mode = "CBC"
print("\nCiphertext is encrypted with {} mode:\n{}".format(mode, ciphertext))
if __name__ == "__main__":
distinguish_oracle_output()
distinguish_oracle_output()
distinguish_oracle_output()
distinguish_oracle_output()
distinguish_oracle_output()
distinguish_oracle_output() | 24.285714 | 81 | 0.65021 | 673 | 4,760 | 4.375929 | 0.236256 | 0.028523 | 0.054669 | 0.023769 | 0.274363 | 0.244482 | 0.217997 | 0.217997 | 0.217997 | 0.217997 | 0 | 0.025481 | 0.257983 | 4,760 | 196 | 82 | 24.285714 | 0.808324 | 0.126261 | 0 | 0.393701 | 0 | 0 | 0.041616 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.11811 | false | 0 | 0.047244 | 0.015748 | 0.275591 | 0.007874 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1494e411076fb87b64c4ba9ac057f14cdbcbc48e | 3,739 | py | Python | legacy/Box.py | JUST0M/procedural-buildings | d51add4bc769823d01f16fbe1d599d97650ec404 | [
"MIT"
] | 2 | 2021-04-13T10:46:20.000Z | 2021-07-04T09:30:49.000Z | legacy/Box.py | JUST0M/procedural-buildings | d51add4bc769823d01f16fbe1d599d97650ec404 | [
"MIT"
] | null | null | null | legacy/Box.py | JUST0M/procedural-buildings | d51add4bc769823d01f16fbe1d599d97650ec404 | [
"MIT"
] | null | null | null | import numpy as np
class Box:
def __init__(self, minCoords, maxCoords):
self.min = minCoords
self.max = maxCoords
self.planes = [*((i,self.min[i]) for i in range(3)),
*((i,self.max[i]) for i in range(3))]
self.size = [c2 - c1 for c1,c2 in zip(self.min, self.max)]
def volume(self):
diffX = self.max[0] - self.min[0]
diffY = self.max[1] - self.min[1]
diffZ = self.max[2] - self.min[2]
return diffX * diffY * diffZ
# Return a new scope such that the new scope comes
# before other but within self in the given direction.
# Dimensions other than the provided axis will remain the same
# as self's dimensions
def before(self, other, axis):
if self.min[axis] == other.min[axis]:
return None
elif axis == 0:
return Box(self.min, (other.min[0], self.max[1], self.max[2]))
elif axis == 1:
return Box(self.min, (self.max[0], other.min[1], self.max[2]))
elif axis == 2:
return Box(self.min, (self.max[0], self.max[1], other.min[2]))
# Like before but after
def after(self, other, axis):
if self.max[axis] == other.max[axis]:
return None
elif axis == 0:
return Box((other.max[0], self.min[1], self.min[2]), self.max)
elif axis == 1:
return Box((self.min[0], other.max[1], self.min[2]), self.max)
elif axis == 2:
return Box((self.min[0], self.min[1], other.max[2]), self.max)
def takeSizeFrom(self, other, axis):
if axis == 0:
return Box((other.min[0], self.min[1], self.min[2]),
(other.max[0], self.max[1], self.max[2]))
elif axis == 1:
return Box((self.min[0], other.min[1], self.min[2]),
(self.max[0], other.max[1], self.max[2]))
elif axis == 2:
return Box((self.min[0], self.min[1], other.min[2]),
(self.max[0], self.max[1], other.max[2]))
def contains(self, other):
for ax in (0,1,2):
if self.min[ax] > other.min[ax] or self.max[ax] < other.max[ax]:
return False
return True
def matToUnitSquare(self):
sx = self.max[0] - self.min[0]
sy = self.max[1] - self.min[1]
sz = self.max[2] - self.min[2]
return np.array([[1/sx, 0, 0, -self.min[0]/sx - 0.5],
[0, 1/sy, 0, -self.min[1]/sy - 0.5],
[0, 0, 1/sz, -self.min[2]/sz - 0.5],
[0, 0, 0, 1]])
def boundWith(self, other):
newMin = tuple(min(pair) for pair in zip(self.min, other.min))
newMax = tuple(max(pair) for pair in zip(self.max, other.max))
return Box(newMin, newMax)
def cutWith(self, plane):
planeAxis, planeCoord = plane
beforeMin = self.min
beforeMax = tuple(planeCoord if i==planeAxis else self.max[i] for i in range(3))
afterMin = tuple(planeCoord if i==planeAxis else self.min[i] for i in range(3))
afterMax = self.max
return Box(beforeMin, beforeMax), Box(afterMin, afterMax)
def relationTo(self, plane):
planeAxis, planeCoord = plane
if self.min[planeAxis] < planeCoord:
if self.max[planeAxis] <= planeCoord:
return 'before'
else:
return 'intersect'
else:
return 'after'
def __str__(self):
return f"min: {self.min}, max: {self.max}"
def __repr__(self):
return f"min: {self.min}, max: {self.max}"
| 38.546392 | 89 | 0.514844 | 531 | 3,739 | 3.602637 | 0.165725 | 0.128071 | 0.033455 | 0.058547 | 0.49242 | 0.411396 | 0.357031 | 0.250915 | 0.158913 | 0.107162 | 0 | 0.03632 | 0.337256 | 3,739 | 96 | 90 | 38.947917 | 0.735674 | 0.054827 | 0 | 0.205128 | 0 | 0 | 0.024483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.012821 | 0.025641 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1494f6a3975880fab8630625b748fe8b61a44ed1 | 1,032 | py | Python | backend-services/mongodbtest.py | Off-Top-App/off-top-python | af51a7da0e52d6ad978835cb05986d7c2a917861 | [
"bzip2-1.0.6"
] | 3 | 2019-12-01T23:09:12.000Z | 2020-12-22T03:02:37.000Z | backend-services/mongodbtest.py | Off-Top-App/off-top-python | af51a7da0e52d6ad978835cb05986d7c2a917861 | [
"bzip2-1.0.6"
] | 5 | 2020-03-05T17:17:12.000Z | 2020-06-16T07:02:27.000Z | backend-services/mongodbtest.py | Off-Top-App/off-top-python | af51a7da0e52d6ad978835cb05986d7c2a917861 | [
"bzip2-1.0.6"
] | 1 | 2020-05-18T12:57:14.000Z | 2020-05-18T12:57:14.000Z | from pymongo import MongoClient
# pprint library is used to make the output look more pretty
from pprint import pprint
import datetime
# connect to MongoDB
#To establish a connection to MongoDB with PyMongo you use the MongoClient class
client = MongoClient('mongodb://localhost:27017/')
#create a database object referencing a new database, called “newDB”
#A single instance of MongoDB can support multiple independent databases.
#MongoDB stores flexible JSON-like documents
with client:
db = client.newDB
#Creating a test collection which is a group of documents roughly the equivalent of a table in a relational database
collection1= db.test_collection
collectionList= db.list_collection_names()
if "customers" in collectionList:
print("The collection exists.")
# Issue the serverStatus command and print the results
serverStatusResult = db.command("serverStatus")
#status = db.command("dbstats")
print(collectionList)
#db.collection_names() is depreciated
pprint(serverStatusResult)
| 41.28 | 116 | 0.778101 | 135 | 1,032 | 5.918519 | 0.562963 | 0.030038 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006944 | 0.162791 | 1,032 | 24 | 117 | 43 | 0.917824 | 0.554264 | 0 | 0 | 0 | 0 | 0.153675 | 0.057906 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.230769 | 0 | 0.230769 | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1499ce61f55da5d49e4d66d22bd91de18f3eb916 | 1,235 | py | Python | examples/python/atkinson_example.py | shenlong95/concentrationMetrics | 266cdc5465cb948e5726aff52f10bc5b51a6ed8e | [
"MIT"
] | 1 | 2022-03-02T14:38:25.000Z | 2022-03-02T14:38:25.000Z | examples/python/atkinson_example.py | shenlong95/concentrationMetrics | 266cdc5465cb948e5726aff52f10bc5b51a6ed8e | [
"MIT"
] | null | null | null | examples/python/atkinson_example.py | shenlong95/concentrationMetrics | 266cdc5465cb948e5726aff52f10bc5b51a6ed8e | [
"MIT"
] | null | null | null | # encoding: utf-8
# (c) 2016-2022 Open Risk, all rights reserved
#
# ConcentrationMetrics is licensed under the MIT license a copy of which is included
# in the source distribution of concentrationMetrics. This is notwithstanding any licenses of
# third-party software included in this distribution. You may not use this file except in
# compliance with the License.
#
# Unless required by applicable law or agreed to in writing, software distributed under
# the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific language governing permissions and
# limitations under the License.
import concentrationMetrics as cm
import pandas as pd
dataset_path = cm.source_path + "/datasets/"
# Comparison with R version in IC2 package on hhbudget dataset
# Expected Results:
# Epsilon 0: 0
# Epsilon 1: 0.3245925
# Epsilon 2: 0.4951668
# Epsilon 3: 0.6053387
# Epsilon 4: 0.6856425
hhbudgets = pd.read_csv(dataset_path + "hhbudgets.csv")
y = hhbudgets["ingreso"].values
myIndex = cm.Index()
# print(cl.atkinson(y, 0))
print(myIndex.atkinson(y, 1))
print(myIndex.atkinson(y, 2))
print(myIndex.atkinson(y, 3))
print(myIndex.atkinson(y, 4))
| 32.5 | 96 | 0.765992 | 187 | 1,235 | 5.037433 | 0.566845 | 0.047771 | 0.084926 | 0.089172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050573 | 0.151417 | 1,235 | 37 | 97 | 33.378378 | 0.848282 | 0.695547 | 0 | 0 | 0 | 0 | 0.084507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
149a2938cb8b209c78d4c945bf2f3f42bf24a389 | 1,537 | py | Python | kcorpusParse.py | passingbreeze/findAntisocial | 704274c07ae8e03b903865d60c9188443b6e34fd | [
"MIT"
] | null | null | null | kcorpusParse.py | passingbreeze/findAntisocial | 704274c07ae8e03b903865d60c9188443b6e34fd | [
"MIT"
] | null | null | null | kcorpusParse.py | passingbreeze/findAntisocial | 704274c07ae8e03b903865d60c9188443b6e34fd | [
"MIT"
] | null | null | null | # -*- encoding:utf-8 -*-
import re
## Path Config in here
readTargetFolder = 'My Path'
outTargetFolder = readTargetFolder + 'conv_'
headNumber = range(4,9)
indexNumber = range(1,100)
fileCategory = ['CM','CK','CL']
contentReg = r'<s n=\d+>(.*?)<\/s>'
def remove_tag(content):
cleanr = re.compile('<.*?>')
cleantext = re.sub(cleanr, '', content)
specialr = re.compile('[\:\,\.\?\~]')
cleantext = re.sub(specialr,'',cleantext)
return cleantext
def parse(fileName):
rx = re.compile(contentReg)
result = ""
try:
rf = open(readTargetFolder + fileName, 'r', encoding='utf-16-le')
wf = open(outTargetFolder + fileName,'w', encoding='utf-8')
lines = rf.readlines()
cnt = 0
for line in lines:
cnt += 1
m = rx.findall(line)
for mm in m:
curret = remove_tag(mm)
if not curret:
continue
result += curret + ' #\n'
wf.write(result)
rf.close()
wf.close()
except FileNotFoundError:
print(fileName + ': FNF Error')
pass
def main():
for headnumber in headNumber:
for filecategory in fileCategory:
for indexnumber in indexNumber:
numberFormat = str(indexnumber).zfill(5)
filename = str(headnumber) + filecategory + numberFormat + '.txt'
parse(filename)
if __name__ == '__main__':
main() | 27.446429 | 82 | 0.527651 | 156 | 1,537 | 5.128205 | 0.5 | 0.04125 | 0.03 | 0.05 | 0.0575 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012758 | 0.33702 | 1,537 | 56 | 83 | 27.446429 | 0.772326 | 0.027326 | 0 | 0 | 0 | 0 | 0.067502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068182 | false | 0.022727 | 0.022727 | 0 | 0.113636 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
149f538f5367f0f292bb57d5e03344b3411fb3be | 3,639 | py | Python | search_service/api/table.py | fossabot/amundsensearchlibrary | dd169e486f7a9cd3233e184a416872d892f55078 | [
"Apache-2.0"
] | null | null | null | search_service/api/table.py | fossabot/amundsensearchlibrary | dd169e486f7a9cd3233e184a416872d892f55078 | [
"Apache-2.0"
] | 1 | 2019-09-21T23:59:47.000Z | 2019-09-21T23:59:47.000Z | search_service/api/table.py | fossabot/amundsensearchlibrary | dd169e486f7a9cd3233e184a416872d892f55078 | [
"Apache-2.0"
] | 1 | 2019-09-21T23:56:40.000Z | 2019-09-21T23:56:40.000Z | from http import HTTPStatus
from typing import Iterable, Any
from flask_restful import Resource, fields, marshal_with, reqparse
from search_service.proxy import get_proxy_client
table_fields = {
"name": fields.String,
"key": fields.String,
# description can be empty, if no description is present in DB
"description": fields.String,
"cluster": fields.String,
"database": fields.String,
"schema_name": fields.String,
"column_names": fields.List(fields.String),
# tags can be empty list
"tags": fields.List(fields.String),
# last etl timestamp as epoch
"last_updated_epoch": fields.Integer,
}
search_table_results = {
"total_results": fields.Integer,
"results": fields.Nested(table_fields, default=[])
}
TABLE_INDEX = 'table_search_index'
class SearchTableAPI(Resource):
"""
Search Table API
"""
def __init__(self) -> None:
self.proxy = get_proxy_client()
self.parser = reqparse.RequestParser(bundle_errors=True)
self.parser.add_argument('query_term', required=True, type=str)
self.parser.add_argument('page_index', required=False, default=0, type=int)
self.parser.add_argument('index', required=False, default=TABLE_INDEX, type=str)
super(SearchTableAPI, self).__init__()
@marshal_with(search_table_results)
def get(self) -> Iterable[Any]:
"""
Fetch search results based on query_term.
:return: list of table results. List can be empty if query
doesn't match any tables
"""
args = self.parser.parse_args(strict=True)
try:
results = self.proxy.fetch_table_search_results(
query_term=args.get('query_term'),
page_index=args.get('page_index'),
index=args.get('index')
)
return results, HTTPStatus.OK
except RuntimeError:
err_msg = 'Exception encountered while processing search request'
return {'message': err_msg}, HTTPStatus.INTERNAL_SERVER_ERROR
class SearchTableFieldAPI(Resource):
"""
Search Table API with explict field
"""
def __init__(self) -> None:
self.proxy = get_proxy_client()
self.parser = reqparse.RequestParser(bundle_errors=True)
self.parser.add_argument('query_term', required=False, type=str)
self.parser.add_argument('page_index', required=False, default=0, type=int)
self.parser.add_argument('index', required=False, default=TABLE_INDEX, type=str)
super(SearchTableFieldAPI, self).__init__()
@marshal_with(search_table_results)
def get(self, *, field_name: str,
field_value: str) -> Iterable[Any]:
"""
Fetch search results based on query_term.
:param field_name: which field we should search from(schema, tag, table)
:param field_value: the value to search for the field
:return: list of table results. List can be empty if query
doesn't match any tables
"""
args = self.parser.parse_args(strict=True)
try:
results = self.proxy.fetch_table_search_results_with_field(
query_term=args.get('query_term'),
field_name=field_name,
field_value=field_value,
page_index=args.get('page_index'),
index=args.get('index')
)
return results, HTTPStatus.OK
except RuntimeError:
err_msg = 'Exception encountered while processing search request'
return {'message': err_msg}, HTTPStatus.INTERNAL_SERVER_ERROR
| 31.643478 | 88 | 0.650179 | 435 | 3,639 | 5.225287 | 0.264368 | 0.043995 | 0.034316 | 0.055433 | 0.615926 | 0.615926 | 0.593929 | 0.593929 | 0.593929 | 0.554333 | 0 | 0.000733 | 0.250344 | 3,639 | 114 | 89 | 31.921053 | 0.832478 | 0.149766 | 0 | 0.461538 | 0 | 0 | 0.113246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061538 | false | 0 | 0.061538 | 0 | 0.215385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
149ffff8d64cab91c2f9be2e3f80f8c9c1f39873 | 6,370 | py | Python | cidan/LSSC/SpatialBox.py | Mishne-Lab/cidan | 3f579b6d5a49e17690e9aa07dfb60d3e8c05e681 | [
"MIT"
] | 2 | 2020-11-24T17:47:23.000Z | 2021-05-20T16:19:53.000Z | cidan/LSSC/SpatialBox.py | Mishne-Lab/CIDAN | 30d1176773e3ad0f236ba342cba48c89492f4e63 | [
"MIT"
] | 4 | 2020-08-18T16:42:23.000Z | 2020-08-18T20:58:12.000Z | cidan/LSSC/SpatialBox.py | Mishne-Lab/cidan | 3f579b6d5a49e17690e9aa07dfb60d3e8c05e681 | [
"MIT"
] | 1 | 2020-08-12T18:47:22.000Z | 2020-08-12T18:47:22.000Z | import logging
from typing import Tuple
import numpy as np
from dask import delayed
logger1 = logging.getLogger("cidan.LSSC.SpatialBox")
class SpatialBox:
def __init__(self, box_num: int, total_boxes: int, image_shape: Tuple[int, int],
spatial_overlap: int):
logger1.debug(
"Spatial Box creation inputs: box num {0}, total boxes {1}, image shape {2}, spatial overlap {3}".format(
box_num, total_boxes, image_shape, spatial_overlap))
# TODO implement spatial overlap
self.box_num = box_num
self.total_boxes = total_boxes
self.image_shape = image_shape
self.boxes_per_row = int(total_boxes ** .5)
self.spatial_overlap = spatial_overlap
self.y_box_num = box_num // self.boxes_per_row
self.x_box_num = box_num - (self.y_box_num * self.boxes_per_row)
self.box_cord_1 = [((image_shape[0] // self.boxes_per_row) *
self.x_box_num) - spatial_overlap,
(image_shape[1] // self.boxes_per_row) *
self.y_box_num - spatial_overlap]
self.box_cord_2 = [(image_shape[0] // self.boxes_per_row) * (
self.x_box_num + 1) + spatial_overlap,
(image_shape[1] // self.boxes_per_row) * (
self.y_box_num + 1) + spatial_overlap]
self.box_cord_1[0] = 0 if self.box_cord_1[0] < 0 else self.box_cord_1[0]
self.box_cord_1[1] = 0 if self.box_cord_1[1] < 0 \
else self.box_cord_1[1]
self.box_cord_2[0] = image_shape[0] if self.box_cord_2[0] > image_shape[0] \
else \
self.box_cord_2[0]
self.box_cord_2[1] = image_shape[1] if self.box_cord_2[1] > image_shape[1] \
else self.box_cord_2[1]
self.shape = (self.box_cord_2[0] - self.box_cord_1[0],
self.box_cord_2[
1] - self.box_cord_1[1])
logger1.debug(("Spatial box creation: Boxes per row {0}, y_box_num {1}, " +
"x_box_num" + " {2}, box cord 1 {3}, box cord 2 {4}, shape {5}"
).format(self.boxes_per_row, self.y_box_num, self.x_box_num,
self.box_cord_1, self.box_cord_2, self.shape))
@delayed
def extract_box(self, dataset):
return dataset[:, self.box_cord_1[0]:self.box_cord_2[0], self.box_cord_1[1]:
self.box_cord_2[1]]
@delayed
def redefine_spatial_cord_2d(self, cord_list):
return [(x + self.box_cord_1[0], y + self.box_cord_1[1]) for x, y in cord_list]
def pointInBox(self, point):
"""
Checks if a point is in the box
Parameters
----------
point
Returns
-------
"""
return self.box_cord_1[0] <= point[0] <= self.box_cord_2[0] \
and self.box_cord_1[1] <= point[1] <= self.box_cord_2[1]
def point_to_box_point(self, point):
"""
Converts a point in the image to its cords in the box
Parameters
----------
point 2d point
Returns
-------
tuple of new cords
"""
return (point[0] - self.box_cord_1[0], point[1] - self.box_cord_1[1])
@delayed
def redefine_spatial_cord_1d(self, cord_list):
box_length = self.box_cord_2[1] - self.box_cord_1[1]
def change_1_cord(x):
return ((x // box_length) + self.box_cord_1[0]) * self.image_shape[
1] + self.box_cord_1[1] + x % box_length
return list(map(change_1_cord, cord_list))
def convert_1d_to_2d(self, cord_list):
def change_1_cord(cord_1d):
return int(cord_1d // self.shape[1]), int(cord_1d - (
cord_1d // self.shape[1]) * self.shape[1])
return list(map(change_1_cord, cord_list))
def data_w_out_spatial_overlap(self, data):
"""
Parameters
----------
data 2d dataset
Returns
-------
"""
if self.total_boxes == 1:
return data
x = [0, self.shape[0]]
y = [0, self.shape[1]]
# This uses which column each box is in to determin overlap parts, first, last,
# any other column
if self.box_num % self.boxes_per_row == 0:
x[1] = self.shape[0] - self.spatial_overlap
elif self.box_num % self.boxes_per_row == self.boxes_per_row - 1:
x[0] = self.spatial_overlap
else:
x[0] = self.spatial_overlap
x[1] = self.shape[0] - self.spatial_overlap
# This uses which row each box is in to determin overlap parts, first, last,
# any other row
if self.box_num // self.boxes_per_row == 0:
y[1] = self.shape[1] - self.spatial_overlap
elif self.box_num // self.boxes_per_row == self.boxes_per_row - 1:
y[0] = self.spatial_overlap
else:
y[0] = self.spatial_overlap
y[1] = self.shape[1] - self.spatial_overlap
return data[x[0]:x[1], y[0]:y[1]]
def combine_images(spatial_box_list, data_list):
"""
Parameters
----------
spatial_box_list
data_list list of reshaped eigen vectors in the correct shape for these boxes
Returns
-------
"""
# Going to go through the lists in a 2d format using number of spatial boxes is always square
spatial_box_num = len(spatial_box_list)
spatial_box_root = int(spatial_box_num ** .5)
data_matched = []
for y in range(spatial_box_root):
temp = []
for x in range(spatial_box_root):
current_spatial_box = spatial_box_list[y * spatial_box_root + x]
current_data = data_list[y * spatial_box_root + x]
temp.append(current_spatial_box.data_w_out_spatial_overlap(current_data))
stacked = np.vstack(temp)
data_matched.append(stacked)
all_data = np.hstack(data_matched)
return all_data
if __name__ == '__main__':
test = SpatialBox(box_num=0, total_boxes=9, image_shape=[1, 9, 9],
spatial_overlap=0)
pixel_list = test.redefine_spatial_cord_1d([0, 4, 8]).compute()
zeros = np.zeros((9 * 9))
zeros[pixel_list] = 1
print(zeros.reshape((9, 9)))
| 36.193182 | 117 | 0.574882 | 917 | 6,370 | 3.70229 | 0.137405 | 0.086598 | 0.116642 | 0.074227 | 0.53785 | 0.369072 | 0.285714 | 0.276583 | 0.200884 | 0.133726 | 0 | 0.036827 | 0.309419 | 6,370 | 175 | 118 | 36.4 | 0.73494 | 0.10989 | 0 | 0.141509 | 0 | 0.009434 | 0.043422 | 0.003864 | 0 | 0 | 0 | 0.005714 | 0 | 1 | 0.103774 | false | 0 | 0.037736 | 0.037736 | 0.254717 | 0.009434 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a0c11dcb55cf743f8fa6d47c0541a4c1e0d0e5 | 6,252 | py | Python | Experiments/TrainStackedClassification.py | christymarc/raycasting-simulation | ed9b92143d3eb1c5a25900419ead517f93f8c315 | [
"MIT"
] | null | null | null | Experiments/TrainStackedClassification.py | christymarc/raycasting-simulation | ed9b92143d3eb1c5a25900419ead517f93f8c315 | [
"MIT"
] | null | null | null | Experiments/TrainStackedClassification.py | christymarc/raycasting-simulation | ed9b92143d3eb1c5a25900419ead517f93f8c315 | [
"MIT"
] | null | null | null | # ---
# jupyter:
# jupytext:
# formats: py:light
# text_representation:
# extension: .py
# format_name: light
# format_version: '1.5'
# jupytext_version: 1.11.4
# kernelspec:
# display_name: Python 3 (ipykernel)
# language: python
# name: python3
# ---
from argparse import ArgumentParser
import matplotlib.pyplot as plt
import os.path
from os import path
from fastai.vision.all import *
from fastai.callback.progress import CSVLogger
from torchvision import transforms
# Assign GPU
torch.cuda.set_device(1)
print("Running on GPU: " + str(torch.cuda.current_device()))
# Constants (same for all trials)
VALID_PCT = 0.05
NUM_REPLICATES = 4
NUM_EPOCHS = 8
DATASET_DIR = Path("/raid/clark/summer2021/datasets")
MODEL_PATH_REL_TO_DATASET = Path("stacked_models2")
DATA_PATH_REL_TO_DATASET = Path("stacked_data2")
VALID_MAZE_DIR = Path("../Mazes/validation_mazes8x8/")
compared_models = {
"alexnet": alexnet,
"xresnext50": xresnext50,
"xresnext18": xresnext18,
"densenet121": densenet121,
}
img_dir = Path("/raid/clark/summer2021/datasets/corrected-wander-full/")
img_filenames = list(img_dir.glob("*.png"))
img_filenames.sort()
def get_pair_2(o):
curr_im_num = int(Path(o).name[:6])
prev_im_num = curr_im_num if curr_im_num == 0 else curr_im_num - 1
prev_im = img_filenames[prev_im_num]
img1 = Image.open(o).convert('RGB')
img2 = Image.open(prev_im).convert('RGB')
img1_arr = np.array(img1, dtype=np.uint8)
img2_arr = np.array(img2, dtype=np.uint8)
new_shape = list(img1_arr.shape)
new_shape[-1] = new_shape[-1] * 2
img3_arr = np.zeros(new_shape, dtype=np.uint8)
img3_arr[:, :, :3] = img1_arr
img3_arr[:, :, 3:] = img2_arr
return img3_arr.T.astype(np.float32)
def get_fig_filename(prefix: str, label: str, ext: str, rep: int) -> str:
fig_filename = f"{prefix}-{label}-{rep}.{ext}"
print(label, "filename :", fig_filename)
return fig_filename
def filename_to_class(filename) -> str:
angle = float(str(filename).split("_")[1].split(".")[0].replace("p", "."))
if angle > 0:
return "left"
elif angle < 0:
return "right"
else:
return "forward"
def prepare_dataloaders(dataset_name: str, prefix: str) -> DataLoaders:
path = DATASET_DIR / dataset_name
db = DataBlock(
blocks=((ImageBlock, ImageBlock), CategoryBlock),
get_items=get_image_files,
get_x=get_pair_2,
get_y=filename_to_class,
splitter=RandomSplitter(valid_pct=VALID_PCT)
)
dls = db.dataloaders(path, bs=64)
return dls # type: ignore
def train_model(
dls: DataLoaders,
model_arch: str,
pretrained: bool,
logname: Path,
modelname: Path,
prefix: str,
rep: int,
):
learn = cnn_learner(
dls,
compared_models[model_arch],
metrics=accuracy,
pretrained=pretrained,
cbs=CSVLogger(fname=logname),
)
out_channels = learn.model[0][0][0].out_channels
kernel_size = learn.model[0][0][0].kernel_size
stride = learn.model[0][0][0].stride
padding = learn.model[0][0][0].padding
if (model_arch == "alexnet"):
learn.model[0][0][0] = nn.Conv2d(6, out_channels, kernel_size=kernel_size, stride=stride, padding=padding)
else:
bias = learn.model[0][0][0].bias
learn.model[0][0][0] = nn.Conv2d(6, out_channels, kernel_size=kernel_size, stride=stride, padding=padding, bias=bias)
if pretrained:
learn.fine_tune(NUM_EPOCHS)
else:
learn.fit_one_cycle(NUM_EPOCHS)
# The follwing line is necessary for pickling
learn.remove_cb(CSVLogger)
learn.export(modelname)
"""
learn.show_results()
plt.savefig(get_fig_filename(prefix, "results", "pdf", rep))
interp = ClassificationInterpretation.from_learner(learn)
interp.plot_top_losses(9, figsize=(15, 10))
plt.savefig(get_fig_filename(prefix, "toplosses", "pdf", rep))
interp.plot_confusion_matrix(figsize=(10, 10))
plt.savefig(get_fig_filename(prefix, "confusion", "pdf", rep))"""
def main():
arg_parser = ArgumentParser("Train stacked classification networks.")
arg_parser.add_argument(
"model_arch", help="Model architecture (see code for options)"
)
arg_parser.add_argument(
"dataset_name", help="Name of dataset to use (handmade-full | corrected-wander-full)"
)
arg_parser.add_argument(
"--pretrained", action="store_true", help="Use pretrained model"
)
args = arg_parser.parse_args()
# TODO: not using this (would require replacing first layer)
# rgb_instead_of_gray = True
# Make dirs as needed
model_dir = DATASET_DIR / args.dataset_name / MODEL_PATH_REL_TO_DATASET
model_dir.mkdir(exist_ok=True)
print(f"Created model dir (or it already exists) : '{model_dir}'")
data_dir = DATASET_DIR / args.dataset_name / DATA_PATH_REL_TO_DATASET
data_dir.mkdir(exist_ok=True)
print(f"Created data dir (or it already exists) : '{data_dir}'")
file_prefix = "classification-" + args.model_arch
# file_prefix += "-rgb" if rgb_instead_of_gray else "-gray"
file_prefix += "-pretrained" if args.pretrained else "-notpretrained"
fig_filename_prefix = data_dir / file_prefix
dls = prepare_dataloaders(args.dataset_name, fig_filename_prefix)
# Train NUM_REPLICATES separate instances of this model and dataset
for rep in range(NUM_REPLICATES):
model_filename = DATASET_DIR / args.dataset_name / MODEL_PATH_REL_TO_DATASET / f"{file_prefix}-{rep}.pkl"
print("Model relative filename :", model_filename)
# Checks if model exists and skip if it does (helps if this crashes)
if path.exists(model_filename):
continue
log_filename = DATASET_DIR / args.dataset_name / DATA_PATH_REL_TO_DATASET / f"{file_prefix}-trainlog-{rep}.csv"
print("Log relative filename :", log_filename)
train_model(
dls,
args.model_arch,
args.pretrained,
log_filename,
model_filename,
fig_filename_prefix,
rep,
)
if __name__ == "__main__":
main()
| 29.630332 | 125 | 0.666667 | 833 | 6,252 | 4.762305 | 0.327731 | 0.007058 | 0.029997 | 0.021175 | 0.198135 | 0.172675 | 0.123519 | 0.10184 | 0.085707 | 0.085707 | 0 | 0.021938 | 0.212572 | 6,252 | 210 | 126 | 29.771429 | 0.783872 | 0.107006 | 0 | 0.06015 | 0 | 0 | 0.143466 | 0.042401 | 0 | 0 | 0 | 0.004762 | 0 | 1 | 0.045113 | false | 0 | 0.052632 | 0 | 0.142857 | 0.045113 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a1b02d39821a15e7fefd1639adef1cef0c66ac | 502 | py | Python | list-operations/Element-insertion/Python/insertion.py | Prince23598/cs-algorithms | 75c90a0603092e8d6d9c5b982beab6729c8cb516 | [
"MIT"
] | 239 | 2019-10-07T11:01:56.000Z | 2022-01-27T19:08:55.000Z | list-operations/Element-insertion/Python/insertion.py | ashfreakingoyal/cs-algorithms | 08f5aba5c3379e17d03b899fc36efcdccebd181c | [
"MIT"
] | 176 | 2019-10-07T06:59:49.000Z | 2020-09-30T08:16:22.000Z | list-operations/Element-insertion/Python/insertion.py | ashfreakingoyal/cs-algorithms | 08f5aba5c3379e17d03b899fc36efcdccebd181c | [
"MIT"
] | 441 | 2019-10-07T07:34:08.000Z | 2022-03-15T07:19:58.000Z |
#this program doesn't use inbuild list functions except append
def insertion(arr): #inserting an element at a particular index of the list
global n
x = int(input("postion :"))
y = int(input("element :"))
arr.append(arr[-1])
for i in range(n-1,x,-1):
arr[i] =arr[i-1]
arr[x] = y
n = n+1 #incrementing the list size variable
n =int(input('No. of elements : '))
arr = []
print("Input the elements of the list\n")
for i in range(n):
x =int(input())
arr.append(x)
insertion(arr)
print(arr)
| 20.08 | 75 | 0.663347 | 91 | 502 | 3.659341 | 0.43956 | 0.096096 | 0.054054 | 0.06006 | 0.072072 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012136 | 0.179283 | 502 | 24 | 76 | 20.916667 | 0.796117 | 0.298805 | 0 | 0 | 0 | 0 | 0.195402 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.058824 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a4da818e55244ad6c1eca2a65a5ca929728ea0 | 4,652 | py | Python | windows_auth/scheduler.py | sourcery-ai-bot/django-windowsauth | 6701dcf4672e9d642185a547c3e193568ae98103 | [
"BSD-3-Clause"
] | 20 | 2020-12-18T12:24:47.000Z | 2022-03-16T12:15:08.000Z | windows_auth/scheduler.py | sourcery-ai-bot/django-windowsauth | 6701dcf4672e9d642185a547c3e193568ae98103 | [
"BSD-3-Clause"
] | 4 | 2021-01-15T16:42:18.000Z | 2021-10-30T03:38:56.000Z | windows_auth/scheduler.py | sourcery-ai-bot/django-windowsauth | 6701dcf4672e9d642185a547c3e193568ae98103 | [
"BSD-3-Clause"
] | 2 | 2021-07-23T19:25:41.000Z | 2022-03-16T12:15:10.000Z | import os
from pathlib import Path
from typing import Optional
import win32com.client
from pythoncom import com_error
from django.conf import settings
from django.utils import timezone
_PYTHON_PATH = str(Path(os.environ.get("VIRTUAL_ENV")) / "Scripts" / "python.exe")
LOCAL_SYSTEM = "NT Authority\\LocalSystem"
LOCAL_SERVICE = "NT Authority\\LocalService"
NETWORK_SERVICE = "NT Authority\\NetworkService"
APPLICATION_POOL_IDENTITY = "IIS AppPool\\DefaultAppPool"
_scheduler = win32com.client.Dispatch('Schedule.Service')
_scheduler.Connect()
def _get_absolute_command_line(command_line):
return f"{Path(settings.BASE_DIR) / 'manage.py'} {command_line}"
def create_task_definition(command_line, description: str = "", priority: int = 3,
timeout: timezone.timedelta = timezone.timedelta(hours=1)):
"""
Create a new Scheduled Task definition for a Django Management Command.
:param command_line: The management command with arguments.
:param description: Task description.
:param priority: Task priority https://docs.microsoft.com/en-us/windows/win32/taskschd/tasksettings-priority.
:param timeout: Maximum execution time.
:return: Task Definition https://docs.microsoft.com/en-us/windows/win32/taskschd/taskdefinition
"""
# create task
task_def = _scheduler.NewTask(0)
task_def.RegistrationInfo.Description = description
task_def.RegistrationInfo.Source = os.path.basename(settings.BASE_DIR)
# run as a Service Account
task_def.Principal.LogonType = 5
task_def.Principal.RunLevel = 1
# configure settings
task_def.Settings.Enabled = True
task_def.Settings.StopIfGoingOnBatteries = False
task_def.Settings.StartWhenAvailable = True
task_def.Settings.WakeToRun = True
task_def.Settings.AllowDemandStart = True
task_def.Settings.AllowHardTerminate = True
task_def.Settings.Priority = priority
task_def.Settings.ExecutionTimeLimit = f"PT{timeout.seconds}S"
task_def.Settings.RestartCount = 3
task_def.Settings.RestartInterval = "PT1M"
# create action https://docs.microsoft.com/en-us/windows/win32/taskschd/actioncollection-create
action = task_def.Actions.Create(0)
# parameters https://docs.microsoft.com/en-us/windows/win32/taskschd/execaction
action.Path = _PYTHON_PATH
action.Arguments = _get_absolute_command_line(command_line)
action.WorkingDirectory = str(settings.BASE_DIR)
return task_def
def add_schedule_trigger(task_def, interval: timezone.timedelta,
random: Optional[timezone.timedelta] = None) -> None:
"""
Add trigger for time scheduled executions.
When interval is grater then one day, the interval is rounded to days.
:param task_def: Task Definition https://docs.microsoft.com/en-us/windows/win32/taskschd/taskdefinition.
:param interval: Time delta between executions.
:param random: Randomize execution inside a time span.
"""
trigger = task_def.Triggers.Create(2) # daily trigger
trigger.StartBoundary = timezone.now().isoformat()
if interval.days:
# if interval longer then a day
trigger.DaysInterval = interval.days
else:
trigger.Repetition.Duration = "P1D"
trigger.Repetition.Interval = f"PT{interval.seconds}S"
if random:
trigger.RandomDelay = f"PT{interval.seconds}S"
def register_task(task_def, name: str, folder: str = None,
username: str = LOCAL_SERVICE, password: Optional[str] = None):
"""
Register new task definition to Windows Task Scheduler.
:param task_def: Task Definition https://docs.microsoft.com/en-us/windows/win32/taskschd/taskdefinition.
:param name: Task name.
:param folder: Task folder (created automatically).
:param username: Principal username (for service principals)
:param password: Principal password
:return: Registered Task https://docs.microsoft.com/en-us/windows/win32/taskschd/registeredtask
"""
if folder:
# get or create folder
root_folder = _scheduler.GetFolder("\\")
try:
task_folder = root_folder.GetFolder(folder)
except com_error:
task_folder = root_folder.CreateFolder(folder)
else:
task_folder = _scheduler.GetFolder("\\")
# register task https://docs.microsoft.com/en-us/windows/win32/taskschd/taskfolder-registertaskdefinition
return task_folder.RegisterTaskDefinition(
name,
task_def,
6, # create or update
username,
password,
1 if password else 3, # password or interactive token (user is logged on)
)
| 39.423729 | 113 | 0.72055 | 560 | 4,652 | 5.8625 | 0.321429 | 0.049041 | 0.04569 | 0.051173 | 0.179714 | 0.168139 | 0.148035 | 0.148035 | 0.148035 | 0.106914 | 0 | 0.008709 | 0.185512 | 4,652 | 117 | 114 | 39.760684 | 0.857746 | 0.350817 | 0 | 0.029851 | 0 | 0 | 0.095386 | 0.054752 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059701 | false | 0.044776 | 0.104478 | 0.014925 | 0.208955 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a4e5e4e89e6e753f18f7181ec73bb1d3a9ba73 | 5,777 | py | Python | server/app.py | sofignatova/02books | 9eed066fee5503c88359958708dfb8eba56e465a | [
"MIT"
] | 38 | 2020-12-22T01:15:38.000Z | 2021-11-09T11:01:40.000Z | server/app.py | sofignatova/02books | 9eed066fee5503c88359958708dfb8eba56e465a | [
"MIT"
] | 1 | 2020-12-21T19:11:11.000Z | 2020-12-21T19:11:11.000Z | server/app.py | sofignatova/02books | 9eed066fee5503c88359958708dfb8eba56e465a | [
"MIT"
] | 3 | 2020-12-22T04:17:50.000Z | 2020-12-22T09:03:37.000Z | import json
import os
import os.path
from typing import List, Optional
import flask
from flask import Flask, Response, jsonify, request
import api
import corpus
import correct_runs_suggestor
import filestorage
import firestore_storage
import storage
import word_suggestor
app = Flask(__name__)
if os.environ.get("GAE_APPLICATION"):
app.config["STORAGE"] = firestore_storage.FirestoreStorage()
else:
app.config["STORAGE"] = filestorage.FileStorage(os.path.join(app.root_path, "data"))
app.config["CORPORA"] = corpus.get_corpora()
def _get_storage() -> storage.Storage:
return app.config["STORAGE"]
def _get_suggestor() -> word_suggestor.WordSuggestor:
if "suggestor" in flask.g:
return flask.g.suggestor
suggestion_data = _get_storage().load_suggestion_data(
correct_runs_suggestor.CorrectRunsSuggestor.NAME, _get_user_id()
)
flask.g.suggestor = (
correct_runs_suggestor.CorrectRunsSuggestor.from_suggestion_data(
suggestion_data
)
)
return flask.g.suggestor
def _get_corpora() -> List[corpus.Corpus]:
return app.config["CORPORA"]
def _get_user_id() -> str:
user_id = request.args.get("user_id")
if user_id is None:
raise storage.UserDoesNotExist("user not specified")
store = _get_storage()
store.check_user_exists(user_id)
return user_id
def _get_default_settings() -> storage.UserSettings:
return storage.UserSettings(selected_corpus_id="curated")
def _get_settings() -> storage.UserSettings:
if "settings" in flask.g:
return flask.g.settings
settings = _get_storage().load_user_settings(_get_user_id())
if settings is None:
settings = _get_default_settings()
flask.g.settings = settings
return settings
def _get_corpus() -> corpus.Corpus:
selected_corpus_id = _get_settings().selected_corpus_id
return corpus.get_corpus(selected_corpus_id)
def _string_to_word(raw_word: str, c: Optional[corpus.Corpus] = None) -> api.Word:
settings = _get_settings()
c = c or _get_corpus()
suggestor = _get_suggestor()
level_of_mastery = suggestor.get_mastery(raw_word)
return api.Word(
word=raw_word,
level_of_mastery=level_of_mastery or 0.0,
is_new=level_of_mastery is None,
corpus_count=c.get_count(raw_word),
removed=raw_word in settings.removed_words,
)
@app.errorhandler(storage.UserDoesNotExist)
def handle_user_does_not_exist(e):
data = {"errors": [{"detail": str(e)}]}
return Response(json.dumps(data), status=400, content_type="application/json")
@app.route("/_ah/warmup")
def warmup():
"""Handle App Engine warmup requests.
See https://cloud.google.com/appengine/docs/standard/python3/configuring-warmup-requests.
"""
return "", 200, {}
@app.route("/api/users", methods=["POST"])
def handle_users():
store = _get_storage()
user_id = store.create_user()
store.save_user_settings(user_id, _get_default_settings())
return jsonify(data={"id": user_id})
@app.route("/api/corpora", methods=["GET"])
def handle_corpora():
return jsonify(
data=[
api.Corpus(
id_=c.corpus_id,
name=c.name,
description=c.description,
link=c.url,
words=[_string_to_word(s, c) for s in c.ordered_word_list],
source=api.Corpus.CorpusSource(c.source.value),
reader_level=c.reader_level,
).to_dict()
for c in _get_corpora()
],
)
@app.route("/api/settings", methods=["GET", "PATCH"])
def handle_settings():
user_settings = _get_settings()
if request.method == "PATCH":
if "removed_words" in request.json:
removed_words = request.json["removed_words"]
else:
removed_words = user_settings.removed_words
if "selected_corpus_id" in request.json:
selected_corpus_id = request.json["selected_corpus_id"]
else:
selected_corpus_id = user_settings.selected_corpus_id
user_settings = storage.UserSettings(
selected_corpus_id=selected_corpus_id,
removed_words=removed_words,
)
_get_storage().save_user_settings(_get_user_id(), user_settings)
return jsonify(
data=api.UserSettings(
selected_corpus_id=user_settings.selected_corpus_id,
removed_words=user_settings.removed_words,
).to_dict()
)
@app.route("/api/trainings", methods=["POST"])
def handle_trainings():
training = api.Training.from_dict(request.json)
store = _get_storage()
user_id = _get_user_id()
suggestion_data = _get_suggestor().update_suggestion_data(
training.sentence, [(wc.word, wc.correct) for wc in training.word_correctness]
)
store.save_suggestion_data(_get_suggestor().NAME, user_id, suggestion_data)
store.record_result(
user_id,
storage.TrainingResult(
training.sentence,
word_results=[
storage.WordResult(wc.word, wc.correct)
for wc in training.word_correctness
],
),
)
return jsonify(data={})
@app.route("/api/words", methods=["GET"])
def handle_words():
raw_words = []
if "suggestions" in request.args:
word_list = _get_corpus().ordered_word_list
settings = _get_settings()
raw_words.extend(
_get_suggestor().suggest(
word_list,
frozenset(settings.removed_words),
count=10,
)
)
for word in request.args.getlist("word"):
raw_words.append(word)
return jsonify(data=[_string_to_word(raw_word) for raw_word in raw_words])
| 28.458128 | 93 | 0.664532 | 706 | 5,777 | 5.123229 | 0.195467 | 0.026541 | 0.057506 | 0.023224 | 0.159248 | 0.080177 | 0.049212 | 0.049212 | 0.024883 | 0.024883 | 0 | 0.002462 | 0.226588 | 5,777 | 202 | 94 | 28.59901 | 0.807073 | 0.021638 | 0 | 0.103226 | 0 | 0 | 0.054462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0 | 0.083871 | 0.025806 | 0.290323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a75c9a99d985c478a454429ef963da655e67d0 | 24,332 | py | Python | hdltools/abshdl/highlvl.py | brunosmmm/hdltools | a98ca8c4d168740fa229c939a7b1f31ea73eec24 | [
"MIT"
] | 2 | 2020-02-28T13:02:39.000Z | 2021-06-30T09:15:35.000Z | hdltools/abshdl/highlvl.py | brunosmmm/hdltools | a98ca8c4d168740fa229c939a7b1f31ea73eec24 | [
"MIT"
] | 1 | 2020-03-22T17:32:45.000Z | 2020-03-23T15:43:39.000Z | hdltools/abshdl/highlvl.py | brunosmmm/hdltools | a98ca8c4d168740fa229c939a7b1f31ea73eec24 | [
"MIT"
] | null | null | null | """High-level coding using python syntax to build HDL structures."""
import inspect
import ast
import textwrap
import sys
import re
from collections import deque
from hdltools.abshdl import HDLObject
from hdltools.abshdl.expr import HDLExpression
from hdltools.abshdl.signal import HDLSignal, HDLSignalSlice
from hdltools.abshdl.port import HDLModulePort
from hdltools.abshdl.assign import HDLAssignment, HDLLazyValue
from hdltools.abshdl.ifelse import HDLIfElse, HDLIfExp
from hdltools.hdllib.patterns import (
ClockedBlock,
ClockedRstBlock,
ParallelBlock,
SequentialBlock,
)
from hdltools.hdllib.fsm import FSM
from hdltools.abshdl.concat import HDLConcatenation
from hdltools.abshdl.vector import HDLVectorDescriptor
from hdltools.abshdl.macro import HDLMacroValue
class PatternNotAllowedError(Exception):
"""Code pattern not allowed."""
pass
class HDLPlaceholderSignal(HDLSignal):
"""Placeholder signal."""
def __init__(self, *args, **kwargs):
"""Initialize."""
super().__init__("other", *args, **kwargs)
class HDLBlock(HDLObject, ast.NodeVisitor):
"""Build HDL blocks from python syntax."""
_CUSTOM_TYPE_MAPPING = {}
_PATTERN_NAMES = [
"ClockedBlock",
"ClockedRstBlock",
"ParallelBlock",
"SequentialBlock",
"HDLBlock",
]
def __init__(self, mod=None, symbols=None, **kwargs):
"""Initialize."""
super().__init__()
self._init()
# build internal signal scope
self.signal_scope = {}
if mod is not None:
self._add_to_scope(**mod.get_signal_scope())
self._hdlmod = mod
self._add_to_scope(**kwargs)
if symbols is None:
self._symbols = {}
else:
self._symbols = symbols
self.fsms = {}
def _init(self):
"""Initialize or re-initialize."""
self.scope = None
self.current_scope = None
self.block = None
self.consts = None
self._current_block = deque()
self._current_block_kwargs = {}
self._verify_signal_name = True
def __call__(self, fn):
"""Decorate."""
def wrapper_BlockBuilder(*args, **kwargs):
self._init()
self._build(fn, fn_kwargs=kwargs)
return self.get()
return wrapper_BlockBuilder
def apply_on_ast(self, tree):
"""Do procedures directly on AST."""
self.tree = tree
self.visit(self.tree)
def _signal_lookup(self, sig_name):
"""Signal lookup."""
if isinstance(sig_name, int):
return sig_name
if self.signal_scope is not None:
if sig_name in self.signal_scope:
if isinstance(
self.signal_scope[sig_name], HDLPlaceholderSignal
):
# go find actual signal
# FIXME: should return a flag indicating placeholder
return self._current_block_kwargs[sig_name]
return self.signal_scope[sig_name]
else:
return None
else:
# search in globals
if sig_name in globals():
return globals()[sig_name]
else:
return None
def _build(self, target, fn_kwargs):
for kwarg in fn_kwargs.values():
if not isinstance(
kwarg, (HDLSignal, HDLSignalSlice, HDLModulePort, int)
):
raise RuntimeError(
"block kwargs must be of HDLSignal, HDLSignalSlice, "
"HDLModulePort or integer type"
)
self._current_block_kwargs = fn_kwargs
src = inspect.getsource(target)
self.tree = ast.parse(textwrap.dedent(src), mode="exec")
self.visit(self.tree)
def visit_FunctionDef(self, node):
"""Visit function declaration."""
# starting point is function declaration. Remove our own decorator.
decorator_list = [
x
for x in node.decorator_list
if x.func.id != self.__class__.__name__
]
if len(decorator_list) == 0:
raise RuntimeError(
"must be used in conjunction with a HDL block"
" decorator, like ClockedBlock, ParallelBlock"
)
for decorator in decorator_list:
try:
decorator_class = getattr(
sys.modules[__name__], decorator.func.id
)
except:
if decorator.func.id not in self._CUSTOM_TYPE_MAPPING:
decorator_class = None
else:
decorator_class = self._CUSTOM_TYPE_MAPPING[
decorator.func.id
]
if decorator.func.id == "SequentialBlock":
# sequential block.
args = []
for arg in decorator.args:
_arg = self._signal_lookup(arg.id)
if _arg is None:
continue
args.append(_arg)
block = SequentialBlock.get(*args)
if self.block is None:
self.block = block
self.scope = self.block.scope
self.current_scope = self.scope
else:
self.scope.add(block)
self.current_scope = block.scope
elif decorator.func.id in ("ClockedBlock", "ClockedRstBlock"):
# a clocked block.
# rebuild args
args = []
for arg in decorator.args:
_arg = self._signal_lookup(arg.id)
if _arg is None:
continue
args.append(_arg)
if decorator.func.id == "ClockedBlock":
block = ClockedBlock.get(*args)
else:
block = ClockedRstBlock.get(*args)
if self.block is None:
self.block = block
self.scope = self.block.scope
self.current_scope = self.scope
else:
self.scope.add(block)
self.current_scope = block.scope
elif decorator.func.id == "ParallelBlock":
block = ParallelBlock.get()
if self.block is None:
self.block = block
self.scope = self.block
self.current_scope = self.scope
else:
self.block.add(block)
self.current_scope = block
elif decorator_class is not None and issubclass(
decorator_class, FSM
):
if node.name in self.fsms:
raise PatternNotAllowedError(
"FSM '{}' already declared.".format(node.name)
)
# rebuild args
args = []
for arg in decorator.args:
_arg = self._signal_lookup(arg.id)
if _arg is None:
continue
args.append(_arg)
kwargs = {}
for kw in decorator.keywords:
if isinstance(kw.value, ast.Str):
kwargs[kw.arg] = kw.value.s
# add signal scope in the mix
kwargs["_signal_scope"] = self.signal_scope
kwargs["instance_name"] = node.name
block, const, fsm = decorator_class.get(*args, **kwargs)
# perform checks
state_var = fsm.state_var_name
for fsm_name, _fsm in self.fsms.items():
if _fsm.state_var_name.name == state_var.name:
raise PatternNotAllowedError(
"state variable '{}' re-utilized in FSM '{}'".format(
node.name
)
)
self.fsms[node.name] = fsm
# go out of tree
fsm = FSMBuilder(block, self.signal_scope)
fsm._build(decorator_class)
if self.block is None:
self.block = block
self.scope = self.block
self.current_scope = self.scope
else:
self.block.add(block)
self.current_scope = block
if self.consts is None:
self.consts = {c.name: c for c in const}
else:
self.consts.update({c.name: c for c in const})
# FIXME: this should probably come at the beginning
if node.args.args is not None:
for arg in node.args.args:
if arg.arg not in self._current_block_kwargs:
raise RuntimeError(
f"while building block: missing argument '{arg.arg}'"
)
# enforce legality of scope
if node.args.args is not None:
scope_add = {
arg.arg: HDLPlaceholderSignal(arg.arg, size=1)
for arg in node.args.args
}
self._add_to_scope(**scope_add)
# for arg in node.args.args:
# if arg.arg not in self.signal_scope:
# raise NameError(
# 'in block declaration: "{}",'
# ' signal "{}" is not available'
# " in current module scope".format(node.name, arg.arg)
# )
# push function name to stack
self._current_block.append((node.name, self._current_block_kwargs))
self.generic_visit(node)
_, self._current_block_kwargs = self._current_block.pop()
return node
def visit_If(self, node):
"""Visit If statement."""
self.visit(node.test)
ifelse = HDLIfElse(HDLExpression(ast.Expression(body=node.test)))
self.current_scope.add([ifelse])
last_scope = self.current_scope
# ordered visit, two scopes, so separe
self.current_scope = ifelse.if_scope
for _node in node.body:
self.visit(_node)
self.current_scope = ifelse.else_scope
for _node in node.orelse:
self.visit(_node)
self.current_scope = last_scope
return node
def visit_Subscript(self, node):
"""Visit Subscripts."""
if isinstance(node.value, ast.Name):
signal = self._signal_lookup(node.value.id)
if signal is None:
raise NameError(
'in "{}": signal "{}" not available in'
" current scope".format(
self._get_current_block(), node.value.id
)
)
if isinstance(node.slice, ast.Index):
index = self.visit(node.slice.value)
vec = HDLVectorDescriptor(index, index)
return HDLSignalSlice(signal, vec)
elif isinstance(node.slice, ast.Slice):
if isinstance(node.slice.upper, ast.Constant):
upper = node.slice.upper.value
else:
upper = node.slice.upper
if isinstance(node.slice.lower, ast.Constant):
lower = node.slice.lower.value
else:
lower = node.slice.lower
return HDLSignalSlice(signal, [upper, lower])
elif isinstance(node.slice, ast.Constant):
if isinstance(node.slice.value, int):
vec = HDLVectorDescriptor(
node.slice.value, node.slice.value
)
return HDLSignalSlice(signal, vec)
else:
raise TypeError(
"type {} not supported".format(
node.slice.value.__class__.__name__
)
)
else:
raise TypeError(
"type {} not supported".format(
node.slice.__class__.__name__
)
)
else:
raise TypeError(
"type {} not supported".format(node.value.__class__.__name__)
)
def visit_Constant(self, node):
"""Visit Constant."""
if isinstance(node.value, int):
return HDLExpression(node.value)
return node
def visit_Name(self, node):
"""Visit Name."""
signal = self._signal_lookup(node.id)
if signal is not None:
if isinstance(signal, HDLSignalSlice):
signal_name = signal.signal.name
elif isinstance(signal, (HDLSignal, HDLModulePort)):
signal_name = signal.name
elif isinstance(signal, int):
signal_name = signal
else:
raise RuntimeError("unknown error")
else:
signal_name = node.id
if self._verify_signal_name:
if signal is None:
raise NameError("unknown name: {}".format(node.id))
node.id = signal_name
return HDLExpression(signal_name)
def visit_Assign(self, node):
"""Visit Assignments."""
self.generic_visit(node)
assignments = []
# check assignees (targets)
assignees = []
for target in node.targets:
if isinstance(target, ast.Attribute):
# attributes are not allowed, except for self access
if target.value.id == "self":
# bypass attribute access directly,
# later on we can execute the block itself in python
# if necessary
target.id = target.attr
else:
raise PatternNotAllowedError(
"Attribute access is not allowed in HDL blocks."
)
if self._signal_lookup(target.id) is None:
if self._signal_lookup("reg_" + target.id) is None:
raise NameError(
'in "{}": signal "{}" not available in'
" current scope".format(
self._get_current_block(), target.id
)
)
else:
target.id = "reg_" + target.id
assignees.append(target.id)
# check value assigned
if isinstance(node.value, ast.Name):
if self._signal_lookup(node.value.id) is None:
raise NameError(
'in "{}": signal "{}" not available in'
" current scope".format(
self._get_current_block(), node.value.id
)
)
for assignee in assignees:
assignments.append(
HDLAssignment(
self._signal_lookup(assignee),
self._signal_lookup(node.value.id),
)
)
elif isinstance(node.value, ast.Constant):
for assignee in assignees:
assignments.append(
HDLAssignment(
self._signal_lookup(assignee),
HDLExpression(node.value.value),
)
)
elif isinstance(node.value, (ast.List, ast.Tuple)):
items = [self.visit(item) for item in node.value.elts]
for assignee in assignees:
assignments.append(
HDLAssignment(
self._signal_lookup(assignee),
HDLConcatenation(*items[::-1]),
)
)
elif isinstance(node.value, ast.Call):
for assignee in assignees:
args = [self._signal_lookup(arg.id) for arg in node.value.args]
kwargs = {
kw.arg: self._signal_lookup(kw.value.id)
for kw in node.value.keywords
}
if node.value.func.id in self._symbols:
fn = self._symbols[node.value.func.id]
# generate
ret = fn(*args, **kwargs)
else:
# dont do anything for now, lazy
fn = node.value.func.id
ret = (
HDLLazyValue(
fn,
fnargs=args,
fnkwargs=kwargs,
),
)
assignments.append(
HDLAssignment(self._signal_lookup(assignee), ret)
)
else:
try:
expr = self.visit(node.value)
for assignee in assignees:
assignments.append(
HDLAssignment(self._signal_lookup(assignee), expr)
)
except TypeError:
# raise TypeError('type {} not supported'.format(
# node.value.__class__.__name__))
raise
# find out where to insert statement
if len(assignments) > 0:
self.current_scope.add(*assignments)
def visit_Call(self, node):
"""Visit call."""
if (
isinstance(node.func, ast.Name)
and node.func.id in self._PATTERN_NAMES
):
return
self._verify_signal_name = True
if (
isinstance(node.func, ast.Name)
and node.func.id not in self._symbols
and node.func.id not in self._CUSTOM_TYPE_MAPPING
):
# unless it is a callable object, in which case the name is here
raise NameError(
"unknown python function: '{}'".format(node.func.id)
)
# FIXME: disallow starred
args = []
for arg in node.args:
if isinstance(arg, ast.Name):
self.visit_Name(arg)
args.append(self._signal_lookup(arg.id))
else:
args.append(arg)
kwargs = {}
for kwarg in node.keywords:
if isinstance(kwarg.value, ast.Name):
self.visit_Name(kwarg.value)
kwargs[kwarg.arg] = self._signal_lookup(kwarg.value.id)
else:
kwargs[kwarg.arg] = kwarg.value
# self._verify_signal_name = False
# call?
# fn = self._symbols[node.func.id]
# ret = fn(*args, **kwargs)
# return ret
def visit_IfExp(self, node):
"""Visit If expression."""
ifexp = HDLIfExp(
self.visit(node.test),
if_value=self.visit(node.body),
else_value=self.visit(node.orelse),
)
self.generic_visit(node)
return ifexp
def visit_UnaryOp(self, node):
"""Visit Unary operations."""
if isinstance(node.op, ast.Not):
return HDLExpression(self.visit(node.operand)).bool_neg()
elif isinstance(node.op, ast.Invert):
return ~HDLExpression(self.visit(node.operand))
else:
raise TypeError(
"operator {} not supported".format(node.op.__class__.__name__)
)
def visit_BinOp(self, node):
"""Visit Binary operations."""
self.generic_visit(node)
return HDLExpression(ast.Expression(body=node))
def visit_Compare(self, node):
"""Visit Compare."""
self.generic_visit(node)
if isinstance(node.left, ast.Name):
# NOTE: Don't know why this is necessary anymore
left_sig = self._signal_lookup(node.left.id)
if left_sig is None:
if self._signal_lookup(node.left.id) is not None:
# rename
node.left.id = "reg_" + str(node.left.id)
else:
raise NameError(
'in "{}": signal "{}" not available in'
" current scope".format(
self._get_current_block(), node.left.id
)
)
if len(node.comparators) > 1:
raise RuntimeError("only single comparison is allowed")
(comp,) = node.comparators
if isinstance(comp, ast.Name):
comp_sig = self._signal_lookup(comp.id)
if comp_sig is None:
if self._signal_lookup("reg_" + str(comp.id)) is not None:
# is state register, rename
comp.id = "reg_" + str(comp.id)
else:
raise NameError(
'in "{}": signal "{}" not available in'
" current scope".format(
self._get_current_block(), comp.id
)
)
return HDLExpression(node)
def visit_Expr(self, node):
"""Visit Expression."""
self.generic_visit(node)
return node
def get(self):
"""Get block."""
return (self.block, self.consts, self.fsms)
def _get_current_block(self):
try:
block, kwargs = self._current_block.pop()
except:
return None
self._current_block.append((block, kwargs))
return block
def _add_to_scope(self, **kwargs):
"""Add signals to internal scope."""
for name, arg in kwargs.items():
if isinstance(arg, (HDLSignal, HDLSignalSlice)):
self.signal_scope[name] = arg
@classmethod
def add_custom_block(cls, block_class):
"""Add custom block class."""
cls._CUSTOM_TYPE_MAPPING[block_class.__name__] = block_class
class FSMBuilder(HDLBlock):
"""Helper class that builds FSMs."""
def __init__(self, fsm_block, signal_scope, **kwargs):
"""Initialize."""
self._block = fsm_block
super().__init__(**kwargs)
self.signal_scope = signal_scope
def _collect_states(self, cls):
state_methods = {}
for method_name, method in inspect.getmembers(cls):
cls_name = cls.__name__
m = re.match(
r"_{}__state_([a-zA-Z0-9_]+)".format(cls_name), method_name
)
if m is not None:
# found a state
if inspect.ismethod(method) or inspect.isfunction(method):
args = set(inspect.getfullargspec(method).args)
input_list = args - set(["self"])
state_methods[m.group(1)] = (method, input_list)
return state_methods
def _build(self, target):
self._class = target
self._states = self._collect_states(target)
super()._build(target, fn_kwargs={})
def visit_FunctionDef(self, node):
"""Visit function (state definition)."""
cls_name = self._class.__name__
m = re.match(r"__state_([a-zA-Z0-9_]+)".format(cls_name), node.name)
if m is not None:
case_scope = self._block.find_by_tag(
"__autogen_case_{}".format(m.group(1))
)[0]
if case_scope is None:
raise RuntimeError("unknown error, cant find case scope")
else:
self.current_scope = case_scope
self.generic_visit(node)
def visit_Str(self, node):
"""Visit strings and guess state changes."""
if self.current_scope is None:
return None
if node.s not in self._states:
raise RuntimeError("invalid state: {}".format(node.s))
return HDLMacroValue(node.s)
def visit_Name(self, node):
"""Visit a name."""
# why???
if node.id in ("self", "FSM"):
return
if node.id not in self.signal_scope:
raise NameError("unknown signal in FSM: '{}'".format(node.id))
| 36.589474 | 81 | 0.509206 | 2,422 | 24,332 | 4.940958 | 0.139554 | 0.027576 | 0.029414 | 0.011699 | 0.315952 | 0.247681 | 0.202473 | 0.171304 | 0.165956 | 0.161444 | 0 | 0.000824 | 0.401693 | 24,332 | 664 | 82 | 36.644578 | 0.821198 | 0.08347 | 0 | 0.323917 | 0 | 0 | 0.050351 | 0.002217 | 0 | 0 | 0 | 0.001506 | 0 | 1 | 0.056497 | false | 0.001883 | 0.032015 | 0 | 0.156309 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a801449469355adc8a1c2e42661d5dd0d77a49 | 1,085 | py | Python | tests.py | Dani-97/tbcnn | 1e19e20099188dea42c97f09de39a0fadbcba92f | [
"MIT"
] | 30 | 2019-01-26T09:19:17.000Z | 2022-01-27T06:52:56.000Z | tests.py | Dani-97/tbcnn | 1e19e20099188dea42c97f09de39a0fadbcba92f | [
"MIT"
] | 9 | 2019-07-26T07:00:57.000Z | 2021-05-11T10:08:40.000Z | tests.py | Dani-97/tbcnn | 1e19e20099188dea42c97f09de39a0fadbcba92f | [
"MIT"
] | 17 | 2019-08-20T09:46:00.000Z | 2022-02-16T19:44:26.000Z | import numpy as np
import train_variants
a = np.arange(15)
print('a =', a)
print()
# ---------------------------------------
print('train_variants.create_sets(4, a, a)')
imgs, labs = train_variants.create_sets(4, np.copy(a), np.copy(a))
print(imgs)
assert len(imgs) == 4
assert len(labs) == 4
assert all([np.array_equal(i, l) for i, l in zip(imgs, labs)])
print()
# ---------------------------------------
print('train_variants.get_rotations(4, imgs, labs)')
train, test = train_variants.get_rotations(4, imgs, labs)
print(train)
assert len(train) == 4
assert len(test) == 4
assert all([len(t) == 2 and (t[0].shape[0] == 12 or t[0].shape[0] == 11) and (t[1].shape[0] == 12 or t[1].shape[0] == 11) for t in train])
assert all([len(t) == 2 and (t[0].shape[0] == 4 or t[0].shape[0] == 3) and (t[1].shape[0] == 4 or t[1].shape[0] == 3) for t in test])
for i in range(4):
assert np.intersect1d(train[i][0], test[i][0]).shape[0] == 0 and np.intersect1d(test[i][0], train[i][0]).shape[0] == 0
assert np.union1d(train[i][0], test[i][0]).shape[0] == 15
print()
print('Success!') | 31.911765 | 138 | 0.58341 | 198 | 1,085 | 3.146465 | 0.217172 | 0.105939 | 0.078652 | 0.051364 | 0.452648 | 0.250401 | 0.250401 | 0.141252 | 0.080257 | 0.080257 | 0 | 0.060768 | 0.135484 | 1,085 | 34 | 139 | 31.911765 | 0.603412 | 0.072811 | 0 | 0.125 | 0 | 0 | 0.088645 | 0.059761 | 0 | 0 | 0 | 0 | 0.375 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.375 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a95440ee127e22ba060762d6955e5d4353d199 | 4,876 | py | Python | Gems/PythonAssetBuilder/Code/Tests/asset_builder_example.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-09-13T00:01:12.000Z | 2021-09-13T00:01:12.000Z | Gems/PythonAssetBuilder/Code/Tests/asset_builder_example.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | null | null | null | Gems/PythonAssetBuilder/Code/Tests/asset_builder_example.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-07-20T11:07:25.000Z | 2021-07-20T11:07:25.000Z | """
Copyright (c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution.
SPDX-License-Identifier: Apache-2.0 OR MIT
"""
#
# Simple example asset builder that processes *.foo files
#
import azlmbr.math
import azlmbr.asset.builder
import os, shutil
# the UUID must be unique amongst all the asset builders in Python or otherwise
busIdString = '{E4DB381B-61A0-4729-ACD9-4C8BDD2D2282}'
busId = azlmbr.math.Uuid_CreateString(busIdString, 0)
assetTypeScript = azlmbr.math.Uuid_CreateString('{82557326-4AE3-416C-95D6-C70635AB7588}', 0)
handler = None
jobKeyPrefix = 'Foo Job Key'
targetAssetFolder = 'foo_scripts'
# creates a single job to compile for a 'pc' platform
def on_create_jobs(args):
request = args[0] # azlmbr.asset.builder.CreateJobsRequest
response = azlmbr.asset.builder.CreateJobsResponse()
# note: if the asset builder is going to handle more than one file pattern it might need to check out
# the request.sourceFile to figure out what jobs need to be created
jobDescriptorList = []
for platformInfo in request.enabledPlatforms:
# for each enabled platform like 'pc' or 'ios'
platformId = platformInfo.identifier
# set up unique job key
jobKey = '{} {}'.format(jobKeyPrefix, platformId)
# create job descriptor
jobDesc = azlmbr.asset.builder.JobDescriptor()
jobDesc.jobKey = jobKey
jobDesc.set_platform_identifier(platformId)
jobDescriptorList.append(jobDesc)
print ('created a job for {} with key {}'.format(platformId, jobKey))
response.createJobOutputs = jobDescriptorList
response.result = azlmbr.asset.builder.CreateJobsResponse_ResultSuccess
return response
def get_target_name(sourceFullpath):
lua_file = os.path.basename(sourceFullpath)
lua_file = os.path.splitext(lua_file)[0]
lua_file = lua_file + '.lua'
return lua_file
def copy_foo_file(srcFile, dstFile):
try:
dir_name = os.path.dirname(dstFile)
if (os.path.exists(dir_name) is False):
os.makedirs(dir_name)
shutil.copyfile(srcFile, dstFile)
return True
except:
return False
# using the incoming 'request' find the type of job via 'jobKey' to determine what to do
def on_process_job(args):
request = args[0] # azlmbr.asset.builder.ProcessJobRequest
response = azlmbr.asset.builder.ProcessJobResponse()
# note: if possible to loop through incoming data a 'yeild' can be used to cooperatively
# thread the processing of the assets so that shutdown and cancel can be handled
if (request.jobDescription.jobKey.startswith(jobKeyPrefix)):
targetFile = os.path.join(targetAssetFolder, get_target_name(request.fullPath))
dstFile = os.path.join(request.tempDirPath, targetFile)
if (copy_foo_file(request.fullPath, dstFile)):
response.outputProducts = [azlmbr.asset.builder.JobProduct(dstFile, assetTypeScript, 0)]
response.resultCode = azlmbr.asset.builder.ProcessJobResponse_Success
response.dependenciesHandled = True
return response
def on_shutdown(args):
# note: user should attempt to close down any processing job if any running
global handler
if (handler is not None):
handler.disconnect()
handler = None
def on_cancel_job(args):
# note: user should attempt to close down any processing job if any running
print('>>> FOO asset builder - on_cancel_job <<<')
# register asset builder for source assets
def register_asset_builder():
assetPattern = azlmbr.asset.builder.AssetBuilderPattern()
assetPattern.pattern = '*.foo'
assetPattern.type = azlmbr.asset.builder.AssetBuilderPattern_Wildcard
builderDescriptor = azlmbr.asset.builder.AssetBuilderDesc()
builderDescriptor.name = "Foo Asset Builder"
builderDescriptor.patterns = [assetPattern]
builderDescriptor.busId = busId
builderDescriptor.version = 0
outcome = azlmbr.asset.builder.PythonAssetBuilderRequestBus(azlmbr.bus.Broadcast, 'RegisterAssetBuilder', builderDescriptor)
if outcome.IsSuccess():
# created the asset builder handler to hook into the notification bus
jobHandler = azlmbr.asset.builder.PythonBuilderNotificationBusHandler()
jobHandler.connect(busId)
jobHandler.add_callback('OnCreateJobsRequest', on_create_jobs)
jobHandler.add_callback('OnProcessJobRequest', on_process_job)
jobHandler.add_callback('OnShutdown', on_shutdown)
jobHandler.add_callback('OnCancel', on_cancel_job)
return jobHandler
# note: the handler has to be retained since Python retains the object ref count
# on_shutdown will clear the 'handler' to disconnect from the notification bus
handler = register_asset_builder()
| 38.393701 | 155 | 0.730517 | 586 | 4,876 | 5.993174 | 0.394198 | 0.075171 | 0.071754 | 0.014806 | 0.071185 | 0.055809 | 0.055809 | 0.036446 | 0.036446 | 0.036446 | 0 | 0.013674 | 0.190115 | 4,876 | 126 | 156 | 38.698413 | 0.875665 | 0.287941 | 0 | 0.08 | 0 | 0 | 0.08072 | 0.022067 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093333 | false | 0 | 0.04 | 0 | 0.213333 | 0.026667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14a9916ca2cb6361754e4375a1b044b446b70a15 | 12,112 | py | Python | autocert/api/bundle.py | Mozilla-GitHub-Standards/f085630706efa0d6c299c78ca54be3a7bc1bacc6818a5012c1eda6166401aebd | 26ea24a991cef080e4bd633d719185184aabf100 | [
"MIT"
] | null | null | null | autocert/api/bundle.py | Mozilla-GitHub-Standards/f085630706efa0d6c299c78ca54be3a7bc1bacc6818a5012c1eda6166401aebd | 26ea24a991cef080e4bd633d719185184aabf100 | [
"MIT"
] | null | null | null | autocert/api/bundle.py | Mozilla-GitHub-Standards/f085630706efa0d6c299c78ca54be3a7bc1bacc6818a5012c1eda6166401aebd | 26ea24a991cef080e4bd633d719185184aabf100 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
import copy
import glob
import time
import tarfile
from io import BytesIO
from ruamel import yaml
from datetime import datetime, timedelta
from exceptions import AutocertError
from utils.dictionary import merge, head, body, head_body, keys_ending
from utils.yaml import yaml_format
from utils.isinstance import *
from utils import timestamp
from utils import sift
from utils import pki
from config import CFG
FILETYPE = {
'-----BEGIN RSA PRIVATE KEY-----': '.key',
'-----BEGIN CERTIFICATE REQUEST-----': '.csr',
'-----BEGIN NEW CERTIFICATE REQUEST-----': '.csr',
'-----BEGIN CERTIFICATE-----': '.crt',
}
class UnknownFileExtError(AutocertError):
def __init__(self, content):
message = f'unknown filetype for this content: {content}'
super(UnknownFileExtError, self).__init__(message)
class BundleFromObjError(AutocertError):
def __init__(self, ex):
message = 'bundle.from_obj error'
super(BundleFromObjError, self).__init__(message)
self.errors = [ex]
class BundleLoadError(AutocertError):
def __init__(self, bundle_path, bundle_name, ex):
message = f'error loading {bundle_name}.tar.gz from {bundle_path}'
super(BundleLoadError, self).__init__(message)
self.errors = [ex]
class VisitError(AutocertError):
def __init__(self, obj):
message = f'unknown type obj = {obj}'
super(VisitError, self).__init__(message)
def printit(obj):
print(obj)
return obj
def simple(obj):
if istuple(obj):
key, value = obj
if isinstance(value, str) and key[-3:] in ('crt', 'csr', 'key'):
value = key[-3:].upper()
return key, value
return obj
def abbrev(obj):
if istuple(obj):
key, value = obj
if isinstance(value, str) and key[-3:] in ('crt', 'csr', 'key'):
lines = value.split('\n')
lines = lines[:2] + ['...'] + lines[-3:]
value = '\n'.join(lines)
return key, value
return obj
def visit(obj, func=printit):
obj1 = None
if isdict(obj):
obj1 = {}
for key, value in obj.items():
if isscalar(value):
key1, value1 = visit((key, value), func=func)
else:
key1 = key
value1 = visit(value, func=func)
obj1[key1] = value1
elif islist(obj):
obj1 = []
for item in obj:
obj1.append(visit(item, func=func))
elif isscalar(obj) or istuple(obj) and len(obj) == 2:
obj1 = func(obj)
elif isinstance(obj, datetime):
obj1 = func(obj)
else:
raise VisitError(obj)
return obj1
def get_file_ext(content):
for head, ext in FILETYPE.items():
if content.startswith(head):
return ext
return '.yml'
def tarinfo(name, content):
ext = get_file_ext(content) if name != 'README' else ''
info = tarfile.TarInfo(name + ext)
info.mtime = time.time()
info.size = len(content)
return info
class BundleProperties(type):
'''
Bundle properties
properties on classmethods https://stackoverflow.com/a/47334224
'''
zero = timedelta(0)
timestamp = timestamp.utcnow()
bundle_path = str(CFG.bundle.path)
readme = open(os.path.dirname(os.path.abspath(__file__)) + '/README.tarfile').read()
@property
def files(cls):
return glob.glob(cls.bundle_path + '/*.tar.gz')
@property
def names(cls):
def get_bundle_name(bundle_file):
ext = '.tar.gz'
if bundle_file.startswith(cls.bundle_path) and bundle_file.endswith(ext):
return os.path.basename(bundle_file)[0:-len(ext)]
return [get_bundle_name(bundle_file) for bundle_file in cls.files]
def bundles(cls, bundle_name_pns, within=None, expired=False):
bundles = []
if isint(within):
within = timedelta(within)
for bundle_name in sorted(sift.fnmatches(cls.names, bundle_name_pns)):
bundle = Bundle.from_disk(bundle_name, bundle_path=cls.bundle_path)
if bundle.sans:
bundle.sans = sorted(bundle.sans)
if within:
delta = bundle.expiry - cls.timestamp
if cls.zero < delta and delta < within:
bundles += [bundle]
elif expired:
if bundle.expiry < cls.timestamp:
bundles += [bundle]
elif bundle.expiry > cls.timestamp:
bundles += [bundle]
return bundles
class Bundle(object, metaclass=BundleProperties):
'''
Bundle class
'''
def __init__(self, common_name, modhash, key, csr, crt, bug, sans=None, expiry=None, authority=None, destinations=None, timestamp=None):
if authority:
assert isinstance(authority, dict)
self.common_name = common_name
self.modhash = modhash
self.key = key
self.csr = csr
self.crt = crt
self.bug = bug
self.sans = sans
self.expiry = expiry
self.authority = authority
self.destinations = destinations if destinations else {}
self.timestamp = timestamp if timestamp else Bundle.timestamp
def __repr__(self):
return yaml_format(self.to_obj())
def __eq__(self, bundle):
return (
self.common_name == bundle.common_name and
self.modhash == bundle.modhash and
self.key == bundle.key and
self.csr == bundle.csr and
self.crt == bundle.crt and
self.bug == bundle.bug and
self.sans == bundle.sans and
self.expiry == bundle.expiry and
self.authority == bundle.authority and
self.destinations == bundle.destinations and
self.timestamp == bundle.timestamp)
@property
def modhash_abbrev(self):
return self.modhash[:8]
@property
def friendly_common_name(self):
if self.common_name.startswith('*.'):
return 'wildcard' + self.common_name[1:]
return self.common_name
@property
def bundle_name(self):
return self.friendly_common_name + '@' + self.modhash_abbrev
@property
def bundle_tar(self):
return self.bundle_name + '.tar.gz'
@property
def serial(self):
return pki.get_serial(self.crt)
@property
def sha1(self):
return pki.get_sha1(self.crt)
@property
def sha2(self):
return pki.get_sha2(self.crt)
@property
def files(self):
files = {}
for content in (self.key, self.csr, self.crt):
if content:
ext = get_file_ext(content)
files[self.bundle_name + ext] = content
return files
def to_obj(self):
obj = {
self.bundle_name: {
'common_name': self.common_name,
'timestamp': self.timestamp,
'modhash': self.modhash,
'serial': self.serial,
'sha1': self.sha1,
'sha2': self.sha2,
'bug': self.bug,
'expiry': self.expiry,
'authority': self.authority,
'destinations': self.destinations,
'tardata': {
self.bundle_tar: self.files
},
}
}
if self.sans:
obj[self.bundle_name]['sans'] = self.sans
return obj
def to_disk(self, bundle_path=None):
if bundle_path == None:
bundle_path = Bundle.bundle_path
authority = copy.deepcopy(self.authority)
authority.pop('key', None)
authority.pop('csr', None)
authority.pop('crt', None)
obj = {
self.bundle_name: {
'common_name': self.common_name,
'timestamp': self.timestamp,
'modhash': self.modhash,
'bug': self.bug,
'expiry': self.expiry,
'authority': self.authority,
}
}
if self.sans:
obj[self.bundle_name]['sans'] = self.sans
yml = yaml_format(obj)
os.makedirs(bundle_path, exist_ok=True)
bundle_file = f'{bundle_path}/{self.bundle_name}.tar.gz'
with tarfile.open(bundle_file, 'w:gz') as tar:
tar.addfile(tarinfo('README', Bundle.readme), BytesIO(Bundle.readme.encode('utf-8')))
for content in (self.key, self.csr, self.crt, yml):
if content:
tar.addfile(tarinfo(self.bundle_name, content), BytesIO(content.encode('utf-8')))
return bundle_file
@staticmethod
def from_obj(obj):
try:
bundle_name, bundle_body = head_body(obj)
common_name = bundle_body['common_name']
modhash = bundle_body['modhash']
expiry = bundle_body['expiry']
authority = bundle_body['authority']
bug = bundle_body.get('bug', None)
sans = bundle_body.get('sans', None)
destinations = bundle_body.get('destinations', None)
timestamp = bundle_body['timestamp']
key, csr, crt = [None] * 3
tardata = bundle_body.pop('tardata', None)
if tardata:
files = tardata[bundle_name + '.tar.gz']
key = files[bundle_name + '.key']
csr = files[bundle_name + '.csr']
crt = files[bundle_name + '.crt']
except Exception as ex:
import traceback
traceback.print_exc()
raise BundleFromObjError(ex)
return common_name, modhash, key, csr, crt, bug, sans, expiry, authority, destinations, timestamp
@classmethod
def from_disk(cls, bundle_name, bundle_path=None):
if bundle_path == None:
bundle_path = Bundle.bundle_path
bundle_file = f'{bundle_path}/{bundle_name}.tar.gz'
key, csr, crt, obj, readme = [None] * 5
with tarfile.open(bundle_file, 'r:gz') as tar:
for info in tar.getmembers():
info.mtime = time.time()
if info.name.endswith('.key'):
key = tar.extractfile(info.name).read().decode('utf-8')
elif info.name.endswith('.csr'):
csr = tar.extractfile(info.name).read().decode('utf-8')
elif info.name.endswith('.crt'):
crt = tar.extractfile(info.name).read().decode('utf-8')
elif info.name.endswith('.yml'):
yml = tar.extractfile(info.name).read().decode('utf-8')
obj = yaml.safe_load(yml)
elif info.name == 'README':
readme = tar.extractfile(info.name).read().decode('utf-8')
try:
common_name, modhash, _, _, _, bug, sans, expiry, authority, destinations, timestamp = Bundle.from_obj(obj)
except AutocertError as ae:
raise BundleLoadError(bundle_path, bundle_name, ae)
bundle = Bundle(
common_name,
modhash,
key,
csr,
crt,
bug,
sans=sans,
expiry=expiry,
authority=authority,
destinations=destinations,
timestamp=timestamp)
return bundle
def transform(self, verbosity):
json = self.to_obj()
if verbosity == 0:
json = {self.bundle_name: self.expiry}
elif verbosity == 1:
json[self.bundle_name].pop('destinations', None)
json[self.bundle_name]['tardata'] = self.bundle_tar
elif verbosity == 2:
json = visit(json, func=simple)
elif verbosity == 3:
json = visit(json, func=abbrev)
return json
| 33.927171 | 140 | 0.557051 | 1,344 | 12,112 | 4.889137 | 0.148065 | 0.042611 | 0.023436 | 0.011414 | 0.230711 | 0.198448 | 0.159032 | 0.149292 | 0.123269 | 0.098615 | 0 | 0.006773 | 0.329508 | 12,112 | 356 | 141 | 34.022472 | 0.802364 | 0.011394 | 0 | 0.198052 | 0 | 0 | 0.063997 | 0.006115 | 0 | 0 | 0 | 0 | 0.003247 | 1 | 0.097403 | false | 0 | 0.055195 | 0.029221 | 0.279221 | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14acabd8cbd4b513d1eb59c9df9ca847e6ac8cba | 9,994 | py | Python | brewtils/log.py | scott-taubman/brewtils | 3478e5ebd6383d7724286c9d0c7afac9ef5d7b45 | [
"MIT"
] | 7 | 2018-02-04T18:11:29.000Z | 2021-10-03T18:47:08.000Z | brewtils/log.py | scott-taubman/brewtils | 3478e5ebd6383d7724286c9d0c7afac9ef5d7b45 | [
"MIT"
] | 256 | 2018-02-04T18:07:50.000Z | 2022-03-07T21:11:03.000Z | brewtils/log.py | scott-taubman/brewtils | 3478e5ebd6383d7724286c9d0c7afac9ef5d7b45 | [
"MIT"
] | 7 | 2019-01-03T17:18:26.000Z | 2021-12-15T16:55:18.000Z | # -*- coding: utf-8 -*-
"""Brewtils Logging Utilities
This module streamlines loading logging configuration from Beergarden.
Example:
To use this just call ``configure_logging`` sometime before you initialize
your Plugin object:
.. code-block:: python
from brewtils import configure_logging, get_connection_info, Plugin
# Load BG connection info from environment and command line args
connection_info = get_connection_info(sys.argv[1:])
configure_logging(system_name='systemX', **connection_info)
plugin = Plugin(
my_client,
name='systemX,
version='0.0.1',
**connection_info
)
plugin.run()
"""
import copy
import json
import os
import re
import string
import warnings
import logging.config
import brewtils
DEFAULT_LOGGERS = {
"pika": {"level": "ERROR"},
"requests.packages.urllib3.connectionpool": {"level": "WARN"},
"yapconf": {"level": "WARN"},
}
DEFAULT_FORMAT = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
DEFAULT_FORMATTERS = {"default": {"format": DEFAULT_FORMAT}}
DEFAULT_HANDLERS = {
"default": {
"class": "logging.StreamHandler",
"formatter": "default",
"stream": "ext://sys.stdout",
}
}
DEFAULT_ROOT = {"level": "INFO", "formatter": "default", "handlers": ["default"]}
DEFAULT_PLUGIN_LOGGING_TEMPLATE = {
"version": 1,
"disable_existing_loggers": False,
"loggers": DEFAULT_LOGGERS,
"formatters": DEFAULT_FORMATTERS,
"handlers": DEFAULT_HANDLERS,
"root": DEFAULT_ROOT,
}
def default_config(level="INFO"):
"""Get a basic logging configuration with the given level"""
config = copy.deepcopy(DEFAULT_PLUGIN_LOGGING_TEMPLATE)
config["root"]["level"] = level
return config
def configure_logging(
raw_config,
namespace=None,
system_name=None,
system_version=None,
instance_name=None,
):
"""Load and enable a logging configuration from Beergarden
WARNING: This method will modify the current logging configuration.
The configuration will be template substituted using the keyword arguments passed
to this function. For example, a handler like this:
.. code-block:: yaml
handlers:
file:
backupCount: 5
class: "logging.handlers.RotatingFileHandler"
encoding: utf8
formatter: default
level: INFO
maxBytes: 10485760
filename: "$system_name.log"
Will result in logging to a file with the same name as the given system_name.
This will also ensure that directories exist for any file-based handlers. Default
behavior for the Python logging module is to not create directories that do not
already exist, which would dramatically lower the utility of templating.
Args:
raw_config: Configuration to apply
namespace: Used for configuration templating
system_name: Used for configuration templating
system_version: Used for configuration templating
instance_name: Used for configuration templating
Returns:
None
"""
class ConfigParserTemplate(string.Template):
"""string.Template variant for ConfigParser-style interpolation
So. This exists because we want to do template substitution on the logging
configuration file. We want this to be consistent with how the logging module
itself does substitution, and since we need this to work on Python 2 that means
the ConfigParser flavor: %(variable)s
The important parts here that differ from the normal string.Template are:
- The delimiter ("%" instead of "$")
- The "delimiter and a braced identifier" part of the pattern definition. This
is needed to match %(variable)s instead of %{variable} like a normal template
- The "id" and additional field "bid" in Python 3.7 are slightly different:
r"(?a:[_a-z][_a-z0-9]*)" instead of r"[_a-z][_a-z0-9]*"
Hopefully that's not a problem.
"""
delimiter = "%"
pattern = r"""
%(delim)s(?:
(?P<escaped>%(delim)s) | # Escape sequence of two delimiters
(?P<named>%(id)s) | # delimiter and a Python identifier
\((?P<braced>%(id)s)\)s | # delimiter and a braced identifier
(?P<invalid>) # Other ill-formed delimiter exprs
)
""" % {
"delim": re.escape("%"),
"id": r"[_a-z][_a-z0-9]*",
}
templated = ConfigParserTemplate(json.dumps(raw_config)).safe_substitute(
namespace=namespace,
system_name=system_name,
system_version=system_version,
instance_name=instance_name,
)
logging_config = json.loads(templated)
# Now make sure that directories for all file handlers exist
for handler in logging_config["handlers"].values():
if "filename" in handler:
dir_name = os.path.dirname(os.path.abspath(handler["filename"]))
if not os.path.exists(dir_name):
os.makedirs(dir_name)
logging.config.dictConfig(logging_config)
def find_log_file():
"""Find the file name for the first file handler attached to the root logger"""
for h in logging.getLogger().handlers:
if hasattr(h, "baseFilename"):
return h.baseFilename
def read_log_file(log_file, start_line=None, end_line=None):
"""Read lines from a log file
Args:
log_file: The file to read from
start_line: Starting line to read
end_line: Ending line to read
Returns:
Lines read from the file
"""
with open(log_file, "r") as f:
raw_logs = f.readlines()
return "".join(raw_logs[start_line:end_line])
# DEPRECATED
SUPPORTED_HANDLERS = ("stdout", "file", "logstash")
def get_logging_config(system_name=None, **kwargs):
"""Retrieve a logging configuration from Beergarden
Args:
system_name: Name of the system to load
**kwargs: Beergarden connection parameters
Returns:
dict: The logging configuration for the specified system
"""
warnings.warn(
"This function is deprecated and will be removed in version "
"4.0, please consider using 'EasyClient.get_logging_config' and "
"'configure_logging' instead.",
DeprecationWarning,
stacklevel=2,
)
config = brewtils.get_easy_client(**kwargs).get_logging_config(system_name)
return convert_logging_config(config)
def convert_logging_config(logging_config):
"""Transform a LoggingConfig object into a Python logging configuration
Args:
logging_config: Beergarden logging config
Returns:
dict: The logging configuration
"""
warnings.warn(
"This function is deprecated and will be removed in version "
"4.0, please consider using 'configure_logging' instead.",
DeprecationWarning,
stacklevel=2,
)
config_to_return = copy.deepcopy(DEFAULT_PLUGIN_LOGGING_TEMPLATE)
if logging_config.handlers:
handlers = logging_config.handlers
else:
handlers = copy.deepcopy(DEFAULT_HANDLERS)
config_to_return["handlers"] = handlers
if logging_config.formatters:
formatters = logging_config.formatters
else:
formatters = copy.deepcopy(DEFAULT_FORMATTERS)
config_to_return["formatters"] = formatters
config_to_return["root"] = {
"level": logging_config.level,
"handlers": list(config_to_return["handlers"]),
}
return config_to_return
def setup_logger(
bg_host, bg_port, system_name, ca_cert=None, client_cert=None, ssl_enabled=None
):
"""DEPRECATED: Set Python logging to use configuration from Beergarden API
This method is deprecated - consider using :func:`configure_logging`
This method will overwrite the current logging configuration.
Args:
bg_host (str): Beergarden host
bg_port (int): Beergarden port
system_name (str): Name of the system
ca_cert (str): Path to CA certificate file
client_cert (str): Path to client certificate file
ssl_enabled (bool): Use SSL when connection to Beergarden
Returns: None
"""
warnings.warn(
"This function is deprecated and will be removed in version "
"4.0, please consider using 'configure_logging' instead.",
DeprecationWarning,
stacklevel=2,
)
config = get_python_logging_config(
bg_host=bg_host,
bg_port=bg_port,
system_name=system_name,
ca_cert=ca_cert,
client_cert=client_cert,
ssl_enabled=ssl_enabled,
)
logging.config.dictConfig(config)
def get_python_logging_config(
bg_host, bg_port, system_name, ca_cert=None, client_cert=None, ssl_enabled=None
):
"""DEPRECATED: Get Beergarden's logging configuration
This method is deprecated - consider using :func:`get_logging_config`
Args:
bg_host (str): Beergarden host
bg_port (int): Beergarden port
system_name (str): Name of the system
ca_cert (str): Path to CA certificate file
client_cert (str): Path to client certificate file
ssl_enabled (bool): Use SSL when connection to Beergarden
Returns:
dict: The logging configuration for the specified system
"""
warnings.warn(
"This function is deprecated and will be removed in version "
"4.0, please consider using 'get_logging_config' instead.",
DeprecationWarning,
stacklevel=2,
)
client = brewtils.get_easy_client(
host=bg_host,
port=bg_port,
ssl_enabled=ssl_enabled,
ca_cert=ca_cert,
client_cert=client_cert,
)
logging_config = client.get_logging_config(system_name=system_name)
return convert_logging_config(logging_config)
| 30.284848 | 87 | 0.659896 | 1,203 | 9,994 | 5.327515 | 0.242727 | 0.054767 | 0.014979 | 0.018724 | 0.296926 | 0.235762 | 0.209861 | 0.180527 | 0.170541 | 0.170541 | 0 | 0.005086 | 0.252451 | 9,994 | 329 | 88 | 30.3769 | 0.852764 | 0.436462 | 0 | 0.226027 | 0 | 0 | 0.246786 | 0.030896 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054795 | false | 0 | 0.054795 | 0 | 0.171233 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14ae9afede54e0cea3815e41671291f32a9f68bc | 216 | py | Python | pty/t2.py | dxm447/ptychogpu | 337d136b1b738ddbc3241144e49fa0129b7bcac1 | [
"MIT"
] | 1 | 2021-09-14T01:28:43.000Z | 2021-09-14T01:28:43.000Z | pty/t2.py | Rydeness/ptychogpu | 337d136b1b738ddbc3241144e49fa0129b7bcac1 | [
"MIT"
] | null | null | null | pty/t2.py | Rydeness/ptychogpu | 337d136b1b738ddbc3241144e49fa0129b7bcac1 | [
"MIT"
] | 2 | 2021-09-14T01:28:42.000Z | 2021-09-16T00:08:59.000Z | import numpy as np
import acc_image_utils as acc
import time
t1 = time.time()
size=128
shape = (size,size,size,size)
n1 = np.zeros(shape, dtype=np.float32)
n2 = acc.gpu_rot4D(n1,55)
t2 = time.time()
print(t2 - t1)
| 16.615385 | 38 | 0.712963 | 41 | 216 | 3.682927 | 0.536585 | 0.15894 | 0.15894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081522 | 0.148148 | 216 | 12 | 39 | 18 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.3 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14aeec12a4f4a1f6347cb92bf444494d07caa82d | 228 | py | Python | profile/results/strip_file.py | AI-Pranto/OpenMOC | 7f6ce4797aec20ddd916981a56a4ba54ffda9a06 | [
"MIT"
] | 97 | 2015-01-02T02:13:45.000Z | 2022-03-09T14:12:45.000Z | profile/results/strip_file.py | AI-Pranto/OpenMOC | 7f6ce4797aec20ddd916981a56a4ba54ffda9a06 | [
"MIT"
] | 325 | 2015-01-07T17:43:14.000Z | 2022-02-21T17:22:00.000Z | profile/results/strip_file.py | AI-Pranto/OpenMOC | 7f6ce4797aec20ddd916981a56a4ba54ffda9a06 | [
"MIT"
] | 73 | 2015-01-17T19:11:58.000Z | 2022-03-24T16:31:37.000Z | import os
x = set()
with open(os.environ['PBS_NODEFILE'], 'r') as fh:
lines = fh.readlines()
for line in lines:
x.add(line)
name = 'machine-' + xxxMACHINExxx
with open(name, 'w') as fh:
for xi in x:
fh.write(xi)
| 17.538462 | 49 | 0.622807 | 39 | 228 | 3.615385 | 0.615385 | 0.113475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214912 | 228 | 12 | 50 | 19 | 0.78771 | 0 | 0 | 0 | 0 | 0 | 0.096491 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14af2d1910bc127c3b0e2359278be4a51c581a07 | 2,212 | py | Python | pysimplestorageservice/core.py | poteralski/pysimplestorageservice | dfa2075fd43683fcc8326a877f14add0ba042b74 | [
"Apache-2.0"
] | 5 | 2016-04-17T20:28:23.000Z | 2018-03-15T22:01:17.000Z | pysimplestorageservice/core.py | poteralski/pysimplestorageservice | dfa2075fd43683fcc8326a877f14add0ba042b74 | [
"Apache-2.0"
] | null | null | null | pysimplestorageservice/core.py | poteralski/pysimplestorageservice | dfa2075fd43683fcc8326a877f14add0ba042b74 | [
"Apache-2.0"
] | null | null | null | import requests
from pysimplestorageservice.auth import AuthSigV4
class AmazonAWSManager(object):
"""
"""
def __init__(self, access_key, secret_key):
self.access_key = access_key
self.secret_key = secret_key
def get(self, prefix, filename, bucket):
"""
GET
"""
auth = AuthSigV4(access_key=self.access_key, secret_key=self.secret_key)
headers = auth.get_headers(bucket, 'GET', canonical_uri=self.build_cannonical_uri(filename, prefix))
file_url = self.__build_endpoint(bucket, prefix, filename)
r = requests.get(file_url, headers=headers)
if r.status_code == 200:
return r.content
else:
return r.status_code
def get_file_list(self, bucket=None, prefix=None, max_keys=None):
"""
FILE LIST
"""
params = {"delimiter": "/"}
if max_keys:
params["max-keys"] = str(max_keys)
if prefix:
params["prefix"] = prefix
auth = AuthSigV4(access_key=self.access_key, secret_key=self.secret_key)
headers = auth.get_headers(bucket, 'GET', querystring=params)
endpoint = self.__build_endpoint(bucket)
r = requests.get(endpoint, headers=headers, params=params)
if r.status_code == 200:
return r.content
else:
return None
def put(self, filename, file, prefix, bucket):
auth = AuthSigV4(access_key=self.access_key, secret_key=self.secret_key)
headers = auth.get_headers(bucket, 'PUT', payload=file, canonical_uri=self.build_cannonical_uri(filename, prefix))
endpoint = self.__build_endpoint(bucket, prefix, filename)
r = requests.put(endpoint, data=file, headers=headers)
return r
def build_cannonical_uri(self, filename, prefix):
return '/' + prefix + '/' + filename
def __build_endpoint(self, bucket, prefix=None, filename=None):
endpoint = "http://" + bucket + '.s3.amazonaws.com'
if prefix and filename:
return "/".join([endpoint, prefix, filename])
elif prefix:
return "/".join([endpoint, prefix])
else:
return endpoint | 36.262295 | 122 | 0.624322 | 257 | 2,212 | 5.159533 | 0.206226 | 0.061086 | 0.04902 | 0.057315 | 0.445701 | 0.41629 | 0.396682 | 0.396682 | 0.254902 | 0.254902 | 0 | 0.006761 | 0.264467 | 2,212 | 61 | 123 | 36.262295 | 0.808236 | 0.005877 | 0 | 0.222222 | 0 | 0 | 0.028478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.044444 | 0.022222 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14b007def33bf66554be51f1a81715ea60bb4eda | 1,834 | py | Python | comix-localization/util/cam.py | snu-mllab/Co-Mixup | 2a7681601ee972892435ae080494a4f0907e595a | [
"MIT"
] | 86 | 2021-02-05T03:13:09.000Z | 2022-03-29T03:10:50.000Z | comix-localization/util/cam.py | snu-mllab/Co-Mixup | 2a7681601ee972892435ae080494a4f0907e595a | [
"MIT"
] | 4 | 2021-06-01T13:07:06.000Z | 2022-02-15T03:08:30.000Z | comix-localization/util/cam.py | snu-mllab/Co-Mixup | 2a7681601ee972892435ae080494a4f0907e595a | [
"MIT"
] | 7 | 2021-02-09T01:27:03.000Z | 2021-09-01T14:07:40.000Z | # @ based on InfoCAm
import torch
import torch.nn as nn
import torch.nn.functional as F
class CAM(nn.Module):
def __init__(self, model, feature, linear, factor, ksize, padding):
super().__init__()
self.model = model
self.feature = feature
self.linear = linear
self.factor = factor
self.ksize = ksize
self.padding = padding
def forward(self, input, target=None):
"""
if target is None:
estimate target label by inferencing 'model(input)' and generate cam based on the estimated target label.
else:
generate cam based on the given target
:param input: image # b3yx, torch.float32, 0~1 value,
:param target: target label # b, torch.int64
:return: cam # b1yx, torch.float32, 0~1 value
notes on comment:
b # batch size
k=1000 # num_class
c=2048 # num of channel in the last feature
y', x': the last feature spatial size
"""
score = self.model(input) # bk
feature = self.feature(input) # bcy'x'
weight = self.linear.weight.clone().detach() # kc
channel = feature.shape[1] # c
if target is None:
_, target = score.topk(1, 1, True, True) # b1
target = target[:, 0] # b
cam_weight = weight[target] # bc
cam = cam_weight[:, :, None, None] * feature # bc11 * bcy'x'
cam_filter = torch.ones(1, channel, self.ksize, self.ksize).to(input.device)
cam = F.conv2d(cam, cam_filter, padding=self.padding)
# upsample
cam = F.interpolate(cam, size=[input.shape[3], input.shape[2]], mode="bicubic")
# normalize
cam = (cam - cam.min()) / (cam.max() - cam.min())
return cam
| 32.75 | 121 | 0.563795 | 233 | 1,834 | 4.377682 | 0.39485 | 0.023529 | 0.02549 | 0.027451 | 0.078431 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025061 | 0.325518 | 1,834 | 55 | 122 | 33.345455 | 0.799515 | 0.323882 | 0 | 0 | 0 | 0 | 0.006178 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.111111 | 0 | 0.259259 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14b1a89e48cc1b82f7462fdd10feb9e3312df735 | 39,383 | py | Python | conference.py | ppjk1/conference-central | e240c96724d7d55f602dfc099eb20e91a5a16113 | [
"Apache-2.0"
] | null | null | null | conference.py | ppjk1/conference-central | e240c96724d7d55f602dfc099eb20e91a5a16113 | [
"Apache-2.0"
] | null | null | null | conference.py | ppjk1/conference-central | e240c96724d7d55f602dfc099eb20e91a5a16113 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""
conference.py -- Udacity conference server-side Python App Engine API;
uses Google Cloud Endpoints
$Id: conference.py,v 1.25 2014/05/24 23:42:19 wesc Exp wesc $
created by wesc on 2014 apr 21
"""
__author__ = 'wesc+api@google.com (Wesley Chun)'
from datetime import datetime
from functools import wraps
import endpoints
from protorpc import messages
from protorpc import message_types
from protorpc import remote
from google.appengine.api import memcache
from google.appengine.api import taskqueue
from google.appengine.ext import ndb
from models import ConflictException
from models import StringMessage
from models import BooleanMessage
from models import Profile
from models import ProfileMiniForm
from models import ProfileForm
from models import TeeShirtSize
from models import Conference
from models import ConferenceForm
from models import ConferenceForms
from models import ConferenceQueryForm
from models import ConferenceQueryForms
from models import Speaker
from models import SpeakerForm
from models import SpeakerForms
from models import Session
from models import SessionForm
from models import SessionForms
from models import SessionMiniHardForm
from models import TypeOfSession
from settings import WEB_CLIENT_ID
from settings import ANDROID_CLIENT_ID
from settings import IOS_CLIENT_ID
from settings import ANDROID_AUDIENCE
from utils import getUserId
from utils import getSeconds
from utils import getTimeString
EMAIL_SCOPE = endpoints.EMAIL_SCOPE
API_EXPLORER_CLIENT_ID = endpoints.API_EXPLORER_CLIENT_ID
MEMCACHE_ANNOUNCEMENTS_KEY = "RECENT_ANNOUNCEMENTS"
ANNOUNCEMENT_TPL = ('Last chance to attend! The following conferences '
'are nearly sold out: %s')
MEMCACHE_FEATURED_SPEAKER_KEY = "FEATURED_SPEAKER_"
FEATURED_SPEAKER_TPL = ('Featured speaker: %s\nSessions: %s')
# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
DEFAULTS = {
"city": "Default City",
"maxAttendees": 0,
"seatsAvailable": 0,
"topics": ["Default", "Topic"],
}
OPERATORS = {
'EQ': '=',
'GT': '>',
'GTEQ': '>=',
'LT': '<',
'LTEQ': '<=',
'NE': '!='
}
FIELDS = {
'CITY': 'city',
'TOPIC': 'topics',
'MONTH': 'month',
'MAX_ATTENDEES': 'maxAttendees',
}
CONF_GET_REQUEST = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeConferenceKey=messages.StringField(1),
)
CONF_POST_REQUEST = endpoints.ResourceContainer(
ConferenceForm,
websafeConferenceKey=messages.StringField(1),
)
CONF_BY_ORGANIZER_GET = endpoints.ResourceContainer(
message_types.VoidMessage,
organizer=messages.StringField(1),
)
SESS_GET_REQUEST = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeConferenceKey=messages.StringField(1),
)
SESS_POST_REQUEST = endpoints.ResourceContainer(
SessionForm,
websafeConferenceKey=messages.StringField(1),
)
SESS_TYPE_GET = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeConferenceKey=messages.StringField(1),
typeOfSession=messages.StringField(2),
)
SESS_SPEAKER_GET = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeSpeakerKey=messages.StringField(1),
)
SESS_WISHLIST_POST = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeSessionKey=messages.StringField(1),
)
SESS_WISHLIST_GET = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeConferenceKey=messages.StringField(1),
)
SESS_HARD_QUERY_POST = endpoints.ResourceContainer(
SessionMiniHardForm,
websafeConferenceKey=messages.StringField(1),
)
FEATURED_SPEAKER_GET = endpoints.ResourceContainer(
message_types.VoidMessage,
websafeConferenceKey=messages.StringField(1),
)
# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
@endpoints.api(name='conference', version='v1', audiences=[ANDROID_AUDIENCE],
allowed_client_ids=[WEB_CLIENT_ID, API_EXPLORER_CLIENT_ID,
ANDROID_CLIENT_ID, IOS_CLIENT_ID],
scopes=[EMAIL_SCOPE])
class ConferenceApi(remote.Service):
"""Conference API v0.1"""
# - - - Profile objects - - - - - - - - - - - - - - - - - - -
def _copyProfileToForm(self, prof):
"""Copy relevant fields from Profile to ProfileForm."""
# copy relevant fields from Profile to ProfileForm
pf = ProfileForm()
for field in pf.all_fields():
if hasattr(prof, field.name):
# convert t-shirt string to Enum; just copy others
if field.name == 'teeShirtSize':
setattr(pf, field.name, getattr(
TeeShirtSize, getattr(prof, field.name)))
else:
setattr(pf, field.name, getattr(prof, field.name))
pf.check_initialized()
return pf
def _getProfileFromUser(self):
"""Return user Profile from datastore, creating new one if
non-existent."""
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
# get Profile from datastore
user_id = getUserId(user)
p_key = ndb.Key(Profile, user_id)
profile = p_key.get()
# create new Profile if not there
if not profile:
profile = Profile(
key=p_key,
displayName=user.nickname(),
mainEmail=user.email(),
teeShirtSize=str(TeeShirtSize.NOT_SPECIFIED),
)
profile.put()
return profile
def _doProfile(self, save_request=None):
"""Get user Profile and return to user, possibly updating it first."""
# get user Profile
prof = self._getProfileFromUser()
# if saveProfile(), process user-modifyable fields
if save_request:
for field in ('displayName', 'teeShirtSize'):
if hasattr(save_request, field):
val = getattr(save_request, field)
if val:
setattr(prof, field, str(val))
prof.put()
# return ProfileForm
return self._copyProfileToForm(prof)
@endpoints.method(message_types.VoidMessage, ProfileForm,
path='profile', http_method='GET', name='getProfile')
def getProfile(self, request):
"""Return user profile."""
return self._doProfile()
@endpoints.method(ProfileMiniForm, ProfileForm,
path='profile', http_method='POST', name='saveProfile')
def saveProfile(self, request):
"""Update & return user profile."""
return self._doProfile(request)
# - - - Conference objects - - - - - - - - - - - - - - - - -
def _copyConferenceToForm(self, conf, displayName):
"""Copy relevant fields from Conference to ConferenceForm."""
cf = ConferenceForm()
for field in cf.all_fields():
if hasattr(conf, field.name):
# convert Date to date string; just copy others
if field.name.endswith('Date'):
setattr(cf, field.name, str(getattr(conf, field.name)))
else:
setattr(cf, field.name, getattr(conf, field.name))
elif field.name == "websafeKey":
setattr(cf, field.name, conf.key.urlsafe())
if displayName:
setattr(cf, 'organizerDisplayName', displayName)
cf.check_initialized()
return cf
def _createConferenceObject(self, request):
"""Create or update Conference object, returning
ConferenceForm/request."""
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
user_id = getUserId(user)
if not request.name:
raise endpoints.BadRequestException(
"Conference 'name' field required")
# copy ConferenceForm/ProtoRPC Message into dict
data = {field.name: getattr(request, field.name)
for field in request.all_fields()}
del data['websafeKey']
del data['organizerDisplayName']
# add default values for those missing
# (both data model & outbound Message)
for df in DEFAULTS:
if data[df] in (None, []):
data[df] = DEFAULTS[df]
setattr(request, df, DEFAULTS[df])
# convert dates from strings to Date objects;
# set month based on start_date
if data['startDate']:
data['startDate'] = datetime.strptime(
data['startDate'][:10], "%Y-%m-%d").date()
data['month'] = data['startDate'].month
else:
data['month'] = 0
if data['endDate']:
data['endDate'] = datetime.strptime(
data['endDate'][:10], "%Y-%m-%d").date()
# set seatsAvailable to be same as maxAttendees on creation
if data["maxAttendees"] > 0:
data["seatsAvailable"] = data["maxAttendees"]
# generate Profile Key based on user ID and Conference
# ID based on Profile key get Conference key from ID
p_key = ndb.Key(Profile, user_id)
c_id = Conference.allocate_ids(size=1, parent=p_key)[0]
c_key = ndb.Key(Conference, c_id, parent=p_key)
data['key'] = c_key
data['organizerUserId'] = request.organizerUserId = user_id
# create Conference, send email to organizer confirming
# creation of Conference & return (modified) ConferenceForm
Conference(**data).put()
taskqueue.add(params={'email': user.email(),
'conferenceInfo': repr(request)},
url='/tasks/send_confirmation_email')
return request
@ndb.transactional()
def _updateConferenceObject(self, request):
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
user_id = getUserId(user)
# copy ConferenceForm/ProtoRPC Message into dict
data = {field.name: getattr(request, field.name)
for field in request.all_fields()}
# update existing conference
wsck = request.websafeConferenceKey
conf = ndb.Key(urlsafe=wsck).get()
# check that conference exists
if not conf:
raise endpoints.NotFoundException(
'No conference found with key: %s' % wsck)
# check that user is owner
if user_id != conf.organizerUserId:
raise endpoints.ForbiddenException(
'Only the owner can update the conference.')
# Not getting all the fields, so don't create a new object; just
# copy relevant fields from ConferenceForm to Conference object
for field in request.all_fields():
data = getattr(request, field.name)
# only copy fields where we get data
if data not in (None, []):
# special handling for dates (convert string to Date)
if field.name in ('startDate', 'endDate'):
data = datetime.strptime(data, "%Y-%m-%d").date()
if field.name == 'startDate':
conf.month = data.month
# write to Conference object
setattr(conf, field.name, data)
conf.put()
prof = ndb.Key(Profile, user_id).get()
return self._copyConferenceToForm(conf, getattr(prof, 'displayName'))
@endpoints.method(ConferenceForm, ConferenceForm, path='conference',
http_method='POST', name='createConference')
def createConference(self, request):
"""Create new conference."""
return self._createConferenceObject(request)
@endpoints.method(CONF_POST_REQUEST, ConferenceForm,
path='conference/{websafeConferenceKey}',
http_method='PUT', name='updateConference')
def updateConference(self, request):
"""Update conference w/provided fields & return w/updated info."""
return self._updateConferenceObject(request)
@endpoints.method(CONF_GET_REQUEST, ConferenceForm,
path='conference/{websafeConferenceKey}',
http_method='GET', name='getConference')
def getConference(self, request):
"""Return requested conference (by websafeConferenceKey)."""
wsck = request.websafeConferenceKey
# get Conference object from request; bail if not found
conf = ndb.Key(urlsafe=wsck).get()
if not conf:
raise endpoints.NotFoundException(
'No conference found with key: %s' % wsck)
prof = conf.key.parent().get()
# return ConferenceForm
return self._copyConferenceToForm(conf, getattr(prof, 'displayName'))
@endpoints.method(message_types.VoidMessage, ConferenceForms,
path='getConferencesCreated',
http_method='POST', name='getConferencesCreated')
def getConferencesCreated(self, request):
"""Return conferences created by logged in user."""
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
user_id = getUserId(user)
# create ancestor query for all key matches for this user
confs = Conference.query(ancestor=ndb.Key(Profile, user_id))
prof = ndb.Key(Profile, user_id).get()
# return set of ConferenceForm objects per Conference
return ConferenceForms(
items=[self._copyConferenceToForm(
conf, getattr(prof, 'displayName')) for conf in confs]
)
@endpoints.method(CONF_BY_ORGANIZER_GET, ConferenceForms,
path='getConferencesByOrganizer/{organizer}',
name='getConferencesByOrganizer')
def getConferencesByOrganizer(self, request):
"""Return conferences created by organizer."""
q = Profile.query()
q = q.filter(Profile.displayName == request.organizer)
prof = q.get()
if not prof:
raise endpoints.BadRequestException('Organizer not found.')
q = Conference.query()
q = q.filter(Conference.organizerUserId == prof.key.id())
confs = q.fetch()
# return set of ConferenceForm objects per Conference
return ConferenceForms(
items=[self._copyConferenceToForm(
conf, getattr(prof, 'displayName')) for conf in confs]
)
def _getQuery(self, request):
"""Return formatted query from the submitted filters."""
q = Conference.query()
inequality_filter, filters = self._formatFilters(request.filters)
# If exists, sort on inequality filter first
if not inequality_filter:
q = q.order(Conference.name)
else:
q = q.order(ndb.GenericProperty(inequality_filter))
q = q.order(Conference.name)
for filtr in filters:
if filtr["field"] in ["month", "maxAttendees"]:
filtr["value"] = int(filtr["value"])
formatted_query = ndb.query.FilterNode(filtr["field"],
filtr["operator"], filtr["value"])
q = q.filter(formatted_query)
return q
def _formatFilters(self, filters):
"""Parse, check validity and format user supplied filters."""
formatted_filters = []
inequality_field = None
for f in filters:
filtr = {field.name: getattr(f, field.name)
for field in f.all_fields()}
try:
filtr["field"] = FIELDS[filtr["field"]]
filtr["operator"] = OPERATORS[filtr["operator"]]
except KeyError:
raise endpoints.BadRequestException(
"Filter contains invalid field or operator.")
# Every operation except "=" is an inequality
if filtr["operator"] != "=":
# check if inequality operation has been used in previous filters
# disallow the filter if inequality was performed on a different field before
# track the field on which the inequality operation is performed
if inequality_field and inequality_field != filtr["field"]:
raise endpoints.BadRequestException(
"Inequality filter is allowed on only one field.")
else:
inequality_field = filtr["field"]
formatted_filters.append(filtr)
return (inequality_field, formatted_filters)
@endpoints.method(ConferenceQueryForms, ConferenceForms,
path='queryConferences',
http_method='POST',
name='queryConferences')
def queryConferences(self, request):
"""Query for conferences."""
conferences = self._getQuery(request)
# need to fetch organiser displayName from profiles
# get all keys and use get_multi for speed
organisers = [(ndb.Key(Profile, conf.organizerUserId))
for conf in conferences]
profiles = ndb.get_multi(organisers)
# put display names in a dict for easier fetching
names = {}
for profile in profiles:
names[profile.key.id()] = profile.displayName
# return individual ConferenceForm object per Conference
return ConferenceForms(
items=[self._copyConferenceToForm(
conf, names[conf.organizerUserId]) for conf in conferences]
)
# - - - Registration - - - - - - - - - - - - - - - - - - - -
@ndb.transactional(xg=True)
def _conferenceRegistration(self, request, reg=True):
"""Register or unregister user for selected conference."""
retval = None
prof = self._getProfileFromUser()
# check if conf exists given websafeConfKey
# get conference; check that it exists
wsck = request.websafeConferenceKey
conf = ndb.Key(urlsafe=wsck).get()
if not conf:
raise endpoints.NotFoundException(
'No conference found with key: %s' % wsck)
# register
if reg:
# check if user already registered otherwise add
if wsck in prof.conferenceKeysToAttend:
raise ConflictException(
"You have already registered for this conference")
# check if seats avail
if conf.seatsAvailable <= 0:
raise ConflictException(
"There are no seats available.")
# register user, take away one seat
prof.conferenceKeysToAttend.append(wsck)
conf.seatsAvailable -= 1
retval = True
# unregister
else:
# check if user already registered
if wsck in prof.conferenceKeysToAttend:
# unregister user, add back one seat
prof.conferenceKeysToAttend.remove(wsck)
conf.seatsAvailable += 1
retval = True
else:
retval = False
# write things back to the datastore & return
prof.put()
conf.put()
return BooleanMessage(data=retval)
@endpoints.method(message_types.VoidMessage, ConferenceForms,
path='conferences/attending',
http_method='GET', name='getConferencesToAttend')
def getConferencesToAttend(self, request):
"""Get list of conferences that user has registered for."""
prof = self._getProfileFromUser()
conf_keys = [ndb.Key(urlsafe=wsck)
for wsck in prof.conferenceKeysToAttend]
conferences = ndb.get_multi(conf_keys)
# get organizers
organisers = [ndb.Key(Profile, conf.organizerUserId)
for conf in conferences]
profiles = ndb.get_multi(organisers)
# put display names in a dict for easier fetching
names = {}
for profile in profiles:
names[profile.key.id()] = profile.displayName
# return set of ConferenceForm objects per Conference
return ConferenceForms(items=[
self._copyConferenceToForm(conf, names[conf.organizerUserId])
for conf in conferences]
)
@endpoints.method(CONF_GET_REQUEST, BooleanMessage,
path='conference/{websafeConferenceKey}',
http_method='POST', name='registerForConference')
def registerForConference(self, request):
"""Register user for selected conference."""
return self._conferenceRegistration(request)
@endpoints.method(CONF_GET_REQUEST, BooleanMessage,
path='conference/{websafeConferenceKey}',
http_method='DELETE', name='unregisterFromConference')
def unregisterFromConference(self, request):
"""Unregister user for selected conference."""
return self._conferenceRegistration(request, reg=False)
# - - - Speakers - - - - - - - - - - - - - - - - - - - -
def _copySpeakerToForm(self, speaker):
"""Copy fields from Speaker to SpeakerForm."""
sf = SpeakerForm()
for field in sf.all_fields():
if hasattr(speaker, field.name):
setattr(sf, field.name, getattr(speaker, field.name))
elif field.name == 'websafeKey':
setattr(sf, field.name, speaker.key.urlsafe())
sf.check_initialized()
return sf
def _createSpeakerObject(self, request):
"""Create Speaker object, returning SpeakerForm request."""
# User must be authenticated to create Speaker
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
# 'name' is a required field
if not request.name:
raise endpoints.BadRequestException(
"Speaker 'name' field required")
# copy SessionForm/ProtoRPC Message into dict
data = {field.name: str(getattr(request, field.name))
for field in request.all_fields()}
del data['websafeKey']
# create Speaker
Speaker(**data).put()
return request
@endpoints.method(SpeakerForm, SpeakerForm,
path="speaker", http_method='POST', name='createSpeaker')
def createSpeaker(self, request):
"""Create new speaker."""
return self._createSpeakerObject(request)
@endpoints.method(message_types.VoidMessage, SpeakerForms,
path='speakers', name='getSpeakers')
def getSpeakers(self, request):
"""Get all speakers."""
speakers = Speaker.query().order(Speaker.name)
# return individual SpeakerForm object per Speaker
return SpeakerForms(
speakers=[self._copySpeakerToForm(s) for s in speakers]
)
# - - - Sessions - - - - - - - - - - - - - - - - - - - - - -
def _copySessionToForm(self, sess):
"""Copy relevant fields from Session to SessionForm."""
sf = SessionForm()
for field in sf.all_fields():
if hasattr(sess, field.name):
# Convert date to string
if field.name == 'date':
setattr(sf, field.name, str(getattr(sess, field.name)))
# Convert integer seconds to time string
elif field.name == 'startTime' and getattr(sess, field.name) != None:
setattr(sf, field.name,
getTimeString(getattr(sess, field.name)))
# Convert string to ENUM
elif field.name == 'typeOfSession':
setattr(sf, field.name, getattr(
TypeOfSession, getattr(sess, field.name)))
# Just copy the rest
else:
setattr(sf, field.name, getattr(sess, field.name))
elif field.name == 'websafeKey':
setattr(sf, field.name, sess.key.urlsafe())
sf.check_initialized()
return sf
def _createSessionObject(self, request):
"""Create new session object, returning SessionForm request."""
# User must be authenticated to create Session
user = endpoints.get_current_user()
if not user:
raise endpoints.UnauthorizedException('Authorization required.')
# 'name' is a required field
if not request.name:
raise endpoints.BadRequestException(
"Session 'name' field required.")
# 'websafeConferenceKey' is a required field
wsck = request.websafeConferenceKey
if not wsck:
raise endpoints.BadRequestException(
'websafeConferenceKey field required.')
# Get conference and check that it exists
conf = ndb.Key(urlsafe=wsck).get()
if not conf:
raise endpoints.NotFoundException(
'No conference found with key: %s' % (wsck,))
# check that user is owner
user_id = getUserId(user)
if user_id != conf.organizerUserId:
raise endpoints.ForbiddenException(
'Only the conference owner can add sessions.')
# copy SessionForm/ProtoRPC Message into dict
data = {field.name: getattr(request, field.name)
for field in request.all_fields()}
del data['websafeKey']
# convert date to Date object and check against conference dates
if data['date']:
data['date'] = datetime.strptime(
data['date'][:10], "%Y-%m-%d").date()
if not conf.startDate <= data['date'] <= conf.endDate:
raise endpoints.BadRequestException(
'Date does not fall within conference dates.')
# convert startTime to integer seconds
if data['startTime']:
data['startTime'] = getSeconds(data['startTime'])
# convert ENUM to string
if data['typeOfSession']:
data['typeOfSession'] = str(data['typeOfSession'])
else:
data['typeOfSession'] = str(TypeOfSession.NOT_SPECIFIED)
# Generate Session Id and Key based on Conference key
s_id = Session.allocate_ids(size=1, parent=conf.key)[0]
s_key = ndb.Key(Session, s_id, parent=conf.key)
data['key'] = s_key
# Store session object
Session(**data).put()
# If speakers were set on the session, add task to check for
# featured speaker for the conference and add to memcache.
if data['speakerKeys']:
taskqueue.add(
params={'websafeConferenceKey': wsck},
url='/tasks/set_featured_speaker'
)
return self._copySessionToForm(s_key.get())
@endpoints.method(SessionForm, SessionForm,
path='conference/newsession',
http_method='POST', name='createSession')
def createSession(self, request):
"""Create new session."""
return self._createSessionObject(request)
@endpoints.method(SESS_GET_REQUEST, SessionForms,
path='conference/{websafeConferenceKey}/sessions',
name='getConferenceSessions')
def getConferenceSessions(self, request):
"""Return all sessions for requested conference."""
q = Session.query().filter(
Session.websafeConferenceKey == request.websafeConferenceKey)
q = q.order(Session.startTime)
sessions = q.fetch()
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in sessions]
)
@endpoints.method(SESS_TYPE_GET, SessionForms,
path='conference/{websafeConferenceKey}/{typeOfSession}',
name='getConferenceSessionsByType')
def getConferenceSessionsByType(self, request):
"""Return all sessions of a given type for a given conference."""
q = Session.query().filter(
Session.websafeConferenceKey == request.websafeConferenceKey,
Session.typeOfSession == request.typeOfSession)
q = q.order(Session.startTime)
sessions = q.fetch()
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in sessions]
)
@endpoints.method(SESS_SPEAKER_GET, SessionForms,
path='sessions/{websafeSpeakerKey}',
name='getSessionsBySpeaker')
def getSessionsBySpeaker(self, request):
"""Return all sessions from all conferences featuring a given
speaker."""
q = Session.query().filter(
Session.speakerKeys == request.websafeSpeakerKey)
q = q.order(Session.websafeConferenceKey)
sessions = q.fetch()
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in sessions]
)
@endpoints.method(SESS_GET_REQUEST, SessionForms,
path='conference/{websafeConferenceKey}/sessions/popular',
name='getSessionsPopular')
def getSessionsPopular(self, request):
"""Returns top three most popular sessions for a given conference."""
# get all sessions for the conference
q = Session.query().filter(
Session.websafeConferenceKey == request.websafeConferenceKey)
sessions = q.fetch()
# Create a list of dicts that marry Session objects, their websafe keys
# and a count of how frequently they appear in user wishlists.
s_list = []
for s in sessions:
websafeKey = s.key.urlsafe()
frequency = Profile.query().\
filter(Profile.sessionWishlistKeys == websafeKey).\
count()
if frequency > 0:
s_list.append({
'session': s,
'websafeKey': websafeKey,
'frequency': frequency
})
# Sort the session list
s_list.sort(key=lambda session: session['frequency'], reverse=True)
# Find the top 3 sessions by their frequency rating
if len(s_list) >= 3:
top_three = s_list[:3]
else:
top_three = s_list # In this case, will be less than three
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s['session']) for s in top_three]
)
@endpoints.method(SESS_HARD_QUERY_POST, SessionForms,
path='conference/{websafeConferenceKey}/sessions/hard',
http_method='POST',
name='getSessionsHardQuery')
def getSessionsHardQuery(self, request):
"""Return sessions not of certain type and before certain time."""
# Create filter node for websafeConferenceKey
wsck = request.websafeConferenceKey
confFilter = ndb.query.FilterNode('websafeConferenceKey', '=', wsck)
# Convert the passed in time string to integer seconds and store
# as a filter node object.
beforeTime = getSeconds(request.beforeTime)
timeFilter = ndb.query.FilterNode('startTime', '<', beforeTime)
# We can't use a '!=', as this will result in too many inequality
# filters due to the implementation (see README.md).
# Instead, we add equality filters for every session type except the
# one we're filtering out.
type_filters = []
for session_type in TypeOfSession:
if str(session_type) != request.notTypeOfSession:
filter_node = ndb.query.FilterNode(
'typeOfSession', '=', str(session_type))
type_filters.append(filter_node)
q = Session.query(ndb.OR(
ndb.AND(confFilter, timeFilter, type_filters[0]),
ndb.AND(confFilter, timeFilter, type_filters[1]),
ndb.AND(confFilter, timeFilter, type_filters[2]),
ndb.AND(confFilter, timeFilter, type_filters[3]),
ndb.AND(confFilter, timeFilter, type_filters[4]),
ndb.AND(confFilter, timeFilter, type_filters[5])))
sessions = q.fetch()
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in sessions]
)
# - - - Session Wishlists - - - - - - - - - - - - - - - - - -
def _addToWishlist(self, request):
"""Add a session to user's session wishlist."""
prof = self._getProfileFromUser()
# Add session key to profile object
wssk = request.websafeSessionKey
if wssk not in prof.sessionWishlistKeys:
prof.sessionWishlistKeys.append(wssk)
prof.put()
retval = True
else:
retval = False
return BooleanMessage(data=retval)
@endpoints.method(SESS_WISHLIST_POST, BooleanMessage,
path='wishlist/add/{websafeSessionKey}',
http_method='POST', name='addSessionToWishlist')
def addSessionToWishlist(self, request):
"""Add a session to user's session wishlist."""
return self._addToWishlist(request)
@endpoints.method(message_types.VoidMessage, SessionForms,
path='wishlist/all', name='getWishlistAll')
def getWishlistAll(self, request):
"""Gets all sessions in user's wishlist across all conferences."""
prof = self._getProfileFromUser()
# Convert websafe keys to Session keys and get Sessions
swl_keys = [ndb.Key(urlsafe=s) for s in prof.sessionWishlistKeys]
sessions = ndb.get_multi(swl_keys)
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in sessions]
)
@endpoints.method(SESS_WISHLIST_GET, SessionForms,
path='wishlist/{websafeConferenceKey}/sessions',
name='getSessionsInWishList')
def getSessionsInWishlist(self, request):
"""Get sessions in user's wishlist for given conference."""
prof = self._getProfileFromUser()
# Convert websafe keys to Session keys and get Sessions
swl_keys = [ndb.Key(urlsafe=s) for s in prof.sessionWishlistKeys]
sessions = ndb.get_multi(swl_keys)
# Return only sessions matching requested conference
conf_sessions = []
for s in sessions:
if s.websafeConferenceKey == request.websafeConferenceKey:
conf_sessions.append(s)
# return individual SessionForm object per Session
return SessionForms(
sessions=[self._copySessionToForm(s) for s in conf_sessions]
)
# - - - Featured Speaker - - - - - - - - - - - - - - - - - - -
@staticmethod
def _cacheFeaturedSpeaker(request):
"""Create featured speaker and sessions for a Conference;
called when new session is created with speaker(s) set.
"""
wsck = request.get('websafeConferenceKey')
# Get all sessions for conference. We're specifying an ancestor here
# to ensure our query uses "strong consistency" and includes the
# just-added session.
sessions = ndb.gql("SELECT * "
"FROM Session "
"WHERE ANCESTOR IS :1 ",
ndb.Key(urlsafe=wsck)).fetch()
speakers_sessions = {}
# Loop through sessions
for s in sessions:
if s.speakerKeys:
# Loop through speakers for the session
for s_key in s.speakerKeys:
if s_key in speakers_sessions.keys():
speakers_sessions[s_key].append(s)
else:
speakers_sessions[s_key] = [s]
featured = {'sessions': [],
'num_of_sessions': 0,
'speaker_key': ""}
for s_key in speakers_sessions:
if len(speakers_sessions[s_key]) > featured['num_of_sessions']:
featured['sessions'] = speakers_sessions[s_key]
featured['num_of_sessions'] = len(speakers_sessions[s_key])
featured['speaker_key'] = s_key
if featured['num_of_sessions'] > 1:
# If there is a featured speaker (more than one session in this
# conference), then get speaker data, format message data and
# set it in memcache.
speaker = ndb.Key(urlsafe=featured['speaker_key']).get()
featured_speaker = FEATURED_SPEAKER_TPL % (
speaker.name,
', '.join(s.name for s in featured['sessions'])
)
# Memcache key consists of a text string plus a websafe Conference
# key. This allows us to store featured speakers for multiple
# conferences simultaneously.
memcache.set(
MEMCACHE_FEATURED_SPEAKER_KEY + wsck, featured_speaker)
else:
# Even if this speaker wasn't a featured speaker,
# don't delete memcache entry, as there may be another featured
# speaker already set for the conference.
featured_speaker = ""
return featured_speaker
@endpoints.method(FEATURED_SPEAKER_GET, StringMessage,
path='conference/{websafeConferenceKey}/featuredspeaker/get',
name='getFeaturedSpeaker')
def getFeaturedSpeaker(self, request):
"""Reaturn Featured Speaker and Sessions from memcache."""
wsck = request.websafeConferenceKey
memcache_key = MEMCACHE_FEATURED_SPEAKER_KEY + wsck
return StringMessage(data=memcache.get(memcache_key) or "")
# - - - Announcements - - - - - - - - - - - - - - - - - - - -
@staticmethod
def _cacheAnnouncement():
"""Create Announcement & assign to memcache; used by
memcache cron job & putAnnouncement().
"""
confs = Conference.query(ndb.AND(
Conference.seatsAvailable <= 5,
Conference.seatsAvailable > 0)
).fetch(projection=[Conference.name])
if confs:
# If there are almost sold out conferences,
# format announcement and set it in memcache
announcement = ANNOUNCEMENT_TPL % (
', '.join(conf.name for conf in confs))
memcache.set(MEMCACHE_ANNOUNCEMENTS_KEY, announcement)
else:
# If there are no sold out conferences,
# delete the memcache announcements entry
announcement = ""
memcache.delete(MEMCACHE_ANNOUNCEMENTS_KEY)
return announcement
@endpoints.method(message_types.VoidMessage, StringMessage,
path='conference/announcement/get', http_method='GET',
name='getAnnouncement')
def getAnnouncement(self, request):
"""Return Announcement from memcache."""
return StringMessage(
data=memcache.get(MEMCACHE_ANNOUNCEMENTS_KEY) or "")
api = endpoints.api_server([ConferenceApi]) # register API
| 38.610784 | 93 | 0.613209 | 3,949 | 39,383 | 6.028868 | 0.13649 | 0.018145 | 0.013441 | 0.006804 | 0.3708 | 0.30914 | 0.269657 | 0.239709 | 0.222908 | 0.200101 | 0 | 0.002511 | 0.292182 | 39,383 | 1,019 | 94 | 38.648675 | 0.851557 | 0.199528 | 0 | 0.301028 | 0 | 0 | 0.100935 | 0.026952 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058737 | false | 0 | 0.052863 | 0 | 0.171806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14b88e26055444d53d304f58f8961c840931c2ec | 1,571 | py | Python | Module_01/ex05/test.py | CristinaFdezBornay/PythonPiscine | 143968c2e26f5ddddb5114f3bcdddd0b1f00d153 | [
"MIT"
] | 1 | 2021-11-17T10:04:30.000Z | 2021-11-17T10:04:30.000Z | Module_01/ex05/test.py | CristinaFdezBornay/PythonPiscine | 143968c2e26f5ddddb5114f3bcdddd0b1f00d153 | [
"MIT"
] | null | null | null | Module_01/ex05/test.py | CristinaFdezBornay/PythonPiscine | 143968c2e26f5ddddb5114f3bcdddd0b1f00d153 | [
"MIT"
] | null | null | null | from the_bank import Account, Bank
if __name__ == "__main__":
bank = Bank()
print("==> [Bank] Adding not an account.")
bank.add(1)
print("\n==> [Bank] Adding a corrupted account.")
william_john = Account(
'William John',
zip='100-064',
value=6460.0,
ref='58ba2b9954cd278eda8a84147ca73c87',
bref="lol",
)
bank.add(william_john)
bank.fix_account(william_john)
bank.add(william_john)
print("\n==> [Bank] Adding account with repeated name.")
william_john = Account(
'William John',
value=0.0,
zip='03540'
)
bank.add(william_john)
print("\n==> [Bank] Adding a corrupted account.")
smith_jane = Account(
'Smith Jane',
zip='911-745',
value=1000.0,
ref='1044618427ff2782f0bbece0abd05f31'
)
bank.add(smith_jane)
print("\n==> [Transfer] Invalid because of the corrupted account")
print('Failed') if bank.transfer('William John', 'Smith Jane', 1) is False else print('Success')
bank.fix_account(smith_jane)
bank.add(smith_jane)
print("\n==> [Transfer] Invalid because of name not a string")
print('Failed') if bank.transfer(123, 'Smith Jane', 10) is False else print('Success')
print("\n==> [Transfer] Invalid because of the amount")
print('Failed') if bank.transfer('William John', 'Smith Jane', 54500.0) is False else print('Success')
print("\n==> [Transfer] Valid")
print('Failed') if bank.transfer('William John', 'Smith Jane', 40.0) is False else print('Success') | 30.803922 | 106 | 0.624443 | 202 | 1,571 | 4.752475 | 0.272277 | 0.126042 | 0.075 | 0.070833 | 0.595833 | 0.485417 | 0.459375 | 0.384375 | 0.236458 | 0.095833 | 0 | 0.069901 | 0.225971 | 1,571 | 51 | 107 | 30.803922 | 0.719572 | 0 | 0 | 0.268293 | 0 | 0 | 0.377863 | 0.040712 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02439 | 0 | 0.02439 | 0.292683 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14bab6f4282c5f5e761e6c042b014650a31eab5b | 3,188 | py | Python | latex_pdf.py | patent-python/patent-generator | 3b5e5102b04eb13913a49a0c9c42922a80d645a9 | [
"MIT"
] | 109 | 2015-01-12T03:23:35.000Z | 2022-02-08T22:32:52.000Z | latex_pdf.py | patent-python/patent-generator | 3b5e5102b04eb13913a49a0c9c42922a80d645a9 | [
"MIT"
] | 2 | 2015-06-02T01:18:12.000Z | 2021-05-17T11:45:32.000Z | latex_pdf.py | antiboredom/patent-generator | 3b5e5102b04eb13913a49a0c9c42922a80d645a9 | [
"MIT"
] | 28 | 2015-01-05T19:45:44.000Z | 2022-03-13T23:37:10.000Z | import os
import sys
import subprocess
import shlex
from machine import *
class pdfCreator():
# declare and define all variables in the constructor
def __init__(self,dn,fn,inv):
self.invention = inv
self.file = self.create_TeX_file(dn,fn)
self.title = self.create_title()
self.abstract = self.create_abstract()
self.illustrations = self.create_illustrations()
self.description = self.create_description()
self.claims = self.create_claims()
self.file_contents = self.create_LaTeX()
# used to open a new folder and open appropriate file
def create_TeX_file(self,dname,fname):
if not os.path.exists(dname):
os.makedirs(dname)
return open(dname + "/" + fname + ".tex","a+")
# assemble the full LaTeX text
def create_LaTeX(self):
text = "\\documentclass[english]{uspatent}\n\\begin{document}"
text += self.title
text += self.abstract
text += self.illustrations
text += self.description
text += self.claims
text += "\n\\end{document}"
return text
# assemble the title featuers
def create_title(self):
title = "\n\\title{" + self.invention.title + "}"
title += "\n\\date{\\today}"
title += "\n\\inventor{First Named Inventor}"
title += "\n\\maketitle"
return title
# put the abstract together
def create_abstract(self):
abs = "\n\\patentSection{Abstract}"
abs += "\n\\patentParagraph " + self.invention.abstract
return abs
# collect image descriptions
def create_illustrations(self):
ill = "\n\\patentSection{Brief Description of the Drawings}"
for i in self.invention.illustrations:
# seperate paragraph for each, maybe not necessary
ill += "\n\\patentParagraph " + i
return ill
# put description together
def create_description(self):
desc = "\n\\patentSection{Detailed Description of the Preferred Embodiments}"
desc += "\n\\patentParagraph " + self.invention.description
return desc
# assemble the claims together
def create_claims(self):
cla = "\n\\patentClaimsStart"
for i,claim in enumerate(self.invention.claims):
cla += "\n\\beginClaim{Claim" + str(i) + "}" + claim[2:]
cla += "\n\\patentClaimsEnd"
return cla
# write the entire text to the file
def write_LaTeX_to_file(self):
# to fix paragraph formatting
self.file.write(self.file_contents.replace("\n\n","\n\\patentParagraph "))
# function to compile the LaTeX formatting, not working yet
#def compile_LaTeX(self):
#process = subprocess.call("pdflatex test/test.tex", shell=True)
if __name__ == '__main__':
import sys
text = open(sys.argv[1],"r").read().decode('ascii', errors='replace')
dir_name = sys.argv[2]
file_name = sys.argv[3]
invention = Invention(text)
pdf = pdfCreator(dir_name,file_name,invention)
pdf.write_LaTeX_to_file()
#pdf.compile_LaTeX()
| 27.721739 | 85 | 0.613551 | 371 | 3,188 | 5.15903 | 0.334232 | 0.036573 | 0.026646 | 0.030303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001723 | 0.271957 | 3,188 | 114 | 86 | 27.964912 | 0.822921 | 0.170013 | 0 | 0.031746 | 0 | 0 | 0.177008 | 0.057099 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.095238 | 0 | 0.365079 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14bc8be324494c94b50dffe3b5b8590dee127599 | 786 | py | Python | chapter07/test_wrappers.py | roiyeho/drl-book | 1db635fd508e5b17ef8bfecbe49a79f55503a1f1 | [
"MIT"
] | null | null | null | chapter07/test_wrappers.py | roiyeho/drl-book | 1db635fd508e5b17ef8bfecbe49a79f55503a1f1 | [
"MIT"
] | null | null | null | chapter07/test_wrappers.py | roiyeho/drl-book | 1db635fd508e5b17ef8bfecbe49a79f55503a1f1 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import gym
from gym.wrappers import atari_preprocessing
from atari_wrappers import FireOnResetWrapper, FrameStackWrapper
env = gym.make('BreakoutNoFrameskip-v4')
print('State space:', env.observation_space)
print('Action space:', env.action_space)
print(env.get_action_meanings())
#env.render()
#input()
obs = env.reset()
plt.imshow(obs)
plt.show()
plt.clf()
env = atari_preprocessing.AtariPreprocessing(env)
obs = env.reset()
plt.imshow(obs, cmap='gray')
plt.show()
plt.clf()
env = FireOnResetWrapper(env)
obs = env.reset()
plt.imshow(obs, cmap='gray')
plt.show()
plt.clf()
n_frames = 3
env = FrameStackWrapper(env, n_frames=n_frames)
env.reset()
for _ in range(n_frames):
obs, _, _, _ = env.step(3) # Move left
plt.imshow(obs)
plt.show()
| 19.170732 | 64 | 0.737913 | 114 | 786 | 4.95614 | 0.368421 | 0.042478 | 0.084956 | 0.074336 | 0.274336 | 0.20708 | 0.166372 | 0.166372 | 0.166372 | 0.166372 | 0 | 0.00431 | 0.114504 | 786 | 40 | 65 | 19.65 | 0.807471 | 0.036896 | 0 | 0.482759 | 0 | 0 | 0.072944 | 0.029178 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.137931 | 0 | 0.137931 | 0.103448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
14be54f29ab61e31d4790aa9f64ba8ebe0ee11d8 | 3,746 | py | Python | QWeb/internal/config.py | kivipe/qweb | abf5881aa67412e4a243b13a59528a3c80aa2f52 | [
"Apache-2.0"
] | 33 | 2021-03-16T12:26:44.000Z | 2022-03-30T17:44:57.000Z | QWeb/internal/config.py | kivipe/qweb | abf5881aa67412e4a243b13a59528a3c80aa2f52 | [
"Apache-2.0"
] | 24 | 2021-03-18T16:21:37.000Z | 2022-03-24T17:52:14.000Z | QWeb/internal/config.py | kivipe/qweb | abf5881aa67412e4a243b13a59528a3c80aa2f52 | [
"Apache-2.0"
] | 13 | 2021-03-24T17:48:50.000Z | 2022-02-25T03:22:01.000Z | # -*- coding: utf-8 -*-
# --------------------------
# Copyright © 2014 - Qentinel Group.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ---------------------------
import copy
class Config:
DROPPED_DELIMITER_CHARS = " _-"
def __init__(self, config_defaults):
self._config_defaults = {}
# Clean config_defaults key values before storage
_config_defaults = {}
for k, v in config_defaults.items():
_k = self._clean_string(k)
_config_defaults[_k] = copy.deepcopy(v)
self._config_defaults.update(_config_defaults)
self.config = copy.deepcopy(self._config_defaults)
def is_value(self, par):
""" Return True if parameter exists. """
_par = self._clean_string(par)
return _par in self.config
def get_value(self, par):
""" Return value(s) for given parameter,
or None if parameter doesn't exist. """
_par = self._clean_string(par)
config_value, _ = self.config.get(_par, (None, None))
return config_value
def get_all_values(self):
"""
Return all configuration values in a dictionary.
:return: configuration dict
"""
_all_configs = {}
for k, v in self.config.items():
_all_configs[k] = copy.deepcopy(v[0])
return _all_configs
def set_value(self, par, value):
""" Set value for given parameter. Setter uses pre-defined adapter function to process value
before storage. Adapter functions are set in config_defaults. Returns old value. """
_par = self._clean_string(par)
if not self.is_value(_par):
raise ValueError("Parameter {} doesn't exist".format(par))
old_val, adapter_func = self.config[_par]
stored_value = adapter_func(value) if adapter_func else value
self.config[_par] = (stored_value, adapter_func)
return old_val
def reset_value(self, par=None):
""" Reset value(s) to original. """
if par:
_par = self._clean_string(par)
self.config[_par] = copy.deepcopy(self._config_defaults[_par])
# trigger adapter func for clearkey
if "clearkey" in _par:
val, adapter_func = self.config[_par]
if adapter_func:
adapter_func(str(val))
else:
self.config = copy.deepcopy(self._config_defaults)
# handle clearkey separately
_par = self._clean_string("ClearKey")
val, adapter_func = self.config[_par]
self.set_value(_par, str(val))
def __getitem__(self, par):
""" Allow accessing parameters in dictionary like syntax."""
_par = self._clean_string(par)
config_value, _ = self.config[_par]
return config_value
def __repr__(self):
return self.config
def __str__(self):
return "{}".format(self.config)
@staticmethod
def _clean_string(string_value):
dropped_chars_dict = dict.fromkeys(Config.DROPPED_DELIMITER_CHARS)
trans_table = str.maketrans(dropped_chars_dict)
_string_value = string_value.lower().translate(trans_table)
return _string_value
| 36.368932 | 100 | 0.630539 | 462 | 3,746 | 4.852814 | 0.318182 | 0.084746 | 0.046833 | 0.048171 | 0.176182 | 0.134701 | 0.104371 | 0.037467 | 0.037467 | 0 | 0 | 0.003631 | 0.264816 | 3,746 | 102 | 101 | 36.72549 | 0.810094 | 0.318473 | 0 | 0.189655 | 0 | 0 | 0.01916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.172414 | false | 0 | 0.017241 | 0.034483 | 0.362069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ad20e50d6be74dfe95babdda8fc8f12f3b7f6f7 | 1,419 | py | Python | survol/sources_types/enumerate_cgroup.py | rchateauneu/survol | ba66d3ec453b2d9dd3a8dabc6d53f71aa9ba8c78 | [
"BSD-3-Clause"
] | 9 | 2017-10-05T23:36:23.000Z | 2021-08-09T15:40:03.000Z | survol/sources_types/enumerate_cgroup.py | rchateauneu/survol | ba66d3ec453b2d9dd3a8dabc6d53f71aa9ba8c78 | [
"BSD-3-Clause"
] | 21 | 2018-01-02T09:33:03.000Z | 2018-08-27T11:09:52.000Z | survol/sources_types/enumerate_cgroup.py | rchateauneu/survol | ba66d3ec453b2d9dd3a8dabc6d53f71aa9ba8c78 | [
"BSD-3-Clause"
] | 4 | 2018-06-23T09:05:45.000Z | 2021-01-22T15:36:50.000Z | #!/usr/bin/env python
"""
List of Linux cgroups
"""
import lib_util
import lib_common
from sources_types.Linux import cgroup as survol_cgroup
# cat /proc/cgroups
# #subsys_name hierarchy num_cgroups enabled
# cpuset 0 1 1
# cpu 0 1 1
# cpuacct 0 1 1
# blkio 0 1 1
# memory 0 1 1
# devices 0 1 1
# freezer 0 1 1
# net_cls 0 1 1
# perf_event 0 1 1
# net_prio 0 1 1
# pids 0 1 1
Usable = lib_util.UsableLinux
def Main():
cgiEnv = lib_common.ScriptEnvironment()
grph = cgiEnv.GetGraph()
fil_cg = open("/proc/cgroups")
prop_cgroup = lib_common.MakeProp("cgroup")
linHeader = fil_cg.readline()
for lin_cg in fil_cg.readlines():
split_cg = lin_cg.split("\t")
cgroup_name = split_cg[0]
cgroup_node = survol_cgroup.MakeUri(cgroup_name)
grph.add((cgroup_node, lib_common.MakeProp("Hierarchy"), lib_util.NodeLiteral(split_cg[1])))
grph.add((cgroup_node, lib_common.MakeProp("Num cgroups"), lib_util.NodeLiteral(split_cg[2])))
grph.add((cgroup_node, lib_common.MakeProp("Enabled"), lib_util.NodeLiteral(split_cg[3])))
grph.add((lib_common.nodeMachine, prop_cgroup, cgroup_node))
cgiEnv.OutCgiRdf("LAYOUT_RECT", [prop_cgroup])
if __name__ == '__main__':
Main()
| 26.277778 | 102 | 0.608879 | 194 | 1,419 | 4.201031 | 0.371134 | 0.026994 | 0.040491 | 0.062577 | 0.217178 | 0.125153 | 0.125153 | 0 | 0 | 0 | 0 | 0.036489 | 0.285412 | 1,419 | 53 | 103 | 26.773585 | 0.767258 | 0.293869 | 0 | 0 | 0 | 0 | 0.068437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.142857 | 0 | 0.190476 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ad23688a8884db208cdf807a9190c4367a75c50 | 47,490 | py | Python | cogs/server.py | Glitchii/Troy | 11070e0a6f191d53429e55047ee89ffdba5c5796 | [
"MIT"
] | 2 | 2022-03-26T00:45:19.000Z | 2022-03-28T18:13:16.000Z | cogs/server.py | Glitchii/Troy | 11070e0a6f191d53429e55047ee89ffdba5c5796 | [
"MIT"
] | null | null | null | cogs/server.py | Glitchii/Troy | 11070e0a6f191d53429e55047ee89ffdba5c5796 | [
"MIT"
] | null | null | null | from io import BytesIO as IoBytesIO
from colorthief import ColorThief
from discord.ext.commands import command, Cog, group
from re import search as re_search, sub as re_sub, compile as re_compile
from PIL import Image
from traceback import format_exc
from os import remove as os_rem
from datetime import datetime
from discord.utils import escape_mentions
from matplotlib.pyplot import pie as pltPie, axis as pltAxis, savefig as pltSavefig
from discord import (
Embed, Colour, Status, File,
Emoji, TextChannel, HTTPException, Member,
ActivityType, Forbidden
)
from imports import (
access_ids, colrs, botPrefixDB,
lineNum, Cmds, loading_msg, paginate,
aiohttp_request, tryInt, tstGuild, dblpy
)
class Server(Cog):
def __init__(self, bot):
self.bot = bot
@Cog.listener()
async def on_ready(self):
print("Module: Server loaded")
@command(description="Check the newest server members")
async def newmembers(self, ctx, *, count=5):
try:
if count > 25: return await ctx.send('Max count is 25')
if not ctx.guild.chunked: await self.bot.request_offline_members(ctx.guild)
embed = Embed(title=f'{count} Newest members (from the newest)', colour=Colour(0x36393f))
members = sorted(ctx.guild.members, key=lambda m: m.joined_at, reverse=True)[:count]
for member in members:
embed.add_field(name=f'{member}', value=f'> Joined on: {member.joined_at:%d of %B, %Y [%A at %I:%M%p}\n> Account made on: {member.created_at:%d of %B, %Y [%A]}', inline=False)
await ctx.send(embed=embed)
except Exception as e: return await ctx.send(e)
@group(invoke_without_command=True, description="Shows server commands, Info, and more")
async def server(self, ctx): await ctx.send(embed=Embed(
color=colrs[4], title="All server commands:",
description="\n".join(map(str, Cmds.server))))
@server.command(description="Change servers prefix")
async def prefix(self, ctx, *, prefix=None):
try:
if prefix == '...':
try:
if not botPrefixDB.find_one({'_id': f'{ctx.guild.id}'}): return await ctx.send('That is the current prefix')
botPrefixDB.find_one_and_delete({'_id': f'{ctx.guild.id}'})
return await ctx.send("Prefix reset to the default which is `...`")
except: print(format_exc())
elif re_search("|\u200b", prefix): return await ctx.send("Please use a prefix that has no zero width characters so that everyone can be able to use it.")
elif len(prefix) > 25: return await ctx.send("That prefix is quite long, limit is 25 characters.")
elif " " in prefix: return await ctx.send("Prefix cannot include spaces")
else:
try: botPrefixDB.find_one_and_update({'_id': f'{ctx.guild.id}'}, {'$set': {'prefix': prefix, 'guildName': ctx.guild.name}}, upsert=True)
except Exception as e:
return await ctx.send(f"There was a a little problem while changing the prefix, please use the feedback command to send the error blow to my developer.\n```\n{e}\n```")
return await ctx.send(embed=Embed(
color=colrs[2],
title="<:Mark:663230689860386846> Custom prefix created:",
description=f"New Prefix: {botPrefixDB.find_one({'_id': f'{ctx.guild.id}'})['prefix']}\n**NOTE:** Bot'll nolonger respond to the default prefix in this server which is ...")
.set_footer(text="So don't forget the prefix you set 👀. But if you do forget, ping me."))
except Exception as e:
print(format_exc())
return await ctx(f"There was an error: {e}\nIf you don't know the problem please screenshot this together with your command and say `{ctx.prefix}feedback <upload your screenshot, or say what you want to say>` send the error to the bot owner.")
@server.command(description="See the roles in the server")
async def roles(self, ctx, sv: int = None):
if sv is None: server = ctx.guild
else: server = self.bot.get_guild(sv)
try:
roles = [str(a.mention) for a in server.roles]
def func(lines, chars=2000):
message, size = [], 0
for line in lines:
if len(line) + size > chars:
yield message
message, size = [], 0
message.append(line)
size += len(line)
yield message
for message in func(roles):
try: embed = Embed(color=Colour.lighter_grey(), description=', '.join(message))
except: return await ctx.send(', '.join(message))
return await ctx.send(embed=embed)
except HTTPException:
await ctx.send("Error: This server probably has a high number or roles which aren't allowed on embeds")
@server.command(description="See info about the server members")
async def members(self, ctx, svID: int = None):
server = self.bot.get_guild(svID) or ctx.guild
pltPie((
sum(x.status.value == 'online' for x in server.members),
sum(x.status.value == 'offline' for x in server.members),
sum(x.status.value == 'dnd' for x in server.members),
sum(x.status.value == 'idle' for x in server.members)),
labels=None,
colors=('#43b581', '#747f8d', '#f04747', '#faa61a'),
shadow=False,
startangle=140)
pltAxis('equal')
pltSavefig('memb_piechart', transparent=True)
await ctx.send(file=File("memb_piechart.png"), embed=Embed(title="Members in the server")
.add_field(name="Members:", value=F"<:online:666002136567513088>Online: {sum(not x.bot and x.status.value == 'online' for x in server.members)}\n<:Idle:664140822672834580>Idle: {sum(not x.bot and x.status.value == 'idle' for x in server.members)}\n<:dnd:664140822295347221>DND: {sum(not x.bot and x.status.value == 'dnd' for x in server.members)}\n<:offline:664140822823829514> Offline: {sum(not x.bot and x.status.value == 'offline' for x in server.members)}\n<:Sum:663243303478886461>Sum: {sum(not x.bot for x in server.members)}")
.add_field(name="Bots:", value=F"<:online:666002136567513088>Online: {sum(x.bot and x.status.value == 'online' for x in server.members)}\n<:Idle:664140822672834580>Idle: {sum(x.bot and x.status.value == 'idle' for x in server.members)}\n<:dnd:664140822295347221>DND: {sum(x.bot and x.status.value == 'dnd' for x in server.members)}\n<:offline:664140822823829514> Offline: {sum(x.bot and x.status.value == 'offline' for x in server.members)}\n<:Sum:663243303478886461>Sum: {sum(x.bot for x in server.members)}")
.add_field(name="Information on Pie chart", value="_ _", inline=False)
.set_thumbnail(url=server.icon_url)
.set_footer(text=f"Say \"{ctx.prefix}newmembers [number]\"to see the newest members in the server")
.set_image(url="attachment://memb_piechart.png"))
os_rem('memb_piechart.png')
@server.command(name="invites", description="See servers active invites")
async def serverInvites(self, ctx, svID: int = None):
server = self.bot.get_guild(svID) or ctx.guild
try:
invites = {str(invite) for invite in await server.invites()}
if not invites: return await ctx.send("There're currently no active invite links in this server")
else: return await ctx.send(embed=Embed(colour=colrs[4])
.add_field(name=f"Total invite links: {len(invites)}", value=" <:link:663295066357497857> \n".join(map(str, invites))+" <:link:663295066357497857> ")
.set_footer(text=f"Say \"{ctx.prefix}invite info <invite>\" for info about an invite"))
except Forbidden:
return await ctx.send("I can't show invites because I don't have `Manage server` permissions")
@server.command(name="info", description="Use to see information about the server")
async def serverInfo(self, ctx, severID: int = None):
try:
embed, loading, server = Embed(), await ctx.send(loading_msg("Gathering server information")), self.bot.get_guild(severID) or ctx.guild
voiceC, textC, stageC, prefixes = server.voice_channels, server.text_channels, server.stage_channels, botPrefixDB.find_one({'_id': f'{ctx.guild.id}'})
if server.icon_url:
try:
color_thief = ColorThief(IoBytesIO(await aiohttp_request(str(server.icon_url_as(format='png')), 'read')))
color_thief.get_color(quality=1)
palette = color_thief.get_palette(color_count=6)
x, y, z = palette[4][0] if len(palette) >= 6 else palette[0][0], palette[4][1] if len(
palette) >= 6 else palette[0][1], palette[4][2] if len(palette) >= 6 else palette[0][2]
iconCol = Colour(int('0x%02x%02x%02x' % (x, y, z), 16))
embed.color = iconCol
# img = Image.open(IoBytesIO(await aiohttp_request(str(server.icon_url_as(format='png')), 'read')))
# img = img.convert("RGB")
# img = img.resize((1, 1), resample=0)
# embed = Embed(colour=Colour.lighter_grey(), description=f"{server.name}")
# embed.color = Colour(int('0x%02x%02x%02x' % img.getpixel((0, 0)), 16))
except: print(f"{format_exc()}\n - {lineNum(True)}")
embed.set_thumbnail(url=server.icon_url).set_footer(text=f"Looking for my info? The command is '{ctx.prefix}info'")
embed.add_field(name="Owner", value=server.owner, inline=True)
embed.add_field(name="ID", value=server.id, inline=True)
if type(server.region) != str:
embed.add_field(name='Region', value=server.region.value.capitalize(), inline=True)
embed.add_field(name='Name', value=server.name, inline=True)
embed.add_field(name="Created on", value=f"{server.created_at:%d/%m/%Y} ({(ctx.message.created_at - server.created_at).days} days ago)", inline=True)
embed.add_field(name="System channel", value=server.system_channel.mention if server.system_channel else None, inline=True)
embed.add_field(name="Server description", value=server.description, inline=True)
embed.add_field(name="Shard ID", value=server.shard_id, inline=True)
embed.add_field(name="Server Icon", value=f"[Click Here]({server.icon_url})" if len(server.icon_url) >= 2 else None, inline=True)
embed.add_field(name="Server features", value=(", ".join(server.features)).capitalize().replace('_', ' ') if server.features else None, inline=len(server.features) <= 5)
embed.add_field(name="Server emotes", value=len(server.emojis), inline=True)
embed.add_field(name="Roles", value=len(server.roles), inline=True)
embed.add_field(name="Premium boosters", value=server.premium_subscription_count, inline=True)
embed.add_field(name="Members", value=f"\
<:online:666002136567513088>{sum(x.status.value == 'online' and x.bot for x in server.members)} bots, {sum(x.status.value == 'online' and not x.bot for x in server.members)} {'person' if sum(x.status.value == 'online' and not x.bot for x in server.members) == 1 else 'people'}\n\
<:Idle:664140822672834580>{sum(x.status.value == 'idle' and x.bot for x in server.members)} bots, {sum(x.status.value == 'idle' and not x.bot for x in server.members)} {'person' if sum(x.status.value == 'idle' and not x.bot for x in server.members) == 1 else 'people'}\n\
<:dnd:664140822295347221>{sum(x.status.value == 'dnd' and x.bot for x in server.members)} bots, {sum(x.status.value == 'dnd' and not x.bot for x in server.members)} {'person' if sum(x.status.value == 'dnd' and not x.bot for x in server.members) == 1 else 'people'}\n\
<:offline:664140822823829514> {sum(x.status.value == 'offline' and x.bot for x in server.members)} bots, {sum(x.status.value == 'offline' and not x.bot for x in server.members)} {'person' if sum(x.status.value == 'offline' and not x.bot for x in server.members) == 1 else 'people'}\n\
<:Sum:663243303478886461>{server.member_count} in total", inline=True)
embed.add_field(name="Channels", value=f"\
<:Textchannel:776521784307875891> {len(textC)}\n\
<:Voicechannel:776521783980326944> {len(voiceC)}\n\
<:Stagechannel:866906785842200636> {len(stageC)}\n\
{len(voiceC) + len(textC) + len(stageC)} Total channels", inline=True)
embed.add_field(name="Nitro boosting level", value=server.premium_tier, inline=True)
try: embed.add_field(name="Active invites", value=len(await server.invites()), inline=True)
except: pass
try: embed.set_image(url=server.banner_url)
except: pass
embed.add_field(name="Animated icon?", value="Yes" if server.is_icon_animated() else "No", inline=True)
embed.add_field(name="My prefix", value=prefixes.get('prefix', '...') if prefixes else '...', inline=True)
await ctx.send(embed=embed)
except Exception as e:
await ctx.send(f"Looks like I fell into a little error; {e}")
print(format_exc(), lineNum(True))
finally:
try: await loading.delete()
except: print(format_exc(), lineNum(True))
@server.command(aliases=("emojis",), description="See custom emojis in the server")
async def emotes(self, ctx, sv: int = None):
if sv is None: server = ctx.guild
else: server = self.bot.get_guild(sv)
try:
emotes = [str(x) for x in server.emojis]
def func(lines, chars=2000):
message, size = [], 0
for line in lines:
if len(line) + size > chars:
yield message
message, size = [], 0
message.append(line)
size += len(line)
yield message
for message in func(emotes): await ctx.send(embed=Embed(
color=Colour.lighter_grey(),
description=' '.join(message))
.set_footer(text=f"{len(server.emojis)} emojis in the server"))
except:
try:
emotes = [str(x) for x in server.emojis]
def func(lines, chars=2000):
message, size = [], 0
for line in lines:
if len(line) + size > chars:
yield message
message, size = [], 0
message.append(line)
size += len(line)
yield message
for message in func(emotes): return await ctx.send(' '.join(message))
except Exception as e:
await ctx.send(e)
@server.command(hidden=True, name="list", description="See all the servers I'm in")
async def serverList(self, ctx, more = False):
if ctx.author.id not in access_ids: return
if not more:
return await ctx.send(embed=Embed(title="All servers I'm powering:", description="```\n" + ', '.join(map(str, self.bot.guilds)) + "```", colour=0x0af78a)
.add_field(name=f"<:Server:663296208537911347> {len(self.bot.guilds)} Servers", value="[Remove yours](https://i.imgur.com/I4tUSRF.png)", inline=True)
.add_field(name=f"<:Users:663295067280244776> {len(set(self.bot.get_all_members()))} users", value=f"[Invite me](https://discordapp.com/oauth2/authorize?client_id={self.bot.user.id}&scope=bot&permissions=1479928959)", inline=True))
clean, text = lambda t: re_sub(r'[\*_`]', '', str(t)), ''
for i, guild in enumerate(sorted([x for x in self.bot.guilds], key=lambda g: g.me.joined_at)):
text += ( f"{i} {clean(guild.name)} ({guild.id})\n"
f"{''*9} Owner: {clean(guild.owner)} ({guild.owner.mention})\n"
f"{''*9} Humans: {len([m for m in guild.members if not m.bot])}. Bots: {len([m for m in guild.members if m.bot])}\n")
for text in paginate(text):
await ctx.send(text)
@group(invoke_without_command=True, aliases=("who",), description="See number of messages, avatar, permision, information about a server member and more")
async def user(self, ctx): await ctx.send(embed=Embed(
color=colrs[4], title="All user commands:",
description="\n".join(x for x in Cmds.server if (x.startswith('user')))))
@user.command(aliases=("msg", "messages",), description="See how many messages you or a server member has sent in a channel")
async def msgs(self, ctx, user: Member = None):
try:
user, msgs = user or ctx.author, 0
loading = await ctx.send(loading_msg(f'Counting messages from {escape_mentions(user.display_name)} in this channel. This could take long...'))
async for elem in ctx.channel.history(limit=None):
if elem.author.id == user.id: msgs += 1
await ctx.send(escape_mentions(("I have" if user == self.bot.user else f"{user.display_name} has") + f" sent {msgs} messages in {ctx.channel.mention} so far"))
except: print(format_exc())
finally: await loading.delete()
@user.command(aliases=("img", "pfp", "av",), description="See a server member's avatar image")
async def avatar(self, ctx, user: Member = None):
ignore = (access_ids[0], 663074487335649292)
if not user: user = ctx.author
if user.id in ignore and ctx.author.id not in ignore: return await ctx.send('https://media1.tenor.com/images/17a17ae6b93faf667b39af6d8fe34d68/tenor.gif')
try: await ctx.send(embed=Embed(title=f"{user.display_name}'s avatar",
description=f"Download [ [PNG]({user.avatar_url_as(format='png')}) | [WEBP]({user.avatar_url_as(format='webp')}) | [JPEG]({user.avatar_url_as(format='jpeg')}) | [JPG]({user.avatar_url_as(format='jpg')})" + (f" | [GIF]({user.avatar_url_as(format='gif')}) ]" if user.is_avatar_animated() else " ]"))
.set_image(url=user.avatar_url)
.set_footer(text=f"As requested by {ctx.author.display_name}"))
except Exception as e:
await ctx.send(f"There was an error:\n{e}")
return print(f"User avatar command failed with error \n{format_exc()}\n - By {ctx.author.name} in the {ctx.guild.name} server")
@user.command(name="info", aliases=("is", "about",), brief="Get information about a server member", description="Information includes when they joined the server, when they created their discord account and more")
async def userInfo(self, ctx, member: Member = None): # sourcery no-metrics
try:
member, loading = member or ctx.author, await ctx.send(loading_msg("Getting information..."))
col = member.color
if member.avatar_url:
try:
img = Image.open(IoBytesIO(await aiohttp_request(str(member.avatar_url_as(format='png')), 'read')))
img = img.convert("RGB")
img = img.resize((1, 1), resample=0)
col = Colour(int('0x%02x%02x%02x' % img.getpixel((0, 0)), 16))
except:
print(f"{format_exc()}\n - {lineNum(True)}")
if str(member.color) == "#000000":
col = member.status == 0xf04747 if Status.do_not_disturb else 0x43b581 if member.status == Status.online else 0xfaa61a if member.status == Status.idle else 0x747f8d
else:
if str(member.color) == "#000000":
col = member.status == 0xf04747 if Status.do_not_disturb else 0x43b581 if member.status == Status.online else 0xfaa61a if member.status == Status.idle else 0x747f8d
embed = Embed(color=col)
embed.set_thumbnail(url=member.avatar_url)
embed.set_footer(text=f"Requested by {ctx.author.name}", icon_url=ctx.author.avatar_url)
embed.add_field(name="Name", value=f"{member.name}#{member.discriminator}", inline=True)
embed.add_field(name="ID", value=member.id, inline=True)
embed.add_field(name="Animated Avatar", value="Yes" if member.is_avatar_animated() else "No", inline=True)
embed.add_field(name="Nickname", value=member.nick, inline=True)
embed.add_field(name="Top Role", value=member.top_role.name if not member.top_role.is_default() else None, inline=True)
embed.add_field(name="Active on mobile?", value="Yes" if member.is_on_mobile() else "No", inline=True)
embed.add_field(name="Status", value=(f"<:online:666002136567513088>" if member.status==Status.online else f"<:offline:664140822823829514>" if member.status == Status.offline else f"<:dnd:664140822295347221>" if member.status==Status.do_not_disturb else f"<:Idle:664140822672834580>" if member.status==Status.idle else "") +member.status.value, inline=True)
roles = [x for x in member.roles if not x.is_default()]
try:
if str(member.color) == "#000000": embed.add_field(name="Name color", value=f"Default", inline=True)
else:
emoji = await tstGuild().create_custom_emoji(name=f"color", image=await aiohttp_request(f'http://www.colorhexa.com/{str(member.color)[1:]}.png', 'read'))
embed.add_field(name="Name color", value=f"{member.color} │ {emoji}", inline=True)
except:
if len(member.roles) >= 2: embed.add_field(name="Name color", value=f"{member.color}", inline=True)
else: embed.add_field(name="Name color", value=f"Default", inline=True)
embed.add_field(name="Avatar URL", value=f"[Click Here]({member.avatar_url})", inline=True)
embed.add_field(name="Activity", value='No activity' if not member.activity
else f"Playing {member.activity.name}" if member.activity.type == ActivityType.playing
else f"Watching {member.activity.name}" if member.activity.type == ActivityType.watching
else f"Listening to {member.activity.name}" if member.activity.type == ActivityType.listening
else member.activity.name, inline=True)
embed.add_field(name="Bot Account", value="Yes" if member.bot else "No", inline=True)
embed.add_field(name="Joined Server on", value=f"{member.joined_at:%d/%m/%Y} ({(ctx.message.created_at - member.joined_at).days} days ago)", inline=True)
embed.add_field(name="Joined Discord on", value=f"{member.created_at:%d/%m/%Y} ({(datetime.now() - member.created_at).days} days ago)", inline=True)
if member.bot:
try:
botInfo = await dblpy.http.get_bot_info(member.id)
if botInfo.get('prefix'): embed.add_field(name="Prefix:", value=botInfo['prefix'], inline=True)
if botInfo.get('lib'): embed.add_field(name="Library:", value='Unknown' if botInfo['lib'].lower() == 'other' else botInfo['lib'], inline=True)
if botInfo.get("owners"):
if len(botInfo['owners']) > 1:
try: embed.add_field(name="Developers:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except:
embed.add_field(name="Developer IDs:", value=", ".join(map(str, botInfo['owners'])), inline=True)
print(format_exc())
else:
try: embed.add_field(name="Developer:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except: embed.add_field(name="Developer's ID:", value=", ".join(map(str, botInfo['owners'])), inline=True)
if botInfo.get("server_count"): embed.add_field(name="Bot is in:", value=f"{botInfo['server_count']} servers", inline=True)
if botInfo.get("tags"): embed.add_field(name="Categories", value=f', '.join(str(tag).lower() for tag in botInfo['tags']), inline=True)
if botInfo.get("shortdesc"): embed.add_field(name="Short description", value=botInfo['shortdesc'], inline=True)
except: pass
embed.add_field(
name=f"Roles ({len(roles)})",
value=", ".join(role.mention for role in roles)
if roles
else 'No roles',
inline=True,
)
await ctx.send(embed=embed)
except Exception as e: await ctx.send(embed=Embed(
title="There was an error:",
description=f"```\n{e}\n - {lineNum(True)}\n```\nPlease send that error to bot owner if you don't know what's wrong"))
finally:
await loading.delete()
try: await emoji.delete()
except: pass
@user.command(name="perms", description="See yours or someones permissions in the server")
async def userPerms(self, ctx, member: Member = None, server_ID: int = None):
guild = self.bot.get_guild(server_ID) or ctx.guild
if not member: member = ctx.author
perms = member.guild_permissions
return await ctx.send(embed=Embed(color=colrs[4])
.add_field(name="Kick Members", value=perms.kick_members, inline=True).add_field(name="Ban Members", value=perms.ban_members, inline=True)
.add_field(name="Manage Channels", value=perms.manage_channels, inline=True).add_field(name="Manage Server", value=perms.manage_guild, inline=True)
.add_field(name="Add Reactions", value=perms.add_reactions, inline=True).add_field(name="View Audit Log", value=perms.view_audit_log, inline=True)
.add_field(name="Read Messages", value=perms.read_messages, inline=True).add_field(name="Send Messages", value=perms.send_messages, inline=True)
.add_field(name="Manage Messages", value=perms.manage_messages, inline=True).add_field(name="Mention Everyone", value=perms.mention_everyone, inline=True)
.add_field(name="Manage Nicknames", value=perms.manage_nicknames, inline=True).add_field(name="Manage Roles", value=perms.manage_roles, inline=True)
.add_field(name="Manage Webhooks", value=perms.manage_webhooks, inline=True).add_field(name="Manage Emojis", value=perms.manage_emojis, inline=True)
.add_field(name="Move Members", value=perms.move_members, inline=True).add_field(name="Mute Members", value=perms.mute_members, inline=True)
.add_field(name="Read Message History", value=perms.read_message_history, inline=True).add_field(name="Send TTS Messages", value=perms.send_tts_messages, inline=True)
.add_field(name="Change Nickname", value=perms.change_nickname, inline=True).add_field(name="Mange Nicknames", value=perms.manage_nicknames, inline=True)
.add_field(name="Embed Links", value=perms.embed_links, inline=True).add_field(name="Mute members", value=perms.mute_members, inline=True)
.set_author(name=f"{member.name}'s permissions in {guild.name}", icon_url=member.avatar_url)
.set_footer(text=f"{member} is the server owner" if guild.owner == member else f"{member.name} is an administrator" if perms.administrator else " "))
@group(aliases=('emote',), invoke_without_command=True, description="Used to add an emoji to server, see information about it like who created it etc. You can also use it to steal an emoji to any of the servers I'm in, If no paremeters give it shows the emoji image")
async def emoji(self, ctx, *, msg=None):
if not msg:
return await ctx.send(embed=Embed(color=colrs[4], title="All emoji commands:",
description="\n".join(x for x in Cmds.server if (x.startswith('emoji')))))
def find_emoji(msg):
msg, colors, name = re_sub("<a?:(.+):([0-9]+)>", "\\2", msg), ("1f3fb", "1f3fc", "1f3fd", "1f44c", "1f3fe", "1f3ff"), None
for guild in self.bot.guilds:
for emoji in guild.emojis:
if msg.strip().lower() in emoji.name.lower():
url, id, name, guild_name = emoji.url, emoji.id, emoji.name + (".gif" if emoji.animated else ".png"), guild.name
if msg.strip() in (str(emoji.id), emoji.name):
url, name = emoji.url, emoji.name + (".gif" if emoji.animated else ".png")
return name, url, emoji.id, guild.name
if name: return name, url, id, guild_name
# Check for a stock emoji before returning a failure
codepoint_regex = re_compile(r'([\d#])?\\[xuU]0*([a-f\d]*)')
unicode_raw = msg.encode('unicode-escape').decode('ascii')
codepoints = codepoint_regex.findall(unicode_raw)
if codepoints == []: return "", "", "", ""
if len(codepoints) > 1 and codepoints[1][1] in colors: codepoints.pop(1)
if codepoints[0][0] == '#': emoji_code = '23-20e3'
elif codepoints[0][0] == '':
codepoints = [x[1] for x in codepoints]
emoji_code = '-'.join(codepoints)
else: emoji_code = f"3{codepoints[0][0]}-{codepoints[0][1]}"
url = f"https://raw.githubusercontent.com/astronautlevel2/twemoji/gh-pages/128x128/{emoji_code}.png"
name = "emoji.png"
return name, url, "N/A", "Official"
emojis = msg.split()
if msg.startswith('s '): emojis, get_guild = emojis[1:], True
else: get_guild = False
if len(emojis) > 5: return await ctx.send("Maximum of 5 emojis at a time.")
images = []
for emoji in emojis:
name, url, id, guild = find_emoji(emoji)
if not url:
await ctx.send(f"Could not find {emoji}. Skipping.")
continue
images.append((guild, str(id), url, File(IoBytesIO(await aiohttp_request(str(url), 'read')), name)))
for (guild, id, url, file) in images:
if ctx.channel.permissions_for(ctx.author).attach_files:
if get_guild: await ctx.send(content=f'**ID:** {id}\n**Server:** {guild}', file=file)
else: await ctx.send(file=file)
else:
if get_guild: await ctx.send(f'**ID:** {id}\n**Server:** {guild}\n**URL: {url}**')
else: await ctx.send(url)
file.close()
@emoji.command(description="Copy an emoji from any of the servers bot is in.\nThe bot'll look through all the servers it is in to find an emoji with the given name, if found bot'll copy that emoji (name and image) and add it to this server.")
async def copy(self, ctx, *, msg):
try:
loading, match = await ctx.send(loading_msg("Looking through servers for this emoji name")), None
if not ctx.author.guild_permissions.manage_emojis: return await ctx.send("You must have `Manage Emojis` permission to copy emojis")
for guild in self.bot.guilds:
for emoji in guild.emojis:
if emoji.name.lower() == msg.lower():
match = emoji
if not match: return await ctx.send('Could not find emoji.')
emoji = await ctx.guild.create_custom_emoji(name=match.name, image=(await aiohttp_request(f"{match.url}", 'read')))
await ctx.send(f"Successfully added the emoji {emoji.name} <{'a' if emoji.animated else ''}:{emoji.name}:{emoji.id}>!")
except:
print(format_exc())
await ctx.send(f"Fell into an error on line {lineNum()}, please send the following error to my developer\n {format_exc()}")
finally: await loading.delete()
@emoji.command(name="add", description="Add emoji to server.\nGive emoji name wich will be used to name the emoji and image url which will be used as emoji image when I create the emoji. If name name given, bot'll go for name in url. I must have manage_emojis permissions first to do this.")
async def emojiAdd(self, ctx, url='', *, name=None):
if ctx.author.guild_permissions.manage_emojis:
if not url: return await ctx.send(f"You must say a link, Usage: `{ctx.prefix}emoji add <link to an image> <name> `")
if not name:
if '.' in url: name = url.split('.')[-2]
name = name.split('/')[-1]
elif " " in name: name = "".join(x.capitalize() for x in name.split())
try: response = await aiohttp_request(url, 'read')
except:
print(format_exc())
return await ctx.send(f"This url `{url}` you have provided is invalid or not well formed.")
if (await aiohttp_request(url)).status == 404: return await ctx.send("The URL link you have provided leads to a 404 (not found) page.")
emoji = await ctx.guild.create_custom_emoji(name=name, image=response)
await ctx.send(f"Successfully added the emoji \"`{emoji.name}`\" <{'a' if emoji.animated else ''}:{emoji.name}:{emoji.id}>")
else: await ctx.channel.send("You must have `Manage Emojis` permission to add an emoji")
@emoji.command(name="del", description="To remove an emoji that matches the name you give for you.")
async def delete(self, ctx, name):
if not ctx.author.guild_permissions.manage_emojis: return await ctx.channel.send("You need '`Manage Emojis`' to use this ccommand.")
if not ctx.guild.me.guild_permissions.manage_emojis: return await ctx.channel.send("I need '`Manage Emojis`' permission to delete an emoji.")
if search := re_search(r'<:(\w+):\d{10,}>', name): name = search[1]
else: name = name.replace(':', '')
emotes, emote_length = [x for x in ctx.guild.emojis if x.name == name], 0
if not emotes: return await ctx.send(f"I couldn't find any custom emojis with the name `{name}` in this server.")
for emote in emotes:
try:
await emote.delete()
emote_length+=1
except: return await ctx.send('There was an error deleting')
await ctx.send(f"Successfully deleted the emoji '`{emotes[0].name}`'" if emote_length == 1 else f"Successfully removed {emote_length} emojis.")
@emoji.command(aliase="rename", description="To rename an emoji that matches the name you give.")
async def ren(self, ctx, name, new_name):
if not ctx.author.guild_permissions.manage_emojis: return await ctx.channel.send("You need '`Manage Emojis`' to use this ccommand.")
if not ctx.guild.me.guild_permissions.manage_emojis: return await ctx.channel.send("I need '`Manage Emojis`' permission to rename an emoji.")
if search := re_search(r'<:(\w+):\d{10,}>', name): name = search[1]
else: name = name.replace(':', '')
emotes, emote_length = [x for x in ctx.guild.emojis if x.name == name], 0
if not emotes: return await ctx.send(f"I couldn't find any custom emojis with the name `{name}` in this server.")
for emote in emotes:
try:
await emote.edit(name=new_name)
new_name, emote_length = self.bot.get_emoji(emote.id), emote_length+1
except: return await ctx.send('There was an error renaming')
await ctx.send(f"Successfully renamed the emoji from '`{emotes[0].name}`'" if emote_length == 1 else f"Successfully renamed {emote_length} emojis.")
@emoji.command(name="info", description="Shows information about emoji.\nShows information about emoji like who added it to the server, when it was added, or to get the it's image and more.")
async def emojiInfo(self, ctx, emoji:Emoji):
try:
embed = Embed()
embed.add_field(name="Name", value=emoji.name, inline=True)
embed.add_field(name="ID", value=emoji.id, inline=True)
if emoji.user is None:
try:
emoji2 = await ctx.guild.fetch_emoji(emoji.id)
embed.add_field(name="Creator", value=emoji2.user, inline=True)
except: pass
else: embed.add_field(name="Creator", value=emoji.user, inline=True)
embed.add_field(name="Animated?", value="Yes" if emoji.animated else "No", inline=True)
embed.add_field(name="Created on", value=emoji.created_at.strftime("%d/%m/%y"), inline=True)
embed.add_field(name="Requires colons?", value="Yes" if emoji.require_colons else "No", inline=True)
embed.add_field(name="Emoji Icon", value=f"[Click Here]({emoji.url})", inline=True)
embed.add_field(name="Server it's made in", value=emoji.guild, inline=True)
embed.add_field(name="Managed by a Twitch integration?", value="Yes" if emoji.managed else "No", inline=True)
if not emoji.roles: embed.add_field(name="Roles allowed to use it", value="All roles", inline=True)
else: embed.add_field(name="Roles allowed to use it", value=', '.join(map(lambda r: r.mention, emoji.roles)), inline=True)
try: embed.set_thumbnail(url=emoji.url)
except: pass
await ctx.send(embed=embed)
except Exception as e:
await ctx.send(f"There was an error:\n```\n{e}\n```\nPlease send the above error to my developer if you don't know what is wrong")
@command(invoke_without_command=True, brief="Get mine or a certain bot's invite, or get info about an invite link", description="**Examples:**\n\t<<prefix>>invite\n\t<<prefix>>invite @MEE6\n\t<<prefix>>invite info <discord server invite>")
async def invite(self, ctx, user=None, invite = None):
if not user or user == self.bot.user:
embed = Embed(color=colrs[1], title=None, description=f"<:link:663295066357497857> [Invite me](https://discord.com/oauth2/authorize?client_id={self.bot.user.id}&scope=bot&permissions=1479928959)\nBot is currently still in beta. If you have some feedback please let me know.\nYou can also get invite link to another bot and probably some info about it by pinging it or saying it's ID with the command eg. `{ctx.prefix}invite @{ctx.guild.me.display_name}`", colour=Colour.green())
embed.set_thumbnail(url="https://i.imgur.com/CNYbdaV.png")
embed.set_footer(text=f"Looking for this server's invites instead? Say \"{ctx.prefix}server invites\"")
await ctx.send(embed=embed)
if user.lower() == "info": #inviteinfo
if not invite: return await ctx.send('No Discord server invite provided')
invite = await self.bot.fetch_invite(re_sub(r"(https?:\/\/)?((discord\.gg\/)|(.+\/invite\/))", "", re_sub('[<>]', '', invite)))
if not invite: return await ctx.send(f'Invite not found, is it a discord server invite?')
data = Embed(title="**Information about Invite:** %s" % invite.id)
if invite.revoked: data.colour = Colour.red() if invite.revoked else Colour.green()
data.add_field(name="Expires", value=f"{invite.max_age:%s} seconds" if invite.max_age else "Never")
data.add_field(name="Temp membership", value="Yes" if invite.temporary else "No")
data.add_field(name="Uses", value=invite.uses, inline=False)
if invite.guild.name:
data.add_field(name="Server", value="**Name:** " + invite.guild.name + "\n**ID:** %s" % invite.guild.id, inline=True)
if invite.guild.icon_url:
data.set_thumbnail(url=invite.guild.icon_url)
if invite.channel.name:
channel = "%s\n#%s" % (invite.channel.mention, invite.channel.name) if isinstance(invite.channel, TextChannel) else invite.channel.name
data.add_field(name="Channel", value=f"**Name:** {channel}\n**ID:** {invite.channel.id}", inline=True)
try:
data.add_field(name="Total members", value=invite.approximate_member_count, inline=True)
data.add_field(name="Active members", value=invite.approximate_presence_count, inline=True)
except: pass
if invite.inviter.name: data.set_footer(
text="Creator: "+invite.inviter.name + '#' + invite.inviter.discriminator + " (%s)" % invite.inviter.id,
icon_url=invite.inviter.avatar_url)
try: return await ctx.send(embed=data)
except: await ctx.send(content="I need the `Embed links` permission to send this")
elif user.isdigit():
try:
member, embed = self.bot.get_user(int(user)), Embed(color=Colour.green(), title="Bot invite")
try:
botInfo = await dblpy.http.get_bot_info(member.id)
if botInfo.get("id") and botInfo.get("avatar"): embed.set_thumbnail(url=member.avatar_url)
if botInfo.get("username"): embed.description = f"<:link:663295066357497857> [Invite {botInfo['username']}]({botInfo['invite'] if botInfo.get('invite') else f'https://discordapp.com/oauth2/authorize?client_id={user}&scope=bot&permissions=2146958839'})"
else: embed.description = f"Click below to invite the bot with that ID\n<:link:663295066357497857> [Bot invite](https://discordapp.com/oauth2/authorize?client_id={user}&scope=bot&permissions=2146958839)"
if botInfo.get('prefix'):
embed.add_field(name="Prefix:", value=botInfo['prefix'], inline=True)
if botInfo.get('lib'):
embed.add_field(name="Library:", value='Unknown' if botInfo['lib'].lower() == 'other' else botInfo['lib'], inline=True)
if botInfo.get("owners"):
if len(botInfo['owners']) > 1:
try: embed.add_field(name="Developers:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except:
embed.add_field(name="Developer IDs:", value=", ".join(map(str, botInfo['owners'])), inline=True)
print(format_exc())
else:
try: embed.add_field(name="Developer:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except: embed.add_field(name="Developer's ID:", value=", ".join(map(str, botInfo['owners'])), inline=True)
if botInfo.get("server_count"):
embed.add_field(name="Bot is in:", value=f"{botInfo['server_count']} servers", inline=True)
if botInfo.get("tags"): embed.add_field(name="Categories", value=f', '.join(str(tag).lower() for tag in botInfo['tags']), inline=True)
embed.set_footer(text="If you meant to invite me say the command without an ID.")
except:embed.description = f"Click below to invite the bot with that ID\n<:link:663295066357497857> [Bot invite](https://discordapp.com/oauth2/authorize?client_id={user}&scope=bot&permissions=2146958839)"
await ctx.send(embed=embed)
except AttributeError:
await ctx.send(f"Error: Check if the ID is right or if the bot is in any server I'm in `{ctx.prefix}invite [bot ID OR ping a bot]`")
elif '<@' and '>' in user:
member, embed = self.bot.get_user(tryInt(user.strip(' <@!&> '))), Embed(color=Colour.green(), title="Bot invite")
if not member:
if r := ctx.guild.get_role(tryInt(user.strip(' <@!&> '))):
if r.managed and len(r.members) == 1 and sum(m.bot for m in r.members):member = r.members[0]
else: return await ctx.send(escape_mentions(f"\"{r.name}\" is neither a bot nor a bot role but a role"))
else: return await ctx.send(f'Member not found, perhaps they\'re nolonger part of this server')
if not member.bot: return await ctx.send(escape_mentions(f'{member.display_name} isn\'t a bot'))
try:
botInfo = await dblpy.http.get_bot_info(member.id)
if botInfo.get("id") and botInfo.get("avatar"):
embed.set_thumbnail(url=f"https://images.discordapp.net/avatars/{botInfo['id']}/{botInfo['avatar']}.png")
if botInfo.get("username"):
embed.description = f"<:link:663295066357497857> [Invite {botInfo['username']}]({botInfo['invite'] if botInfo.get('invite') else f'https://discordapp.com/oauth2/authorize?client_id={member.id}&scope=bot&permissions=2146958839'})"
else: embed.description = f"Click below to invite {member.display_name}\n<:link:663295066357497857> [Bot invite](https://discordapp.com/oauth2/authorize?client_id={member.id}&scope=bot&permissions=2146958839)"
if botInfo.get('prefix'): embed.add_field(name="Prefix:", value=botInfo['prefix'], inline=True)
if botInfo.get('lib'): embed.add_field(name="Library:", value='Unknown' if (botInfo['lib'].lower() == 'other') else botInfo['lib'], inline=True)
if botInfo.get("owners"):
if len(botInfo['owners']) > 1:
try: embed.add_field(name="Developers:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except:
embed.add_field(name="Developer IDs:", value=", ".join(map(str, botInfo['owners'])), inline=True)
print(format_exc())
else:
try: embed.add_field(name="Developer:", value=", ".join(str(self.bot.get_user(int(ID))) for ID in botInfo['owners']), inline=True)
except: embed.add_field(name="Developer's ID:", value=", ".join(map(str, botInfo['owners'])), inline=True)
if botInfo.get("server_count"): embed.add_field(name="Bot is in:", value=f"{botInfo['server_count']} servers", inline=True)
if botInfo.get("tags"): embed.add_field(name="Categories", value=f', '.join(str(tag).lower() for tag in botInfo['tags']), inline=True)
embed.set_footer(text="If you meant to invite me say the command without pinging a bot.")
except: embed.description = f"Click below to invite {member.display_name}\n<:link:663295066357497857> [Bot invite](https://discordapp.com/oauth2/authorize?client_id={int(user.strip(' <@!> '))}&scope=bot&permissions=2146958839)"
await ctx.send(embed=embed)
else:
await ctx.send(escape_mentions(f"Parameter \"{user}\" is not valid, please say `{ctx.prefix}help {ctx.command}` for help with the command."))
def setup(bot):
bot.add_cog(Server(bot)) | 76.350482 | 545 | 0.615603 | 6,452 | 47,490 | 4.463577 | 0.100589 | 0.030279 | 0.045418 | 0.043682 | 0.513698 | 0.454912 | 0.395778 | 0.37168 | 0.332616 | 0.306886 | 0 | 0.028012 | 0.244536 | 47,490 | 622 | 546 | 76.350482 | 0.774619 | 0.008107 | 0 | 0.268293 | 0 | 0.06446 | 0.282757 | 0.051744 | 0.001742 | 0 | 0.001699 | 0 | 0 | 1 | 0.010453 | false | 0.012195 | 0.020906 | 0 | 0.055749 | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ad4887310112ec9d93c98470773aaa026cdeeb0 | 14,015 | py | Python | pystac/validation/schema_uri_map.py | schwehr/pystac | c2b7e0e8010931ec7dde3cc21b9f4955cd1f0706 | [
"Apache-2.0"
] | null | null | null | pystac/validation/schema_uri_map.py | schwehr/pystac | c2b7e0e8010931ec7dde3cc21b9f4955cd1f0706 | [
"Apache-2.0"
] | null | null | null | pystac/validation/schema_uri_map.py | schwehr/pystac | c2b7e0e8010931ec7dde3cc21b9f4955cd1f0706 | [
"Apache-2.0"
] | null | null | null | from abc import ABC, abstractmethod
from functools import lru_cache
from pystac.serialization.identify import OldExtensionShortIDs, STACVersionID
from typing import Any, Callable, Dict, List, Optional, Tuple
import pystac
from pystac.serialization import STACVersionRange
from pystac.stac_object import STACObjectType
class SchemaUriMap(ABC):
"""Abstract class defining schema URIs for STAC core objects and extensions."""
def __init__(self) -> None:
pass
@abstractmethod
def get_object_schema_uri(
self, object_type: STACObjectType, stac_version: str
) -> Optional[str]:
"""Get the schema URI for the given object type and stac version.
Args:
object_type : STAC object type. One of
:class:`~pystac.STACObjectType`
stac_version : The STAC version of the schema to return.
Returns:
str: The URI of the schema, or None if not found.
"""
pass
class DefaultSchemaUriMap(SchemaUriMap):
"""Implementation of SchemaUriMap that uses schemas hosted by stacspec.org
For STAC Versions 0.9.0 or earlier this will use the schemas hosted on the
radiantearth/stac-spec GitHub repo.
"""
# BASE_URIS contains a list of tuples, the first element is a version range and the
# second being the base URI for schemas for that range. The schema URI of a STAC
# for a particular version uses the base URI associated with the version range which
# contains it. If the version it outside of any VersionRange, there is no URI for
# the schema.
BASE_URIS: List[Tuple[STACVersionRange, Callable[[str], str]]] = [
(
STACVersionRange(min_version="1.0.0-beta.1"),
lambda version: "https://schemas.stacspec.org/v{}".format(version),
),
(
STACVersionRange(min_version="0.8.0", max_version="0.9.0"),
lambda version: (
f"https://raw.githubusercontent.com/radiantearth/stac-spec/v{version}"
),
),
]
# DEFAULT_SCHEMA_MAP contains a structure that matches 'core' or 'extension' schema
# URIs based on the stac object type and the stac version, using a similar
# technique as BASE_URIS. Uris are contained in a tuple whose first element
# represents the URI of the latest version, so that a search through version
# ranges is avoided if the STAC being validated
# is the latest version. If it's a previous version, the stac_version that matches
# the listed version range is used, or else the URI from the latest version is used
# if there are no overrides for previous versions.
DEFAULT_SCHEMA_MAP: Dict[str, Any] = {
STACObjectType.CATALOG: ("catalog-spec/json-schema/catalog.json", None),
STACObjectType.COLLECTION: (
"collection-spec/json-schema/collection.json",
None,
),
STACObjectType.ITEM: ("item-spec/json-schema/item.json", None),
STACObjectType.ITEMCOLLECTION: (
None,
[
STACVersionRange(min_version="v0.8.0-rc1", max_version="0.9.0"),
"item-spec/json-schema/itemcollection.json",
],
),
}
@classmethod
def _append_base_uri_if_needed(cls, uri: str, stac_version: str) -> Optional[str]:
# Only append the base URI if it's not already an absolute URI
if "://" not in uri:
base_uri = None
for version_range, f in cls.BASE_URIS:
if version_range.contains(stac_version):
base_uri = f(stac_version)
return "{}/{}".format(base_uri, uri)
# We don't have JSON schema validation for this version of PySTAC
return None
else:
return uri
def get_object_schema_uri(
self, object_type: STACObjectType, stac_version: str
) -> Optional[str]:
uri = None
is_latest = stac_version == pystac.get_stac_version()
if object_type not in self.DEFAULT_SCHEMA_MAP:
raise KeyError("Unknown STAC object type {}".format(object_type))
uri = self.DEFAULT_SCHEMA_MAP[object_type][0]
if not is_latest:
if self.DEFAULT_SCHEMA_MAP[object_type][1]:
for version_range, range_uri in self.DEFAULT_SCHEMA_MAP[object_type][1]:
if version_range.contains(stac_version):
uri = range_uri
break
return self._append_base_uri_if_needed(uri, stac_version)
class OldExtensionSchemaUriMap:
"""Ties old extension IDs to schemas hosted by https://schemas.stacspec.org.
For STAC Versions 0.9.0 or earlier this will use the schemas hosted on the
radiantearth/stac-spec GitHub repo.
"""
# BASE_URIS contains a list of tuples, the first element is a version range and the
# second being the base URI for schemas for that range. The schema URI of a STAC
# for a particular version uses the base URI associated with the version range which
# contains it. If the version it outside of any VersionRange, there is no URI for
# the schema.
@classmethod
@lru_cache()
def get_base_uris(
cls,
) -> List[Tuple[STACVersionRange, Callable[[STACVersionID], str]]]:
return [
(
STACVersionRange(min_version="1.0.0-beta.1"),
lambda version: f"https://schemas.stacspec.org/v{version}",
),
(
STACVersionRange(min_version="0.8.0", max_version="0.9.0"),
lambda version: (
"https://raw.githubusercontent.com/"
f"radiantearth/stac-spec/v{version}"
),
),
]
# DEFAULT_SCHEMA_MAP contains a structure that matches extension schema URIs
# based on the stac object type, extension ID and the stac version.
# Uris are contained in a tuple whose first element represents the URI of the latest
# version, so that a search through version ranges is avoided if the STAC being
# validated is the latest version. If it's a previous version, the stac_version
# that matches the listed version range is used, or else the URI from the latest
# version is used if there are no overrides for previous versions.
@classmethod
@lru_cache()
def get_schema_map(cls) -> Dict[str, Any]:
return {
OldExtensionShortIDs.CHECKSUM.value: (
{
pystac.STACObjectType.CATALOG: (
"extensions/checksum/json-schema/schema.json"
),
pystac.STACObjectType.COLLECTION: (
"extensions/checksum/json-schema/schema.json"
),
pystac.STACObjectType.ITEM: (
"extensions/checksum/json-schema/schema.json"
),
},
None,
),
OldExtensionShortIDs.COLLECTION_ASSETS.value: (
{
pystac.STACObjectType.COLLECTION: (
"extensions/collection-assets/json-schema/schema.json"
)
},
None,
),
OldExtensionShortIDs.DATACUBE.value: (
{
pystac.STACObjectType.COLLECTION: (
"extensions/datacube/json-schema/schema.json"
),
pystac.STACObjectType.ITEM: (
"extensions/datacube/json-schema/schema.json"
),
},
[
(
STACVersionRange(min_version="0.5.0", max_version="0.9.0"),
{
pystac.STACObjectType.COLLECTION: None,
pystac.STACObjectType.ITEM: None,
},
)
],
),
OldExtensionShortIDs.EO.value: (
{pystac.STACObjectType.ITEM: "extensions/eo/json-schema/schema.json"},
None,
),
OldExtensionShortIDs.ITEM_ASSETS.value: (
{
pystac.STACObjectType.COLLECTION: (
"extensions/item-assets/json-schema/schema.json"
)
},
None,
),
OldExtensionShortIDs.LABEL.value: (
{
pystac.STACObjectType.ITEM: (
"extensions/label/json-schema/schema.json"
)
},
[
(
STACVersionRange(min_version="0.8.0-rc1", max_version="0.8.1"),
{pystac.STACObjectType.ITEM: "extensions/label/schema.json"},
)
],
),
OldExtensionShortIDs.POINTCLOUD.value: (
# Invalid schema
None,
None,
),
OldExtensionShortIDs.PROJECTION.value: (
{
pystac.STACObjectType.ITEM: (
"extensions/projection/json-schema/schema.json"
)
},
None,
),
OldExtensionShortIDs.SAR.value: (
{pystac.STACObjectType.ITEM: "extensions/sar/json-schema/schema.json"},
None,
),
OldExtensionShortIDs.SAT.value: (
{pystac.STACObjectType.ITEM: "extensions/sat/json-schema/schema.json"},
None,
),
OldExtensionShortIDs.SCIENTIFIC.value: (
{
pystac.STACObjectType.ITEM: (
"extensions/scientific/json-schema/schema.json"
),
pystac.STACObjectType.COLLECTION: (
"extensions/scientific/json-schema/schema.json"
),
},
None,
),
OldExtensionShortIDs.SINGLE_FILE_STAC.value: (
{
pystac.STACObjectType.CATALOG: (
"extensions/single-file-stac/json-schema/schema.json"
)
},
None,
),
OldExtensionShortIDs.TILED_ASSETS.value: (
{
pystac.STACObjectType.CATALOG: (
"extensions/tiled-assets/json-schema/schema.json"
),
pystac.STACObjectType.COLLECTION: (
"extensions/tiled-assets/json-schema/schema.json"
),
pystac.STACObjectType.ITEM: (
"extensions/tiled-assets/json-schema/schema.json"
),
},
None,
),
OldExtensionShortIDs.TIMESTAMPS.value: (
{
pystac.STACObjectType.ITEM: (
"extensions/timestamps/json-schema/schema.json"
)
},
None,
),
OldExtensionShortIDs.VERSION.value: (
{
pystac.STACObjectType.ITEM: (
"extensions/version/json-schema/schema.json"
),
pystac.STACObjectType.COLLECTION: (
"extensions/version/json-schema/schema.json"
),
},
None,
),
OldExtensionShortIDs.VIEW.value: (
{pystac.STACObjectType.ITEM: "extensions/view/json-schema/schema.json"},
None,
),
# Removed or renamed extensions.
"dtr": (None, None), # Invalid schema
"asset": (
None,
[
(
STACVersionRange(min_version="0.8.0-rc1", max_version="0.9.0"),
{
pystac.STACObjectType.COLLECTION: (
"extensions/asset/json-schema/schema.json"
)
},
)
],
),
}
@classmethod
def _append_base_uri_if_needed(
cls, uri: str, stac_version: STACVersionID
) -> Optional[str]:
# Only append the base URI if it's not already an absolute URI
if "://" not in uri:
base_uri = None
for version_range, f in cls.get_base_uris():
if version_range.contains(stac_version):
base_uri = f(stac_version)
return "{}/{}".format(base_uri, uri)
# No JSON Schema for the old extension
return None
else:
return uri
@classmethod
def get_extension_schema_uri(
cls, extension_id: str, object_type: STACObjectType, stac_version: STACVersionID
) -> Optional[str]:
uri = None
is_latest = stac_version == pystac.get_stac_version()
ext_map = cls.get_schema_map()
if extension_id in ext_map:
if ext_map[extension_id][0] and object_type in ext_map[extension_id][0]:
uri = ext_map[extension_id][0][object_type]
if not is_latest:
if ext_map[extension_id][1]:
for version_range, ext_uris in ext_map[extension_id][1]:
if version_range.contains(stac_version):
if object_type in ext_uris:
uri = ext_uris[object_type]
break
if uri is None:
return uri
else:
return cls._append_base_uri_if_needed(uri, stac_version)
| 38.71547 | 88 | 0.53086 | 1,381 | 14,015 | 5.275163 | 0.136857 | 0.039808 | 0.050515 | 0.063143 | 0.705148 | 0.578723 | 0.494029 | 0.455045 | 0.377076 | 0.348524 | 0 | 0.007091 | 0.386229 | 14,015 | 361 | 89 | 38.822715 | 0.839805 | 0.197217 | 0 | 0.498233 | 0 | 0 | 0.137999 | 0.109212 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028269 | false | 0.007067 | 0.024735 | 0.007067 | 0.109541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ad5e7efb51c44e87712dae513359c781ede674e | 4,339 | py | Python | Python/mhd_utils.py | matthew-brett/diffusion_mri | a7ff1e2a7b3e43e419b8040a48dcda9c361ecf42 | [
"MIT"
] | 4 | 2017-04-13T16:29:49.000Z | 2021-08-14T09:52:25.000Z | Python/mhd_utils.py | matthew-brett/diffusion_mri | a7ff1e2a7b3e43e419b8040a48dcda9c361ecf42 | [
"MIT"
] | 1 | 2019-03-28T11:29:39.000Z | 2019-03-28T15:47:08.000Z | Python/mhd_utils.py | matthew-brett/diffusion_mri | a7ff1e2a7b3e43e419b8040a48dcda9c361ecf42 | [
"MIT"
] | 7 | 2016-04-18T11:03:45.000Z | 2021-01-12T06:04:57.000Z | #!/usr/bin/env python
#coding=utf-8
#======================================================================
#Program: Diffusion Weighted MRI Reconstruction
#Module: $RCSfile: mhd_utils.py,v $
#Language: Python
#Author: $Author: bjian $
#Date: $Date: 2008/10/27 05:55:55 $
#Version: $Revision: 1.1 $
#======================================================================
import os
import numpy
import array
from utils import *
def read_meta_header(filename):
"""Return a dictionary of meta data from meta header file"""
fileIN = open(filename, "r")
line = fileIN.readline()
meta_dict = {}
tag_set1 = ['ObjectType','NDims','DimSize','ElementType','ElementDataFile']
tag_set2 = ['BinaryData','BinaryDataByteOrderMSB','CompressedData','CompressedDataSize']
tag_set3 = ['Offset','CenterOfRotation','AnatomicalOrientation','ElementSpacing','TransformMatrix']
tag_set4 = ['Comment','SeriesDescription','AcquisitionDate','AcquisitionTime','StudyDate','StudyTime']
tag_set = []
tag_set.extend(tag_set1)
tag_set.extend(tag_set2)
tag_set.extend(tag_set3)
tag_set.extend(tag_set4)
tag_flag = [False]*len(tag_set)
while line:
tags = str.split(line,'=')
#print tags[0]
for i in range(len(tag_set)):
tag = tag_set[i]
if (str.strip(tags[0]) == tag) and (not tag_flag[i]):
#print tags[1]
meta_dict[tag] = str.strip(tags[1])
tag_flag[i] = True
line = fileIN.readline()
#print comment
fileIN.close()
return meta_dict
def load_raw_data_with_mhd(filename):
meta_dict = read_meta_header(filename)
dim = int(meta_dict['NDims'])
#print dim
#print meta_dict['ElementType']
assert(meta_dict['ElementType']=='MET_FLOAT')
arr = [int(i) for i in meta_dict['DimSize'].split()]
#print arr
volume = reduce(lambda x,y: x*y, arr[0:dim-1], 1)
#print volume
pwd = os.path.split(filename)[0]
if pwd:
data_file = pwd +'/' + meta_dict['ElementDataFile']
else:
data_file = meta_dict['ElementDataFile']
#print data_file
fid = open(data_file,'rb')
binvalues = array.array('f')
binvalues.read(fid, volume*arr[dim-1])
if is_little_endian(): # assume data in file is always big endian
binvalues.byteswap()
fid.close()
data = numpy.array(binvalues, numpy.float)
data = numpy.reshape(data, (arr[dim-1], volume))
return (data, meta_dict)
def write_meta_header(filename, meta_dict):
header = ''
# do not use tags = meta_dict.keys() because the order of tags matters
tags = ['ObjectType','NDims','BinaryData',
'BinaryDataByteOrderMSB','CompressedData','CompressedDataSize',
'TransformMatrix','Offset','CenterOfRotation',
'AnatomicalOrientation',
'ElementSpacing',
'DimSize',
'ElementType',
'ElementDataFile',
'Comment','SeriesDescription','AcquisitionDate','AcquisitionTime','StudyDate','StudyTime']
for tag in tags:
if tag in meta_dict.keys():
header += '%s = %s\n'%(tag,meta_dict[tag])
f = open(filename,'w')
f.write(header)
f.close()
def dump_raw_data(filename, data):
""" Write the data into a raw format file. Big endian is always used. """
rawfile = open(filename,'wb')
a = array.array('f')
for o in data:
a.fromlist(list(o))
if is_little_endian():
a.byteswap()
a.tofile(rawfile)
rawfile.close()
def write_mhd_file(mhdfile, data, dsize):
assert(mhdfile[-4:]=='.mhd')
meta_dict = {}
meta_dict['ObjectType'] = 'Image'
meta_dict['BinaryData'] = 'True'
meta_dict['BinaryDataByteOrderMSB'] = 'False'
meta_dict['ElementType'] = 'MET_FLOAT'
meta_dict['NDims'] = str(len(dsize))
meta_dict['DimSize'] = ' '.join([str(i) for i in dsize])
meta_dict['ElementDataFile'] = os.path.split(mhdfile)[1].replace('.mhd','.raw')
write_meta_header(mhdfile, meta_dict)
pwd = os.path.split(mhdfile)[0]
if pwd:
data_file = pwd +'/' + meta_dict['ElementDataFile']
else:
data_file = meta_dict['ElementDataFile']
dump_raw_data(data_file, data)
| 33.376923 | 107 | 0.596912 | 512 | 4,339 | 4.908203 | 0.300781 | 0.08277 | 0.045762 | 0.023876 | 0.13848 | 0.116992 | 0.05969 | 0.05969 | 0.05969 | 0.05969 | 0 | 0.011387 | 0.230929 | 4,339 | 129 | 108 | 33.635659 | 0.741684 | 0.164324 | 0 | 0.153846 | 0 | 0 | 0.213626 | 0.031178 | 0 | 0 | 0 | 0 | 0.021978 | 1 | 0.054945 | false | 0 | 0.043956 | 0 | 0.120879 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ad9b64d65098701b1bcba36c3d7d451c75d6db9 | 10,259 | py | Python | src/support/gan_model.py | nipdep/STGAN | c72ba6cb9d23d33accc0cfa1958a2005db3ed490 | [
"MIT"
] | null | null | null | src/support/gan_model.py | nipdep/STGAN | c72ba6cb9d23d33accc0cfa1958a2005db3ed490 | [
"MIT"
] | null | null | null | src/support/gan_model.py | nipdep/STGAN | c72ba6cb9d23d33accc0cfa1958a2005db3ed490 | [
"MIT"
] | null | null | null | # To add a new cell, type '# %%'
# To add a new markdown cell, type '# %% [markdown]'
# %%
import time
import random
import tensorflow as tf
import numpy as np
from numpy import load
from numpy import zeros
from numpy import ones
from numpy.random import randint
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.initializers import RandomNormal
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Conv2D, Flatten, Dense, Conv2DTranspose, LeakyReLU, Activation, Dropout, BatchNormalization, ReLU, LeakyReLU, Concatenate
from tensorflow.keras import losses
from tensorflow.keras import metrics
from matplotlib import pyplot
from tensorflow.python.autograph.pyct import transformer
# %%
def define_encoder_block(layer_in, n_filters, batchnorm=True):
init = RandomNormal(stddev=0.02)
g = Conv2D(n_filters, (4, 4), strides=(2, 2), padding='same', kernel_initializer=init)(layer_in)
if batchnorm:
g = BatchNormalization()(g, training=True)
g = LeakyReLU(alpha=0.2)(g)
return g
def define_decoder_block(layer_in, skip_in, n_filters, dropout=True):
init = RandomNormal(stddev=0.02)
g = Conv2DTranspose(n_filters, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(layer_in)
g = BatchNormalization()(g, training=True)
if dropout:
g = Dropout(0.4)(g, training=True)
g = Concatenate()([g, skip_in])
g = ReLU()(g)
return g
# %%
def defing_generator(image_shape=(128, 128, 3)):
init = RandomNormal(stddev=0.02)
content_image = Input(shape=image_shape)
style_image = Input(shape=image_shape)
# stack content and style images
stacked_layer = Concatenate()([content_image, style_image])
#encoder model
e1 = define_encoder_block(stacked_layer, 64, batchnorm=False)
e2 = define_encoder_block(e1, 128)
e3 = define_encoder_block(e2, 256)
e4 = define_encoder_block(e3, 512)
e5 = define_encoder_block(e4, 512)
e6 = define_encoder_block(e5, 512)
#e7 = define_encoder_block(e6, 512)
# bottleneck layer
b = Conv2D(512, (4, 4), strides=(2, 2), padding='same', kernel_constraint=init)(e6)
b = ReLU()(b)
#decoder model
#d1 = define_decoder_block(b, e7, 512)
d2 = define_decoder_block(b, e6, 512)
d3 = define_decoder_block(d2, e5, 512)
d4 = define_decoder_block(d3, e4, 512, dropout=False)
d5 = define_decoder_block(d4, e3, 256, dropout=False)
d6 = define_decoder_block(d5, e2, 128, dropout=False)
d7 = define_decoder_block(d6, e1, 64, dropout=False)
#ouutput layer
g = Conv2DTranspose(3, (4, 4), strides=(2, 2), padding='same', kernel_initializer=init)(d7)
out_image = Activation('tanh')(g)
model = Model(inputs=[content_image, style_image], outputs=out_image)
return model
#%%
g_model = defing_generator()
tf.keras.utils.plot_model(g_model, show_shapes=True)
# %%
def define_cnt_descriminator(image_shape=(128, 128, 3)):
init = RandomNormal(stddev=0.02)
#content image input
in_cnt_image = Input(shape=image_shape)
#transfer image input
in_tr_image = Input(shape=image_shape)
#concatnate image channel-wise
merged = Concatenate()([in_cnt_image, in_tr_image])
# c64
d = Conv2D(64, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(merged)
d = LeakyReLU(alpha=0.2)(d)
# c128
d = Conv2D(128, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(d)
d = BatchNormalization()(d)
d = LeakyReLU(alpha=0.2)(d)
# c256
d = Conv2D(256, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(d)
d = BatchNormalization()(d)
d = LeakyReLU(alpha=0.2)(d)
# c512
d = Conv2D(512, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(d)
d = BatchNormalization()(d)
d = LeakyReLU(alpha=0.2)(d)
# patch output
d = Conv2D(512, (4, 4), strides=(2,2), padding='same', kernel_initializer=init)(d)
patch_out = Activation('sigmoid')(d)
#define model
model = Model(inputs=[in_cnt_image, in_tr_image], outputs=patch_out)
return model
dsc_model = define_cnt_descriminator()
tf.keras.utils.plot_model(dsc_model, show_shapes=True)
# %%
def define_style_descrminator(image_size=(128, 128, 3)):
init = RandomNormal(stddev=0.02)
input_img = Input(shape=image_size)
# C64
d = Conv2D(64, (4, 4), (4, 4), padding='SAME', kernel_initializer=init)(input_img)
d = LeakyReLU(alpha=0.2)(d)
# C128
d = Conv2D(128, (4, 4), (4, 4), padding='SAME', kernel_initializer=init)(d)
d = BatchNormalization()(d)
d = LeakyReLU(alpha=0.2)(d)
# C256
d = Conv2D(256, (4, 4), (4, 4), padding='SAME', kernel_initializer=init)(d)
d = BatchNormalization()(d)
d = LeakyReLU(alpha=0.2)(d)
# flatten
flt = Flatten()(d)
# linear logits layer
output = Dense(1)(flt)
#build and compile the model
model = Model(inputs=input_img, outputs=output, name='style_descriminator')
return model
dss_model = define_style_descrminator()
tf.keras.utils.plot_model(dss_model, show_shapes=True)
# %%
def define_gan(g_model, dc_model, ds_model, image_shape=(128, 128, 3)):
for layer in dc_model.layers:
if not isinstance(layer, BatchNormalization):
layer.trainable = False
for layer in ds_model.layers:
if not isinstance(layer, BatchNormalization):
layer.trainable = False
# input layer for GAN model
cnt_img = Input(shape=image_shape)
style_img = Input(shape=image_shape)
# generator model
gen_out = g_model([cnt_img, style_img])
# style descriminator model
dss_out = ds_model(style_img)
dst_out = ds_model(gen_out)
# content descriminator model
cnt_out = dc_model([cnt_img, gen_out])
model = Model(inputs=[cnt_img, style_img], outputs=[gen_out, dss_out, dst_out, cnt_out])
return model
gan_model = define_gan(g_model, dsc_model, dss_model)
tf.keras.utils.plot_model(gan_model, show_shapes=True)
#%%
def pairWiseRankingLoss(y_ref, y_style, label):
m = tf.cast(tf.broadcast_to(1, shape=y_ref.shape), dtype=tf.float32)
u = tf.cast(tf.broadcast_to(0, shape=y_ref.shape), dtype=tf.float32)
i = tf.cast(tf.broadcast_to(1, shape=y_ref.shape), dtype=tf.float32)
y = tf.cast(label[..., tf.newaxis], dtype=tf.float32)
dist = tf.norm(y_ref-y_style, ord='euclidean', axis=-1, keepdims=True)
loss = y*dist + (i-y)*tf.reduce_max(tf.stack([u,m-dist]), axis=0)
return tf.reduce_mean(loss)
dscLoss = tf.keras.losses.binary_crossentropy(from_logits=True)
cntLoss = tf.keras.losses.KLDivergence()
gan_opt = tf.keras.optimizers.Adamax(lr=0.002)
gan_alpha = 0.6
gan_beta = 0.5
#%%
@tf.function
def gan_train_step(ref_in, style_in, trans_in,cnt_true, style_true):
with tf.GradientTape() as tape:
gen_out, dss_out, dst_out, cnt_out = gan_model(ref_in, style_in)
dss_loss = pairWiseRankingLoss(dss_out, dst_out, style_true)
dsc_loss = dscLoss(cnt_out, cnt_true)
gen_loss = cntLoss(trans_in, gen_out)
total_loss = gan_alpha*(gan_beta*dss_loss+(1-gan_beta)*dsc_loss)+(1-gan_alpha)*gen_loss
grads = tape.gradient(total_loss, gan_model)
gan_opt.apply_gradients(zip(grads, gan_model.trainable_weights))
return total_loss, dss_loss, dsc_loss
#%%
ds_model = define_style_descrminator() #TODO
ds_opt = tf.keras.optimizers.Adam(lr=0.02)
#%%
@tf.function
def ds_train_step(style_in, trans_in, label_in):
with tf.GradientTape() as tape:
ref_out = ds_model(style_in)
trans_out = ds_model(trans_in)
loss = pairWiseRankingLoss(ref_out, trans_out, label_in)
grads = tape.gradient(loss, ds_model.trainable_weights)
ds_opt.apply_gradients(zip(grads, ds_model.trainable_weights))
#train_metrics.update_state(ref_out, style_out, label_in)
return loss
#%%
dc_model = define_cnt_descriminator()
dc_opt = tf.keras.optimizers.Adam(lr=0.02)
#%%
@tf.function
def dc_train_step(cnt_in, trans_in, label_in):
with tf.GradientTape() as tape:
logits = dc_model(cnt_in, trans_in)
loss = pairWiseRankingLoss(label_in, logits)
grads = tape.gradient(loss, dc_model.trainable_weights)
dc_opt.apply_gradients(zip(grads, dc_model.trainable_weights))
#train_metrics.update_state(ref_out, style_out, label_in)
return loss
#%%
def load_pixel_metrics(filename):
full_mat = np.load(filename)
style_pixels = (full_mat['style']-127.5)/127.5
content_pixels = (full_mat['cotent']-127.5)/127.5
transfer_mat = (full_mat['transfers']-127.5)/127.5
return style_pixels, content_pixels, transfer_mat
def generate_real_samples(dataset, n_samples, patch_shape):
style, content, trans = dataset
cnt_idxs = random.sample(range(style.shape[1]), n_samples)
style_idxs = np.random.randint(0, style.shape[0], n_samples)
cnt_pixels = content[cnt_idxs]
style_pixels = style[style_idxs]
mat_pixels = trans[style_idxs, cnt_idxs, ...]
y_dc = ones((n_samples, patch_shape, patch_shape, 1))
y_ds = ones((n_samples))
return [cnt_pixels, style_pixels, mat_pixels], y_dc, y_ds
def generate_fake_samples(g_model, samples, patch_shape):
cnt_img, style_img = samples
X = g_model.predict(cnt_img, style_img)
y_dc = zeros((len(X), patch_shape, patch_shape, 1))
y_ds = zeros((len(X)))
return X, y_dc, y_ds
#%%
def train(g_model, ds_model, dc_model, gan_model, dataset, n_epoch=100, batch_size=2):
n_patch = dc_model.output_shape[1]
batch_per_epoch = len(dataset[1])//batch_size
n_steps = n_epoch*batch_per_epoch
for i in range(n_steps):
[X_cnt, X_stl, X_trn], ydc_real, yds_real = generate_real_samples(dataset, batch_size, n_patch)
X_fake_trn, ydc_fake, yds_fake = generate_fake_samples(g_model, [X_cnt, X_stl], n_patch)
# train style descriminator
ds_loss1 = ds_train_step(X_stl, X_trn, yds_real)
ds_loss2 = ds_train_step(X_stl, X_fake_trn, yds_fake)
#train content descriminator
dc_loss1 = dc_train_step(X_cnt, X_trn, ydc_real)
dc_loss2 = dc_train_step(X_cnt, X_fake_trn, ydc_fake)
#train GAN model
gan_total_loss, gan_dss_loss, gan_dsc_loss = gan_train_step(X_cnt, X_stl, X_fake_trn, ydc_real, yds_real)
#%%
| 36.250883 | 164 | 0.699678 | 1,565 | 10,259 | 4.336741 | 0.150799 | 0.005304 | 0.030057 | 0.045381 | 0.373803 | 0.282452 | 0.25151 | 0.231472 | 0.219685 | 0.21438 | 0 | 0.036662 | 0.173116 | 10,259 | 282 | 165 | 36.379433 | 0.763409 | 0.071839 | 0 | 0.203125 | 0 | 0 | 0.011302 | 0 | 0 | 0 | 0 | 0.003546 | 0 | 1 | 0.072917 | false | 0 | 0.083333 | 0 | 0.223958 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1adac1d218c18f6ac227bb196bd7e184f91ab6fe | 4,673 | py | Python | ssis/views/courses/routes.py | edenroseFR/Student-Information-System-Web-based- | 3553dc33defdcade0f207e200efbeceec0b46a2a | [
"MIT"
] | null | null | null | ssis/views/courses/routes.py | edenroseFR/Student-Information-System-Web-based- | 3553dc33defdcade0f207e200efbeceec0b46a2a | [
"MIT"
] | 6 | 2022-03-09T22:32:52.000Z | 2022-03-31T22:32:16.000Z | ssis/views/courses/routes.py | edenroseFR/Student-Information-System-Web-based- | 3553dc33defdcade0f207e200efbeceec0b46a2a | [
"MIT"
] | 1 | 2022-01-30T21:58:16.000Z | 2022-01-30T21:58:16.000Z | from flask import request, render_template, redirect, flash
from flask.helpers import url_for
from ssis.models.student import Student
from ssis.models.course import Course
from ssis.models.college import College
from .utils import add_course_to_db, update_course_record, check_page_limit, check_limit_validity
from . import course
from math import ceil
current_page = 1
course_limit = 5
@course.route('/courses')
def courses() -> str:
global course_limit
min_page = request.args.get('min_page')
max_page = request.args.get('max_page')
page_limit = check_page_limit(min_page, max_page)
course_count = str(Course().get_total())
entered_limit = request.args.get('limit-field')
if entered_limit == course_count:
page_limit = 'min-and-max'
try:
course_limit = check_limit_validity(int(entered_limit), int(course_count))
except:
course_limit = course_limit
students = Student().get_all(paginate=False)
courses = Course().get_all(current_page, course_limit)
colleges = College().get_all(paginate=False)
return render_template('/course/courses.html',
data=[students,courses,colleges],
datacount = course_count,
course_limit = course_limit,
limit=page_limit)
@course.route('/courses/next', methods=['GET', 'POST'])
def next() -> str:
global current_page
course_count = Course().get_total()
current_page += 1
limit_page = ceil(course_count/course_limit)
max_page_reached = current_page == limit_page
if not max_page_reached:
return redirect(url_for('course.courses', page_num=current_page))
else:
return redirect(url_for('course.courses', page_num=current_page, max_page=True))
@course.route('/courses/prev', methods=['GET', 'POST'])
def prev() -> str:
global current_page
max_page_reached = current_page == 1
if not max_page_reached:
current_page -= 1
return redirect(url_for('course.courses', page_num=current_page))
else:
current_page = 1
return redirect(url_for('course.courses', page_num=current_page, min_page=True))
@course.route('/course/add', methods=['GET', 'POST'])
def add() -> str:
if request.method == 'POST':
course = {
'code': request.form.get('course-code'),
'name': request.form.get('course-name'),
'college': request.form.get('course-college')
}
add_course_to_db(course)
flash(f'{course["code"]} added succesfully!', 'info')
return redirect(url_for('course.courses'))
else:
return redirect(url_for('course.courses'))
@course.route('/courses/search', methods=['GET', 'POST'])
def search() -> str:
user_input = request.form.get('user-input')
field = request.form.get('field')
if field == 'select':
result = Course().search(keyword=user_input)
elif field == 'code':
result = Course().search(keyword=user_input, field='code')
elif field == 'name':
result = Course().search(keyword=user_input, field='name')
elif field == 'college':
result = Course().search(keyword=user_input, field='college')
else:
result = []
if len(result) != 0:
return render_template('/course/courses.html',
data=['', result],
datacount = str(len(result)),
course_limit = '5')
else:
flash(f'No course found', 'info')
return render_template('/course/courses.html',
data=['', result],
datacount = str(len(result)),
course_limit = '5')
@course.route('/courses/delete/<string:id>')
def delete(id: str) -> str:
try:
Course().delete(id)
flash(f'{id} deleted from the database.', 'info')
return redirect(url_for('course.courses'))
except:
flash(f'{id} cannot be deleted. Students are enrolled in this program', 'info')
return redirect(url_for('course.courses'))
@course.route('/courses/update/<string:id>', methods=['GET', 'POST'])
def update(id: str) -> str:
if request.method == 'POST':
course = {
'code': id,
'name': request.form.get('course-name'),
'college': request.form.get('course-college')
}
update_course_record(course)
flash(f"{id} has been updated succesfully!", 'info')
return redirect(url_for('course.courses'))
else:
return redirect(url_for('course.courses')) | 34.614815 | 97 | 0.61438 | 563 | 4,673 | 4.920071 | 0.181172 | 0.051625 | 0.061372 | 0.072202 | 0.430686 | 0.412996 | 0.368953 | 0.274007 | 0.274007 | 0.249097 | 0 | 0.002568 | 0.249947 | 4,673 | 135 | 98 | 34.614815 | 0.787732 | 0 | 0 | 0.362832 | 0 | 0 | 0.157253 | 0.011553 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061947 | false | 0 | 0.070796 | 0 | 0.247788 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1adb4971d58f79dc1378fa4f67ca664d6713962a | 1,372 | py | Python | src/ml-auto-sklearn/run.py | altermarkive/numerai-experiments | dd4d0b18b64d948a875cba9fd16b962ad9fe0a26 | [
"MIT"
] | 12 | 2019-12-05T08:43:07.000Z | 2022-01-15T03:21:09.000Z | src/ml-auto-sklearn/run.py | altermarkive/Resurrecting-JimFleming-Numerai | dd4d0b18b64d948a875cba9fd16b962ad9fe0a26 | [
"MIT"
] | null | null | null | src/ml-auto-sklearn/run.py | altermarkive/Resurrecting-JimFleming-Numerai | dd4d0b18b64d948a875cba9fd16b962ad9fe0a26 | [
"MIT"
] | 6 | 2019-12-12T08:12:04.000Z | 2021-06-05T14:13:08.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import autosklearn.classification
import numpy
import os
import pandas
import sys
def ingest():
training_data = pandas.read_csv(os.getenv('TRAINING'), header=0)
tournament_data = pandas.read_csv(os.getenv('TESTING'), header=0)
features = [f for f in list(training_data) if 'feature' in f]
x = training_data[features]
y = training_data['target']
x_tournament = tournament_data[features]
ids = tournament_data['id']
return (x, y, x_tournament, ids)
def train(x, y):
model = autosklearn.classification.AutoSklearnClassifier(
time_left_for_this_task=int(os.getenv('TIME_LIMIT_ALL', '3600')),
per_run_time_limit=int(os.getenv('TIME_LIMIT_PART', '360')))
model.fit(x, y)
print(model.show_models())
return model
def predict(model, x_tournament, ids):
eps = sys.float_info.epsilon
y_prediction = model.predict_proba(x_tournament)
results = numpy.clip(y_prediction[:, 1], 0.0 + eps, 1.0 - eps)
results_df = pandas.DataFrame(data={'probability': results})
joined = pandas.DataFrame(ids).join(results_df)
joined.to_csv(os.getenv('PREDICTING'), index=False, float_format='%.16f')
def main():
x, y, x_tournament, ids = ingest()
model = train(x, y)
predict(model, x_tournament.copy(), ids)
if __name__ == '__main__':
main()
| 28.583333 | 77 | 0.686589 | 193 | 1,372 | 4.658031 | 0.419689 | 0.073415 | 0.036707 | 0.03782 | 0.135706 | 0.055617 | 0 | 0 | 0 | 0 | 0 | 0.015831 | 0.171283 | 1,372 | 47 | 78 | 29.191489 | 0.774846 | 0.031341 | 0 | 0 | 0 | 0 | 0.075358 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.147059 | 0 | 0.323529 | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1adce0caad79b2df76b7128ac2a5ec6c6011e8fe | 17,143 | py | Python | genens/gp/types.py | gabrielasuchopar/genens | 228e2776897b0cfc6f96fb625346d28e7e1cf5fe | [
"MIT"
] | null | null | null | genens/gp/types.py | gabrielasuchopar/genens | 228e2776897b0cfc6f96fb625346d28e7e1cf5fe | [
"MIT"
] | null | null | null | genens/gp/types.py | gabrielasuchopar/genens | 228e2776897b0cfc6f96fb625346d28e7e1cf5fe | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
This module defines the structure of GP primitives.
The GP primitives are nodes with typed edges (parent input and child output types must match) and variable
arity (for a given type, its final arity can be chosen during the evolution process).
A ``GpPrimitive`` is a node whose types, arities and keyword arguments have been decided. To create such primitives,
it is possible to take use of templates. These contain possible values of arities, types and keyword arguments and
methods for choosing final values.
The primitive templates defined in this file are
1) functions - inner nodes of the tree, transform input into output
2) terminals - leaves of the tree, provide constant output.
"""
import functools
import random
from copy import deepcopy
from deap import base
from typing import Callable, List, Any, Dict, Union, Tuple
class GpTreeIndividual:
"""Represents a tree individual used in GP.
The individual is a tree encoded as a list of ``GpPrimitive`` nodes.
The list is a post-order representation of the tree. The tree can be
uniquely reconstructed using the arity (and types) of primitives.
"""
def __init__(self, prim_list: List['GpPrimitive'], max_height: int):
"""
Construct a GP tree from a list of primitives.
Args:
prim_list: list of GpPrimitive prim_list: Post-order representation of the tree.
max_height: Height of the tree - maximum of all node depths + 1
"""
self.primitives = prim_list
self.max_height = max_height
self.validate_tree()
def __deepcopy__(self, memo=None):
if memo is None:
memo = {}
new = object.__new__(type(self))
memo[id(self)] = new
new.__dict__.update(deepcopy(self.__dict__, memo))
return new
def __eq__(self, other):
if not isinstance(other, GpTreeIndividual):
return False
if self.primitives != other.primitives:
return False
return True
def __repr__(self):
return f'GpTreeIndividual(height={self.max_height} primitives={self.primitives.__repr__()})'
def run_tree(self, node_func, group_children: bool = False) -> Any:
"""
Applies a function with the signature ``func(node, child_list)`` on all
nodes in the tree. The tree is traversed in post-order.
The arguments of the function are a node and list of result values of its child nodes.
:param node_func: Function which is applied on all nodes of the tree.
:return: Return value of the root.
"""
stack = []
for node in self.primitives:
if node.arity == 0:
stack.append(node_func(node, []))
else:
args = stack[-node.arity:]
stack = stack[:-node.arity]
if group_children:
children_by_type = []
for t in node.in_type:
t_children, args = args[-t.arity:], args[:-t.arity]
children_by_type.append((t.name, t_children))
args = children_by_type
stack.append(node_func(node, args))
if len(stack) > 1:
raise ValueError("Bad tree - invalid primitive list.")
return stack.pop()
def subtree(self, root_ind: int) -> Tuple[int, int]:
"""
Returns the start position of the subtree with primitive `self.primitives[root_ind]` as root. Note
that the returned value is in fact the index of one of the leaves, as the node list is post-order.
As so, the whole subtree is extracted with `self.primitives[subtree(root_ind), root_ind + 1]`.
Args:
root_ind: Position of the root (index of the beginning).
Returns: A tuple `(s, h)`, where `s` is the start index and `h` is the height of the subtree.
"""
curr = self.primitives[root_ind]
arity_rem = curr.arity
init_h = curr.depth
max_h = init_h
while arity_rem > 0:
root_ind = root_ind - 1
curr = self.primitives[root_ind]
max_h = max(max_h, curr.depth)
arity_rem = arity_rem - 1 + curr.arity
return root_ind, (max_h - init_h + 1)
def validate_tree(self):
"""
Validates the tree, raises an exception if its children are invalid.
"""
def validate_node(node, child_list):
if node.arity != len(child_list):
raise ValueError("Invalid number of children.")
child_id = 0
for in_type in node.in_type:
for i in range(in_type.arity):
child = child_list[child_id + i]
if child.out_type != in_type.name:
raise ValueError("Invalid child type.")
if child.depth != node.depth + 1:
raise ValueError("Invalid child height.")
child_id += in_type.arity
return node
self.run_tree(validate_node)
if self.max_height != max(prim.depth + 1 for prim in self.primitives):
raise ValueError("Invalid tree height.")
class DeapTreeIndividual(GpTreeIndividual):
"""
Represents an individual with DEAP-compatible fitness.
"""
def __init__(self, prim_list: List['GpPrimitive'], max_height: int):
super().__init__(prim_list, max_height)
self.fitness = DeapTreeIndividual.Fitness() # (score, log(evaluation_time))
self.compiled_pipe = None
class Fitness(base.Fitness):
def __init__(self, values=()):
self.weights = (1.0, -1.0)
super().__init__(values)
def reset(self):
del self.fitness.values
self.compiled_pipe = None
class GpPrimitive:
"""
Represents a typed node in the GP tree.
Its name and keyword dictionary hold information about the function
or object, which is associated with the node.
"""
def __init__(self,
name: str,
obj_kwargs: Dict[str, Any],
in_type: List['GpPrimitive.InputType'],
out_type: str,
arity: int,
depth: int):
"""
Creates an instance of a GP tree node. The number and output types as well as the ordering of its children
is specified by `in_type`.
Args:
name: Name of the node.
obj_kwargs: Keyword arguments associated with the node.
in_type:
List of input types with arity. The subtypes are ordered - e.g. [('data', 2), ('ens', 1)] is not the
same as [('ens', 1), ('data', 2)].
arity: Sum of arity of subtypes.
depth: Depth of the node.
"""
self.name = name
self.obj_kwargs = obj_kwargs
self.in_type = in_type
self.out_type = out_type
self.arity = arity
self.depth = depth
def __deepcopy__(self, memo=None):
if memo is None:
memo = {}
new = object.__new__(type(self))
memo[id(self)] = new
new.__dict__.update(deepcopy(self.__dict__, memo))
return new
def __eq__(self, other):
if not isinstance(other, GpPrimitive):
return False
if self.name != other.name:
return False
if self.arity != other.arity or self.in_type != other.in_type or self.out_type != other.out_type:
return False
if self.obj_kwargs != other.obj_kwargs:
return False
return True
def __repr__(self):
return 'GpPrimitive(name=' + self.name + ", arity={}".format(self.arity)\
+ ", height={})".format(self.depth)
class InputType:
"""
Represents the input type of a primitive. It determines how many children with a specific output type
should the node have.
"""
def __init__(self, name: str, arity: int):
"""
Construct a new instance of input type with arity.
:param name: Name of the type.
:param arity: Arity of this type.
"""
self.name = name
self.arity = arity
def __eq__(self, other):
if not isinstance(other, GpPrimitive.InputType):
return False
if self.name != other.name or self.arity != other.arity:
return False
return True
class GpTerminalTemplate:
"""
Represents a terminal of the GP tree, or a primitive with no inputs.
The output type is fixed.
The keyword arguments are chosen from lists of possible values.
"""
def __init__(self, name: str, out_type: str, group: str = None):
"""
Creates a new instance of a terminal template.
Args:
name: Name of the node.
out_type: Name of the output type.
"""
self.name = name
self.type_arity_template = []
self.out_type = out_type
self.group = group
def __repr__(self):
return f"GpTerminalTemplate: {self.name} - {self.group}"
def create_primitive(self, curr_height: int, max_arity: int, kwargs_dict: Dict[str, List[Any]]) -> GpPrimitive:
"""
Creates an instance of a `GpPrimitive` from the template.
Selects keyword arguments from `kwargs_dict`. For every key,
the dict contains a list of possible values.
Args:
curr_height: Height at which the node is generated.
max_arity: Only for compatibility, not used for terminals.
kwargs_dict: Dictionary which contains possible keyword argument values.
Return: A new instance of `GpPrimitive`.
"""
prim_kwargs = _choose_kwargs(kwargs_dict)
return GpPrimitive(self.name, prim_kwargs, [], self.out_type, 0, curr_height)
class TypeArity:
"""
Represents a variable node arity associated with a type.
"""
def __init__(self,
prim_type: str,
arity_template: Union[int, Tuple[int, int], Tuple[int, str]]):
"""
Constructs a new instance of a type template - with a fixed type, but possibly variable arity.
Args:
prim_type: Name of the type.
arity_template: Either a fixed arity value, or a bounded interval (a, b) where a and b are integers,
or an interval (a, 'n') that has only a lower bound a ('n' is a string).
"""
self.prim_type = prim_type
self.arity_template = arity_template
# check arity range
if isinstance(self.arity_template, tuple):
lower_invalid = self.arity_template[0] < 0
upper_invalid = self.arity_template[1] != 'n' and self.arity_template[0] > self.arity_template[1]
if lower_invalid or upper_invalid:
raise ValueError("Invalid arity range.")
# check fixed arity
elif isinstance(self.arity_template, int):
if self.arity_template <= 0:
raise ValueError("Arity must be greater than 0.")
else:
raise ValueError("Invalid arity type.")
def is_valid_arity(self, arity: int):
"""
Determines whether `self.choose_arity` could possibly result in `arity`.
Args:
arity: Input arity to compare with this template.
Returns: True if `arity` can be created from this template.
"""
if isinstance(self.arity_template, tuple):
# out of range
if arity < self.arity_template[0]:
return False
if self.arity_template[1] != 'n' and arity > self.arity_template[1]:
return False
# inside range
return True
# match fixed arity
if isinstance(self.arity_template, int):
return self.arity_template == arity
return False
def choose_arity(self, max_arity: int) -> int:
"""
Chooses an integer arity from the arity range or
returns the arity if it is already a fixed value.
Args:
max_arity: The upper bound of arities. It is an upper bound for all ranges, even when the arity
upper bound is greater. If the lower bound is greater than this value, `max_arity` is not applied
and the value is chosen from the original interval.
Returns: A fixed arity value.
"""
if not isinstance(self.arity_template, tuple):
return self.arity_template
a_from, a_to = self.arity_template
# is not applied for ranges which are greater than ``max_arity`` as it could result in invalid behavior
if a_to == 'n' or a_to > max_arity >= a_from:
a_to = max(max_arity, a_from)
return random.randint(a_from, a_to)
class GpFunctionTemplate:
"""
Represents an inner node of the GP tree.
This class is a template of a function node - the input type may have variable arity which is decided when
an instance of `GpPrimitive` is created from this template. During this process, keyword arguments are decided
as well.
The template input type is an ordered list of subtypes. These may have a variable arity (a range) and the
final arity is chosen during the instantiation.
The keyword arguments are chosen from lists of possible values.
"""
def __init__(self, name: str, type_arity_template: List[TypeArity], out_type: str, group: str = None):
"""
Args:
name: Name key associated with the node.
type_arity_template: Ordered list of children subtypes with variable arity.
out_type: Name of the output type.
group: Name of group of the node.
"""
self.name = name
self.type_arity_template = type_arity_template
self.out_type = out_type
self.group = group
def __repr__(self):
return f"GpFunctionTemplate: {self.name} - {self.group}"
def create_primitive(self,
curr_height: int,
max_arity: int,
kwargs_dict: Dict[str, List],
in_type: GpPrimitive.InputType = None) -> GpPrimitive:
"""
Creates an instance of a ``GpPrimitive`` from the template.
Selects keyword arguments from ``kwargs_dict``. For every key,
the dict contains a list of possible values.
Select final arities of every TypeArity in `self.type_arities` - that is, if a function node has
a variable number of input arguments (possibly of different types), for every subtype a fixed arity
is chosen.
TODO add example
Args:
curr_height: Height at which the node is generated
max_arity: Maximum arity value which can be chosen for a single TypeArity.
kwargs_dict: Dictionary which contains possible keyword argument values.
in_type: Input type; if provided, it is used instead of generating a new random type.
Returns: A new instance of GpPrimitive
"""
prim_kwargs = _choose_kwargs(kwargs_dict)
def create_type(t):
arity = t.choose_arity(max_arity)
# do not construct a type in case of 0
if arity == 0:
return None
return GpPrimitive.InputType(t.prim_type, arity)
if in_type is None:
# ordered list of final arity of types
in_type = [create_type(t_a) for t_a in self.type_arity_template]
in_type = [t for t in in_type if t is not None]
#
# total arity of the node
arity_sum = functools.reduce(lambda s, t: s + t.arity, in_type, 0)
return GpPrimitive(self.name, prim_kwargs, in_type, self.out_type, arity_sum, curr_height)
def _choose_kwargs(kwargs_dict: Dict[str, List]) -> Dict[str, Any]:
"""
Chooses keyword argument values from the argument dictionary `kwargs_dict`.
For every keyword argument, it contains a list of possible values for every key.
Args:
kwargs_dict: Dict of possible kwarg values.
Returns: Dict with one value selected per key.
"""
if kwargs_dict is None:
return {}
return {k: random.choice(arg_list) for k, arg_list in kwargs_dict.items()}
def tree_str(gp_tree: GpTreeIndividual, with_hyperparams=False):
def get_tree(node, ch_l):
return node, ch_l
def print_node(node: GpPrimitive, ch_l, indent, str_res):
str_res += " " * indent + node.name + "\n"
if with_hyperparams:
for k, val in node.obj_kwargs.items():
str_res += " " * (indent + 1) + "| " + f"{k}: {val}\n"
for child, next_list in ch_l:
str_res = print_node(child, next_list, indent + 1, str_res)
return str_res
root, child_list = gp_tree.run_tree(get_tree)
return print_node(root, child_list, 0, "") | 34.013889 | 116 | 0.605903 | 2,241 | 17,143 | 4.481481 | 0.142347 | 0.033655 | 0.028776 | 0.010156 | 0.261277 | 0.215872 | 0.177736 | 0.158319 | 0.150752 | 0.132829 | 0 | 0.003319 | 0.314531 | 17,143 | 504 | 117 | 34.013889 | 0.851332 | 0.385522 | 0 | 0.287736 | 0 | 0 | 0.049248 | 0.010733 | 0 | 0 | 0 | 0.001984 | 0 | 1 | 0.146226 | false | 0 | 0.023585 | 0.023585 | 0.377358 | 0.014151 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1adde29b2e90eb0fb93c6d9f7876da65ee32a793 | 2,408 | py | Python | pca-server/src/pca/pca-aws-sf-wait-for-transcribe-notification.py | Harsh15021992/amazon-transcribe-post-call-analytics | d85aaf6f52394a96f3935cfc63e5bbe4f6e7a2ef | [
"Apache-2.0"
] | 8 | 2021-12-18T17:10:33.000Z | 2022-03-25T14:16:38.000Z | pca-server/src/pca/pca-aws-sf-wait-for-transcribe-notification.py | Harsh15021992/amazon-transcribe-post-call-analytics | d85aaf6f52394a96f3935cfc63e5bbe4f6e7a2ef | [
"Apache-2.0"
] | 6 | 2022-02-28T01:29:25.000Z | 2022-03-22T09:09:17.000Z | pca-server/src/pca/pca-aws-sf-wait-for-transcribe-notification.py | Harsh15021992/amazon-transcribe-post-call-analytics | d85aaf6f52394a96f3935cfc63e5bbe4f6e7a2ef | [
"Apache-2.0"
] | 4 | 2021-12-30T00:13:24.000Z | 2022-03-23T13:12:47.000Z | """
This python function is part of the main processing workflow. It is called when a Transcribe job is started, and it
will create an entry in a DynamoDB table that holds some job information and the Step Functions task token. The Step
Function should then wait for another task to read this task token from DynamoDB and resume the execution.
Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
SPDX-License-Identifier: Apache-2.0
"""
import json
import boto3
import os
def lambda_handler(event, context):
"""
Create/update the task token for the given Transcribe job,
and the Step Function should pause until that token is sent
back by an EventBridge Lambda trigger when the Transcribe
job completes. If no Transcribe job exists then throw an
exception, but we shouldn't be here if this is the case
"""
# Our tracking table name is an environment variable
DDB_TRACKING_TABLE = os.environ["TableName"]
# Extract our parameters
jobName = event["Input"]["jobName"]
api_mode = event["Input"]["apiMode"]
taskToken = event["TaskToken"]
# If the jobName is "" then that means no task was started - the Step Function
# shouldn't have sent us here, so throw an exception to break the execution
if jobName == "":
raise Exception('No Transcribe job called \'{}\' exists.'.format(jobName))
# Insert/Update tracking entry between Transcribe job and the Step Function
ddbClient = boto3.client("dynamodb")
response = ddbClient.put_item(Item={
'PKJobId': {'S': jobName},
'SKApiMode': {'S': api_mode},
'taskToken': {'S': taskToken},
'taskState': {'S': json.dumps(event["Input"])}
},
TableName=DDB_TRACKING_TABLE)
return event
# Main entrypoint for testing
if __name__ == "__main__":
event = {
"Input": {
"bucket": "ajk-call-analytics-demo",
"key": "audio/example-call.wav",
"langCode": "en-US",
"jobName": "stereo.mp3",
"apiMode": "analytics"
},
"TaskToken": "tesGGDSAG3RWEF"
}
os.environ['TableName'] = 'cci-PCAServer-MK00H3MPFXK9-DDB-1DUUJKPYBH0LP-Table-1AOTYJNH0R9RF'
lambda_handler(event, "") | 39.47541 | 117 | 0.624169 | 291 | 2,408 | 5.106529 | 0.508591 | 0.05249 | 0.040377 | 0.028264 | 0.041723 | 0.041723 | 0 | 0 | 0 | 0 | 0 | 0.008671 | 0.281561 | 2,408 | 61 | 118 | 39.47541 | 0.850289 | 0.44186 | 0 | 0 | 0 | 0 | 0.258263 | 0.083782 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03125 | false | 0 | 0.09375 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ade2ab84fea82b9e5678e3da0ae759e264da506 | 7,388 | py | Python | facebook_business/adobjects/nativeoffer.py | enricapq/facebook-python-business-sdk | 49c569ac5cf812b1bcb533520c35896b0436fa4c | [
"CNRI-Python"
] | null | null | null | facebook_business/adobjects/nativeoffer.py | enricapq/facebook-python-business-sdk | 49c569ac5cf812b1bcb533520c35896b0436fa4c | [
"CNRI-Python"
] | null | null | null | facebook_business/adobjects/nativeoffer.py | enricapq/facebook-python-business-sdk | 49c569ac5cf812b1bcb533520c35896b0436fa4c | [
"CNRI-Python"
] | 1 | 2020-07-27T16:34:58.000Z | 2020-07-27T16:34:58.000Z | # Copyright 2014 Facebook, Inc.
# You are hereby granted a non-exclusive, worldwide, royalty-free license to
# use, copy, modify, and distribute this software in source code or binary
# form for use in connection with the web services and APIs provided by
# Facebook.
# As with any software that integrates with the Facebook platform, your use
# of this software is subject to the Facebook Developer Principles and
# Policies [http://developers.facebook.com/policy/]. This copyright notice
# shall be included in all copies or substantial portions of the software.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
from facebook_business.adobjects.abstractobject import AbstractObject
from facebook_business.adobjects.abstractcrudobject import AbstractCrudObject
from facebook_business.adobjects.objectparser import ObjectParser
from facebook_business.api import FacebookRequest
from facebook_business.typechecker import TypeChecker
"""
This class is auto-generated.
For any issues or feature requests related to this class, please let us know on
github and we'll fix in our codegen framework. We'll not be able to accept
pull request for this class.
"""
class NativeOffer(
AbstractCrudObject,
):
def __init__(self, fbid=None, parent_id=None, api=None):
self._isNativeOffer = True
super(NativeOffer, self).__init__(fbid, parent_id, api)
class Field(AbstractObject.Field):
barcode_photo = 'barcode_photo'
barcode_photo_uri = 'barcode_photo_uri'
barcode_type = 'barcode_type'
barcode_value = 'barcode_value'
block_reshares = 'block_reshares'
details = 'details'
disable_location = 'disable_location'
discounts = 'discounts'
expiration_time = 'expiration_time'
id = 'id'
instore_code = 'instore_code'
location_type = 'location_type'
max_save_count = 'max_save_count'
online_code = 'online_code'
page = 'page'
page_set_id = 'page_set_id'
redemption_code = 'redemption_code'
redemption_link = 'redemption_link'
save_count = 'save_count'
terms = 'terms'
title = 'title'
total_unique_codes = 'total_unique_codes'
unique_codes = 'unique_codes'
unique_codes_file_code_type = 'unique_codes_file_code_type'
unique_codes_file_name = 'unique_codes_file_name'
unique_codes_file_upload_status = 'unique_codes_file_upload_status'
class BarcodeType:
code128 = 'CODE128'
code128b = 'CODE128B'
code93 = 'CODE93'
databar = 'DATABAR'
databar_expanded = 'DATABAR_EXPANDED'
databar_expanded_stacked = 'DATABAR_EXPANDED_STACKED'
databar_limited = 'DATABAR_LIMITED'
datamatrix = 'DATAMATRIX'
ean = 'EAN'
pdf417 = 'PDF417'
qr = 'QR'
upc_a = 'UPC_A'
upc_e = 'UPC_E'
class LocationType:
both = 'both'
offline = 'offline'
online = 'online'
def api_get(self, fields=None, params=None, batch=None, success=None, failure=None, pending=False):
from facebook_business.utils import api_utils
if batch is None and (success is not None or failure is not None):
api_utils.warning('`success` and `failure` callback only work for batch call.')
param_types = {
}
enums = {
}
request = FacebookRequest(
node_id=self['id'],
method='GET',
endpoint='/',
api=self._api,
param_checker=TypeChecker(param_types, enums),
target_class=NativeOffer,
api_type='NODE',
response_parser=ObjectParser(reuse_object=self),
)
request.add_params(params)
request.add_fields(fields)
if batch is not None:
request.add_to_batch(batch, success=success, failure=failure)
return request
elif pending:
return request
else:
self.assure_call()
return request.execute()
def create_native_offer_view(self, fields=None, params=None, batch=None, success=None, failure=None, pending=False):
from facebook_business.utils import api_utils
if batch is None and (success is not None or failure is not None):
api_utils.warning('`success` and `failure` callback only work for batch call.')
param_types = {
'ad_account': 'string',
'ad_image_hashes': 'list<string>',
'carousel_captions': 'list<string>',
'carousel_data': 'list<Object>',
'carousel_links': 'list<string>',
'deeplinks': 'list<string>',
'image_crops': 'list<map>',
'message': 'string',
'photos': 'list<string>',
'place_data': 'Object',
'published': 'bool',
'published_ads': 'bool',
'urls': 'list<string>',
'videos': 'list<string>',
}
enums = {
}
request = FacebookRequest(
node_id=self['id'],
method='POST',
endpoint='/nativeofferviews',
api=self._api,
param_checker=TypeChecker(param_types, enums),
target_class=NativeOffer,
api_type='EDGE',
response_parser=ObjectParser(target_class=NativeOffer, api=self._api),
)
request.add_params(params)
request.add_fields(fields)
if batch is not None:
request.add_to_batch(batch, success=success, failure=failure)
return request
elif pending:
return request
else:
self.assure_call()
return request.execute()
_field_types = {
'barcode_photo': 'string',
'barcode_photo_uri': 'string',
'barcode_type': 'string',
'barcode_value': 'string',
'block_reshares': 'bool',
'details': 'string',
'disable_location': 'bool',
'discounts': 'list<NativeOfferDiscount>',
'expiration_time': 'datetime',
'id': 'string',
'instore_code': 'string',
'location_type': 'string',
'max_save_count': 'int',
'online_code': 'string',
'page': 'Page',
'page_set_id': 'string',
'redemption_code': 'string',
'redemption_link': 'string',
'save_count': 'int',
'terms': 'string',
'title': 'string',
'total_unique_codes': 'string',
'unique_codes': 'string',
'unique_codes_file_code_type': 'string',
'unique_codes_file_name': 'string',
'unique_codes_file_upload_status': 'string',
}
@classmethod
def _get_field_enum_info(cls):
field_enum_info = {}
field_enum_info['BarcodeType'] = NativeOffer.BarcodeType.__dict__.values()
field_enum_info['LocationType'] = NativeOffer.LocationType.__dict__.values()
return field_enum_info
| 36.756219 | 120 | 0.634407 | 831 | 7,388 | 5.406739 | 0.309266 | 0.036724 | 0.030047 | 0.019363 | 0.30158 | 0.266637 | 0.259292 | 0.250835 | 0.217227 | 0.217227 | 0 | 0.004819 | 0.269762 | 7,388 | 200 | 121 | 36.94 | 0.827989 | 0.136302 | 0 | 0.25 | 0 | 0 | 0.233388 | 0.034039 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025 | false | 0 | 0.04375 | 0 | 0.14375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ade68f905e2e6d0e18b008d110d6b4f19c4319b | 6,503 | py | Python | test/hvd_allreduce.py | IST-DASLab/horovod | d2611353c33b299f04e47fae0de741702de3130e | [
"Apache-2.0"
] | null | null | null | test/hvd_allreduce.py | IST-DASLab/horovod | d2611353c33b299f04e47fae0de741702de3130e | [
"Apache-2.0"
] | null | null | null | test/hvd_allreduce.py | IST-DASLab/horovod | d2611353c33b299f04e47fae0de741702de3130e | [
"Apache-2.0"
] | null | null | null | import argparse
import horovod.torch as hvd
import pickle
import numpy.random as rnd
import numpy as np
import torch
import os
arrays = []
class MaxMinQuantizer():
def __init__(self, q_bits, bucket_size):
self.q = q_bits
self.num_levels = 1 << self.q
self.bucket_size = bucket_size
def compress(self, a):
if self.q == 32:
return a
numel = a.numel()
if self.bucket_size == -1:
a[:] = self.quantize_bucket(a)
else:
main_chunk_size = (numel // self.bucket_size) * self.bucket_size
if main_chunk_size > 0:
a[:main_chunk_size] = self.quantize_bucket(a[:main_chunk_size].view((-1, self.bucket_size))).view(-1)
if numel - main_chunk_size > 0:
a[main_chunk_size:] = self.quantize_bucket(a[main_chunk_size:])
return a
def quantize_bucket(self, a):
non_2 = False
if a.dim() != 2:
a = a[None, :]
non_2 = True
if a.dim() == 2:
fmin = torch.min(a, dim=1)[0]
fmax = torch.max(a, dim=1)[0]
unit = (fmax - fmin) / (self.num_levels - 1)
unit = unit[:, None]
fmin = fmin[:, None]
s = torch.Tensor([1e-11]).expand_as(unit).to(a.device)
unit = torch.max(unit, s)
a -= fmin
a /= unit
a += torch.empty(a.size(), device=a.device).uniform_(0, 1)
# log(a.cpu().numpy())
torch.floor_(a)
a *= unit
a += fmin
if non_2:
return a[0]
return a
class NormUniformQuantizer(MaxMinQuantizer):
def __init__(self, q_bits, bucket_size):
super().__init__(q_bits, bucket_size)
self.num_levels = self.num_levels // 2
def quantize_bucket(self, a):
non_2 = False
if a.dim() != 2:
a = a[None, :]
non_2 = True
if a.dim() == 2:
vnorm = torch.norm(a, p=float("inf"), dim=1)
vnorm = vnorm[:, None]
s = torch.Tensor([1e-11]).expand_as(vnorm).to(a.device)
else:
vnorm = torch.norm(a, p=float("inf"))
s = torch.Tensor([1e-11]).to(a.device)
vnorm = torch.max(vnorm, s)
sign = torch.sign(a)
# cast sign to 1 bit
sign.add_(1).div_(2)
sign.mul_(2).add_(-1)
if self.num_levels > 1:
q = torch.abs(a / vnorm)
r = torch.rand(a.shape, device=a.device)
q.mul_((self.num_levels - 1))
q.add_(r)
torch.floor_(q)
q.div_((self.num_levels - 1))
res = q * vnorm * sign
else:
res = vnorm * sign
if non_2:
return res[0]
else:
return res
def log(msg):
if hvd.rank() == 0:
print(msg)
def generate_arrays(size):
rnd.seed(43)
global arrays
sum = np.zeros(size)
for i in range(num_nodes):
array = rnd.normal(size=size, scale=0.1)
sum = np.add(sum, array)
arrays.append(array)
arrays.append(sum)
def get_array(idx):
return arrays[idx]
def get_expected_result():
quantizer = MaxMinQuantizer(args.q, args.bucket_size)
# quantizer = NormUniformQuantizer(args.q, args.bucket_size)
rank = hvd.rank()
array = torch.tensor(get_array(rank), device="cuda")
# print(array)
quantizer.compress(array)
# print(array.cpu().numpy())
a = hvd.allgather(array, "allgather").view(-1, *array.shape)
return torch.sum(a, dim=0)
def run_allreduce(args, num, res):
array = get_array(hvd.rank())
res = get_array(hvd.size())
# print(array[:8])
tensors = []
for i in range(num):
if args.no_cuda:
tensor = torch.tensor(array, device='cpu').float()
else:
tensor = torch.tensor(array, device='cuda').float()
tensors.append(tensor)
torch.cuda.synchronize()
if args.fp16:
tensor = tensor.half()
handles = []
for i in range(num):
handles.append(hvd.allreduce_async_(tensors[i], name='test.{}'.format(i), op=hvd.Sum))
#tensors[i] = hvd.synchronize(handles[-1])
#tensors[i] = hvd.allreduce_(tensors[i], name='test.{}'.format(i), op=hvd.Sum)
#tensors[i] = hvd.allreduce(tensors[i], name='test.{}'.format(i),
# compression=hvd.Compression.fp16 if args.quantization_bits == 16 else hvd.Compression.none)
#for i in range(num_nodes):
# print(i, get_array(i)[:8])
#print("Base sum: ", result[:8])
#print("Hvd: ", avg[:8])
for i in range(num):
h = handles[i]
avg_tensor = hvd.synchronize(h)
#avg_tensor = tensors[i]
#print(avg_tensor[:8])
#avg_tensor = tensors[i]
avg = avg_tensor.cpu().numpy()
if hvd.rank() == 0:
diff = np.linalg.norm(res - avg)
if diff > res.size * 5e-2:
log("L2 error: {}".format(diff))
log("Base sum: {}".format(res[:8]))
log("Hvd: {}".format(avg[:8]))
hvd.broadcast_object(False, 0, "Result")
return False
else:
hvd.broadcast_object(True, 0, "Result")
else:
res = hvd.broadcast_object(None, 0, "Result")
if not res:
return False
return True
parser = argparse.ArgumentParser(description='PyTorch MNIST Example')
parser.add_argument("--array-size", type=int, default=32,
help="array size (default: 32)")
parser.add_argument('--no-cuda', action='store_true', default=False,
help='disables CUDA training')
parser.add_argument('--fp16', action='store_true', default=False,
help='Casts tensors to fp16')
parser.add_argument('-q', type=int, default=32, help="quantization bits")
parser.add_argument('--bucket-size', type=int, default=512, help="quantization bucket size")
args = parser.parse_args()
# os.environ["HOROVOD_QUANTIZATION_BITS"] = str(args.q)
# os.environ["HOROVOD_COMPRESSION_BUCKET_SIZE"] = str(args.bucket_size)
hvd.init()
num_nodes = hvd.size()
torch.cuda.set_device(hvd.rank())
generate_arrays(args.array_size)
# res = get_expected_result()
# res = res.cpu().numpy()
res = None
num_layers = 1
num_batches = 100
for i in range(num_batches):
if not run_allreduce(args, num_layers, res):
log("Failed")
break
else:
log("Passed")
| 31.721951 | 123 | 0.557743 | 869 | 6,503 | 4.035673 | 0.182969 | 0.042772 | 0.025948 | 0.01882 | 0.262047 | 0.188195 | 0.159681 | 0.145994 | 0.106644 | 0.106644 | 0 | 0.019908 | 0.297094 | 6,503 | 204 | 124 | 31.877451 | 0.74732 | 0.123789 | 0 | 0.21118 | 0 | 0 | 0.050211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.062112 | false | 0.006211 | 0.043478 | 0.006211 | 0.186335 | 0.006211 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ae318fc2f53a52ba514b1e0986fa0f2c395b272 | 6,495 | py | Python | accelerator/shell/init.py | drougge/accelerator | f99b2550a84c79cadb032acf0d2d60bccf75bf0d | [
"Apache-2.0"
] | null | null | null | accelerator/shell/init.py | drougge/accelerator | f99b2550a84c79cadb032acf0d2d60bccf75bf0d | [
"Apache-2.0"
] | null | null | null | accelerator/shell/init.py | drougge/accelerator | f99b2550a84c79cadb032acf0d2d60bccf75bf0d | [
"Apache-2.0"
] | null | null | null | ############################################################################
# #
# Copyright (c) 2019-2020 Carl Drougge #
# Modifications copyright (c) 2020 Anders Berkeman #
# #
# Licensed under the Apache License, Version 2.0 (the "License"); #
# you may not use this file except in compliance with the License. #
# You may obtain a copy of the License at #
# #
# http://www.apache.org/licenses/LICENSE-2.0 #
# #
# Unless required by applicable law or agreed to in writing, software #
# distributed under the License is distributed on an "AS IS" BASIS, #
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #
# See the License for the specific language governing permissions and #
# limitations under the License. #
# #
############################################################################
a_example = r"""description = r'''
This is just an example. It doesn't even try to do anything useful.
You can run it to see that your installation works.
'''
options = dict(
message=str,
)
def analysis(sliceno):
return sliceno
def synthesis(analysis_res):
print("Sum of all sliceno:", sum(analysis_res))
print("Message:", options.message)
"""
build_script = r"""def main(urd):
urd.build('example', message='Hello world!')
"""
config_template = r"""# The configuration is a collection of key value pairs.
#
# Values are specified as
# key: value
# or for several values
# key:
# value 1
# value 2
# ...
# (any leading whitespace is ok)
#
# Use ${{VAR}} or ${{VAR=DEFAULT}} to use environment variables.
slices: {slices}
workdirs:
{name} ./workdirs/{name}
# Target workdir defaults to the first workdir, but you can override it.
# (this is where jobs without a workdir override are built)
target workdir: {name}
method packages:
{name}
accelerator.standard_methods
accelerator.test_methods
urd: local # can also be URL/socket to your urd
# [host]:port or path where board will listen.
# You can also start board separately with "ax board".
board listen: .socket.dir/board
result directory: ./results
input directory: {input}
# If you want to run methods on different python interpreters you can
# specify names for other interpreters here, and put that name after
# the method in methods.conf.
# You automatically get four names for the interpreter that started
# the server: DEFAULT, {major}, {major}.{minor} and {major}.{minor}.{micro} (adjusted to the actual
# version used). You can override these here, except DEFAULT.
# interpreters:
# 2.7 /path/to/python2.7
# test /path/to/beta/python
"""
def quote(v):
from accelerator.compat import PY2
if PY2:
return '"%s"' % (v.replace('"', '\\"'),) # good enough, hopefully
else:
import shlex
return shlex.quote(v)
def main(argv):
from os import makedirs, listdir, chdir
from os.path import exists, join, realpath
from sys import version_info
from argparse import RawDescriptionHelpFormatter
from accelerator.compat import ArgumentParser
from accelerator.error import UserError
parser = ArgumentParser(
prog=argv.pop(0),
description=r'''
creates an accelerator project directory.
defaults to the current directory.
creates accelerator.conf, a method dir, a workdir and result dir.
both the method directory and workdir will be named <NAME>,
"dev" by default.
'''.replace('\t', ''),
formatter_class=RawDescriptionHelpFormatter,
)
parser.add_argument('--slices', default=None, type=int, help='override slice count detection')
parser.add_argument('--name', default='dev', help='name of method dir and workdir, default "dev"')
parser.add_argument('--input', default='# /some/path where you want import methods to look.', help='input directory')
parser.add_argument('--force', action='store_true', help='go ahead even though directory is not empty, or workdir exists with incompatible slice count')
parser.add_argument('directory', default='.', help='project directory to create. default "."', metavar='DIR', nargs='?')
options = parser.parse_args(argv)
assert options.name
assert '/' not in options.name
if not options.input.startswith('#'):
options.input = quote(realpath(options.input))
prefix = realpath(options.directory)
workdir = join(prefix, 'workdirs', options.name)
slices_conf = join(workdir, '.slices')
try:
with open(slices_conf, 'r') as fh:
workdir_slices = int(fh.read())
except IOError:
workdir_slices = None
if workdir_slices and options.slices is None:
options.slices = workdir_slices
if options.slices is None:
from multiprocessing import cpu_count
options.slices = cpu_count()
if workdir_slices and workdir_slices != options.slices and not options.force:
raise UserError('Workdir %r has %d slices, refusing to continue with %d slices' % (workdir, workdir_slices, options.slices,))
if not options.force and exists(options.directory) and listdir(options.directory):
raise UserError('Directory %r is not empty.' % (options.directory,))
if not exists(options.directory):
makedirs(options.directory)
chdir(options.directory)
for dir_to_make in ('.socket.dir', 'urd.db',):
if not exists(dir_to_make):
makedirs(dir_to_make, 0o750)
for dir_to_make in (workdir, 'results',):
if not exists(dir_to_make):
makedirs(dir_to_make)
with open(slices_conf, 'w') as fh:
fh.write('%d\n' % (options.slices,))
method_dir = options.name
if not exists(method_dir):
makedirs(method_dir)
with open(join(method_dir, '__init__.py'), 'w') as fh:
pass
with open(join(method_dir, 'methods.conf'), 'w') as fh:
fh.write('example\n')
with open(join(method_dir, 'a_example.py'), 'w') as fh:
fh.write(a_example)
with open(join(method_dir, 'build.py'), 'w') as fh:
fh.write(build_script)
with open('accelerator.conf', 'w') as fh:
fh.write(config_template.format(
name=quote(options.name),
slices=options.slices,
input=options.input,
major=version_info.major,
minor=version_info.minor,
micro=version_info.micro,
))
| 36.284916 | 153 | 0.646497 | 839 | 6,495 | 4.935638 | 0.330155 | 0.01956 | 0.01304 | 0.008452 | 0.06327 | 0.036223 | 0.01787 | 0.01787 | 0.01787 | 0.01787 | 0 | 0.005743 | 0.222479 | 6,495 | 178 | 154 | 36.488764 | 0.814257 | 0.174288 | 0 | 0.036496 | 0 | 0.007299 | 0.477366 | 0.0194 | 0 | 0 | 0 | 0 | 0.014599 | 1 | 0.014599 | false | 0.007299 | 0.072993 | 0 | 0.109489 | 0.014599 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ae3f2b04641e41b7f95d3b94460b572e616223c | 2,115 | py | Python | tensorflow_probability/python/math/psd_kernels/pointwise_exponential_test.py | jakee417/probability-1 | ae7117f37ac441bc7a888167ea23e5e620c5bcde | [
"Apache-2.0"
] | 3,670 | 2018-02-14T03:29:40.000Z | 2022-03-30T01:19:52.000Z | tensorflow_probability/python/math/psd_kernels/pointwise_exponential_test.py | jakee417/probability-1 | ae7117f37ac441bc7a888167ea23e5e620c5bcde | [
"Apache-2.0"
] | 1,395 | 2018-02-24T02:28:49.000Z | 2022-03-31T16:12:06.000Z | tensorflow_probability/python/math/psd_kernels/pointwise_exponential_test.py | jakee417/probability-1 | ae7117f37ac441bc7a888167ea23e5e620c5bcde | [
"Apache-2.0"
] | 1,135 | 2018-02-14T01:51:10.000Z | 2022-03-28T02:24:11.000Z | # Copyright 2021 The TensorFlow Probability Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""Tests for exponential."""
import numpy as np
import tensorflow.compat.v2 as tf
import tensorflow_probability as tfp
from tensorflow_probability.python.internal import test_util
@test_util.test_all_tf_execution_regimes
class PointwiseExponentialTest(test_util.TestCase):
def testValuesAreCorrect(self):
original_kernel = tfp.math.psd_kernels.Parabolic()
exponential_kernel = tfp.math.psd_kernels.PointwiseExponential(
original_kernel)
x1 = [[1.0]]
x2 = [[2.0]]
original_output = original_kernel.apply(x1, x2)
exponential_output = exponential_kernel.apply(x1, x2)
self.assertAllEqual(
self.evaluate(tf.math.exp(original_output)),
self.evaluate(exponential_output))
def testBatchShape(self):
amplitude = np.random.uniform(2, 3., size=[3, 1, 2]).astype(np.float32)
length_scale = np.random.uniform(2, 3., size=[1, 3, 1]).astype(np.float32)
original_kernel = tfp.math.psd_kernels.GeneralizedMatern(
df=np.pi, amplitude=amplitude, length_scale=length_scale)
exponential_kernel = tfp.math.psd_kernels.PointwiseExponential(
original_kernel)
self.assertAllEqual(original_kernel.batch_shape,
exponential_kernel.batch_shape)
self.assertAllEqual(
self.evaluate(original_kernel.batch_shape_tensor()),
self.evaluate(exponential_kernel.batch_shape_tensor()))
if __name__ == '__main__':
test_util.main()
| 38.454545 | 78 | 0.715839 | 269 | 2,115 | 5.453532 | 0.449814 | 0.066803 | 0.035446 | 0.043626 | 0.163599 | 0.163599 | 0.092706 | 0.092706 | 0.092706 | 0 | 0 | 0.018519 | 0.157447 | 2,115 | 54 | 79 | 39.166667 | 0.804714 | 0.318203 | 0 | 0.193548 | 0 | 0 | 0.005622 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 1 | 0.064516 | false | 0 | 0.129032 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ae49887b3b0718efa0664bc88caa2cfb2fd87dd | 263 | py | Python | baekjoon/1934.py | jiyeoun/PS | 855ec59fe7844daf102fde713eab48c88cbc5419 | [
"MIT"
] | null | null | null | baekjoon/1934.py | jiyeoun/PS | 855ec59fe7844daf102fde713eab48c88cbc5419 | [
"MIT"
] | null | null | null | baekjoon/1934.py | jiyeoun/PS | 855ec59fe7844daf102fde713eab48c88cbc5419 | [
"MIT"
] | null | null | null | def gcd(a, b):
while b != 0:
x = a % b
a = b
b = x
return a
def lcm(a, b):
gcd2 = gcd(a, b)
return (a * b) // gcd2
n=int(input())
for i in range(n):
a,b=input().split()
a=int(a)
b=int(b)
print(lcm(a, b))
| 13.842105 | 26 | 0.422053 | 50 | 263 | 2.22 | 0.38 | 0.162162 | 0.09009 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018634 | 0.387833 | 263 | 18 | 27 | 14.611111 | 0.670807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0 | 0 | 0.266667 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1ae63296d699c65a0bd047816b0e603a4cff99eb | 5,841 | py | Python | hplc_analysis.py | furubayashim/hplc-analysis-hitachi | d8f2b594b577032548e860b424f55241bbe72b37 | [
"MIT"
] | null | null | null | hplc_analysis.py | furubayashim/hplc-analysis-hitachi | d8f2b594b577032548e860b424f55241bbe72b37 | [
"MIT"
] | null | null | null | hplc_analysis.py | furubayashim/hplc-analysis-hitachi | d8f2b594b577032548e860b424f55241bbe72b37 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# script to draw PDA chromatogram & spectrum figure using Hitachi HPLC Chromaster stx/ctx files
import pandas as pd
import numpy as np
import glob
import os
import sys
import matplotlib.pyplot as plt
# change this if using different user/folder
data_dir = "raw/"
# can give sample name file as argv
if len(sys.argv) >1:
samplenamefile = sys.argv[1]
else:
samplenamefile = 'sampletable.xlsx'
sample_df = pd.read_excel(samplenamefile)
### load parameter from the xls file ####################################
sample_nos = [str(s) for s in sample_df['sample no'].values]
sample_names = sample_df['name'].values
sample_dir = sorted([f+'/' for f in os.listdir(data_dir) if not os.path.isfile(f)],key=lambda x:int(x[:-1]))
# Time range (x axis)
start_time = 2
end_time = 18
if 'start time' in sample_df.columns:
start_time = sample_df['start time'].values[0]
if 'end time' in sample_df.columns:
end_time = sample_df['end time'].values[0]
# which chart to draw
all_chromato = sample_df['all chromato'].values[0]
each_data = sample_df['each data'].values[0]
# output folder and name
if not os.path.exists('processed'): os.mkdir('processed')
output_name = 'all_chromato'
if 'output name' in sample_df.columns:
output_name = sample_df['output name'].values[0]
### draw chromato for all samples in one fig ############################
if all_chromato == 'y':
ctx_files = sorted(glob.glob(data_dir+'*/*.ctx'),key=lambda x: (int(x.split('/')[1]),int(x.split('/')[2][:-4]))) #ごちゃごちゃ
chromato_dfs = [pd.read_csv(file,skiprows=38,delimiter=';',header=None,names=[sample_names[n],'NaN']).iloc[:,:1] for n,file in enumerate(ctx_files)]
chromato_df = pd.concat(chromato_dfs,axis=1)
chromato_df_cut = chromato_df.loc[start_time:end_time]
fig,axes = plt.subplots(1,2,figsize=[10,8])
for n,(name,col) in enumerate(chromato_df_cut.iteritems()):
time = chromato_df_cut.index.values
abs = col.values - 0.1 * n
axes[0].plot(time,abs,label=name)
axes[0].legend()
axes[0].set_ylabel('Absorbance')
axes[0].set_xlabel('Time (min)')
#axes[0].set_ylim([-0.45,0.1])
axes[0].set_xlim([start_time,end_time])
axes[0].set_title('Height as it is')
for n,(name,col) in enumerate(chromato_df_cut.iteritems()):
abs = col.values / np.nanmax(col.values) - 1.1 * n
time = chromato_df_cut.index.values
axes[1].plot(time,abs,label=name)
axes[1].legend()
axes[1].set_ylabel('Absorbance (Normalized)')
axes[1].set_xlabel('Time (min)')
#axes[1].set_ylim([-0.45,1])
axes[1].set_xlim([start_time,end_time])
axes[1].set_title('Height Normalized')
plt.savefig("processed/{}.pdf".format(output_name),bbox_inches = "tight");
### draw chromato/spec for each sample ############################
if each_data == 'y':
for sample_no,sample_name,sample_dir in zip(sample_nos,sample_names,sample_dir):
# load chromato files. Can import several ctx file
ctx_files = sorted(glob.glob(data_dir+sample_dir+'*.ctx'))
chromato_dfs = [pd.read_csv(file,skiprows=38,delimiter=';',header=None,names=[os.path.basename(file)[:-4],'NaN']).iloc[:,:1] for file in ctx_files]
chromato_df = pd.concat(chromato_dfs,axis=1)
if chromato_df.index.min() < start_time:
chromato_df_cut = chromato_df.loc[start_time:]
else:
chromato_df_cut = chromato_df
if chromato_df_cut.index.max() > end_time:
chromato_df_cut = chromato_df_cut.loc[:end_time]
# load stx files
stx_files = sorted(glob.glob(data_dir+sample_dir+'*.stx'),key=lambda x: float(os.path.basename(x[:-4])))
stx_dfs = [pd.read_csv(f,delimiter=';',skiprows=44).iloc[:,:1] for f in stx_files]
stx_df = pd.concat(stx_dfs,axis=1)
# stx_df is the dataframe of the abs spectrum of each peak.
# index = 200-650 (nm)
# column name = str of time (min)
stx_df_cut = stx_df.loc[250:600] # select 250-600 nm
# draw figure
fig = plt.figure(figsize=[6,16])
# draw chromatogram
ymax = 0
ymin = 0
for name,col in chromato_df_cut.iteritems():
time = chromato_df_cut.index.values
abs = col.values
plt.subplot(6,1,1)
#109: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.
plt.plot(time,abs,label=name)
ymaxtemp = chromato_df.loc[start_time:end_time,name].values.max()
ymintemp = chromato_df.loc[start_time:end_time,name].values.min()
if ymaxtemp > ymax: ymax = ymaxtemp
if ymintemp < ymin: ymin = ymintemp
plt.legend()
plt.xticks(np.arange(start_time,end_time,1))
plt.xlabel('Time (min)')
plt.ylabel('Absorbance')
plt.ylim([ymin + ymin*0.05,ymax + ymax*0.05])
plt.title(sample_no + '-' + sample_name)
# draw abs spectrum
for n,(rt,series) in enumerate(stx_df_cut.iteritems()):
wavelength = series.index.values
absorbance = series.values
abs_max = str(int(series.idxmax()))
plt.subplot(12,3,7+n)
plt.plot(wavelength,absorbance,label=rt)
plt.xlim([250,600])
plt.xticks(np.arange(300,700,100))
plt.ylim([series.min(),series.max()])
plt.title('{} min (λmax: {} nm)'.format(rt[:-2],abs_max))
plt.tight_layout(pad=-0.1);
plt.savefig('processed/'+sample_no+'-'+sample_name+'.pdf',bbox_inches = "tight");
| 41.721429 | 349 | 0.640644 | 878 | 5,841 | 4.113895 | 0.247153 | 0.055371 | 0.043189 | 0.026578 | 0.266058 | 0.223699 | 0.194075 | 0.168328 | 0.129568 | 0.107973 | 0 | 0.025371 | 0.203732 | 5,841 | 139 | 350 | 42.021583 | 0.751236 | 0.171375 | 0 | 0.092784 | 0 | 0 | 0.072049 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.061856 | 0 | 0.061856 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1aeb67d97f0f649de1b16474a53ed6bd4c5ce0c0 | 3,445 | py | Python | src/pycropml/main.py | brichet/PyCrop2ML | 7177996f72a8d95fdbabb772a16f1fd87b1d033e | [
"MIT"
] | 5 | 2020-06-21T18:58:04.000Z | 2022-01-29T21:32:28.000Z | src/pycropml/main.py | brichet/PyCrop2ML | 7177996f72a8d95fdbabb772a16f1fd87b1d033e | [
"MIT"
] | 27 | 2018-12-04T15:35:44.000Z | 2022-03-11T08:25:03.000Z | src/pycropml/main.py | brichet/PyCrop2ML | 7177996f72a8d95fdbabb772a16f1fd87b1d033e | [
"MIT"
] | 7 | 2019-04-20T02:25:22.000Z | 2021-11-04T07:52:35.000Z | # -*- coding: utf-8 -*-
"""
Created on Tue Mar 19 22:59:23 2019
@author: pradal
"""
# coding: utf8
from __future__ import absolute_import
from __future__ import print_function
import sys
import os
from optparse import OptionParser
from path import Path
from pycropml.cyml import transpile_file, transpile_package, transpile_component
from pycropml.transpiler.main import languages
def main():
usage = """Usage: %prog [options] package language1 [languages]
cyml transpiler translate a cyml source code or a Crop2ML package with algo in cyml
language to target language.
Example
cyml <source_code.pyx or pkg> <target_language>
* target language must be:
py for python
cs for csharp
f90 for fortran
java for java
simplace for simplace
sirius for sirius
openAlea
cpp for C++
r
"""
#TODO
todo = """
* target language must be:
py for python
cs for csharp
cpp for c++
f90 for fortran
java for java
r for R
simplace for simplace
sirius for sirius
"""
parser = OptionParser(usage=usage)
parser.add_option("-f", "--file", dest="file", metavar="FILE",
help="cyml source code FILE to transpile")
parser.add_option("-p", "--package", dest="package",
help="package directory containing a crop2ml directory with algorithms.")
parser.add_option("-c", "--component", dest="component",
help="framework model component directory")
parser.add_option("-l", "--languages", dest="languages", action="append",
choices=languages,
help="Target languages : "+','.join(languages))
(opts, args)= parser.parse_args()
sourcef = None
pyx_filename = None
package = None
component = None
newpackage = None
langs = []
if len(parser.option_list) + len(args) < 2:
parser.error("incorrect number of arguments")
if opts.file:
sourcef = pyx_filename = opts.file
elif opts.package:
sourcef = package = opts.package
elif opts.component:
sourcef = component = opts.component
else:
sourcef = args[0]
sourcef = Path(sourcef)
if not sourcef.exists():
parser.error("Package or file does not exists")
if opts.languages:
langs = opts.languages
else:
if opts.component:
newpackage = args[0]
args = args[1:]
langs = [a for a in args if a in languages]
fail = False
for arg in args:
if arg == sourcef:
continue
if arg not in languages:
parser.error("%s is not a supported language"%arg)
fail = True
if fail:
return
if not langs:
parser.error("No language has been specified")
print(parser.usage)
return
if pyx_filename or len(sourcef.split(".")) == 2:
# translate from cyml code
if sourcef.split(".")[1] != "pyx":
parser.error("Source code %s is not a Cyml file (.pyx estension) "%(str(sourcef)))
return
for language in langs:
status = transpile_file(sourcef, language)
elif package:
for language in langs:
status = transpile_package(sourcef, language)
else:
for language in langs:
status = transpile_component(sourcef,newpackage,language)
if __name__ == '__main__':
main()
| 24.607143 | 94 | 0.615094 | 420 | 3,445 | 4.961905 | 0.314286 | 0.026392 | 0.028791 | 0.025912 | 0.143474 | 0.143474 | 0.040307 | 0.040307 | 0.040307 | 0.040307 | 0 | 0.011038 | 0.289985 | 3,445 | 139 | 95 | 24.784173 | 0.840965 | 0.033962 | 0 | 0.25 | 0 | 0 | 0.328912 | 0 | 0.01 | 0 | 0 | 0.007194 | 0 | 1 | 0.01 | false | 0 | 0.08 | 0 | 0.12 | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1aef95bf283d7fd58b467eb0379f7cc4a2f9650c | 1,051 | py | Python | Yotube Downloader.py | vishal8888a8/GUI-youtube-downloader | f1454da9534e4f40e84258b9600c0fe82cd0c39f | [
"MIT"
] | null | null | null | Yotube Downloader.py | vishal8888a8/GUI-youtube-downloader | f1454da9534e4f40e84258b9600c0fe82cd0c39f | [
"MIT"
] | 1 | 2021-10-01T07:14:36.000Z | 2021-10-01T07:14:36.000Z | Yotube Downloader.py | vishal8888a8/GUI-youtube-downloader | f1454da9534e4f40e84258b9600c0fe82cd0c39f | [
"MIT"
] | null | null | null | from pytube import YouTube
from tkinter import *
#main download part
def download(link):
obj = YouTube(link)
dl = obj.streams.get_highest_resolution()
print("Downloading in proccess")
dl.download()
print("Downloading completed!")
#getting the link as string
def get_class():
link=ent.get()
download(link)
#window properties
window = Tk()
window.geometry("573x400")
window.title("YouTube Downloader")
window.configure(background="#e0db31")
#logo set-up
logo = PhotoImage(file="logo.png")
l1 = Label(window,image=logo,bg="#962383",anchor="center").pack()
#second label
l2=Label(window,text="Enter Your link below!",font="times 20 bold",bg="#4640e6")
l2.pack(pady=15,padx=10)
#taking input from the user
ent = Entry(window,textvariable = StringVar)
ent.pack(padx=10, pady=14)
#the enter button
btn1 = Button(window, text="Click Me!", command=get_class)
btn1.pack(padx=10, pady=13)
#output
T = Text(window, height = 5, width = 52)
#end of the window loop
window.mainloop()
| 23.355556 | 81 | 0.687916 | 148 | 1,051 | 4.858108 | 0.601351 | 0.025035 | 0.027816 | 0.038943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048276 | 0.172217 | 1,051 | 44 | 82 | 23.886364 | 0.778161 | 0.146527 | 0 | 0 | 0 | 0 | 0.17654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.08 | 0 | 0.16 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1af28adf5e0279198bbb4dfd5d10357dd6366699 | 693 | py | Python | CTF/home.py | mark0519/CTFplatform | 5f9bc555cdc0e07f9ba31165a44b206e1a2bd915 | [
"MIT"
] | 9 | 2021-09-26T04:04:52.000Z | 2022-03-30T16:37:38.000Z | CTF/home.py | mark0519/CTFplatform | 5f9bc555cdc0e07f9ba31165a44b206e1a2bd915 | [
"MIT"
] | null | null | null | CTF/home.py | mark0519/CTFplatform | 5f9bc555cdc0e07f9ba31165a44b206e1a2bd915 | [
"MIT"
] | null | null | null | from flask import (
Blueprint, flash, g, redirect, render_template, request, url_for,session
)
from werkzeug.exceptions import abort
from CTF import login
from CTF.models import user
bp = Blueprint('home', __name__)
@bp.route('/')
def index():
if 'id' in session:
is_admin = 0
if user.query.filter(user.user_id == session.get('id')).first().user_teamid == 1: #1表示是管理员
is_admin = 1
name = session.get('username') #用户名信息
return render_template('home/home.html', name=name , is_admin=is_admin)
# print(session)
# print("*********")
return render_template('home/home.html')
| 25.666667 | 110 | 0.598846 | 86 | 693 | 4.662791 | 0.511628 | 0.069825 | 0.099751 | 0.119701 | 0.159601 | 0.159601 | 0 | 0 | 0 | 0 | 0 | 0.00789 | 0.268398 | 693 | 26 | 111 | 26.653846 | 0.783037 | 0.066378 | 0 | 0 | 0 | 0 | 0.073409 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.25 | 0 | 0.4375 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1af5bda8e49289017e6590164cdb0fa51ea9e014 | 690 | py | Python | src/atcoder/abc126/f/sol_0.py | kagemeka/competitive-programming | c70fe481bcd518f507b885fc9234691d8ce63171 | [
"MIT"
] | 1 | 2021-07-11T03:20:10.000Z | 2021-07-11T03:20:10.000Z | src/atcoder/abc126/f/sol_0.py | kagemeka/competitive-programming | c70fe481bcd518f507b885fc9234691d8ce63171 | [
"MIT"
] | 39 | 2021-07-10T05:21:09.000Z | 2021-12-15T06:10:12.000Z | src/atcoder/abc126/f/sol_0.py | kagemeka/competitive-programming | c70fe481bcd518f507b885fc9234691d8ce63171 | [
"MIT"
] | null | null | null | r"""Note.
\forall{n \le 2}\ \xor_{i=0}^{2^n-1}{i} = 0
\xor_{0 \le i \lt 2^n, i \neq k (0 \le k \lt 2^n)}{i} = k
"""
import typing
import sys
# import numpy as np
# import numba as nb
def main() -> typing.NoReturn:
m, k = map(int, input().split())
n = pow(2, m)
if k >= n or m == k == 1:
print(-1)
return
if m == 1:
print(0, 0, 1, 1)
return
a = [-1] * (n << 1)
ptr = 0
for i in range(n):
if i == k: continue
a[ptr] = i
ptr += 1
a[ptr] = k
ptr += 1
for i in range(n - 1, -1, -1):
if i == k: continue
a[ptr] = i
ptr += 1
a[-1] = k
print(*a)
main() | 18.157895 | 57 | 0.417391 | 125 | 690 | 2.288 | 0.32 | 0.020979 | 0.027972 | 0.034965 | 0.237762 | 0.153846 | 0.153846 | 0.153846 | 0.153846 | 0.153846 | 0 | 0.064439 | 0.392754 | 690 | 38 | 58 | 18.157895 | 0.618138 | 0.213043 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.074074 | 0 | 0.185185 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1af8dccc6477e942a760e2ea57872e651c435eaf | 6,750 | py | Python | entities/views.py | gve-sw/UserFeedbackSampleWebexTeamsBot | 01f62826a559cb47a6417bd677ab874e16bd41de | [
"RSA-MD"
] | 1 | 2021-08-05T21:51:03.000Z | 2021-08-05T21:51:03.000Z | entities/views.py | gve-sw/UserFeedbackSampleWebexTeamsBot | 01f62826a559cb47a6417bd677ab874e16bd41de | [
"RSA-MD"
] | 6 | 2020-06-15T21:10:38.000Z | 2021-06-04T23:26:58.000Z | entities/views.py | gve-sw/UserFeedbackSampleWebexTeamsBot | 01f62826a559cb47a6417bd677ab874e16bd41de | [
"RSA-MD"
] | null | null | null | import tempfile
import xlsxwriter
import re
from django.shortcuts import render, redirect, get_object_or_404
from django.views import View
from django.conf import settings
from django.http import HttpResponse
from django.contrib import messages
from .models import QuestionSet, Question, ReplySet, Reply, Receiver
from .forms import QuestionSetModelForm, QuestionFormSet, ReceiverModelForm, AddSubscriberForm, ChannelModelForm
from webexteamssdk import WebexTeamsAPI
from pyadaptivecards.card import AdaptiveCard
from pyadaptivecards.components import TextBlock, Choice
from pyadaptivecards.inputs import Text, Choices
# Create your views here.
def get_card_from_question_set(qs):
body = []
intro = TextBlock(f"## {qs.name}")
body.append(intro)
# Create a input for each of the questions
for q in qs.questions.all():
input_id = f"{qs.id}#{q.id}"
label = TextBlock(f"**{q.text}**")
body.append(label)
if q.question_type == Question.TYPE_TEXT:
field = Text(input_id)
body.append(field)
elif q.question_type == Question.TYPE_MC:
string_choices = re.search(r'\((.*?)\)', q.text).group(1).split(",")
choices = []
for str_choice in string_choices:
c = Choice(str_choice, str_choice)
choices.append(c)
field = Choices(choices, input_id)
body.append(field)
submit_action = {
'type': "Action.Submit",
'title': "Send Survey",
'data': {
'question_set': str(qs.id)
}
}
card = AdaptiveCard(body=body, actions=[])
ret = card.to_dict()
ret['actions'].append(submit_action)
return ret
class UserView(View):
def get(self, request):
ctx = {
'receiver': Receiver.objects.all(),
'receiver_form': ReceiverModelForm(),
'add_subscriber_form': AddSubscriberForm(),
'add_channel_form': ChannelModelForm()
}
return render(request, 'entities/create_user.html', context=ctx)
def post(self, request):
form = ReceiverModelForm(request.POST)
if form.is_valid():
obj = form.save()
obj.save()
return redirect('users')
class CreateChannelView(View):
def post(self, request):
f = ChannelModelForm(request.POST)
if f.is_valid():
obj = f.save()
obj.save()
messages.add_message(request, messages.SUCCESS, "Channel successfully created!")
return redirect('users')
class CreateSubscriptionView(View):
def post(self, request):
f = AddSubscriberForm(request.POST)
if f.is_valid():
c = f.cleaned_data['channel']
for r in f.cleaned_data['receivers']:
c.receiver.add(r)
return redirect('users')
class ListQuestionSetView(View):
def get(self, request):
ctx = {
'question_sets': QuestionSet.objects.all()
}
return render(request, "entities/list_questionsets.html", context=ctx)
class ReportView(View):
def get(self, request, question_set_id):
qs = get_object_or_404(QuestionSet, pk=question_set_id)
# Get replies
reply_sets = ReplySet.objects.filter(question_set=qs)
ctx = {
'question_set': qs,
'reply_sets': reply_sets
}
return render(request, "entities/report.html", context=ctx)
class DownloadReportView(View):
def get(self, request, question_set_id):
qs = get_object_or_404(QuestionSet, pk=question_set_id)
reply_sets = ReplySet.objects.filter(question_set=qs)
with tempfile.NamedTemporaryFile(suffix='.xlsx') as temp:
workbook = xlsxwriter.Workbook(temp.name)
worksheet = workbook.add_worksheet()
bold = workbook.add_format({'bold': True})
# Add excel header
worksheet.write(0, 0, "Replier", bold)
col = 1
for q in qs.questions.all():
worksheet.write(0, col, q.text, bold)
col += 1
row = 1
for rs in reply_sets:
worksheet.write(row, 0, str(rs.receiver))
col = 1
for q in qs.questions.all():
for r in rs.replies.all():
if r.question == q:
worksheet.write(row, col, r.text)
col += 1
row += 1
workbook.close()
temp.flush()
resp = HttpResponse(temp.read(), content_type='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
file_name = self.__sanitize_name(qs.name)
resp['Content-Disposition'] = f"attachment; filename={file_name}.xlsx"
return resp
def __sanitize_name(self, name):
return str(name).replace(" ", "_").lower()
class SendQuestionSetView(View):
def get(self, request, question_set_id):
qs = get_object_or_404(QuestionSet, pk=question_set_id)
# Get unique list of people to send message to
to = set()
for c in qs.channel.all():
for r in c.receiver.all():
to.add(r.mail)
# Create card
card = get_card_from_question_set(qs)
attachment = {
"contentType": "application/vnd.microsoft.card.adaptive",
"content": card
}
# Send card to everyone in to list
api = WebexTeamsAPI(access_token=settings.WEBEX_ACCESS_TOKEN)
for mail in to:
api.messages.create(toPersonEmail=mail, markdown="Card. View on desktop", attachments=[attachment,])
qs.was_send = True
qs.save()
return redirect('questions')
class CreateQuestionSetView(View):
def get(self, request):
ctx = {
'question_set_form': QuestionSetModelForm(),
'questions_form_set': QuestionFormSet(queryset=Question.objects.none())
}
return render(request, "entities/create_question.html", context=ctx)
def post(self, request):
qsf = QuestionSetModelForm(request.POST)
questions_form_set = QuestionFormSet(request.POST)
if qsf.is_valid() and questions_form_set.is_valid():
qs = qsf.save()
# Save all questions
for qf in questions_form_set:
q = qf.save()
qs.questions.add(q)
qs.save()
return redirect('create.question')
else:
return HttpResponse(qsf.errors) | 29.220779 | 126 | 0.588 | 743 | 6,750 | 5.208614 | 0.258412 | 0.036951 | 0.015504 | 0.021705 | 0.216021 | 0.174677 | 0.128165 | 0.09509 | 0.060465 | 0.060465 | 0 | 0.004922 | 0.307704 | 6,750 | 231 | 127 | 29.220779 | 0.82324 | 0.030074 | 0 | 0.234177 | 0 | 0 | 0.097263 | 0.032727 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075949 | false | 0 | 0.088608 | 0.006329 | 0.297468 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1afa4f92d9ead6a0daa884c3ac378f4d63bb1ab1 | 4,256 | py | Python | utils/pyart.py | terry97-guel/POENet-MultiTracking | f5a1bf99749c62cdbce50f1e2997a2b3cf32e749 | [
"MIT"
] | null | null | null | utils/pyart.py | terry97-guel/POENet-MultiTracking | f5a1bf99749c62cdbce50f1e2997a2b3cf32e749 | [
"MIT"
] | null | null | null | utils/pyart.py | terry97-guel/POENet-MultiTracking | f5a1bf99749c62cdbce50f1e2997a2b3cf32e749 | [
"MIT"
] | null | null | null | import torch
# def expm(vector):
# if vector.shape[0] == 6:
# return to_SE3(vector)
# if vector.shape[0] == 4:
# return to_SO3(vector)
def t2pr(t):
p = t[:,0:3,3]
r = t[:,0:3,0:3]
return (p,r)
def t2p(t):
device = t.device
p = t[:,0:3,3]
return p
def pr2x(p,r):
device = p.device
X = torch.zeros(p.size()[0],6,6,dtype=torch.float).to(device)
E = torch.transpose(r,1,2)
X[:,:3,:3] = E
X[:,3:,:3] = -torch.matmul(E,skew(p))
X[:,3:,3:] = E
return X
def t2x(t):
(p,r) = t2pr(t)
X = pr2x(p,r)
return X
def pr2t(p,r):
device = p.device
T = torch.zeros(p.size()[0],4,4,dtype=torch.float).to(device)
T[:,0:3,0:3] = r[:]
T[:,0:3,3] = p[:]
T[:,3,3] = 1
return T
def skew(p):
device = p.device
skew_p = torch.zeros(p.size()[0],3,3,dtype=torch.float).to(device)
zero = torch.zeros(p.size()[0],dtype=torch.float).to(device)
skew_p[:,0,:] = torch.vstack([zero, -p[:,2], p[:,1]]).transpose(0,1)
skew_p[:,1,:] = torch.vstack([p[:,2], zero, -p[:,0]]).transpose(0,1)
skew_p[:,2,:] = torch.vstack([-p[:,1], p[:,0],zero]).transpose(0,1)
return skew_p
def rpy2r(rpy):
device = rpy.device
R = torch.zeros(rpy.size()[0],3,3,dtype=torch.float).to(device)
r = rpy[:,0]
p = rpy[:,1]
y = rpy[:,2]
R[:,0,:] = torch.vstack([
torch.cos(y)*torch.cos(p),
-torch.sin(y)*torch.cos(r) + torch.cos(y)*torch.sin(p)*torch.sin(r),
torch.sin(y)*torch.sin(r)+torch.cos(y)*torch.sin(p)*torch.cos(r)
]).transpose(0,1)
R[:,1,:] = torch.vstack([
torch.sin(y)*torch.cos(p),
torch.cos(y)*torch.cos(r) + torch.sin(y)*torch.sin(p)*torch.sin(r),
-torch.cos(y)*torch.sin(r)+torch.sin(y)*torch.sin(p)*torch.cos(r)
]).transpose(0,1)
R[:,2,:] = torch.vstack([
-torch.sin(p),
torch.cos(p)*torch.sin(r),
torch.cos(p)*torch.cos(r)
]).transpose(0,1)
return R
def inv_x(x):
device = x.device
invX = torch.zeros(x.size()[0],6,6,dtype=torch.float).to(device)
E = x[:,:3,:3]
temp = x[:,3:,:3]
invX[:,:3,:3] = torch.transpose(E,1,2)
invX[:,:3,3:] = torch.zeros(x.size()[0],3,3).to(device)
invX[:,3:,:3] = torch.transpose(temp,1,2)
invX[:,3:,3:] = torch.transpose(E,1,2)
return invX
def srodrigues(twist, q_value, verbose =False): #number of set of twist is one & number of q_value is n_joint
eps = 1e-10
device = twist.device
batch_size = q_value.size(0)
T = torch.zeros(batch_size,4,4,dtype=torch.float).to(device)
#number of joint
w = twist[:3]
v = twist[3:]
theta = w.norm(dim=0)
if theta.item() < eps:
theta = v.norm(dim=0)
q_value = q_value * theta
w = w/theta
v = v/theta
w_skew = skew(w.unsqueeze(0)).squeeze(0)
# print("q_value:", q_value.device)
# print("w:", w.device)
# print("(1-torch.cos(q_value):", (1-torch.cos(q_value)).device)
# print("w_skew @ v:", (w_skew @ v).device)
T[:,:3,:3] = rodrigues(w, q_value)
T[:,:3,3] = torch.outer(q_value,v) + \
torch.outer((1-torch.cos(q_value)), w_skew @ v) + \
torch.outer(q_value-torch.sin(q_value), w_skew@w_skew@v)
T[:,3,3] = 1
return T
def rodrigues(w,q,verbose = False):
eps = 1e-10
device = q.device
batch_size = q.size()[0]
if torch.norm(w) < eps:
R = torch.tile(torch.eye(3),(batch_size,1,1)).to(device)
return R
if abs(torch.norm(w)-1) > eps:
if verbose:
print("Warning: [rodirgues] >> joint twist not normalized")
theta = torch.norm(w)
w = w/theta
q = q*theta
w_skew = skew(w.unsqueeze(0)).squeeze(0)
R = torch.tensordot(torch.ones_like(q).unsqueeze(0), torch.eye(3).unsqueeze(0).to(device), dims=([0],[0])) \
+ torch.tensordot(torch.sin(q).unsqueeze(0), w_skew.unsqueeze(0),dims = ([0],[0]))\
+ torch.tensordot( (1-torch.cos(q)).unsqueeze(0), (w_skew@w_skew).unsqueeze(0), dims =([0],[0]))
return R
def bnum2ls(branchNum):
branchLs = []
for Num in branchNum:
for i in range(Num):
branchLs.append(0)
branchLs.append(1)
return branchLs | 26.110429 | 112 | 0.546523 | 728 | 4,256 | 3.142857 | 0.127747 | 0.016608 | 0.045892 | 0.05201 | 0.456731 | 0.251748 | 0.238199 | 0.176573 | 0.109266 | 0.054196 | 0 | 0.047546 | 0.234023 | 4,256 | 163 | 113 | 26.110429 | 0.654294 | 0.087171 | 0 | 0.20354 | 0 | 0 | 0.012907 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097345 | false | 0 | 0.00885 | 0 | 0.212389 | 0.00885 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
1afb2889435df1e35545b9d49b108b145645f238 | 1,903 | py | Python | dymos/transcriptions/common/timeseries_output_comp.py | yonghoonlee/dymos | 602109eee4a1b061444dd2b45c7b1ed0ac1aa0f4 | [
"Apache-2.0"
] | null | null | null | dymos/transcriptions/common/timeseries_output_comp.py | yonghoonlee/dymos | 602109eee4a1b061444dd2b45c7b1ed0ac1aa0f4 | [
"Apache-2.0"
] | 9 | 2021-05-24T15:14:37.000Z | 2021-06-28T21:12:55.000Z | dymos/transcriptions/common/timeseries_output_comp.py | yonghoonlee/dymos | 602109eee4a1b061444dd2b45c7b1ed0ac1aa0f4 | [
"Apache-2.0"
] | null | null | null | import openmdao.api as om
from ...transcriptions.grid_data import GridData
from ...options import options as dymos_options
class TimeseriesOutputCompBase(om.ExplicitComponent):
"""
Class definition of the TimeseriesOutputCompBase.
TimeseriesOutputComp collects variable values from the phase and provides them in chronological
order as outputs. Some phase types don't internally have access to a contiguous array of all
values of a given variable in the phase. For instance, the GaussLobatto pseudospectral has
separate arrays of variable values at discretization and collocation nodes. These values
need to be interleaved to provide a time series. Pseudospectral techniques provide timeseries
data at 'all' nodes, while ExplicitPhase provides values at the step boundaries.
Parameters
----------
**kwargs : dict
Dictionary of optional arguments.
"""
def __init__(self, **kwargs):
super().__init__(**kwargs)
self._no_check_partials = not dymos_options['include_check_partials']
def initialize(self):
"""
Declare component options.
"""
self._timeseries_outputs = []
self._vars = {}
self.options.declare('input_grid_data',
types=GridData,
desc='Container object for grid on which inputs are provided.')
self.options.declare('output_grid_data',
types=GridData,
allow_none=True,
default=None,
desc='Container object for grid on which outputs are interpolated.')
self.options.declare('output_subset',
types=str,
default='all',
desc='Name of the node subset at which outputs are desired.')
| 38.836735 | 99 | 0.624277 | 203 | 1,903 | 5.729064 | 0.522167 | 0.020636 | 0.046432 | 0.036114 | 0.05675 | 0.05675 | 0.05675 | 0 | 0 | 0 | 0 | 0 | 0.308986 | 1,903 | 48 | 100 | 39.645833 | 0.884411 | 0.368891 | 0 | 0.090909 | 0 | 0 | 0.21123 | 0.019608 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.136364 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |